Inside my program, I am using an TCP-IP client connection
to a server using TCP protocol, everythings works correctly
excepted that there is something I don’t understand.
When sending data to a socket using ‘send’ system call
I can see on my LAN analyser two and sometimes three
Ethernet Frame while I am sending a small quantity of data
This is not easy to read data on my LAN analyser (I’m using
an other protocol upon TCP)… This is not very important
excepted the fact that when the data is splitted in three frames
I can notice a problem on the server (I cannot debug it).
Has anyone a solution to oblige the TCP-IP connection to send
one frame per send call ?
PS: le same program on Windows NT sends one Ethernet
frame per system call send (168 bytes data size).
PS: I have increased buffers size without any change, I’m using