How to set buffer size in client-server app using sockets?
Posted
by nelly
on Stack Overflow
See other posts from Stack Overflow
or by nelly
Published on 2010-06-17T10:20:05Z
Indexed on
2010/06/17
10:23 UTC
Read the original article
Hit count: 199
First of all i am new to networking so i may say dumb thing in here. Considering a client-server application using sockets(.net with c# if that matters).
- The client sends some data, the server process it and sends back a string.
- The client sends some other data, the serve process it, queries the db and sends back several hundreds of items from the database
- The client sends some other type of data and the server notifies some other clients
.
My question is how to set the buffer size correctly for reading/writing operation.
Should i do something like this: byte[] buff = new byte[client.ReceiveBufferSize]
?
I am thinking of something like this:
Client sends data to the server(and the server will follow the same pattern)
byte[] bytesToSend=new byte[2048] //2048 to be standard for any command send by the client
bytes 0..1 ->command type
bytes 1..2047 ->command parameters
byte[] bytesToReceive=new byte[8]/byte[64]/byte[8192] //switch(command type)
But..what is happening when a client is notified by the server without sending data? What is the correct way to accomplish what i am trying to do? Thanks for reading.
© Stack Overflow or respective owner