Used language - C#, IDE - VC2008SP1.
So I have 2 questions:
1. From server side perspective how I can track desired transfer speed of packets? Now I do this with Thread.Sleep between packets, but I think it's may be wrong...
2. From client side perspective I get some problems - about 20% of UDP packets received lost so pictures not decodable. In client side I use async udp:
Code: Select all
public void client_start()
{
try
{
_client = new UdpClient(_rcv_endp);
_client.BeginReceive(ReceiveCallback, null);
}
catch (Exception e)
{
MessageBox.Show(e.Message);
}
}
private void ReceiveCallback(IAsyncResult ar)
{
try
{
_client.BeginReceive(ReceiveCallback, null);
Byte[] receiveBytes = _client.EndReceive(ar, ref _rcv_endp);
if (_stream.Length == 0)
{
if (Encoding.ASCII.GetString(receiveBytes, 5, 3) == "NEW")
_stream.Write(receiveBytes, _header_length, receiveBytes.Length - _header_length);
}
else
{
if (Encoding.ASCII.GetString(receiveBytes, 5, 3) == "NEW")
{
OnCompleted(_stream);
_stream.SetLength(0); _stream.Position = 0;
}
_stream.Write(receiveBytes, _header_length, receiveBytes.Length - _header_length);
}
}
catch (Exception e)
{
MessageBox.Show(e.Message);
}
}
OnCompleted(_stream) - event delegate, after it picture decoded and output to the stream.
In 100mbit ethernet direct network only with Thread.Sleep(20) and more packets not lose, but it's not desired 25 fps... May be you have advice for me?
Thanks and sorry for my eng