How to debug packet loss ?
Posted
by Gene Vincent
on Stack Overflow
See other posts from Stack Overflow
or by Gene Vincent
Published on 2010-04-27T17:57:26Z
Indexed on
2010/04/27
18:13 UTC
Read the original article
Hit count: 454
I wrote a C++ application (running on Linux) that serves an RTP stream of about 400 kbps. To most destinations this works fine, but some destinations expericence packet loss. The problematic destinations seem to have a slower connection in common, but it should be plenty fast enough for the stream I'm sending.
Since these destinations are able to receive similar RTP streams for other applications without packet loss, my application might be at fault.
I already verified a few things: - in a tcpdump, I see all RTP packets going out on the sending machine - there is a UDP send buffer in place (I tried sizes between 64KB and 300KB) - the RTP packets mostly stay below 1400 bytes to avoid fragmentation
What can a sending application do to minimize the possibility of packet loss and what would be the best way to debug such a situation ?
© Stack Overflow or respective owner