C
Chris
Hello,
I've made an application which receives udp packets (around 820
packets per second which contain 651 bytes per packet).
When the same application which sends and received the packets use the
loopback interface ( IP 127.0.0.1)
then the application receives all packets correctly, but when the
application and receiver are on different computers, connected only
through an ethernet cross cable, then packets are lost..
And not just one or two once in a while which is acceptable, but from
the 820 around 200 packets are lost.
What can be the cause of that?
Did anyone saw something similar?
Is there and error in the windows TCP/IP stack?
Does getting the packets from the network interface and process them
take so much more time then using a loopback interface?
i use windows xp SP 2 with firewall disabled.
Thanks in advance,
Chris
I've made an application which receives udp packets (around 820
packets per second which contain 651 bytes per packet).
When the same application which sends and received the packets use the
loopback interface ( IP 127.0.0.1)
then the application receives all packets correctly, but when the
application and receiver are on different computers, connected only
through an ethernet cross cable, then packets are lost..
And not just one or two once in a while which is acceptable, but from
the 820 around 200 packets are lost.
What can be the cause of that?
Did anyone saw something similar?
Is there and error in the windows TCP/IP stack?
Does getting the packets from the network interface and process them
take so much more time then using a loopback interface?
i use windows xp SP 2 with firewall disabled.
Thanks in advance,
Chris