I'm trying to write a Reliable Data Transfer program for class. The thing is, however, something involving the computer itself messes with the process.
Essentially, we're taking a file provided to us, breaking it into packets and using a "Sender" to transport it to a "receiver" that will make an exact copy, using UDP. Obviously there's more to it then that, involving checksums, sequences, timeouts, etc. But in my program, while the packets are definitely being sent, they aren't being received. It isn't a problem with the code itself, however, because the exact same program works fine on the Macs and my professor's computer. Someone told me that it might be due to a firewall on my computer.
The "internet" in the title refers to another odd thing: whenever my computer has any kind of connection to the internet, I get a lot of errors talking about how the datagram packet couldn't be sent. But when the connection is removed, I don't get any of them.
I've been hearing this happened on other computers in my class as well. Do you know what I could do to keep this from happening? Or at what is causing it?
Posts
Many firewalls will prohibit all incoming UDP, unless the computer behind the firewall has already sent a UDP packet out to the sending computer. This is called "punching a hole" in the firewall. Turn off any Windows firewall or ZoneAlarm or Norton Internet Security or whatever is blocking incoming packets if there's something like that running.