Something I've been noticing, lately, when I browse the web. I use Firefox, mostly, but I've seen it on other browsers as well.
Let's say I'm trying to reach a particular website, or I've just clicked a link on a popular website that is busy but not currently being hammered. Meaning, I'm not the only one connecting to that particular server, but the load on that server is still way below the maximum it can take.
I'll type the address in or click the link, and then the little animated icon telling me that the page is being downloaded will be shown. I'll wait for 10-15 seconds, with my page not even starting to show.
So I get impatient, I click the STOP button, and then I start the request again (click on the address bar and press enter, or just click on the link again) and usually, the requested page will show up within a second or two.
I haven't used dial-up in a while, so I'm not describing slow dial-up behavior; this is happening on a fast DSL connection, and even at work, where I've been able to download files at over 1 megaBYTE per second (and I mean bytes, not just bits.)
So I guess the question is, if the first query is not getting a reasonably quick response, why don't browsers run the query again, just to see if it gets a response? Yes, some sites would get hammered more, but shouldn't something like that be accounted for in a protocol such as HTTP?
(Note, if you're a programmer or a network admin or something similar, and you know the answer, but it's technical, be advised that I do have a degree in computer science, and that I can probably follow the more technical stuff you might write... and if I do understand it, I might even "translate" it into words that laypeople are more likely to understand, as I tend to be pretty good at that.)
Creativity begets criticism.
Check out my new blog: http://50wordstories.ca
Also check out my old game design blog: http://stealmygamedesigns.blogspot.com