Sometimes I boil water in a teapot. Sometimes I boil water in the microwave. The weird thing is, I seem to get different results in terms of the speed of cooling. When I use a teapot, I typically don't bring the water to a full boil, just until I can hear that it's
almost boiling. When I use the nuker, I often (inadvertantly) bring it to a full, bubbling boil. However, it seems that water boiled in the microwave cools much faster than water boiled on the stove.
Now, this makes no sense to me
at all. I can think of no mechanism by which the way that water was boiled should have anything to do with how fast it cools. As far as I know, 212 degree water is 212 degree water. If anything, I would think that the microwaved water should cool slower, because in that case, I'm actually boiling it in the mug, which would heat the mug up, and so I wouldn't lose any heat from pouring the water into a cold mug. Yet it still seems to me, in the absence of actual experimentation, that the nuked water cools faster.
I'm crazy, right? There's no difference, and I'm just imagining it, right?
I submitted an entry to Lego Ideas, and if 10,000 people support me, it'll be turned into an actual Lego set!If you'd like to see and support my submission,
follow this link.
Posts
Link
I can't imagine the way you heat water would have any effect on how it cools. You're probably just going crazy.
edit: Since the water is boiling more vigourously, could it be expending its energy in the form of heat faster than the almost boiling water?
The microwave doesn't heat the cup, just the liquid inside. Then the cup will draw heat from the water.
Contrast this to the teapot. The stove heats the pot (higher than 212) and the hot pot boils the water. Hot pot = hot water for a longer period of time.
When I first moved out on my own and didn't have an electric kettle, I'd nuke my water. Once I got a kettle, for whatever reason, I noticed the water seemed to stay warmer just that little bit longer. Despite, as you said, the fact that the microwaved water was probably brought to a much higher temperature.
The other instance I've noticed this is when I heat up something like soup to place in a thermos. Either way, I pre-heat the thermos with boiling water for 10 minutes before putting the soup in. But if I heat it up via microwave, the soup is generally luke-warm by the end of the day. Meanwhile, exact same soup, exact same thermos, except heated up on the stove instead, it will stay piping hot almost until the end of the day.
God, I hope there's a good explanation out there. It's always bugged me, and I could never figure it out myself.
It really shouldn't be all that noticable though. If anything the mug would keep the water hot for longer, since ceramic is an insulating material.
I presume you're also using the same mug for this, correct?
There is a related belief that warm water will yield ice cubes faster than cold water. This is also impossible excepting the fact that if enough of the warmer water evaporates that you get smaller ice cubes faster.
Well water will stop boiling when it's temperature drops below the BP and even once the source of heat is removed so I can't see how it would effect the cooling. The exception is the super heated case that you mentioned, where boiling would rapidly bring the temperature down to the BP.
(Please do not gift. My game bank is already full.)
I mean after it's in the mug. I've never done a controlled experiment, but it seems to hold regardless of what mug I use.
As to the warmer-water-cools-faster thing, there's actually a tiny kernel of truth to it. If you super-heat water to just below boiling, it will freeze faster than water that's a couple degrees cooler. Has to do with the lowered density of water immediately below the boiling point, or something. It lets the superheated water get a head start, so to speak. But hot water won't freeze faster than cold water, because yes, that's just impossible.
Imagine you have two ice cube trays in the freezer, one with warm water and one with cold. The warm one will melt the frost and the tray will sink into it, the cold one won't. This increases the surface area of the cube in contact with the ice already in the freezer and results in the heat being drawn off quicker.
Combine this with the larger temperature difference and the cube's might freeze slightly quicker. I say might because I read the article on this a good while ago and can't remember if it was a hypothetical article or not.
What he said. I had an elaborate explanation thought up, but this does the job nicely.
The difference "in the cup" might be because the microwaved water is having the heat leeched from it from the instant the microwaving stops, while the tea- kettled water is buffered from heat loss until it hits the cup. Thus your end product tea kettle water seems (is) hotter.
Hey, it looks good on paper.
In a nutshell, the hot mug would have a faster rate of conduction as per Fouriers' Law of Conduction so the water heated in the microwave would cool faster.
You should also consider that the water heated in the microwave will have been de-oxygenised - bringing it past boiling point will remove any bubbles of oxygen from the water. This means that the microwaved water will not posses an additional layer of insulation in the form of bubbles between the water and the edge of the mug, allowing the heat to pass more freely through the conductor (the edge of the mug). Comparatively, the water brought to just below boiling point in a kettle should have retained it's bubbles and so will contain an additional layer of insulation along the edge of the mug.
?
Are there any wizards present? They do this kind of shit all the time.
You'd think so, wouldn't you?
http://en.wikipedia.org/wiki/Mpemba_effect
My thought is that it's mostly the fact that hot water contains less dissolved gas than cool water, and this in some way makes it cool down and freeze faster. They still don't know exactly how it works.
The cup is being heated by the water while the microwave is on, too. By the time the microwave is finished, a substantial amount of energy has gone microwave->water->mug. I suspect that the same volume of water would heat up much more quickly in something that won't absorb as much heat, like a paper cup.
I demand statistics. And graphs. With those little error bars and all that.
Either way, microwaves are awesome. I'm going to go home and microwave some water in ElJeffe's honor.
Interesting. I once read a pretty credible article about how it was complete bollocks, but I guess if the phenomenon actually has a name it must be legit.
I always assumed that the answer was something along the lines of the heat being a side effect from the radiation you're exposing it to bouncing around the molecules in the water, and since the radiation will dissipate much more rapidly once the water is outside the radiation proofed microwave, the amount of energy contained by the water drops very rapidly.
This would be in comparison to water molecules that are accelerated by gradually increasing the amount of heat they're exposed to, whose only way of reducing their speed is for the heat to transfer into the air around them.
IE, the water/cheese/whatever doesn't really absorb much of the energy from the microwave, it's just that its molecules are moved about a lot by the radiation, and once the source of radiation is removed and ambient radiation dissipates, there is not much energy retained because not much was ever absorbed by the component molecules.
Now that I've typed that out, it sounds really asinine, and is probably very wrong. But it made sense in my head when I was a kid, so I'm just going to run with it.
CUZ THERE'S SOMETHING IN THE MIDDLE AND IT'S GIVING ME A RASH
With a kettle the heat is coming from one direction and move away from it when it boils (turns into water vapor).
Random guess there but it sounds good.
Yeah - because if you heat soup, or something viscous like that, in the microwave the sides can be boiling, but the centre can be cold - whereas if you heat it on the stove, convection means that it heats evenly.
I dunno.
From a physics standpoint, it is probably bunk. That is to say that if hot water can freeze faster, it's because it has less gas dissolved in it - not because it's hot. If you let it cool to room temperature after boiling, I'd expect that it would still freeze faster than tap water.
If gas is not the cause, it's possible that a hot container could form a better connection with the frosty floor of a freezer, resulting in substantially faster cooling that way.
Side effects to the water being hot, and not the temperature of the water itself is more likely to cause the phenomenon.
Yeah, it is pretty wrong on quite a few levels.
That's pretty much what I was thinking, although it's not that the microwaves can't penetrate the water, it's that they're focused on some very specific places inside the microwave (this is why all modern microwaves have a carousel inside them). You can use paper from a thermal printer to see where the hotspots in your microwave are.
But yeah, not all of the water gets heated as thoroughly in a microwave AFAIK.
http://www.thelostworlds.net/
Your thermos sucks ass, get ye vacuum flask.
Alas, that's what I own. If it were a regular, non-insulated thermos, I doubt it would stay warm more than a few hours.
Irregardless, it doesn't explain the large difference in the length of time staying warm with heating something in a microwave, compared to a stovetop. But after reading the comments about the microwave unevenly heating the liquid, it makes sense, but I always try to stir the living daylights out of items in the microwave, and in the case of my thermos, heat it well past boiling several times between stirrings. So, I would think the liquid would be as evenly heated as with a stovetop in that case.
I doubt that a difference in pressure from the kettle to the microwave would be significant enough to cause a noticeable change, but that's just what popped into my mind.
I'd be inclined to agree with the methods of heating: with the microwave, the water is not heated evenly. Some of it begins to boil, but entire mug-full has not reached boiling temperature. The stove, however, will heat it more evenly until all of it is at the boiling point.
What he said.
You could approximately test this by boiling water in the microwave at a much lower setting, taking longer. That is usually the fix for 'hotspots' as it allows more time for natural convection etc.
It might even be the case that if you find a setting which boils water in the same amount of time as your kettle boils water, they then cool down at the same rate.
Temperature is a function of the movement of particles in matter. All molecular particles above absolute zero are constantly moving at speeds relative to their temperature. The way that heat is most commonly conducted is for something to excite those molecules and cause them to move faster. This can happen either by direct surface contact ie you put your hand on something that is hot and the molecules from that transfer energy directly into the surface of your hand., or it can happen through radiation, for example the heat you get from being in the sun as opposed to the shade. I'm not good with photon theories though, so its hard to explain the difference.
Anyway, to the point. Warm water freezes faster than cold because with its molecules moving faster, it actually transfers its energy faster to external molecules, like air. Think of an unbroken pool table. A cue ball hit hard into the rack may actually stop moving far quicker than one hit slowly because it arrives sooner to transfer its energy to the other balls. Now imagine an enourmous table with a bunch of approching cue balls on one side and a bunch of relative inert numbered balls on the other. All of these balls are moving in random angles and bouncing off each other. The faster the cue balls are moving in this chaotic environment, the sooner they are likely to come in contact with something and transfer their kinetic force.
The microwave & the Stove:
The stove transfers a whole bunch of energy through direct contact for a nice even heat. The microwave sends in.. well microwaves, to superexcite the particle in your food (and any other molecules in there). By exciting the particles so rapidly, you can heat things much quicker than with standard conducted heat, but now the molecules are all moving around like effing crazy. When they come in contact with colder air, they lose their load faster than a 30 year old virgin.
I'm pretty certain it's irrelevant as water cools at an exponential rate. So as it cools down it's rate of cooling would slow down anyway. Hotter water only cools faster to begin with.
You are skipping to Physics 102. Physics 101 is bullshit explanations.
(Please do not gift. My game bank is already full.)
Easiest example: black shirts and white shirts heat more equally in the dryer than in the sun.
Since some particles are very active and some are much less so, when your water comes out, it will measure the average temperature (a characteristic of temperature measurement is that things of different temperature are always inherently moving towards their average.) keep in mind also, that your water is not pure H20. Its got particulate matter, chemicals and biological compounds unless you've actually gone through the distillation process in perfectly controlled conditions.
anyway, back to the point, the molecules that are more excited cause a high temperature measurement, but they also transfer their energy faster.
I don't know if that explains it very well or not... and to be honest, there is a lot more complexity than i understand.
I just skimmed that, but I think you're implying that ElJeffe's water is radioactive. I'm not at all convinced of that.
Hmm.
Temperature is basically a measurement of energy, right? So a glass full of water will have an average temperature proportional to the average energy of its particles. If you imagine a glass of water in a perfectly-insulated container, then you would expect the temperature to remain constant over time, whether the particles had wildly variant velocities or whether they were all uniform. The total energy in the water, and the average energy of a given particle, aren't going to change. So in that case, the energy distribution should be irrelevant.
However, perfect insulators don't exist. Even if the mug is an excellent insulator, the top surface of the water is exposed to the air. The question, then, is whether, given the same average energy, a collection of uniform particles would suffer more energy loss to the surrounding air than a collection of wildly variant particles. Without running the numbers, my intuition tells me that the net energy loss would be the same in each case.
I'm thinking that a lack of adequate convection, and large-scale temperature variation in the mug (say, between the outer and inner areas of the water) may be more to blame, here.
The thing is, unless you have some pretty exotic stuff going on, your particles are going to collide and share energy and reach equilibrium on a much shorter time scale than they will lose energy to the environment.
(Please do not gift. My game bank is already full.)
Yeah, that's why I was looking at other explanations for ElJeffe's observations, such as the containers having different levels of insulation.
More specifically, temperature is a measurement of the kinetic energy of molecules.
Black T-shirts dry quicker in the sun than white t-shirts because they don't reflect radiation from the sun but absorb it as kinetic energy, which in turn transfers to the water in the t-shirt which then evaporates when it reaches a hot enough temperature that the water molecules are moving fast enough to become gaseous.
I'd imagine the same thing happens in a microwave - the radiation is converted to kinetic energy in the substance being microwaved, which makes it hot. When you remove something from the microwave, the only thing it is radiating is heat. It isn't dousing you in electromagnetic radiation.