Monday, June 28, 2010

Think You Know What Temperature Is?

My goal here is to describe the concept of temperature. I think most non-scientists would struggle to define the concept and few would get it right.

We have to build up to the definition. And because it involves the concept of entropy, let's start with an explanation of that term. Say you have a pair of common dice and you throw them onto a tabletop. What is the probability that you'll roll a 3 (meaning the dots on each die together sum to 3)? To answer this question, you have to figure out how many different ways there are to arrange the dice to get that sum. Here, it's pretty easy. The die that lands closest to you is a 1 and the one the lands farther away is a 2, or vice versa. So there are two ways to get a 3. Out of how many total unique arrangements of the dice? Well, 36, as for each of the 6 numbers the first die can take, the other die can take any of its 6 numbers. And 6x6 is 36. So the answer to our question, "What is the probability that you'll roll a 3?" is 2 / 36 = .0556 or 5.56%. OK, now what's the probability you'll roll a 7? There are 6 different ways to arrange the dice to get a 7 (e.g. 1 and 6, 2 and 5, 3 and 4, 4 and 3, etc.). That means 6 / 36 or 16.67% of the time you'll roll a 7. In fact, 7 is the most probable sum you can roll. All other sums come up less than 16.67% of the time. (If you've played the dice game Craps, I'm sure you know this.)

OK, so let's create a couple of definitions in terms of the dice rolling. We'll call the sum of the two dice (i.e. what you roll) the macrostate and the different arrangements that give you this macrostate the microstates. So the answer to the question, "What are the various microstates that give you the macrostate 7?" is, written as ordered pairs, (1,6) (2,5) (3,4) (4,3) (5,2) (6,1). The macrostate 7 has 6 microstates. How many microstates does the macrostate 2 have? Just one. Snake eyes. (1,1). Let's call the number of microstates that correspond to a macrostate the multiplicity of the macrostate. Macrostate 2 then has a multiplicity of 1, and macrostate 7 has a multiplicity of 6.

In this dice example, the probability of rolling the dice and getting the macrostate with the largest multiplicity (i.e. macrostate 7) is larger than the probability of any other macrostate, but it's not a lot larger. There's a 16.67% chance of getting a 7, but there's a 13.89% chance of getting either a 6 or 8. But let's now look at a different example. The number of ways to arrange a million atoms in a sealed, empty glass aquarium. The atoms constitute a very dilute gas and are whizzing about throughout the enclosure. In this case, a microstate refers to the specific location and momentum of every atom, and the macrostate is the state of the system defined by a few large-scale properties like pressure, volume, temperature, or total mass of all the atoms. Common sense or experience will likely tell you that the most probable macrostate (i.e. the macrostate with the largest multiplicity) is a more-or-less uniform distribution of the atoms throughout the aquarium. That's correct, but to be a bit more quantitative about this, let's ask the following question: What is the chance that all the atoms will be on the left side of the tank? Let's imagine each atom has a 50% chance of being on the left side of the tank and a 50% chance of being on the right side. Each atom flips a theoretical coin and if it lands Heads Up then the atom is on the left side. Tails Up and the atom is on the right side. With only two atoms, what are the chances that both will get Heads Up and be on the left side of the tank? 1 in 4 or 25%. But what are the odds that one million coin flips for one million atoms will result in one million Heads Up? For all practical purposes, zero! Unless the atoms all start out on the left side, because I put them there, they'll never spontaneously all move to the left side. Now consider the probability of the atoms being uniformly spread out over the entire volume. This macrostate has an unimaginably large multiplicity, as any one atom could take any other position in the entire volume and the macrostate would not change. This is not the case for the previous case, where the macrostate is defined as all the atoms being on the left side of the tank. One of those atoms can't take just any other position in the entire volume. It can only take a position in half the volume (the left half). If it were to take a position in the right half, it would define a new macrostate.

Now, a system doesn't have to start out in the macrostate of largest multiplicity. I can force it to start out in an unlikely macrostate. But given time, it's going to naturally (without any kind of conscious master plan, of course) evolve to that most probable macrostate.

To step from macrostates and multiplicity to entropy is rather easy. Entropy is simply defined as the natural logarithm of the multiplicity. Entropy = log(multiplicity) where the log has base 'e'. It's also common to write, Entropy = ln(multiplicity). If you're not familiar with a natural logarithm (note: the "natural" means base 'e'), it's simply this: the natural logarithm of a number x is the power you would have to raise 'e' to to get x. 'e' is just a number, itself, albeit a special one: 2.71828... So, ln(4) = 1.386... because (2.71828)^(1.386) = 4. Anyways, the main point is that entropy is a measure of multiplicity, but, fortunately for people that have to deal with it, a lot smaller in value. Now, let's say we do start out with an aquarium tank of atoms in the gas state, and initially they are all trapped on the left side of the tank behind a partition. When I remove the partition, the system evolves to the macrostate of largest multiplicity. As the atoms are filling the tank, the system is passing through different macrostates on its way towards the most likely one, and its entropy is climbing towards ... its maximum! Because ln(x) > ln(y) when x > y (e.g. ln(5) > ln(4)), the entropy will achieve its largest value when the macrostate with the largest multiplicity is achieved. And we know that this state is always achieved, given time and without external influences. (This is basically the Second Law of Thermodynamics, just so you know. Succinctly put, an isolated macroscopic system will evolve to the macrostate of largest entropy and will then remain there.)

Temperature is defined in terms of entropy, but before we get to the definition, we need to address the concept of heat. My dictionary defines heat as "a form of transferred energy that arises from the random motion of molecules and is felt as temperature, especially as warmth or hotness." The word "transferred" is key, as only energy in transit may be correctly called "heat." An object cannot possess or contain a certain amount of heat. (A lot of people get this wrong!) It can contain energy, though, and this energy, in transit, is how one object can heat another. Therefore, we should try to avoid using the word heat as a noun and stick to using it as a verb. The definition as a verb is this: "To become or make something warm or hot." It's fine to say, "I'm heating the water." It's not okay to say, "I'm losing heat through the top of my head; I should wear a cap." OK, so let's say we place a "hot" object on top of a "cold" object. What happens? The cold object gets hotter and the hot object gets colder. The hot object heats the cold object by transferring energy to it. This energy is passed along by vibrating atoms in the hotter object knocking into vibrating atoms in the colder object, causing them to vibrate more strongly. Even though many people say this, I won't go so far as to say that temperature is a measure of this vibrating energy per particle. It is true that in certain situations the energy per particle is directly proportional to temperature, but this is only true when the particles, the atoms, are spread out enough so that they rarely run into one another. And this is only found in dilute gases. So, even though some people say that temperature is just a measure of average kinetic energy or thermal energy, it's really not this simple. Sometimes it is, but usually it's not. Hence, this long article.

So what is temperature? I can give you a general, qualitative answer at this point, but a quantitative answer will take a bit more preparation. Here's the general, qualitative answer: Temperature is hotness measured on some definite scale. That is, it is a number that we assign to objects as a way to rank those objects in a sequence that indicates which object will gain energy (and which will lose energy) by heating when they are placed in thermal contact. Using this definition, we can say that a cold pan placed on a hot stove has a lower temperature than the stove burner because energy is being passed as heat from the stove burner into the pan.

Say you take two blocks of metal, one a fair bit hotter than the other, and you touch them together. We know that energy will transfer from the hotter object to the colder object until an equilibrium state is reached, at which point both objects will have the same temperature. (I know I'm using the word "temperature" before formally defining it, so, ummm, just use the general definition from the paragraph above.) What else can we say about this condition of equilibrium? We can say that the total macrostate of the combined system is, upon reaching equilibrium, the macrostate with the largest multiplicity. That is, as one object heats the other, entropy is increasing, and when the energy exchange stops, upon reaching an equilibrium state, entropy has been maximized. Now, at this equilibrium state, and only at this state, moving an infinitesimal (read: very, very small) bit of energy from one object to the other has a negligible affect on the total entropy. Such a relocation of energy would increase the entropy of one of the blocks just as much as it decreased the entropy of the other block. (This is not the case before equilibrium is reached.) Therefore, we can set up an equation (involving two partial derivatives) to define this equilibrium state: (∂s1/∂u1) = (∂s2/∂u2), where s is entropy and u is energy and 1 and 2 refer to blocks 1 and 2. This is an ugly HTML way of saying: the change in entropy in block 1 due to a tiny increase (or decrease) in energy is equal to the change in entropy in block 2 due to a tiny decrease (or increase) in energy (of the same size as the change in block 1, due to conservation of energy). That is, total entropy of the system (blocks 1 and 2) is not changing. Now, because we have this equation which is valid when the two objects are in thermal equilibrium, we can make a meaningful connection between temperature and the terms in the equation. First, we agree that when objects are in thermal equilibrium, they have the same temperature. This is not some natural law but is simply part of our definition of temperature. And so here it goes: the quantity (the partial derivative) ∂s/∂u, i.e. the change in an object's entropy per infinitesimal change in its energy by heating, is equal to ... well, not its temperature, but 1 divided by its temperature. 1/T = ∂s/∂u --> T = 1 / (∂s/∂u). This is our definition of temperature!

As stated before, (∂s1/∂u1) does not equal (∂s2/∂u2) before equilibrium is reached. Initially, one of these two derivatives is larger than the other, and the owner of the larger derivative is going to be the block that gains energy. Think about it for a second and you'll realize, entropy is going to grow to its maximum value only if the block that is "better at converting energy to entropy", meaning it has the larger derivative, is getting energy from the block that is "worse at converting energy to entropy." From the above, we know that the block with the greater derivative will be the colder one. And it will keep taking energy from the hotter block until it and the hotter block reach some common intermediate temperature, at which point energy transfer no longer increases entropy and the system has reached its most probable macrostate. Now, once equilibrium is reached, energy does not cease to move back and forth between the two blocks; there will be small fluctuations in energy in each block (up in one, down in the other), but these changes are simply described by different microstates that all belong to the same most-probable macrostate. Temperature, which is a property that describes a macrostate, is now constant.

One last thing. The way I've defined things, our temperature T is actually called the fundamental temperature. And its units are the units of energy, not units of conventional temperature like kelvin or degrees Celsius. That's fine. No one ever said temperature had to be measured in some particular units like degrees Celcius. People just like using those units. But to arrive at the conventional temperature we're used to dealing with, divide the fundamental temperature by a constant called the Boltzmann constant. It's value is 1.381 x10^-23 joules/kelvin. Yes, it's very small. This gives us the temperature in kelvin, which is just the temperature in degrees Celsius plus 273.15. Turned around, °C = K - 273.15. That is, 283.15 K = 10 °C.

Wednesday, February 3, 2010

Why Must Gravity Exist?

Why must gravity exist?

To begin with, we accept (gratefully) that the laws of physics are the same everywhere in the universe. This didn't have to be the case, but it appears that it is the case. A person throws a ball into the air and it falls back to the Earth. An observer here in Austin would describe the motion of the ball in the same way, using the same equations, as an observer in Boston, Berlin or Bangkok. I push on a rock, and it pushes back on me (with an equal but oppositely directed force), both on Earth and on the Moon. Even on Pluto. A proton is attracted to an electron with a force determined according to a certain equation, valid here in the Milky Way galaxy, but also in other galaxies. An experiment repeated in many different locations may yield different answers, but each time the answer will be calculated by using the same equation. Gravity is weaker at the top of Mount Everest, far from the Earth's center, but a ball dropped from that height falls according to the same equation of motion as a ball dropped from my apartment balcony.

Taking things a step further, the laws of physics are the same whether you are standing still or moving at a constant velocity. (Is standing still even possible? I may be at rest on the Earth, but the Earth is hurtling around the sun, so I'm moving relative to the sun. If anything is moving, everything is moving. It's all relative!) Flying in an airplane, if you were to toss a coin up and forward, it would trace out a parabola on its way to the ground, just as it would do if you were standing firmly on the Earth when you tossed it. If the aisles were wide enough and you didn't fear arrest, you could play catch with another passenger, tossing a baseball back and forth. It would be no different than if you were doing so in Central Park, New York. The ball wouldn't behave differently. Unless the plane hit an air pocket, that is. Or slowed down suddenly. Or turned. Then the ball would appear to move in some arbitrary way. But we're assuming a constant velocity. Moving at a constant speed in a straight line. If the ride was especially smooth, and the all of the windows of the airplane were closed, you wouldn't even be able to tell if you were moving. Just like we can't tell that we (and the Earth) are moving through space right now, orbiting the sun. If everything around you is moving at the same speed you are, then as far as you're concerned, nothing is moving. So in summary, regardless of how fast I'm moving, as long as I'm moving at a constant velocity, I can describe some phenomena using the same laws of physics as someone else that is moving at a different constant velocity. I, standing in the middle of a basketball court, will describe the path of a falling ball using the same equation as a boy sailing across the floor on his skateboard, traveling a constant 5 mph. The universe was just built that way.

Okay, so we're getting closer to answering the original question: why does gravity exist? Hey, if it only took a few sentences to explain it, you'd probably already know it!

Our goal: a world in which the laws of physics are the same to all observers. Are we there, yet? No, not quite. Now we come to acceleration. Say gravity didn't exist. There would be nothing pressing you against your seat. You might float right off the chair as you pressed on your keyboard and it pushed back on you. But what if a rocket was affixed to the bottom of the chair? Upon ignition, it would propel your chair, and your chair would propel you, up towards the ceiling. If you closed your eyes and plugged your ears, all you'd notice was the sensation of the chair pressing against your butt. It would feel no different than if gravity was pulling you into the seat. Einstein hit upon this idea. Gravity and acceleration are the same thing! As far as scientific experiments designed to test this idea are concerned, if you were in a steadily accelerating windowless elevator, accelerating at the same rate forever, you wouldn't be able to tell if you were accelerating or standing still on the surface of a planet (with gravity pulling you to the ground). Were it not for gravity, you'd be able to tell the difference. Without gravity, if you were standing, with feet pressed against a surface, you would know that the surface was accelerating upwards against your feet. Acceleration would be absolute, not relative. (But we want it to be relative. It shouldn't matter whether or not you are accelerating when you choose a set of equations to model some phenomenon. Just like it shouldn't matter where you are or what your relative velocity is compared to someone or something else.) Without gravity, if you held a ball in front of you and released it, you would witness one of two outcomes. The ball might hover there in front of you or it might "fall" towards the ground. Since there's no gravity, the only way to explain its "fall" would be to assert that the ground (and you) were accelerating upwards while the ball was at rest. Once it came in contact with the ground, it would start accelerating upwards with you, resting there at your feet. You see, the ball would behave differently based on whether or not you were accelerating. A person accelerating would describe the motion of the ball using different equations than would a person not accelerating. The laws of physics would not be the same for the two observers. To ensure that the laws of physics are indeed the same for all observers, gravity must exist! With gravity, we can use the same set of equations to describe that ball's path, regardless of whether we are standing on the surface of a planet or steadily accelerating through space in a windowless elevator.

So that's why gravity must exist in a universe in which the laws of physics are the same to all observers, no matter where those observers are, whether they are standing still, moving at a steady velocity, or accelerating.

Wednesday, January 20, 2010

Glass: Liquid or Solid?

Is glass a liquid or a solid? Actually, the answer is not straightforward.

Solids, made of atoms or molecules that lack the thermal energy to bump past one another and so get locked into place by electric attraction, are generally classified as crystalline or amorphous. In a crystalline solid, the atoms or molecules lie in an orderly array (i.e. they have an orderly internal structure). Not so in amorphous solids, in which the atoms or molecules lie in a random jumble.

Glass is an amorphous solid. However, some chemists prefer to label glass as a supercooled liquid. Normally, the cooling of a substance from a very hot liquid state results in the crystallization of the substance. Its atoms or molecules line up in an orderly manner as they settle into place. Glass is produced by cooling a molten liquid fast enough that crystallization doesn't occur. As the glass cools, the time needed for it to exhibit liquid behavior, such as flowing, increases and reaches extremes. It becomes like a very thick syrup, so viscous (or resistant to flow) that liquid behavior becomes noticeable only on a geologic timescale (as in billions of years).

Another defining characteristic of glasses, which differentiates them from normal solids, is that they lack a latent heat of fusion. The latent heat of fusion is the amount of thermal energy which must be absorbed or lost for a substance to change states (from solid to liquid or vice versa). In other words, if you were to put a container of liquid in a room set to a temperature well below the liquid's freezing point, the liquid would begin giving off heat and decrease in temperature. When it reached its freezing point, however, it would remain at a constant temperature (the freezing point) as it continued to give off heat. This would be the actual phase transition, and the heat given off during this transition is termed the "latent heat of fusion". After the liquid solidified, it would start dropping in temperature again as it continued releasing heat. It would stop releasing heat when its temperature matched the temperature of the room. Water, for example, has a specific amount of heat it must release, per gram, as it transitions from liquid water to ice. While it's releasing this heat of fusion, its temperature remains steady at 0 degrees C. Glass, on the other hand, never enters a stage during its production in which its temperature stops decreasing as it continues to give off heat. There is no defined phase transition between molten glass and "solid" glass.

As a side note, some people claim to have seen evidence of flow in very old glass (perhaps very old windows in a historic building). These claims are not valid. Because of the amount of time needed for glass to start exhibiting liquid behavior, no human will ever witness such a thing.

Saturday, January 2, 2010

Entropy and Ice

There are many more ways to be messy than organized. Things tend toward disorder because, statistically, there's just so many more ways to be disordered than ordered. This is the central idea behind the Second Law of Thermodynamics. Exhale and the carbon dioxide that comes from your mouth is not likely to float there in front of your face, bound up like a little ball of gas. It's not likely to stay so organized. The molecules are going to randomly move about and distance themselves from one another. There's no force pushing them apart; it's random motion. Divide the room you're in into a billion equal-sized cubes. The cubes right in front of your mouth have a large number of carbon dioxide (CO2) molecules in them, right after you exhale. The other cubes around the room have some CO2 molecules scattered about them. With all these molecules whizzing about at large velocities, what are the odds that as many CO2 molecules will enter the cubes (the space; the volume) in front of your mouth as will leave that volume? Practically zero. The universe naturally tends towards disorder. In fact, nothing will happen spontaneously unless it (i.e. the process, the reaction) increases the disorder of the universe. Entropy is a property of a system that measures the system's disorder. When a system becomes more disordered, we can say its entropy has increased. While we can talk of entropy in a qualitative sense, it's a number. (Its units are joules per Kelvin; energy is measured in joules, while temperature is measured in Kelvin, so we have amount of energy per unit (think degree) of temperature.) We see some process occur spontaneously and we think, entropy must have increased.

Go outside on a winter day and you may see ice -- solid water. And solids are more organized than liquids. In a piece of ice the water molecules are arranged in a certain pattern, and the ice itself takes up a small amount of space and doesn't spread out like liquid water. How could water suddenly become more organized in its structure? The ice you see formed spontaneously. Does this violate the Second Law of Thermodynamics, which states that the entropy of an isolated system must increase for any spontaneous process?

It's true that the entropy of the water decreased when the ice froze, but what about the entropy of the "isolated system"? Does the water/ice constitute an isolated system? No. It's in direct contact with the ground and with the air. We realize the Second Law holds only when we realize that the entropy of something else increased when the entropy of the water decreased. And not only did the entropy of something else increase, but it increased by more than the entropy of the water decreased. What became more disordered in this process? The air around the water-turned-ice. When ice freezes, it releases heat, and that heat goes into the air. The heat warms the air and gets some air molecules to speed up a bit. It increases the air's local temperature ever so slightly.

I need to mention that our intuitive notion of temperature is a result of atoms and molecules in motion (i.e. thermal energy). When something is heated up, its atoms/molecules move and vibrate more rapidly. When something is cooled down, its atoms/molecules move and vibrate less rapidly. Our skin can sense these motions, in the aggregate, and our brains interpret these sensations as temperature. With increased thermal energy comes increased entropy (or disorder). So hot air is more disordered than cold air. And, in our example, the air around the water-turned-ice is slightly more disordered than before the water froze, because the air has just been heated up a bit.

What dictates when the increase in the entropy of the air around a pool of water is enough to allow the water to become more organized (i.e. to turn to ice)? We now know that the increase in the entropy of the air must be greater than the decrease in the entropy of the water for such a process to occur spontaneously. (Thanks to the Second Law of Thermodynamics.) When is this the case? Why doesn't a pool of water, out on the sidewalk, turn to ice on a summer day? Again, what determines when the entropy of the air will increase by a greater amount than the entropy of the water will decrease? ... The initial temperature of the air, of course. But why?

Scenario A: I'm in the mood to dance. I go to a night club and make my way to the dance floor. Some hundred people are already shaking their bodies to the beat. I join in.
Scenario B: I'm in the mood to dance. I head over to the campus library and walk directly to the large reading room. Some hundred students are spread about the room, at tables, quietly reading. I start dancing in the middle of the room. They stare.

In which scenario have I contributed most to the energy in the room? Definitely scenario B. At the night club, there's already a scene. A hundred people dancing. I go relatively unnoticed. At the library, however, I'm really stirring things up. I'm definitely noticed. With it so calm and quiet prior to my arrival, I significantly contribute to the energy in the room once I start dancing. Likewise, adding heat to a cool system differs from adding heat to a hot system. "[T]he arriving [thermal] energy more noticeably stirs up the molecules of a cool system, which have little thermal motion, than those of a hot system, in which the molecules are already moving vigorously." (Chemical Principles, 3rd edition, Atkins & Jones, p. 247)

We can now reason that the heat released by freezing water really stirs things up on a cold day, when the air molecules have little thermal motion. And the heat that would be released by the water on a warm day would be an insignificant contribution to the thermal motion of the warm air. It would add little to the entropy of the air. So for a given amount of heat, introduced into a system, entropy will increase more when the system is at a low temperature than when it is at a high temperature. What's the magic temperature below which a bit of released heat (from water) will increase the entropy of the air more than the entropy of the water will decrease? Zero degrees centigrade, of course. Water, we know, freezes at 0 degrees C. When the outside air temperature is greater than 0 degrees C, the heat that would come from the freezing of water is not capable of increasing the entropy of the air enough to warrant the freezing. Only when the temperature is 0 degrees or less is the entropy of the entire system increased by the freezing of water, so only then is freezing spontaneous.

Sources: Chemical Principles, 3rd edition, by Atkins & Jones;
Principles of Chemistry, by Michael Munowitz