My goal here is to describe the concept of temperature. I think most non-scientists would struggle to define the concept and few would get it right.
We have to build up to the definition. And because it involves the concept of entropy, let's start with an explanation of that term. Say you have a pair of common dice and you throw them onto a tabletop. What is the probability that you'll roll a 3 (meaning the dots on each die together sum to 3)? To answer this question, you have to figure out how many different ways there are to arrange the dice to get that sum. Here, it's pretty easy. The die that lands closest to you is a 1 and the one the lands farther away is a 2, or vice versa. So there are two ways to get a 3. Out of how many total unique arrangements of the dice? Well, 36, as for each of the 6 numbers the first die can take, the other die can take any of its 6 numbers. And 6x6 is 36. So the answer to our question, "What is the probability that you'll roll a 3?" is 2 / 36 = .0556 or 5.56%. OK, now what's the probability you'll roll a 7? There are 6 different ways to arrange the dice to get a 7 (e.g. 1 and 6, 2 and 5, 3 and 4, 4 and 3, etc.). That means 6 / 36 or 16.67% of the time you'll roll a 7. In fact, 7 is the most probable sum you can roll. All other sums come up less than 16.67% of the time. (If you've played the dice game Craps, I'm sure you know this.)
OK, so let's create a couple of definitions in terms of the dice rolling. We'll call the sum of the two dice (i.e. what you roll) the macrostate and the different arrangements that give you this macrostate the microstates. So the answer to the question, "What are the various microstates that give you the macrostate 7?" is, written as ordered pairs, (1,6) (2,5) (3,4) (4,3) (5,2) (6,1). The macrostate 7 has 6 microstates. How many microstates does the macrostate 2 have? Just one. Snake eyes. (1,1). Let's call the number of microstates that correspond to a macrostate the multiplicity of the macrostate. Macrostate 2 then has a multiplicity of 1, and macrostate 7 has a multiplicity of 6.
In this dice example, the probability of rolling the dice and getting the macrostate with the largest multiplicity (i.e. macrostate 7) is larger than the probability of any other macrostate, but it's not a lot larger. There's a 16.67% chance of getting a 7, but there's a 13.89% chance of getting either a 6 or 8. But let's now look at a different example. The number of ways to arrange a million atoms in a sealed, empty glass aquarium. The atoms constitute a very dilute gas and are whizzing about throughout the enclosure. In this case, a microstate refers to the specific location and momentum of every atom, and the macrostate is the state of the system defined by a few large-scale properties like pressure, volume, temperature, or total mass of all the atoms. Common sense or experience will likely tell you that the most probable macrostate (i.e. the macrostate with the largest multiplicity) is a more-or-less uniform distribution of the atoms throughout the aquarium. That's correct, but to be a bit more quantitative about this, let's ask the following question: What is the chance that all the atoms will be on the left side of the tank? Let's imagine each atom has a 50% chance of being on the left side of the tank and a 50% chance of being on the right side. Each atom flips a theoretical coin and if it lands Heads Up then the atom is on the left side. Tails Up and the atom is on the right side. With only two atoms, what are the chances that both will get Heads Up and be on the left side of the tank? 1 in 4 or 25%. But what are the odds that one million coin flips for one million atoms will result in one million Heads Up? For all practical purposes, zero! Unless the atoms all start out on the left side, because I put them there, they'll never spontaneously all move to the left side. Now consider the probability of the atoms being uniformly spread out over the entire volume. This macrostate has an unimaginably large multiplicity, as any one atom could take any other position in the entire volume and the macrostate would not change. This is not the case for the previous case, where the macrostate is defined as all the atoms being on the left side of the tank. One of those atoms can't take just any other position in the entire volume. It can only take a position in half the volume (the left half). If it were to take a position in the right half, it would define a new macrostate.
Now, a system doesn't have to start out in the macrostate of largest multiplicity. I can force it to start out in an unlikely macrostate. But given time, it's going to naturally (without any kind of conscious master plan, of course) evolve to that most probable macrostate.
To step from macrostates and multiplicity to entropy is rather easy. Entropy is simply defined as the natural logarithm of the multiplicity. Entropy = log(multiplicity) where the log has base 'e'. It's also common to write, Entropy = ln(multiplicity). If you're not familiar with a natural logarithm (note: the "natural" means base 'e'), it's simply this: the natural logarithm of a number x is the power you would have to raise 'e' to to get x. 'e' is just a number, itself, albeit a special one: 2.71828... So, ln(4) = 1.386... because (2.71828)^(1.386) = 4. Anyways, the main point is that entropy is a measure of multiplicity, but, fortunately for people that have to deal with it, a lot smaller in value. Now, let's say we do start out with an aquarium tank of atoms in the gas state, and initially they are all trapped on the left side of the tank behind a partition. When I remove the partition, the system evolves to the macrostate of largest multiplicity. As the atoms are filling the tank, the system is passing through different macrostates on its way towards the most likely one, and its entropy is climbing towards ... its maximum! Because ln(x) > ln(y) when x > y (e.g. ln(5) > ln(4)), the entropy will achieve its largest value when the macrostate with the largest multiplicity is achieved. And we know that this state is always achieved, given time and without external influences. (This is basically the Second Law of Thermodynamics, just so you know. Succinctly put, an isolated macroscopic system will evolve to the macrostate of largest entropy and will then remain there.)
Temperature is defined in terms of entropy, but before we get to the definition, we need to address the concept of heat. My dictionary defines heat as "a form of transferred energy that arises from the random motion of molecules and is felt as temperature, especially as warmth or hotness." The word "transferred" is key, as only energy in transit may be correctly called "heat." An object cannot possess or contain a certain amount of heat. (A lot of people get this wrong!) It can contain energy, though, and this energy, in transit, is how one object can heat another. Therefore, we should try to avoid using the word heat as a noun and stick to using it as a verb. The definition as a verb is this: "To become or make something warm or hot." It's fine to say, "I'm heating the water." It's not okay to say, "I'm losing heat through the top of my head; I should wear a cap." OK, so let's say we place a "hot" object on top of a "cold" object. What happens? The cold object gets hotter and the hot object gets colder. The hot object heats the cold object by transferring energy to it. This energy is passed along by vibrating atoms in the hotter object knocking into vibrating atoms in the colder object, causing them to vibrate more strongly. Even though many people say this, I won't go so far as to say that temperature is a measure of this vibrating energy per particle. It is true that in certain situations the energy per particle is directly proportional to temperature, but this is only true when the particles, the atoms, are spread out enough so that they rarely run into one another. And this is only found in dilute gases. So, even though some people say that temperature is just a measure of average kinetic energy or thermal energy, it's really not this simple. Sometimes it is, but usually it's not. Hence, this long article.
So what is temperature? I can give you a general, qualitative answer at this point, but a quantitative answer will take a bit more preparation. Here's the general, qualitative answer: Temperature is hotness measured on some definite scale. That is, it is a number that we assign to objects as a way to rank those objects in a sequence that indicates which object will gain energy (and which will lose energy) by heating when they are placed in thermal contact. Using this definition, we can say that a cold pan placed on a hot stove has a lower temperature than the stove burner because energy is being passed as heat from the stove burner into the pan.
Say you take two blocks of metal, one a fair bit hotter than the other, and you touch them together. We know that energy will transfer from the hotter object to the colder object until an equilibrium state is reached, at which point both objects will have the same temperature. (I know I'm using the word "temperature" before formally defining it, so, ummm, just use the general definition from the paragraph above.) What else can we say about this condition of equilibrium? We can say that the total macrostate of the combined system is, upon reaching equilibrium, the macrostate with the largest multiplicity. That is, as one object heats the other, entropy is increasing, and when the energy exchange stops, upon reaching an equilibrium state, entropy has been maximized. Now, at this equilibrium state, and only at this state, moving an infinitesimal (read: very, very small) bit of energy from one object to the other has a negligible affect on the total entropy. Such a relocation of energy would increase the entropy of one of the blocks just as much as it decreased the entropy of the other block. (This is not the case before equilibrium is reached.) Therefore, we can set up an equation (involving two partial derivatives) to define this equilibrium state: (∂s1/∂u1) = (∂s2/∂u2), where s is entropy and u is energy and 1 and 2 refer to blocks 1 and 2. This is an ugly HTML way of saying: the change in entropy in block 1 due to a tiny increase (or decrease) in energy is equal to the change in entropy in block 2 due to a tiny decrease (or increase) in energy (of the same size as the change in block 1, due to conservation of energy). That is, total entropy of the system (blocks 1 and 2) is not changing. Now, because we have this equation which is valid when the two objects are in thermal equilibrium, we can make a meaningful connection between temperature and the terms in the equation. First, we agree that when objects are in thermal equilibrium, they have the same temperature. This is not some natural law but is simply part of our definition of temperature. And so here it goes: the quantity (the partial derivative) ∂s/∂u, i.e. the change in an object's entropy per infinitesimal change in its energy by heating, is equal to ... well, not its temperature, but 1 divided by its temperature. 1/T = ∂s/∂u --> T = 1 / (∂s/∂u). This is our definition of temperature!
As stated before, (∂s1/∂u1) does not equal (∂s2/∂u2) before equilibrium is reached. Initially, one of these two derivatives is larger than the other, and the owner of the larger derivative is going to be the block that gains energy. Think about it for a second and you'll realize, entropy is going to grow to its maximum value only if the block that is "better at converting energy to entropy", meaning it has the larger derivative, is getting energy from the block that is "worse at converting energy to entropy." From the above, we know that the block with the greater derivative will be the colder one. And it will keep taking energy from the hotter block until it and the hotter block reach some common intermediate temperature, at which point energy transfer no longer increases entropy and the system has reached its most probable macrostate. Now, once equilibrium is reached, energy does not cease to move back and forth between the two blocks; there will be small fluctuations in energy in each block (up in one, down in the other), but these changes are simply described by different microstates that all belong to the same most-probable macrostate. Temperature, which is a property that describes a macrostate, is now constant.
One last thing. The way I've defined things, our temperature T is actually called the fundamental temperature. And its units are the units of energy, not units of conventional temperature like kelvin or degrees Celsius. That's fine. No one ever said temperature had to be measured in some particular units like degrees Celcius. People just like using those units. But to arrive at the conventional temperature we're used to dealing with, divide the fundamental temperature by a constant called the Boltzmann constant. It's value is 1.381 x10^-23 joules/kelvin. Yes, it's very small. This gives us the temperature in kelvin, which is just the temperature in degrees Celsius plus 273.15. Turned around, °C = K - 273.15. That is, 283.15 K = 10 °C.
Monday, June 28, 2010
Subscribe to:
Comments (Atom)
