Wednesday, October 14, 2009

Superconductivity

When Thomas Edison went about providing electric power to New York City in the late 1800s, he knew that energy was dissipated as heat in the wires that delivered electric current to his customers. This reduced the amount of power that made it to the homes of his customers, and presented Edison with the problem of trying to minimize this power loss. (I talked about this in a previous blog entry.)

The issue faced by Edison was one of electrical resistance, which is a measure of the degree to which an object opposes an electric current through it. When current flows through an object with resistance, electrical energy is converted to heat at a rate equal to the square of the current times the resistance. This rate is a measure of power loss.

While Edison had means to lessen this loss of power, he couldn't escape it completely. That's because conductors (i.e. materials that conduct electricity) naturally heat up as an electric current moves through them. The electrons that comprise this current, as they snake forward through the material, are constantly bumping into the atoms (ions) of the conductor. At each collision, an electron loses a bit of kinetic energy to an ion, increasing the kinetic energy of the ion, generating heat and increasing the temperature of the conductor. While conductors exhibit less resistance at lower temperatures, ordinary conductors can never be cooled enough to achieve zero resistance.

It was in 1911 that a scientist, Heike Kamerlingh Onnes, discovered that certain unordinary conductors, under certain conditions, do possess zero electrical resistance. That is, passing an electric current through these materials does not result in the heating of the materials and, therefore, no power is lost in them. The reason why no one had seen such behavior before: it only takes place in certain materials, and these materials have to be unimaginably cold. It was only just prior to 1911 that such cold temperatures were achieved in the laboratory (by Onnes). Onnes had taken helium gas and got it so cold (down to 4.2 degrees above absolute zero) that it condensed into a liquid. Using this liquid helium as a refrigerant, he tested the electrical resistance of mercury and was amazed to find that it actually dropped to ZERO! Such behavior was a completely new phenomenon, never before witnessed. Onnes labeled it "superconductivity."

Onnes didn't understand what was going on inside the superconducting material. How could the electrons avoid bumping into the material's ions, passing kinetic energy to them? Why did such behavior occur only below a certain temperature, labeled the critical temperature? Twenty-two years later, in 1933, the answer was still unknown. But in this year, Walter Meissner and Robert Ochsenfeld made an important new discovery about superconducting materials (which, as a class, had expanded to include materials other than mercury). They found that superconductors expelled applied magnetic fields. Magnetic field lines that passed through a sample of material were, in a sense, pushed out of the material (or more accurately, cancelled within the material) when the material was cooled below its critical temperature. This finding, now known as the Meissner effect, provided evidence that superconductivity was, most fundamentally, a magnetic phenomenon. Such a finding also changed the mindset that the fundamental property of a superconductor was zero resistance.

A theory explaining the phenomenon of superconductivity was proposed in 1957 by John Bardeen, Leon Cooper, and Robert Schrieffer. It became known as the BCS Theory, after their initials. It had to do with phonons (not photons) and Cooper pairs. Phonons are quantized crystal lattice vibrations. What does this mean? Certain materials exist as crystals, which means "the constituent atoms, molecules or ions [which are atoms or molecules with a net electric charge] are packed in a regularly ordered, repeating pattern in all three spatial dimensions." (Wikipedia) The graphic below is an example of a unit cell, which is periodically repeated in three dimensions to form a crystal. Each sphere represents an atom and the tubes represent bonds between atoms.


A lattice is a sort of framework upon which, at each point, there exists a unit cell like you see pictured above. So the crystal looks the same when viewed from any lattice point. As an electron moves through a crystal, it exerts a force (i.e. it pulls) on the positively charged lattice ions, distorting them towards its (the electron's) path. As the electron then moves away from that point on the lattice, the lattice ions return to their original position. Because all atoms in a crystal are connected, "the displacement of one or more atoms from their equilibrium positions will give rise to a set of vibration waves propagating through the lattice." (Wikipedia) Finally, these vibration waves are quantized, which means they can't possess just any amount of energy but only certain discrete numerical values.

What happens as an electron moves through a crystal, generating a phonon? Let's picture an electric current flowing through the material. One electron after another. An electron zips past a point in the crystal lattice, distorting the lattice through the creation of a phonon. The lattice is pulled inward towards the negatively-charged electron, but the electron quickly moves away, faster than the lattice can relax back to its original position. This creates a region of positive charge, as the lattice ions that are pulled inward are positively charged. Here's the cool part. A second electron can be attracted to the region of positive charge along the path of the first electron. And these two electrons, which would normally repel one another (because they are both negatively charged), can become bound to one another. "If this binding energy is higher than the energy provided by kicks from oscillating atoms in the conductor (which is true at low temperatures), then the electron pair will stick together and resist all kicks, thus not experiencing resistance." (Wikipedia) These electron pairs are called Cooper pairs, and they lie at the heart of the BCS Theory. They are what allow for superconductivity; they carry the superconducting current. But, as noted just above, the temperature has to be low. Above a critical temperature, the atoms in the crystal are jostling around too much, bumping into the electron pairs with enough force to knock them apart. This breaking apart of the Cooper pairs destroys superconductivity in the material, and the material becomes "normal." What's the highest temperature at which a known material will superconduct? A special ceramic material comprised of many different atoms has been observed to superconduct at -135 degrees C. Notice the negative sign. The holy grail of those working in the field is to find a material that superconducts at room temperature. (Obviously, no material yet identified would have helped Edison ... although there are techniques, which I addressed in a previous blog entry, that lessen the problem.)

The material that superconducts at -135 degrees C (or 138 K), like all materials that superconduct above around -243 degrees C (or 30 K), is called a "high-temperature" superconductor. This is obviously a relative term. Such materials are not consistent with the BCS Theory and there is no good theory to describe how these high-temperature superconductors work.

Sunday, October 4, 2009

Blackbodies

What is a blackbody?

A blackbody is an idealized type of object that absorbs ALL electromagnetic radiation that falls on it. It therefore reflects no light. Now let's hold that the blackbody is in thermal equilibrium with its surroundings. (This means that it's at the same temperature as its surroundings.) With a bit of physics background, it becomes apparent that the object must not only absorb all radiation incident upon it (which is what makes it a blackbody) but it must also emit radiation at an equal rate, otherwise the net inflow or outflow of radiation would cause its temperature to change. (It should seem reasonable that radiation incident on an object can alter its temperature ... think of a microwave oven.) This emission of radiation may be in the form of visible light, so we acknowledge that even though no light is reflected from the blackbody, the body may still give off light. In other words, it may not actually be black in color.

(If you're familiar with the term "electromagnetic radiation" and you know what wavelength is, you can skip this paragraph.) Electromagnetic radiation is the collective term for radiation in its many guises. Microwaves, radio waves, visible light, X-rays. These are all examples of radiation, which can be viewed as a wave, with electric and magnetic components, propagating through space, carrying along energy. Waves can vary from one another in various ways, with a notable example being wavelength, or the distance between two adjacent crests or troughs. (More precisely, wavelength is the distance between adjacent maximums in the oscillating electric field.) Frequency, which is inversely proportional to wavelength, is another distinguishing characteristic of waves. It's a measure of the rate of oscillation of the wave. As a wave passes through a point in space, the shorter the wavelength, the more rapidly crests (or troughs) pass through that point. And vice-versa. This inverse proportionality holds, by the way, because radiation travels at a constant speed -- namely, the speed of light. In what wavelengths can radiation come? Any and all. A millionth of a centimeter, or two centimeters, or two meters, or bigger or smaller. People have arbitrarily divided this wavelength continuum into sections and given names to the different regions. Radiation that has a wavelength anywhere between 1 mm and 10 cm is called microwave radiation. Radiation with a wavelength between 400 and 700 nanometers (or one billionth of a meter) is called visible light. And, within the range of visible light, wavelength furthermore determines the color. As an example, light at 500 nm is green. You get the idea.

Funny thing about a blackbody in thermal equilibrium: it will emit a specific radiation spectrum (or a specific distribution of energy spread over all the possible wavelengths of radiation) that is characteristic NOT of the shape or size of the object, or even of what it is made of, but ONLY of the temperature of the object. Therefore, any two objects at 300 K (I'm using the Kelvin temperature scale, where the temperature in Kelvin is always 273.15 degrees higher than what it is in Celsius) will have the same radiation spectrum, and two objects at 3000 K will also share the same radiation spectrum, though one that is different than that shared by the objects at 300 K. (The radiation spectrum is generally referred to as a thermal spectrum, but I'll stick with the first term.) Here's a graph of spectrums at 4 different temperatures. Each has a peak, with a rather sharp tapering of the shorter-wavelength side and a more gradual tapering of the longer-wavelength side.



Quick question: How hot should the filament in a light bulb be, so that is will produce the same "white" light spectrum produced by the Sun? Answer: the same temperature as the surface of the Sun, which is about 5800 K. From the graph above, we can see that something with this temperature (or 6000 K) produces most of its radiation in and around the visible portion of the spectrum but also produces X-rays and microwaves and other types of waves in lesser quantities. And because the spectrum peak at about 5800 K is in the middle of the visible region, we get from the Sun fairly equal amounts of all different colors of visible light. The colors of the rainbow (in roughly equal amounts) blend to form a nice "white" light. (But, you object, the Sun is yellow! Actually, it only looks yellow from the surface of the Earth because of the distorting effects of the atmosphere.) What would we see, however, if the Sun's surface was only 3000 K? Well, the Sun would then emit a lot more red light (which corresponds to the right side of the visible band) than blue light (which is nearer the left side of the band), and sunlight would have a reddish hue (though I don't think it would be very obvious to the naked eye). No doubt you've seen something glowing "red hot", like, say, the heating element on the stove. The color is an indication that the stove is hot enough to produce a radiation spectrum that has a sufficient bit of energy allocated to the visible red region but little or no energy allocated to the other, shorter-wavelength colors of light, which would make the stove more orange or even white in appearance, depending on its exact temperature. For the same stove, guess which section of the radiation spectrum is best represented, so to speak. That would be infrared radiation, which our bodies perceive as heat. (A hot stove burner that is dull red in color is about 800 K; if you can raise the temperature high enough, it will turn orange at about 1150 K.)


(Another graph to look at.)

Normal incandescent bulbs don't get close to the temperature of the Sun, so they fall short of producing the same pleasant white light that emanates from our star. These bulbs contain tungsten filaments that reach temperatures of about 2500 K. So the light coming from them is redder than that of the Sun. Tungsten has the highest melting point of all metals. So to better mimic the color spectrum of the Sun in a light bulb, we can't try to heat a metal filament to 5800 K. It would simply melt well before reaching that high temperature. We must turn to a bulb that produces light not by getting hot, but by a different mechanism. Fluorescent bulbs are a case in point.

Is a blackbody black in color? It can be, but it doesn't have to be, as we saw in the first paragraph. The Sun is very nearly a blackbody, meaning it absorbs very nearly all radiation incident upon it. It also remains fairly constant in temperature because, even though its emitting a lot of radiation, its creating more of it deep within its core. And it's most definitely not black in color. Actually, to a reasonable approximation, all matter in thermal equilibrium behaves like a blackbody. A book does. A car does. Even a person does. As an example, the actress Halle Berry is a blackbody. Now, if she were a true ideal blackbody, she wouldn't reflect light (and she wouldn't have made the movie Catwoman), so we wouldn't be able to see her in color. She would appear pitch black. She would absorb all light and emit a radiation spectrum characteristic of something at 98.6 F (or 310 K). But she, and all people, approximate blackbodies. We reflect some of the light falling on us, which makes us visible, and absorb the rest. And we all emit a radiation spectrum that peaks in the middle of the infrared region of the spectrum (like the stove), producing such little visible light (as well as certain other wavelengths) that we don't shine in a dark room (unlike the stove, which is hotter). Want to be able to "see" someone that isn't reflecting visible light? Try infrared goggles. These pick up the infrared radiation produced by the person (and other warm things around the person) and convert it to visible light that your eyes can detect.