|Buy F&SF • Read F&SF • Contact F&SF • Advertise In F&SF • Blog • Forum|
For millennia, if people wanted to see after dark, they had to do it by moonlight or starlight. The human eye is really good at that—we can adapt to light levels a million times lower than sunlight—but there's a limit. If there aren't enough photons to trigger the light-sensing cells in our eyes' retinae, we can't see.
Sometime in the distant past, about a million years ago by some estimates, proto-humans learned to control fire. They probably used it first for heat, and later for cooking, but it's no great stretch of the imagination to think that they used it for light, too. A burning branch will let you explore a lot deeper into a cave than you could otherwise go.
A burning branch has one big drawback, though: It doesn't last long. So right from the start, we can assume that the search was on for better lighting methods. Let's have a look at what some of those advances in lighting were, what they've led to today, and where we might go in the future.
It wouldn't have taken long for a prehistoric cave-person to discover that a branch covered in pitch (hardened tree sap) burns longer and brighter than a regular stick. It doesn't blow out as easily, either, which makes it much more useful if you're walking around with a torch in your hand. Pitch can be gathered and melted and smeared on green sticks that don't easily burn by themselves, so it's safe to assume that pitch-covered sticks pretty quickly became the standard lighting fixture for upscale cave dwellers.
According to the archaeological record, that remained the pinnacle of technology for thousands of years. But we've found evidence that about 70,000 years ago someone discovered that they could soak moss in animal fat and put it in a bowl and the flame would burn for quite a while. If the bowl was a seashell or a rock, it would last a lot longer than the wooden shaft of a torch, and it could be refilled with more moss and fat when necessary. That became a common alternative to the torch, and also offers evidence that people were using fires to cook meat by then, too.
Torches and fat lamps have one thing in common: They're smokey, which was probably not such a bad thing. The smoke kept mosquitoes and biting flies at bay, which probably led to a considerable health gain.
Animal fat is a relatively rare resource, though. You have to kill an animal to get it, and that's not an easy task. It's also a valuable foodstuff, so any fat that went into lamps was fat that didn't go into a person's belly. Fortunately certain plants also produce combustible fats, and those fats are often liquid at room temperature. We call that liquid fat "oil."
Olives in particular are an excellent source of oil, and olives were plentiful in Mesopotamia, where agriculture got its start. So olive oil became another common lamp fuel, as did sesame oil. Vegetable oil also stored longer than fat without spoiling, which made it easier to stockpile fuel.
Technology isn't a one-way street, however. The invention of vegetable oils didn't eliminate the use of animal fat, and in northern climes animals continued to be a prime source of lamp fuel. Whales, in particular, became such a common source that an entire industry sprang up around their harvest, eventually driving them to near extinction.
During the whale oil boom, and perhaps driving it, came the first major innovation in oil lamps: the 1780 invention of the "central draft fixed oil lamp," which used a hollow cylindrical wick that greatly increased its efficiency. Four years later the addition of a glass chimney helped control air flow, and the oil lamp we still occasionally see in antique shops became the standard of the day.
What about candles? Those were invented not long (only 1500 years) after oil lamps, but they were expensive to produce. Making a candle required tallow, which is rendered and purified animal fat, or spermaceti, which is a waxy oil extracted from sperm whales, or beeswax, which required someone really brave to procure it. Wicks were either wrapped with softened tallow or dipped repeatedly in liquid tallow or spermaceti or melted beeswax until they were the right diameter to burn completely without excess dripping or coring (burning down the center without melting the outer part). So for thousands of years, candles were for the rich.
In the 19th century, experiments with coal produced a waxy substance called "paraffin." Paraffin burned cleanly and produced an agreeably white light, and it quickly became the favorite material for candles. Furthermore, with the development of petroleum wells and refineries around the same time, paraffin became both common and cheap. Candle-making became a big business, prices dropped, and candles became available to the lower classes.
However, in one of those ironies that can only be appreciated after the fact, the death knell of the candle began to toll even before the discovery of paraffin. Two technologies emerged nearly simultaneously to challenge the candle for supremacy: gas and electricity.
In the late 18th century and early 19th a practical system for distilling a flammable gas from coal was developed. That gas could be burned directly at the end of a pipe, producing a flame that burned cleanly and persisted for as long as the gas supply held out.
At the same time, electricity was becoming a technological contender. It was known practically from the first electrical battery that a spark created light, so it was a fairly straight shot to the invention of the arc lamp, which was essentially a big spark that jumped between two carbon electrodes. Compared to candles, arc lights were insanely bright and harshly blue-white, which didn't exactly make them popular. More problematically, the arc ate away at the electrodes, so you had to keep adjusting their separation to keep the arc going.
Gas lighting had none of those problems, and the invention of the mantle, a fiber bag soaked in rare-earth salts that glowed brightly when heated in gas flame, further improved the gas light. So gas lighting was popular for most of the 19th century.
To further the irony of the race between candles, gas, and electricity, the big break for electricity came when Joseph Swan and Thomas Edison designed a dimmer electric light. The incandescent bulb, which used a cellulose filament heated until it glowed, was a much softer, warmer color than an arc light. It was brighter than candles, but not so bright that it drove people from the room.
The lighting element had to be in a vacuum or the superheated filament would burn up, so these new gadgets were sealed in glass bulbs. Edison attached a screw-in base to his, and the light bulb as we know it was born. When the technology allowed, the cellulose filament was replaced with a tungsten wire that lasted longer and was more shock resistant. Advances in generating electricity came along with the light bulb, so people didn't have to power them with batteries. Electrical transmission lines sprang up between generating stations and factories and the homes of the wealthy, and increases in efficiency brought those power lines to more and more households until nearly everyone had electric lights.
Up to this point, all our light-generating technology depended upon incandescence: If you get something hot enough, it will glow. But there are other ways to make things glow, and some of them are much more efficient than incandescence. Fluorescence is one of those technologies.
Fluorescent lamps work by using an electrical discharge to excite atoms of a particular material (usually a mercury compound). The electrons in these excited atoms jump to a higher orbit, then quickly fall back to their ground state, and when they do that the energy of their jump is released as a photon of light. Those photons have too short a wavelength (ultraviolet) for us to see, so the inner surface of the fluorescent light bulb is coated with a compound that glows when struck with ultraviolet light. That phosphorescent coating glows at a wavelength we can see. Fluorescent lights tend to be bluer than incandescent lights, but they can be tuned with various materials to produce light of softer colors.
Fluorescent lights are much more efficient than incandescent lights. While incandescent lights convert about 5% of their energy to light, fluorescent lights convert about 22%. That means they run cooler and don't cost as much to light a room.
One downside to fluorescent lights is that the mercury vapor eventually reacts with the phosphorescent coating and the bulbs lose their efficiency over time.
For years, streetlights used a similar technology, with sodium rather than mercury as the excited gas inside the bulb. When sodium electrons gain and release energy, they do it at a wavelength that we can see. That means they don't require a phosphor coating, so sodium vapor lights are simpler to make and they last for years. Their light is a soft amber color, not really suitable for indoor use but relatively pleasing outdoors at night, which led to sodium vapor lamps being used for streetlights.
There was one really cool advantage to sodium vapor streetlights: Their light pollution was monochromatic, which means they only put out that amber color. Astronomers, or anyone else who didn't want to see their stray light, needed only to use a filter that blocked that one frequency and the light would vanish completely.
Then along came Light Emitting Diodes. They use a phenomenon called electroluminescence, which is similar to the way electrons give off light when they jump from a high orbital to their ground state in an atom. In electroluminescence, electrons give off light when they jump from an electron-emitting electrode to an electron-receiving electrode through a semiconductor. The width of the semiconductor's band gap determines the energy released when the electron drops into the hole, and that in turn determines the wavelength of the light emitted.
Early LEDs only emitted infrared light, which our eyes can't detect. Later improvements produced red ones, which became ubiquitous in numeric displays and power indicator lights. It took quite a lot of effort, but new materials have increased the band gap and thus the frequency of LED emission, so now we have LEDs of nearly every color.
Like sodium vapor lights, and the mercury vapor inside fluorescent lights, the output of an LED is monochromatic. The size of the band gap produces a single wavelength of light. So manufacturers of LED light bulbs meant for general lighting borrowed a trick from fluorescent bulbs and surround the actual light-producing element with phosphorescent material that glows at other wavelengths when excited.
Unfortunately modern LEDs have two of same drawbacks as carbon arc lights: Their light tends to be harsh and blue. Cooling down the "color temperature" of LED lighting is an ongoing struggle.
The color temperature is basically how hot you would have to get an object for it to glow at the same wavelength as your light source. Candles burn yellow-orange and have a CT of 1800 Kelvin. A typical incandescent light bulb has a CT of 2700K, fluorescents typically run between 3500K and 5500K, and LEDs used for lighting (as opposed to those used for indicator lights and numerical displays) run from about 2700K to 5000K.
The 2700K lights are harder to make, though, so we see a lot of 4000K and 5000K lights used for street lighting and even in-home lighting. The problem is, that's way too blue to be comfortable, or even healthy, and blue light scatters way more than yellow light (which is why the sky is blue). So while LED streetlights have reduced lighting costs for cities, they have vastly increased light pollution.
LEDs are at least ten times as efficient as incandescent lighting and over twice as efficient as fluorescent lighting, so they're being adopted at an amazing rate. In the near future, that's where the money is. But further afield?
LEDs still waste energy as heat, both in the circuitry needed to drive them and in the emission of light itself, so it would be nice to increase their efficiency or find a better alternative. Right now the theoretical maximum is 400 lumens per watt, and LEDs produce about half that, so there's room for improvement
It would be really nice to have a light source whose color temperature could be adjusted as easily as its brightness. Blue during the day, yellower at night. It would be nice to be able to control the bandwidth of the emission, too, so light pollution could be mitigated with simple filters again.
Glare is a huge problem with LED lighting, since the individual light emitting elements are so tiny and so bright. Flat panel technology could solve that problem by spreading out the light so there's no single source. Imagine a room in which the entire ceiling glowed softly, lighting everything evenly. Imagine streetlights that only shine on the ground, not in your eyes. Imagine lights that only come on when people need them, but stay off and don't contribute to light pollution when they aren't needed. (That technology exists; we just need to implement it.)
We've come a long way from a pitch-covered stick on fire, but we have a long way yet to go.
Jerry Oltion has been a science nut since he was old enough to spell "curious." He has written science fiction almost as long, and has done astronomy somewhat less. He writes a regular column on amateur telescope making for Sky & Telescope magazine, and spends many, many nights a year out under the stars.
To contact us, send an email to Fantasy & Science Fiction.
Copyright © 1998–2020 Fantasy & Science Fiction All Rights Reserved Worldwide
If you find any errors, typos or anything else worth mentioning, please send it to email@example.com.
To contact us, send an email to Fantasy & Science Fiction.
Copyright © 1998–2020 Fantasy & Science Fiction All Rights Reserved Worldwide