Smartphone cameras have significantly improved in recent years. Computational photography and AI allow these devices to capture stunning images that can surpass what we see with the naked eye. Photos of the northern lights, or aurora borealis, provide one particularly striking example.
If you saw the northern lights during the geomagnetic storms in May 2024, you might have noticed that your smartphone made the photos look even more vivid than reality.
Auroras, known as the northern lights (aurora borealis) or southern lights (aurora australis) occur when the solar wind disturbs Earth’s magnetic field. They appear as streaks of color across the sky.
What makes photos of these events even more striking than they appear to the eye? As a professor of computational photography, I’ve seen how the latest smartphone features overcome the limitations of human vision.
Your eyes in the dark
Human eyes are remarkable. They allow you to see footprints in a sun-soaked desert and pilot vehicles at high speeds. However, your eyes perform less impressively in low light.
Human eyes contain two types of cells that respond to light – rods and cones. Rods are numerous and much more sensitive to light. Cones handle color but need more light to function. As a result, at night our vision relies heavily on rods and misses color.
The result is like wearing dark sunglasses to watch a movie. At night, colors appear washed out and muted. Similarly, under a starry sky, the vibrant hues of the aurora are present but often too dim for your eyes to see clearly.
In low light, your brain prioritizes motion detection and shape recognition to help you navigate. This trade-off means the ethereal colors of the aurora are often invisible to the naked eye. Technology is the only way to increase their brightness.