If you’ve ever wished you could see the world though the eyes of another animal, we have good news for you. We also wondered about that and, being scientists who specialise in colour vision, have created a solution: a camera system and software package that allows you to record videos in animal-view colours.
Many animals, including bees, birds and even mammals like reindeer and mice can perceive ultraviolet light. Indeed, the lack of UV-sensitivity in humans is more of an exception than the rule. At the other end of the visible light spectrum, human eyes have receptors that are sensitive to red while many animals – including bees, mice and dogs – are just as blind to red as we are to ultraviolet light.
Even when it comes to blues and greens, colours perceived across the animal kingdom, the precise wavelength of the light an animal would experience as “pure blue” or “pure green” is specific to the species. As a result, no two species see the world in the same colours.
We invite you to stare at the sky and appreciate that its blueness is the joint product of the sunlight being scattered in the atmosphere and your own sensory system. The colour you see is specific to you – in fact, for many animals, the sky is ultraviolet-coloured.
Now, slowly lower your eyes and try to imagine how the rest of the landscape could appear for other species. With our new camera system, we took one step closer to understanding this wonderful, strange world that other animals live in.
Capturing the world in motionWhile we cannot possibly imagine how ultraviolet appears to the animals who can perceive it, we can visualise it using false colour imagery. For example, for honeybees that are sensitive to three types of light (ultraviolet, blue and green), we can shift their perceivable colours into the human visible range such that ultraviolet is represented as blue, blue becomes green, and green becomes red.
Up until now, we could only apply this process to immobile objects. False colour photography relies on taking a series of photos through a succession of optic filters and subsequently overlaying them, and this sequential method means that everything must be in the exact same position in all the photos.
This is a serious drawback. It makes for a laborious process which sets the limit on the number of objects that can be realistically imaged. For example, taking photos of an iridescent peacock feather from a hundred different angles would require screwing on and off each filter a hundred times.
Even worse, all movement-related information is discarded. Yet the living world is in constant motion: trees sway in the wind, leaves flutter, birds hop along branches looking for insects that scutter in the undergrowth. We needed a way to be able to visualise all this movement.
The first challenge was to devise a camera that records in ultraviolet and visible light simultaneously. The solution turned out to be a beam splitter. This specialised piece of optical equipment reflects ultraviolet light as if it was a mirror, but allows visible light to pass through, just like clear glass.
We positioned two cameras (nothing too fancy, the same kind you can buy in shops and online, but with one modified to record in ultraviolet) in a 3D-printed casing, such that the modified camera received reflected ultraviolet light while a stock camera received transmitted visible light. We overlayed and synchronised the recordings of these two cameras, and a series of conversion steps allowed us to calculate the amount of light that had reached each camera’s sensors.
From this, we could estimate the amount of light that would have been captured by an animal’s eye if it were seeing the scene from the vantage point of our camera.
Try it yourselfWe have made all codes necessary for implementing the video conversions and the plans of the camera system freely available online, along with our best attempt to explain how to build the camera from scratch.
Our goal is for other researchers to build their own cameras and to use them to answer their own questions about how other species see the world. There are so many possibilities.
We can record the dances of peacocks and see how dazzling their feathers appear to peahens. The iridescence of these feathers extends into the ultraviolet – our recordings show the feathers appear even more colourful for their target audience than to us.
We can accurately describe how the startle displays of caterpillars appear to their bird predators, and understand why the unexpected flash of colourful patterns scares them away. We can ask questions about how animals move between spots on the forest floor to show off or hide their colours.
We can also create image records of butterflies and other insects held in museum collections and offer animal-view conversions as part of a digital library. And we can ensure glass facades are sufficiently visible to birds who might otherwise collide with them.
But the most exciting questions will be those we have yet to consider. Only now that we have started taking videos of the natural world in colours that animals see are we beginning to notice how much information is out there. Discoveries await you in your own backyard.
Don’t have time to read about climate change as much as you’d like?
Get a weekly roundup in your inbox instead. Every Wednesday, The Conversation’s environment editor writes Imagine, a short email that goes a little deeper into just one climate issue. Join the 30,000+ readers who’ve subscribed so far.
Vera Vasas, Research Fellow in Ecology and Evolution, University of Sussex and Daniel Hanley, Assistant Professor, George Mason University
This article is republished from The Conversation under a Creative Commons license.