Wild radish (Raphanus raphanistrum) flower in natural colour (left), uv only (centre) and false bee colours (right).
We’re attracted to flowers because of their form, scent, and colour, the very things that attract their pollinators. It’s clear that some colours are associated with particular pollinator groups, like red for birds and blue for bees. However, most pollinating animals perceive a different colour spectrum from us. Humans are unusual—but not unique—among mammals in having three colour receptor genes, which code for opsin proteins whose light reception peaks in the blue, green, and red wavelengths. Some people—more often males than females because of sex-linkage—can perceive only two, usually blue and green. For them red and green may be indistinguishable, or red appears as black, and hues like purple that mix red with another colour will not look the same as they do to people with three-colour vision. There are websites that show what the world looks like to people with fewer than three opsins. This is important, because none of us can see one of the colours that birds and bees can see, ultraviolet; so we’re a bit like colour-blind bees, if such things exist (I wouldn’t be at all surprised to find they do).
It seems that early mammals were nocturnal and sacrificed full colour vision for eyes that had many more rods, enabling better, but monochromatic, night vision. Through opsin gene duplications and divergence of the colour sensitivity of the duplicated genes, many mammals have regained second opsins and a few, like humans, even regained a third. This is engagingly described by Richard Dawkins in “The Ancestor’s Tale”. But non-mammal vertebrates like birds, lizards, and fish can have more opsins, often three similar to the three human ones plus one that perceives a portion of the ultraviolet part of the spectrum. Many insects on the other hand, have three opsins that perceive uv, blue, and red.
All this indicates that if we’re to understand colour as an attractant for pollinators, we need some way to study the ranges of flower colours that the pollinators can perceive. Often this means using a spectrometer to measure the reflected wavelengths as by Bischoff et al. (2013). But an important aspect of flower colour is the arrangement of contrasting patterns in flowers, like stripes and spots that function as nectar guides and ensure insects are both attracted and aligned correctly in the flower to find the rewards and accomplish pollen transfer. One way to study these patterns is through photography.
Viola banksii flower in natural colour (left), uv only (centre) and false bee colours (right).
First, I’ll describe how to photograph just the uv portion of the reflected light from a flower. Then I’ll describe how to simulate the flower’s reflectance across the whole spectrum perceived by an insect.
Photography in uv light.
Digital photography has made uv photography easy in some ways and hard in others. It’s hard, or at least expensive, because most modern digital cameras have very effective filters that prevent the sensor from recording any uv or infra-red light. There are some very expensive uv-sensitive cameras that are used in forensic work (bruises can be detected in uv light long after they’ve faded from normal vision). But we’re in luck because a few older DSLR cameras are weakly sensitive to uv light, and these are still available second hand. One of the reliable ones that many people use is the Nikon D70. I bought a couple of good ones on line for about $200 each, although one has an occasionally sticky shutter.
Secondly, many lenses have numerous glass elements and glass absorbs uv light. There’s another problem too: most lenses focus uv light slightly differently to the optical wavelengths, and because we can’t see the uv we have no way of focusing the image before we take the picture. You could pay $10,000 for a quartz lens to eliminate these problems, but there are adequate cheaper solutions. The EL-Nikkor enlarger lenses often transmit enough uv light because as fixed-focus lenses they need fewer glass elements. The different models vary in their uv transmission and their ability to focus close to the visible range. Overall the EL-Nikkor 80mm f5.6 seems to be the best bet. As a bonus, it’s a pretty good close-up lens with a couple of sets of rings or a bellows. I got a couple of used ones on line for about £40 each, and I see them popping up for sale every few weeks. Note there are two versions of this lens and the one that’s recommended is the older type that has some chrome (not all black) on its outer parts. I found this blog post, which compares some of these Nikkor enlarger lenses and the site has many other very useful posts and great images.
Various sites recommend a focussing helicoid to make up for the EL-Nikkor being a fixed focus lens. I bought one, but I don’t need it for close-up shots, because I’m using close-up rings and so I can simply focus by moving the camera with the tripod. But if you want to photograph landscapes in uv or infra-red (IR) with this lens, you’ll need a helicoid. Either way, you will need a connecting ring to join the lens’s screw mount to the helicoids’s or close-up ring’s bayonet mount (eBay).
Next, you need to block all visible light, because the sensor is orders of magnitude more sensitive to it than to uv light. There are several uv pass filters available that block all visible light while allowing uv through. But most of them, and unfortunately those that are reasonably priced, also let IR through, and the camera sensor is also much more sensitive to IR than to uv. So if you use those, your photo will be an IR, rather than uv, image. What you need is a so-called Venus filter, made by Baader, and designed for astronomers to photograph uv reflectance from Venus. I got mine new from Baader’s Australian agent. It’s probably the most expensive item in the kit.
Important safety warning: never use this filter to look at the sun or to photograph the sun. Although it blocks all visible light, remember that invisible, but very dangerous, uv light is pouring through it.
There’s an additional problem: attaching the filter to the front of the lens. The filter holder on the EL-Nikkor lens has an unusual diameter (34.5 mm, while the filter is 52 mm), but there’s an engineer in Belarus who makes 34.5–52 mm connecting rings, available through eBay. Or you could get a 34–52 mm connection and glue it onto the lens.
Finally you need a light source that produces uv, and again this is a problem because standard flashes are filtered to protect our eyes from damaging uv wavelengths. Following recommendations, I bought a Vivitar 285HV flash, and asked a technician to remove the filter (there’s a YouTube instructable for how to do it, but I didn’t want to do it myself for fear of electric shock from the capacitor). Alternatively, you can use sunlight, but that’s better done outside (because window glass filters out some uv) so you run into trouble with wind movement in long exposures. It doesn’t matter that the flash also produces a lot of visible light, because the Venus filter blocks it all. I’m not sure, but I hope using the uv-enabled flash is not too damaging to the eyes, so long as you don’t ever fire it towards a face.
Nikon D70 camera, two sets of close-up rings, EL-Nikkor 80 mm f 5.6 lens, and Baader Venus filter (left) and modified Vivitar flash (right).
So that’s the kit. It took me a couple of months to pull it all together and it probably cost close to $1000. There are other options, but this was based on what seemed like the most knowledgeable advice I could find on line.
How do I use it? First, I use visible light to arrange my subject and focus the camera, set to manual. Then I screw on the filter, set the camera to take a 10 second exposure and while the shutter is open I fire the flash a couple of times at very close range, from the left and right sides of the flower. You can use a cheap infra-red remote to avoid camera shake, but if you give it a second to settle before firing the flash that won’t be a problem. I hold the flash in my hand and fire it manually while the shutter is open. It’s important to try to avoid shadows, because these can look falsely like uv absorbing regions, especially when you come to assemble false colour images. A uv-enabled ring flash would be ideal I think.
You can ramp up the ISO setting, and I tend to use an f11 aperture although f8 would help brighten the image. Even with the flash so close, the image is often pretty pale, so sometimes f8 is necessary.
Sometimes there are uv-absorbing regions in flowers that you wouldn’t expect from the visible range, and that’s what is fascinating about this. In musk, for example, the lower corolla lobe is uv dark, while the others reflect. In Genista stenopetala, the keel and wings absorb uv while the standard reflects it, although its corolla is uniformly yellow to our eyes.
Simulating insect vision.
Unfortunately, people sometimes jump from “insects can see uv” to thinking uv photos show what insects see. They don’t. Bees and many other insects see complex colours made up of uv, blue and green, in the same way we see colours made up of blue, green, and red.
Each pixel on a TV or monitor is a colour based on the contribution of blue, green, and red to that point. If insects had TVs, they’d display colours based on contributions of uv, blue, and green to each dot, but we’d only be able to see the blue and green components. We have no idea what uv looks like or how it mixes with blue and green. All we can hope to do is to replace our blue, green, and red channels with uv, blue, and green channels. Now the uv signal looks blue to us, the blue green, and the green red. This process shifts the bee-visible spectrum into the range of wavelengths that are human visible, in much the same way as a musician might transpose a song from a low into a high key: the relationship between the notes and the chords is the same, but it sounds different in a higher key (we can do this with low frequency whale calls by transposing them into a higher frequency that we can hear).
How do we do this in practice? You need two identical exposures, so it needs a firm tripod. First photograph the flower in visible light, and then repeat the exposure in uv only, by using the filter and uv flash. It takes a bit of trial and error to make sure the two photos are similarly exposed, but you can adjust a lot with Photoshop or GIMP. I tend to open the aperture a stop between taking the colour image and the uv image and of course the uv photo is taken with the much brighter light of the flash at close range.
I’ll now describe what I do in GIMP to bring these two images together as a false colour representation of what a bee might see (false colour because it’s in the human colour palette, even though it shows the colour contrasts that a bee would see).
I start with the uv photo. As it comes off the camera it’s usually in pink or purple hues (the red and blue sensors respond best to the uv light. Here are the raw colour (left) and uv-only pictures of a musk flower (Erythranthe moschata) combined side by side in one picture but not otherwise modified.
Open the uv image first. I usually convert the uv picture to monochrome by dragging the saturation slider all the way to the left.
Then usually I need to adjust the brightness. Most times the automatic setting does this pretty well.
That’s your basic uv picture, but it’ll need to be aligned exactly to the colour one, so I crop it to a convenient size and shape (usually square) and centre it on a distinctive pixel somewhere. There are a couple of ways I’ve figured out to do this, but it doesn’t matter how you do it so long as the uv and visible colour pictures are cropped to exactly the same dimensions and the crops are centred on the same pixel (in practice it can be a tiny bit out without too much damage). Once you’ve cropped it, save it, in tiff if you’re fussy, or jpg is probably fine.
Next open the colour photo. Set the brightness either manually or auto, and then crop it exactly the same as the uv one.
Then in the colour menu decompose it into its three colour channels: red, green, and blue.
On screen you’ll see a new image in monochrome, which will actually be made up of three layers, one for each colour channel. Now add the uv photo as a new layer, to give you four layers: uv, blue, green, and red.
Go back to the colour menu and compose the picture again, setting the blue channel to use the uv layer, the green channel to use the blue layer, and the red channel to use the green layer. The red layer is not used. Watch your new false colour image appear on screen!
These false colour pictures still need a little care in their interpretation. The pink and orange musk flower is unlikely to be what a bee sees, but any contrasting colour patterns we can see (not always seen in the human-colour palette) will be visible to a bee. We have no idea what a bee experiences when it sees a flower in its uv-blue-green colour palette, any more than we know what infra-red or radio waves might look like.
And of course the choice of which channels to fill with which colours is entirely arbitrary. Here's the musk flower again, this time with blue in the blue channel, green in the green channel, and uv in the red channel.
Birds can see uv as well as blue, green, and red. Not only do we not know what uv looks like, but we can’t begin to guess what its colour combinations look like, such as uv+blue+red. It’s like visualising four dimensions in only three, with the added problem that we can’t conceive what one of the dimensions looks like at all. Most New Zealand flowers visited by birds tend to reflect strongly in uv and red and absorb blue and green wavelengths, appearing just red to us but some unknowable blended colour to their pollinators.