Jump to content
UltravioletPhotography

Modeling Bee Vision with Stacks or Filters? Well, no, we can't really.


Andrea B.

Recommended Posts

I was wondering if it would be useful to make three separate monochrome images showing an approximation to the bees UV, G and B receptors (see here). One could then combine these in different ways (to show contrasts between channels for example) rather than summing than by combining in RGB channels

 

How would this be done? Filters we use for UV (e.g, UG1) seem to be a reasonable approximation to the bees UV response, but if we white balance the UV image and then desaturate then this is like an unweighted sum over the range of the camera UV band isn't it? Should the image be adjusted somehow to give more weight to wavelengths near the peak bee UV sensitivity (~350 nm). Or how should the monochrome UV image be created?

 

Are there filters available that mimic the B and G channels, which overlap considerably and extend into the UV?

Link to comment

I'm not sure how you would identify the wavelengths near the peak bee UV sensitivity if using a broadband UV-pass filter? Maybe if you used a narrowband UV-pass filter peaking at 350 nm?

 

There are blue-pass and green-pass filters. Google Schott Optical filters and look for them. I have a set of color band filters from Baader. But Schott or Hoya would quite likely be less expensive. Cadmium might be able to offer some advice on B and G filters. I don't recall anything just now about the transmission features of my color band filters.

 

Could you just take the Blue and Green channels from a Visible photo and use them? That's typically how we use B and G to make a stack. You can extract the B and G channels in most converters, in Photoshop or in Raw Digger.

 

I don't really have a specific answer about how to best make use of a monochrome UV photo for use in stacking or for other uses. I do know you don't want to simply remove all colors from a color file or you might not be left with any tones !! :wacko: :wacko: :wacko: So I would probably start with Greyscale and experiment with that for awhile. I recall a few years ago I made a mess of some stacks by using desaturation instead of greyscale because I either did not know or had forgotten that different colors of same brightness* saturation look alike in a desaturated file. (Sometimes we simply do not think about the most obvious things. La !!!!! Speaking for meself, of course.)

 

You could use a range of reflective standards (from 99% white to 2% black with one or two intermediates) for proper calibration of tones in a monochrome file.

 

Perhaps some other folks have some insights here?

 

* ha-ha! There I make an error while talking about a past error!

Link to comment
What about the Luminance channel of L*a*b? I don't have access to any app with that right now, so I can't go compare it to desaturation or greyscale monochromes.
Link to comment
The reference list in the paper (#2) above will keep anyone amused for hours. Ref 44 Proc Natl Acad Sci USA 2013, 110:18686-18691 is all about how bees manage to land safely on flowers rather than crashing into them and shows where the research of Horridge's group developed and has been acclaimed in teaching flying robots how to land. "When approaching a vertically oriented target bees reduce their speed using the rate of expansion of the viewed image". This is achieved with the colour blind green sensors in the bee eye, the bee slows down until the rate of flicker, caused by the edges of the rapidly approaching flower, stops, at which point the bee has landed. No GPS needed!
Link to comment

Regarding a bee UV monochrome image: we have some wavelength info in our uv images due to different spectral responses of the Bayer filters. It looks to me like there is response in the blue channel at longer wavelengths ~399, red and green overlap rather in the middle range ~360 and green predominates at shorter wavelengths ~340. ( I am basing this on sparticle images of good lenses seen on this site plus some graphs of Bayer filter response which only cover the long end of uv around 370-400)

 

Maybe just taking the red channel from the raw file might give a reasonable quick approximation - I think I saw someone do that on here ?

Link to comment
Andy Perrin
Oh Jim, that is a can of worms! We have been debating whether you can use the false colors to get spectral info for ages. The issue is that if you have two or more wavelengths that are being reflected at the same time then you get a weighted average of their R G and B responses, and you can’t distinguish it from the same RGB components produced using a single wavelength in between.
Link to comment

True, but neither can the bee.

 

All that I am trying to do is match the image grayscale Intensity to the bee uv receptor response (in a crude and simple way ).

 

To do this one needs to understand the camera spectral sensitivity (camera here means whole imaging system inc lens and filter)- to do this one can present a range of single wavelengths (or narrow bands).

 

From what I have seen this ( after White balancing) usually gives a spectrum going from green through yellow to blue with only some hints of red. I don’t think I have seen any red in any white balanced uv image ever . To me this indicates that there is consider spectral overlap in the red and green camera sensors in the uv range. If one was actually designing a uv camera one wouldn’t do it like this, but we are dealing with accidental spectral responses of sensors designed for the visible range.

 

All we have to play with are the rgb sensor outputs. To be honest I don’t know understand how to do it, but I am suggesting that the red sensor output could give a rough match to the bee uv sensor? How does one get just that output ? It’s not the same as the red channel from a camera generated monochrome image is it ?

Link to comment

This website https://maxmax.com/spectral_response.htm

 

Shows how the responses could be measured but they only went to 375 nm and it’s not clear if uv filter was in place

 

Has nobody here done anything like this ?

 

I do have access to a monochrator and spectoradiometer so could do this. I think it would be publishable ? Although ideally would be done for a few different cameras - maybe there could be a collaboration ?

Link to comment

This website https://maxmax.com/s...al_response.htm

 

Shows how the responses could be measured but they only went to 375 nm and it’s not clear if uv filter was in place

 

Has nobody here done anything like this ?

 

I do have access to a monochrator and spectoradiometer so could do this. I think it would be publishable ? Although ideally would be done for a few different cameras - maybe there could be a collaboration ?

 

Jim, I've been working on this (measuring spectral response curves) for months, and have now gone down to 280nm and up to 800nm. Threads:

 

http://www.ultravioletphotography.com/content/index.php/topic/2764-bayer-filter-transmission-in-uv-and-ir/

 

http://www.ultravioletphotography.com/content/index.php/topic/2580-build-thread-at-home-measurement-of-camera-uv-spectral-response/page__fromsearch__1

Link to comment

Wow brilliant work Jonathan !

 

Huge apologies for missing that !

 

I’ll have a good read through and try to understand before can mmenting further .

Link to comment
No problem Jim. There's quite a bit in there, as it has evolved somewhat from the initial idea, so have a read and let me know if you have any questions.
Link to comment

Looks really good Jonathan!

 

Just wondered in your results (#5 post) you show absolute sensitivities - I wonder what they would look like plotted as relative sensitivity - that is plot each point as a % of total area under each curve. ( or maybe even more simply % maximum per curve ) (with greens averaged I think ). Maybe this is simplistic but I think that will be like the response after white balancing ? To test you could then simulate what colour would be generated at each wavelength band (rgb ratio) and compare to sparticle images or similar.

 

It would also then be interesting to compare each curve and combinations of curves with the bee short receptor. Maybe green plus red would be the closest , although this would have central peak a little long ?

 

I read somewhere that white flowers need to have low uv reflexivity otherwise they look the same as the background to bees. We tend to show them blue in uv photos, but maybe to simulate bee uv channel that blue should not contribute in the monochrome mix too much ...

Link to comment
Thanks Jim. You're getting beyond my basic maths abilities now. If you want, drop me a PM with your email address and I can send you the file to have a play with. Oh and I agree, I definitely think there is something publishable in the spectral response work.
Link to comment

I admire all this creative thinking but some facts are missing.

 

Wavelength to color is not a 1-1 correspondence.

Wavelength to grey tones is even more not a 1-1 correspondence.

Every monochrome model lumps together some different colors.

 

Bees do not have trichromatic vision.

As per Horridge, bees have Blue Content Cue and Edge Contrast Cue (via Green receptor) as their two preferred visual Cues.

 

For the bee the edges of a Yellow flower would stimulate the green receptor flicker detection in a small visual field. The flicker is seen as contrast changes not as "green".

The Yellow area of the flower would stimulate nothing. To the bee the Yellow flower is Absence of Blue Cue which is also a visual Cue. The Yellow flower is "bee black" to the bee.

Older papers about Bee Vision used to postulate that the bee could "see" yellow as maybe green via stimulation of the trailing end of the green receptor. But no.

 

I read somewhere that white flowers need to have low uv reflexivity otherwise they look the same as the background to bees. We tend to show them blue in uv photos, but maybe to simulate bee uv channel that blue should not contribute in the monochrome mix too much ...

 

White flowers stimulate the bees blue receptor and the bee measures blue content in that flower area within a wide visual field. A white flower would have more blue content than a green grass or foliage background. If the white flower also has some UV reflectivity that reduces the blue content.

 

White flowers edges stimulate the bees green receptor separately and the bee sees flicker while moving within small visual fields.

The bee can determine an angle between the Blue Cue and the Green Cue.

 

Exercise for the Interested Reader: Take the green content of the background and the blue content of the white flower and look at them in monochrome. Under what monochrome conversions will they be the same? different? (Think saturation, brightness, luminance.) Report back with answers if you like. I already know. But I want you (the undefined 'you' who is reading, no specific person in mind) to know the answer.

 


 

I really truly do not want to stop anybody from investigating broadband camera modeling of how bees, butterflies, other insects and animals see the world. But please study the facts first so your modeling makes sense for the particular animal. Thank you. Off soapbox now.

 

You know, I'm gonna have to go back through all my posts and add some Editorial note about how I was totally ignorant when I made bee vision remarks. I'm not at all embarassed to do so but it will be soooooo time consuming!!!

 


 

I have been trying to make some updated charts showing a flower as humans see it, as the UV-pass camera sees it, in bee colors and in the Horridge bee detection model. The bee colors model is now considered inapplicable to real bee vision which is now known not to be trichromatic. Nevertheless it holds some interest as a comparison to the Horridge model so I have included it.

 

 

Shasta Daisy

This daisy is white with a yellow center shown against a green foliage background. The reflectivities are shown with a + sign. Absorption is shown with a - sign.

 

As the UV-Pass camera camera sees it, the flower is rather uniformly UV-absorbing with some streaking and bright pollen, but I'm not using tiny details for these simple models.

 

If the bee had trichromatic vision (it does not), then the bee would see the daisy as bee-cyan with a green, lime-green or possibly yellow center.

 

In the Horridge model of bee vision, the bee detects the daisy by measuring the blue content and noting the edge contrast changes (shown in green). Those are the two preferred cues amongst the bee's several visual cues.

 

ShastaDaisy.jpg

 

 

 

Yellow Flower with UV-Absorbing Bullseye

Here is a generic yellow flower in the Asteraceae family which has a UV-absorbing central area, the well-known bulls-eye. The flower is shown against a generic green foliage background.

 

The UV-pass camera system sees the flower rays as UV-reflective and the center disc as UV-absorbing.

 

If the bee had trichromatic vision (it does not), then the bee would see the central disc as bee green, lime-green or possibly yellow. The bee would see the flower rays as the combination colour UVGreen, which I have shown by mottling green with dissolved purple as this is an imaginary colour to us humans.

 

In the Horridge model of bee vision, the bee detects the yellow flower by detecting the lack of blue content and noting the edge contrast changes (shown in green). Edge contrast changes are a preferred cue in bee vision but I don't know where Absence of Blue Content ranks as a bee cue.

 

BullseyeModel.jpg

 

 

 

If you don't know that I welcome all comments and corrections and suggestions, well, you do now !!! :D

 

Done now at 12:41 PM 22 May 2018. All edits will now be marked as such.

Link to comment

Andrea, I would suggest you to consider the hypotheses and tests of other researchers, in addition to Horridge, no matter how thick his book is.

 

There is experimental evidence (2015) that bees and bumblebees strongly prefer to land on UV-absorbing areas of yellow flowers, even if these UV-absorbing areas are located on the periphery of artificial "flowers" (reverse bullseye, if I may call it like that). Which means they detect UV absorbing areas and use it for foraging.

 

There is also strong correlation between pollinator types (birds versus bees) and absence/presence of UV-absorbing nectar guides in yellow flowers.

Link to comment

I need a reference please !! Thanks. I'm happy to look at other work.

 

Horridge book not thick at all, btw. Very short and readable.

 

Here is a critique of Horridge's work. Quite interersting.

https://www.ncbi.nlm...les/PMC2845063/

 

 

All in all, there is no question that Adrian Horridge has made very significant contributions to the understanding of pattern recognition in bees, and that he has constructed a very plausible model of their behavior to small angle targets. His book is an excellent introduction to the subject, despite, or even sometimes because, of his very opinionated outlook. I say “despite”, because if one would believe him, the behavioral side of the studies are virtually finished, and they definitely are not. But also “because”, since a careful reading of the book will reveal the many interesting experiments of wide angle vision that remain to be done, along with the impetus that the reader is continually challenged to prove some of his most provocative statements wrong.

Link to comment
All I am after at the moment is a better understanding of the UV camera spectral response (i.e. measured response to monochromatic sources in the standard way this is defined) and how you might make a monochrome image that matches the response of the bee UV receptor. I have to take one step at a time and I will need to study how the different bee receptor inputs are combined, but it seems a reasonable place to start to think of the initial input layer.
Link to comment
The bee receptor inputs are not combined in a trichromatic way, but I'm seriously not intending for that to stop you from your research here. I think the issue lies in how to define the monochrome you want. It is intriguing what you are proposing, but I can't quite see how it isn't going to lump some responses that you might want to keep separate. But I am very much going to shut up now and simply await your investigation. Go for it!!
Link to comment
Thanks Andrea - Phew! when I skim read what you wrote I thought it said: "I'm seriously intending to stop you from your research here" :D
Link to comment
I hope that everyone would know that I wouldn't ever do such a thing. My goodness!
Link to comment
Regarding the reason for the uv signature, could it have anything to do with the way the flowers (of the daisy - "day's eye" type particularly) open up in the day and face the sun. Somehow maybe having UV absorption at the base of at petal and uv reflection further out, might cause the petal to bend to the light. Maybe fanciful - just a thought I had while walking the dog. I am sure this subject has been researched somewhere - does anyone know?
Link to comment
Andy Perrin
Jim, the UV only makes up 5% of the sunshine, so the difference in absorption would be pretty small?
Link to comment

I know that there is a paper "out there" about the uv-absorbing areas and how that affects the reproductive parts. I will try to find it.

 

IIRC, I don't think the uv absorption all by itself has anything to do with floral heliotropism, but that is a good thought and we should try to verify that. Look for both heliotropism and phototropism.

 

Jim, you have such good thoughts and questions which have caused me to learn even more about various UV matters. Thank you !!

Link to comment

Please sign in to comment

You will be able to leave a comment after signing in



Sign In Now
×
×
  • Create New...