Jump to content
UltravioletPhotography

Bayer filter transmission in UV and IR


Recommended Posts

Ulf, I'm sure there are better ways to approach this, but there is a trade off between convenience / usefulness and precision. I chose the 20 nm step range as my monochromator didn't allow me to get any better resolution than that due to peak overlap. If I had a better monochromator, perhaps it would make more sense to do with smaller steps, but not with the current setup.
Link to comment

Jonathan, did you convert your regular Canon to full spectrum or did you buy a third camera that is converted?

If you still have the normal, would be interesting to compare the direct sensitivity of all three on the same curve.

Or maybe not if I remember the off the shelf droped fast at 400nm. I think I would like to see the mono vs full spectrum plotted as two seperate curves though, rather than divided out. This analysis of your gets to the old Maxmax point that full spectrum cameras don't see much below 360nm.

 

I have a plot here of all 3 cameras (Colour - Unmodified, Monochrome, and Multispectral - UV-Vis-IR with Bayer filter remaining), overall sensitivity between 280nm and 800nm. To calculate the curves I just took averages of the 4 channels - Red, Green, Blue and Green 2, and corrected them for variations in light intensity across the measurement range. There are 2 lines per cameras as measured these curves in 2 goes (280nm to 480nm, and 440nm to 800nm). The units are arbitrary but relative - twice the score twice the sensitivity to a given wavelength.

post-148-0-18468900-1526897985.jpg

 

It does indeed echo MaxMax's point that the full spectrum cameras would be expected to see very little below 360nm. This will be mainly driven by the Bayer filter absorption, as the sensor itself in the monochrome conversion remains sensitive down to 300nm, and drops steadily between 400nm and 300nm.

 

Now I must emphasize these are averages of all 4 channels, so individual channels would show higher values depending on wavelengths.

 

Thanks to Ulf for suggesting the gridlines, it does make it easy to compare the data from the different cameras.

Link to comment

Ulf, I'm sure there are better ways to approach this, but there is a trade off between convenience / usefulness and precision. I chose the 20 nm step range as my monochromator didn't allow me to get any better resolution than that due to peak overlap. If I had a better monochromator, perhaps it would make more sense to do with smaller steps, but not with the current setup.

 

Jonatan,

Not intended as criticism.

 

I was mainly interested in how the windowing of the monochromator works.

I have never worked with a monochromator myself and wondered if the selected wavelengths within the 20nm span gave a reasonably constant throughput.

If that is true, it might with a lot of work, be possible to extract more data with overlapping spans, a bit like the method used for modern pixel-shift to improve resolution.

 

I still think it is better to calculate the effects of filters instead of doing the measurement with the filter on the lens.

Then you will reduce the problem caused by noise and can maintain a reasonable baseline = 0.

 

(filter-transmission) x (sensor-responce)

Link to comment
Ulf. Not a problem, I was just trying to explain the reason for my choice of 20 nm intervals. Yes I can try calculating it, but I always prefer to see raw, measured day first if possible. In my line of work, I've seen far too many models created first, before the data they are meant to be modelling has been measured. The people using them then blame real life for not matching the model. And yes I have had conversations which went along the lines of "your measurements must be wrong as they don't match my model".
Link to comment

Ok, a wonderfully controversial topic for discussion. Can a RAW image from a camera provide spectral information about reflected wavelengths? Looking at the camera sensitivity curve through the Baader U filter, with the multispectral Canon 5DSR and 85mm Asahi UAT lens I get the following set of curves for the Red, Green, Blue and Green 2 channels (as shared in the thread above);

post-148-0-33610200-1527091487.jpg

 

Looking at the curve, the ratio of Red to Green to Blue changes at different wavelengths. For instance at 380nm, R:G:B is about 2:1:1, and at 360nm R:G:B is about 3:2:1. So is it possible to look at the R:G:B ratios is a RAW photograph and infer something about the wavelengths of light being reflected?

 

As an example I took a photo of a Buttercup in the garden with this setup (5DSR multispectral, UAT lens, Baader U filter) and imported it into RawDigger as a RAW composite (not an RGB render). I then measured Red, Green and Blue values from two areas. Firstly the petals of the Buttercup;

post-148-0-47642100-1527091831.jpg

 

Secondly the background foliage;

post-148-0-77801700-1527091852.jpg

 

If I then look at the ratios of Red to Green to Blue in these two areas, I get the following.

 

Petals R:G:B 3.41 : 1.60 : 1.00

Foliage R:G:B 2.08 : 1.04: 1.00

 

So clearly the Red to Green to Blue ratio for the Buttercup is different to the surrounding foliage, as would be expected, otherwise why would they be different colours in the final images.

 

What does this mean though? Going back to the sensitivity curve for the Baader U + UAT + EOS 5DSR multispectral above it is possible to see where the ratios of Red to Green to Blue most closely match those of the Buttercup and the foliage. From this it looks like the foliage ratios are closest to the ones seen around 380nm, while the petals are closer to 365nm.

 

I accept this a somewhat controversial area, and there are still a lot of questions to be asked and answered. Do I know the reflectance spectra of the foliage in UV? No. Do I know the reflectance spectra of the light parts of the buttercups petals in UV? No. Can these be measured? Yes. What about mixing of two wavelengths of light hitting the sensor? That remains to determined, especially any potential when measuring in the UV. Have I taken into account the drop in UV intensity as a function of wavelength? No, although drop in intensity would not be expected to change the ration of red to green to blue at a given wavelength, only the absolute values.

 

Right, I'll be the first to admit,this is very early days for this type of work, and there plenty of things still to address. More work needed :)

Link to comment

This is how it is typically done!

Looks good so far. Keep on goin' !!!!!

 

One thing you need to figure out is how to deal with: The same hue at different intensity. Light/dark versions of the same hue. Saturated/unsaturated versions of the same hue.

 

In the raw photo I have rarely found more than a few clustered hues when informally measuring around the photo. I think learning how to make a hue map of your original photo might also be useful in this effort. I've tried that a few times. Limited success because I didn't have the correct tools!! Need to research this more. Like I have the time!!==I don't. I found one app for doing this, but was old and no longer worked. If you find something, let us know. I suspect MatLab might be useful for this???

 

I think I'd take the transmission chart and code it with hue color bars at full strength?? Then you could make versions with 50% saturation or 50% brightness and so on. I've done this iinformally wiith raw hues from UV photos of the standards. (((Posted somewhere. Can't find anything anymore 'cause we have too much.))))

 

 

 

I envision an app........run the raw UV photo thru it, creates a kind of clustered hue map, marks hues with wavelengths. Eh voilà - the poor man's spectrothingy. Write it Jonathan & Andy & whoever.

:D :D :D B) B) B) smiles of encouragement.....

Link to comment

Jonathan - I have been thinking about this a lot!

 

Can you clarify - are the sensor spectral response curves normalized to the illuminating source? i.e. is the y-axis %transmission. If not it is a convolution of the input spectrum and the sensor response and therefore ratios would be directly comparable to the image due to a different light source.

 

Are the raw images completely un-adjusted? They look closer to being white balanced compared to what I get out of my camera - is there any firmware type adjustment to compensate for the different responses in the different channels?

 

Unfortunately, I think better wavelength resolution is needed. The range of interest is probably only say 400-340 and this is only 4 data points. Your curves all have the same peak position at 380 nm. I think the false colours come from subtle differences in the peak position, probably less than 20 nm shifts?

 

Regarding working backwards from the ratio of RGB to input spectrum - this is what is known as an ill-poised inversion problem. That is there are a large number of (infinite?) input spectra than can give rise to then same RGB ratios. Generally speaking you can only find a solution which has fewer parameters than the number of independent (uncorrelated) measurements. Sorry if that sounds a bit technical ...

Link to comment

As I say, plenty more to look at.

 

Jim, the y axis is spectral response normalised to the intensity of the light source across the measurement range.

 

The raw images are straight from the camera and 'unadjusted' as I can get them. In Rawdigger they are then opened as raw composite images, not RGB rendered images, which is probably why they look different.

 

More resolution would be great, but that is about as far as I can push my system. With money to burn I'd by a fancy monochromator and better light source etc etc, but until the lottery win this'll have to do.

 

You lost me in the last paragraph sorry. Bit beyond me.

Link to comment

Jonathan to answer your question I think yes. But how we get there and what we learn is the fun part.

To tease out filter effects, you should do exactly what you did but with the U340/S8612 and U360/S8612 combinations. If you get the same camera sensitivity curves for these stacks with different peak maxes, then we get part of the answer. You may not even need the S8612, if your monochrometer is clean. But would be needed for a cross reference floral image.

 

Something I like to see is the different images I get with filters and my LED source. I am lucky and my Zwb1 does not see visible, as I have a 405nm centered lamp and have much larger exposure values. I also convinced myself by placing a Wratten 2A in front and see nothing. But I still may pick up a U340 to compare. I am also hoping that in a lot of microscope uv bandpass filters I bought, a ug1 and ug11 is present.

Link to comment

Regarding working backwards from the ratio of RGB to input spectrum - this is what is known as an ill-poised inversion problem. That is there are a large number of (infinite?) input spectra than can give rise to then same RGB ratios.

 

Or, said more simply (mathematically), the mapping between wavelength and RGB color is not 1-to-1. Thus, it is not invertible. You can sort of go in one direction with a fair amount of accuracy, but you probably can't go backwards with any guarantees.

 

That is to say, most visible wavelengths or wavelength "mixtures" will usually (but not always!) map to an RGB color as represented, say, in a CIE colorimetric representation. And "accidentally", so-to-speak, we get the same deal with UV wavelengths to RGB false color as seen in the raw file before WB. But the way color spaces are defined causes some problems with the mapping (see "twists", etc.). Then the effects of saturation, brightness and luminance must be accounted for. And the camera demosaicing algorithms further mush up the final colors in the raw photo (before WB). So going from color back to wavelength has to undo all the assumptions (tristimulus values etc etc etc) and after that, then there are still many choices of wavelength or wavlength combo from whence cameth that color.

 

Nevertheless everyone we have ever had on UVP has wanted to solve this insoluble problem! Jonathan has made some good progress towards a reasonable approximation. I really do envision an approximate solution which perhaps would be of some use as a preliminary field estimate. But you won't ever want to bet the farm on it methinks.

 

I forgot to mention the dependence on camera sensor, lens in use, filter in use and illumination. IF sunlight is used as the illumination, then all color-wavelength maps will need to be correlated with time-of-day, altitude, lat/long, time-of-year? Or can that be skipped? I don't really know.

Link to comment

I did just write a long and detailed post on this topic, but it has vanished!

 

I don't have energy to re-write, but there was some overlap with what Andrea has just said anyway.

 

In the specific case you mention of the buttercup and the background - what you suggest (that you could determine the source from the RGB ratio) would only work if the source was monochromatic. Otherwise there are very many possible spectra than can give the same RGB response.

 

This problem is only solvable if you can make some big assumptions about the source. If for instance you knew something of the chromophores present and their reflctance spectra. That's just an example - you can't go backward to a solution that has more detail than the resolution of the measurement system unless you have some other information about the solution in advance.

Link to comment
To be clear, I am not saying that I think the camera can become a spectrometer that's capable of measuring down to very specific wavelengths - is this pixel an image of a surface that's reflecting 341nm and that one 344nm. I think for me, the next step is to capture some spectra of the reflectance from some buttercups (the bit that's yellow in the UV image) and folliage and see what they look like. I'm an empiricist, and work better with data....
Link to comment

I think I am thinking about your question differently. If you have a monochromatic light source and you shine it on a flower, the flower will act like a prism shifting the wavelengths slightly and provide new wavelengths as fluorescent output based on any excitation provided by the light source. Example is my 405nm LED light bulb, I get a spread back, which includes some uv and I do get back IR fluorescence.

 

Now if you have a white light source, then all bets are off. Spectral unmixing is not hard. This is how I used to do 8 color flow cytometry. But you need controls for everything. With sun light the controls needed would out weigh your time.

Link to comment

Just to remind us what we are trying to understand, here's spectral sensitivity of the honeybee:

 

post-28-0-58390900-1527141137.jpg

Link to comment

Jonathan, we gotta try to find out how to make a hue map of a raw UV file!! I wonder if the Raw Digger guys can help me with this.

 

Dave, look at the one I found in color! Don't know how accurate it is.

Unfortunately try as I might I cannot find a credit for this chart.

https://fieldguidetohummingbirds.wordpress.com/2008/11/11/do-we-see-what-bees-see/

 

15e19615c5269d7246d3d8f15b2cb055--bee-safe-bee-do.jpg

Link to comment

I.Fell.Down.the.MatLab.Rabbit.Hole.

oh wow! So cool. Can def make a hue map with MatLab.

 

I'll be back sometime next autumn...........

Link to comment

Thanks Andrea,

 

That also shows the bird eye response which is probably a whole new topic, more questions.

 

Dave

Link to comment
Andy Perrin

I.Fell.Down.the.MatLab.Rabbit.Hole.

oh wow! So cool. Can def make a hue map with MatLab.

 

I'll be back sometime next autumn...........

MWAHAH-HAH-HAH! Yes, it is lovely, you have total control.

Link to comment

[[Off Topic: Andy, just what I need --- another coding obsession !!! I spent two months this winter writing Unix shells to rename my botanical files and folders. It was so FUN!! Now I want to get lost in MatLab so I can make Hue Maps of the raw UV files. To what end, well, I don't know. It would just be interesting. Of course, don't let me stop you-who-knows-MatLab from doing that. Heh, heh, heh........]]

 

I hope Jonathan will forgive me for butting into his topic and going nutso over the idea of a Hue Map. I promise to stay on topic henceforth.

Link to comment
Andy Perrin

Yeah, I don't think you really can invert a hue into a wavelength for the reason's already stated? So I'm not sure what this hue map does exactly.

 

Just to point out some of the issues, suppose we wanted to do this in visible light. If you take a photo of a subject that's partly in sunlight and partly in shadow, the shadows look bluer. So how would you infer anything about the reflection of the surface when the same surface might have two different hues depending on whether you put your imaginary eyedropper tool in the shadows or the sunlit part?

Link to comment

I wasn't going there just yet. That is, to any kind of wavelength-hue correlation which I tend to rant about on occasion even though I *am* curious just how far one can go in that direction and am encouraging the attempt by Jonathan.

 

I merely want a raw UV Hue map to see "what is there". I've been looking at the raw UV hues for some time now and how they differ amongst our various UV-pass options. In my botanical photos of photos of standards I can almost almost tell what filter was used by looking at the raw composite.

 

If you look back at Jonathan's transmission chart above, you should be able to "see" the raw hues in it. As an example, any transmission interval where red is over blue and green is going to produce some kind of desaturated red. For example, (R, g, B), where g=b and g,b

 

All raw files I've looked at so far seem to have orange, pink, magenta, purple, and blue-violet colors. I've posted about the Hues and shown some examples using the standards. [Filter Test] AndreaU MK II with BU, SEU, U-360 :: Standards & CC Passport

Link to comment
Andy Perrin

Huh, okay, I think I'm just totally lost then. I thought you were talking about this comment:

I envision an app........run the raw UV photo thru it, creates a kind of clustered hue map, marks hues with wavelengths. Eh voilà - the poor man's spectrothingy. Write it Jonathan & Andy & whoever.

:D :D :D B) B) B) smiles of encouragement.....

 

So what is a "raw UV Hue map"? What are you mapping to what? State the domain and the range Ms. Former Math Teacher.

Link to comment

domain: raw photo pixels color-grouped in some way. Perhaps using MatLab segmentation routine? Or posterize the photo a bit?

range: pixel color-group average Hue (and saturation and brightness)

 

As for the comment you quoted -- once you had a Hue map, you could attempt to correlate it with wavelengths by using the transmission chart and converting that to hues/colors and matching up.

 

Look at Jonathan's transmission chart in Post #30.

At 380 nm there is a ratio R:G:B = (180:90:90). Scale that to an RGB value (255, 127 127) to get the hue 0° with saturation = 50% and brightness = 100%. This is a desaturated red we commonly call "pink".

 

At 360 nm, Jonathan's chart shows an approx ratio R:G:B = 100:60:30. Scaled that would be (255, 153, 77) which is hue 26°, sat = 70%, bright = 100%. So the chart has moved towards orange which is at 30° on the color wheel.

((I'm rounding off a bit just to make ithis easier.))

 

At 340 nm there is an approximate (255, 255, 192) which 60°/23%/100%, a yellow.

And so on......

Link to comment
Andy Perrin

I see, yes, you could do the former.

 

Regarding the comment, that would not give you any useful info because the reflected colors aren't single wavelengths, they are mixtures. It's a hopeless dream, guys. What you want is an imaging spectrometer. They exist! We could home brew one with a motorized filter wheel! But what I don't think we can do is try to use the Bayer response to get more than a crude grouping into three UV wavelength ranges, and even if we did that, the aforementioned issue of different parts of the image having different illumination (shadows vs. direct sun, etc., but even just variations in angle make a difference to this sort of thing) come into play.

Link to comment

Please sign in to comment

You will be able to leave a comment after signing in



Sign In Now
×
×
  • Create New...