Jump to content
UltravioletPhotography

Color vs Wavelength: Two Visible Examples


Andrea B.

Recommended Posts

Wed 18 Apr 2018 EDITOR'S NOTE: Just a quick note to let you know that it will take me more than one session to get this topic completely written. So I am temporarily locking the post to comments until I am finished. Then we will open it up and welcome any and all comments, corrections and suggestions.

 

Thurs 19 Apr 2018 UTC 12:19 Editor's Note: I think I am mostly done. I will re-read this for typos. And then re-read this for meaning. {{It is still Wednesday the 18th here on the East Coast of the US.}}

 

Sat 21 Apr 2018 Editor's Note: Thanks to Andy Perrin for proper terminology: bands and bandwidth. I am cleaning up the posts to use that.

 

Please credit UVP, Jonathan Crowther, Andrea Blum in any discussions of this article elsewhere.

Don't make us come after you with tar and pitchforks.


 

Brief Intro

The topic of using a converted camera as a device to determine the value of reflected wavelengths has been debated since the beginning of digital UV/IR photography. Can it be done? How accurate are the results? This will be an addition to that dialogue.

 

Here are two previous posts on this subject:

.

Our UVP member Jonathan (handle: JMC) of JMC Scientific Consulting Ltd has very kindly given me some raw CR2 visible files from his sensor spectral response measurements using a Canon EOS 5DS R so that I could investigate the raw colors. You can read more about his ongoing explorations here in the original thread:

.

I made a nice Excel chart showing my explorations of the raw colors associated with the 460 nm and 560 nm measurements. And I will make an attempt to list the factors affecting what colors we see in this chart in order to better understand how the converted digital camera can be used spectrometrically.

 

I'm not drawing any conclusions here in this post. This is just a little raw color investigation, OK? :)

 

You can help, please, by adding your comments, suggestions and corrections so that we can all better understand this topic.

 

CHART

This is a copy of the Excel chart copied to a JPG. With the conversion to JPG and a downward resize, there are tiny artifacts creeping in and a bit of softness in the labels, so if you would like a clean version there is a PDF upload link following the Chart.

 

Jon_ColorChart01.jpg

 

 

Chart PDF: Jon_ColorChart.pdf

 


Raw Colors

greatly simplified...

The digltal camera records light intensity at each physical photosite. The number of recording levels, or bit-depth, is usually 12-bit or 14-bit in current digital cameras. The raw file contains all these light intensity measurements together with the Bayer filter color the light passed through. A raw color is developed for each photosite during the demosaicing process which takes into account all the surrounding photosites. The raw color is assigned to a pixel (which may not necessary correspond to a physical photosite). Then the pixel colors are tweaked by application of autoscaling and gamma. Autoscaling "stretches" the 12/14-bit intensity representation to 16-bit depth. Autoscaling increases brightness when RGB values are compared. Historically a gamma curve was applied to an image to compensate for the non-linear encoding of a CRT monitor. Now gamma is applied to compensate for the raw data linearity and the fact that computers (and other digital display devices) still expect gamma to have been applied even though we no longer use CRT monitors. The gamma curve alters the color hue and brightness when RGB values are compared.

 

References:

Raw Color Dependencies

  • physical photosites (and microlenses, etc.)
  • photosite bit-depth
  • Bayer filter layout
  • Bayer filter dyes
  • demosaicing algorithm
  • autoscaling
  • gamma
  • Camera settings for noise reduction, ISO, or raw compression (or not).

.

Chart Color Labels: For each color block both the HSB and RGB values are provided. For review, H = the color wheel hue in degrees. S = the color saturation in %. B = the color brightness in %. More about the HSB model can be found here: HSL and HSV. Most of us are familiar with the RGB model, but for completeness here is a link: RGB Color Model.

 

 

CHART BOX 1: Raw color samples before autoscaling and gamma

The CR2 raw file was opened in Raw Digger. It was then exported as a TIF with no autoscaling and no gamma applied. The color blocks in box 1A show the dark raw colors as sampled in my Photoshop Elements with a 5x5 sampler. The colors are so dark that we cannot really see very well what they are. So the color blocks in box 1B show the raw dark colors pulled up to full saturation and full brightness. Now we can get a feel for what raw colors were recorded. At 460 nm the color is a kind of blue and at 560 nm the color is a kind of green. Which blue and which green remain to be developed.

 

Chart Box 1 Dependencies

  • Raw Digger demosaicing algorithm
  • Photoshop sampling size
  • Photoshop and Excel RGB color pickers
  • Conversion to web-viewable JPG

.

CHART BOX 2: Raw color samples with autoscaling but no gamma.

The CR2 raw file was opened in Raw Digger. It was then exported as a TIF with autoscaling applied but no gamma. The autoscaled color blocks in box 2A are still dark. So the fully saturated and bright versions are shown in box 2B. We do not see much change between the dark colors in boxes 1A and 2A, but there has been an increase of brightness from 6% to 32% for the 460 nm color blocks - and from 4% brightness to 19% brightness for the 560 nm raw color block. Those are big brightness increases even if our eyes don't perceive it as such. (Larger samples might be more revealing here.)

 

Chart Box 2 Dependencies

  • Raw Digger demosaicing algorithm
  • Photoshop sampling size
  • Photoshop and Excel RGB color pickers
  • Conversion to web-viewable JPG
  • Autoscaling algorithm

.

CHART BOX 3: Raw color samples with gamma but no autoscaling.

The CR2 raw file was opened in Raw Digger. It was then exported as a TIF with gamma 2.2 applied but no autoscaling. Again, the raw gamma-ed color blocks in box 3A are dark. So box 3B shows the fully saturated, fully bright versions in order to better understand this color. Note that brightness has increased over the values in box 1A while the saturation has decreased.

 

Chart Box 3 Dependencies

  • Raw Digger demosaicing algorithm
  • Photoshop sampling size
  • Photoshop and Excel RGB color pickers
  • Conversion to web-viewable JPG
  • Gamma value

.

CHART BOX 4: Final raw color sample. Both autoscaling and gamma applied.

The CR2 raw file was opened in Raw Digger. It was then exported as a TIF with both gamma 2.2 and autoscaling applied. Note that we can finally distinguish the raw color in box 4A. It is no longer dark. But I have supplied the fully saturated/bright version again in box 4B just to continue the pattern. The dependencies have already been listed, so will not be repeated here.

 


 

PAUSE

for a brief consideration of True Colors

For each of the 460 nm and 560 nm wavelengths, we have 4 versions of the raw color recorded by the camera. All are derived without any application of white balance. The raw colors derived from the 460 nm sample are at 232°, 230°, 217° and 217° on the color wheel. They are all blue hues with a touch of green. For the 560 nm sample, the raw colors are at 108°, 107°, 98° and 99° on the color wheel. These are all green hues with a touch of red - or you might say, with a touch of yellow.

Which hue truly represents the 460 nm wavelength? And the 560 nm wavelength?

Is there an answer to that? Good question!

 

ADDED 21 Apr 2018:I think I have just encountered the "twist" in color modeling which explains why the raw samples seem to fall into two groups.

 

I think I hear Cyndi somewhere in the background ......

 


 

 

CHART BOX 5: Canon EOS 5DS JPG Auto White Balance

Every raw file contains a JPG which is created by the camera's JPG software. When these photos were made, the camera was set to Auto white balance - because you have to set it to something. Box 5 shows the color sample from the JPGs. Clearly the 460 nm JPG color has picked up some red to veer away from blue towards a blue-violet. The JPG green has picked up some blue, but didn't travel quite as far away from the very first raw sample.

Although this is just one white balance example, I am very comfortable in stating that I don't think a white-balanced JPG color should be used for any wavelength-to-color correlation. What do you think?

 

 

CHART BOX 7 Reference Colors

The last three color blocks on the chart provide some RGB reference colors. The two colors in box 7A are the pure reference hues at 240° and 120°on the color wheel: blue (0, 0, 255) and green (0, 255, 0). The other color blocks shown in boxes 7B and 7C are some typical "named" colors which are close in hue to some of the fully saturated/bright colors shown in preceding boxes.

 

 

CHART BOX 6: Wavelength-to-RGB Approximation

Before discussing my result in Box 6 showing the conversion of 460 nm and 560 nm wavelengths to RGB colors, I want to list some references. Let's just say that there are a lot of assumptions made when assigning an RGB value to a particular wavelength. If you want to learn more about that, you can get started with these links.

  • The original program for the online wavelength-to-color tool is found here: http://www.efg2.com/...ng/Spectra.htm. This page is sponsored by Earl F. Glynn. Code for the conversion is presented on the page.
    Please note the Disclaimer mid-way down the page. There is no unique one-to-one mapping between wavelength and RGB values.

  • In the above links you will find lots of references and other links to color and spectral topics. Of particular interest, I think, is this discussion: Rendering Spectra.

  • Measuring Color, 4th Ed. by R.W.G. Hunt (2011, Wiley). This is one of several good books about colorimetry. In particular I've been reading in it about how a wavelength is matched to an RGB triplet. This correlation is based on results from human color vision experiments (of course, what else could it be based on, smile). Tristimulus values, spectral weighting functions and color matching functions are key terms in this process.

.

In Box 6 I have displayed the color which results from using the Wavelength to Color tool to obtain RGB values for the 460 nm and 560 nm wavelengths. The values and color seen in Box 6 are sure to raise some questions. I know I've got some!

  • For the raw blue, we sampled hues 232°/230° and 217°/217° as designated on the color wheel. (The two sets are dependent on how the raw color was developed for the photo file.) However, the Wavelength to Color tool assigned the hue 211° and RGB (0, 123, 255) to 460 nm. The 211° hue is very close to a blue with half-green, which is often called "azure".

  • For the raw green we sampled hues of 108°/107° and 98°/99°. But the Wavelength to Color tool assigned the hue 117° and RGB (195, 255, 0) to the 560 nm wavelength. The 117° hue has moved away from the sampled hues by quite a few degrees towards a yellow-green not too far from yellow.

.

I do not know for sure why the Wavelength to Color hues differs so much from the raw sampled hues.

I think it might be because the program includes a gamma correction different from that used by Raw Digger?

 

I have tried to list dependencies at each step of the sampling. And to provide references to wavelength-to-color conversion. As mentioned before, there certainly are a lot of dependencies and assumptions at play in determing raw colors and wavelength colors.

 

I will continue to think about this and (ideally) learn more - perhaps learn enough to explain this. Meanwhile, if anyone has any insights here, it's your turn to weigh in and help us all to figure this out. You know I want to hear from you, so please add your comments, questions, thoughts and suggestions. And Thank You!! :D

 

.

Link to comment
Edits made to discussion of gamma in Raw Colors section. References to gamma added.
Link to comment

Very interesting Andrea, and a lot more there than I had considered. I think i will need to go back through it a few times before I get it all.

 

With regards to your question;

 

Although this is just one white balance example, I am very comfortable in stating that I don't think a white-balanced JPG color should be used for any wavelength-to-color correlation. What do you think?

 

Yes, I think you're right when it comes to tying to extract wavelength related information white balanced (and especially auto white balanced) JPGs are not the way to go.

 

Something else to keep in mind. There is a difference between trying to get colours in an image which are representative of and closely resemble the original wavelengths, and trying to extract wavelength related information from a RAW file. In the former, you have all the considerations of how to process the RAW file to output the image. In the latter, it's a case of opening the RAW file and extracting the R, G and B values, and trying to look for differences in those values at different wavelengths. I must admit even as I write this I am still trying to get my head around it, so will no doubt have to come back and rethink this again to make it clearer.

 

When we use filters to look at UV we are in effect looking at a narrow range of wavelengths - much narrower than the visible spectrum. So I think it is only natural to assume within that narrow range of wavelengths there will be much smaller range of 'colours' which can be differentiated.

Link to comment
Andy Perrin

“When we use filters to look at UV we are in effect looking at a narrow range of wavelengths - much narrower than the visible spectrum. So I think it is only natural to assume within that narrow range of wavelengths there will be much smaller range of 'colours' which can be differentiated.“

 

This is not very good logic - color (and information) is actually related to FREQUENCY not wavelength. Frequency is inversely proportional to wavelength and we are using much higher frequencies. The bandwidth is greater between 400-300nm, than 500-400nm, for example. I haven’t checked whether the whole visible spectrum has more bandwidth than UVA, though, so caveat there. But for example this is why FM radio is better quality than AM, and UHF was higher quality (in old analog TV) than VHF.

 

In principle there can be more colors in UV, not fewer, but it will depend on where we put the cutoffs - I will have to try to calculate it later.

 

Another consideration is that the false colors we record depend on the Bayer dye responses, so if they were flat across UV we wouldn’t get any false colors at all. But conversely we could get better color sensitivity if we had designed the Bayer dyes themselves to react to specific sub bands of UVA. Instead we amplify small differences using white balance.

Link to comment

Some random thoughts:

 

-Any such discussion needs to consider metamerism: there are an infinite number of light wavelength distributions (metamers) that produce the same tristimulus value (perceived color, either by sensor or retina.) A sensor or retina has only three wavelength "bins" and the rest is interpolation. For pure, monochromatic light there is a one-to-one correspondence between color and wavelength. Deviate from this condition and all bets are off. A spectrometer has hundreds of bins in the same range, so the different distributions are easier to tease apart. Monochromatic light is rare in nature; most real-world applications involve broad-band illumination. Thus, any conclusions about wavelength based on a single photograph are going to be quite vague and approximate and no substitute for proper reflectance spectrometry.

 

-Historically, I think gamma may have had as much to do with film response curves as with CRT properties--film is linear only in a narrow range and the full response curve is a sigmoid.

 

-The reseau dyes in a digital camera were not designed to discriminate wavelengths outside the 400-700 nm range. That they do so at all is accidental; that they do so less well than in the design range is to be expected.

Link to comment

“When we use filters to look at UV we are in effect looking at a narrow range of wavelengths - much narrower than the visible spectrum. So I think it is only natural to assume within that narrow range of wavelengths there will be much smaller range of 'colours' which can be differentiated.“

 

This is not very good logic - color (and information) is actually related to FREQUENCY not wavelength. Frequency is inversely proportional to wavelength and we are using much higher frequencies. The bandwidth is greater between 400-300nm, than 500-400nm, for example. I haven’t checked whether the whole visible spectrum has more bandwidth than UVA, though, so caveat there. But for example this is why FM radio is better quality than AM, and UHF was higher quality (in old analog TV) than VHF.

 

In principle there can be more colors in UV, not fewer, but it will depend on where we put the cutoffs - I will have to try to calculate it later.

 

I never knew that - about the colour being related to frequency rather than wavelength. Every day's a school day....

Link to comment

Frequency and wavelength are not independent parameters--they are related by

 

F = c/λ

 

So to say that something is related to the one but not the other wants explaining.

Link to comment
Andy Perrin

OlDoinyo, they are definitely RELATED, but that is not the same thing as being equivalent. Information is proportional to frequency bandwidth - the difference between the high and low frequency cutoff in your signal. Small wavelengths have high frequencies via the equation you quoted. So c/300nm - c/400nm is a bigger number than c/400nm - c/500nm. You can transmit more information per second in one band than the other, even though they are both 100nm wide bands.

 

In addition to the AM/FM examples and VHF/UHF examples, this is also why blue-ray can store more info than ordinary CD-ROM, and why fiber optics can transmit info faster than copper wire (which is restricted to lower frequencies).

 

--

 

That said, it looks like c/320nm-c/400nm is NOT larger than c/400nm-c/700nm, so it looks like visible light does have more bandwidth despite being lower frequency, just because we have a large enough range of wavelengths to nullify the advantage of going to higher frequencies. (If you put your visible light cutoff at 600nm, they are exactly the same bandwidth.)

Link to comment
I think that the ability of a system to render colors in a given band (whether one chooses to define such by frequency or wavelength) is related to the ability to cram three appropriately-shaped sensing bins into that band. I don't think it has as much to do with "bandwidth" as discussed above, because that governs information density in the temporal domain, not the frequency domain; and spectrometers can have sensing bins much narrower than those of typical reseau dyes--that is demonstrated technology. More than three bins will not give you more colors, because everything must ultimately be output in a three-color space.
Link to comment
Andy Perrin

Ok, I have lost track of whether we are talking about using the camera as a spectroscope or are we talking about HUMAN PERCEIVABLE colors. Definitely cameras that have more dyes could get more out of UV than we are. They have hyperspectral cameras that do this.

 

The problem isn't that UV does not have enough information capacity to show more colors, and that is what I was talking about above. It does, we just don't have devices that can image them.

Link to comment

Nice discussion :)

 

The definitions&assumptions one starts with are always important and should be clear&distinct. I have problems with the terminus “color”. Especially with regard to reflected light or emitted light. Jonathan’s pictures represent emitted light and due to the set-up, each represents a very narrow bandwidth and we could assume -for easier discussion- that it is monochrome. If we look at reflected light, this is a different story. We see/perceive a color, which is made up from some kind of light, but it depends on the reflecting ground, what is reflected. E.g. a “rainbow” “emits” all “wavelength related” colors, but we can perceive more colors -like brown- which are not part of these natural colors.

 

Then, if we approach the problem from a simple mathematical viewpoint, it is easy to say, what can be done and what not.

 

We have three functions, which describe the wavelength response of the R, G and B pixels and we get there values, one for R, one for G and one for B. If we know, the recorded light is monochrome, we have enough equations to solve the problem. If we have a couple more wavelengths in our light source, it is a no go. All the further treating of the RAW data does only change the three functions for the wavelength response, so our equations to solve the riddle will look different, but do not provide further equations.

 

It is getting a bit more complicated, when the functions for the wavelength response are not only wavelength dependent, but also intensity dependent, what we have not yet considered ;)

Link to comment

EDIT 21 APR 2018:

Old incorrect vocabulary corrected in this post and my other posts.

I had used the phrases "wavelength bandwidth" and a "frequency bandwidth".

Here are the proper terms.

band = a small section of a spectrum. Often designated as an interval with enpoints, [a, b] or (a, B) or a-b (that's a dash).

bandwidth = the size of an interval. But most commonly used in communications and signal processing to refer to the range of a frequency band.

passband = the range of an optical filter. The word bandwidth is also often used to designate an optical filter's passband. Example: the BaaderU has a full maximum half bandwidth of 325-369 nm.


 

 

 

Let me clarify some things. (For my own sake, at least. And maybe for some readers, too.)

 

The visible interval [400nm, 500nm] has a nanometer width of 100 nm.

Its frequency bandwidth, from 1/400 to 1/500, has a bandwidth of 1.5*1014 Hertz.

 

The ultraviolet interval [300nm, 400nm] also has a nanometer width of 100nm.

But its frequency bandwidth, from 1/300 to 1/400, has a bandwidth 2.5*1014 hertz which is longer than the frequency bandwidth of the visible interval.

 

The wavelengths within the ultraviolet band [300, 400nm] have larger frequencies than the wavelengths within the visible interval [400, 500nm].

Because, obviously, fractions like c/300, c/310, etc. are larger numbers than fractions like c/400, c/410, etc.

 

Review: The 'c' character there represents the speed of light in nanometers/second in the wavelength-to-frequency relationship c = λ * f where λ = wavelength and f = frequency.

 

However, given the 1-to-1 relationship between wavelength and frequency, both the visible band [400, 500nm] and the ultraviolet band [300, 400nm] contain (represent) the same number of wavelengths and frequencies.

 

My point: We don't want to confuse an interval's length (how long) with an interval's cardinality (how many).

Nor do we want to confuse frequency size or wavelength size with anything else.

OK then. B)

 

[[[bTW, to represent the infinite number of wavelengths (or frequencies) within [300nm, 400nm], mathematics has created a special transfinite number to designate the cardinality of that interval, namely aleph-one: 1. It represents what is called uncountable infinity in mathematics - as opposed to countable infinity. But I digress......]]]

 


With me so far? I wanted to dust off these band/bandwidth/wavelength/frequency definitions there with some actual examples.


 

Now, in theory, always a dangerous way to begin..... :)

 

In theory, every wavelength in the visible band [400nm, 700nm] can be considered a visible color.

21 APR 2018 ADDED: ....at least for the purposes of formal colorimetry ideas. We grant that human vision more truly defines the notion of 'color'. But we have to deal with the digital version here.

 

In practice?? Well, of course, there is no way to reasonably represent an infinitude of visible colors in order to display those colors digitally or in print. So, we abandon the 1-to-1 theoretical relationship between a visible wavelength and a color to simplify our color models.

 

For example, there are 16,776,216 colors in an RGB model where each component can be assigned a value from 0 to 255. That representation squashes infinity into a mere 16 million plus.

 

Yes, our chromacity diagrams appear to show an infinite gamut in the nicely displayed charts, but in actual practice a color designation eventually gets crunched into a hexadecimal or an RGB triplet or some other finite-number-of-bits-representation.

And so it goes....

 

 

What about the ultraviolet band?

Again, in theory, every one of the infinite number of wavelengths within [300nm, 400nm] could be considered a false color. And from the discussion above, we see that there are the same number of potential false colours in the interval [300nm, 400nm] as there are potential visible colours in [400nm, 500nm].

 

But we can still only represent a finite number of them in a UV false color model.

 

The problems we have with UV false colour are more about about what "gets in the way" of properly recording it. I certainly do wish there was a Bayer mask which could separate the interval [300nm, 400nm] into equivalent R, G and B regions. So, somebody please get busy with this invention! You will make millions selling it to us UVP members!!!

 


This all started because.....?

People want to use a converted camera as a simple UV spectral device. Fine. But be aware that you are trying to map backwards from a finite bit color representation to one wavelength amongst an uncountable infinity in the interval [300nm, 400nm]. And there are so very many variables affecting the color representation in the first place that is difficult to account for how they might affect the mapping.

(On the equator at high noon in summer or on top of a mountain in Norway midwinter? You might get different UV photos of the same subject.)

 

Generally people try to simplify matters by mapping backwards from RGB representations to whole number wavelengths at full amplitude. Fine. But be aware that you have introduced another compression: the 16 million (plus) RGB triplets must go back to 100 whole numbers. (Brightness? Saturation? Don't go there!)

 

This all gets so messy so fast.

Link to comment

P.S. I am still thinking about "information" and large frequencies.

 

Yes, in radiowaves more info can be carried per unit of time with a higher frequency wave than a lower one. But how does that translate to color wavelengths?

Link to comment

21 APR 2018 EDIT: Sorry this post was so unclear about what I was trying to do that it is best skipped. I don't like to delete things because it causes confusion. But I'm crossing it out.

 

Simplify the entire problem by taking the RGB back to just one dominant wavelength. After trying to take into account illuminants, reference white, gamma, etc. This is somewhat do-able. But still laden with assumptions.

 

Hacking through a whole lot of arithmetic some of which I probably got not quite right....the raw value in the first post sampled as (15, 159, 153) with assumptions of D65, gamma 2.2 and Adobe RGB, then I somehow got 473nm, rounded off. But I have no idea what assumptions to make with monochromatic input photographed in a laboratory setting? Clearly 473nm is not 460 nm. So something went wrong somewhere. Oh well. I gave it a good try.

 

Sorry, that was all very vague. I was trying to "go backwards" and I did not succeed.

I should delete this post, but I do not like to do that. So kindly ignore it!!! :wacko: :wacko: :wacko:

Link to comment

"For example, there are 16,776,216 colors in an RGB model where each component can be assigned a value from 0 to 255. That representation squashes infinity into a mere 16 million plus."

 

Isn't it 65536 colors with 256 shades of brightness?

Link to comment
Andy Perrin
There is the notion of a wavelength bandwidth and then there is a frequency bandwidth.

Andrea, there is no such thing as "wavelength bandwidth". The physical principle is that every frequency band of the same width can carry the same amount of information. This is not true of wavelength ranges, full stop. You are muddying the waters there.

Link to comment

That is most assuredly not my intention. So let's try to provide me with the proper vocabulary, OK?

21 APR 2018 EDIT: Andy helped me clear up the terminology and I have made corrections now! "-)

 

What do you call this ultraviolet interval [300nm, 400nm]? Or this visible interval [400nm, 500nm]?

And why do we refer to a filter as having, for example, a bandwidth of 325 - 369 nm with wavelength designations as the endpoints ?

 

And I am not arguing against physics here in any way. I have not contested the fact of higher frequency carrying more information. I haven't even mentioned it. B)

 

****************

 

In principle there can be more colors in UV, not fewer, but it will depend on where we put the cutoffs - I will have to try to calculate it later.

 

The point I'm trying to make is this: If we grant that each visible wavelength in the interval [400nm, 700nm] is associated with a color (however difficult it might be to actually represent those colors in a finite bit system), then by that same reasoning every UV wavelength in the interval [300nm, 400nm] can potentially be associated with a false color. And in either case there are infinite number of colors or false colors. The UV interval has neither more nor fewer false colors than the visible interval has visible colors.

 

The UV wavelengths, having higher frequency, may be capable of encoding more information. But I'm not at all sure what that has to do with color. Color IS the wavelength --- well, as long as the appropriate physiological response has been triggered in our brains after the wavelength hits our eyes.

 

****************

 

It is possible that we are all getting hung up on a definition of color?

It is not an easy definition to get hold of. :D

Color is a human physiological response triggered by light hitting our eyes. That it is possible to associate one color with one specific visible wavelength is just a simplification which helps with modeling color and color response and color spaces and colorimetry and so on.

 

Werner has suggested that for the 16million+ RGB triplets, there are really only 65 thousand(+) colours with 256 degrees of brightness. That is certainly a valid way to look at RGB triplets. The casual interpretation most of us use in daily life is that bright blue (0, 0, 255) and dark blue (x, x, 255) and pale blue ( x, x, 255) are three different colors although we do label them all as some kind of "blue".

 

I really don't want to get too far into all this. Color and all the nuances of the subject have been well documented online and in books.

 

Added Later: But please do feel free to discuss it as much as you like!! I didn't mean to sound like I was cutting off discussion or anything. It's just that I have some other things to do. I've run out of time for now. :lol: :lol: :lol:

***********

 

The basic problem in this topic: How is an RGB value assigned to a photograph of a monochromatic light?

I didn't get too far with trying to figure out which were the "proper" RGB triplets for the photos of 460 nm and 560 nm.

Link to comment

Your question:

"The topic of using a converted camera as a device to determine the value of reflected wavelengths has been debated since the beginning of digital UV/IR photography. Can it be done? How accurate are the results? This will be an addition to that dialogue."

 

The simple answer is yes. But what will hang you up is how you ask this question.

 

Ocean optics spectrometers work by simply shining light onto a ccd sensor and reading the wavelengths. The tricky parts are the separation, uniformity and calibration.

 

The simplest case you have a sensor modified with the RGB layer removed, you have a known prism or grading separating your light source and you have a known distance on your sensor where the full spectrum shines on your sensor. With a standard you can calibrate exactly on sensor location with known wavelength, as you have replicated an ocean optics spectrometer. This can be done with converted camera, a Sigma camera or if really careful a super resolution camera like Olympus Em5 mk2 taking multiple images so each dye is exposed at same point in space.

 

Where this gets tricky is if you have a RGB dye layer. LDP, had indicated different manufacturers and even different cameras within a manufacturer have different dyes, each with a different wavelength response. So to simplify a RGB camera you lose spectral resolution as will need to ensure a band is reproduceably hitting all four sensor dyes. Otherwise your calibration will get trickier.

 

Where I see you getting lost above is talking about color. This is a truly none quantified thing. Genetically everyone on this forum will see the colors differently. We have been told and trained to say that representation is orange, but to all of us it will be different. So there you will get lost. Best to deal with quantified numbers, separation distances from known grading and distance on sensor to tease out wavelengths with known reproducible standards.

Link to comment
Andy Perrin

Ok, a band is a contiguous range of frequencies or wavelengths; bandwidth is a number equal to the highest frequency minus the lowest frequency in that band.

 

What do you call this ultraviolet interval [300nm, 400nm]? Or this visible interval [400nm, 500nm]?

A band!

 

And why do we refer to a filter as having, for example, a bandwidth of 325 - 369 nm with wavelength designations as the endpoints ?

That is called the passband of the filter, really, but people are informal sometimes, which is fine IF everyone is on the same base, vocabulary-wise! (There may be some added difficulty because the vocabulary here mostly comes from signal processing, but they overlap with optics people and the different fields don't perfectly agree on the language, I think? Even though it is all the same physics no matter whether you are an electrical engineer or an optics person.)

Link to comment

I can see this topic is going to be an interesting one.

 

To be clear, from my end, the question came about as I was originally interested in knowing the spectral response of my cameras as a function of wavelength. I approached Canon and was told the information was proprietary, and they couldn't share it. So I built my own device to measure it. It isn't perfect, but it gives me an idea of what is going on, until someone comes along with something better. Once I had some data I got to wondering whether there were differences in the 'colour' recorded by the camera from the different wavelengths in the UV region. In essence, can the information recorded by the camera provide some information about the wavelength of light hitting the sensor. Am I expecting the camera to be a spectrometer with the resolution of 1nm? Obviously not. Would it be interesting is the 'colour' information recorded could provide some information about the wavelengths coming in? Yes, to me it would. I say 'colour' as to me colour as a concept is somewhat subjective, and an interpretation of the wavelength.

 

I have some new and very preliminary data to share here on my 2 UV modified cameras (a Canon 7D mk1 and Nikon d810). Both of these have been modified by ACS in the UK for UV imaging with their proprietary, which we do not know the filter transmission characteristics for. I will look at this again with a Baader U and UV-Vis-IR modified camera when I get one.

 

When I assessed my 2 cameras through my measurement device (covered in my other build thread,http://www.ultraviol...ctral-response/ ), in the UV, and look at the Red, Green and Blue responses from the RAW files in Rawdigger, I plotted the following graphs;

post-148-0-52206500-1524215197.jpg

 

Yes I am aware there is signal above the baseline at 400nm, however I saw a similar effect at 400nm when I used the Baader U on my monochrome camera (again I will retest with the Baader U and Uv-Vis-IR camera when I get one). This suggests the wavelength resolution on my device is perhaps not as good as I'd like, which I accept is an issue. The 'non zero' baseline I think has something to do with background noise from the long time needed to capture the images, as it takes 30s to capture each image.

 

However I took the data from these curves, normalised the y axis so they went between 0 and 255 and used the R, G and B values to create colour patches in Powerpoint for the different wavelengths. This is what it looks like;

post-148-0-55296100-1524243472.jpg

 

The aim here was to see if there were big differences in 'colour' recorded for the different wavelengths in the UV, and and to me it looks as though there is based on this. Perhaps this is potentially a way to create a colour palette for UV images.

 

I am aware of making some assumptions when doing this, not least that I am saving these images as JPGs. So I emphasize again, this is done on a home built device to help me try and understand is going on with my cameras.

 

Question

Does anyone have a RAW file of the light from one of the UV LED torches (Nichia 365nm) taken ideally with UV lens such as a UV Nikkor? That should have a relatively tight spread in wavelength, and it would be interesting to see what the R, G and B values from Rawdigger look like for light source which has much smaller FWHM than light that has come through my monochromator, and whether they are at least similar to my 360nm and 380nm data?

 

EDIT - 20th April 2018, added RGB values to colour palette chart.

Link to comment

Andy, thanks !!!! Always appreciate getting pointers to the correct terminolgy and term usage. Later today I will go back through the post and reword the terminology. {{21 APR 2018: Done!!!}}


 

Jonathan, just remember --->>>>>> you can do this, BUT......please promise me that you will learn, understand and list factors which affect what "color" you are getting so that you do not mislead anyone can answer all questions about your work and defend it. Please also remember and understand why mixed light could also create the same color response as monochromatic light. You can make a list of factors affecting UV false color in a topic in the reference section and then reference that topic with a link. I might already have one there, but don't recall at the mo'.

That's all I have to say on this. :lol: :lol: :lol:

{{{{Everyone else says: oh sure andrea...when were you ever done talking. LOL....}}}}

 

Edit: 21 Apr 2018. I rephrased a sentence in the preceding paragraph which contained a word that had potential implications that I did not intend and which came across poorly on the page. That happens sometimes. But when it does we can fix it, OK?

 

As for the color patches on the chart, I'm thinking that it is important to add a note saying whether you are showing the fully saturated, fully bright versions of the colors or not? Also labeling the color's position on the color wheel (i.e., its "degree" like 240°, 255°, etc. would be most helpful to all of us who are color enthusiasts. Alternately, or additionally, the RGB values would be useful.

 

You can see those orange and orange-pink colours in scads of the raw composite photos I've posted. Let me try to find the link to the topic where I categorize our broadband UV-pass filters as being either "orange filters" or "magenta filters" accordingly as they are peaking at 350nm or 380nm. It confirms what you are showing here, but in a general sense. [Filter Test] Raw Colour Differences in 6 UV-Pass Filters

 

 

And I think I have scads of UV-Led photos of the Spectralon standards under both 365nm UV-Led and 385 Uv-Led. But please be patient as I have to attend to other social & household matters for a couple of days. I'm sure other folks reading this will contribute theirs also giving you a better sample set.

Link to comment

Andrea, I'm not here to mislead people. I would make that promise, but I'm getting a bit forgetful these days, so lets just say I will TRY and learn.

 

I have a line on the chart which says "Normalised highest value to 255 (scale 0-255)". Is that not the same as saying these are the fully saturated versions? Serious question, I don't know, but I thought that was what I was implying.....

 

I'll rework the chart and put in the RGB values (probably over the weekend). I'm not that familiar with the "degree" way of describing colour, although I think there are online conversion tools to get degree from RGB if needed.

 

No rush on the RAW file, I'm going to be travelling over the next couple of weeks and wont have any time to work on this.

Link to comment

Andrea, I'm not here to mislead people.

oh I know that! I'm really sorry if there was an unintended implication there ! Sometimes I'm "too Mom" trying to be protective of my UVP members. :lol:

 

Edit: 21 Apr 2018. I rephrased that sentence in the first post because it had potential implications that I did not intend and which came across poorly on the page. That happens sometimes. But when it does we can fix it, OK?

 

The degree isn't necessary if the RGB values are available. Should you ever want to use it, then most of the color pickers in converters either show HSB (Hue, Saturation, Brightness) or can be reset to show HSB. (Or, sometimes, HSV.)

 

Yes you are correct that "normalised highest value = 255" indicates full saturation in the dominant wavelength. I missed that! I.E., didn't read the labels on the chart - just got caught up in the colors. :rolleyes:

Added: But that does not necessarily imply a fully saturated color.

 

That 380 nm color is so pretty. I wonder if it has a name?

You should see the strange color the Wavelength-to-RGB tool produces for 380nm. Way different. I've gotta try to figure that thing out. I've glanced at the code but haven't had time to analyze it.

 

Added: Here is the 380-to-RGB color. Do you recognize this color? You certainly will if you bring it up to full brightness and see the resultant screaming magenta. Ouch! Magenta, of course, does not exist as a spectral color. So the Wavelength-to-RGB tool is showing its limitations here -- although it is a noble attempt to model the high UV colours.

380waveToRgb.jpg

 

fullBright.jpg

 

fullSat.jpg

Link to comment

Jonathan, I think that makes sense. The question your asking is different then the one Andrea started the thread with. You are asking about the wavelength sensitivity of the RBG dyes on your cameras and how do they compare.

MaxMax used to have some of this data. I just checked their site and the writing is still there but not the images, unless something is wrong with my computer:

https://maxmax.com/f...ectral-response

They have the data for the Nikon D200, D300, D700 and Canon 40D. I remember it being quite different for each camera. They may have the data for your cameras, so you should contact to see and compare.

 

As for using a converted camera as a spectrometer. Yes you can actually get 0.1 nm resolution with your monochrome Canon. You just need to build a box, place the grading on a pivot, so it can be adjusted in very very fine movements. Then attach the camera to a sliding rail, which can also be adjusted in very small controlled increments. So your light enters the box, hits the grading to be separated into a rainbow of colors. The further away the camera is the more the rainbow will separate before it hits the camera sensor, giving you resolution. The pivot on the grading will allow you to select the "color" range to hit the sensor. The limits will be intensity of light source, and calibration. You can use a used theatrical spot light or a motorcycle lamp to get over intensity and use Hg lamp or mixed laser pointers to help with calibration. That is the beauty of the camera method to doing spectroscopy. You give up range to gain resolution, or gain full range by giving up resolution with very quick snap shots.

 

The spectrophotometers I used in the past were simply a mixed light source of tungsten and deuterium lamps. The tungsten were good from 330nm to 1100nm and the deuterium lamps are good for less than 320nm. The light enters a grading to isolate the wavelength, moves through your sample and then hits a photo multiplier tube (PMT). This amplifies the weak signals to get reads. I see now on ebay you can get a cheap visible 320nm to 1000nm spectrometer for around $400. You just need to manually adjust the wavelength for each reading. This is not so bad as the calibration has been done.

 

After reading some of these posts and seeing the cheap cost of 1000 line/mm films, I think I may play around with this on my SD14. Optical bench calibrations were the fun part of setting up microscopes. I never did that, but a month of fine tuning the optics was typical. Looks like you can now have the fun at home with cheap source equipment.

All the best,

David

Link to comment

Andrea. I too must apologise, I sometimes read too much into things. No problem with trying to be 'mum'. I'll update the chart now with the new version with RGB values in it.

 

David (Dabateman), thanks, I've been chatting with Dan at MaxMax about some of this while developing it all - he made my monochrome camera and has an interest in this calibration work. Not sure where the graphs have gone on his page, perhaps he is updating them (there's nothing wrong with your computer, I can't see them either). My work is done with a little manual monochromator (an eBay special which cost me about $400, which was a lot less than a new one would be). In theory my method should give similar results to the grating method, assuming the grating doesn't absorb more light in the UV Please do experiment - the more info we have the better. My work is only one approach and it would be great to see what others are coming up with.

Link to comment

Please sign in to comment

You will be able to leave a comment after signing in



Sign In Now
×
×
  • Create New...