Jump to content
UltravioletPhotography

Filter Series :: Name-brand UV-Pass Shootout with LaLaU, LuvU-II, KolariU, Moon UVA Miami and More


Andrea B.

Recommended Posts

As long as we are on this color thing......(I'm having fun anyway!! I love this stuff!!)....here is a representation of the colors between high UV and violet. (Might have been posted in the past??) It is just an attempt at modeling those colors with RGB values, so please don't read too much into it. Just for fun.

Reference Linkie: https://academo.org/...r-relationship/

 

Note that there are a huge number of errors and misperceptions in the comments to the linked page. You might want to just not read that stuff. Too difficult to sort out what is correct and what's not.

 

 

I regret having to say that the age of your eyes will play a huge role in how you perceive both spectral violet and this simple RGB color strip. As an example, to my good, young eye the 5 patches are quite distinct, but with my bad, old eye the three patches on the left are almost the same. Weird, eh. (Note that my bad, old eye likely is much worse at color than most of you alls' eyes. That old eye is really needing some surgery soon. Yuk!)

 

It would be nice if we could process our reflected UV photos to actually produce some of these colors. But it doesn't seem possible.

 

Wavelengths_Violet_Purple.jpg

Link to comment
Andy Perrin
They are all distinct to me. Colorblind or no, my vision seems to do better on the violet side of the spectrum.
Link to comment

Any RGB triplet with one zero or two zeros is 100% saturated.

Any RGB triplet with one 255 or two 255s is 100% bright.

 

In the preceding violet strip, all the colors are fully saturated and the brightness increases from left to right.

Link to comment

Time to flush some thoughts...

 

We see UV as violet (above 360-370 nm), then it starts to become bluer and at 340-350 nm it looks almost pure blue, similar to the 435.8 nm mercury G-line if you want to experience it. I did try to look at a Convoy with a diffraction grating (with the eye not in the beam, but still seeing the LED), and I could see the light going from violet to a bluer shade of violet, still not as blue as a 340 nm LED. Yes, don’t try this, at least, not without knowing that UV is bad for the eyes. But really, don’t do this. I had the confirmation that 340-350 nm appears blue by reading this Quora post. I did it after experiencing UV-blue myself, to find a confirmation of what I had experienced. (I think I already linked that Quora post in the past).

 

So, assuming our cameras see UV colors as follows, under a "normal" white balance done under a broad UV source, such as sunlight, on a white standard, and with a broad bandpass filter passing from about 340 nm and lower up to about 400 nm;

-lavender at about 380-400 nm;

-greenish yellow around 360-370 nm;

-green around 340 nm and lower;

 

then, by re-mapping lavender as purple (almost leaving it unchanged), yellow as blue-purple and green as almost pure blue, you would get a similar palette as what our eyes would see.

 

Or, working with RAWs, which is better, and knowing your camera's response (and you did a lovely grating test), and knowing the wavelengths (you may want to redo it but with the Fraunhofer lines in focus, that would help a lot), you can make a similar reasoning, sending purple to purple, pink to blue-purple and orange to almost pure blue.

 

This, anyway, doesn't take into account the quickly falling sensitivity of the human eye to shorter UV wavelengths, which I think falls faster than a camera's. If I look out of my window with my ZWB2 (2 mm) + Chinese BG39 (2 mm) stack, I see monochromatic violet, there's no blue.

 

On a side note, at least based on my personal experience, IR looks solid red, from 700 nm up to where our eyes completely stop seeing it. So, if you want to make "human vision IR", just make it monochromatic 100% red.

 


The Abney effect works a bit with green on me, I see it going just a bit bluer. With purple, it may be going just a tiny bit redder, but I would say it doesn't work with purple.

 

To see brown as orange, you have to display a completely black screen with a brown square/circle in the middle. If there is white around (like in this forum) your brain will auto-adjust it to become brown.

Link to comment
Spectral violet to my good eye looks exactly like that 395nm patch above. Only when looking at that color as light it is an entirely different sensation than I get when looking at that 395° 395nm patch. I don't know how to describe it.
Link to comment

395°?

 

My 10 W 405 nm LED makes my whole bedroom violet when turned on at the maximum power. The color is absolutely gorgeous, and I can reproduce the hue I see. I don't have the RGB coordinates now, I will post them when I will turn on my PC tomorrow if anyone is interested.

 

Strong violet looks more grayish-bluish to me, it's like it loses saturation.

Link to comment

then, by re-mapping lavender as purple (almost leaving it unchanged), yellow as blue-purple and green as almost pure blue, you would get a similar palette as what our eyes would see.

 

A combination of purple, blue-purple and blue doesn't seem like it would produce as interesting a photograph as does our false-blue/yellow palette ??

Purple to blue is only 30° on the color wheel. Not much variation. Perhaps saturation and brightness differences would help though.

Link to comment
I don't know exactly where was the peak of my light. What I saw was like that 395nm patch. And I don't know how to describe the difference between light and the reflected light from the monitor patch above. It is different.
Link to comment

I have to try this.

Click it up to fill screen?

brownBlack.jpg

Link to comment
well foopsy-doodle. I made the brown square too big. Trying again.
Link to comment

expand browser. close side bars. close toolbars. click up jpg to fill screen.

 

brownBlack.jpg

Link to comment

Having an OLED screen (or similar) with deep blacks helps, as well as being in a dark room and raising the screen brightness to the max. It must look bright, brighter than the surroundings.

 


Do you mean 395 nm? Or is there a 395º color?

Link to comment

And so.....I showed the preceding to the SigOth and asked "what color is that"?

 

"Orange", he said. "Well, maybe orange-brown".

 

I surely cannot get orange from that square meself !!!!

Link to comment

Sorry, I meant 395 nm, not 395°.

Corrected above.

Link to comment

then, by re-mapping lavender as purple (almost leaving it unchanged), yellow as blue-purple and green as almost pure blue, you would get a similar palette as what our eyes would see.

 

A combination of purple, blue-purple and blue doesn't seem like it would produce as interesting a photograph as does our false-blue/yellow palette ??

Purple to blue is only 30° on the color wheel. Not much variation. Perhaps saturation and brightness differences would help though.

yes, not so nice. I prefer our white balanced images with the color palette we are used to.

 

If you want to replicate human UV vision, don’t forget that we can’t focus very far in UV, our eyes have a very bad focus shift there (who wants an achromatic doublet in his eyes?) At 340 nm, I cannot focus further away than ~20 cm. I didn’t test myself at 365 nm, but one meter is too much. I cannot focus blue light past 3-4 m. I think I can reach infinity with green and above. IR at infinity is no problem. I am not short-sighted, BTW. I actually see the stars and mountains clearly, but I have a little bit of astigmatism in my right eye which makes me see double stars instead of one and 2-3 moons instead of one. But it’s actually a minor defect, approx. 0.5 diopters.

 

When I look outside through a Hoya R72 filter, I actually see everything very sharply. One reason is probably that I see mainly at 700-750 nm, so almost a monochromatic vision.

Link to comment
Once when I was driving on one of the Safaris, Birna said "stop a few meters up the road there". I drove on for about a half a mile before stopping.
Link to comment

"**Here are the usual suspects in the WB game played in any light:

converter, converter white balance algorithm, camera profile or not,

camera, lens, filter, amount of illumination, kind of illumination, time of day...."

 

Digging out my notes from 2006 ......... ​

 

If the OEM RAW converter is used it can make accurate use of WB data (since the sensor is fully characterized by the designer). In this case WB is adjusted by applying a co-ordinated colour temperature correction prior to XYZ transform aka photometric space. This occurs prior to demosiac/interpolation.

 

When 3rd party RAW converters are used then WB adjustments are applied as RGB scalars to the RGB transform aka colorimetric space. This occurs post demosiac/interpolation.

 

These two methods can vary slightly in their output and the latter can also produce colour artifacts.

 

This was the OEM approach used in the days when custom WB for IR could easily be accomplished using green grass. At some point, some DSLRs started to have issues creating a custom WB balance by this method and it was reported that manufacturers were implementing a different approach to WB application.

 

​Does anyone know exactly what changes were made to the original WB application?

 

I am unsure at what point WB pre-conditioning was implemented, perhaps this was their change in methodology?

Link to comment

If I'm getting what you wrote (not at all sure I do!), then it seems that originally the CIE space tristimulus values were computed to use during the demosaicing process. So maybe the change was to use some other color space for determining the x,y,z?

 

The problem with my speculation is that I thought the CIE color space is supposed to be good, supposed to be the standard for matching how the human eye sees color. But maybe better ones have been figured out? So that was the change which was made?

 

The other problem with my speculation is that I'm not at all sure why an initial CIE xyz transform to specify colors would skew an outcome from the camera's custom white balance.

 

Side Note: I'm not sure why some wider human eye color spaces aren't used as the camera color space instead of aRGB or sRGB. But maybe sRGB, etc, is still used because our digital images are first viewed on a monitor and those are limited in color representation. (Not as much as in the past, but still.)

 

****

 

I'm also thinking that perhaps the newer WB algorithms make use of some "memory scenes". Like, oh I'm seeing trees and blue sky so I know I want to tweak one color one way and another color another way for proper WB, rather than applying one temperature shift to all colors simultaneously. Memory scenes have been used to improve metering, so why not use them to improve WB?

 

******

Link to comment

Please sign in to comment

You will be able to leave a comment after signing in



Sign In Now
×
×
  • Create New...