Jump to content
UltravioletPhotography

Is <No Profile> the right alternative?


Recommended Posts

Andrea show hoe selecting different alternatives for the input profile in PhotoNinja give different results here:

https://www.ultravio...__fromsearch__1

 

There has also been a long discussion about how different filters show flowers with different hues, especially blue flowers.

 

I have been thinking some more about this and have formed an opinion based on how I think colour management works.

My simplified view of colour management looks like this:

post-150-0-15304200-1622116450.png

Colour management is quite complex and I do not know if I got all of it right.

I am very open for any corrections.

 

The main issue here is if input profile data should be used or not, when processing images for a standardised formal usage.

The digital output from any sensor is defined by many different things, but normally rather similar for a specific camera model.

 

The goal for manufacturers and creators of image processing software is to get as good and correct colours in the images, showing a wide diversity of nuances and hues.

Therefore there exist camera model specific profile data.

 

The profile data is used for correcting for both the specific Bayer-matrix response and the effect of the internal filters normally mounted in the camera.

 

During full spectrum conversion the internal filters are removed making the input profile quite incorrect, but the profile still in some sense contain information related to the silicon-sensor response and Bayer-matrix response.

Those are the main components controlling how image data is separated into the RGB-channels and quite important for the appearance of the hues.

 

When using different filters that allow different amounts of light from the upper spectra close to 400nm, I find it logical that that is reflected in the hues and colours of properly white balanced images likeI showed here:

https://www.ultravio...dpost__p__45847

 

Andreas "no profile" setting eliminates those differences making all flowers similarly blue.

I do not think that is correct, especially as there are quite some differences in Bauer response between different cameras.

I think using a profile matching the used camera would help equalise those differences at least to some degree.

Link to comment

Once David said that the filters used in the Bayer filters on most (all?) cameras are made by the same manufacturer. I can't find the link now.

 

This is interesting, as it possibly means that standard Bayer filter cameras actually all see colors the same way, including UV. Did someone test this in the past? It would be nice if we collected diffraction grating images of the Sun taken with broadband UV-pass filters (~320-400 nm), with the Fraunhofer lines in focus, and rendered them without profiles, white balances or anything else, just pure raws rendered as .jpgs. I can do that with the Canon EOS M, although my Chinese BG39 is probably not that good below 340 nm, someone else can try it with a Nikon, a Sony, etc, and we can compare them. If we find that raw colors are the same for most cameras, then not using any profiles to process the images may help with standardisation.

Link to comment

Once David said that the filters used in the Bayer filters on most (all?) cameras are made by the same manufacturer. I can't find the link now.

 

This is interesting, as it possibly means that standard Bayer filter cameras actually all see colors the same way, including UV. Did someone test this in the past? It would be nice if we collected diffraction grating images of the Sun taken with broadband UV-pass filters (~320-400 nm), with the Fraunhofer lines in focus, and rendered them without profiles, white balances or anything else, just pure raws rendered as .jpgs. I can do that with the Canon EOS M, although my Chinese BG39 is probably not that good below 340 nm, someone else can try it with a Nikon, a Sony, etc, and we can compare them. If we find that raw colors are the same for most cameras, then not using any profiles to process the images may help with standardisation.

Even if the manufacturer is the same it can still be different variants of dyes. we do not know that.

 

Also not all silicon image sensors respond identically.

If all standard Bayer filter cameras actually see colours the same way, there would not be any reason to have different profiles.

 

I think the difference in sensor- and Bayer-response is bigger than the internal filter response.

I have seen some old measurements of bayer/sensor response, possibly by MaxMax. They were indeed different between camera models.

 

You can also look at the measurements made by Jonathan some years ago. He might have some input to this.

 

I think the idea to not use any profiles as Andrea did was intended to help with standardisation, but I do not think that is correct.

That idea is exactly what I am challenging here above.

Link to comment

No not the filters.

Most cameras are using the same Fuji dyes on the sensor. But the coverglass and AR coatings on that coverglass are different. Then the microlenses are different and the filter stack on top of all that (that we typically remove) is different. The only real change in dyes is Canon and sometimes Olympus are using two slightly different green dyes. But I need to track that down again. I think the recent Olympus cameras are using the same green.

 

I really should keep a sheet with all my references as I am starting to forget whom told me what when I would contact people to sort this all out.

 

As an example Panasonic is boasting about the new AR coatings on the actual coverglass of the old 20Mpixel sensor inside the new GH5mk2. Just as Olympus did when they released the Em1mk3.

 

I now worry that the 350nm AR cut on coverglass problem Jonathan showed will be on all cameras now.

Link to comment

Yes, the coverglass, microlenses, BG filters and hot mirrors are different between cameras, but what determines the colors is the CFA. If a camera sees 340 nm as orange in raw (mine does, for example) then 340 nm will always look orange regardless of the filters above the sensor.

 

But apparently this is not the case, and there are differences in raw colors between different cameras.

 

As I understand, color profiles were meant to correct the colors in normal visible light photography (it feels odd when I think that our cameras were actually meant for visible light only, and had to be modified to take UV/IR photos), this correction differs between different cameras as the BG filters and hot mirrors are different, some are bluer, some pass more red, and so on. But in UV, where our camera sensors are almost "naked", things are different, raw colors are more important and we base our UV palette on the white balance we apply.

 

So I don't think color profiles are as important for multispectral photography, maybe they don't matter at all.

Link to comment
Andy Perrin

This kind of uncertainty about what is really happening to the image during processing is why I’m drifting more and more towards MATLAB and programming the entire thing (at least up to the General Image Enhancement part) myself. That way I can track what on earth is being done to the pixels precisely.

 

One of many projects I have in mind is to write a RAW processor for Sony so I understand the demosaicing better.

Link to comment

Yes and No Stefano. The transmission of these ingredients will effect the colors. Thus why Ulf's absolute transmission spectra tell the full story. If your glass or AR coatings have various peaks and valleys and absolute cut offs than that will effect the colors and the response. If you have an AR coating that cuts hard at 350nm like Jonathan has shown, that will have a major impact. Also if that same coating has a dip at 480nm, that will have an impact.

 

Don't forget about the coverglass and microlenses.

Link to comment
Andy Perrin
David is correct— the coatings and glass have their own spectra. It is incorrect to ignore them totally. CFA is only part of the total absorption even if it is the largest part.
Link to comment
In my view, the choice is mostly of an aesthical aspect. It is true the camera profile (for Daylight/flash etc.) is optimised from the maker's side. However, any optimisation is guaranteed to break down when we push the camera outside the predefined operational domain, and internal filter(s) are removed. The assumptions of the specific (and calibrated) response of the sensor to spectral parts of incoming illumination are violated. I generally find that the most pleasing "false colours" result when I set Photo Ninja to "No profile". Your mileage may vary. After all the resulting colours are "false" thus we cannot say one renditon is correct and another is incorrect.
Link to comment

I thought more about what I wrote above.

 

So, a monochrome-converted sensor sees in B&W. This is because every channel has the same response. The same can be said in upper IR, above about 800-850 nm.

 

The CFA is the only thing that determines the raw color of a monochromatic light source perceived by the camera. If the microlenses are uniform, and the silicon photodiodes below all have the same response curve, then, no matter what coverglass, conversion filter, lens, lightning you have, a monochromatic light source will appear always to have the same color. Only the intensity will change.

 

This means that, if most cameras use the same exact CFA, even on different sensors, as long as the bare photodiodes on a sensor have a uniform response, and the microlenses are uniform, most cameras will see a precise wavelength (for example, a mercury line) as the same exact raw color.

 

What does change if the filters above the sensor are different between cameras, as well as the absorption curves of the microlenses, etc, is the raw color of polychromatic light, since the ratios of the camera sensitivities at the different wavelengths will vary between different cameras.

 

This also means that different cameras will see different raw colors in standard UV images, as long as you don't use a very narrow bandpass filter.

Link to comment
Andy Perrin
Stefano again, still wrong. The layers in front of the CFA also reflect different proportions of the incoming light as a function of wavelength even for monochromatic light. I think you are picturing the process as one of pure absorption without light bouncing between layers. But reality is that light takes a complex path through the layers and the ones underneath affect the reflections. This is a relatively small effect but it’s not zero.
Link to comment
So, if I understand, part of the light passes through the CFA multiple times, because it is partially reflected at the glass-air and air-glass interfaces in the glass above, and this depends on factors like the refractive index of such glass.
Link to comment

I have a question here.

Where does Andrea use "no profiles"???

 

Making camera profiles is my big thing!!

The only time I use "no profile" is during a Raw Digger to show a file prior to white balance because profiles typically also contain some WB information.

 

So I R confuzzed !!

Profiled and non-Profiled White Balance of UV False-Color

 

In that topic I showed everyone 3 false color outcomes: no profile, camera specific profile, generic daylight profile. Given that UV photos have false color, I would let everyone decide how they wanted to specify their false color outcome because we have no definition of what "proper" false colour is. (As Birna has said above.) But I personally always use the profiled (middle) version because I think that I might as well use the camera's original colors for my false colours. YMMV.

 

Deciding how you want your false color to look is not really a color management issue, IHMO. I'm splitting hairs a bit here :grin: but color management is about making sure your camera, comverter software, monitor, and printer all know how to interpret the colors in the photo. Color management of a false color photo (or *any* photo) is taken care of by assigning it a color space like sRGB or proPhotoRgb, etc.

 

OK, so there is

  • specification of colors false or otherwise, and there is
  • managing those colors through the camera, converter, monitor, printer.

Well, that helped me a bit. La!!! :lol:

Maybe someone else will find it useful. :cool:

 


 

As for the issue about whether all Bayer filters/dyes are the same --- well, that's not particularly the main point AFAIC. The key issue is what the demosaicing algorithm is. And also what color space is used to specify the colors in. And those are different by model, by brand. I've never had a Nikon give colors SOOC quite the same way as does as a Pentax, for example. Each camera maker has a color "look" that they like. Canons are said to be best for Caucasian skin tones. Pentax supposedly makes the best landscapes. Sony massages their raw data. And so forth. I don't know what the latest buzz is on any of that. But there is sooooooooo much that goes into producing the raw file or the JPG that I think it would take forever to figure it all out. And much of it is kept "secret" by the manufacturers.

 

Oh well. Time for me to go eat dinner instead of rattling on here.

Link to comment
From what I understand from the early Max Max attempts to convert cameras to mono, is that the cameras algorithms still 'think' that the CFA is still in place & processes the image as such, even though the output is mono, but with the wrong gradients, if that is the correct word ?
Link to comment

Yes, the camera color specification algorithms and demosaicing algorithms do not change for a mono conversion. So the tonal gradients* and tonal range might not be optimal for a good monochrome "look". This is because during ordinary demosaicing the 2 green, 1 blue, and one 1 red set of pixels gets smushed into one specified color (to oversimplify) with one tone. If you demosaic using a monochrome demosaicing algorithm, then you are preserving 4 different tonal values.

 

*The tonal gradients aren't wrong. They just not making full use of the data.

 

Shortly after MaxMax and others began changing cameras to mono, somebody wrote up some mono converter software. I think it is the Raw Digger people. Anybody have a linkie?

Link to comment

Thanks Ulf, that is good to know if I ever go down that path....

The FastRawViewer, in the same link, is quite usable too, even if it's adjustment range cannot cope with the most extreme BUG RAW-files WB against PTFE.

That is normally not a big problem as such files often look better WB in other ways and PTFE-WB is not mandatory here for those files.

I often prefer other WB alternatives, found by clicking around in the image until I get something that looks good.

Link to comment

Please sign in to comment

You will be able to leave a comment after signing in



Sign In Now
×
×
  • Create New...