Jump to content
UltravioletPhotography

What color profile to use?


Fandyus

Recommended Posts

13 hours ago, Fedia said:

Fandyus said he white balanced the image before taking the photo (that's why the infrared tail is blue). He couldn't have the grating on his lens and a white target in the frame at the same time. The white balance coordinates are in the metadata. Why would a would a reference be so important ?

.

the balance in the camera works good only for the VIS, which then each post production software translates differently

.

12 hours ago, Fandyus said:

Edit: @photoniI didn't see your comment earlier where you said I'm making a mess. I'm not sure if that's something lost in translation but I don't think that's a very polite way to express yourself. And as Fedia said, the file is pre color balanced. Even if it wasn't, you could just turn the colour balance off and process the data knowing there was an orange filter on the lens at the time of the image being taken. Please read my post properly next time before you accuse me of making a mess. Thanks!

the catch is using an orange filter with an orange artificial light. you ask miracles to your sensor.

.

.

.

these are the reasonings that make me want to close the account UVP

.

hello guys, i stop, i'm a touchy old retiree

Link to comment
9 minutes ago, photoni said:

the balance in the camera works good only for the VIS, which then each post production software translates differently

Sony can white balance just fine in-camera for UV and IR alike. Please stop assuming things.

 

9 minutes ago, photoni said:

the catch is using an orange filter with an orange artificial light. you ask miracles to your sensor.

I do not. A halogen light emits the a very wide spectrum of wavelengths and is just fine for infrared images, and I believe many fellow members can back me up on this. If you think halogen lights are "orange" you might want to get your vision checked. If they were orange, the orange filter would be indistinguishable from a clear filter under a halogen light, but I can distinguish easily.

I still don't get what you're complaining about. You very obviously don't understand my issue. Maybe you should consider not getting involved in this thread anymore.

Link to comment

@Andrea B. Thanks for analysing the grating image in Raw digger. You said that you didn't see any major problems in the way darktable represented it. But funnily enough, its by looking at our result that the flaws of darktable become even more obvious to me.

 

We can clearly see that in order to display the bright green edge of the line Darktable decided empty the blue channel completely. This makes the blue channel unusable because it bears a "negative imprint" from the green channel.

 

I though I would find another exemple somewhere so I googled "trafic lights in the fog" and found this image :

 

89622945-pais-brouillard.jpg.6e884b256d1733b7e5e860408b05f13b.jpg

 

I you look closely at the red halos around the lights you can see they don't make sense : they are darker in luminance than their background !!

I turned the image in B&W to verify this :

 

89622945-pais-brouillardBW.jpg.8e996c2315e6601c59751ff52c1bfeba.jpg

 

Indeed the glowing area around the light in the fog, is darker than its backgound.

 

Lets look at the RGB values :

 

The red halo is pure red (252; 0; 0).

 

feudeciculationcouleurdufeu.jpg.5230dea765cf409c9b771a4f4fc9518c.jpg

 

The background is a light brown, and its Green and Blue values are respectively 109 and 71.

 

feudecirculationcouleurdufond.jpg.01bef6c78ef6e75f6be28eda35507e99.jpg

 

If there wasn't any color substraction to represent the red halo, the only solution would be to add the values of the red ligh to tthe values of the backgound. Of course the red value being already of 187 the clipping point would be reached very soon.

 

RGB values of the the red halo in a scenario without color substraction:  R = 255 (clipped), G = 109, B = 71

 

This unsubstracted red would look like this :

 

rougesanssubstraction.png.ed7dbc271608f0e608005858274b1166.png

 

Not very red.

 

Note that if we had HDR monitors that could dispaly values beyond 255, the absence of substraction wouldn't prevent the red light to be red. It would have RGB coordinates of 439; 109; 71 which I think would produce something fairly red.

 

This is impossible on regular screens and that's why color spaces exist : to be able to translate the very wide gamut of reality for our little screens. And the key to this is color substraction.

 

The smaller the color space is (ex : sRGB) the more color substraction has to be performed to represent saturated colors.

 

As we saw above color substraction produces luminance artefacts. That's we saw on Fandyus' extracted blue channel.

 

 

 

 

 

 

 

 

 

 

Link to comment

I think there is a nuance that should be made when we say our sRGB screens can't display certain colors. The actually can but they do it through substraction. A wide Gamut display, which is essentially a screen with very high brightness capabilities, doens't have to substract anything since it can produce bright AND saturated colors at the same time.

 

 

Link to comment

Colour theory and colour profiles can be very confusing and difficult to grasp.

I too am struggling to get an optimal workflow for that.

 

As I understand it the RAW files only carry an assigned colour profile from the camera for the tiny preview image and for when looking at a RAW file on the camera.

The RAW data itself is without any such information.

 

When converting the RAW file you can chose a profile and colour space the resulting image should have. Adobe RGB is one of the good alternatives

When processing my image files (tiffs) I want a wide working colour space with many bits like 16 bit data to use all information I can get from the camera and avoid clipping the data prematurely when processing the image further.

 

When the image is processed to taste it shall be prepared to be used.

For showing things on the web you unfortunately have to use the tiny sRGB to show images in a similar way on all browsers and monitors.

For making a good print there are similar considerations matching the image data to the profile of the printer and paper type, but that is another story.

 

 

Link to comment
8 minutes ago, Fedia said:

I think there is a nuance that should be made when we say our sRGB screens can't display certain colors. The actually can but they do it through substraction. A wide Gamut display, which is essentially a screen with very high brightness capabilities, doens't have to substract anything since it can produce bright AND saturated colors at the same time.

 

 

I think this statement is not correct.

Different monitor types have different RGB material types with differently wide gamuts that can show  more or less of a colour space.

To get it completely right you must then also calibrate the monitor as the colour behaviour is drifting.

 

The need for sRGB on the web is more due to limitations in crappy browsers. 

Link to comment
6 minutes ago, ulf said:

Different monitor types have different RGB material types with differently wide gamuts that can show  more or less of a colour space.

To get it completely right you must then also calibrate the monitor as the colour behaviour is drifting.

Indeed the spectral purity of the light comming from the RGB dots also determine the what kind of gamut the screen can display. But even the best SDR monitor wont be able to show the wide gamuts that an HDR screen can show. Maximum brightness and Gamut are linked I don't see how it could be otherwise.

Link to comment
1 hour ago, Fedia said:

flaws of darktable become even more obvious to me

I have not read the entire conversation yet, but I think Darktable is not the only program to do this. Pretty sure ACR and by that extent Lightroom would do the same, and also Rawtherapee. The negative ghosting seems to be some sort of way of achieving "lifelike saturation", though I don't find it very lifelike personally.

 

Edit: so I basically thought the same. All I can add is that I have a QLED Samsung screen, so wider gamut than normal LED but not HDR.

Link to comment

Toni, we forgive you for being a grumpy, old retiree !! Chill a bit while we work this out. You are a good contributor here and I will not let you close your account. 😁

 

*********

 

Fandy, I am going to continue with the Darktable experiment. There is a quirk in Darktable in the Exposure tool. I want to figure that out.

 

*********

 

Fedia, I'll get to your fog image eventually, OK?

 

*********

 

Ulf:  As I understand it the RAW files only carry an assigned colour profile from the camera for the tiny preview image and for when looking at a RAW file on the camera. The RAW data itself is without any such information.

 

That is correct. So we must have an Input Profile, Working "Profile", and Output Profile to work with a raw photo file. As mentioned, we typically do not have to specify an Input Profile because most photo software automatically applies a profile specific to the camera model specified in the raw data. Darktable does require you to set an input profile which should be "standard color matrix" in order to trigger the application of a camera/model specific profile.

 

Color Management 101:  Translation between Color Spaces

 

Profiles translate between the color spaces of devices or software.

In a device profile, the colors of the device are listed in a prescribed way so that the receiving device or receiving software can decode them.

  • Physical devices such as cameras, monitors, and printers each have their own color spaces.
  • Photo software also has a working color space used by its various tools.

For example, a Nikon D610 camera profile tells Darktable how to translate the D610 raw colors into Linear Rec2020 RGB colors. After processing is complete, the Adobe RGB profile tells the monitor or printer how to translate the Linear Rec2020 RGB colors to the monitor or printer colors. 

 

For a JPG straight from the camera with an attached profile, the flow is simply a translation of the camera raw colors to the viewing device.

camara raw colors --> aRGB/sRGB profile --> monitor or printer colors. 

 

 

Please let me know if something is not clear.

Link to comment

I FORGOT TO ASK:  Fandyus, what is the filter you used for the Dandelion photo?

Link to comment

Yeah, Darktable is failing the blue channel test !!

 

I finally got to try some variations on the Exposure bar using the Dandelion photo. Here are the results.

 

First I set Color Calibration to show the Blue Channel (if indeed we are truly getting the blue channel.....).

 

Color Calibration ON for blue.

All other settings OFF.

As Fandy originally noted, this is weirdly dark.

EDIT:  26 Aug 2023. Note that with Darktable's White Balance set to OFF, this should represent the raw blue channel of the photo. But it is much darker than Raw Digger's version of the raw blue channel.

dandyDktblBlueChanExpOffWBOff.jpg

 

 

Color Calibration ON for blue.

Exposure set to Automatic.

All other settings OFF.

I tested this because this auto exposure seems to be initially set when you open a photo in Darktable.

Again too dark.

dandyDktblBlueChanExpAutoWBOff.jpg

 

 

Color Calibration ON for blue.

White Balance ON and set to Camera.

All other settings OFF.

Better. But still rather dark.

This is a yellow flower, so you would expect the yellow area to be dark in the blue channel

given that yellow = red + green.

EDIT:  26 Aug 2023. When Darktable's White Balance is set to ON/Camera, then this represents the blue channel of the photo As Shot and not the raw blue channel.

dandyDktblBlueChanExpOffWBCam.jpg

Link to comment

I might be daft, but why shouldn't this flower (which *isn't* a dandelion) be rendered very dark in UV? In fact, that is quite normal for many yellow-coloured members of the Asteraceae.

 

Something is lost in this jungle of channels, colour spaces, and what-nots.

Link to comment

Here is a composite. On the top row are the R, G, B channels produced by Raw Digger.

On the bottom row are the R, G, B channels produced by Darktable with Exposure = OFF and White Balance = ON.

(R, G, B from left to right)

 

The red and green channels produced by Darktable seem to be close enough.

But the blue channel from Darktable is still a bit dark.

I'm think that there is some difference in how Darktable renders the B&W.

 

EDIT 26 Aug 2023:  The Raw Digger channels represent the raw data channels. The Darktable channels, made with WB = ON represent the channels from the photo As Shot.

So this really isn't a "fair" comparison.

Composite.jpg

Link to comment

Birna, you are of course correct. Any "pure" yellow will have a very dark blue channel because nothing is recorded in the blue channel for pure yellow areas. The yellow in the Dandelion may not be "pure". 😁

 

Also:  OH, WOOPS!! Not a Dandelion? I wasn't paying attention due to being absorbed in trying to figure out that crazy Darktable. Is the flower a Hieracium or Hypochaeris?

 

 

I measured around on the yellow part of the JPG. There is definitely a reddish cast. The red and green amounts are not the same so as to give a pure yellow.

Link to comment

I edited the instructions for finding Darktable Channels. I'll repeat them here.

Another Edit:  26 Aug 2023. I added clarification for finding either raw data channels or channels As Shot.

NOTE:  The raw data channels we are getting from Darktable do not seem to correspond to the raw data channels obtained from Raw Digger.

 

Channels in Darktable

 

Initialize

Input Profile:  Set it to "standard color matrix" so that Darktable applies a camera model specific profile to translate the camera's raw colors into the working space colors used by Darktable's tools.

Working Profile: Set it to Linear Rec2020 RGB as recommended by the Darktable developers. You can use Adobe RGB here if you like.

Output Profile:  Set it to either aRGB if printing or sRGB if photo will be displayed on website.

Exposure Tool:  OFF. You don't want Darktable applying any of its auto corrections to exposure when splitting channels.

For channels of the photo As Shot, set the White Balance Tool to ON/Camera.

For raw data channels, set the White Balance Tool to OFF.

All other Tools:  OFF

 

Get a Channel

Go to the Color Calibration Module.

Right click on the Color Calibration bar and select basic channel mixer.

Click gray (on the right).

 

For the red channel, move the input red slider all the way to the right until value 1.0 is attained.

Right click on the Color Calibration bar and select store as preset

Give the preset a name like Red Channel. 

 

Reset the Color Calibration Module by clicking the Reset Parameters button.

 

Repeat the procedure to create Green Channel and Blue Channel presets.

 

Now when you go to the Color Calibration Module, you can right click on the bar

to select those channel presets.

 

I have one more thing to try. Stay tuned. It will take me a while to get to it.

Link to comment

I repeated the channel split with the Working Profile set to a larger gamut:  Linear ProPhoto RGB. This produced a slightly less dark blue channel.

 

Left blue channel using Linear Rec2020 RGB.

Right blue channel using Linear ProPhoto RGB.

dandyDktblBlueChanExpOffWBCam01.jpgdandyDktblBlueChanExpOffWBCam_proPhoto.jpg

Link to comment
3 hours ago, Andrea B. said:

Fandyus, what is the filter you used for the Dandelion photo?

It's a Rowi GO-2 orange filter.

Link to comment
17 minutes ago, Fandyus said:

It's a Rowi GO-2 orange filter.

Isn't the Dandelion a visible image taken on un unconverted camera ? The grating is the one taken on a full spectrum camera + orange filter, isn't it ?

Link to comment

I think my final channel split from Darktable is not too much off the mark. I think you could be comfortable using it for channel stacks.

EDIT:  26 Aug 2023. Assuming you do not want raw data channels, but rather channels from the photo As Shot.

 

But shouldn't a channel split be the same from any app??

 

I learned a lot about Darktable from this effort. It is a mighty converter with many sophisticated tools which require a lot of study to use properly. But like other photo apps, presets are very helpful.

 

I'll look at some of the other items in this topic later.

Link to comment

I downloaded the latest version of darktable this morning to be able to use the color calibration module.

 

making sure to have the "clip negative RGB from gamut" box ticked, makes the blue channel more natural. When the image is displayed in color this box lowers the saturation a little bit.

 

Capturedcran2023-08-26155056.jpg.bbeb7b41dcbcf42590d67f9c48fa395e.jpg

 

There is a lot of different parameters and it seems impossible to just extract the data from one channel. When we output a grey scale from the blue channel like I did above, modifications have been applied in the CAT (chromatic adaptation transform) module.

 

What we are trying to do here is very simple, but the software was obviously not made for this.

Link to comment

oh, oh,  I might go with Toni, at least a little bit of mess here ? ;-)

 

Has anyone read the manual of darktable? It uses a complete different workflow

as it tries to use a calculating space not limited by 8-bit or 10-bit like older software.

 

 E.g. it tries to use the original linear space as long as possible (close related to the physical measurement the sensor is doing).

"Older" software often first does a non linear conversion (gamma correction) and then calculates on.

This has an impact, how WB is working (same for the color space selected!)

 

E.g. differences in the final output might start already depending on the demosaic procedure you choose.

 

The gamut clipping is especially mentiond to be used with understanding, what is going on, it can have an affect on the blue channel ... (not recommended, 

 

But it seems, darktable has a lot of features to look into :-) 

 

Link to comment
19 hours ago, Fedia said:

Isn't the Dandelion a visible image taken on un unconverted camera ? The grating is the one taken on a full spectrum camera + orange filter, isn't it ?

Yes, the "dandelion" is taken with an EOS 77D. The diffraction grating shot is with my full spectrum Sony a6000.

Link to comment

I don't think we a re making a mess : we are simply trying to solve a particular problem AND to learn a little bit about color science along the way.

 

Capturedcran2023-08-26221951.png.d9dc8e12d110d3ec4887ee33b536066f.png

 

I think I made quiet a lot of visual demonstrations of the issue. Even if i'm not clear in my words you can understand what problem I'm pointing at.

 

I tried all the demosaicing methods and none of them has a significant effect on this "macro" scale (we a re viewing the image as a whole).

 

All i'm saying is gamut clipping keep the dips from happening :

 

gamutclipping.jpg.6a900c470ee735503b29e8d8273caeff.jpg

 

Here gamut clipping is set to sRGB instead of OFF.

 

 

 

 

Link to comment

I was thinking, would it help you all if I posted a few more raw files of pictures I took with the diffraction grating? I appreciate you are investigating this, I am unfortunately not able to help right now but providing some data is the least I can do.

Link to comment

I just went playing with it a little bit and I figured out that linear Rec709 seems to not be causing any channel clipping even with all three channels displayed.

Link to comment

Please sign in to comment

You will be able to leave a comment after signing in



Sign In Now
×
×
  • Create New...