Jump to content
UltravioletPhotography

Lens and camera tests with sparticle and monochromator


Jim Lloyd

Recommended Posts

White balance using the Baader U (only), not through the 340BP10.

Essentially, the individual BP filters in the Sparticle groups when shot with the Baader U (or other) on the lens become a stack of sorts with the filter on the lens.

So each of the BP filters in the group is its own stack with the Baader U.

However, the white balance is done only through the Baader U, not through the Baader U + BP filter (s).

So if you want your 340..10 filter to have about the same color as the 340BP10 filters shown in the Sparticle groups in this topic,

then white balance on PTFE using the Baader U alone, then stack the 340 on the front of the Baader U and shoot a pic, then apply the PTFE Baader U white balance to the stacked pic.

The color should become more like what you see with the Sparticle 340BP10.

Link to comment

Gotcha !! Sorry I misinterpreted the first time through. :rolleyes:

 

I have lots of BaaderU white balance presets saved, so I will try this.

Link to comment

Only just seen the latest posts here as I don't seem to be getting notification emails at the moment for some reason. - Also a bit slow as recovering from minor op - but will try to catch up ...

 

 

Andrea - Sorry what did you mean by:

 

 

" 1) I would really like to see a raw color patch beside the predicted color patch."

 

I haven't actually done the prediction for the exact filter used in the sparticle, but plan to do this shortly ...

Link to comment

Gotcha !! Sorry I misinterpreted the first time through. :rolleyes:

 

I have lots of BaaderU white balance presets saved, so I will try this.

 

Might work OK, but maybe best to use the same scene to shoot both the Baader U WB shot, and the Baader U + 340-10 stack shot.

The lighting might not be the same between some other Baader U WB shot and a new Baader U + 340 stack.

Link to comment

The left side of this Edmund 340/10 photo has had a BaaderU white balance + D610 color profile combo applied. And there's the green! On the right side is the usual blue/yellow rendering. In strong sunlight between 10AM - 3PM white balance presets do not vary by much,if any. I've got scads of such presets to compare to.

helianthus_uvEdmund340x10_sun_20180612wf_10463wbCompare.jpg

 

 

[Off Topic] I still do not know if this Edmund 340/10 filter is recording properly. It is supposedly OD4. So why is the UV-absorbing area on the petal base not as dark as it should be? The filter was rear mounted, but I must still be getting some flare. [/Off Topic]

 

 

For the record, the 340/10 raw colors are shown in the next photo. For this photo the contrast and black/white points were adjusted because Raw Digger only applies the usual histogram scaling and "gamma" curving -- the net effect of which leaves the raw photo looking a bit flat.

helianthus_uvEdmund340x10_sun_20180612wf_10463rawCompPn.jpg

Link to comment
[Off Topic] I still do not know if this Edmund 340/10 filter is recording properly. It is supposedly OD4. So why is the UV-absorbing area on the petal base not as dark as it should be? The filter was rear mounted, but I must still be getting some flare. [/Off Topic]

 

Some thoughts:

- we know that both the sunlight and Bayer/microlens covered sensor sensitivity go down dramatically after 340. It’s possible that even at OD4, the blocking is not enough.

 

- or possibly the flower becomes somewhat reflective again in the short waves?

Link to comment
But in 12 years of using the BaaderU, I've never seen that green in any UV photos.
Link to comment

Andrea, as the wavelength shortens, the camera sensitivity and UV intensity drop quickly. So the relative contribution of the 340nm light compared to the longer wavelengths is much reduced, even though the filter still shows good transmission.

 

A while ago I shared this for my monochrome camera, showing his the sunlight and sensor sensitivity skewed the effective transmission curve for the filter (post #17);

 

http://www.ultravioletphotography.com/content/index.php/topic/2644-thought-expt-correcting-filter-transmission-with-camera-sensitivity-and-light-intensity/page__view__findpost__p__19974

 

With a multispectral conversion, rather than a monochrome one there'll be even less contribution at 340nm as the sensor sensitivity curve drops more steeply below 400nm. Hence not surprising you don't see green. Hopefully that makes sense, if not I'll try and prepare a graph to show the effect more clearly.

Link to comment

(note: I just accidently del-ed my post. Sorry. Will try to recreate.)

 

Yes, of course, that makes sense for the broadband BaaderU.

 

However, all that matters at 340nm is that there is enough 340 in sunlight for filters such as the 340/10 to be useable. Which apparently there is or I would not be able to make the sunflower photos above. :D

 

But I would like to let this topic get back on track now that I finally managed to apply the BaaderU settings to the 340/10 photo. :lol: :lol: :lol:

Link to comment

But in 12 years of using the BaaderU, I've never seen that green in any UV photos.

 

Probably because we don't come across a source of 340 nm 10 nm width except in the artificial situation of sparticle images. Flower reflectance spectra for example tend to show relatively few peaks and troughs with most rapid changes around the uv/visible border with relatively little spectral shape between 320-380 nm. (based on Chittka's work).

 

So if you look at my second graph in the first post in this thread, one sees that you only get the green over a narrow band. Once the spectral content hitting the camera is broadened out (say consider a 330-360 source), then the red and green channels will tend to equalize producing a yellow.

Link to comment

Update:

 

Continuing with the plan:

 

First fitted continuous curves through my sensor response data, by fitting two Gaussians to the data points using the solver function in Excel to minimize sum of squared errors:

 

post-175-0-61822400-1534595578.jpg

 

Just to recap - this is the measured RGB response (normalized to source irradiance) of Nikon d3200 with physical BG40 2 mm filter and Nikkor EL 80 mm f/5.6 old style lens and modeled UG1 2mm filter, modeled sunlight and modeled white balance (that is normalized so that the area under each curve is the same)

 

Then used data that Jonathan has kindly provided on the sparticle filter transmission. In each case I approximated this by a Gaussian fit (just adjsuted parameters manually in Excel until the curve matched the data)

 

Then convolved filter transmission and RGB response for each filter and summed to get predicted response (i.e. at each wavelength between 280-450 multiplied transmission and response and then summed over all wavelengths)

 

Then compared sparticle image of filters with predicted colour from the model.Sparticle image was obtained with same camera/fileters as described above and raw converted / white balanced using photoninja using reference to ptfe tape within same image

 

It is immediately apparent that although the colours looked approximately correct, the luminance was low at the short and long wavelengths end. So luminace was adjusted manually to get a match by eye.

 

 

Results

post-175-0-80398100-1534596138.jpg

 

Discussion

 

A reasonable match between the measured and predicted colours can be obtained, but only if the luminance is manually adjusted to obtain that match.

 

I am unsure why this is. I think it is most likley due to non-linearity of sensor response with respect to source irradiance. In the original measurements using a monochromator, the output irradiance was reasonably independent of wavelength but the exposure time was varied to avoid over saturating the sensor. The exposure time at 400 nm was roughly 1/20 of that at 340nm. _ the response was then scaled in proportion to exposure time and output irradiance (aperture and iso were not adjusted) . The underlying assumption here is that response is proportional to exposure, but in reality it is likely that have a sigmoid shape.

 

I have used a modeled solar spectrum rather than measured. This is also likely to impact the results since not only does the absolute UV level vary with sun poistion and time of day, but the ratio of shorter to longer (say 350 v 400) may also change considerably.

 

The colours and image appearance vary a little depending on what software is used to demosacic and white balance. This will also have some impact on the relationship between model and image.

 

There could be some inaccuracy in the filter transmission measurements, although I think this would only be small and have limited impact compared to the points above.

 

Conclusion

 

The work appears to show promise and heading in the right direction, although quite a few issues need to be addressed. The bayer/sensor response shown appears to have roughly the correct RGB ratios at each wavelength, but probably the overall responses are overestimated at the peak and underestimated at the short and long wavelength ends.

 

This is probably as far as I will take this for now as I think it would require a bigger investment in time than I have available to compete properly. I feel I have done enough to illustrate the principle and ideally will keep on the look out for someone else to work with to take this forward. My main thrust has I think to be more on the art side now.

 

 

 

Be interested in feedback ...

Link to comment

The issue of non-linearity (of sensor response v exposure) is described very well in this paper:

 

Garcia et al determine spectral sensor response in two steps: firstly they determine the response v exposure and then fit a curve so that they can work back from a measured response to a linearised exposure (the exposure that would have given that response if the system was linear) (described here in detail). Then they determine linearised relative response as a function of wavelength.

 

In my work with the monochromator I haven't taken this requirement for linearisation into account. Then there is the same issue with the sparticle images. As shown by Jonathan here the transmission through the 396, 405, 406 filters is about 2.5x that of the others filters. Also the solar irradiance is lower at shorter wavelengths - so overall exposure at the shorter wavelengths may be a 1/4 of that at 400.

 

Looking at the response curves from Garcia (see below) suggests to me that the response at 1/4 peak might be around 2x what would be predicted if the response was linear - which would fit with the fact that my sparticle images look brighter than I predicted..It is interesting that the response-exposure curves have slightly different shapes for different channels which suggest that not only will the absolute intensity of response change, but so to will the (false) colour.

 

Of course in photographic processing we tend to play around with the exposure and maybe reduce highlights and boost shadows anyway, suggesting that aesthetically we do not want a linear response. But for any quantitative work, the non-linear response needs to be considered.

 

Any thoughts ?

 

post-175-0-51683900-1534691116.png

Link to comment

Just a thought to add to the above ...

 

It would be important to take linearity into account when comparing the spectral response of two different cameras - as the observed spectral response difference could actually be due to different exposure/response curve shapes.

Link to comment

You also need to keep in mind how the image is handled and the data extracted Jim. Quick demonstration. Take 8 Spectralon diffuse reflectance calibration standards with reflectances between 2% and 99%. Put them out in sunshine, and take a picture of them with Monochrome 5DSR camera, Asahi UAT 85mm lens, and Baader U filter. ISO800, f8, and 1/30s exposure, saving both RAW file and a JPEG within the camera, and you get something that looks like this;

post-148-0-66205300-1534693309.jpg

 

Take the JPEG and extract grey scale scores from the 8 tiles in ImageJ (numbers between 0 and 255), and for the RAW file put it into RawDigger and open as RAW composite. Then read the channel information from each of the 8 tiles, and average them together to get and average response. Plot both of these against reflectance of the tiles, and you get these two graphs (JPEG data on the left RAW composite data on the right);

post-148-0-73719100-1534693423.jpg

 

I'll need to go and re read the Garcia paper, but depending on how they handled the RAW files they could be dealing with data which has already been heavily processed, which they then need to process further to get the curve out of it again. This is why I do all my work using RAW composite files as I get a linear response for changes in intensity.

Link to comment

Thanks Jonathan - a very nice demonstration.

 

I'm a bit puzzled now. I now recall that actually the sensors are quite linear inherently. This is what Garcia et al say in their paper on measurement of the RGB values:

 

"Images were recorded in the native raw format for each cam- era and subsequently

converted into the sRGB IEC61966-2.1 color space and encoded into an 8-bit scale

employing Adobe Camera Raw plug-in (v 6.2) for Photoshop version CS5 (Adobe

Systems, San Jose, CA)."

 

So I guess it is that Adobe Raw plug-in conversion that is introducing the non-linearity in their case. It seems a bit crazy then that they have gone to such lengths to correct for it - I guess maybe they didn't have any other way to access the actual raw values?

 

Anyway - regarding my work, then yes for the monochromator work I used the RAW RGB values from rawdigger (I think I am right in saying the values given are raw composite, independently of the display method used)

 

So I think then the spectral response curves should be OK and my concerns over non-linearity there are unfounded.

 

To compare my predictions with the model I applied White balance using Photoninja to the sparticle images (based on the PTFE tape in the image) - I guess this is where the non-linearity comes in?

 

I suppose I should check the predictions without white balance against the raw composite image of the sparticle to clarify ...

Link to comment

(this is pertaining to several posts back)

Given that all the BP filters are shot at the same time, with the same exposure, it seems there is ample sensor sensitivity to record the green at about the same brightness as the other BP filters.

However, This is from back lighting, not reflected UV from objects.

I can't say what amounts of 340BP10 is being reflected from objects, but that range of UV light is illuminating objects, and it would seem that the sensor is able to see strong enough compared with the other bandpass sections of the range.

Link to comment

So further to earlier post I have now investigated further.

 

The sparticle image seen in Rawdigger as raw composite is shown here:

 

post-175-0-16513100-1534775131.png

 

(This is 5 sec exposure iso 800, Nikkor EL 80 mm (old style) D3200 + UG1 2mm + BG40 2 mm)

 

In this shot some of the filters are just over-saturating (in the R channel). but if I used an example with lower exposure some of the filters are very difficult to see.

 

I used rawdigger to obtain RGB values (raw values) average in 100x100 squares (on equivalent image to above but with 2 sec exposure to avoid over saturation). I just used one of the G values.

 

Then I scaled these according to the total area under the filter transmission curves to compensate for the variable amounts of total transmission

 

I have plotted these together with my monochromator measurements (to recap these are in 10 nm bands, values also obatined as raw values using Rawdigger, same lens, camera and BG40 2mm filter, UG1 2mm filter and sunlight modeled afterwards)

 

These are shown below - the overall normalization is consistent within each group of curves, but arbitary

 

.post-175-0-26220000-1534775684.jpg

 

 

This looks good to me - not an exact match, but sufficiently close to me to indicate that the response curves are essentially correct as they match based on two separate measurement methods.

 

Of course I should have been more aware of some of this before, but the key learning point for me is that when the image is rendered from the raw values, as well as applying the white balance correction, it is likely that a gamma correction will also be applied (and has been in the examples I have shown in previous posts).

 

This is explained nicely here (although of course you all knew this already ...)

 

If one also considers that green appears perceptually brighter for the same physical value (maybe opening a can of worms here ...) then we can explain why the 340 nm band pass filter can appear quite bright (after wb and gamma correction) despite the sensor response being pretty low at this wavelength. However in a real life scene it is unlikely that there will be much reflecting just in a narrow band around 340.

 

Quick demo of apparent brightness of green:

 

Here RGB set separately to the same value, but green looks brighter.

 

post-175-0-32685100-1534777038.jpg

 

Incidentally along the way in this latest work I discovered a small error what I had done before relating to the green channels - such that the values were all slightly (~10%) higher than they should be. I am never sure really whether to use the average of G and G2 or sum them, but if not stated I have taken the average. In the above I actually compared the average of G and G2 for the monochromator measurement with just the G values as I found that there was very little difference between G and G2.

Link to comment

If I apply a gamma correction after white balance to my monochromator data this is what I get:

 

(What I mean is that first I normalize the curves so that each have the same area (=1) then apply the formula Vnew = Vold ^1/gamma - where Vold is the original value, Vnew is the new value and gamma = 2.2

 

post-175-0-25203500-1534789518.jpg

 

(for ease of comparison here is the graph without gamma correction)

 

post-175-0-68565800-1534789565.jpg

 

Seems plausible that this accounts for the observed sparticle images

Link to comment
  • 2 weeks later...

I have done some further work and refinements on this and got about as far as I can take it now. Very useful learning experience for me and hopefully of interest to others.

 

A quick recap: I measured camera response using Xe arc lamp based radiation monochromator - Camera full spectrum D3200 + UG40 2mm + Nikkor EL 80 mm. Then added model of UG1 2mm and sunlight - this response was fitted to bi-normal distribution (RGB channels).

To check that model appears reasonable I imaged a series of narrow band filters in sunlight and used the response model and filter transmission measurements to predict appearance of filters ("sparticle).

 

Here are the predicted and actual filter colours (shown as image and graphically)

 

post-175-0-50366600-1535481881.jpg

 

This is improvement on previous - I updated the solar model parameters (see here) as I previously had the wrong time of day. This can have a big impact on relative irradiance at different wavelengths.

I also made some changes to the filter transmission. The major change was a reduction to the transmission through the 365 nm filter - this may be a coincidence but it seemed to be too high roughly in proportion to its area (its bigger than the others)

 

As this looked like a reasonable match to the raw RGB values, I moved on to predicting colours after white balancing.

Previous work showed that the lower filter images at either end needed to be boosted so I introduced a gamma correction of 2.2. This was done after white balancing, individually for RGB channles.

White balance was simply determined by measuring the area under the response model and scaling the blue and green channels in inverse proportion to their ratios to the red channel.

 

This is the result compared to the image obtained in View NX2 by white balance to teflon tape

 

post-175-0-68629900-1535482551.jpg

 

This looks convincing enough to me.

 

I will come back and add more comments later ...

Link to comment

Are you using any color profiles for this system of camera+lens+filter+light?

 

Something which might be useful....take each of your predicted and actual colors and find their color wheel reference color at full sat and brightness. It isn't all that important to do this, but does sometimes give a better "feel" for what a color is. For example your 407 color bars are so dark that I cannot see what color they might be.

Link to comment

I have to admit ignorance here, as I am not sure what you mean by colour profiles. Could you educate me?

 

I have been working with RGB, but I will have a look at HSL values

Link to comment

Reflections on this work:

 

1. As proof of principle / pilot work it has been succesful

2. Monochromator method works, akthough there is probably some low level of stray light, which I would need to investigate and quantity.

3. To compare with sparticle accurately, quantifying fiter relative transmission is important and a potential source of error

4. Important to get accurate solar spectrum, although white balance will reduce solar spectrum variations

5. Exact method used to convert from RAW to rendered white balanced image varies with software in ways that might not be clear to user.

Link to comment

Please sign in to comment

You will be able to leave a comment after signing in



Sign In Now
×
×
  • Create New...