Jump to content
UltravioletPhotography

Camera sensitivity limit(s) measurement, and possible way to use your camera as a spectrometer


Stefano

Recommended Posts

Today, out of curiosity, I tried to see how deep my camera can see in UV. I took my 340 nm LED, a diffraction grating and wanted to see if the spectrum extended further than the one from a 365 nm LED. It did work, but the result wasn't very precise. In fact, a diffraction grating is approximately linear for wavelengths significantly shorter than its pitch, and Andy and I discussed this here: https://www.ultravio...post__p__31708.

 

I had two things to do: find a nice exact formula to calculate the wavelength of something by counting the number of pixels in an image (you will see later) and something with a very precise and known wavelength to use as a reference.

 

Let's start with the reference:

 

I have a CFL bulb. These bulbs emit both mercury lines and phosphor lines. I don't know the wavelengths of the phosphors emissions, but I know very well the mercury lines. So I use two books (as usual) to make a slit, mounted my UV-pass filter (ZWB2 (2 mm) + chinese BG39 (2 mm)) on the camera and a diffraction grating on top. ISO 1600 and +2 exposure compensation weren't enough. So I put the camera in "scene" mode and shot some ISO 80, 60 seconds exposures. And this time it worked wonderfully. The last image looked like this:

post-284-0-87908200-1603313912.jpg

Notice that in "scene" mode I can't white-balance, and those are "RAW" colors.

 

Crop:

post-284-0-48159300-1603313941.jpg

 

See? There are two big lines. The blue one is the 404.7 nm H-line and the pink/orange one is the famous 365.4 nm I-line. So I had two references. In Paint, I measured the distance (in pixels) between the center slit and the two H-lines. I was surprised by the consistency of the results. The first distance (left line-center slit) is 1473 pixels. The second one (center slit-right line) is 1475 pixels. The average is 1474 pixels. To be sure that that was the H-line I took a visible spectrum image. The colors are funky because I had a UV white-balance on (I didn't use the "scene" mode, because I would have overexposed massively):

post-284-0-29647900-1603314798.jpg

 

You can also see the 546.1 nm green line.

 

I tried again to measure the distances in pixels of the H lines, but got much less consistent results: 1462 and 1499 pixels. The average is 1480.5 pixels, which is close to the previous average but not quite. Maybe the different focusing (because of the focus shift between VIS and UV) altered the magnification? Anyway, I was now sure that the first line in the UV shot was the H-line.

 

I measured the distances of the I-lines from the center slit and they are even more consistent: 1320 and 1319 pixels, whith 1319.5 pixels as average. Doing a simple proportion, assuming that 1474 pixels correspond to 404.7 nm, you get 362.3 nm, which is close but not as close as it should have been. The problem is that diffraction gratings are not linear. It took me a while to figure out a proper formula, and here it is: (for reference, see here: https://www.ultravio...dpost__p__31725)

 

I know thatpost-284-0-11456500-1603316767.png, and thatpost-284-0-65685400-1603316790.png, sopost-284-0-99660000-1603316898.png, where x is our displacement in pixels and L is the distance between the diffraction grating and the wall. But we don't have it, and so we have to calculate one. It sounds strange, but to make things work we have to imagine a distance in pixels between the camera and the imaginary wall the lines are projected on.

 

To calculate it, I used the known data from the H-line:

post-284-0-37745200-1603318119.png

post-284-0-84325600-1603318132.png

post-284-0-42465000-1603318145.png

With x = 1474 pixels.

 

And that gives us our final formula:

post-284-0-20134600-1603318484.png

Where x, to be clear, is again the distance in pixels between the center slit and the lines. Note that this formula will work only for me and my camera used at the same zoom. You can make your own for your camera, you just need to change L.

 

Doing again the calculations for the I-line, we getpost-284-0-77216400-1603318837.png, which is closer. Not perfect, but closer. Here one pixel is about 0.28 nm.*

 

Now the spectrum with the 340 nm LED:

post-284-0-48757500-1603319145.jpg

 

Again, "scene" mode and "RAW" colors. This time I measured the distance between the left and right extremes of the spectrums (the shorter wavelengths visible in the image), and divided by two. The middle point isn't exactly in the middle, where the LED is, but offset to the right by about 7 pixels, that is similar to the visible light image before, but doing the average I should eliminate the error (in theory).

 

Displacement: 1223 pixels.

Wavelength: 344.7 nm.

 

...so, apparently my camera can still record a tiny bit at ~345 nm? That's quite deep, but if I didn't mess up somewhere, that's the limit. The LED should peak between 340 and 345 nm, so I can't see the peak.

 

If you do what I did, you can measure the cut-off wavelength of your camera, and the peak wavelength of LEDs and stuff. If you want to measure above or near 1000 nm, use a diffraction grating with less than 1000 lines/mm. 500 lines/mm will work.

 

*corrected after posting.

Link to comment
This is also a function of the lens and filter, I would think? Like, what if the lens rather than the sensor is cutting off everything past 344, how would you know?
Link to comment

This is also a function of the lens and filter, I would think? Like, what if the lens rather than the sensor is cutting off everything past 344, how would you know?

You are right. Here ve have the sensor (which you can divide in “monochrome” sensor+Bayer filter), the lens, the filter and the diffraction grating. My camera combined with that filter can see (apparently) that low. The lens is fixed, and the filter probably makes little difference.

 

I think the sensor is capable of more, and the lens is the limiting factor, but I can’t know for sure.

Link to comment

Just a guess, but if the camera is converted to full spectrum, then I think most camera sensors will see down to about 300nm or lower, given a lens with a far enough reach.

So I think the 340nm is not the sensor (camera) but rather the lens.

Link to comment

Stefano, you can also shine your light at a vertical needle instead of using a slit, it will give a nice thin line of reflection.

The 340nm LED, where can I get one please ?

Link to comment

Stefano, you can also shine your light at a vertical needle instead of using a slit, it will give a nice thin line of reflection.

The 340nm LED, where can I get one please ?

That's a nice idea, thanks.

 

That's the LED: https://www.ultravioletphotography.com/content/index.php/topic/3770-340-nm-led-first-impressions

 

https://www.ebay.com/itm/LED-chip-3W-UVA-340nm-high-power-LED-emitter-diode-deep-UV-345nm-SMD6363-60-70mw/264452973660?_trkparms=ispr%3D1&hash=item3d92a0245c:m:mUg5IqYGSQlu3V0kuD_Ivbw&enc=AQAEAAACQBPxNw%2BVj6nta7CKEs3N0qXfoTVdZN%2BIMiRwvVOp7QOwGC8sd8Io%2F%2BD9I8XOe2MvO48mPyzqoOIC0s%2F0HCmJkcgmXK7CYqfgxG0UZTEH%2Fk8XrarNglgt4GNzuQjkHmnC2NpkT1J%2FP%2F%2BqqvVEgwVrsiO%2FFAif6Bbm%2Fl5Bch4xJLkQqkijdzwm%2BEtQtdtS3oIEtZYAFnBhoyMXzaQK1OAyVmDE60EJ98l2Qo7XbDm0%2BH7rcin61PednZR6bnEJVj7XW99CUuslZ%2BfktguYo4f%2Fa%2F%2FcOVPB8dhaSDvXKtAx%2BgWjE85VaHhsc0pR%2BZ%2Fm19oFMSPJWaUNk1igUBE3nWTAMXIw5f5S33J4F8wVZfpLBd%2FY7zfFk6IVTCWaVaQi%2Bc7PTnzCNoNeyierJi7YrrNKaRoc%2F0lHGgOShY0J%2Bl7i9qyf7wdcuPWIMEBwnP7U7UU80Dy406mSKPqGGaKG4C3k037zhuc%2BDK8BRJ87XyD1W4JuhjAysen%2BikN88tai5PFto%2Fil1gWbcYGLTZj1v2m%2BQTuya%2FHzD3XKh8eUQtmTI5YHdvDfRHcqQe8r50wWOwjDfCpidOF9Admo0Oiym3%2BMGwdnM%2FlgL9Cv2glnDN1G9rWWZqBftJFpVosuDWh01WYHLQn1YL4g25Br8divYivrE2qjaDsCHLgLS9MiSOJ75mv0iykCNKOrB3%2B%2FiNaJhzNy19CjspZVE%2FmwHZGLIy9JzpPQ0Y3Cx5%2Bzt9nh2uGM4tzydtFuvR7IHEfvM67uZnmgrw%3D%3D&checksum=264452973660703bf25fb9464ca4b60eff6adc8ac2df

 

It seems it is no longer available...

 

Probably it is identical to this: https://www.ebay.com/itm/High-Power-3W-UVA-340nm-Led-Chips-Deep-UV-Led-Emitter-Diodes-Radiant-flux-55mW/202120730112?hash=item2f0f559600:m:mjTkCxAMKs2aloJPxecaFYg

Link to comment

Col, please note the radiant flux for the LED in the last link Stefano posted above.

it is only 55mW, even if the input power is 3W.

The efficiency is only 1.8%. 98% of the input power will end up heating the LED.

 

I think you already know this and want such LEDs for other purposes:

It is not well suited as a light source for UV-photography!!

 

Compare with the 365nm-LEDs used in Convoy and Nemo.

They have a conversion efficiency in the neighborhood of 30%

 

Again, it is not the input power that is the key parameter when looking for light-sources.

Link to comment

Thanks Stefano for the link.

 

Ulf, yes I realise it will be a dim light, In just want it because I don't have any LEDs lower then 365nm & I want to see what the camera sees etc.....

Link to comment

...I have some issues. I need some help. I tried the formula with a 532 nm laser pointer, that of course comes with a bonus 808 nm line from the pump laser. I got 1874 pixels for the green laser and 2903 pixels for the IR one. Plugging in the numbers in my formula I got 490 nm and 657 nm. Instead, assuming linearity and using the 404.5 nm line as a reference, I get 514.5 and 797 nm. Much better. Now... is the grating actually linear? This doesn't make any sense to me. I am sure that the grating has 1000 lines/mm, since I measured Ѳ = 32° with the 532 nm laser, and the theoretical angle is 32.14°. My formula has an horizontal asymptote at 1000 nm with x growing towards infinity, and is approximately linear for small x values. Can someone figure out what's wrong?

 

I measured the angle of the IR laser. I got ~54.5°, which corresponds to 814 nm.

Link to comment
Stefano, lenses have distortion also, and that distortion is often a function of wavelength to boot. (That's what chromatic aberation is.) Aside from any measurement inaccuracy like misalignments, that's probably the second biggest source of error.
Link to comment

Stefano, lenses have distortion also, and that distortion is often a function of wavelength to boot. (That's what chromatic aberation is.) Aside from any measurement inaccuracy like misalignments, that's probably the second biggest source of error.

I too thought about the lens distorsion, but it seems the lens is corrected quite well. Straight lines appear straight, even at the edges, so imaging the diffracted dots should be the same as projecting them on a wall. You mentioned chromatic aberration, but the 808 nm laser line was way too off. I may accept 10-20 nanometers of error, which is still big for me, but I am wrong by more than 100 nm, which is more than 10% of the value.

 

At least, I hope my formulas are correct.

Link to comment

I did some other tests and the results are interesting. I learned something I didn't know. First, as another test that confirms that my grating and my laser work as expected, I tried to measure the green laser again. The grating is 1000 lines/mm (d = 1 µm). I projected the dots on a wall. L was 0.439 m, and I measured x as 0.2785 m (I measured the distance between the two extremes and divided by two). usingpost-284-0-73183600-1603573869.png(again, see here for reference: https://www.ultravio...dpost__p__31725), I got λ = 535.7 nm, which is very close to the real value, which at room temperature is 532 nm.

 

I then wanted to simply find the ratio between x at 808 nm and x at 532 nm, with a 1000 lines/mm diffraction grating. This ratio is constant. I found the formula to get x in function of everything else, and that'spost-284-0-11922200-1603574839.png(I discovered now that the formula I wrote here https://www.ultravio...dpost__p__31750 is wrong. That's quite embarassing, I verified this formula with real numbers and it works), and by putting in the numbers and doing the ratio, this came out as 2.183. So, if the first (and only) dot at 532 nm is one meter away from the central spot, the 808 nm one is 2.183 meters away from the central spot. Now I did two different things, that I thought were equivalent but turned out not to be. First I photographed the dots projected on the wall. In theory, if the lens doesn't distort and so on, they should follow the ratio I calculated above. I measured 1179 pixels for the green one and 2738 for the IR one. The ratio is 2.32, not as close as I desired, but quite close.

 

Now the interesting part. I now put the diffraction grating on the camera, and photographed the "virtual" dots. I measured the distances, they were 1823 pixels for the green dot and 2903 for the IR dot. Their ratio is 1.59, way off the expected 2.183.

 

So projecting the dots on the wall and seeing the virtual ones are not two equivalent processes.

Below the images I used for the calculations, resized for the forum.

 

Real dots:

post-284-0-99292200-1603646377.jpg

 

Virtual dots:

post-284-0-71442700-1603646385.jpg

 

You can already see eyeballing the distances that their ratios are not the same.

 

Edit: I edited the images labelling the dots.

Link to comment
Stefano, I tried googling various versions of this (quite popular) experiment around the web, and in all cases, people used the real images, not virtual images with the grating on the camera. I think probably I would trust that methodology more, at least until we understand the source of the discrepancy.
Link to comment
I agree. With the real images we have exact formulas, and the behavior of the diffraction grating is well understood. As far as I know, real spectrometers project the “rainbow” on the linear sensor directly, but I think prisms are preferred over gratings, and they behave differently.
Link to comment

Please sign in to comment

You will be able to leave a comment after signing in



Sign In Now
×
×
  • Create New...