Jump to content
UltravioletPhotography

Corrected Raspberry pi HQ review to 335nm


dabateman

Recommended Posts

Updated July 19, 2020.

Well computers do what you tell them exactly not what you want. Corrections added below.

 

I have now build my pi based camera. Quite exciting to construct it. I have 4.3" touch screen, with the pi 4B board. I also have a UPS hat connected, which takes 2 of the same batteries used in the convoy flashlights. So I have it fully mobile.

I got 2 HQ camera modules. One I carefully popped the CM500 filter out with a Q-tip after carefully breaking the glue with tweezers. So I can place it back.

The stock camera should allow for UV transmission.

 

But the sad news from my test today. The full spectrum converted camera is very faintly sensitive at 370nm. Its black with 335nm and 313nm filters using quartz UAT lens and two Exxo Terra bulbs.

I see something faint at 370nm using analog gain switch 16, so at basically ISO 1600. As pi uses:

 

ISO = analog gain x digital gain x 100.

 

But this is mostly likely higher wavelength UV range of my 370bp15 filter.

I get an image with BaaderU and U330WB80 improved filter. So the deep UV end might only be 375nm to 380nm.

I get a good image with 390bp25 filter. So that may be the filter of choice.

 

So if you want a HQ camera for UV, there is no point in removing the CM500 filter as the filter lets in more UV than the camera can see (still true after update). I only saw half stop drop at 335nm for the filter. So stacking a 2.5mm or maybe even 3mm U360, UG1 or ZWB2 glass with a stock HQ pi camera should give nice high band UV. There is space for that behind the lens and in front of the camera if using a convoy cut filter size around 20mm.

 

I haven't yet tested the IR. But others have and IR always seems easy.

 

Not sure if a monochrome converted camera would be better. Its possible the stack used on top of the sensor is really cutting out the UV. Would need to see a test to see if worth the cost.

 

I did figure out how to convert the jpeg + raw files to DNG, using PyDNG. I also have code to isolate the four channels. The modified dcraw also works to develop the jpeg + raw files. So all the backing was done prior to today's test.

I have guizero providing an on screen touch camera control. Using a modified Merlin camera code. Still need to finalize that package.

 

Updated:

So I inspected my EXIF data and found a major problem. I thought I could fix the ISO and have the shutter speed float to high values like a normal camera. However there seems to be a problem with my version of raspistill. The lowest shutter speed it floats to is 1/16 seconds. So not very good to test out deep UV transmission.

 

For my camera module the analog gain switches are -ag 1 = ISO 40, 2 = ISO 80, 4 = ISO 160, 8 = ISO 320 and 16 = ISO 640 with digital gain fixed at 1.

So my original test for UVB was conducted using ISO 640 with shutter speed 1/16 and F4.5 on a Quartz UAT lens. Not much light allowed into the Camera and No wonder I didn't see anything.

 

My raspistill does seem to have an issue as I can't hit a shutter speed above 21 seconds (that's with -md 3 switch). Using the -ex verylong switch, I can only get 10.2 seconds maximum shutter speed. So will need to find a fixed update.

 

July 19, 2020 images (full spectrum module filter is removed):

 

Visible ISO 160 (-ag 4, -dg 1) with 1/16 seconds shutter speed with 2 Exo-Terra UVB lamps and UAT set to F8, because who doesn't like diffraction of F44 equivalent:

post-188-0-18640000-1595144023.jpg

 

330WB80 improved filter only showing what I previously saw 1/16 shutter speed, ISO 640 (-ag 16, -dg 1), with UAT at F5.6:

post-188-0-07382100-1595144043.jpg

 

330WB80 improved filter only with a correct exposure 10 Seconds, ISO 320 (-ag 8, -dg 1), F5.6:

post-188-0-04262900-1595144058.jpg

 

370bp15 + 330WB80 improved 10 seconds shutter speed, ISO 320 (-ag 8, -dg 1), F5.6:

post-188-0-97780300-1595144072.jpg

 

335bp10 + 330WB80 improved 10 seconds shutter speed, ISO 2560 (-ag 16, -dg 4), F5.6:

post-188-0-62343600-1595144090.jpg

 

335bp10 + 330WB80 improved 21 seconds shutter speed, ISO 320 (-ag 8, -dg 1), F5.6:

post-188-0-66669900-1595144110.jpg

 

 

Chatting with Dan at MaxMax. He might be able to significantly improve UV response with monochrome conversion. He hinted at being able to see a 255nm LED with the sensor after conversion. But is still working on the HQ camera.

 

I hope its affordable and that would be very exciting to maybe get into UVC.

 

The only main problem with raspistill I see, once I get it fixed. Is that it needs 7x the duration of your shutter speed to capture the image. So a 10 second exposure actually takes 80 seconds with my camera. This is the first 5-6 captures are for camera adjustment. then main capture and then data processing to save. So if I can get the 269 maximum shutter speed to actually work, it would take 30 minutes for that capture time.

Link to comment

Not sure if a monochrome converted camera would be better. Its possible the stack used on top of the sensor is really cutting out the UV. Would need to see a test to see if worth the cost.

I think it would certainly help. Quite sad that the sensor is so little sensitive to UV.
Link to comment

Yes I became so used to seeing 313nm with the Em1 that I expect it as a minimum.

Disappointed that the Sigma SDQ can only see to 335nm. But this is quite sad, at not even 370nm.

 

At least I have a full portable computer with the pi and not an expensive camera purchase.

 

I also can figure out how to do in camera all my proccesing. So I could add an Astrophotography camera instead.

Still eyeing the ZWO 1600mm. But due to COVID, completely sold out, used copies sell fast and on back order. No airplanes has been a huge boom for Astrophotography.

 

Link to comment

Dave it is probably the cover glass on the sensor that is doing the UV blocking....?

Are you referring to the CM500 filter? It should behave more or less like BG39. If he removed it, there should not be any cover glass I think.
Link to comment

Are you referring to the CM500 filter? It should behave more or less like BG39. If he removed it, there should not be any cover glass I think.

 

No Colin might be correct. I worry for Sony sensors as Jonathan has shown that even the coverglass on the sensor has coatings to block UV below 350nm.

 

Stefano,

A Sensor has many layers. Look at how complicated the raspberry pi IMX 219 sensor is:

https://www.google.com/amp/s/maxmax.com/maincamerapage/monochrome-cameras/raspberry-pi-mono/amp

 

Raw silicon detection layer, Antireflection coating, sublens focusing layer with color filter array, large microlenses to focus to color array, then coverglass with antireflection coatings.

Lots of UV blocking stuff there as who normally cares about UV detection. 90% of people want highest possible quantum efficiency at 550nm to see our green world. There might even be coatings there to block the abundance of IR that most also don't want.

 

 

 

Link to comment
WiSi-Testpilot

Da Bateman, thank you for your interesting report. Unfortunately I destroyed my camera when I tried to remove the Bayer filter. The cover glass is very thick and stable. I broke it and then removed it. After that I should have taken a UV photo to see whether it still works.

I would like to have the GSENSE2020BSI sensor...

Best regards,

Wilhelm

Link to comment

Da Bateman, thank you for your interesting report. Unfortunately I destroyed my camera when I tried to remove the Bayer filter. The cover glass is very thick and stable. I broke it and then removed it. After that I should have taken a UV photo to see whether it still works.

I would like to have the GSENSE2020BSI sensor...

Best regards,

Wilhelm

 

What can you use to illuminate down to 200nm ?

Link to comment
WiSi-Testpilot

What can you use to illuminate down to 200nm ?

 

I would like to see corona discharges on high-voltage lines in daylight.

Link to comment

Are you referring to the CM500 filter? It should behave more or less like BG39. If he removed it, there should not be any cover glass I think.

 

Stefano, here is a digital camera sensor. It is still not 'bare' as it has a clear cover glass, mounted on top of the black ceramic holder for the sensor.

This is hermetically sealing the sensor inside the black ceramic holder & protecting the sensor with the microlenses & colour filter array on its surface, & also the very fine gold wires that take the signals from the sensor to the mother board.

What this clear cover glass transmittance is I don't know, But the astro photography people that clean off the microlenses & CFA, have to remove this clear cover glass & they replace it with quartz or fused silica to achieve the lowest UV wavelengths.

Cheers

Col

post-31-0-15622900-1594975065.jpg

Link to comment

This is not in the main corrected text as I can't push the shutter speed yet but if anyone is curious. This is my Full spectrum converted module (no filter) with 313nm test

 

2 Exo-Terra UVB lamps

UAT at F5.6

 

313bp25 filter with 330WB80 improved, ISO 320 (-ag 8, -dg 1), with 10.2 second shutter speed the maximum with the -ex verylong exposure switch:

post-188-0-39481200-1595144935.jpg

 

You can just see something near the center. If I could get 60 seconds or 100 seconds to work, You might actually see something real.

Link to comment

May be I am weird, but one thing I have enjoyed about build the pi camera, is that its a complete system. I can take images with it on tripod or hand held.

Then I can bring it to my desk, connect a USB cable and an HDMI cable to my monitor, which has keyboard and mouse connected to it and I can edit the photos directly from the pi. I still need to install Hugin, GIMP and Raw therapee. But I can, and can use the command line with dcraw to do main changes. The 4channels like script schoolpost wrote works on single images. But works.

If I camera company made a camera like that. Where you had full edit control with just a large screen, keyboard and mouse, then I might buy into it. Or just keep building it myself.

I have the 8 GB pi 4 version. So nearly more powerful than my other computers anyway. Which just have J1800 processors and max out at 8 GB ram.

Link to comment

Nice greens below 340 nm. It seems every sensor sees UV as green in that wavelength range.

 

More correct to say that the dye used for green is most sensitive to light below 340nm. Red dye is there a little, but blue is not sensitive much below 370nm.

 

Looking quickly at the individual green channels. They are different. Green 1 does not equal green 2. I will have to process the UV shots. I just ran a quick Tiffen 47, 58, 29 filter test to ensure the seperate was correct.

Link to comment
  • 3 weeks later...

For anyone interested this is what my marvelous Pi4 camera looks like:

 

Front:

post-188-0-93036700-1596569476.jpg

 

Back:

post-188-0-79960700-1596569468.jpg

 

I have the waveshare 4.3inch touch HDMI monitor working with my GUI camera apps on the home screen. Power is delivered by 2 18650 batteries, the same I use in the Convoy flashlights. I use a CSI to HDMI cable adapter and can hot swap off camera modules using the HDMI cable. Thus allowing for deticated UV-A, UV/VIs, IR, and full spectrum cameras. Then any lens can be used without worrying about adding front filter.

 

I have the embedded raw working out to DNG files using the brilliant Csaba Nagy's PyDNG program. I also create a Tiff file using 6by9 modified DcRaw.

 

This is the code I use to get all the files in a folder converted to DNG using PyDNG:

 

from pydng.core import RPICAM2DNG

import os

 

def files(path):

for file in os.listdir(path):

if os.path.isfile(os.path.join(path, file)):

if ".jpg" in file:

yield os.path.join(path, file)

 

for file in files("/home/pi/Downloads/FS/"):

RPICAM2DNG().convert(file)

 

 

For this to work you need PyDNG installed.

 

This is the Code I use to also get the isolated 4 channels off the sensor:

 

from pydng.core import RPICAM2DNG

import os

from PIL import Image

 

def files(path):

for file in os.listdir(path):

if os.path.isfile(os.path.join(path, file)):

if ".jpg" in file:

yield os.path.join(path, file)

 

 

def channelSplit(data, file):

file_name = os.path.splitext(file)[0]

 

data[1::2, 0::2] #RED

data[0::2, 0::2] #GREEN

data[1::2, 1::2] #GREEN

data[0::2, 1::2] #BLUE

 

img_R = Image.fromarray(data[0::2, 0::2] << 4)

img_R.save(file_name+"_b.tif")

img_G1 = Image.fromarray(data[1::2, 0::2] << 4)

img_G1.save(file_name+"_g1.tif")

img_G2 = Image.fromarray(data[0::2, 1::2] << 4)

img_G2.save(file_name+"_g2.tif")

img_B = Image.fromarray(data[1::2, 1::2] << 4)

img_B.save(file_name+"_r.tif")

return data

 

 

for file in files("/home/pi/Downloads/FS/"):

RPICAM2DNG().convert(file, process=channelSplit)

 

 

For this to work you also need to install Pillow.

 

This is an example code I use for on screen display to capture images. This one is configured with fixed ISO approximately of 160. The HQ camera has an automatic behavour that it will set shutter speeds only from 1/16 and faster. Nothing slower. So I have fixed settings at one stop intervals below.

 

 

from guizero import App, PushButton, Text, Window

from time import sleep

import time

import datetime

import sys, os

import subprocess

 

def clear():

show_busy()

os.system("rm -v /home/pi/Downloads/UV4/*")

hide_busy()

 

def show_busy():

busy.show()

print("busy now")

 

def hide_busy():

busy.hide()

print("no longer busy")

 

def fullscreen():

 

app.tk.attributes("-fullscreen", True)

 

def notfullscreen():

 

app.tk.attributes("-fullscreen", False)

 

# Generate timestamp string generating name for photos

def timestamp():

tstring = datetime.datetime.now()

#print("Filename generated ...")

return tstring.strftime("%Y%m%d_%H%M%S")

 

global capture_number

capture_number = timestamp()

video_capture_number = timestamp()

 

def long_preview():

show_busy()

print("15 second preview")

os.system("raspistill -f -t 15000")

hide_busy()

 

def longB_preview():

show_busy()

print("30 second preview")

os.system("raspistill -f -t 30000")

hide_busy()

 

def capture_image():

 

show_busy()

capture_number = timestamp()

print("Raspistill starts")

os.system("raspistill -f -md 3 -r -ag 4 -dg 1 -o /home/pi/Downloads/FS/" +str(capture_number) + "_ag4_a.jpg")

print("Raspistill done")

hide_busy()

 

def SS_in8():

show_busy()

capture_number = timestamp()

print("Raspistill starts")

os.system("raspistill -f -md 3 -r -ex off -ss 125000 -ag 4 -dg 1 -o /home/pi/Downloads/FS/" +str(capture_number) + "_ag4_ssin8.jpg")

print("Raspistill done")

hide_busy()

 

def SS_in4():

show_busy()

capture_number = timestamp()

print("Raspistill starts")

os.system("raspistill -f -md 3 -r -ex off -ss 250000 -ag 4 -dg 1 -o /home/pi/Downloads/FS/" +str(capture_number) + "_ag4_ssin4.jpg")

print("Raspistill done")

hide_busy()

 

def SS_in2():

show_busy()

capture_number = timestamp()

print("Raspistill starts")

os.system("raspistill -f -md 3 -r -ex off -ss 500000 -ag 4 -dg 1 -o /home/pi/Downloads/FS/" +str(capture_number) + "_ag4_ssin2.jpg")

print("Raspistill done")

hide_busy()

 

def ss_1():

show_busy()

capture_number = timestamp()

print("Raspistill starts")

os.system("raspistill -f -md 3 -r -ex off -ss 1000000 -ag 4 -dg 1 -o /home/pi/Downloads/FS/" +str(capture_number) + "_ag4_ss1.jpg")

print("Raspistill done")

hide_busy()

 

def ss_2():

show_busy()

capture_number = timestamp()

print("Raspistill starts")

os.system("raspistill -f -md 3 -r -ex off -ss 2000000 -ag 4 -dg 1 -o /home/pi/Downloads/FS/" +str(capture_number) + "_ag4_ss2.jpg")

print("Raspistill done")

hide_busy()

 

def ss_4():

show_busy()

capture_number = timestamp()

print("Raspistill starts")

os.system("raspistill -f -md 3 -r -ex off -ss 4000000 -ag 4 -dg 1 -o /home/pi/Downloads/FS/" +str(capture_number) + "_ag4_ss4.jpg")

print("Raspistill done")

hide_busy()

 

def DNG():

show_busy()

subprocess.Popen(["python3", "/home/pi/TouchCam/DNG_FS.py", "--yes"])

hide_busy()

 

def DCRaw():

show_busy()

os.system("/home/pi/dcraw/dcraw -a -6 -T /home/pi/Downloads/FS/*.jpg")

hide_busy()

 

def DCRawMono():

show_busy()

os.system("/home/pi/dcraw/dcraw -d -6 -T /home/pi/Downloads/Mono/*.jpg")

hide_busy()

 

def Chan():

show_busy()

subprocess.Popen(["python3", "/home/pi/TouchCam/ChanPil_FS.py", "--yes"])

hide_busy()

 

 

 

app = App(layout="grid", title="Camera Controls", bg="black", width=800, height=480)

space0 = PushButton(app, grid=[0,0], width=50, height=50, image="/home/pi/TouchCam/icon/del.png", command=quit)

 

 

button = PushButton(app, grid=[1,1], width=150, height=150, image="/home/pi/TouchCam/icon/prev.png", command=long_preview)

text1 = Text(app, color="white", grid=[1,2],text="Focus")

space = PushButton(app, grid=[2,0], width=10, height=10, image="/home/pi/TouchCam/icon/100trans.png", command=notfullscreen)

 

button2 = PushButton(app, grid=[3,1], width=150, height=150, image="/home/pi/TouchCam/icon/cam.png", command=capture_image)

text2 = Text(app, color="white", grid=[3,2],text="Auto")

space2 = PushButton(app, grid=[4,0], width=10, height=10, image="/home/pi/TouchCam/icon/100black.png", command=longB_preview)

 

button3 = PushButton(app, grid=[5,1], width=150, height=150, image="/home/pi/TouchCam/icon/cam.png", command=SS_in8)

text2 = Text(app, color="white", grid=[5,2],text="AG4 1/8")

space3 = PushButton(app, grid=[6,0], width=10, height=10, image="/home/pi/TouchCam/icon/100black.png", command=DCRaw)

 

 

button4 = PushButton(app, grid=[7,1], width=150, height=150, image="/home/pi/TouchCam/icon/cam.png", command=SS_in4)

text3 = Text(app, color="white", grid=[7,2],text="AG4 1/4")

space4 = PushButton(app, grid=[8,0], width=10, height=10, image="/home/pi/TouchCam/icon/100black.png", command=Chan)

 

 

button5 = PushButton(app, grid=[1,3], width=150, height=150, image="/home/pi/TouchCam/icon/cam.png", command=SS_in2)

text4 = Text(app, color="white", grid=[1,4],text="AG4 1/2")

button6 = PushButton(app, grid=[3,3], width=150, height=150, image="/home/pi/TouchCam/icon/cam.png", command=ss_1)

text2 = Text(app, color="white", grid=[3,4],text="AG4 1sec")

button7 = PushButton(app, grid=[5,3], width=150, height=150, image="/home/pi/TouchCam/icon/cam.png", command=ss_2)

text3 = Text(app, color="white", grid=[5,4],text="AG4 2sec")

button8 = PushButton(app, grid=[7,3], width=150, height=150, image="/home/pi/TouchCam/icon/cam.png", command=ss_4)

text4 = Text(app, color="white", grid=[7,4],text="AG4 4sec")

 

busy = Window(app, bg="red", height=100, width=800, title="busy")

 

app.tk.attributes("-fullscreen", True)

busy.hide()

app.display()

 

Just copy and paste these codes into .py files, add the attached icons to the specified TouchCam icon folder and you will be all set with a GUI camera on the Pi system. Quite fun.

post-188-0-80236700-1596570678.png

post-188-0-16017800-1596570689.png

post-188-0-29526100-1596570700.png

post-188-0-32587900-1596570709.png

post-188-0-27144200-1596570780.png

Link to comment
  • 2 years later...

Please sign in to comment

You will be able to leave a comment after signing in



Sign In Now
×
×
  • Create New...