Jump to content
UltravioletPhotography

[UV SAFETY WARNING] Reaspberry Pi HQ affordable Fast UV sensitive Sensor!


dabateman

Recommended Posts

Yes the sensor is sensitive to UV without bayer removal. Actually if you have a 365nm LED light source that is tight and a dark room you can image without any filters as the stock CM500 filter on the sensor sees from 350nm to 650ish nm.

You can add 3mm zwb2 filter to your lens to do UV with a stock camera. The stock CM500 is 1mm thick so not the best at removing all IR leakage. Adding S8612 is best.

 

I use presets with fixed ISO and shutter speeds with white balance settings depending on my HQ module.

In the code I posted in that link has my template program.

 

I think I read some where that the stereo Hat doesn't work well with the HQ. I bought a camera scheduler. But still haven't tested it fully out.

https://www.waveshare.com/camera-scheduler.htm

 

https://www.raspberrypi.org/forums/viewtopic.php?t=267761

 

Link to comment

If the camera is sensitive to X-rays and doesn't get damaged (sensor and memory) you may make a pinhole out of lead and use it for reflected X-rays photography, which is something I never saw being done (airport X-rays scanners using backscattered X-rays are similar to that). I will not mess with X-rays now or in the near future, as in Italy you have to ask permission to even own a tube, and of course X-rays are dangerous.

 

MaxMax sells VUV phosphors, maybe those can make you see in deep UV. The concept is similar to the SWIR upconversion phosphors Andy tested.

Link to comment

3D printing materials, (the filament) might also be leaking some IR that can contaminate the images.

Especially when using the sun as light source.

 

Light leakage can be tricky to track down in any camera structure.

The amount of light disturbing the UV-image is sometimes rather small.

I often use a laser pointer painting over all possible suspected areas while observing the live image from the camera.

Sometime the light can bounce on surfaces several times before reaching the sensor, but only if it has the right angle.

Link to comment

Yes the sensor is sensitive to UV without bayer removal. Actually if you have a 365nm LED light source that is tight and a dark room you can image without any filters as the stock CM500 filter on the sensor sees from 350nm to 650ish nm.

You can add 3mm zwb2 filter to your lens to do UV with a stock camera. The stock CM500 is 1mm thick so not the best at removing all IR leakage. Adding S8612 is best.

 

I use presets with fixed ISO and shutter speeds with white balance settings depending on my HQ module.

In the code I posted in that link has my template program.

 

I think I read some where that the stereo Hat doesn't work well with the HQ. I bought a camera scheduler. But still haven't tested it fully out.

https://www.waveshar...a-scheduler.htm

 

https://www.raspberr...ic.php?t=267761

 

Interesting. Just to make sure we are on the same page the HAT resizes the images and puts them into one frame.

 

 

Arducam has schedulers but I don't like their performance (11FPS).

https://www.arducam.com/product/multi-camera-v2-1-adapter-raspberry-pi/

 

At the moment I'm asking Arducam if we can use our cameras with their HAT or if we must use their cameras (despite the same sensor array).

Link to comment

Your links look good. Both seem to support the HQ camera.

 

I am mixing up the various stereo pi versions. There are many and I once had it straight. There is also Stereopi which seems to be on version 2 with support for the new computer module.

 

You will know more about this than me as I was only up to speed back in June. But looks like much has changed since than.

 

Link to comment

Reply from Arducam about their "Arducam 12MP*2 Synchronized Stereo Camera Bundle Kit for Raspberry Pi" HAT (link) :

"Regular RPI HQ camera doesn’t work on our HAT.

So we sell the bundle kit to fix all the necessary modifications to the cameras.

You’d better to use our bundle as a start point. "

 

Most unfortunate. I'm asking them if they can send me modifications instructions.

Link to comment
Thanks for the lens recommendation. As mentioned...too bad it does not have a thread for a filter but it's easy to tape a 30mm filter adapter.

post-134-0-82783700-1608423633.jpg

post-134-0-74562100-1608423652.jpg

post-134-0-04581100-1608423673.jpg

Link to comment

If those filters are 25.5mm, can they screw on behind the lens?

So see if you could remove the 5mm adapter ring, screw the filter first to the camera and then the lens to the filters.

Might work but not sure about your threads.

Link to comment

The dual camera setup is coming together.

 

Notes:

Currently using two Pis (may change later).

Waiting for another Pi 12MP camera (to replace styrofoam mounted camera).

 

I had trouble taking RGB images while the UV light was on. Now I'm wondering what I can do to imitate the intensity and spectrum on the sun indoors to test the setup and software before taking it all outside. I would like simultaneous video.

 

Da Batman, you had some recommendations to add a filter to the RGB camera. Now I know why. :)

post-134-0-49020500-1608451827.jpg

post-134-0-14486400-1608451841.jpg

post-134-0-68016900-1608451853.jpg

Link to comment

Try setting the lens to most far setting and the zoom to most telephoto, then hold the front touching a book with words and slowly pull back to see if you see anything.

Its possible the added glass is a lot and you may need to add a 5mm spacer after the filters and try my test again.

 

Too much distance pushes infinity mark to super macro. But too much glass behind a lens will shift closes focus point to infinity. So you need to see if it went macro or beyond infinity.

Link to comment

I'm still hoping to connect two of these cameras to one platform. Using the Pi had a bottleneck of only one MIPI port and the work arounds were sub par. Currently have two Pis, one for each camera and have not started streaming yet.

 

Now I'm looking at the Jetson Nano, it's just like a Pi, but with an extra chip on it for AI (which we don't need to use). And it's only $99

 

The Jetson Nano Dev Kit has two MIPI ports:

https://developer.nvidia.com/embedded/jetson-nano-developer-kit

 

YouTube tutorial on using two cameras on it in Python:

 

and there are several applications of adding a battery pack to it and making it portable:

https://www.nvidia.com/en-us/autonomous-machines/embedded-systems/jetbot-ai-robot-kit/

 

looks promising...

Link to comment

Extra note on the MIPI camera to Pi setup:

 

100 cm long MIPI ribbon cable works great.

200 cm long MIPI ribbon cable works, but you lose connection to the camera every 10 minutes.

Link to comment

Got two of these cameras streaming to one platform, using the Jetson Nano.

 

I'm only using the default examples. So I'm not getting the raw images yet or adjusting the exposure yet.

 

I had to remove R8 first:

 

The 100cm ribbon cable was too long. Had to shorten it to work reliability.

post-134-0-19908700-1609242715.jpg

Link to comment

Your lucky the R8 was so close to the edge of the board and could just be wiped off.

 

I wonder why that modification is needed. Why make that the difference. What have you lost or gained? Is the driving voltage different.

 

Now that you have the soldiering irons out, there are other timing modifications you can make. The pins for flash and extra long exposure have been discussed on the camera pi board. You can expose for hours.

 

https://www.raspberrypi.org/forums/viewtopic.php?f=43&t=281913

Link to comment

Your lucky the R8 was so close to the edge of the board and could just be wiped off.

 

I wonder why that modification is needed. Why make that the difference. What have you lost or gained? Is the driving voltage different.

 

Now that you have the soldiering irons out, there are other timing modifications you can make. The pins for flash and extra long exposure have been discussed on the camera pi board. You can expose for hours.

 

https://www.raspberrypi.org/forums/viewtopic.php?f=43&t=281913

 

Had something to do with the voltage level of a reset signal.

 

https://www.hackster.io/SaadTiwana/embedded-diaries-how-to-use-rpi-hq-camera-with-jetson-e2063e

Link to comment

Hello Ori333,

I’m reading your thread with great interest. A few months ago I also had a HQ camera, see link. I tried to remove the Bayer filter. The camera didn't like that.

Best redards,

Wilhelm

 

https://www.ultravioletphotography.com/content/index.php/topic/3883-raspberry-pi-hq-camera-12mp/page__pid__37553__st__40#entry37553

 

Thanks. You got a great image with the quartz lens.

Link to comment

Your right about being lucky. Otherwise I would get the board holder out, change the tips of the irons to match the pad sizes....and use a magnifier.

 

Link to comment
  • 1 month later...

Update: Two live video streams of the Raspberry Pi HD camera:

One is regular RGB

The other is monochrome (Bayer Filter removed) with quartz lens and filters (XNiteBP125.5 and XNite33025.5).

 

Any comments to get a better UV image?

 

https://imgur.com/a/KY0cNWN

 

 

Edit: I have an XNite330, not a 330C.

Link to comment

Update: Two live video streams of the Raspberry Pi HD camera:

One is regular RGB

The other is monochrome (Bayer Filter removed) with quartz lens and filters (XNiteBP125.5 and XNite330C25.5).

 

Any comments to get a better UV image?

 

https://imgur.com/a/KY0cNWN

Yes. Put lens hoods on those babies! The low contrast can often be improved with a lens hood of appropriate length (which you may have to experiment to find).

Link to comment

Please sign in to comment

You will be able to leave a comment after signing in



Sign In Now
×
×
  • Create New...