Jump to content
UltravioletPhotography

Mars in UV!


Andy Perrin

Recommended Posts

Andy Perrin

Mars tricolor pics using three bands between 110 and 340nm! 
 

Mars in UV

 

Given our experiences with UV tricolors here, it seems notable that they got some significant color differences using that range of wavelengths. Apparently our problem is that we have been using too narrow a spectral range, there is just not enough differences in absorption within UV-A for most substances to make tricolors show a whole lot. 

Link to comment

Indeed. And the same goes for tri-colour IR, by the way, most outcomes being very drab and dull with the odd deviant image in between.

Link to comment

Beautiful images. I can only dream about imaging at 110 nm, silicon doesn't even respond there (I have to search how they did that).

 

Of course, the wider the band, the higher the chance of getting colors.

Link to comment
10 hours ago, Stefano said:

Beautiful images. I can only dream about imaging at 110 nm, silicon doesn't even respond there (I have to search how they did that).

 

Of course, the wider the band, the higher the chance of getting colors.

Silicon absolutely responds to far UVC. MaxMax monochrome converted piHQ sensor has been reported to pick up X-rays.

The problem is many sensors have stuff on them. Only the top most layer is sensitive.  So you either use a shaved naked sensor or one optimized.  The optimizations include:

The main way in silicon sensors are altered to improve the UV QE is via UV Enhancement. This involves additional back-thinning of the sensor, meaning that a shallow photon penetration depth can still result in a conversion to a photoelectron. This process is similar to traditional back-thinning, in which the p+ layer  (doped layer formed from electron holes) is shallow at the surface of the sensor, but the UV Enhancement results in an even thinner p+ layer, bringing the potential peak as close to the sensor surface as possible. This minimizes any area of the surface which cannot absorb the UV photons, which is essential to improve QE for UV light.

Link to comment

I knew that camera sensors could respond to light between 190 and 1100 nm, according to Wikipedia. I also knew about X-rays (even gamma rays and maybe other forms of radiation appear as white dots). But I don't know about UV deeper than 190 nm or so.

Link to comment
6 hours ago, dabateman said:

Silicon absolutely responds to far UVC. MaxMax monochrome converted piHQ sensor has been reported to pick up X-rays.

The problem is many sensors have stuff on them. Only the top most layer is sensitive.  So you either use a shaved naked sensor or one optimized.  The optimizations include:

The main way in silicon sensors are altered to improve the UV QE is via UV Enhancement. This involves additional back-thinning of the sensor, meaning that a shallow photon penetration depth can still result in a conversion to a photoelectron. This process is similar to traditional back-thinning, in which the p+ layer  (doped layer formed from electron holes) is shallow at the surface of the sensor, but the UV Enhancement results in an even thinner p+ layer, bringing the potential peak as close to the sensor surface as possible. This minimizes any area of the surface which cannot absorb the UV photons, which is essential to improve QE for UV light.

 

I'd try to capture Mars images like that, but the atmosphere cuts out deep UV. A satellite (or maybe a very high mountain?) is needed. Also atmospheric "seeing" limitations affect UV far more than, say, infrared images. I'd rather try this using different bands of IR, which would be comparatively easy. We'd be able to use regular lenses too.

Link to comment

Mars is a relatively small object in the sky and you would need a focal length much longer than any camera lens to reach that size in the image.

 

Last autumn Jupiter was in a good position for imaging and I tried to capture it.

I think that it then, seen from Earth, had a similar size as Mars.

Even with almost 1000mm FL the planet did not cover that many pixels.

 

  

Link to comment

I think Jupiter is always bigger than Mars in the sky. Right now, Mars is very small.

 

Diffraction is also a limit for resolution.

 

With a large telescope (I think at least 30 cm or 12"), good or excellent seeing and processing software, you can see the discs of Galilean moons.

 

Probably you can take UVA photos of Mars, but very likely UVB is asking too much, unless you have a huge telescope.

Link to comment
1 hour ago, Stefano said:

I think Jupiter is always bigger than Mars in the sky. Right now, Mars is very small.

 

Diffraction is also a limit for resolution.

 

With a large telescope (I think at least 30 cm or 12"), good or excellent seeing and processing software, you can see the discs of Galilean moons.

 

Probably you can take UVA photos of Mars, but very likely UVB is asking too much, unless you have a huge telescope.

You are right, but the size of the telescope like 30 cm or 12" is more about how much light it can collect.

 

The magnification related to the focal length must also be large to make the planets fill many pixels on the camera.

I am not into astronomy and do not own a telescope. I just wanted to test the limits of the camera and lenses I have.

 

When capturing Jupiter in VIS  I could also easily see the Galilean moons, but not as discs, only as pixels with different intensity.

This is a final result after a lot of processing:

2023-02-28JupiterGalileanmoons.jpg.7898ea3c77f7f9b781ce37921a07c767.jpg

I used almost 100 source files and a lot of image processing and enhancement. 

Link to comment

Extended-spectrum astrophotography would make a nice thread on its own here. IR in particular has huge and often overlooked advantages in astrophotography.

Link to comment

Ulf's photo is not bad. It's better than the few I was able to take of Jupiter.

 

This year I followed Venus and saw its fase change as the months passed. I never tried to take UV images of Venus, which is well-known to show its cloud structure, but I took some nice VIS ones using my phone in the eyepiece of my telescope.

 

On multiple occasions I was able to see it in my telescope in broad daylight, as early as 3-4 pm (if you want to try this, be sure to take care to avoid pointing your telescope at the Sun. I waited for the Sun to pass behind a nearby building).

 

Although I was not able to see it with the naked eye, in the telescope it was clearly visibile, it had more contrast with the sky than the moon does, which makes sense.

Link to comment

I don't know. The white balance was done on a gray card in sunlight. That region of the nebula emits mainly narrow-band red (Hydrogen alpha, 658nm) and blue (hydrogen beta). But there may be some IR emission lines for hydrogen as well; I don't know.

Link to comment

Please sign in to comment

You will be able to leave a comment after signing in



Sign In Now
×
×
  • Create New...