Jump to content
UltravioletPhotography

Moon in UV


Andreas

Recommended Posts

Several days ago I have taken images of the moon in Ultraviolet Light using a Baader U 2" filter in front of a Beroflex 400mm lens mounted on a full spectrum converted Olympus OM-D EM10 Mark II camera. I have used a Nano Tracker to minimize blurring due to earth rotation. Unfortunately the sharpness of the Beroflex lens is not perfect in Ultraviolet and Infrared range, nevertheless I can see at least in two marked areas differences in brightness for UV image compared to the images taken in the Visible and Infrared range.

The image taken with Olympus 75-300 mm zoom lens mounted on a standard Olympus OM-D EM10 Mark II camera is for comparison, because the sharpness of this zoom lens at 300 mm is better that the sharpness of the old Beroflex 400 mm lens.

 

Did anyone else tried to take UV images of the moon with some success?

 

 

Ultraviolet Light: Baader U 2"

post-147-0-55083400-1536012207.jpg

 

Visible Light: Hoya UV IR Cut

post-147-0-63357400-1536012217.jpg

 

Infrared Light: Zomei IR720

post-147-0-71413600-1536012231.jpg

 

Visible Light: Olympus 75-300mm at 300mm on Olympus OM-D EM10 Mark II without additional filter

post-147-0-63375200-1536012513.jpg

Link to comment
  • 3 months later...

Now I can add some images with better quality taken with Olympus OM-D EM10 Mark II and Celestron C6 with focal reducer.

Images in visible light and infrared light are with high color saturation. I think especially these colors are interesting.

 

Bay the way a Celestron C6 is not only a telescope it is also a telephoto lens with a large focal length of about 1500mm and 1000nm (with focal reducer) respectively. It is full usable in UV even with the focal reducer. I did not observe a focus shift. The resolution of the UV image is only reduced due to long exposure time and atmospheric turbulence.

 

Visible Light: SVBONY UV/IR Cut, Exposure time: 1/90s

post-147-0-95769800-1545502448.jpg

 

Infrared Light: Zomei IR720, Exposure time: 1/90s

post-147-0-94181500-1545502437.jpg

 

Ultraviolet Light: Baader U 2", Exposure time 8s

post-147-0-90255800-1545502468.jpg

Link to comment

By the way, that darker black/blue area in your original RGB visual shot (and others of the same color) represent titanium rich areas.

 

Lets desaturate each of your images for a monochrome comparison.

Left to right: Visual, IR, UV. Not much difference without the color.

post-87-0-01648300-1545514057.jpg

Link to comment

Interestingly, they define UV as 450 nm ...

I think that is a typo, occurring once. 415nm is what they have written in all other places for the shortest wavelength band.

Link to comment

Wikipedia had more info about Clementine:

https://en.wikipedia.org/wiki/Clementine_(spacecraft)

 

The sensor consisted of a catadioptric telescope with an aperture of 46 mm and fused silica lenses focused onto a coated Thompson CCD camera with a bandpass of 250–1000 nm and a six-position filter wheel. The wavelength response was limited on the short wavelength end by the transmission and optical blur of the lens, and on the long end by the CCD response. The CCD was a frame transfer device which allowed three gain states (150, 350, and 1000 electrons/bit). Integration times varied from 1–40 ms depending on gain state, solar illumination angle, and filter. The filter center wavelengths (and bandpass widths (FWHM)) were 415 nm (40 nm), 750 nm (10 nm), 900 nm (30 nm), 950 nm (30 nm), 1000 nm (30 nm), and a broad-band filter covering 400–950 nm.

Link to comment

Very interesting. If you put the visible image through my Independent Component Analysis method (like I did for the fields here), you can see the color variations better.

post-94-0-47655900-1545552735.jpg

Link to comment

Here is a simple multispectral composite.

 

Blue channel = UV shot desaturated

Green channel = Visual shot desaturated

Red channel = IR shot desaturated

Auto levels, + vibrance, sharpened.

 

post-87-0-63889200-1545569982.jpg

Link to comment
  • 2 years later...

Knowing the green channel is the most important when it comes to sharpness due to the way we humans perceive color, I made a composite of the first three pictures. I saturated it a bit but overall it's quite a subtle result.

post-350-0-65601500-1629508094.jpg

Link to comment

Did you delete the first three pics? I see the composite but that's all.

 

You can improve on that with better alignment of the images, and also with some sharpening and denoising.

Link to comment
  • 5 months later...
On 8/21/2021 at 4:00 AM, Andy Perrin said:

Did you delete the first three pics? I see the composite but that's all.

 

You can improve on that with better alignment of the images, and also with some sharpening and denoising.

Using luminance denoising ruins detail, I never use it, sharpening is also more or less cooking from water, but modest amounts are ok.

I don't like overcooked images in general.

I don't think the images could be aligned any better, you can take a shot at it yourself if you see any misalignments.

Link to comment
51 minutes ago, Fandyus said:

Using luminance denoising ruins detail, I never use it, sharpening is also more or less cooking from water, but modest amounts are ok.

I don't like overcooked images in general.

I don't think the images could be aligned any better, you can take a shot at it yourself if you see any misalignments.

You're replying rather late to this!

 

Careful luminance denoising using an app or filter dedicated to the job does not have to ruin detail. Examples would be Neat Image and also the new Topaz software (in moderation). Moderation is key.

 

I have my own moon pics, if you don't like the free advice, it's worth what you paid...

Link to comment
15 hours ago, Andy Perrin said:

You're replying rather late to this!

 

Careful luminance denoising using an app or filter dedicated to the job does not have to ruin detail. Examples would be Neat Image and also the new Topaz software (in moderation). Moderation is key.

 

I have my own moon pics, if you don't like the free advice, it's worth what you paid...

Topaz is great, but it is largely due to machine learning which "makes up" detail that isn't there. I have not tried Neat Image so I can't judge how that works.

Link to comment

Neat image is not machine learning, it’s based on more conventional methods (probably Fourier or wavelets). The Topaz software for denoising doesn’t make up details as far as I know. You may be thinking of the sharpening tool or gigapixel (in the same package). It is based on machine learning, but it just removes noise. There is also a sharpening slider but you don’t have to use it. 

Link to comment

Please sign in to comment

You will be able to leave a comment after signing in



Sign In Now
×
×
  • Create New...