Andreas Posted September 3, 2018 Share Posted September 3, 2018 Several days ago I have taken images of the moon in Ultraviolet Light using a Baader U 2" filter in front of a Beroflex 400mm lens mounted on a full spectrum converted Olympus OM-D EM10 Mark II camera. I have used a Nano Tracker to minimize blurring due to earth rotation. Unfortunately the sharpness of the Beroflex lens is not perfect in Ultraviolet and Infrared range, nevertheless I can see at least in two marked areas differences in brightness for UV image compared to the images taken in the Visible and Infrared range.The image taken with Olympus 75-300 mm zoom lens mounted on a standard Olympus OM-D EM10 Mark II camera is for comparison, because the sharpness of this zoom lens at 300 mm is better that the sharpness of the old Beroflex 400 mm lens. Did anyone else tried to take UV images of the moon with some success? Ultraviolet Light: Baader U 2" Visible Light: Hoya UV IR Cut Infrared Light: Zomei IR720 Visible Light: Olympus 75-300mm at 300mm on Olympus OM-D EM10 Mark II without additional filter Link to comment
OlDoinyo Posted September 6, 2018 Share Posted September 6, 2018 To our cameras the moon in UV does not appear terribly different from the way it does in visible light. There are some subtle differences, which have been mapped by the Clementine orbiter. Link to comment
nfoto Posted September 7, 2018 Share Posted September 7, 2018 Interestingly, they define UV as 450 nm ... Link to comment
Andreas Posted December 22, 2018 Author Share Posted December 22, 2018 Now I can add some images with better quality taken with Olympus OM-D EM10 Mark II and Celestron C6 with focal reducer.Images in visible light and infrared light are with high color saturation. I think especially these colors are interesting. Bay the way a Celestron C6 is not only a telescope it is also a telephoto lens with a large focal length of about 1500mm and 1000nm (with focal reducer) respectively. It is full usable in UV even with the focal reducer. I did not observe a focus shift. The resolution of the UV image is only reduced due to long exposure time and atmospheric turbulence. Visible Light: SVBONY UV/IR Cut, Exposure time: 1/90s Infrared Light: Zomei IR720, Exposure time: 1/90s Ultraviolet Light: Baader U 2", Exposure time 8s Link to comment
Cadmium Posted December 22, 2018 Share Posted December 22, 2018 By the way, that darker black/blue area in your original RGB visual shot (and others of the same color) represent titanium rich areas. Lets desaturate each of your images for a monochrome comparison.Left to right: Visual, IR, UV. Not much difference without the color. Link to comment
ulf Posted December 23, 2018 Share Posted December 23, 2018 Interestingly, they define UV as 450 nm ...I think that is a typo, occurring once. 415nm is what they have written in all other places for the shortest wavelength band. Link to comment
ulf Posted December 23, 2018 Share Posted December 23, 2018 Wikipedia had more info about Clementine:https://en.wikipedia.org/wiki/Clementine_(spacecraft) The sensor consisted of a catadioptric telescope with an aperture of 46 mm and fused silica lenses focused onto a coated Thompson CCD camera with a bandpass of 250–1000 nm and a six-position filter wheel. The wavelength response was limited on the short wavelength end by the transmission and optical blur of the lens, and on the long end by the CCD response. The CCD was a frame transfer device which allowed three gain states (150, 350, and 1000 electrons/bit). Integration times varied from 1–40 ms depending on gain state, solar illumination angle, and filter. The filter center wavelengths (and bandpass widths (FWHM)) were 415 nm (40 nm), 750 nm (10 nm), 900 nm (30 nm), 950 nm (30 nm), 1000 nm (30 nm), and a broad-band filter covering 400–950 nm. Link to comment
Andy Perrin Posted December 23, 2018 Share Posted December 23, 2018 Very interesting. If you put the visible image through my Independent Component Analysis method (like I did for the fields here), you can see the color variations better. Link to comment
Cadmium Posted December 23, 2018 Share Posted December 23, 2018 Here is a simple multispectral composite. Blue channel = UV shot desaturatedGreen channel = Visual shot desaturatedRed channel = IR shot desaturatedAuto levels, + vibrance, sharpened. Link to comment
Cadmium Posted December 23, 2018 Share Posted December 23, 2018 Original Visual, IR, and UV images used for Standard Deviation composite. Link to comment
Fandyus Posted August 21, 2021 Share Posted August 21, 2021 Knowing the green channel is the most important when it comes to sharpness due to the way we humans perceive color, I made a composite of the first three pictures. I saturated it a bit but overall it's quite a subtle result. Link to comment
Andy Perrin Posted August 21, 2021 Share Posted August 21, 2021 Did you delete the first three pics? I see the composite but that's all. You can improve on that with better alignment of the images, and also with some sharpening and denoising. Link to comment
Fandyus Posted February 15, 2022 Share Posted February 15, 2022 On 8/21/2021 at 4:00 AM, Andy Perrin said: Did you delete the first three pics? I see the composite but that's all. You can improve on that with better alignment of the images, and also with some sharpening and denoising. Using luminance denoising ruins detail, I never use it, sharpening is also more or less cooking from water, but modest amounts are ok. I don't like overcooked images in general. I don't think the images could be aligned any better, you can take a shot at it yourself if you see any misalignments. Link to comment
Andy Perrin Posted February 15, 2022 Share Posted February 15, 2022 51 minutes ago, Fandyus said: Using luminance denoising ruins detail, I never use it, sharpening is also more or less cooking from water, but modest amounts are ok. I don't like overcooked images in general. I don't think the images could be aligned any better, you can take a shot at it yourself if you see any misalignments. You're replying rather late to this! Careful luminance denoising using an app or filter dedicated to the job does not have to ruin detail. Examples would be Neat Image and also the new Topaz software (in moderation). Moderation is key. I have my own moon pics, if you don't like the free advice, it's worth what you paid... Link to comment
Fandyus Posted February 16, 2022 Share Posted February 16, 2022 15 hours ago, Andy Perrin said: You're replying rather late to this! Careful luminance denoising using an app or filter dedicated to the job does not have to ruin detail. Examples would be Neat Image and also the new Topaz software (in moderation). Moderation is key. I have my own moon pics, if you don't like the free advice, it's worth what you paid... Topaz is great, but it is largely due to machine learning which "makes up" detail that isn't there. I have not tried Neat Image so I can't judge how that works. Link to comment
Andy Perrin Posted February 16, 2022 Share Posted February 16, 2022 Neat image is not machine learning, it’s based on more conventional methods (probably Fourier or wavelets). The Topaz software for denoising doesn’t make up details as far as I know. You may be thinking of the sharpening tool or gigapixel (in the same package). It is based on machine learning, but it just removes noise. There is also a sharpening slider but you don’t have to use it. Link to comment
Recommended Posts
Please sign in to comment
You will be able to leave a comment after signing in
Sign In Now