Jump to content
UltravioletPhotography

Macro: I am glad that I persisted.


colinbm

Recommended Posts

I am glad that I persisted.
Thanks for your help & encouragement on this forum.
Still getting the shakes out of the system & very fine adjustment.

The 10x microscope objective has preformed very well.
This is a 1mm wide crop of the Green Jewel Beetle's shell.
Only the centre strip is stacked, it was 12 stacks over about 0.1mm depth.
The magenta piece seems to be dust ?

 

1193457489_2022-11-19-07_30.18ZSDMap_v1-Sharpentextweb.jpg.ce7b387106814a5c3e364aef62449cbe.jpg

 

Link to comment
Share on other sites

Thanks Ulf, 

Yes it is certainly nice to see this level of detail, but it is still early days, hopefully better to come.

Link to comment
Share on other sites

Thanks Andrea
I still have a way to go, but the goal is to see the structures of the 'Structural Colour' in these beetles & butterflies etc.
I am working up to a 60x objective. 20x will be next week I hope, followed by 40x & 60x next year all going well, slowly improving the stability of the system & the very fine focus stacks.

Link to comment
Share on other sites

Colin, I think that goal should be beyond the reach of visible light microscopy. Very short wavelength UV light might do the trick. But  no matter what wavelength you use, the NA sets an absolute limit to the resolution of the system, as a function of wavelength. You will probably need a very high NA even when using short-wavelength light. By the way, the magnification of the infinity-corrected objective is completely irrelevant to your goal. All that matters is its NA. For example, a 10x 0.5 will outresolve a 20x 0.42.

Link to comment
Share on other sites

There is one other issue, which has surely been addressed somewhere in these threads. The resolution I mention in my preceding comment is the resolution of the "aerial image", the actual image projected onto the sensor. But the sensor generally can't capture the full resolution of the aerial image, because of the pixels and the Bayer filter matrix. If you have a 20Mp camera, the sensor is only taking blue data at 5Mp; and filling in the rest via software. Same with red. Green is 10Mp of real data. These are very low resolutions. The solution is to use pixel-shifting to move the senor around so that full color data is taken at each pixel-sized point in the aerial image. Pentax cameras do this by taking four shifted images per shot.Olympus and Panasonic do it by taking eight images per shot; this can reveal sub-pixel detail at the cost of gigantic files. The most recent Sony cameras let you choose the level of shifting, up to 16 images per shot. These techniques squeeze the most resolution out of the aerial image. However, these techniques are not worth the trouble if your aerial image is deep into diffraction territory.

 

Another solution is to use a monochrome camera with no Bayer filter.

Link to comment
Share on other sites

The actual structures that create the color are going to require electron microscopy to resolve. This is because they are going to be on the scale of the light wavelength itself (hundreds of nanometers) and thus the light will perforce be too long in wavelength to resolve such features.

Link to comment
Share on other sites

56 minutes ago, OlDoinyo said:

The actual structures that create the color are going to require electron microscopy to resolve. This is because they are going to be on the scale of the light wavelength itself (hundreds of nanometers) and thus the light will perforce be too long in wavelength to resolve such features.

Yes, I have told him this multiple times. I was thinking though that UV might be enough to see something? Especially UVB.

 

even if he doesn’t reach the destination, the journey might be very pretty.

Link to comment
Share on other sites

Well, resolution is exactly proportional to wavelength, all else being equal. If green interference structures are about the same scale as the wavelength of green light, say 540nm, and if you need two pixels to resolve a structure, then you need to use light whose wavelength is less than 270nm. That's just an upper bound; the real number has to be even lower, to take into account NA and other issues.

 

Having said that, I did once seem to photograph the interference structures on a scale of a Sunset Moth. But maybe these were just artifacts. I am still not too sure.

https://www.photomacrography.net/forum/viewtopic.php?p=261204

 

Link to comment
Share on other sites

lukaszgryglicki

254nm light is easy to get very cheap, but then filtering becomes the major problem IMHO (I'm now at this level, I can't get good 254nm image that doesn't contain other wavelengths).

 

Link to comment
Share on other sites

I have been thinking about why my photos  (see link above) seem to capture the iridescence-producing layers even without using UV light. We all have made a mistake in our assumption that the structures that produce the color must have the same scale as the wavelength of light. The physics of interference does not require the layer spacing to be a set distance. The physical requirement for interference is for the distance to be an integer multiple of the special wavelength. The structures could be much bigger than the wavelength of the light that they reflect.

Link to comment
Share on other sites

Thanks Lou
I have often wondered about doubling & halving wavelengths.
A common green laser are actually an IR laser, emitting natively 1064nm light, but cut in half.

Link to comment
Share on other sites

Please sign in to comment

You will be able to leave a comment after signing in



Sign In Now
×
×
  • Create New...