Jump to content
UltravioletPhotography

Do we know the fuzzy surface resolution limits?


dabateman

Recommended Posts

Looking through Ultrapurples x-ray images and my recent dips into 280nm. I wonder if anyone has calculated the surface resolution limits for imaging an object?

What I am referring to is there will be a line where lower wavelengths, higher energy will result in a blurry image, as you reach enough energy to start to enter into ionizing radiation. I am thinking this may be around 300nm. As lower energies, higher wavelengths seem to produce "crisper" images of fine hair structures. My 313nm images look good, the sweet spot may be 335nm. But lower wavelengths, higher energy and I start to see blurring. After thinking about xrays and ionization at 260nm, I am now thinking that this blurring maybe more than just poor filters and very low camera sensitivity. There should be a point when you enter into the object, as in for x-ray images and thus the surface will be blurry. Just as there is an IR point where the wavelengths are large enough to pass through the sample, causing blurring.

Has anyone calculated these points?

 

I am now guessing that the high energy point will be 300nm and the low energy point is 1100nm. Below and above these, surfaces will be blurry.

 

Link to comment
Andy Perrin

I think your understanding of the physics is flawed here. The fuzziness at longer wavelengths is caused by diffraction. It is a function of your aperture, wavelength, and sensor resolution, not ability to ionize. There is also a separate effect where objects tend to become more transparent at longer wavelengths, but that only affects focus via depth of field. That is, if you can see through the material and you are focusing on the surface, then the layers under the surface will be blurry.

 

https://en.m.wikipedia.org/wiki/Airy_disk

https://en.m.wikipedia.org/wiki/Diffraction-limited_system

https://en.m.wikipedia.org/wiki/Circle_of_confusion

Link to comment
enricosavazzi

Another (theoretical) possibility for fuzziness at very low wavelengths:

 

Polished optical surfaces are actually full of small scratches and pits, if you look at high enough magnification (with an SEM for example). Lens coatings are also uneven. Also, the precision with which the curvature of optical surfaces is ground (for example lambda/20) is typically measured with green light. If you decrease the wavelength used for imaging, the precision relative to the wavelength becomes worse.

 

So the optics for extreme UV must be ground and polished with much more exacting standards than those for VIS and NUV. If this is not done, when you decrease wavelength, first you should reach a maximum in resolution where diffraction is low and degradation by optical defects is also low. Beyond this maximum, diffraction is still decreasing, but surface defects have an increasing importance and image resolution decreases as a result.

Link to comment

Andy,

At long wavelengths, low energy, I said it will pass through the sample. Thus being fuzzy. My new thought to try and understand is at what high energy low wavelengths do you see into the sample, thus removing surface details. At 800nm, you can easily pass through most cells to allow for two-photon excitation microscopy and do interesting things. So maybe I have the low energy limit too high. It would depend on your tolerance to fuzziness.

 

Enrico, you maybe correct. What I am seeing may be just due to incorrectly polished filters. But I do wonder where is the transition point from sharp surface structure, to seeing through surfaces at bones. This would cause a Fuzzy surface.

 

For the chemists out there, I am thinking about the transition points, like freezing and boiling water. Things get interesting at transition points and I wonder if we know the visual points.

Link to comment

Ok, I think I may not have explained this correctly leading to possible misconceptions. Because I was mainly interested in the high energy spectra section. See if this question makes more sense:

 

At long wavelengths, like microwaves and radio waves, the wavelengths are really big and will pass through objects. Shrink this down to IR wavelengths and you are small enough to see things as the wavelength no longer passes through the object. I think I will set my upper fuzzy threshold at 900nm, now thinking about what objects look like using a long pass 1100nm filter. So to increase resolution of the surface details, we can drop the wavelength. However in doing this we are also increasing the energy of that photon wave. So a light source of 405nm will have better resolution of surface details than a light source of 650nm. In UV a 365nm light source produces very sharp surface details of fine hairs and dots.

 

But lets flip this, an X-ray has a very very small wavelength, and yet passes through the surface of an object to see only hard minerals and metals inside. I am assuming the major reason for this is now the energy level is so high, that the ionizing effect is too great to provide any details of the surface. As in if shrinking the wavelength before allowed to see small hairs, why not see the atoms on the surface with sufficient very small wavelengths.

 

So now my question, the one before was rhetorical. What point does the high energy level of the wavelength cause a loss in surface resolution? Possible due to ionizing of the surface.

 

My 255nm light has high enough energy to fill the room with ozone. So I think even at 260nm we may have too high an energy level, for the shorter wavelength to see any higher surface detail. I am thinking the practical lower limit may be 300nm. However, I wonder if this has been calculated. I couldn't find it quickly, but may have not search the correct terms.

Link to comment
Andy Perrin
You want to know when the light source will destroy the surface itself? That has nothing to do with resolution and a lot to do with the chemistry of the particular surface. There will not be a single answer. Metals behave differently from nonmetals for a start.
Link to comment
It sounds as if you are conflating transparency/opacity and blurriness. They are not the same thing. In theory, resolution increases with decreasing wavelength, all other factors being equal. There is no theoretical constraint on this of which I am aware. That is not to deny that there are practical obstacles to using shorter wavelengths (constructing suitable optics, etc.)
Link to comment

Yes OlDoinyo, you have it.

That would be a better way to search for it. The transparency index, at what point does the surface go from sharp highly detailed to phasing out and being transparent.

Link to comment

From what I have found it gets complex. The penetrating depth seem to depend on the refractive index, the interfaces and the materal spacing. There are other factors as well.

Biological surfaces seem to shift most clearly around 10nm. The point between soft x-rays and UVc. Not generally reflective, and thus more transparent.

 

However, proteins and amino acid backbone and other biological molecules absorb at 220nm to 300nm, so this leads to loss in surface reflection. This is the fuzzy point I was eluding to, so my guess of 300nm was not far off.

 

So for electromagnetic radiation, UV-c may be the surface structure limit for imaging.

 

So the high end in IR, we have greater depth penetration due to the longer wavelengths, up to the absorption of water.

At the low end we have the wavelengths passing through the sample as the matrix is broad and not much to reflect off of, like gamma rays passing through. But we loose resolution due to absorbing of the material and the light not reflecting, strongly for proteins at 230nm.

 

I have used electron microscope and Atomic force microscopy. To gain resolution, but these use something tapping or bouncing off the material for imaging. I have been curious about the electromagnetic imaging limits.

Link to comment
Andy Perrin
However, proteins and amino acid backbone and other biological molecules absorb at 220nm to 300nm, so this leads to loss in surface reflection. This is the fuzzy point[...]

No, absorption in the material barely affects surface reflection. Surface reflection depends only on the light angle and the complex refractive indexes of the surface and surrounding medium. The complex refractive index is the version that has the usual (real number) refractive index as the real part and the extinction coefficient, which depends on the absorption in the material, as the imaginary part.

 

N = n - ik, where N is the complex refractive index, n is real refractive index, k is extinction coefficient, and i=sqrt(-1).

 

The reflection coefficient from the surface is (for the case of light perpendicular to the surface):

R = ( |Nair - Nsurf| / |Nair + Nsurf| )2

 

The extinction coefficient k = alpha*lambda/(4π), where alpha is the absorption coefficient of the material. But k is usually pretty small compared to n for most insulators.

So R does depend on the absorption but it's generally a small effect in insulators.

Link to comment

Nope, the absorption at 280nm is strong enough to prevent quite a bit of the reflection.

When I have time I will shoot a flower, a sugar cube, a chunk of salt and a piece of Coca fat.

Link to comment
Andy Perrin
You are attributing the overall reflectance to the surface reflectance. They are not the same. There is also scattering inside the material.
Link to comment
Ultrapurplepix

This is a very interesting thread. To a certain extent I think the 'fuzzy threshold' depends on the material (and I am using 'fuzzy' in a wide term) - a landscape in near-IR can look fairly sharp, but a person's face is soft; as I understand it, this is caused by the longer wavelength being reflected from deeper in the tissue. But that's not the question here.

 

Coming back to materials, this is an area that's being relentlessly investigated by the semiconductor companies. Their 'problem' is sort-of the reverse of ours, in that they need to project a tiny, well-focused image onto a photoresist ('emulsion') to delineate the tiny bits of silicon used to make parts of individual transistors. In round figures the state of the art is a minimum feature size of 10nm, though this is partly achieved by very clever use of phase techniques as well as more traditional optical systems.

 

There's a very interesting Wikipedia article on extreme UV imaging for fabricating integrated circuits that deals with these, and related issues, including a look at surface interactions at wavelengths in the tens of eV range.

Link to comment

Please sign in to comment

You will be able to leave a comment after signing in



Sign In Now
×
×
  • Create New...