Jump to content
UltravioletPhotography

Search the Community

Showing results for tags 'Processing'.

  • Search By Tags

    Type tags separated by commas.
  • Search By Author

Content Type


Forums

  • Office
    • Announcements
    • UVP Rules & Guidelines
    • Requests for Photographs
    • Feedback & Support
  • Introductions
    • Who & Why
    • Introduce Yourself
  • UVP Technical Zone
    • Techniques, Tests & Gear
    • UV Lens Technical Data
    • Non-technical Experiences
    • STICKIES, References & Lists
    • Essays & Tutorials
    • ID Help
  • UVP Photo Zone
    • Ultraviolet & Multispectral Photos
    • Fauna: Animals, Birds, Insects or Other Critters
    • Forensics & Other Investigations
    • Fluorescence and Related Glows
    • Infrared and its Friends (SWIR, MWIR, LWIR)
    • Macro
    • People and Portraits
    • Scapes: Land, Sea, City
  • UVP Botanicals
    • UV Wildflowers by Family
    • UV Cultivars: Garden & Decorative Flora
    • UV Cultivars: Vegetables, Herbs & Crops
    • UV Other Botanicals
    • Index

Find results in...

Find results that contain...


Date Created

  • Start

    End


Last Updated

  • Start

    End


Filter by number of...

  1. I bought a Stitz Universal Stereo Adapter on eBay, with the intent of using it to take simultaneous captures in UV and IR, or similar two-filter-at-once combinations. I haven't had time to mess with it much, but I did a proof of concept below. Please note that that air outside is horribly smokey due to fires on the west coast of the US and in Canada, so much of the haze is not the fault of the system, but instead is the fault of the atmosphere. First the setup: And here is a photo. The two sides were independently edited in Photoshop and PhotoNinja and combined afterwards with layers. The large black line can be adjusted with the diaphragm control, but I didn't have time to adjust it.
  2. One of the things I have been trying for some time is to get interesting UV photos of flowers in Stereo (i.e. 3D), especially in close-up and macro. In this post I'll outline the technique (happy to provide more detail if you need it), and provide some samples: you will need red/cyan anaglyph glasses (a couple of $ on ebay or Amazon) to view them. Also because of the 800 pixel-width that UVP impose, you will not get the full effect, and won't be able to do what I love - zooming in and "wandering around" the flower and looking at the detailed internal structure. If you've got some red/cyan glasses and want to look at some full-resolution files then drop me a line. And if you haven't got the glasses but would like to see any of the detailed images in 2D, let me know - but you really lose a lot in 2D. I've now got to a point where I get a reasonable percentage of successes, although some flowers make life difficult because of the amount they move during the process: some move in their entirety (wilting, esp. under the bombardment from flashguns), and some just waggle their stamens about. Dandelions are particularly irritating in this respect. This movement is a problem because it can easily take 30-45 minutes to complete one stereo shot. This is because I use focus stacking to get the image quality and depth of field, and that can mean 100 shots for each of the stereo pair of images. The traditional way to make stereo pairs is to move the camera sideways between the two images. For image distances of 1-10 metres, the sideways separation would be the inter-ocular distance of 6 or 7 cms; for greater image distances you would use greater separations (I have used 100 metres for a subject 12 miles away), and for shorter distances a smaller separation. However, for macro this does not work - with even a small separation the image disappears from the field of view. So you need to use a toe-in approach - swing the camera through about 5 degrees between the two stereo images, and then move it sideways until the image comes back into the field of view. There is nothing magical about 5 degrees - it is the angle between the line of sight of the 2 eyes for a subject at 0.5 metres. Other people use 3.7 degrees, or some other number. If you read materials about stereo photography, they often say that toe-in is a definite non-no, but modern 3D software can handle it perfectly. (I use the excellent and free Stereo Photo Maker to create the stereo image from my stereo pairs.) One of the difficulties with any kind of macro work is the shallow depth of field that you experience. To overcome this I use focus stacking - taking a number of mages focused on different planes in the subject, and combining them in software to get a single sharp image. I use Zerene software for this. This also overcomes the problem of how to focus accurately in UV, because all you need to find out is where the image starts to come into focus and where it starts to go out of focus. Another difficulty with UV+macro is getting enough light on the subject. I use 3 cheap-from-China but powerful WS-560 flashguns (with UV-absorbing "lens" removed) a few centimetres away from the subject. The repeated shooting for focus-stacking overheats the flashguns, and already three have failed for this reason. But they're so cheap (I just bought a couple new for $20 each) that they're effectively a consumable rather than a capital purchase. Let's look at some examples. They're all taken on a full-spectrum Canon EOS M, U340+S8612 filters, flash, white-balanced on PTFE. Lenses were either a Steinheil Cassar S 2.8/50mm and El-Nikkor 5.6/105mm, always stopped down to f/8. First, one that didn't work so well. This is a type of Geranium AFAIK. The body of the flower is OK, but the central area is not so good to look at. This is because of stamen movement. I mentioned above that Dandelions cause movement problems. This shot isn't too bad, but I had to crop it a lot to get rid of moving petals. Fruit blossom seems to work well - here is Quince, Pear, Alpine Strawberry. On the strawberry, you'll see a couple of tiny spiders in cyan. This is an irritant that comes from focus stacking - insects that wander around the flower while you are taking the images. In fact on this shot there were about a dozen out-of-focus spiders in cyan (i.e. on the right image) and a few in red (i.e. on the left image) which I had to post-process out of the image. Now we have a Daffodil. This works really well in a full-resolution image, with the a great wander-about capability. Here are a few shots of an Orchid - full flower and then a macro shot of the interior. I have provided a visible light shot (taken on a Canon EOS 6D Mk 2, Sigma 105mm macro lens, ring flash) of the interior for comparison. The UV interior shot is another great one to wander around. Now a Bluebell: A chrysanthemum: A daisy: A Grape Hyacinth: An "ordinary" Hyacinth: Magnolia (this is another flower that can move significantly when young - I could actually observe it twitching): Snowdrop (also good for wandering around inside): Forsythia: And finally a Sunflower:
  3. Getting blue/purple flowers to be the correct color has become quite a difficult task. The flower is a Penstemon which has a striking, vivid dark blue color combined with some moderately dark purple/pinkish blushes. I cannot get the correct color unless I adjust the color manually* after conversion and white balance. I recently posted an example of a Blue Flax flower for which I had a similar problem. But this Penstemon flower is worse. It is completely off in color. *By "manual color adjustment" I mean a color adjustment made using either a color brush in PSE or using a color point in NX2 either of which are applied only to the flowers. Here are 3 conversion attempts followed by the repaired color. This Photo Ninja profiled version is only slightly closer to the actual color. This version is close enough to keep. The color was repaired in Capture NX2 by placing a color point onto a flower and adjusting the red (less) and blue (more) sliders until the color looked as much like the flower before me as I could make it. I also had to darken the color with the color point brightness slider. I'm sure everyone said whatever there is to say about this when I showed the Blue Flax flower, so I'm not really expecting any replies or comments. However, this example may be of use to someone in the future who needs to know that the correct colors can be "painted in" successfully. :bee:
  4. Awhile ago I found that it was possible to remove visible light contamination in UVIVF photography by subtracting an (averaged) image without the torch from an image illuminated with the torch. This method was used to excellent effect with the Queen Anne's Lace that I showed some time ago. The secret is to only subtract on 16 bit linear images, which can be obtained from PhotoNinja by turning off everything except the white balance. Do not try image subtraction on JPEGs! You won't get nice results. The filters used were BG38 2mm + Tiffen Haze 2E, and the contaminating background light is streetlights. The white balance and color correction were taken from the profile I made for the gourd photos the other day. The torch was the Nemo. Procedure was to take 30 photos with the torch off using the built-in intervalometer in my Son A7s, and then repeat the process with 30 more photos while light painting with the Nemo. I then took the median of the no-torch photos to get a combined no-torch image, and took the MAXIMUM of the light-painted images to get a combined with-torch image. Then I subtracted the streetlight-only image from the streetlights+UV image in Photoshop and adjusted contrast on the results. Final result: Image with torch + streetlights: Image with streetlights only:
  5. Editor's Note: I added the gear and exposure details. Measured the WB. Made a profile in Photo Ninja. Looks pretty good to me. The CC Card and the flower were photographed between 9 and 11 am in the same location. Nikon D610-fullSpectrum + Coastal Optics 60/4.5 + Baader UV/IR-Cut Filter + Sunlight f/16 for 1/1600" @ISO-100 Applied WB and profile during conversion in Photo Ninja. Got this. It is not blue. It should be blue. Nikon D610-fullSpectrum + Coastal Optics 60/4.5 + Baader UV/IR-Cut Filter + Sunlight f/11 for 1/2500" @ISO-100 With thanks to the pets for the cat hair contributions! This happens a lot with blue flowers or blue-purple flowers.
  6. Hello all, I'm a new member and have been trying my hand at UV photos. I wanted to post some of my early attempts and ask some very intro-level questions to help me get off the ground. Forgive me for a general "getting started" post rather than specific posts for each question. All UV shots are with a Fujifilm X-t3 full spectrum conversion and a Kolarivision UV pass filter on the Fujifilm 27mm f/2.8 pancake or Fujifilm 60mm 2.4 lens. 1. I'm clueless about white balance for UV. I've read online that one should use a gray card and I've read the advice on the present site as well. Seems clear that the typical method I've used for IR photos (i.e., brightly lit green foliate for a custom WB) does not work too well. Auto WB also doesn't seem to work. First shot below is auto WB. Second shot is custom WB on green foliage. Third shot is a BW version. Ultimately, I think I will do mostly B&W UV, just as I have done lots of B&W IR (see fourth shot, just for fun). I think from these shots I can conclude that the 27mm f2/8 Fuji pancake does ok with UV. This is the "correct" pattern for a dandelion, right? This would indicate little or no IR leak on the Kolarivision filter I think. 2. In full sun, f/2.8, ISO 160, does the 1.1 second exposure seem ok? From that would you say this lens is passing acceptable amounts of UV light, or would you say that this is too long of a shutter speed indicating that the amount of UV light being passed is problematically low? Of course I could raise the ISO (although, since the X-t3 is a crop camera, I would not want to go too high). I mostly do nature photography which typically means wind, so usually 1 second exposures are not fun. 3. I have several old Nikon AI-S lenses (55mm 3.5 macro; 28mm 3.5; 105mm 2.5; I also have the 50mm 1.8D). All 52mm filter threads. The only UV pass filter I have is the 39mm from Kolari (purchased because Kolari said that the 60mm 2.4 macro lens from Fuji is good for UV (and has a 39mm thread); I'm also using it on the 27mm f2/8 which also has a 39mm thread). I can still exchange the 39mm Kolari UV pass filter for a 52mm version, but would probably only do so if some of the AI-S (or the D) lenses work well for UV. I've seen lots of things that older, simpler lenses with less advanced coatings might work well with UV, but I haven't seen specific recommendations for these AI-S lenses. Any thoughts? 4. Kolarivision's website noted that the Fujifilm 60mm macro lens was good for UV, but I found it to be hotspotty--even wide open as the last picture shows (wide open it is more like vignetting, but stopped down the lighter center becomes more pronounced and well defined). Stopping down to 5.6 or 8 made it significantly worse. I am well versed in the hot spot issues with IR, do we usually have the same problems with UV? (and yes, that is the ghost of a man visiting a grave just to the right of center in the background). But....IT'S HIS OWN GRAVE!!! ;-) Thanks everyone for any newbie advice you can offer. John
  7. Andrea show hoe selecting different alternatives for the input profile in PhotoNinja give different results here: https://www.ultravio...__fromsearch__1 There has also been a long discussion about how different filters show flowers with different hues, especially blue flowers. I have been thinking some more about this and have formed an opinion based on how I think colour management works. My simplified view of colour management looks like this: Colour management is quite complex and I do not know if I got all of it right. I am very open for any corrections. The main issue here is if input profile data should be used or not, when processing images for a standardised formal usage. The digital output from any sensor is defined by many different things, but normally rather similar for a specific camera model. The goal for manufacturers and creators of image processing software is to get as good and correct colours in the images, showing a wide diversity of nuances and hues. Therefore there exist camera model specific profile data. The profile data is used for correcting for both the specific Bayer-matrix response and the effect of the internal filters normally mounted in the camera. During full spectrum conversion the internal filters are removed making the input profile quite incorrect, but the profile still in some sense contain information related to the silicon-sensor response and Bayer-matrix response. Those are the main components controlling how image data is separated into the RGB-channels and quite important for the appearance of the hues. When using different filters that allow different amounts of light from the upper spectra close to 400nm, I find it logical that that is reflected in the hues and colours of properly white balanced images likeI showed here: https://www.ultravio...dpost__p__45847 Andreas "no profile" setting eliminates those differences making all flowers similarly blue. I do not think that is correct, especially as there are quite some differences in Bauer response between different cameras. I think using a profile matching the used camera would help equalise those differences at least to some degree.
  8. Stefano (2021) Canis familiaris Linnaeus, 1758[2] (Canidae) Domestic Dog. Dog photographed in ultraviolet and infrared light. LINK Camera: Full-spectrum Canon EOS M Lens: Soligor 35 mm f/3.5 Filters: UV: ZWB2 (2 mm) + Chinese BG39 (2 mm) IR: Hoya R72 The first images are UV: f/4(?), ISO 6400, 1/30 s exposure (repost from here): f/4(?), ISO 6400, 1/30 s exposure f/(?), ISO 25600, 1/15 s exposure f/(?), ISO 3200, 1/30 s exposure. A bit out of focus, but I like it: These last two are IR: f/(?), ISO 100, 1/60 s exposure f/(?), ISO 400, 1/500 s exposure. Starting with the raw image, I increased the saturation to the max in Photo Ninja. Then saved it as a .jpg, opened it in IrfanView, increased the saturation to the max again, and swapped the red and blue channels (BGR).
  9. A local Bottle tree with the full spectrum converted Sigma fp with filters DB850 + Schott OG550. I tested 5 the 'yellow' filters that I have today, & I will post them later. All the individual setups were individually WB with a grey card & only processed to JPG in IrfanView then Channel Swapped to GBR & cropped.
  10. By rights, this topic doesn't really belong on UVP, which is why I'm sticking it in the chat room, but I have a feeling many people here will be interested in it, because it's about another kind of invisible imaging. I've been working for several years with a Ph.D. student in Florida who is building a large scale wind tunnel for imitating thunderstorm downbursts over models of medium-sized buildings. The problem he has is how to visualize the flow over these models in a wind tunnel the size of small warehouse. Currently he is doing a smoke based system, but it occurred to me that another way would be to put pieces of black tape at various places on the model and heat them with spotlights or small lasers, then visualize the refractive index changes in the air as it flows. The imaging of refractive index gradients has a long history in fluid mechanics and is known as Schlieren imaging. It is usually done with large parabolic mirrors, but a new kind of computational Schlieren called Background-Oriented Schlieren has recently come of age, and it is extremely simple to do, as I will demonstrate below. The basic idea is that you take your warm object that is making the hot air currents and you place it in front of a large screen of randomly placed dots. Small changes in refractive index caused by warming the air make the dots "dance" on the background. Prior to placing the warm object in the scene, you take a "tare" image of just the screen with the dots, and then you can use special programs that use image correlations to determine how far each background dot has moved. The final output is then shown as a grayscale image where the amount of left-right movement of the dots is coded as a shade of gray (or sometimes a false color). The setup is shown below, both as a schematic from a paper by Gary Settles, "Smartphone schlieren and shadowgraph imaging," and also in my kitchen. My setup: Note that the size of the screen I used is actually much too small, which I knew ahead of time, but this was intended only as a proof of concept. I wanted to know how easy it was, and how practical it would be to build a set up in a large wind tunnel. I also wanted to demonstrate the concepts for my student, and show him how to do the processing. My tare image (actually an average of 30 aligned images to reduce noise) looked like this: A second image with the candle lit looked like this. No visible movement of the background is obvious and it couldn't be seen with the naked eye either. At this point I was very nervous that it wouldn't work! After processing the images, though, the airflow popped right out! Images were processed in MATLAB using a freeware program called PIVLab. Places where I had no dot screen or that had very little texture for the PIVLab program to detect came out noisy. I have also Photoshopped the candle and bowl back into the photo, which is standard procedure in BOS imaging. Here are some more examples: I do believe this is the first time THE AIR ITSELF has been imaged on UVP! I also took thermal photos, but what you see here is not the air but the soot from the candle flame (since gases don't emit blackbody radiation). And before Stefano points it out, yes, I COULD have visualized the CO2 absorption in MWIR with the other camera, but at the moment I don't have a way to support that 7kg camera in my kitchen. https://vimeo.com/535737212
  11. In a recent thread about possible harm to insects when flashing them (https://www.ultravio...cts-with-flash/), naturally the question of using high ISO popped up. ISO-performance has been mentioned all across the board in different topics, the merits of different companies and models discussed, etc. (and is a constant source of heated discussion all over the internet), but Canon-users on this site are few and far between. So I wanted to test my UV-workhorse Canon and see how it performs at high ISO: The camera: Canon EOS 6D, introduced in 2013 (https://en.wikipedia...ki/Canon_EOS_6D), a full-frame camera with 20.2 MP, native ISO range from 100 to 25600, with the option of going to 50 at the lower end, and 51200 and 102400 at the upper end, respectively. Now, 8 years later, I would expect new models to show a significant improvement at higher ISO, as is the case with all manufacturers and lines. Its successor, the 6D II (2017) already sports a native ISO range from 100 to 40000 (and the latest mirror-lens bodies 100 to 102400, so things are developing). I bought it actually for the very reason that being the discontinued model it was cheaper to be had second-hand It was converted by maxmax to black & white, i.e. getting rid of the Bayer-filter, and at the same time removing the usual blocking filter, replacing it with their proprietary X-Nite 330C-filter. This filter does have a small IR-leak of 1% at 720nm (https://maxmax.com/s...r-x-2-2mm-thick ). Converting the camera to b/w boosted its UV-response by an unknown factor (I've read claims of fivefold somewhere), and as a matter of fact, in most cases the difference of camera and camera+S8612-filter is all but non-existent. So, before I did this test, I've always tried to keep ISO below 1600. If I shoot with tripod, then ISO can be as low as 100 (for perfect quality), but if I don't want to smear out movement or want to take a handheld shot, then up goes the shutter speed, and up goes ISO as well. The prospect of shooting handheld was my main motivation for getting this modification; especially when I'm on city-trips and with some friends I want to be able to take snap-shots. So, without further ado, if you've braved this somewhat lengthy introduction, here are the results. I've shot a stamp of mine, indoors, with camera on the tripod, distance sensor to stamp about 55cm (I think I missed perfect focus ever so slightly). The lens was the EL-Nikkor 105 mm f/5.6, set to f/8, used on a helicoid, of course. A full-spectrum Yongnuo VN560III was triggered by radio from the camera. I shot a series from 100 to 25600, trying first to adjust the level of flash so that the initial histrogram would be about the same, and for 25600 I shot some more tests where I allowed some under-exposure, up to about 3 stops. I didn't think it necessary to go lower. For 51200 and 102400 I just tried to keep the histogram around the centre. I'm not showing all of them, this would be too boring even for myself This is a 1000x1000 px crop from the 5472x3648 pixel frame. In post (Lightroom) I mainly adjusted brightness, structure and clarity. Noise reduction was only performed from 25600 upwards. The stamp has a size of about 24x29 mm (on top you can see the technical paper I placed it on, the thick lines are spaced 0.5 cm apart). ISO 100: ISO 1600: ISO 12800: ISO 25600 (aiming for good exposure): ISO 25600 (worst exposure 2.5-3 steps): ISO 51200: ISO 102400: Summary: I was actually surprised that the loss of quality from 1600 to 12800 is this small. I wouldn't hesitate to go to 3200 or 6400 in the field, even to 12800 if it meant getting the shot. The first 25600 is also better than I had expected it, so if I had to crank up the shutter speed to capture an insect without flash (and without too much details), I might use it. However, if with 25600 the shot is still underexposed, the story changes quite rapidly, as most of the fine detail is lost. It might still be acceptable as documentation, like here if I only wanted to prove the stamp to be the famous Millstatt-stamp, worth millions of Euro, it would do the job (which, sadly is not true, the stamp with a nominal value of 1 Österreichische Schilling, can be bought for 0.45 €). The same is true for 51200 and 102400, documentation can be done, but that's about it, where the latter is almost completely unusable, as was to be expected. Or, if it is already pretty dark and one goes for a spooky photo of an old castle, it might work as well Caveats, if you want to apply this to other Canons or other subjects: This camera has got its Bayer filter removed, so on top of increased sensitivity in UV and spatial resolution, any colour effects at higher ISO will simply not be there. I'm sure most of you will have had some coloured high ISO-shot converted to black/white so that it doesn't look quite as awful I am everything but an expert in handling photos with a large amount of noise. Using Lightroom, Photoshop, Noise-Ninja, etc. an expert will probably be able to improve the quality considerably. This means, I'm showing about the worst-case. The subject is flat, the conditions were perfectly controlled. Of course, this is way to test the performance of the sensor. Out in the field everything changes. There will be other caveats which I'm not thinking about right now, as usual
  12. Some photos taken a couple of days ago. Nothing special, I just wanted to share. Camera: full-spectrum Canon EOS M Lens: Soligor 35 mm f/3.5 Filter: ZWB2 (2 mm) + Chinese BG39 (2 mm) f/3.5, ISO 800, 1/4 s exposure My skin is a bit purple. Below the same image but in BGR version, increased saturation and I corrected the overexposes areas replacing the color with white, tolerance 10, all done in IrfanView: I always like how houses come out in UV. I like them when they are purple, and also in this red version. There are some houses around with that red color, so they almost look "normal". Daisies at f/3.5. Although this aperture is not so wide, it does produce a nice shallow depth of field. I may try the Helios f/2 lens one day, it isn't as UV-capable as the Soligor but has a wider aperture. The problem being dust is always waiting to jump on the sensor. f/3.5, ISO 100, 4 s exposure f/22, ISO 100, 120.2 s exposure. Those flowers are all-yellow in visible light, but the UV-dark center is visible, it appears different. I am sure this flowers are already present in the botanical section somewhere. f/22, ISO 100, 420.3 s exposure. Photo taken on a sunny day, but in the shade. The wind ruined it a bit, but still came out nice. ISO 25600, 1/30 s exposure. I don't know the aperture, but it probably wasn't wide open. And finally, not a flower, but an old ball chewed by our dog. It moved a bit during the long exposure. f/22, ISO 100, 150.2 s exposure.
  13. I state that I use google translate. Hello everybody. I have read almost all the discussions but I can't find the solution to the problem or maybe I don't understand it. Being at the first experiences with UV photography, when I go to process the photos I can't get the expected results. Let me explain better, I use a D300 full spectrum plus Hoya U-360 filter combined with Shott S8612 on 70s optics. The photos obviously have magenta cast and after having balanced the white on both gray and white and magenta (sample photo with use colorchecker) the photo is turned almost black and white. If on the contrary I use a color profile created with the colorchecker and photoninja, the photo becomes dominated by green. How do I get the results of the various photos seen among those published such as flowers and then get the shades that UV photography should capture and return? Thanks so much.
  14. I contacted photo ninja support to see if a group discount on the software might be possible and it actually might be. Also a new version will be released soon. When I was a member (may still be haven't checked in in years) of an Olympus photo group, we were able to get a discount on lightzone and photoacute software. They were very generous with 50% off. I have know idea what to expect, but they want to know if this will be general listing or restricted list of individuals. I personally think a list of interested people might be best. Similarly to how we funded the Hoya filter project. Please let me know if interested so that I can relay an expected number of people.
  15. I searched through the forums for this, as well as searching for the URL at which the product is located, so seems like no one has shared this before. I figured I'd bring it to everyone's attention. It only has RGB and GBW squares, and costs a whopping $875 (US dollars) though. http://www.imagescienceassociates.com/mm5/merchant.mvc?Screen=PROD&Store_Code=ISA001&Product_Code=TUVUVGC&Category_Code=TARGETS
  16. Ok, this one is for Mark Jones. Selfie with half-sunscreen. Camera: converted Sony A7S Settings: F8, ISO8000, 1/50" Lens: Novoflex Noflexar 35/3.5 Filters: UG11 2mm + S8612 1.5mm This is the picture converted from RAW but with no processing at all besides resizing: This is the picture with denoising with Neat Image plugin, curves, highlight reduction, and a final autocontrast in Photoshop: Here are 1:1 crops. First the no processing image: After processing:
  17. I had this thought for a while. We have already briefly talked about this in the past (below are some links), but I never saw an actual experiment about it (if someone already tried this and posted it on the forum, please link your experiment). So, to put it simply, our cameras have a limited but non-zero ability to distinguish different wavelengths in both UV and IR, which appear as different colors, and this is especially apparent when an image is properly white balanced. In UV, the usual color palette (after a WB) starts with blue at the longest wavelengths (usually near 400 nm, depending on the filter being used), then violet/lavender, then white (at a precise wavelength), then a greenish yellow, and usually around 340 nm there is green, but in real world photos UV-green is almost never seen, unless you have very specific materials such as ruby. If you are interested in a discussion about UV-green objects, read this nice talk members had. In IR (using a 720 nm filter, like Hoya R72 or similar), the palette is quite similar: we start with a yellowish/orange at the 700 nm edge, then yellow, then white, then cyan/blue. I just talked about this here. This means that, both in UV and in IR, the shortest wavelengths appear yellow, and the longest appear blue. But there are really two channels, as Andy showed here. In particular, there isn't the equivalent of a green channel. We have yellow and blue, two complementary colors, they give white when mixed, but that's it. The fact that we have two channels only also explains why there is always a neutral wavelength which appears white. I read on Wikipedia that colorblind people also experience this. BGR images (images in which the red and blue channels have been swapped) map longer wavelengths as red and shorter wavelengths as cyan. This is, to an extent, something similar to a true tri-color image. Regarding true tri-color images, I recently experimented with this technique in IR here, and before me Bernard Foot did the same in both UV and IR. I suggest you to see his work if you are interested in the technique. This images have actual three channels of information, and so it is possible to see red, green and blue objects in the same image, although green seems to be the rarest color. My experiment is to compare a true tri-color image, which was made as described in my initial post, and a simulated version done with more usual methods. Cameras are much less sensitive to wavelengths above 900 nm than to those below, so I needed a light source heavily weighted towards longer wavelengths, and running halogen lamps at low voltages didn't provide enough light. So I used my DIY incandescent lamp (this one), to provide enough light for my purposes. I ran it at about 30 W of power. The resulting images had too much noise and were quite dark, so I took 199 of them, and did a mix of stack/average (sum/50), to have an image 4 times brighter and 50 times less noisy. I did an in-camera white balance before taking the images, and used a Hoya R72 filter. This was the result: Not the best image in the world, but nice enough. The green color cast and the vertical stripes are a result of the noise at low brightnesses. Then I re-white balanced it, swapped the red and blue channels, and increased saturation to the max twice. ...and this is the final result: I got blue water and some reds. Then I did the proper tri-color version, and this is how it looks like: This is the visible light reference: For sure, the two images are not identical, but are quite similar. Also, on the Rubik's cube, the green squares became orange and the blue ones became yellow in the tri-color image. I think I can see a slight difference in the simulated tri-color image too: Visible reference: True tri-color: Simulated tri-color: Isn't the square in the corner a bit redder? I may be overseeing stuff, but I have this impression. Conclusion: in IR (probably also in UV, but I only checked in IR), doing a BGR channel swap on a white balanced image can give a clue of what a true tri-color image would look like, although a true tri-color image can only be done using three filters or three light sources and combining the resulting channels properly. Other occasions in which members, including me, talked about this (I may miss something): https://www.ultravio...dpost__p__39254 https://www.ultravio...dpost__p__36801 https://www.ultravio...dpost__p__25092 (the post contains some links).
  18. I saw this on eBay: https://www.ebay.co.uk/itm/333795972392 It is a SWIR camera which is "non functioning and for replacement parts" (translated from Italian, this may not be how the message appears in English). I think there is everything except the lens, maybe someone could be able to repair it and make it work? What do you think?
  19. I started this new topic since I thought the previous one was already complete, being my first attempts only. I have taken other images, and I am posting them here. This is the first topic: https://www.ultravio...t-tri-color-ir/ As a reminder, the camera I used is a full-spectrum Panasonic DMC-F3, I used a Hoya R72 filter for the IR images and a white LED for the visible light references. Some tri-color images have been white balanced, I will say that for every image. Note: sometimes my camera doesn't autofocus, and usually (ironically) the IR images are sharper than the visible ones the camera was designed to take. Channels: Red: ~940 nm; Green: ~850 nm; Blue: ~730 nm. Visible reference: Tri-color IR (original): I don't think I need to say this, but to be clear the brands shown are just random and there isn't the slightest intent to advertise them. The first bottle from the left is water, the third one is denaturated alcohol 90°. They both appear bluish, but the alcohol is greener. Pens, assorted colors. If the colors aren't clear, from left to right they are BLUE RED BLUE GREEN GREEN BLACK BLACK BLUE RED BLUE. Visible reference: Tri-color IR (original): Full-size crop: Some inks became transparent (red), others orange (blue), others red (green) and the black pens remained black. Black pen ink becomes transparent to IR when thin (see my pen ink filter). Rodolfa Visible reference: Tri-color IR (original): Amethyst (I am 99.9% sure it is that) Visible reference: Tri-color IR (white balanced): Various minerals The rock/mineral below has a quite strong orange fluorescence under 365 nm UV. Visible reference: Tri-color IR (white balanced): Visible reference: Tri-color IR (white balanced, almost identical to the original): The middle specimen is actually green: Visible reference: Tri-color IR (white balanced, similar to the original): The middle specimen is actually blue: Bonus: a failed attempt. I tried to photograph an orange, but it slowly settled down and thus moved a bit. I processed the images anyway and this is the result (crop): Thoughts and conclusions: White, red, orange and yellow plastics come out usually white; Blue plastics are usually yellow; Green plastics and dark blue plastics are usually orange/brown; Black plastics are often black. Most minerals don't have strong IR false colors. Except for one strong yellow, all I got were shades of orange and pink, and a very pale blue from the amethyst. I think I will take other images, probably 3-6, and I will post them here. If you have any suggestions (I am running out of ideas!), please share them here.
  20. Inspired by Bernard's excellent work, I wanted to try full color/tri-color IR too. I didn't use three separate filters, but three separate light sources. More below. I have a wide range of IR LEDs, currently seven between 730 and 1050 nm, but only three of them are usable as of now, since I didn't attach the others to a heatsink yet, and thus they would overheat. Those three LEDs are the most common IR LEDs you can find online, emitting at 730, 850 and 940 nm. Their peaks are roughly evenly spaced, and thus they are suitable for tri-color photography (one can use any combination he/she likes, but evenly spaced filters/light sources are better in my opinion). The LEDs are the "10 W" type, with 9 chips in a 3S3P configuration and a maximum rated forward current of 900 mA. They probably emit 1-3 W of light, not more. I may one day write a topic about my LEDs in detail. I ran all of them at full power. The target was water. People who read my posts for a while know that I like seeing the absorption of water in the near-infrared spectrum (possibly, one day, even in SWIR), and since I know that water appears noticeably darker at 940 nm by experience I wanted to combine three images to make it appear blue. I used my full-spectrum Panasonic DMC-F3, an Hoya R72 filter to prevent any possible (but unlikely) contamination by visible light, and to prevent movements between the images I mounted the camera on a tripod which I attached to the floor with bi-adhesive tape. Since I didn't care about colors in the single images, and I would have needed to convert them to B&W anyway, I directly shot them in B&W in-camera. To have uniform exposures between the images I put the camera on auto ISO mode, and it worked very well. I put a paper tissue in the background to have a white target. Normal copy paper would have worked as well. The water thickness was 28 mm, and the LEDs were ~50 cm (~20 in) from the container. I mapped 730 nm as blue, 850 nm as green and 940 nm as red. Images settings were f/2.8 for all images, and 1/30 s ISO 80 for 730 nm; 1/30 s ISO 100 for 850 nm; 1/8 s ISO 320 for 940 nm. Combined final image (just the three channels stacked, no white balancing, no alignment, no post-processing): Increased saturation: Any suggestion is welcome.
  21. Today I tried to "shoot" (more to "generate") an IRG image. I took the visible light image with a piece of chinese BG39 (2 mm) on the lens, the IR image with a Hoya R72, processed the visible image to have the original green channel on the blue channel and the original red channel on the green channel (getting rid of the original blue channel), converted the IR image to B&W and then to red, and stacked the two images. The result is OK color-wise, but the images are clearly not aligned, and it's not that great in general. As a concept, I like it. Tell me if this is the right section for this kind of work (maybe the ultraviolet and multispectral was better) and also if I need to write the settings of the original images.
  22. Today my dad and I tried to cook some chestnuts on a "barbecue" (a little more than a pan with charcoal). It was a fail, since the chestnuts were too dry. But it provided material for invisible light imaging. [color=#282828][font=helvetica, arial, sans-serif]Camera: [/font][/color][color=#282828][font=helvetica, arial, sans-serif]Full-spectrum Panasonic DMC-F3[/font][/color] [color=#282828][font=helvetica, arial, sans-serif]Filter: Hoya R72[/font][/color] Lit charcoal glows IR-blue, since its blackbody radiation is heavily weighted towards longer wavelengths, and peaks in the deep SWIR-early MWIR region (roughly around 3 [color=#282828][font=arial, helvetica, sans-serif]µm).[/font][/color] [color=#282828][font=helvetica, arial, sans-serif]F-stop: f/2.8, ISO 200, 1/30 s exposure.[/font][/color] [attachment=20552:P1000206.JPG] [color=#282828][font=helvetica, arial, sans-serif]F-stop: f/2.8, ISO 200, 1/13 s exposure.[/font][/color] [attachment=20553:P1000227.JPG] [color=#282828][font=helvetica, arial, sans-serif]F-stop: f/2.8, ISO 400, 1/15 s exposure.[/font][/color] [attachment=20554:P1000258.JPG] Then I noticed some flames were IR-yellow. That's odd for a flame. We initially used a chemical to help the flame going (not exactly healthy, but it hopefully burned out, even if we didn't eat the chestnuts), and I think it has an emission line in the 700-800 nm range. Sometimes flames change color depending on the chemical burned. I have seen cyan-green flames when burning some thin residues of plastic from a metal wire I heated to cut a plastic bottle. Yeah, avoid this experiments indoor. [color=#282828][font=helvetica, arial, sans-serif]F-stop: f/2.8, ISO 400, 1/400 s exposure (both images).[/font][/color] [attachment=20555:P1000274.JPG] [attachment=20556:P1000275.JPG]
  23. Today I received BG3 (2mm) and decided to test what happens if I use it with unconverted Nikon D600. The result is not quite UV photography so I decided to post it in UV/IR experiences, maybe someone will be interesting how such combination would look like (everything is blue) and what it is possible to do with it. I found such combination can be used in art by applying some base tone curve adjustments in DNG profile editor and then adjusting exposure/temp/tint/blue colour in Lightroom. This inspired me to dismantle my camera today and hopefully it will be ok after removing UV/IR cut filter. Here is original file. Here is how the base tone curve looks like in DNG profile editor. Here are some tweaking results in Lightroom + white balancing in Photo Ninja
  24. Yesterday Stephan suggested me IrfanView as a software to edit images: https://www.ultravio...post__p__39238. I noticed it has a channel-swapping feature, and I really liked it. Below you see some images I have already posted in the forum in the past, but converted in BGR. Image: https://www.ultravio...-1576877461.png Post: https://www.ultravio...dpost__p__31485 Image: https://www.ultravio...-1581372961.jpg Post: https://www.ultravio...dpost__p__33080 Image: https://www.ultravio...-1586374208.jpg Post: https://www.ultravio...dpost__p__34983 Image: https://www.ultravio...-1585946815.jpg Post: https://www.ultravio...dpost__p__34909 Image: https://www.ultravio...-1591277922.jpg Post: https://www.ultravio...dpost__p__36235 Image: https://www.ultravio...-1602537628.jpg Post: https://www.ultravio...dpost__p__39198 I remember this comment by Andy: https://www.ultravio...dpost__p__36801 My answer: BGR (both in IR and UV) can sort-of mimic true tri-color images, but the colors are not "exact" (reds are not exactly red, etc. You can see this in the first image in this post), but this can be fixed in post-processing. The biggest issue is that, in both IR and UV, you lack green. You have the extremes, red and blue, but not the middle green band. This technique reminds me of IRG done like Andy did here: https://www.ultravio...chrome-rainbow. Yes, it works and it is quite accurate, but it is an approximation, that (in my opinion) can not replace true tri-color images.
  25. Since I'm doing the Queen Anne's Lace by every method lately, here is the UVIVF, which was startlingly pretty. The main technical component here was an attempt to subtract off the background light since the plant was slightly illuminated by street lights. I took two batches of photos, one with the 15W UV torch (hereafter called "the Nemo") and the other without it. Then I averaged both separately, and subtracted the two pics after averaging to remove the noise. Since the only difference between the two is the UVIVF and/or any contamination from the torch, the hope is that it will remove the effect of the spurious lighting. I think it was mostly a success, since the background became quite black. The camera setup was essentially the same as in the previous zinnia thread. Visible reflectance, illuminated by streetlights. This is the averaged photo. I also adjusted exposure to make the scene visible (it looks black otherwise with the original exposure settings). Also (for display purposes, but not in the image subtraction), I white balanced on the outer petals. 30 photos x F/8, ISO500, 1/4" UVIVF + streetlights 38 photos x F/8, ISO500, 1/4" UVIVF after subtracting streetlights in Photoshop (UVIVF + streetlights) - (streetlights) = UVIVF hopefully Crop on the UVIVF: 1-1 Crop of the visible: 1-1 Crop of the UVIVF: I will say, the averaging really drives the noise almost to nothing. Since the Starry Sky Stacker program has tools for evaluating image quality and accepting/rejecting each photo individually before stacking, and the camera takes all the photos in a batch, the actual time for taking and processing these photos was about an hour start to finish. There is no reason not to employ this procedure for almost every photo I take of stationary subjects, in fact, because the quality improvement is so dramatic.
×
×
  • Create New...