Jump to content
UltravioletPhotography

Search the Community

Showing results for tags 'TriColour'.

  • Search By Tags

    Type tags separated by commas.
  • Search By Author

Content Type


Forums

  • Office
    • Announcements
    • UVP Rules & Guidelines
    • Requests for Photographs
    • Feedback & Support
  • Introductions
    • Who & Why
    • Introduce Yourself
  • UVP Technical Zone
    • Techniques, Tests & Gear
    • UV Lens Technical Data
    • Non-technical Experiences
    • STICKIES, References & Lists
    • Essays & Tutorials
    • ID Help
  • UVP Photo Zone
    • Ultraviolet & Multispectral Photos
    • Fauna: Animals, Birds, Insects or Other Critters
    • Forensics & Other Investigations
    • Fluorescence and Related Glows
    • Infrared and its Friends (SWIR, MWIR, LWIR)
    • Macro
    • People and Portraits
    • Scapes: Land, Sea, City
  • UVP Botanicals
    • UV Wildflowers by Family
    • UV Cultivars: Garden & Decorative Flora
    • UV Cultivars: Vegetables, Herbs & Crops
    • UV Other Botanicals
    • Index

Find results in...

Find results that contain...


Date Created

  • Start

    End


Last Updated

  • Start

    End


Filter by number of...

  1. [since a "generally formal presentation" is required, any help from the admins is appreciated] In a TriColour image we want to construct an image in which each RGB channel represents a certain wavelength band. This is analog to how our eyes and camera work in the visible spectrum, having red light in the red channel, green light in the green channel and blue light in the blue channel. In a TriColour image we do the same, except we don't use red, green and blue light, but other bands, often in the UV or IR spectrum. This is how I do it. I used an image I already posted here as my example. To build this particular image I took three photos of the same subject at 730, 850 and 940 nm. If you use different light sources, it is important to place them in the exact same spot. If you use the same light sources and filter the bands with banpass filters, you have to be careful not to move your camera when changing them. As Bernard already said, this technique is only suitable for static objects. I suggest reading his topic too, where he describes his method. Here are my images, converted to black and white (I took them directly in monochrome in-camera, and I only have the .jpgs). If your images have colors, it is very important to convert them to black and white. 730 nm: 850 nm: 940 nm: You can already see differences between the images. Now I would transform these images into "channels": I open them in IrfanView, and go to "images": and then go to "Color corrections..." (I couldn't take a screenshot of the dropdown window). Here you will find RGB sliders in the lower left: If you want to make a "red channel", drop the G and B sliders to zero. if you want to make a "green channel", drop the R and B sliders to zero, and for a "blue channel" drop R and G to zero. I converted the images as follows: Red: 940 nm; Green: 850 nm; Blue: 730 nm; This is how they should look like after the procedure: 730 nm: 850 nm: 940 nm: The last step is the stacking. I use a software called "Image Stacker" for that. If you are going to use the same software, remember to select "Stack": and this is the final result: Placing a neutral target in the images (such as PTFE) can help to balance colors. One advantage of doing the white balance in IrfanView is that it just re-weights the channels, without creating anything that wasn't there. Other members use other softwares, and you can use your own.
  2. Note: Please bear with me on this topic. I'm not going to be able to get everything posted all in one sitting because we have lots of things going on here at UVP Headquarters - West. MaxMax makes 3 Infrared bandpass filters: https://maxmax.com/filters/bandpass-ir These are beautifully made filters. Each has a shiny coloured side (dichroic?) which gives them their designation as blue, green and red. The color designation has nothing to do with how the raw or false colours appear in a finished IR photo - at least as far as I can determine currently. On the linked page Dan has shown some RGB stacks made with the 3 filters. My initial experiments were not quite that colourful, but that's OK. I'll get there eventually. For my first experiments I had simply wanted to explore the basics. I always like to look at the demosaiced raw photo before any RGB multipliers are applied to set the white balance. This raw composite gives an idea of how the camera is really recording through its Bayer filter. The excellent app Raw Digger outputs a very basic raw composite with only minimal contrast/saturation curves. Gear: Nikon D600 + some 35mm lens + Sunlight Measurements of the raw colour casts were made over the white Spectralon. Then a square of the fully saturated colour was added to the raw composite. IR BP Blue max transmission about 93% half-max width about 90 nm f/3.5 for 1/500" @ ISO-400 Like other IR-pass filters which pass some high red, we get lots of false colour with the BP Blue. It has a strong Orange raw colour cast. IR BP Green max transmission about 90% half-max width about 65 nm The colour cast for the BP Green is very unsaturated. When the saturation is pushed, we can see that the colour cast is Red with about a 25% contribution from Blue (so headed towards Cerise, I suppose). f/3.5 for 1/500" @ ISO-400 IR BP Red max transmission about 85% half-max width about 37 nm You can see why the BP Red requires a longer exposure time. It's lots less wide and peaks out there in the 900s. The colour cast for the BP Red is definitely Magenta after full saturation is reached. f/3.5 for 1/40" @ ISO-400
  3. This topic is a spin-off from this topic, using different filters to those in that topic's title: https://www.ultravio...__fromsearch__1 This is about getting full (and, of course, false) colours in IR. It uses the same techniques described for UV in this topic: https://www.ultravio...__fromsearch__1 In outline, this approach takes 3 images in different parts of the NIR spectrum, and uses these for the RGB channels to get a full colour image. To get the three images, the following filter set was used: 700-800nm range: Midwest Optical BP735 + R72. (The R72 is used to block the red leakage from the BP735.) This forms the blue channel. 800-900nm range: Midwest Optical BN850 900nm+ range: Midwest Optical LP1000 These filters give the following transmssion curves. The curves have been adapted to include typical CMS sensor sensitivity variation across the spectrum, and have been adjusted such that their heights are the same (as differential exposures are used to overcome the different transmission levels of the three filters): The first results are below: The first image is a standard visible light shot. The second is the full-colour IR shot. It is disappointing in that it shows only faint colouring. But ... The third image is the full-colour IR shot with the saturation wound right up. Colour at last! White balancing was done against the white PTFE tile. You can see that the 18% grey target at the back is not so grey in the IR! Most of the foliage (inc. the red poinsettia leaves) is neutral - I was expecting a slight cyan tinge as UV reflectance by foliage decreases slightly across this part of the spectrum. The effects of manmade colourants are always interesting. Note how the green marker pen and green booklet have quite different colours to each other in IR. But the blue marker pen and blue book both become yellow (i.e. absorbing in 700-800, reflecting/transmitting in the 900+ band). And the red printing ink is completely transparent across the frequency range. The red berry-like things and the darker red parts of the roses in the vase have come out cyan, indicating that whatever colouring chemistry is reflecting red in the visible region is continuing to reflect light as we move into the 700-800 and 800-900 bands, but not in the 900+ band.
  4. The TriColour technique is probably my favourite way of representing false color in invisible light photos (UV, IR, and theoretically any other band of the EM spectrum). False colors will always be false, but I feel like this technique makes "truer" false colors, as there's a logical meaning behind. Traditionally, this is done by taking three photos of the subject at three different wavelengths. The images must be superimposable, meaning the subject must stay still, the lighting should stay the same, and the images must be taken from the same point of view, otherwise color fringing will occur. This is fine if the conditions above are met, and I have taken some images this way which have little to basically no color defects. For videos, however, things are much different. Normal color sensors have subpixels for red, green and blue, and take the frames at the same time in all spectral bands. Outside the visible spectrum, if such sensors are not available, one has to use different strategies. Method 1 (naive method) The most obvious approach is to use three cameras, as close as possible, with three filters on their lenses, and take the frames at the same time. It would work fine for far away subjects, but at close distances parallax would be obvious. Pros: - simple to implement; Cons: - needs three sensors and three lenses; - parallax at close distances. Method 2: filter wheel I discussed this idea here: a spinning filter wheel is placed in front of the lens, and the setup is timed so that the sensor takes a frame every time a new filter is in place. If done quickly enough, this could allow for TriColour video. The problem is that fringing would be visible for fast-moving objects, and the sensor will have different sensitivities at different wavelengths, so ND filters might be needed. Pros: - only one sensor and one lens are needed; Cons: - difficult to build (the sensor and the filters must be synchronized); - the lens must be corrected for chromatic aberration. Method 3: dichroic mirrors To take three images at the same time at three different wavelengths from the same point of view, dichroic mirrors can be used. They reflect certain wavelengths and transmit others, essentially splitting the image. The biggest downside is that the lens must be either telephoto or strongly retrofocus in design, as the image plane cannot be close to the rear element. As for the retrofocus lens, here's a very raw attempt, at f/8: Pros: - allows for true simultaneous images without parallax; - corrects chromatic aberration (by adjusting the individual sensors); Cons: - requires three sensors; - for wide angle images, the lens must be strongly retrofocus, which makes it difficult to design; - dichroic mirrors in UV are not easily available (maybe interference filters at 45° could be used, although they are usually designed for near perpendicular light beams). A similar technique has been successfully used here. Method 4: dichroic mirrors with image screen This is a possible improvement of the previous method. It's the same camera as before, but the image is first projected onto a screen by a first lens, and then the screen is imaged with a second lens with longer focal length. This way a retrofocus lens is not needed. To increase the brightness of the image, the screen could be made with a microlens array or a Fresnel lens, although I doubt it would work much better. Something similar was used by Andy for his early SWIR experiments: https://www.ultravioletphotography.com/content/index.php?/topic/2112-swir-camera-setup-and-some-pics Pros: - allows for true simultaneous images without parallax; - doesn't need a retrofocus lens; Cons: - requires two lenses; - the sensitivity is likely lower than in the previous method, which is a problem especially for UVB; - the first lens must be corrected for chromatic aberration. To connect multiple sensors, I think a Raspberry Pi or similar could be used. I had other more exotic ideas (like using phosphors excited by different wavelengths), but I don't think they could be practically built. I think method #3 is the most reasonable.
  5. I've been using my new Aerochrome filter set a lot these past weeks. It produces really articulate colors. At a point that I don't think the original film necessarily did better. (plus digital offers a completely different dynamic range, making digital IRG photos very disctinct from the OG) Until now I haven't noticed any plant health information that isn't distinguishable with the naked eye : young leaves and show pink, older leaves red and the more they wither the more the lean toward orange brown and yellow until it is completely dry and shows grey. My guess is that aerochrome is not intended to be properly white balanced like I do with my photos, and that therefore it shows a strong dichotomy beetween cyan dead plants and pink healthy plants. It is maybe easier to tell the different beetween cyan and pink that beetween grey and green. So yes from what I've seen nothing special is reavealed, it's just way more beautiful than the usual boring green. So to recap my setup for these pictures is : filters : Midopt TB550/650/850 + Lee "Flesh Pink" + GRB3 (+Cokin diffusion filter on some shots) Camera : full spectrum Canon 1200D The channels are swapped in darktable, no IR substraction is needed. I work from sRGB jpegs. No color edits at all. I edit the contasts in Lightroom.
  6. I spent some time yesterday taking many images of this unremarkable piece of plastic fantastic. I got these on a flea market the other day, thinking they were cool, but given that they're made from pretty cheap plastic, I might have changed my mind on that over time. Regardless, it was interesting "investigate" them. First thing I did was an IR tri color, using the GRB3 method I discovered recently. Very underwhelming, but expected. Next thing I did was I illuminated the glasses with three different IR filtered LED lightsources, a green one, a blue one and a 395nm one, I recorded the fluorescence in IR for each using my 850nm longpass filter and made a tri color image out of them afterwards. Lastly, I opened up the lens all the way to f/1.4, since I knew I would need a lot of light. I mounted the blue spotlight I was using beforehand and I did an IR tri color of the IR fluorescence under blue. I'm not exactly sure if I learned anything from this, besides maybe the fact that the orange shot glass really likes to fluoresce, but it was interesting, still. I would like to investigate more objects this way later.
  7. The following experiments are inspired by the findings of @Christoph Here is a series of pictures with my full spectrum canon 1200D and a stack of these three filters : Midopt Triple bandpass 550/660/850nm Lee "Flesh pink" gel GRB3 from Tangsinuo The transmission curve of the three filters combined should look something like this : Both the green and the IR spike are heavily filtered. When the TB is used alone, it seems to suffer from a very weak red transmission : without the GRB3 the red channel records as much (if not more) IR than red, requiring IR subtraction of 100%. The green spike transmission to the contrary is very powerful. To the eye (at least mine) the filter is green. I'm not sure how bright humans are able to percieve red at 660nm, its pretty deep already. I then decided to reduce the transmission of IR and green equally to give room for the red transmission. The GRB3 is used to minimise IR contamination in the RED channel and the Lee gel is used minimise green contamination in the blue/IR channel and to make it more or less match the level of output of the red channel. The blue channel stays very underexposed compared to the two others, and is brought back in white balance (pretty extreme : lower than 1900Kelvin). The exposure value with these filters ranges from 1/50s to 1/100s in sunlight at f5.6, 200iso. It's a very dark combo. All the pictures are channel swapped in Resolve from the camera Jpegs, the process is video ready. No saturation was added and zero IR substraction is needed. The fact that IR transmission is cut by 90% by the GRB3 causes some green leaks to be percieved in the Blue/IR channel. This leads to the need to apply a hue correction to the sky in order to make it look properly blue and not purple-ish. This process is simple and non-destructive. The original color of the sky after channel swap : Important notice : The channel swap causes a significant decrease in contrast and micro-contrast. For a moment I thought this was due to the Jpeg compression but it is not. Do you remember this law ? Y = 0.1 Blue + 0.6 Green + 0.3 Red It is actually crucial in understanding why the images look better unswapped. It describes how the human eye is sensitive to luminosity. The green value is far more decisive in the percieved brightness of an object than the red and blue values. A quick exemple Unswapped image : Swapped image : it looks less detailed and dynamic. The contrast beetween the bright grass and the dark trees is lessened Swapped image but with the "preserve luminance" box ticked : the image is sharper and more alive. (Click on the image and use the viewer to compare both instantaneously.) Explanation : The grass in the unswapped image appears turquoise. Let's make it cyan for the sake of clarity. The trees appear blue. Cyan RGB values are (0; 255; 255) Blue RGB values are (0; 0; 255) The minimum luminance(Y) is 0 and the maximum luminance is 1. Y(cyan grass) = 0.1x Blue(255/255) + 0.6x Green(255/255) + 0.3x Red(0/255) = 0.7 Y(blue trees) = 0.1 x B(255/255) + 0.6 x G(0/255) + 0.3 x R(0/255) = 0.1 in the unswapped image the contrast beetween the brightness of the trees and the grass is important : Y=0.1 versus Y=0.7. Now, let's do the same for the swapped image where the grass is magenta and the trees are red. Y(magenta grass) = 0.1 x B(255/255) + 0.6 x G(0/255) + 0.3 x R(255/255) = 0.4 Y(red trees) = 0.1 x B(0/255) + 0.6 x G(0/255) 0.3 x R (255/255) = 0.3 in the swapped image the contrast beetween the brightness of the trees and grass is less important than in the original image (Y=0.3 versus Y=0.4) In this chart the colors are ranked by luminance. white Y=1, yellow Y=0.9, cyan Y=0.7, green Y=0.6, magenta Y=0.4, red Y=0.3, blue Y=0.1, black Y=0 As you can see here the brightness of Magenta and Red is very close as opposed to cyan and blue that are very far apart. So that's why channel swapping sometimes makes the image lose quality. A solution to this is to tick the box "preserve luminance" in the channel mixer. It doesn't work in every situation since it makes the yellow objects turn way darker, leading to an unnatural look. Ticking the box also makes the sky way brighter. It's a tradeoff that has to be made for each individual picture. In the selection I posted above, a few have the box ticked and most don't.
  8. I have been taking a lot of photos lately. This time I decided to investigate a pair of cucumbers. I illuminated them with a halogen spotlight, which emits enough light for IR, visible and UV. IR tri color is using my GRB3 method UV is taken with a ZWB2+QB39 stack (which surprisingly does not leak significantly, even with halogen) I will also be including the individual pictures if anyone else wanted to take a shot at processing them (please, do post). I'd especially appreciate if someone managed to stack all 7 channels continually. I only could stack them by binning the visible and the IR part of the spectrum together. The pictures are in full hd so if you still have a 1080p display, you might want to enlarge. visible IR tri color 850nm+720nm+red Aerochrome simulation GBUV full spectrum individual color channels: 950nm longpass ~850nm band ~720nm band red band green band blue band UV band (400-350nm) I also decided to stack the images in Photoshop and pick the range stack mode, I got interesting results. All of the bands stacked: All of the bands except for UV stacked: IR only stacked: IR only stacked (normalized): Bonus: IR stacked, normalized and processed with Topaz Denoise AI: Here's the IR range mapped on the visible image: Here's the range between 720 and 850 bands mapped on the visible image:
  9. Lately, I have been having much fun taking pictures of different objects in many bands and then combining the data I get in different ways. I have gotten many interesting results which I will share later but for now I would like to share images of this beautifully blue mineral I got on a flea market on saturday. The images are in full hd. As I usually do, I took several images of the rock in different bands. Three IR bands with my GRB3 method, a normal RGB image (as seen above), and a UV image with a ZWB2 and QB39. 950nm 850nm 720nm red green blue UV Out of curiosity, I also did a UV picture with the same exposure time, aperture and ISO with a 510nm longpass filter screwed on top of the two filters. I think the result was very impressive considering this was a halogen spotlight and not even the sun or some other better lightsource. This image was pushed by 8.612 stops in Darktable. Now for the more artistic interpretations. G-B-UV 950nm-850nm-720nm Aerochrome simulation 850+720+R full spectrum (850nm+950nm+720nm)-(R+G+B)-UV Edit: here's a full spectrum stack made with a hybrid method I developed using advice from both @Stefano and @Andrea B. (found in this thread): I think it looks much better. (stackmode maxium)-(stackmode median)-(stackmode minimum) stackmode range, normalized Brighter areas show where the minaral is the least consistent in its reflectance. It's very inconsistent overall.
  10. As you know, all cameras react differently to IR light once they are converted to full spectrum. I don't know if this is really well understood so I wanted to start a topic and have your opinion about it. I just got the Sony f828 (the camera that can be converted to full spectrum with just a magnet) and I can now compare its fullspectrum colors to my Canon. Canon 1200D, white balanced full spectrum colors, no additional filters : Sony f828, white balanced full spectrum colors, no additional filter : The pictures were white balanced and saturated from the RAW files in Lightroom. The two cameras are pretty far apart as we can see from these pictures. They are also pretty far apart in terms of technology : the first is an 18Mpx CMOS and the second is a 20 years old 8Mpx CCD one of a kind RGBE sensor. Sony camera are known to not perform as good as Canon with the IRchrome filter. The images above indeed show that this Canon camera has a predisposition to record IR in the red channel compared to the Sony. Now do more recent Sony cameras produce blue SOOC like this one does ? I don't know... Maybe the sony users here can help me from their experience.
  11. I have already tried several ways of achieving this. This time I got the best results though, I think. the blue channel is a 720nm longpass filter and the GRB3 the green channel is an 850nm longpass and the GRB3 the red channel is a 950nm longpass Here's the transmission curve for GRB3 as published by Tangsinuo themselves. As you can see, stacking this with either 720nm longpass or 850nm longpass would create a distinct peak by itself. Here's a hypothetical and not-at-all scientifically accurate representation of how this (most likely) works: The only disadvantage here is that the transmission of the green channel stack is very low, so the images take a long time to expose properly, otherwise though I think it works perfectly. I tested this by taking a picture of a Guiness World Records book that is pearlescent in the visible to see if it persists in IR. See for yourself. Visible (phone photo) IR Tri-color You still see red, yellow, green and blue, so I think the spectrum is pretty good. Here are some more examples of images made with this method. Visible: Assorted Lightsources IR Tri-color Visible: Glassware and Liquids (the yellow-green one is uranium glass) IR Tri-color Visible: Filters, gemstones, remote control, minerals. (I unfortunately bumped the tripod on the last photo so I couldn't align properly) IR Tri-color Visible: gemstones alone IR Tri-color IR Tri-color Bonus: Fruits (focused on the damaged apple area on purpose) IR Tri-color Bonus: jars with various foodstuffs IR Tri-color Bonus: Sony DSC-F828 digital camera The camera is not very interesting truth be told. The plastic is very persistently black. Next thing I would like to try is capturing infrared fluorescence this way to see if it is as "colorful" as the visible sort.
  12. A process for full-colour UV (actually, UVA) photographs using tri-colour separation images is covered in the thread at https://www.ultravio...__fromsearch__1 . An equivalent process for IR (actually, NIR) uses the same methods, but with different filters (and without the problems associated with the camera's low sensitivity in UV). In this post are some images which show typical results from these techniques. Unlike simulated Aerochrome images, no visible light is involved – the images are pure UV or pure IR. Just as a reminder, the tri-colour separation images used the following filters: UV: Red Channel: 380BP20 Green Channel: 345BP25 Blue Channel: 315BP25 (peak transmission at about 323nm) IR: Red Channel: CWL at about 1,000 nm Green Channel: CWL at about 850 nm Blue Channel: CWL at about 735 nm Camera in all images is a full-spectrum Sony A6000. Firstly a few shots showing groups of related items. Here are some printed materials (Visible, UV, then IR images; Lens = Focotar-2): Next, various containers with metal, plastic, and paint (Visible, UV, then IR images; Lens = Focotar-2). In UV, a lot of plastics come out as brown (i.e. increasing absorption as wavelength decreases) irrespective of their visible colour, and the same plastics tend to come out white in IR. (Oil-based paints similarly come out brown in UV.). There is also a glass of water here. In UV this is yellow, which is because of the absorption of shorter wavelengths by the glass; in IR it is blue, because of the increasing absorption at longer wavelengths by the water. The plastic bottle of isopropyl alcohol at bottom left shows a similar effect in IR. Now, some fruit and vegetables (Visible, UV, then IR images; Lens = Focotar-2). Relatively little colour, with the objects looking dark/rotting in UV and shades of white in IR. Glazed Pottery (Visible, UV, then IR images; Lens = Focotar-2): My long-suffering wife. The red hair in the IR shot is close to what it was like when she was younger. The skin has a slight cyan colouring, indicating higher reflectance at the shorter IR wavelengths (the yellow patches are probably down to facial movement between shots). The redness in IR of the visibly red bricks is noticeable. The UV shot shows up the freckles, and the sun-blocking effect of face cream (rather than specific sun-block cream) applied several hours earlier: the brown colour of the face cream area indicates decreasing absorption as wavelength decreases. The mauve of the T-shirt is interesting – indicates that reflection dips in the middle of the UVA range (Visible, UV, then IR images; Lens = Focotar-2): Non-glazed Pottery (Visible, UV, then IR images; Lens = Focotar-2): Finally in this sequence, a car windshield (Visible, UV, then IR images; Lens = Focotar-2). The UV image shows very strong absorption, especially at shorter wavelengths, which is not so surprising. But I was surprised to see that there was quite a lot of IR absorption, predominantly at longer wavelengths. The above UV image brings to mind what the great Richard Feynman said about the first atomic bomb test at Los Alamos: "They gave out dark glasses that you could watch it with. Dark glasses! Twenty miles away, you couldn't see a damn thing through dark glasses. So I figured the only thing that could really hurt your eyes - bright light can never hurt your eyes - is ultraviolet light. I got behind a truck windshield, because the ultraviolet can't go through glass, so that would be safe ... this tremendous flash out there is so bright that I duck ... So I look back up, and I see this white light changing into yellow and then into orange ... Everybody else had dark glasses, and the people at six miles couldn't see it because they were all told to lie on the floor. I'm probably the only guy who saw it with the human eye." Looking at buildings now, it is interesting that brick and roof tiles that are red in the visible also come out reddish in IR. So an IR colour image could almost be mistaken for a standard visible colour image - until you have a true visible colour image for comparison (Visible, UV, then IR images; Lens = Focotar-2): A couple of noteworthy points about the next trio of images: the window frames and doors in the buildings to the left are brown in UV because they are plastic coated or painted with oil-based paints. In the building just right of centre, the wooden beams, which are weathered to grey in the visible image, come out brown in IR: this is a typical rendition of wood in colour IR (Visible, UV, then IR images; Lens = Focotar-2): (Visible, UV, then IR images; Lens = Focotar-2): In this image, the columns are painted white using an oil-based paint, and so come out brown in the UV image (Visible, UV, then IR images; Lens = Focotar-2): The next image is a vertical panorama. This church is on a hilltop and is a reporting point called "Golden Ball" for aircraft flying in to the local airfield (Visible, IR; Lens = Focotar-2): The following image shows the limitations of this technique when using focal lengths shorter than 50mm (with an APS-C sensor) and dichroic filters. You can see the colour shift towards the edges in the IR shot, and the additional problems caused by the small diameter (25mm) of the UV filters (Visible, UV, then IR images; Lens = Soligor 35mm f/3.5 enlarging lens): Turning to landscapes, this is where IR is at its best. UV landscapes show very little colour, and of course haze in the distance is more pronounced (which might be an effect you want). Skies come out blue in IR (adding to the effect of sometimes appearing as almost normal colour images); this is less noticeable in UV, with skies often appearing white – perhaps as a result of the sky burning out, because if you deliberately under-expose you can get a blue sky) (Visible, UV, then IR images; Lens = El-Nikkor 105mm): I have added a funky fail in to the following trio: this was an IR shot where the fast-moving clouds caused their shadows to move while I was changing filters for the three tri-colour separation exposures (Visible, UV, IR, then IR Funky Fail images; Lens = El-Nikkor 105mm): I like this shot in IR – it almost looks like a normal colour image, but then you have the surprising white background. Also, another funky fail (Visible, UV, IR, then IR Funky Fail images; Lens = El-Nikkor 105mm): This IR image shows subtle variations in colour between different areas of vegetation. (Visible, IR; Lens = Focotar 2): In this trio, we see clearly the effects of atmospheric scattering: the distance in the UV shot is very hazy, and the IR shot shows blue-green colouration in the distance. The building with chimney stacks towards the top-left of the image is about 19 Km away (it is all that remains of a power station which used to be a major landmark for local light aircraft) (Visible, UV, then IR images; Lens = El-Nikkor 105mm): Note here the brown colour of the bridge in UV, which again is down to the use of oil-based paint. The IR image shows subtle variations of colour in foliage, and the white paint on the bridge is obviously not white in IR (Visible, UV, then IR images; Lens = Focotar-2): Finally, flowers. I have not included any IR shots here, because these always come out as white, although you can squeeze a bit of colour out of them by ramping up the saturation to extreme levels. But here we have only UV shots. As a general (but not universal) rule, blue and white flowers come out red – presumably because the colourant reflecting blue light doesn't stop reflecting at 400 nm, but continues into the longer UV wavelengths. (Blue flowers also come out blue in straight shots taken through a Baader U, presumably because the longer UV wavelengths just happen to be triggering the blue channel.) Wild Strawberry (Visible, UV; Lens = Focotar-2): Cornflower (Visible, UV; Lens = Focotar-2): Bindweed (Visible, UV; Lens = Focotar-2): Mock Orange (Visible, UV; Lens = Focotar-2): Campanula – note how the difference in visible colour does not come through in UV (Visible, UV; Lens = Focotar-2): Margerite (Visible, UV; Lens = Focotar-2): Sweet Pea (Visible, UV; Lens = Focotar-2): Dead Nettle (Visible, UV; Lens = Focotar-2): Dog Rose (Visible, UV; Lens = Focotar-2): And yellow flowers tend to come out slightly cyan, indicating more reflectance at shorter wavelengths. (These come out slightly yellow with the Baader U). St. John's Wort (Visible, UV; Lens = Focotar-2): Buttercup (Visible, UV; Lens = Focotar-2): Dandelion (Visible, UV; Lens = Focotar-2): Hawksbeard (Visible, UV; Lens = Focotar-2): The rest of these images do not follow those general rules: Geranium (Visible, UV; Lens = Focotar-2): Iris (Visible, UV; Lens = Focotar-2): Pansy (Visible, UV; Lens = Focotar-2): Mallow (Visible, UV; Lens = Focotar-2): Clematis (Visible, UV; Lens = Focotar-2): Loosestrife (Visible, UV; Lens = Focotar-2): Aquilegia (Visible, UV; Lens = Focotar-2): Snapdragon (Visible, UV; Lens = Focotar-2): Tulips. Flowers of the same species but with different visible colours usually look the same in UV, but these tulips show that that is not always the case (Visible, UV; Lens = Cassar S):
  13. Hi, I have had this idea for a while, I'm sure this isn't anything new, but I wanted to share it. Using a camera with a high frame rate (at least 120 fps) and a rotating filter wheel, it could be possible, at least theoretically, to make a TriColour video. For example, having 4 filters for three wavelength bands (one doubled, ideally for the band at which the camera is least sensitive), at 120 fps would allow to take images at three different wavelengths 30 times per second, and with some digital processing have a color video at 30 fps. For UV especially the camera would better be monochrome for sensitivity reasons. If the blue channel is chosen to be UVB, taking a UVB photo at 1/120 s exposure time could maybe be possible using a fast lens and high ISO. See Lukas` post for UVB video. It's doable with a monochrome camera and I strongly doubt you can do it with a color camera, unless you are recording the Sun. The resulting video would probably suffer from color fringing for fast-moving objects, and also there's the problem of different sensitivity to different wavelengths (one solution would be ND filters for the "brightest" ones). I don't have such a camera and I won't try this anytime soon, if I ever will. Has anyone tried this or is going to?
  14. (UVP has no monetary associations with any company.) I was reviewing the available information about the MegaVision 120/4.5 Macro UV-IR ApoChromat lens which is listed in the UV-Dedicated section of the Lens Sticky. I learned that for Archival and Cultural Heritage Imaging, MegaVision offers a complete setup of camera, lens, LED lighting and software. LINKIE: http://mega-vision.c...l_heritage.html SEE ALSO: N-Shot Capture The camera is a 50 megapixel monochrome with no internal filtration. Nice. The LED lighting panel can be configured with 365 nm UV and 395 nm UV/Violet modules as well as with 10 different visible modules and 6 different IR modules. Again, nice! (And I'm willing to bet that a UV module below 365 nm would not be impossible to ask for.) The software controller permits a 12-band image capture in less than a minute. Woo, fast!! Here is the unusual thing: no filters are used with their "leakless" LED lighting. This would be a nice thing, methinks. But I do hope instructions are given to shoot in the dark. I particularly liked this: "Spectral confusion due to Bayer filter transmission overlap is eliminated, thereby reducing potential metameric failure and allowing improved display/printing flexibility." That's what we get here sometimes - spectral confusion. La!! Finally, I really enjoyed reading the last part of the LINKIE which discussed making not tri-colour but 5-colour to 8-colour images from a monochrome camera. The claim is made that to make "good" visible images from a mono cam, one needs more than 3 bands. Very interesting! What do you think this gear would cost? I'm not sure I even want to ask. :lol:
  15. Here Andrea suggested me to try a TriColour photograph of an egg, and here I am. I felt like it was better to create another topic, but I'm OK if this post is moved into the original topic. Usual setup (full-spectrum Canon EOS M, SvBony 0.5x focal reducer lens), with the following filters: Visible reference Chinese BG39 (2 mm); TriColour Blue channel: double 310 nm Chinese bandpass filter + ZWB1 (2*2 mm); Green channel: BrightLine 340/26 filter + ZWB1 (2*2 mm); Red channel: BrightLine 387/11 filter + Chinese BG39 (2 mm); Left: raw egg; right: boiled egg (both brown eggs); Visible reference: UV TriColour (there is some color fringing in the shadows, but the result is clear): I would have never expected the blue color, it is very unusual for something to be "blue" in UV. I did a quick Google search and found this: https://www.researchgate.net/figure/Average-spectral-reflectance-of-host-eggs-before-and-after-application-of-UV-block-a_fig2_281866377 https://www.researchgate.net/figure/Iridescence-of-eggshells-Specular-reflectance-of-a-T-major-eggshell-fragment-at_fig2_269398327 which may explain what I saw. It would be interesting if someone else tried the same experiment, preferably with a white reference (David did, but without the reference). Also, my paper tissue may not have a flat reflectance in UV, especially at 310 nm (if it darkens the egg appears brighter).
  16. UV color is a complex and controversial topic. For example, Andrea will always remind you that UV false color is not strictly related to wavelength, as it depends on many factors (lighting spectrum, lens transmission, filter transmission, sensor response, white balance...), although we always see the same colors in our UV photos: blues, lavenders/purples, yellows, and sometimes green. Red is not a color that we would expect. A different way of thinking at color outside the visible spectrum in general is to make a TriColour/trichrome/tri-band image, which often produces more natural-looking colors (for example, the sky is still blue) and also there is a wavelength-color relationship. UVP member Bernard Foot has experimented with the technique some years ago, and I have already tried it before. Other people (notably UVP member OlDoinyo) like to render white-balanced UV photos in BGR (swapping the red and blue channels), which also produces blue skies and a different color palette. Since I have a color camera, the images I take when making a TriColour image have colors, which I normally get rid of to make the channels. If I stack those images instead, I can simulate the raw image taken by a camera with an approximately flat response between about 310 nm and 400 nm, and with sunlight having a uniform spectrum too. This never happens in real life, even with a UV-dedicated lens. The interesting part is comparing the resulting colors with those of a normal UV photo. The equipment I used is the usual one: full-spectrum Canon EOS M, SvBony 0.5x focal reducer lens and the following filters: TriColour: Blue channel: double 310 nm Chinese bandpass filter + ZWB1 (2*2 mm) (the ZWB1's are not necessary, but I used them anyway); Green channel: BrightLine 340/26 filter + ZWB1 (2*2 mm); Red channel: BrightLine 387/11 filter + Chinese BG39 (2 mm); Standard UV: ZWB2(2 mm) + Chinese BG39 (2 mm); Visible reference: Chinese BG39 (2 mm); The technique used to make the TriColour images is also the usual one, described here. The major difference is that I took multiple 310 nm exposures this time and stacked them taking the darkest pixels (5 in both cases). As for the raw color stacks, I set the brightness of each image to be about the same by eye and stacked them. Also, following Andy's advice last time, I raised the brightness of my images and the contrast in the TriColour stacks (also because the contrast in the original channels was removed in PhotoNinja during the processing). The visible and UV references are white-balanced in-camera, the raw stacks were white-balanced in PhotoNinja. I used both UV-lavender and UV-yellow subjects. For the lavender, I picked three items with varying degrees of lavender: a magnifying glass on the left (transparent at 387 nm, mostly transparent at 340 nm and opaque at 310 nm), almost colorless; a white LED lightbulb in the middle, and a plastic lens on the right (mostly transparent at 387 nm but opaque at 340 nm and below, which shows a strong blue-purple color). Visible reference: Standard UV: White-balanced raw UV stack: As you can see, the color palette didn't change much, but since here the shorter UV wavelengths contribute much more to the image, the magnifying glass is noticeably darker. In general, objects with a pale lavender color got a color boost. UV TriColour: Here the color palette is obviously richer, with the color giving a good indication of the transmitted/reflected wavelengths. Standard UV, BGR: Compared to the TriColour rendition above, only the plastic lens on the right looks similar, while the color deviates more for items with a flatter UV response. For comparison, here's the raw stack, in BGR version: ...and now for the yellows. Here I used a 3 mm thick ZWB1 filter on the left, and a 2 mm thick ZWB2 filter on the right. Visible reference: Standard UV: Here the colors look similar, with the ZWB1 filter being slightly greener, as expected. Also the paper tissue I used apparently contains UV-absorbing fibers. White-balanced raw UV stack: Here things get weird. The ZWB1 filter got orange, which is a bit different from its normal color. Also, and this was expected, the difference in color (hence transmission) between the filters is more evident now. UV TriColour: Standard UV, BGR: UV stack, BGR: Raw or .tif files are available.
  17. David has previously shown that this lens is good for UV down to UVB: https://www.ultravio...but-not-magical And here is an image quality test: https://www.ultravio...-tests-at-313nm Bernard Foot has also tested this lens for its intended purpose, and too reported good results: https://www.ultravio...th-svbony-et-al So I tested one myself. I built a lens using the following parts: The 25-52 mm adapter ring barely screws on the lens ring since the lens is thicker than the original UV filter. If you want to build a similar lens, either glue the step-up ring or try to find a deeper 25 mm filter ring. The lens should be mounted with the most curved side towards the subject and the flatter side towards the sensor. The focal length of this lens is about 45-48 mm (I used 48 mm to calculate the apertures below). Image quality Being a single lens, there's no correction for spherical and chromatic aberrations, so you will get both. You will need to refocus at different wavelengths. Full-spectrum Canon EOS M, Chinese BG39 (2 mm). Fully open (about f/2.4), ISO 200, 1/30 s exposure: About f/5, ISO 800, 1/30 s exposure: Stopping down even more will improve image quality a lot, especially the edges, but of course this will require longer exposure times. UV (ZWB2 (2 mm) + Chinese BG39 (2 mm)), lens stopped down between f/3 and f/5, eyeballing: ISO 25600, 1/15 s exposure: Rubik's cube. All colors absorb at 310 nm and 340 nm, but the white squares are still a bit reflective at 387 nm, and they appear red here: ISO 100, 8 s exposure: ISO 25600, 1/30 s exposure: UVB performance and TriColour This lens is quite transparent to UVB, at least down to 310-313 nm (see David's test in the link at the beginning), while even lenses with a very good UV reach (such as my Soligor) do not transmit much there, just a few percent at most. This makes taking UVB images and TriColours easier, as I put the 310 nm images in the blue channel. With this lens there is a gain of at least 2-3 stops compared to a Soligor 35 mm f/3.5 lens at 310 nm. The focus shift between 310 and 340 nm is not very strong, but between 340 and 387 nm it is visible in live view (I would have expected the opposite). In the images below, I had to zoom-in some channels (especially the red one) and stretch them horizontally and/or vertically to overlay them as best as possible. I did that in Paint. In the 310 nm images I only kept the green channel, as most of the signal is recorded there. Channels: Red: BrightLine 387/11 filter + Chinese BG39 (2 mm); Green: BrightLine 340/26 filter + ZWB1 (2*2 mm); Blue: Chinese 310 nm bandpass filter*2. Empty glass with paper tissue and rocks. This image was probably taken with the lens either fully open or very open, as it is quite soft. The absorption of the glass at 310 nm is visible as a yellow tint. Magnifying glass (top), plastic lens (bottom left), car headlight lens (bottom right). All the lenses are black at 310 nm, and the plastic lens is black at 340 nm too (hence the red color): Rubik's cube. All squares absorb UV, except for the white ones, which reflect at 387 nm and appear red here: 2 mm thick ZWB2 filter (left), 3 mm thick ZWB1 filter (right): From here, I also stretched the channels to reduce color fringing. Before, I only zoomed them. CFL bulb. Both the base and the tube are white to the naked eye (athough with slightly different shades). In UV, the base becomes a neutral gray (quite unusual in my opinion) and the tube becomes orange. This color means two things: firstly, the reflectance gradually decreases at shorter wavelengths (a sudden drop would usually look red or yellow), and secondly, the glass the tube is made of is mostly transparent at 387 nm and still partially transparent at 340 nm. I have detected the 365 nm mercury I-line coming from this very bulb in the past: https://www.ultravio...-a-spectrometer Polycarbonate goggles with atypical temples. I already showed them in this topic. Both temples absorb at 310 nm, but only one absorbs well in UVA: Some trees with some haze: Daisy: One day I should try an aspheric lens, if quality is significantly improved. But this lens is still nice and cheap.
  18. So recently, I have developed a new (maybe) way to investigate the properties of objects. I'm not sure if it's been tried here before, but what I do is that I use three lightsources, each of different wavelength, l filter these lightsources with a QB39, and on the camera I mount a 720nm longpass and a 650nm longpass. The lightsources I use are generic chinese made spotlights that you find on eBay or Aliexpress. https://www.ebay.com/itm/164732255359?var=464226938551 https://www.aliexpress.com/item/4000068864312.html?spm=a2g0s.9042311.0.0.27424c4dpYRypM I use the blue, green and 395nm variety. The light causes the given object to give off fluorescence but its amount and wavelength differs slightly depending on the wavelength. I then stack the images and map them to different channels to illustrate the differences. The 650nm longpass is used in tandem in order to weed out the little green light that leaks through the 720nm longpass. I have prepared two examples, an apple, and a decaying leaf. leaf reference leaf under green leaf under blue leaf under 395nm leaf trichrome apple under green apple under blue apple under 395nm apple trichrome bonus: apple UVIVF If you have any suggestions of what else I should do, please do tell. I have tried a few other things, including pumpkin seeds, a mineral rock, the outside of an apple or the print on a juice box to see how the different dyes and bare paper behave. I might post that later. So far I love the results, I think they're really interesting and rich in how the color varies, which cannot be said for many other techniques here, such as UV, which really only gives around 3-4 colors depending on your reach. Next step for me is to make a setup with three different wavelength UV LEDs to make UV trichromes, I think that's a superior way to see into UV as it can give a full spectrum of colors and works in a way our brains understand. But enough rambling, hope you enjoy the post.
  19. One of my favorite posts on this forum is this one by Bernard. https://www.ultravio...lour-uv-and-ir/ I'd love to replicate something like this myself, at least to an extent. I wanted to get myself a set of LED chips with different wavelengths, I would then light up objects with them and create trichromes. This is my current shopping cart. 400-405nm/390-395nm for red 380-385nm for green 365-370nm for blue What I'm not sure about is the red channel, I was thinking perhaps I could get something even more longwave to accommodate for the fact that my reach won't go that deep, definitely not as deep as Bernard's for example. And if I do settle for one of those options, which one should I pick? I guess I could also get both since the 400-405nm one only costs about 3 USD. I don't know if there would even be a difference between those two though. Is the 5nm difference worth it? Edit: Update, I just purchased the LED chips. I kept the blue and green options and I went with a 395-400nm for red. If all I get are some shades of yellow, beige and orange, that's fine with me. It's still a more sense-making way to image things in UV than using the odd transmission curves of the bayer array filters. Not saying I'm done with that type of UV photography, but this is I feel superior because it ties into the way we perceive the world already.
  20. Jonathan sold me some 25 mm filters some time ago, including two BrightLine filters, a 340/26 and a 387/11 (2 pieces). These filters pass two distinct UV bands with no overlap, and I thought they could be useful for TriColour UV. Since they leak IR, I had to stack the 340/26 filter with 2*2 mm thick ZWB1 and the 387/11 with 2 mm thick Chinese BG39. For the blue channel, I used two chinese 310 nm filters stacked to eliminate any leak (alone, they leaked some 340 nm light). I putty-mounted all filters in 30-26 mm step-down rings. To avoid vignetting, I had to reverse-mount my step-down rings, so I mounted a 46-43 mm ring on the Soligor, then a 43-37 mm ring on top, then a 37-30 mm ring upside down and then the 30-26 mm rings containing the filters. I can not screw completely this assembly on the lens because something touches the metal ring with the lens name and the serial number. The only rings which allow for reverse-mounting are those with wide threads. Below two 46-37 mm step-down rings, they are the same except for the threads. 46-43 mm, 43-37 mm, 37-30 mm and 30-26 mm step-down rings mounted as described: *Edit: I noticed that sometimes narrow thread adapters allow reverse-mounting. The filters putty-mounted on the 30-26 mm step-down rings. From left to right, 310 nm bandpass filter (stack of two), brightLine 340/26 and brightLine 387/11. The BrightLine filters are "lipstick mirrors" as Andrea would say, and that makes dust very visible (not in images luckily). The 310 nm filter has an orange side and a gray side. The orange side is UV-yellow and IR-orange. 310 nm filter, orange side: My Soligor, according to Ulf's data only transmits 1% at 310 nm, which means I have to use very long exposure times at very high ISO and so my images are noisy at 310 nm. These two are my first attempts at UV TriColour: Camera: Full-spectrum Canon EOS M Lens: Soligor 35 mm f/3.5 Red: BrightLine 387/11 filter + Chinese BG39 (2 mm); Green: BrightLine 340/26 filter + ZWB1 (2*2 mm); Blue: Chinese 310 nm bandpass filter*2. You can notice a dichroic discoloration and a very noisy blue channel. Magnifying glass: Wooden house: UVB photography The blue channel in the photos above is in the UVB band at 310 nm. Since this wavelength is recorded basically in the green channel only, any signal in the red and blue channels is noise, and by keeping the green channel only I can cut some of the noise. This is not enough to have good images though. Some UVB photos at 310 nm. Lens (Soligor) at f/3.5 or f/4. I took the green channel only. ISO 25600, 60 s exposure, brightened by 50% in Windows Photo editor ISO 25600, 30 s exposure ISO 12800, 1/30 s exposure. This is the Sun. I don't know what causes that weird reflection effect. ISO 25600, 30 s exposure. here I also tried to improve the image by adjusting the contrast and the brightness. Comments Although the Soligor makes it possible to take 310 nm photos, it is at the limit of what's possible. A better lens (such as a quartz lens or a lens transmitting significantly more than 1% at 310 nm) would help immensely.
  21. Today we went on lake Como and of course I took a lot of images, in visible and invisible light. I made some TriColours with infrared in the red channel, visible in the green channel and UV in the blue channel. I still have to improve the technique, but the results are not too bad in my opinion. Camera: full-spectrum Canon EOS M Lens: Soligor 35 mm f/3.5 Filters: UV: ZWB2 (2 mm) + Chinese BG39 (2 mm); VIS: Chinese BG39 (2 mm) (only); IR: Hoya R72; All images taken at f/8 and ISO 100. The channels were obtained by converting the whole images to B&W. Except for the second image, all images are the "raw" stacks, no WB applied. I did apply it in Photo Ninja in the second image by clicking the base of the column at the right. Normalizing the exposure times for the visible images, these are the exposures required for UV and IR: UV: ~200-250 VIS (+UV*): 1 IR: 2-2.6 *The BG39 filter used alone passes visible light as well as UV. The resulting image is almost completely VIS-only, but the sky does have a slight violet tint.
  22. Foot, B. (2020) Campanula persicifolia L. (Campanulaceae) Peach-leaved Bellflower. Flowers photographed in ultraviolet and visible light. Also with a UV stereo anaglyph and a TriColour RGB stack. https://www.ultravio...er-uv-anaglyph/ Location: Date: 30 June 2019 Reference: 1. Wikipedia (29 June 2021) Campanula persicifolia. Wikimedia Foundation, San Francisco, CA. 2. Fitter, R., Fitter, A., Blamey, M. (1996) Wild Flowers of Britain and Northern Europe, 5th Ed. Peach-leaved Bellflower, page 238. HarperCollins Publishers, London, U.K. Visible Light Ultraviolet Light (Baader-U UV-pass filter, flash): UV Stereo Anaglyph UV TriColour Blue Channel = 315nm CWL, Green Channel = 345nm CWL, Red Channel = 380nm CWL
  23. So, for a while, I was thinking of the cheapest, most practical way to start making NIR trichrome images. I could either go about it by buying three LED light sources at different wavelengths, or by buying three different bandpass filters. I have found options for both of those but I'm not sure how practical either of those are. Obviously the bandpass solution would be perfect but those filters are cheap options on eBay, so I don't know whether they'll be A) optically uniform and B) won't leak other wavelengths. The name of the store implies they're made for lasers, so I suppose optical uniformity is a thing they would want to deliver to keep the beam properly collimated? https://www.ebay.com/itm/133205598912?ssPageName=STRK%3AMEBIDX%3AIT&_trksid=p2060353.m1438.l2649 https://www.ebay.com/itm/133202534997?ssPageName=STRK%3AMEBIDX%3AIT&_trksid=p2060353.m1438.l2649 https://www.ebay.com/itm/133202547786?ssPageName=STRK%3AMEBIDX%3AIT&_trksid=p2060353.m1438.l2649 https://www.ebay.com/itm/133238203848?ssPageName=STRK%3AMEBIDX%3AIT&_trksid=p2060353.m1438.l2649 Has anyone tried those? If you can confirm they are reasonably ok, I'm buying them right away, I would putty mount them on my Industar 50-2 which should be alright for the job with it's tiny front element. Another option would be buying different light sources and waiting for the night to take three pictures with different illumination, I have thought of those options. They're quite expensive though, and I also wonder which option would be better for the blue channel, 660nm or 730nm? There are also 1w versions of those bulbs but I suspect 1w is too weak, though I don't own any 1w lights so I don't know. Anyone own anything like that? Can I expect to illuminate larger portions of the room with such a light given that I use long exposures? And how long should I expect them to be? https://www.ebay.com/itm/254155708741?ssPageName=STRK%3AMEBIDX%3AIT&var=553505138524&_trksid=p2060353.m1438.l2649 https://www.ebay.com/itm/253563323272?ssPageName=STRK%3AMEBIDX%3AIT&var=553154114221&_trksid=p2060353.m1438.l2649 https://www.ebay.com/itm/253563323272?ssPageName=STRK%3AMEBIDX%3AIT&var=553154114249&_trksid=p2060353.m1438.l2649 https://www.ebay.com/itm/253563323272?ssPageName=STRK%3AMEBIDX%3AIT&var=553154114227&_trksid=p2060353.m1438.l2649 And there's a kind of bonus question I have, what is the ideal budget dual bandpass filter to emulate aerochrome? Ideally from Tangsinuo as their filters are the best price-wise. I have a QB29 which looks nice but not quite like aerochrome.
  24. EDITs on 17 June 2021. If those of you who are interested would kindly contribute a working description of the terms multispectral image and tricolor image, I would greatly appreciate that. I want to get consensus descriptions nailed down so that we can refine tags, Stickies, tutorials and our general discussions such that everyone understands what we are talking about. Here is what I was thinking, but I am entirely open to changing my own thoughts about these descriptions. In my descriptions I am making the assumption that everyone knows what is an RGB channel stack !!! DESCRIPTION A Tricolor Image is an RGB channel stack of 3 images where each image is made from a different sub-region within one of the following broad intervals. Typically three narrowband filters are used to photograph the subject. Ideally, the narrowband filters do not overlap or overlap very little. Ultraviolet [10-400 nm] Visible [400-700 nm] Infrared [700-104 nm] The Tricolor goal is to assign visible colors to photos made with "invisible" wavelengths. The selection of sub-regions may of course be restricted to smaller intervals within these larger categories. Tricolor Example: Red [300-320 nm] + Green [330-350nm] + Blue [360-380 nm] where each filter is a narrow 20 nm UV-bandpass. DESCRIPTION A Multispectral Image is typically thought of as an RGB channel stack of 3 images - UV, Visible and IR -where each image is made within the following broad intervals using either a broadband or narrowband filter. Other combinations can also be considered multispectral such as the UV, Vis and Vis example below. Ultraviolet [10-400 nm] Visible [400-700 nm] Infrared [700-104 nm] The Multispectral goal is to produce an image which might represent the outcome of photographing a subject in mixed light or to make an image which emulates the way an animal or insect might see. Multispectral Examples: Red-UV + Green-Vis + Blue-IR where each image is made under a broadband filter such as the BaaderU or an RG780. Red (entire UV image) + Blue (Visible Green Channel) + Green (Visible Blue Channel) Mix it up and see what happens. The 7 Electromagnetic Wavebands: Nobody can quite agree on the endpoints, so please don't let this bother you. This is not Science Class. There won't be a pop quiz. For example, Wikipedia likes to start Infrared at 750 nm. But any UV/IR photographer would start it at 700 nm because wavelengths in 700-750 nm contaminate Visible (and UV) photos. Gamma Rays X-Rays Ultraviolet [10-400 nm] Visible [400-700 nm] Infrared [700-104 nm] Microwaves Radio Some Sub-Wavebands: Same warning - don't sweat the endpoints. This is just a bit of guidance for general discussion. Near UV.........[300-400 nm] Middle UV......[200-300 nm] Far UV............[122-200] Extreme UV....[10-121 nm] UV-A.....[315-400 nm] UV-B.....[280-315 nm] UV-C.....[100-280 nm] Near IR (NIR)...........................[700-1400 nm] For the typical reflected IR photographer I think we would use 800-1100 nm. Shortwave IR (SWIR)..............[1400-3000 nm] Mid-wavelength IR (MWIR).....[3000-8000 nm] Longwave IR (LWIR)................[8000-15000 nm] NOTES: 1) Neither Tricolor nor Multispectral imagery is new. It's been done since the film (both still and movie) days. It is definitely much easier to make such images these days using digital files. 2) Yes, certain single dual bandpass filters or really wide filters might produce results which could be considered either Tricolor or Multispectral. Some light from such a filter is likely going to "contaminate" some other light, so the result may vary from what can be obtained with an RGB stack. Besides which, where's the fun?
  25. Today I tried my ZWB2 filter alone to take some UV/IR photos, ad I also took photos of the solar spectrum with a diffraction grating. The equipment is the usual, full-spectrum Canon EOS M and Soligor 35 mm f/3.5. Image white balanced in-camera. Here's one: You can see IR on the left, and UV on the right, with the 400-700 nm gap in between. But look closer at the IR section: There is an approximate RGB "rainbow", with blue for the longest wavelengths, green for the medium ones and orange for the shortest. Although this is only the IR band passed by the ZWB2 filter (UG1/U-360 equivalent, so about 700-800 nm), can we in some way separate those RGB components and use this technique to take one-shot TriColour IR images in the 700-800 nm band?
×
×
  • Create New...