Jump to content
UltravioletPhotography

Search the Community

Showing results for tags 'Video'.

  • Search By Tags

    Type tags separated by commas.
  • Search By Author

Content Type


Forums

  • Office
    • Announcements
    • UVP Rules & Guidelines
    • Requests for Photographs
    • Feedback & Support
  • Introductions
    • Who & Why
    • Introduce Yourself
  • UVP Technical Zone
    • Techniques, Tests & Gear
    • UV Lens Technical Data
    • Non-technical Experiences
    • STICKIES, References & Lists
    • Essays & Tutorials
    • ID Help
  • UVP Photo Zone
    • Ultraviolet & Multispectral Photos
    • Fauna: Animals, Birds, Insects or Other Critters
    • Forensics & Other Investigations
    • Fluorescence and Related Glows
    • Infrared and its Friends (SWIR, MWIR, LWIR)
    • Macro
    • People and Portraits
    • Scapes: Land, Sea, City
  • UVP Botanicals
    • UV Wildflowers by Family
    • UV Cultivars: Garden & Decorative Flora
    • UV Cultivars: Vegetables, Herbs & Crops
    • UV Other Botanicals
    • Index

Find results in...

Find results that contain...


Date Created

  • Start

    End


Last Updated

  • Start

    End


Filter by number of...

Found 22 results

  1. The TriColour technique is probably my favourite way of representing false color in invisible light photos (UV, IR, and theoretically any other band of the EM spectrum). False colors will always be false, but I feel like this technique makes "truer" false colors, as there's a logical meaning behind. Traditionally, this is done by taking three photos of the subject at three different wavelengths. The images must be superimposable, meaning the subject must stay still, the lighting should stay the same, and the images must be taken from the same point of view, otherwise color fringing will occur. This is fine if the conditions above are met, and I have taken some images this way which have little to basically no color defects. For videos, however, things are much different. Normal color sensors have subpixels for red, green and blue, and take the frames at the same time in all spectral bands. Outside the visible spectrum, if such sensors are not available, one has to use different strategies. Method 1 (naive method) The most obvious approach is to use three cameras, as close as possible, with three filters on their lenses, and take the frames at the same time. It would work fine for far away subjects, but at close distances parallax would be obvious. Pros: - simple to implement; Cons: - needs three sensors and three lenses; - parallax at close distances. Method 2: filter wheel I discussed this idea here: a spinning filter wheel is placed in front of the lens, and the setup is timed so that the sensor takes a frame every time a new filter is in place. If done quickly enough, this could allow for TriColour video. The problem is that fringing would be visible for fast-moving objects, and the sensor will have different sensitivities at different wavelengths, so ND filters might be needed. Pros: - only one sensor and one lens are needed; Cons: - difficult to build (the sensor and the filters must be synchronized); - the lens must be corrected for chromatic aberration. Method 3: dichroic mirrors To take three images at the same time at three different wavelengths from the same point of view, dichroic mirrors can be used. They reflect certain wavelengths and transmit others, essentially splitting the image. The biggest downside is that the lens must be either telephoto or strongly retrofocus in design, as the image plane cannot be close to the rear element. As for the retrofocus lens, here's a very raw attempt, at f/8: Pros: - allows for true simultaneous images without parallax; - corrects chromatic aberration (by adjusting the individual sensors); Cons: - requires three sensors; - for wide angle images, the lens must be strongly retrofocus, which makes it difficult to design; - dichroic mirrors in UV are not easily available (maybe interference filters at 45° could be used, although they are usually designed for near perpendicular light beams). A similar technique has been successfully used here. Method 4: dichroic mirrors with image screen This is a possible improvement of the previous method. It's the same camera as before, but the image is first projected onto a screen by a first lens, and then the screen is imaged with a second lens with longer focal length. This way a retrofocus lens is not needed. To increase the brightness of the image, the screen could be made with a microlens array or a Fresnel lens, although I doubt it would work much better. Something similar was used by Andy for his early SWIR experiments: https://www.ultravioletphotography.com/content/index.php?/topic/2112-swir-camera-setup-and-some-pics Pros: - allows for true simultaneous images without parallax; - doesn't need a retrofocus lens; Cons: - requires two lenses; - the sensitivity is likely lower than in the previous method, which is a problem especially for UVB; - the first lens must be corrected for chromatic aberration. To connect multiple sensors, I think a Raspberry Pi or similar could be used. I had other more exotic ideas (like using phosphors excited by different wavelengths), but I don't think they could be practically built. I think method #3 is the most reasonable.
  2. This YouTube channel (https://www.youtube.com/@invisiblerays) has some videos on UV/IR photography. The latest videos uploaded show footage taken with Universe Kogaku lenses. These lenses have been already discussed on UVP, for example here: https://www.ultravioletphotography.com/content/index.php?/topic/4784-uv8040bk2-does-it-make-sense These lenses are not chromatically corrected, as they only use one type of glass (fused silica), thus they need to be refocused when changing wavelength. From the videos below, image quality looks to be pretty good, and there's not much chromatic aberration visible (I can't see any of it). They offer lenses from 6 mm to 105 mm, but only lenses from 35 mm cover an APS-C sensor (actually not, as the diameter of the image circle is 22 mm, which is the side, and not the diagonal, of a Canon APS-C sensor, but maybe they cover a few extra millimeters). Two videos: https://youtu.be/6Rok76Quulk https://youtu.be/NvtFJsU8bTM
  3. Browsing YouTube I found this video: https://youtu.be/TlsS3rUjg8c [the site is acting weird, at least on my PC, so I cannot embed the link] The uploader warns not to watch the video as it is boring, and I haven't watched the lens-building part. What I find particularly interesting is the beginning of the video. It shows a modern f/0.95 lens next to an older f/4 enlarger lens in UV. The enlarger lens doesn't show any false color, and the modern lens has a lavender tint, as expected. But actually it doesn't look too bad, and it seems to transmit a surprising amount of light. Maybe some f/0.95 lenses could be interesing for UV photography if deep reach is not a priority.
  4. Hi, I have had this idea for a while, I'm sure this isn't anything new, but I wanted to share it. Using a camera with a high frame rate (at least 120 fps) and a rotating filter wheel, it could be possible, at least theoretically, to make a TriColour video. For example, having 4 filters for three wavelength bands (one doubled, ideally for the band at which the camera is least sensitive), at 120 fps would allow to take images at three different wavelengths 30 times per second, and with some digital processing have a color video at 30 fps. For UV especially the camera would better be monochrome for sensitivity reasons. If the blue channel is chosen to be UVB, taking a UVB photo at 1/120 s exposure time could maybe be possible using a fast lens and high ISO. See Lukas` post for UVB video. It's doable with a monochrome camera and I strongly doubt you can do it with a color camera, unless you are recording the Sun. The resulting video would probably suffer from color fringing for fast-moving objects, and also there's the problem of different sensitivity to different wavelengths (one solution would be ND filters for the "brightest" ones). I don't have such a camera and I won't try this anytime soon, if I ever will. Has anyone tried this or is going to?
  5. Here's a video I shot a few months ago when I first got my Samsung Galaxy Z-Flip3 smartphone. I held up a 760nm cut-on filter over the lens and was surprised how well it turned out. I finally created a Vimeo account to post the occasional video here so this one is my first attempt. I didn't make any adjustments to the camera settings, it automatically pops into night mode when the scene gets dark enough. I only edited the video to reduce the bitrate so the filesize would be better for online. There is a prominent hotspot in the IR portion of the video. If you're wondering why the foreground grass looks dark in IR it's fake grass, some sort of vinyl plastic. Here's a few stills taken at the same location recently using the same smartphone & filter. I monochromed them and reduced to 2000 pixels for posting here. The last one was taken in the shade and could barely be handheld due to a slow exposure of 1/17 second. The quality isn't superb but remember these were taken with a lens smaller than the button on your shirt. The gear. The little 27mm filter is about the right size to allow a wide angle shot from the phonecam's tiny lens. 720 and 760nm filters worked best. 800nm was getting too dark for handheld shots. The IR only works in bright daylight but the little filter makes a nice accessory for this smartphone.
  6. I was thinking about germanium being transparent in IR, and I remember some videos on this posted on this forum: https://www.ultravioletphotography.com/content/index.php?/topic/4782-hellow-from-poland-minsk-mazowiecki/&do=findComment&comment=48466 Or imagine how this big block of germanium would look like in IR: Looks like a metal, breaks like glass (actually gallium, a real metal, also breaks with a conchoidal fracture). I searched on YouTube and found this video: I find it interesting because he shows a variety of materials in visible and IR (I guess LWIR), including fused silica, calcium fluoride, sapphire, germanium and zinc selenide, as well as some metals (lead, tin, bismuth, zinc and silver, plus the non-metal carbon as graphite). As expected, some materials (windows in this case) are transparent only in visible light (for example, fused silica), some only in IR (germanium) and some in both (calcium fluoride). Another interesting thing he notes is how the surface of the metals (and graphite) appears more reflective in IR, as if it was more polished. I believe this is due to the longer wavelengths (20 times longer than visible light if he used a LWIR camera like I think he did), and thus the surface is smoother relative to the wavelength. This is also why the ground looks like a mirror in this 90 GHz microwave image: And a similar effect happens in the THz band: In that post I said that the rough side of tinfoil should look smoother in LWIR, and Andy replied that the low resolution of his camera wouldn't allow to see the difference. I actually think that the difference can be seen, perhaps under favourable conditions (or using other materials). UV does the opposite. I remember an image Mark once posted where you could see the scratches on a metal bowl better in UV, but his images are no longer here. This makes me think that, in theory, the same effect should be visible with ordinary visible light, with surfaces appearing slightly smoother under red light than under blue light. Probably the reason we don't see this everyday is that the difference is too small.
  7. I did a short video of clouds using the EL-Nikkor 105mm lens with a DB850 and Tiffen#12 stack. Hue rotated 45 degrees to make the sky blue, and contrast adjusted. Speed has been increased by a factor 10x relative to normal life.
  8. SAFETY WARNING: UV-C is dangerous to your eyes and your skin. UVP DOES NOT SUPPORT USING UV-C ILLUMINATION. [UV SAFETY] UV-C Light Dangers A short YouTube video that includes a 254 nm clip: https://youtu.be/CaXzgumCn34. The gear they use is interesting.
  9. The new forum software has more video capability, so you might like to test that. How a particular type of video is viewed might also depend on your browser. If the video won't run in one browser, perhaps try another. I'm going to test an upload of Birna's Poppy Video. She filmed the poppy flashing its iridescence in the wind on a bright sunny day. I viewed this mp4 video in Safari with no problem. Rørslett, B. (2021) Eschscholzia californica Cham. (Papaveraceae) California Poppy. Ultraviolet video. Gear: Panasonic Lumix H-2 (full spectrum) + Coastal 60/4.0 + BaaderU UV-Pass Filter in Sunlight poppyVideo.mp4
  10. By rights, this topic doesn't really belong on UVP, which is why I'm sticking it in the chat room, but I have a feeling many people here will be interested in it, because it's about another kind of invisible imaging. I've been working for several years with a Ph.D. student in Florida who is building a large scale wind tunnel for imitating thunderstorm downbursts over models of medium-sized buildings. The problem he has is how to visualize the flow over these models in a wind tunnel the size of small warehouse. Currently he is doing a smoke based system, but it occurred to me that another way would be to put pieces of black tape at various places on the model and heat them with spotlights or small lasers, then visualize the refractive index changes in the air as it flows. The imaging of refractive index gradients has a long history in fluid mechanics and is known as Schlieren imaging. It is usually done with large parabolic mirrors, but a new kind of computational Schlieren called Background-Oriented Schlieren has recently come of age, and it is extremely simple to do, as I will demonstrate below. The basic idea is that you take your warm object that is making the hot air currents and you place it in front of a large screen of randomly placed dots. Small changes in refractive index caused by warming the air make the dots "dance" on the background. Prior to placing the warm object in the scene, you take a "tare" image of just the screen with the dots, and then you can use special programs that use image correlations to determine how far each background dot has moved. The final output is then shown as a grayscale image where the amount of left-right movement of the dots is coded as a shade of gray (or sometimes a false color). The setup is shown below, both as a schematic from a paper by Gary Settles, "Smartphone schlieren and shadowgraph imaging," and also in my kitchen. My setup: Note that the size of the screen I used is actually much too small, which I knew ahead of time, but this was intended only as a proof of concept. I wanted to know how easy it was, and how practical it would be to build a set up in a large wind tunnel. I also wanted to demonstrate the concepts for my student, and show him how to do the processing. My tare image (actually an average of 30 aligned images to reduce noise) looked like this: A second image with the candle lit looked like this. No visible movement of the background is obvious and it couldn't be seen with the naked eye either. At this point I was very nervous that it wouldn't work! After processing the images, though, the airflow popped right out! Images were processed in MATLAB using a freeware program called PIVLab. Places where I had no dot screen or that had very little texture for the PIVLab program to detect came out noisy. I have also Photoshopped the candle and bowl back into the photo, which is standard procedure in BOS imaging. Here are some more examples: I do believe this is the first time THE AIR ITSELF has been imaged on UVP! I also took thermal photos, but what you see here is not the air but the soot from the candle flame (since gases don't emit blackbody radiation). And before Stefano points it out, yes, I COULD have visualized the CO2 absorption in MWIR with the other camera, but at the moment I don't have a way to support that 7kg camera in my kitchen. https://vimeo.com/535737212
  11. A week ago, Stefano PMed me, with the title, "MWIR camera at 'affordable' price?" I admit, I was skeptical, but the skepticism turned to amazement as I read the eBay listing. For sale was an Agema 470 Pro, at "Buy It Now" of $650, or best offer. The camera was of the HgCdTe (or MCT) type, which means it has a single pixel and a high speed rotating mirror that directs light onto the sensor, which is cooled via Peltier effect to -80C or so. Effective resolution was 100x140 pixels. The sensitivity is 2-5 microns, going from the long end of SWIR into the mid-MWIR. From 5-8 microns, air is absorbing, so no cameras are available in that range currently (nor likely ever to be). Beyond 8 microns is the usual LWIR window where my other thermal cameras work. The seller had posted pictures of the camera operating, and a power supply was easily available, so I thought: why not? So I got the camera. The camera is extremely large and heavy. It is about 50cm (22 inches) long, and it weights 7kg (15.4lb). It has a monochrome viewfinder and a floppy drive (3.5") for storage. I do not have any floppy disks alas. It took a few days to acquire a power supply and a light source. I bought a "Deep Heat Projector" from Arcadia Reptile. Arcadia had this to say when I asked about the spectrum last year in reference to my TriWave: I didn't buy it last year (didn't get around to it) but with the MWIR camera it was too handy to resist: better SWIR light and one that worked for the short end of the MWIR. So I got that along with a socket for it. Today all the stuff arrived and I put it together. The camera makes a revving up noise like a jet engine: a slowly building whirrrrrrRRRRRR!R!R!R!R!!!!!! as that mirror goes faster and faster. The electronics turns on and a boot-up screen appears, showing the software dates to May 1, 1989. The camera originally came with a variety of lenses, so the lens on the front is detachable. The one it came with was a 20 deg FOV lens — in IR, camera lenses are described by field of view (FOV) rather than focal length. With the lens removed, there is another (concave) lens behind it, and according to the ancient manual, which is available still from FLIR's website since FLIR bought Agema eons ago, you can use it in macro mode if you leave the outer lens off. Showing how highlights on my hand vanish when the reptile light is removed from my hand: https://youtube.com/Kz6nH_u0Teo Showing teeth changing temperature as I breathe: https://youtube.com/0exOl11PJSk
  12. Hi, I am new to UV photography and looking forward to learn a lot about it in this forum! I am very interested in using UV photography to create awareness about sun protection and to help visualize "sun protection" at an early age to children (and sun damage to adults). I am especially interested in low-cost options of UV photography- that will function in normal sunlight without any additional light sources. My first attempt at buying a UV webcam ​(Model: XNiteUSB2S-MUV) was definitely not successful, so hope that with this forum and wealth of information provided by its members, I will be able to make a more informed decision on purchasing the right equipment.
  13. Greetings from Mount Vernon, Washington, USA! I am grateful to have been introduced and accepted into your community! I am interested in all things light and its use in application. My education is BSEET/ABET from DeVry Institute of Technology, Phoenix 1999. I have eighteen years of engineering and technician experience in the following industries: Industrial Nd:YAG lamp and diode pumped laser markers from 3-100 W at IR/Green wavelengths of 1064nm and 532nm (rofin) Semiconductor stepper & scanner photolithography at UV/DUV wavelengths of 365nm, 248nm, 193nm, and 157nm (ASML USA/Veldhoven at Motorola, TSMC, Texas Instruments, Micron, and IBM) Aerospace metrology laser and radar trackers (Janicki Industries) Commercial HVACR R&D full life cycle product development testing for manufacturing (Legend Brands/Dri-Eaz Products) I look forward to working with you to progress the art and science of Ultraviolet Photography! Create a Great Day! Aaron
  14. Ockertfc

    CA Hello

    Hello All I am excited to join this group and explore the world of UV photography. My interest are in "insect vision" or simulated insect vision. I work in the flower industry and am starting a little side project to learn more about the unseen variation in our crops. Other than being a nerd, I have no experience in UV photography and am planning on using this as a learning platform and hopefully in the near future share some images.
  15. After a great deal of work, I have figured out how to access the video feed from my TriWave in MATLAB, and I've written a program to capture time lapses and process the resulting frames. Each frame of video is actually 120 frames from the camera (4 seconds' worth) averaged together, and this is repeated once a minute. This video took 90 minutes in real life, but's 13 seconds of video, or about 415 times faster than reality. Lens: Wollensak Velostigmat 25mm/1.4 Filter: Thorlabs FB1450-12 (1450nm, 12nm FWHM) https://youtube.com/dEgNbJj6NWU YouTube's compression really degrades it. ---- Update: I repeated the experiment at 1250nm. For some reason the Triwave is significantly less sensitive at 1250nm. (Or my lens has lower transmission that at 1450nm? Unlikely.) I don't THINK it's the filter, since the peak transmission is 50%-ish for the 1250nm but 30%-ish for the 1450nm, and the bandwidth is almost the same. ANYway, excuse my noise. Filter: Thorlabs FB1250-10. https://youtube.com/Br3hr_6Z7Tc
  16. Friday was another flyday! Continuing on from Cadmium's post (https://www.ultravio...v-with-a-drone/) and my first flight with the UV filter (https://www.ultravio...a-drone-part-2/) I managed to get some more favourable flight conditions this past weekend. What's odd is that my focusing seemed better in the first video ( ) but was off for most of this recent flight, and the gimbal was struggling in the first flight, but not this time. One day, I am going to find a happy medium. The difference this time is that I took off with the gimbal facing straight down. You can see from the treetops that I had a moderate breeze in both flights, so I think pointing the gimbal down during take-off seemed to help stablise it, as I was able to move it up and down without an issue once I was up in the air. This time, I encoded the video at 1080p, so that you can see the degree of focusing issues. Again, I put a Lumetri monochrome filter on the main part of the video: There's a very short segment at the end (without the Lumetri filter) where I was experimenting with the drone's built-in "Quick Shot" modes (one of those modes is called a "Dronie" which is a selfie with a drone). It's supposed to move away from the subject, climbing to a preset altitude, but I had many problems where it lost the lock on me. I had the same problem when attempting to do the "Helix" Quck Shot of the playground (not shown in this video). Not sure if the software uses colour to recognise the locked object, or if it's purely driven by contrast, but it also failed when I put the Kolari Vision 850nm IR filter on too. Might be down to a lack of sharp focus this time around. The Quick Shots only seem to work in the visible light range. Anyhow, for what it's worth, I included the Dronie attempt right at the end of the video. I still think this filter has potential, but I definitely need to play with it some more. Perhaps increase the ISO and lower the shutter speed a little.
  17. Following on from Cadmium's earlier post (https://www.ultravio...v-with-a-drone/) I managed to squeeze in a quick test flight today, using the filter that he had put together. Scattered thunderstorms were in the forecast for this weekend, so this was a rushed test at the mid-day hour, and conditions were not optimal: * Steady breeze, not too strong, but very noticeable at higher altitudes (you can see this in the movement of the trees in the video). * Partly cloudy I had my DJI Mavic 2 Pro (M2P) drone full spectrum converted by Omni View Tech, a local DJI retailer and repair shop. I also purchased the Kolari Vision drone filter pack (6 standard) plus the M2P versions of the IRChrome and NDVI filters. White balancing options are somewhat limited on the M2P. You can use a handful of presets and also set a specific temperature, but you can't CWB on a target, although it can be done in post. The M2P can shoot 4K H.265 DLOG for video and the standard DNG raw format for stills, so processing options are plentiful. Initial tests for UV sensitivity yielded some promising results. I took a quick shot from my balcony, hand-holding the drone (as I had no time to fly it yesterday, when I just got my drone back from conversion). Here's yesterday's shot - an SOOC resized jpeg, taken at f/2.8, ISO 320 and 1/30sec, using the "incandescent" WB setting. I think the giimbal did well to stabilise my shot: Here's the test video from today's quick flight. I added a monochrome Lumetri filter in Premiere Pro, and posted this as unlisted on my semi-abandoned Youtube channel, downsizing it to 480p in the process: When the drone was sat on my table, it seemed to have no problems coping with the 7.5g filter (I was able to move it up and down without an issue) but up in the air the gimbal really struggled to stabilise the video, both when stationary and on the uplift, and it seemed to have issues moving the camera up and down. When I switched out the UV filter for the KV NDVI filter (much lighter) I had no issues with stabilisation or gimbal movements. Again, this was only a quick test and in less-than-optimal conditions, so I need to revisit this when the weather permits. I may need to point the gimbal downwards for future flights. The good news is that there is virtually zero vignetting using the filter in both stills and video mode.
  18. Visible (sunshine, Resolve 60mm quartz lens, S8612 1.75mm + DB850 filter, Sony A7S camera, F16 ISO200 1/60") UV-A (sunshine, Resolve 60mm quartz lens, 330WB80 filter, Sony A7S camera, F16 ISO3200 2") UV video of tiny insects or arachnids coming out of center of flower when UV light is shined on it. They would not come out for visible light. Click through to make it big to see them better. They are very very tiny. https://youtube.com/52LwcRVlzF8 Visible (halogen, Resolve 60mm quartz lens, S8612 1.75mm + DB850 filter, Sony A7S camera, F8 ISO80 0.4") UVIVF (Convoy S2+ torch, Resolve 60mm quartz lens, S8612 1.75mm + DB850 filter, Sony A7S camera, F8 ISO1600 15") SWIR (halogen, Wollensak 25mm lens, Thorlabs 1500nm LP filter, Triwave camera, and god only knows how to quantify the exposure for this.) This is a pano. In situ photo for ID help.
  19. Chris Rooney

    Hello from Dubai

    Hi all, thank you for accepting my application mods.. I am a camera and lens engineer in a Dubai rental house, we have been asked to provide equipment on a skincare commercial to show sun damage, I'm sure you all know the type, enhanced blemishes etc.. We hope to use a Red Weapon Helium with a modified OLPF ( full spectrum), and UG11 and hot mirror filters. I have found a manufacturer in China who will make the UG11 in cinema sized 4 x 5.6" and should start experimenting when they arrive. As for lighting I was going to try the sun as it is not in short supply here . Really excited as it is something I haven't been asked in 30 years working this industry, any advice is much appreciated and welcome, I have done some research on this forum, very informative, thank you. Chris R
  20. Art hat on. Experimenting / playing: Video here D3200 full spectrum camera ASA 800 EL-Nikkor 80 mm f/5.6 (metal) f 5.6 Hoya u-330 1.5 mm filter plus S8612 2mm Processed in Premiere Pro (Neat Video Noise reduction plug in) Rather dull day, so low light levels, giving noisy images. Rather than trying to white balance I have been trying other approaches to giving "pleasing" image (can't think of the right word at the moment). Previously I was doing channel swaps (G-B, seemed good), but here I have rotated the hue by 90 degrees which gets away from the magenta look Initially I found images to be very noisy - the Neat video plug in works well to reduce this. Other experiments I have done noise has been less of a problem.
  21. I always wonder what setup they are using for these videos. Would be nice if they actually said so.
×
×
  • Create New...