Jump to content
UltravioletPhotography

Filter simulator using real world data in software


Daniel Gawędzki

Recommended Posts

Daniel Gawędzki

Hello everyone, this post is going to be generally pretty informal, as I'm in a rush today, and I don't have time for polish. 

For context, I like messing around with filter combinations, especially IR reduction filters in combination with color filters with NIR transmission of the range 700-850nm. 

I got an idea a while back, and realized it would be pretty much impossible to execute without some tedium. I called up my optics manufacturer, and next thing I know, I have the following filters:

 

400nm+

450nm+

470nm+

490nm+

535nm+

565nm+

610nm+

650nm+

690nm+

720nm+

760nm+

800nm+

850nm+ *not included in today's test

 

I took an image with each of these filters using a tripod, with the same white balance and exposure settings, converted them all into TIF, and then placed them into blender.

image.png.07a427706f3089ac17158c9f07226984.png

 

Although not easy to see, the leftmost two columns contain the images, which were in an HDR format to prevent data loss, which explains the odd groups instead of image texture nodes. Then each image, was subtracted by the next higher nm filter, to get a band of data which expresses the difference in those filters. Repeat this process for each image, and now you have a series of bands, each representing a range of light in nm. The image above in particular, is the 720nm-760nm band + 760nm-800nm band, white balanced, and set as material output as a proof of concept. 

 

image.png.738c784c719d96bd454acb3758ffc7cd.png

 

With this data collection and processing method, it is in principle possible to emulate any and all filters given a small enough bandwidth, with this set of filters, I was aiming for 40nm bands, the next set will be 20nm bands, and will give me even more of a headache.

 

I hope you enjoyed staring at this monstrosity, and I thank you for reading through. 

Link to comment

Interesting general idea, but I was left wondering about the purpose of doing the long pass with subtraction (vs using lots of bandpass filters)? Jonathan did a bandpass version of this some time ago. 

 

How do you afford all those by the way? That for me is the main issue. 
 

Also, it would be nice to see this method used to emulate a bandpass filter of known bandpass and compare the results directly. We know from previous attempts at subtracting images on here that there are a number of “gotchas” having to do with nonlinearities in the demosaicing stage, especially gamma adjustments that are buried in the raw converter, that you have to be careful of when doing image subtraction. You should validate your results.

Link to comment
Daniel Gawędzki

I didn't actually fall into many issues here, so first, I run a small optics company, so getting ahold of so many long pass filters wasn't an issue. The reason I didn't use band pass is because band pass addition is not the same thing as long pass subtraction to produce given width bands. Band pass filters added with eachother can have issues where there isn't a full coverage, or there's an overlap in coverage for any given wavelength, so by subtracting long pass filters, we get perfect slices of those wavelength ranges. Of course there's still some imprecision with those slices, but those slices are guaranteed *a* full coverage, not necessarily a perfect one, but that imperfect coverage is more than enough to emulate filters to some degree. I didn't actually have many gotchas apart from Blender not being able to load all of those image textures at the same time, so I had to bake some in, and then try again. I just got a new graphics card, which should allow me to run this perfectly fine. I didn't really have any mosaicking issues, either aligned in photoshop or not before being imported into blender, which may have something to do with the long pass filters, my lens, and a lack of focus breathing on my lens. After this post, I was able to get to a point where I was able to emulate the beloved IR Chrome filter, my own version of it, and a competitor's (all creators of which reside on this forum). The difference between my filter and Kolari's is miniscule though, and neither emulations were all too amazing, they just had the right colors. For the actual capturing and conversion process, I don't believe I did anything that could aggravate any raw issues, as I converted all of my files into 16 bit TIFs before importing them to blender. I had an okay, but imprecise method of verification by comparing a simulated 720, to the real 720, via difference, which rendered mostly black, apart from light edges, and I also did this for 400-690, which yielded similar results. I appreciate the interest!

Link to comment

May we see the pics so we can judge for ourselves how well this works or not?

 

I didn't really have any mosaicking issues, either aligned in photoshop or not before being imported into blender, which may have something to do with the long pass filters, my lens, and a lack of focus breathing on my lens. 
 

I think you didn’t follow my train of thought. The problem is the gamma correction that is automatically applied by many converters unless you turn it off somehow. You don’t get a linear signal. Twice the light intensity will not be twice the pixel value. Some RAW converters let you turn this off. In order to subtract things properly, you want everything to be linear first. 

Link to comment

Please sign in to comment

You will be able to leave a comment after signing in



Sign In Now
×
×
  • Create New...