Jump to content
UltravioletPhotography

Spectral irradiance of standard and UV flash, and a question


Recommended Posts

As part of my work I wanted to get a feel for the amount of light being emitted by a standard flash and one which has been modified for UVA photography. I wanted to share what I found and hopefully get some advice.

 

So to get a feel for this I ordered an Ocean Optics FX spectrometer. This is one of their new ones with a fast acquisition time and ability to run in burst mode capturing a lot of spectra in a short space of time. I went with this as I presumed flash duration would be short, so I wanted a spectrometer which could capture information in the ms range (and wouldn't break the bank). The spectrometer came with a fibre optic and cosine corrector and was calibrated by Ocean Optics to enable me to get absolute measurements. I built a box and painted the inside black, so I could mount the flash and fibre optic cable in a repeatable way and isolate it from its surroundings. Firing the flash and triggering the spectrometer was interesting, but I set it to measure 200 spectra, each 10ms in length, over 2 s. I then started the data capture and triggered the flash. Looking at the raw data I got 199 baseline noise spectra and 1 with the flash in it. So, success, I managed to capture the flash, and get some quantitative data from it. I have 2 Canon EOS 600RT flash heads, one standard and one modified for UVA imaging by Advanced Camera Services. I collected spectra from each of these, making sure they were both on the same setting. And this is what they looked like.

post-148-0-64725700-1500484701.jpg

 

As expected the standard, unmodified flash gave light in the 400nm and up region, and no UV. The UV modified one gave a strong peak in UVA and also increased light in the IR region compared to the unmodified flash (not hugely unexpected as ACS told me I'd see plenty of IR from the flash as they do not use the same filter material they use for their sensor modifications).

 

The questions I have on this;

 

The output of the spectrometer is uW/cm2/nm. Is this meaningless here - the flash duration will be less than 10ms so not all of that collection 'bin' will be made up of the light from the flash, so it is an average of light and dark? While I am comfortable that as a direct comparison between the 2 flashes it is quantitative, what do the numbers mean?

 

Is how the data is plotted - uW/cm2/nm the right thing to do here for comparing the flash output? Obviously the UV photons each have more energy than the IR ones, so does the higher peak in the UV region for the modified flash mean it is emitting more light in the UV than the IR, and should I consider just plotting 'photon count' for want of a better term, taking energy of the photons out of the equation?

 

I have become 'data blind' thinking about this too much and would welcome comments/advice.

Link to comment
First of all, thanks for posting this. I love data, especially comparatively/relatively quantified data. On first glance at your results I'm immediately struck by what appears to be a huge discrepancy in the total output of each flash. Since you used the same settings (I presume 'manual'...?) for each test the filtered flash should have produced considerably lower total output (AUC≤ the stock flash output). If you could normalize the raw data and repost I think the results might look more as should be expected. Either way, this does clearly show the effective range of the filter (and a few interesting IR spikes).
Link to comment

The output of the spectrometer is uW/cm2/nm. Is this meaningless here - the flash duration will be less than 10ms so not all of that collection 'bin' will be made up of the light from the flash, so it is an average of light and dark?

 

Outside of the 10ms flash, what else would be in the collection 'bin' if it is otherwise dark? I'm not sure I understand.

 

Mark, if the vis flash and the UV flash have different filters, then this might not be a discrepancy?

This answer is dependent upon that JMC was using absolute irradiance mode and that calibration was performed before starting measurements.

Link to comment

Mark - no problem. Yes, both settings were the same, although it was not manual, it was ETTL on the display of the flash. So I shall repeat tomorrow on Manual and see if that makes a difference. Thank you for that, that's why I posted it.

 

Andrea - My concern was this, will the length of the collection bin in relation to the duration of the flash impact the measured value? So with it being in uW there is a time element in this - uJs-1. So if the light was being emitted for the full 10ms I wouldn't be concerned, but the light is only being emitted for a fraction of that 10ms, and I do not know precisely how much of that 10ms. I cannot go faster than that as my computer can't buffer the data coming in, so that is the shortest time I can go to. So is what I am seeing a calculated average based on the length of the collection bin, or is a 'total light collected' value? For relative readings from one flash to another (of the same make) it wouldn't be a problem anyway as the length of the flash should be the same, however if I were to compare Nikon to Canon for instance, where the flash durations may be different, would it then become a problem. Hopefully that makes more sense - I'm on about 2 hours of sleep a night at the moment due to a very ill family member, so it's harder than usual to think straight

 

With regards to the intensities, I haven't opened up the modified flash head, but as I understand it is not uncommon to change the bulb (or remove some coating from the bulb?). So if that is the case, I am not massively surprised at the differences in intensity, but will double check tomorrow anyway.

Link to comment

Thank you for the update! I now understand your question. Which is not to say that I know the answer. :rolleyes:

Intuitively I would think that nothing would be recorded within the 10ms measuring interval but outside the flash interval. And I don't think it would make sense to average anything in an absolute irradiance measurement.

 

I'm thinking you could ask the Ocean Optics guys via their online chat and they could get an answer for you.

https://oceanoptics.com/

The Live Chat button is at the top of the home page.

 

FWIW, I'm guessing the filter on the UV-flash is about a 1mm thick UG1 or U-360. http://www.ultraviol...ndpost__p__7836

 

Sorry to hear you are dealing with a family illness. Those events are very difficult. Hope it resolves for the better very soon. Take care of yourself.

 

I think that Shane or Enrico or John D. might have an answer for you also. They don't typically stop by every day, but will no doubt also reply when they see this.

Link to comment
enricosavazzi

1)The output of the spectrometer is uW/cm2/nm. Is this meaningless here - the flash duration will be less than 10ms so not all of that collection 'bin' will be made up of the light from the flash, so it is an average of light and dark? While I am comfortable that as a direct comparison between the 2 flashes it is quantitative, what do the numbers mean?

 

2)Is how the data is plotted - uW/cm2/nm the right thing to do here for comparing the flash output? Obviously the UV photons each have more energy than the IR ones,

 

3)so does the higher peak in the UV region for the modified flash mean it is emitting more light in the UV than the IR, and

 

2)should I consider just plotting 'photon count' for want of a better term, taking energy of the photons out of the equation?

I can try to answer, although probably not in an exhaustive way.

 

1) Assuming the flash duration (decay down to 50% or 10% of peak level, depending on which standard you prefer) is about 1 ms, the spectrometer is basically integrating the reading across the measurement time span (10 ms). It may also be "blind" for an unspecified interval between readings, so the only reliable measurements are those made when the flash fires somewhere in the middle of the 10 ms window. If the measurement window starts after the flash has already fired, or ends before the flash emission decays to zero, you will have an unreliable measurement. This is why for this type of measurement it is better to use a spectrometer with an external trigger input (usually the same trigger that fires the flash).

 

Without a trigger input, you can do either the way you are doing, and discard the readings that are significantly lower than average (which probably refer to flashes truncated either at their beginning or end), or use a wider measurement window (e.g. 100 ms). The problem with wide measurement windows is that noise becomes higher, so the spectrum graph becomes more "jagged". A way to overcome this problem is to average say 10 or 20 successive readings, which reduces the total amounts of random noise because it is different in successive readings and therefore "cancels out" when averaging. Some of the noise is non-random, and remains in the data after averaging.

 

2) You can do either way, as long as you know how the spectrometer behaves. Basically, the CCD in the spectrometer gives you an electron count, rather than a photon count. A problem with obtaining a reliable photon count is that energetic photons like UV ones can statistically generate more than one electron. The spectrometer may or may not be compensating for this, depending on how it is calibrated.

 

3) No, it only means that the measured level at the peak is higher. To calculate the actual amount of UV versus IR, you need to calculate the area under either region of the curve, and compare the two areas. Assuming of course that the spectrometer has been calibrated to give a linear response, uniform across the spectrum.

Link to comment

Enrico -- thanks!!!

 

To calculate the actual amount of UV versus IR, you need to calculate the area under either region of the curve, and compare the two areas.

 

Well of course! The former calculus teacher (me) should have thought of that!! :D It's been a while.

Link to comment

UPDATE - New data using Manual mode on the flashes 20/07/17

 

Thanks all for your comments. It will take me a little time to digest them fully as there is plenty to think about, and for me to be getting on with.

 

I've gone back and repeated the experiment using full Manual mode on both flashes (both set to 1/1 power). The un-modified flash was significantly brighter than before, while the modified one was a little darker. Not sure how it works, but I'm presuming the ETTL mode in the flash stores some information about how it was used previously and uses that when the flash is triggered manually. Here's the new graph;

post-148-0-27341000-1500535005.jpg

I did a few runs with each unit, and there was a bit of variability from flash to flash. Once the flashes indicated they were charged they continued to build charge (and flash brighter) the longer they were left. It was quite a big effect as well - overall max output varied by about 20% over 5 flashes with each.

 

Below is a screen shot of part of one of the data sets. It shows 9 x 10ms time slots where data is being collected. Along the top is Wavelength in nm.

post-148-0-93100400-1500535303.jpg

 

Time slot 5 is where the flash is being measured. The good thing is that either side of Time slot 5, the signal reverts back to background, so the flash is being captured within one 10ms slot. I've collected about 40 data sets now, and with each one the flashes as been within one 10ms time slot.

 

I'm going to have a play later today with the time scale of the bin for collecting the data - increasing it from 10ms upwards - to see what this does to the reported intensity. I will also follow up with Ocean to find out a bit more about how it is calculated. The information about the calibration is a little 'light' - in the invoice it says "In house spectroradiometric UV-Vis calibration". Perhaps the info in the box has more details....

 

I must admit, overall for what it cost, it seems to be quite a good little spectrometer.

Link to comment

Ok. I have done a set of experiments using the standard, unmodified RT600 flash where I varied the length of the collection times during the acquisition of the spectra. I started at 10ms bin times, and increased it through 20, 40, 80, 160, 320 and 640ms. For each of these I plotted the Irradiance data produced from the spectrometer.

post-148-0-82933300-1500543610.jpg

 

From this I took the 430nm peak intensity for each of the plots, and plotted this against 1/the length of time used to collect the data, so 1/10ms, 1/20ms etc etc, with the thinking that the flash time will be very short in relation to the data collection time.

post-148-0-87483300-1500543722.jpg

 

To my surprise, I got an r2 of 1. Never seen that before, and my initial thought was 'what have I done wrong now?'. But it looks as though the reported values from the device do depend on the length of the collection 'bin' and the duration of the flash. A longer collection time for the same flash duration reduces in a reduction on the reported irradiance. So for it to be true absolute irradiance, the light must last the entire length of the collection bin. While it would be fine for sunlight or continuous light sources, it makes it difficult to get absolute data for flashes with this setup (or for comparing flashes with sunlight etc).

 

This time round I standardised the recharge time of the flash a bit more as well - I did the measures about 10s after a the recharge light came on, after I had manually fired the flash. Looks as though the standardisation worked, given the straight line relationship in the graph.

Link to comment
enricosavazzi
...

A longer collection time for the same flash duration reduces in a reduction on the reported irradiance.

...

Irradiance is power/irradiated surface, so increasing the length of the measurement window reduces the average irradiance across the window (since the duration of the flash remains the same and in the rest of the window there is no radiation). Somewhere either in the onboard processing in the spectrometer or after the data is sent to the PC, the software is doing an electron count/measurement time calculation. The question is whether the spectrometer or PC software can be configured to give a simple electron count instead, which except for noise should be independent of the length of the measurement window. If it cannot, then the spectrometer is probably designed only for use with continuous radiation sources. In this case, you still can use it to assess the relative contribution of different wavelengths of a flash discharge, but not to make a reliable absolute power measurement of an electronic flash unit.

Link to comment

Thanks again Jonathan for posting the experiments and Enrico for the explanations. I am learning a LOT from this.

 

JMC: But it looks as though the reported values from the device do depend on the length of the collection 'bin' and the duration of the flash.

This surprised me, however.....

 

Enrico: The question is whether the spectrometer or PC software can be configured to give a simple electron count instead, which except for noise should be independent of the length of the measurement window. If it cannot, then the spectrometer is probably designed only for use with continuous radiation sources. In this case, you still can use it to assess the relative contribution of different wavelengths of a flash discharge, but not to make a reliable absolute power measurement of an electronic flash unit.

....now I understand that spectrometer design plays a role.

 

 

I would very much like to have a spectrometer and its accessories to measure lens and filter transmission. While I'm still fairly adept at picking up new technologies and their proper usage, I do need put some research into understanding more before I consider actually buying one.

Link to comment
With the older Ocean Optics spectrometers it was possible to rig up an external trigger to fire both flash and spectrometer (haven't done it for 10 years so a little foggy on how I did that). Can't imagine this is not possible with the newer ones.
Link to comment

It's no problem at all Andrea, I'm happy to share what I can, when I can (and while I have access to the kit ;) ), and thanks to everyone for your explanations (especially you Enrico).

 

Shane, I'm sure it can be done, probably very easily with a bit of time with a soldering iron and some relatively straightforward electronics, but that isn't something I can do at the moment. Thankfully the Burst mode seems to work very well, and it's simply a case of opening the file in excel and extracting the line which there flash was detected.

Link to comment
  • 1 year later...

A bit of a resurrection here. I have been playing with trying to get flash spectra on and off, and have realised there is a problem with some of my spectra above. Specifically in posts #8 and #9 above. There is a wavy section of the spectra between about 420nm and 620nm, which has no fine structure. This is actually the sensor maxing out based on the collection times being too long. Not sure I fully understand the maths that the OO system is using here, but I tried reducing the collection time further and further, and found that at 1ms that smooth wavy part of the spectra no longer appeared smooth, and I started to see fine structure there which is more in keeping with what I expected to see.

 

I had an issue with the previous computer I was using - it couldn't buffer data quick enough for me to do a burst sequence with the spectrometer at fast acquisition times, hence the wave data for the full power flashes. I recently attached it to a new computer and have been able to get down below 10ms per collection bucket. In fact I have been able to get down to 10us buckets now, which is as fast as the spectrometer will allow. As I am manually triggering the flash, then pressing go on the data collection, I had to set it to run for 0.2s, at 10us intervals. So I get a lot of data (20,000 scans worth - big text file), but I managed to capture a flash from a Canon 600EX RT flash (unmodified). The graph below shows 4 consecutive 10us collection buckets (scan 5, 6, 7, 8).

post-148-0-90269800-1537101569.jpg

 

Scan 5 was just before the flash was triggered, scan 6 was when it was just fired, scan 7 shows the flash decaying, and then at scan 8 it is back to baseline. Overall duration of the flash from firing to decayed, <20us. I had the cosine corrector for the collection fiber, about 1inch from the flash gun.

 

Not a UV scan, as this is a standard flash unit, just sharing I was amazed it could capture full spectra at 10us intervals, and could see not only when it was fired, but also the decay.

Link to comment
There is no light before it was triggered. Scan 5 is at baseline, before the flash goes off. Scan 6 is when the flash was fired. You can then see the decay in scan 7, before getting back to baseline at scan 8.
Link to comment

The converted flash looks to have either a ug1 or U360 filter in front.

Would be interesting to see unmodified unit.

I like the comparison I saw here between modified unmodified Canon 199A. That flash I own and is not bad for UV.

 

This post:

http://www.ultravioletphotography.com/content/index.php/topic/2385-canon-199a-uv-modified-and-not-uv-modified/page__hl__199a__fromsearch__1

Link to comment

(I was indeed confusing 5 and 7 - now that I'm on my computer and not looking at it on a phone I see what happened. Scan 5 is basically under 8. Sorry, bright sunshine + cellphone = not the best conditions for graph reading.)

--

Is there really a lot of infrared from that flash or is that your spectrometer misbehaving out of range?

Link to comment

(I was indeed confusing 5 and 7 - now that I'm on my computer and not looking at it on a phone I see what happened. Scan 5 is basically under 8. Sorry, bright sunshine + cellphone = not the best conditions for graph reading.)

--

Is there really a lot of infrared from that flash or is that your spectrometer misbehaving out of range?

 

The data is trustworthy up to 850nm, so yes the big peaks in the IR in scans 6 and 7 are real.

Link to comment

I wonder what might cause such strong peaks in the NIR range.

 

The sensitivity of the detector array and the efficiency of the grating of an OO UV-NIR spectrometer decrease in the NIR end.

I think your spectrometer is fitted with Grating #1 in this diagram:

https://oceanoptics....lpm_USB-Jaz.jpg

 

Do you know how trustworthy the results are?

 

Not sure as to the grating Ulf, details are up in the loft somewhere, but from memory it was preconfigured for 200 to 850nm. It was calibrated for absolute irradiance measurements between 250nm and 850nm by Ocean Optics when bought. I did some flash measurements soon after buying it, and they also showed the peaks. Also, if you look at Xenon flash spectra online they also have those peaks, so yes I believe it is trustworthy.

Link to comment

There are some strong emission lines in the Xenon spectra at 823nm, 828nm, 835nm and 841nm.

https://physics.nist.gov/PhysRefData/Handbook/Tables/xenontable2.htm

 

You can use the measurement peaks to verify the wavelength calibration, at least in that wavelength end.

 

The wavelength calibration in OO-spectrometers is done with a fourth order polynomial.

The integer offset is most likely to have drifted.

Link to comment

Please sign in to comment

You will be able to leave a comment after signing in



Sign In Now
×
×
  • Create New...