Hello,
I have a question regarding measuring natural illuminant spectra underwater. I see that in the sample data file there is a csv for “5m green water 400-700 nm”. Could you share the technique of how this was measured? I’m interested in doing something similar but would like to know if the illuminant spectra should be measured in the same viewing plane as the animal or if measuring downwelling irradiance is ok.
Many thanks!
Mark
Hi Mark,
Jolyon is spot on. One issue that we have been discovering is that, if your grey standard is not lambertian you might run into calibration issues when you have it illuminated sideways (i.e. in shallow water with lots of light coming from the top but the image taken sideways e.g. of an anemone or a wall). The measurement of the light environment is pretty straight forward, just as Jolyon said.
Cheers,
Cedric
Hi Mark,
I’m not sure exactly how Cedric measured these irradiance spectra underwater, though I imagine using a cosine corrector and spectrometer.
The illuminant you measure should ideally match the illumination falling on the object of interest you’re measuring. On the sea floor things will normally be receiving light almost exclusively from above, so simply measuring the light in the upper nearly-half hemisphere (as with a cosine corrector) would make most sense. But if you’re dealing exclusively e.g. with pelagic conditions and sideways-facing surfaces, then measuring the light coming from the side would make most sense (i.e. turn the cosine corrector 90 degree to point towards the horizon). If the surface being measured is also not-at-all-diffuse then you might want to not use the cosine diffuser at all.
The grey standard you use should also then match the orientation of the measured illuminant.
Cheers,
Jolyon
You can certainly use the toolbox as-is to work with relative radiance (e.g. looking horizontally underwater). To convert to absolute radiance you will need to calibrate your camera against something of known radiance. Assuming you keep aperture and ISO constant then shutter speed (integration time) multiplied by the linear (non-normalised) pixel value will give you radiometric measurements. Just compare these to your measured radiance value to convert to absolute radiometric measurements. Let me know if you’d like more of an explainer on this. There are certain assumptions made here (e.g. about sensor performance), but in my testing this has worked extremely effectively.
Re. spectra at different depths – absolutely, if you measure these yourself or take published spectra at different depths. You just need to make a new model at each depth.
Cheers,
Jolyon
Hi Jolyon,
Just to clarify. If I load a multispectral image as a linear image and then convert to cone-catch can I calculate the radiance contrast of an object against its background (horizontally underwater) using the colour and luminance JND calculator, or are the equations not suitable for this? I have also noticed that when the linear image is converted to cone catch the resulting image is white for each slice… not sure why that is.
Cheers,
Mark
Hi Jolyon,
Thanks for the reply! I appreciate the advice. I’m curious, is it possible to compare an object against the background radiance of a horizonless landscape (e.g. open water) using the toolbox? Can radiance values be calculated using the cameras RBG response and the sidewelling irradiance data? Also, if I calculate how the irradiance spectra changes at different depths can this be simulated in the model?
Cheers!
Mark