I’ve imaged different artificial lights in a dark lab and want to know can these images be used in the toolbox to determine the colour distance between lights as viewed by dichromatic and trichromatic fish? What would be the most appropriate pipeline to follow?
The method is based on comparing the reflectance of a surface in an image to the reflectance of one (or multiple) grey standards in the same image and the same illuminant. The problem there is with quantifying a light-emitting (rather than reflecting) object is that the comparison to the grey standard (which doesn’t emit light) becomes tricky and I would genuinely recommend using a spectrophotometer (e.g. looking at the shape of the irradiance spectrum of your lights (no need for units, to measure colour you just want the shape) and then do the RNL math by hand or using PAVO (pretending your irradiance is reflectance). As colour is the result of relative photoreceptor stimulation this will give you the delta-S contrast between your lights.
There are a few ways you could try to work around this issue, i.e. by shining your lights at a spectralon white which you place next to a spectralon white that you shine a full-spectrum light at (i.e a PX2) and you try to avoid oversaturating any of your camera channels (especially looking at your coloured light). That should enable you to take a picture of your lights that captures its colour relative to a standard and which you can then use to measure the colour delta-S. Something along those lines xD Maybe Jolyon has an alternative idea?