Brand new to studying colour and image analysis and could really use some advice.

I’m trying to develop a gradient of coloured baits for an experiment with birds, going from green to yellow with various mixes in between. I want to show that the birds perceive a gradual gradient of colour from one extreme to the other (that each colour is as different from the one before it as the one after).

Calculating colour distance is straightforward from a calibrated image (just look at the normalized RGB and convert to L*a*b, which can be used to compute distance), but that’s for human vision. I can’t work out how to do this for a photo converted to cone catch for blue tits (only similar species to the one I’m interested in). When I measure the mean values for the ROIs after cone catch conversion instead of normalized RGB I get lw, mw, sw, and UV, which I don’t think can serve the same function. So I tried running the QCPA framework and it produces new images, but doesn’t produce a table of results, and even if it did I’m not 100% sure what I need to be looking at on such a table. Can I make comparisons of colour similarity using the QCPA framework? And if so, how? (I think I want to use JNDs because if, say, all the baits are 4 JND units away from one another then they ought to be a smooth gradient, but JND and RNL analysis user guides suggests to do them through the QCPA, and I keep getting what look like errors when I try to use those tools on their own.)

As a final spanner in the works I don’t have a UV-capable camera. I have analysed the UV reflectance of my baits by other means and found it to be negligible (and similar across baits). Currently I am trying to figure out the analysis using a mock image which I pretend has UV in it. Is it possible to add a blank image for the UV channels to ignore them in the analysis without upsetting the validity of the visible light analysis?

Thank you in advance for any help.


Measuring colour distance with bird perception
Answered question