I am trying to calibrate the images I have from a multispectral camera and I am struggling to figure out how to adapt the calibration (linearization models and cone catch model from chart, using pastels with known reflectance as color chart) to this type of images. The multispectral camera capture a stack of 7 images (in tiff) covering 7 different spectral ranges: 400, 500, 600, 700, 800, 900 and thermal. Those images are all in grey levels (not RGB).
what I did, for now, is convert these stacks into images, removed 800,900 and thermal (that are useless to my study) and align the rest of the images. Now, I can do a linearization model (using pastels for the calibration), but I don’t know which protocols to follow between the following that I though about:
- create a linearization not for the camera itself but for each sensor of the camera. I should use the create linearization for each image from each independently
- create a stack of images with the four images (400,500,600 and 700nm) and create the linearization model from this stack.
- Create a stack with the images: 700, 600 and 500 nm and transform the stack to an RGB image where R= 700, G=600 and B= 500. Create a linearization model from this RGB image for the “visible range” and use my 400nm (330-470nm) image as UV image (I suppose I need to create an independent linearization for the UV anyway no?).
And then how to calibrate with the chart depending on the protocol. I did not find the info for calibrating UV pictures, but I may have missed the information.
Then for my experiments, I take pictures of animals while behaving in an arena. Because of this design, I take my color standards in a different picture, before running my experiments. I would like to know if someone knows a trick to measure my color standards and then apply the measurement over my whole sequence of images to calibrate them instead of doing it one by one.
thank you in advance. Best regards
P.S: Still cannot login :s
In my litterature I found some spectral sensivity some user where looking for, so let me know if it still the case and can share the reference.
Regarding linearisation, you should be able to test this yourself. e.g. make sure pixel values for known reflectance levels are all linear. The chances are you can use the same models for all channels in your stack.
The built in linearisation tools will show you the model equation you need. You can then apply these yourself (and write a little script to automate this) using the linearisation tool in plugins>micaToolbox>tools>linearisation function. Just copy the model parameters across, select the right equation and run this sequentially on all channels.
As for cone-mapping, you presumably don’t need to use a chart given you’re working in IR where there are no animals with sensitivity?