Recently I've been on a bit of a deep dive regarding human color vision and cameras. This left me with the general impression that RGB bayer filters are vastly over-utilized (mostly due to market share), and are they are usually not great for tasks other than mimicking human vision! For example, if you have a stationary scene, why not put a whole bunch of filters in front of a mono camera and get much more frequency information?
That's common in high end astophotography, and almost exclusively used at professional observatories. However, scientists like filters that are "rectangular", with a flat passband and sharp falloff, very unlike human color vision.
Assuming the bands are narrow, that should allow approximately true-color images, shouldn't it?
Human S cone channel = sum over bands of (intensity in that band) * (human S-cone sensitivity in that channel)
and similarly for M and L cone channels, which goes to the integral representing true color in the limit.
Are the bands too wide for this to work?
> Are the bands too wide for this to work?
For wideband filters used for stars and galaxies, yes. Sometimes the filters are wider then the entire visible spectrum.
For narrowband filters used to isolate emission from a particular element, no. If you have just the Oxygen-III signal isolated from everything else, you can composite it as a perfect turquoise color.
One big reason for filters in astronomy and astrophotography is to block certain frequency ranges, such as city lights.
The vast majority of consumers want their camera to take pictures of people that “look good” to the human eye; the other uses are niche.
But that said, I’m actually surprised that astrophotographers are so interested in calibrating stars to the human eye. The article shows through a number of examples (IR, hydrogen emission line) that the human eye is a very poor instrument for viewing the “true” color of stars. Most astronomical photographs use false colors (check the captions on the NASA archives) to show more than what the eye can see, to great effect.
I suspect its because when conditions are right to actually see color in deep-sky objects, its confounding that it doesn't look the same as the pictures. Especially if seeing the colors with your own eyes feels like a transcendent experience.
I've only experienced dramatic color from deep sky objects a few times (the blue of the Orion Nebula vastly outshines all the other colors, for instance), and its always sort of frustrating that the picture show something so wildly different from what my own eyes see.
There's a good chance the real problem there is limited gamut on the screen, and with the right viewing method the RAW photo could look much much better.
If you get a big enough telescope it will gather enough light to where you'll see things in proper color. I've seen the Orion nebula with a 10 inch reflector in a good location and the rich pinks, blues and reds were impossible to miss. This is the actual photons emitted from that object hitting your retina so it's about as "true color" as you can get.
I think when astrophotographers are trying to render an image it makes sense that they would want the colors to match what your eyes would see looking through a good scope.
I think you want a push broom setup:
https://www.adept.net.au/news/newsletter/202001-jan/pushbroo...
Hyperspectral imaging is a really fun space. You can do a lot with some pretty basic filters and temporal trickery. However, once you’re out of hot mirror territory (near IR and IR filtering done on most cameras), things have to get pretty specialized.
But grab a cold mirror (visible light cutting IR filter) and a nighvision camera for a real party on the cheap.
The technical term for this is multispectral imaging. Lots of applications across science and industry.
In case you weren't already aware, that last bit basically describes most optical scientific imaging (e.g. satellite imaging or spectroscopy in general).
And don't forget about polarization! There's more information out there than just frequency.
I guess that’s yet another dimension. Perhaps spin a polarizing filter in front of the camera to grab that?
There are for sure things to explore. Craig Bohren once wrote, that he wouldn't think of going anywhere without polarizing sunglasses. His books are really nice... (Fundamentals of Atmospheric Radiation, Clouds in a Glass of Beer, ..)