Johns Hopkins applied mathematicians and astronomers have developed a new method to render images from ground-based telescopes as clear as those taken from space, a process that stands to expand the benefits of Earth-based instruments.
I believe this is a new deconvolution image stacking algorithm that can easily be run in hardware. It should work with any observatory. The math is far enough above my head that I can’t be sure though.
It would be cool if this makes it into software that people could use at home. I would love to see what amateur astrophotographers could to with it.
Hopefully, this new algorithm is not overly taxing. The amount of processing they’ll have to do to keep up with Rubin must be staggering. It’s got what, a 3.2 Gpixel camera mapping the entire night sky every few days. And then all that data has to be processed across the timeline of past observations. I wouldn’t be surprised if the computational demands are what kept it from becoming a reality until now.
I believe this is a new deconvolution image stacking algorithm that can easily be run in hardware. It should work with any observatory. The math is far enough above my head that I can’t be sure though.
It would be cool if this makes it into software that people could use at home. I would love to see what amateur astrophotographers could to with it.
Oh that would be cool!
Hopefully, this new algorithm is not overly taxing. The amount of processing they’ll have to do to keep up with Rubin must be staggering. It’s got what, a 3.2 Gpixel camera mapping the entire night sky every few days. And then all that data has to be processed across the timeline of past observations. I wouldn’t be surprised if the computational demands are what kept it from becoming a reality until now.