Using event cameras for imaging through atmospheric turbulence
Long range horizontal path imaging through atmospheric turbulence is hampered by spatiotemporally randomly varying shifting and blurring of scene points in recorded imagery. Typical mitigation strategies employ software algorithms that combine optical flow to estimate and reduce tip/tilt aberrations with lucky patch identification and data fusion to filter higher order aberrations. In practice, accurate and fast optical flow estimation in turbulence faces is highly challenging. Here we investigate if these challenges can be overcome by using a neuromorphic camera. These sensors measure logarithmic changes in scene radiance in each pixel instead of the radiance itself. The changes are thresholded to produce a stream of events, which enables an ultrafast time resolution on the order of microseconds and enables efficient optical flow estimation at very high frame rates. Here we report on our initial experiments, where we have used a neuromorphic camera to image through turbulence in a controlled indoor setting. We analyze how the sensor responds to the turbulence induced apparent scene motion and propose an algorithm for computing high resolution images of the scene from the event stream in combination with intensity frames. Our initial results do not show a statistical relation between event counts and lucky patches. The image reconstruction based on the event stream did show promising improvements in output image sharpness and stability compared to the raw image stream, with many opportunities for further improvement
To reference this document use:
Defence, Safety and Security
COAT-2019 - workshop (Communications and Observations through Atmospheric Turbulence: characterization and mitigation), ONERA, Dec 2019, Châtillon, France.