bandeau.png

Exploring an event camera-based pyramid wavefront sensor
Jorge Tapia  1, *@  , Vicente Westerhout  1@  , Esteban Vera  1, *@  
1 : Pontificia Universidad Católica de Valparaíso
* : Corresponding author

Adaptive optics (AO) is essential in the Extremely Large Telescopes (ELT) era to reach their scientific goals in terms of accuracy and sensitivity [1]. Similarly, AO is fundamental to boosting the performance of future ground-satellite optical communications [2] and improving space situational awareness [3]. Owing to these new challenges for AO systems, innovations can be carried out by taking advantage of cutting-edge sensor technologies based on non-traditional principles. That is the case of the event-based cameras, which use a dynamic vision sensor where every pixel works independently and asynchronously so that each one quantizes local relative intensity changes to generate spike events [4]. Due to its high acquisition speed and dynamic range, this sensor appears as a promising alternative to be incorporated in a pyramidal wavefront sensor (PWFS), leading to a potentially low-cost, though still sensitive enough, solution. However, a new challenge arises. How to correctly obtain the interaction matrix of and event-based system? In this work, we test a PWFS based on an event camera sensor at the PULPOS bench [5], an AO bench that have either a PWFS made with a 4-face pyramid prism of Zeonex or a digital PWFS, and that can be stimulated by high-order wavefront aberrations and pupil discontinuities (spiders or secondary obscuration) with a combination of spatial light modulators (LCOS or DMD) at the pupil plane. The event data is collected during a fixed emulated integration time to obtain an equivalent interaction matrix. Samples of the response to some known aberrations are shown in Fig.1. We are currently investigating its potential and limitations for wavefront estimations.

 

References:

[1] Hubin, N., Ellerbroek, B. L., Arsenault, R., Clare, R. M., Dekany, R., Gilles, L., Kasper, M., Herriot, G., Le Louarn, M., Marchetti, E., Oberti, S., Stoesz, J., Veran, J. P., & Vérinaud, C. (2005). Adaptive optics for extremely large telescopes. Proceedings of the International Astronomical Union, 1(S232), 60-85.

[2] Chen, M., Liu, C., Rui, D., & Xian, H. (2018). Performance verification of adaptive optics for satellite-to-ground coherent optical communications at large zenith angle. Optics express, 26(4), 4230-4242.

[3] Copeland, M., Bennet, F., Rigaut, F., Korkiakoski, V., d'Orgeville, C., & Smith, C. (2018, July). Adaptive optics corrected imaging for satellite and debris characterisation. In Adaptive Optics Systems VI (Vol. 10703, p. 1070333). International Society for Optics and Photonics.

[4] Lichtsteiner, P., & Posch, C. (2008). C. and T. Delbruck,“An 128× 128 120 dB 15 μs latency temporal contrast vision sensor. IEEE Journal of Solid State Circuits43, 566-576.

[5] Tapia, J., Bustos, F. P., Weinberger, C., Romero, B., & Vera, E. (2022, August). PULPOS: a multi-purpose adaptive optics test bench in Chile. In Adaptive Optics Systems VIII (Vol. 12185, pp. 2222-2231). SPIE.

 

 



  • Poster
Online user: 4 RSS Feed | Privacy
Loading...