bandeau.png

The Architecture of the MCAO Real-time Control Computer Cluster for the Daniel K. Inouye Solar Telescope
Dirk Schmidt  1@  , Andrew Beard  1@  
1 : National Solar Observatory

Long version for review:
The real-time control system for the multi-conjugate adaptive optics system for
the 4-meter Daniel K. Inouye Solar Telescope will process 11817 subapertures in
total, each 20x20 px, received from nine wavefront sensors and control three
deformable mirrors with a total of 3452 actuators. The system will run the
control loop at a rate of 2000 Hertz. To perform this tremendous amount of
processing, we use a cluster of ten x86 Linux servers interconnected with a
200-gigabits-per-second Infiniband network.

Identical computers are dedicated to the camera in each of the nine wavefront
sensors. These computers process the Shack-Hartmann image with a two-dimensional
cross-correlation technique, and they reconstruct their parts of the wavefront
modal coefficients using a matrix-vector multiplication. This data is then sent
to the tenth computer in the network which sums them up and computes the final
actuator commands, using another matrix-vector multiplication. This computer is
also performing additional control loop functions and manages the other computers.

The cluster hardware is complete, and based around a total of eleven AMD
Epyc 7742 64-core processors. We are currently developing the real-time control
software which is a clusterized advancement of the KAOS Evo 2 software which is
used in Clear at the Goode Solar Telescope. The Infiniband communication layers
for all relevant data has been implemented using RDMA (remote direct memory access)
technologies for minimal latency, and a simplified control loop is already
running at 2000 Hertz. The system can be commanded from a single graphical user
interface which displays all nine wavefront sensor images in parallel. The
development is ongoing and progress is being made quickly.

We will not have a complete optical setup in the lab before deployment at the
telescope, and therefore the implementation of the algorithms in the clusterized
control system will be tested with simulations. We will use the adaptive optics
simulation software Blur that will present the cluster with faked observations,
simulating both the wavefront sensor images and the deformable mirror effects in
the imaging of the sun through turbulence. On a server with two AMD Epyc 7742
processors, Blur can simulate one loop cycle in about 70 milliseconds.
To speed this up even further, we have ordered a server with two AMD Epyc 9654
96-core processors. This computer will be sending the simulated camera images
directly into the frame grabbers of the cluster computers using nine CoaXPress 2.0
simulator cards. We expect delivery of this computer in late March.

At AO4ELT7, we plan to present the software and hardware architecture of the
real-time control system, including synchronization techniques we use for lowest
latency, motivate our choices and report on the timing performance and the
things we have learned from using 200-gigabits-per-second Infiniband.

Overall project status: The Wavefront Sensor System design has passed Final
Design Review in February 2023 and the system is being manufactured. Deliveries
of the custom-made cameras and of the first high-altitude deformable mirror are
expected in the first half of 2023. Fabrication of the second high-altitude
deformable mirror is under contract.

Note to the organizers: Screenshot for review only, not for public release. Thanks.


Short version for program:
The real-time control system for the multi-conjugate adaptive optics system for
the 4-meter Daniel K. Inouye Solar Telescope will process 11817 subapertures in
total, each 20x20 px, received from nine wavefront sensors and control three
deformable mirrors with a total of 3452 actuators. The system will run the
control loop at a rate of 2000 Hertz. The control system is based on the KAOS
Evo 2 software and will run on a cluster of ten x86 Linux servers interconnected
with a 200-gigabits-per-second Infiniband network.

We will use the adaptive optics simulation software Blur to test the cluster
implementation before deployment at the telescope. Blur will present the cluster
with faked observations, simulating both the wavefront sensor images and the
deformable mirror effects in the imaging of a solar imaging through turbulence.

At AO4ELT7, we plan to present the software and hardware architecture of the
real-time control system, including synchronization techniques we use for lowest
latency, motivate our choices and report on the timing performance and the
things we have learned from using 200-gigabits-per-second Infiniband.



  • Picture
Online user: 1 RSS Feed | Privacy
Loading...