Speaker
Description
The amounts of raw data in next-generation observatories, such as the Square Kilometre Array Observatory (SKAO), will be so large that they cannot be archived in their entirety, but must be significantly reduced. This is well known in high-energy physics, particularly at the Large Hadron Collider (LHC), where the data streams captured by the detectors are reduced by several orders of magnitude during the data acquisition phase using sophisticated real-time algorithms.
At the LHC, proton collisions are repeated over and over again under the same initial conditions, which means that even rare events can be observed multiple times if the observation period is long enough. In astronomy, on the other hand, the initial conditions cannot be influenced. Furthermore, the experimental boundary conditions can change in unpredictable ways. Consequently, the established workflows in high-energy physics must be expanded to allow realtime optimization of telescope control parameters. Several years ago, Michael Kramer, Stefan Wagner, and the author proposed the “Dynamic Life Cycle Model.” A characteristic feature of the model is the introduction of two feedback loops:
- from the data centers next to the observatories and
- from the archives in data centers distributed worldwide
to the telescopes in order to control them in realtime or near-realtime.
The presentation provides a brief introduction to the model, followed by a discussion of selected computational challenges and an overview of the current status and future work.
| Affiliation of the submitter | German Center for Astrophysics (DZA) |
|---|---|
| Attendance | in-person |