This tutorial will provide a hands-on introduction to automating unit testing in Python using the industry-standard pytest framework. Participants will learn how to design software components that are easily testable, write effective unit tests using pytest to ensure software robustness and maintainability, and generate and utilize code coverage metrics to verify the comprehensiveness of their...
The main goal of this tutorial is to teach participants how to use hierarchical Virtual Observatory (VO) standards allowing construction, exploration and querying of all-sky datasets. The Hierarchical Progressive Survey (HiPS) and the Space-Time Multi-Order Coverage map (ST-MOC) standards can be used by data providers to expose their datasets (images or catalogues), and astronomers can use...
Obtaining a clear picture of a new or even existing codebase is difficult. This
is especially true if the aim is to ascertain which parts of the code are
crucial to the primary function or performance of the software, and which are
for handling edge-cases, memory management, input/output (IO), or even argument
parsing. One of the most effective means by which to learn such information...
The growing volume of alerts from time-domain and multi-messenger astronomy, often with poor localization, necessitates automated and optimized follow-up scheduling. We present tilepy, an open-source Python package designed to meet this challenge by automatically generating efficient observation plans.
tilepy processes HEALPix-based sky maps to derive pointing schedules using various...
PySE (Python Source Extractor) was developed by Spreeuw & Swinbank between 2005 and 2010, as part of the LOFAR Transients Key Project. It has been in continuous use since 2017 within the Amsterdam–ASTRON Radio Transients Facility and Analysis Center (AARTFAAC) pipeline. More recently (2023), major performance enhancements reduced runtime dramatically: offline processing of typical 2300²-pixel...
The Chinese Space Station Survey Telescope (CSST) is China's first major space-based optical survey facility, equipped with advanced instruments including the Main Survey Camera (MSC), Multi-Channel Imager (MCI), Integral Field Spectrograph (IFS), Cool Planet Imaging Coronagraph (CPIC), and High Sensitivity Terahertz Detection Module (HSTDM), covering wavelengths from near-ultraviolet (NUV) to...
The first Euclid Quick Data Release (Q1) encompasses approximately 30 million sources across 63.1 square degrees, marking the beginning of a mission providing petabytes of imaging data through Data Release 1 (DR1) and subsequent releases. Systematic scientific exploitation of these datasets frequently requires extraction of source-specific cutouts; however, the scale of modern surveys renders...
ESA's Euclid cosmology mission was launched in 2023 and during its five year life time it will observe one third of the sky with unprecedented resolution with its optical and near infrared instruments.
The Euclid data will be hosted by the Euclid Science Archive, within the ESAC Science Data Centre (ESDC) in Madrid. The ESDC is in charge of the development, operations and maintenance of...
The Keck Observatory Archive ([KOA][1]) curates all observations acquired at the W. M. Keck Observatory. The archive is expected to grow rapidly as complex new instruments will soon be commissioned and as the expectations of archive users have expanded. In response, KOA has been implementing a new Python based VO-compliant query infrastructure. We have deployed real time ingestion of newly...
The Euclid satellite is an ESA mission that launched in July 2023. Euclid targets to observe an area of 14,000 deg^2 with two instruments, the Visible Imaging Channel (VIS) and the Near IR Spectrometer and imaging Photometer (NISP) down to VIS=24.5mag (10 sigma). Ground based imaging data in griz from surveys such as the Dark Energy Survey and Pan-STARRS complement the Euclid data to enable...
BlueMUSE is a new blue-optimized, medium spectral resolution, panoramic integral field spectrograph being developed for ESO's VLT. While building on the legacy of the much requested MUSE instrument, its blue wavelength coverage to the atmospheric cutoff (~350 nm) will make it unique. In Galactic, extragalactic, and high-redshift domains, BlueMUSE will enable new science not possible with...
Most traditional data reduction pipelines are on run an investigators local machine or remote machines requiring a manual touch to be executed. This approach leads to data discoverability and reproducibility issues. Additionally, observatory sites are also often remote and one of the major challenges is facilitating data transfer from site to site. Data Central's Apache NiFi system aims to...
Bayesian imaging of astrophysical measurement data shares universal properties across the electromagnetic spectrum: it requires probabilistic descriptions of possible images and spectra, and instrument responses. To unify Bayesian imaging, we present the the Universal Bayesian Imaging Kit (UBIK). Currently, UBIK allows X-ray satellite data imaging for Chandra and eROSITA and soon radio...
The Nightly Digest (ND) is a web application that condenses Rubin observatory operations into clear visual indicators and key metrics of efficiency, downtime, and key events for a large and diverse cohort of stakeholders.
Developed by the Astronomy Data and Computing Services (ADACS) team in Australia in collaboration with the Rubin Telescope and Site Software team as part of...
The Mikulski Archive for Space Telescopes (MAST) is the primary archive for the soon-to-launch Nancy Grace Roman Space Telescope. As part of that mission, a science platform known as the Roman Research Nexus (RRN) is being built to make it possible for the user community to view and analyze data at the scales Roman will produce it. As part of this effort, MAST is developing a set of tools that...
In computer science, the field of entertainment computing covers aspects such as game design, computer graphics, human-computer-interaction, and artificial intelligence. On the first glance, the application area seems far away from astronomical research. On second thoughts, we discover many challenges that both areas encounter.
Dealing with large and complex data is a common challenge...
The largest astronomical catalogues now exceed the capacity of a single machine, hence multiple groups have been experimenting with clustered database solutions to be able to scale past a single machine. This BoF will provide a forum for sharing what solutions work and do not work, issues around licensing and possible solutions, and how to take existing astronomical database extensions and use...
2025 marks thirty years since the first release of WCSLIB, a software library closely linked to FITSWCS, the FITS World Coordinate System standard. From the start, WCSLIB informed the development of FITSWCS and now provides one of several practical implementations. In this talk I will describe WCSLIB's origins, its close connection with FITSWCS, and major milestones in their evolution. I...
As astronomy data sets become larger, efficient data ingestion systems are required to ensure science ready data products are promptly available to the community. Within Data Central’s Ingestion system, bottlenecks were identified with the use of py-spy and diagnostic queries against the ingestion database. Rectifying inefficient database usage resulted in an 8 times speed up of some ingestion...
The SIXTE (SImulation of X-ray TElescopes) software is a general end-to-end simulation toolkit for X-ray observations, covering the full observation process from source photon generation to detector readout and the production of high-level output files. It is the official simulator for existing and future X-ray missions, such as eROSITA, NewAthena, THESEUS and AXIS.
Originally being...
Implementing a test process for a long living scientific software with complex dependencies and many layers of code requires a change in perspective and culture within the entire development team. But it can be done! I will present some of the challenges we have faced to test the [CASA][1] software and how we went from having a few tests to having too many tests and why this needs to be...
Astronomical research increasingly depends on complex web-based user interfaces for data exploration, pipeline configuration, and visual inspection of results for quality assurance. As these interfaces grow in sophistication and user expectations rise, ensuring their reliability and usability across diverse environments becomes a critical challenge.
At Data Central we integrate automated UI...
The Cherenkov Telescope Array Observatory (CTAO) is the next-generation
very-high energy gamma-ray observatory currently under construction.
With tens of telescopes planned at two sites in both hemispheres, it
will provide a significant improvement over current instruments in
sensitivity, energy range, and resolution. CTAO will generate tens of
petabytes per year, with a first analysis of...
Software tools and the algorithms underlying them have become critical to the advancement of astronomical research. The contribution of those who develop astronomical software can and should be directly linked to the discoveries made using these tools. The American Astronomical Society Journals, including the Astrophysical Journal and the Astronomical Journal, explicitly welcome articles whose...
In science, the lifecycle of software products is typically managed with limited resources while facing unlimited demand. Scientific software requirements are necessarily often dominated by internal project specifications and deadlines, but these internal priorities, while beneficial for the community as a whole, do not always align with the individual needs of our ultimate customers: general...
The Astro Data Lab science platform recently marked eight years of operational service, a significant milestone in the fast-evolving domains of big data research, software development, and computational infrastructure. Initially designed to host and analyze data from the Dark Energy Survey, Data Lab has expanded its scope far beyond these (modest) first goals. Now integral to the success of...
LOFAR is a high throughput data facility facing several non-trivial technical
challenges in data processing and storage. Since start of science operations, LOFAR has accumulated over 50 petabytes of data in its science data archive. Following a major upgrade of the instrument, it is expected that over the course of the next five years of operations the archive will grow to well over 100...
Astronomy’s data lifecycle is no longer defined solely by storage and processing technologies, but equally by the social and organizational structures that enable long-term usability, interoperability, and trust. Within ESA’s ESAC Science Data Centre (ESDC), we face these challenges at scale through missions such as Gaia and Euclid, and in the development of the Euclid Data Space (EDS)...
It has recently become possible to numerically simulate large, representative volumes of the Universe. These cosmological (magneto)hydrodynamical simulations solve for the coupled evolution of gas, dark matter, stars, and supermassive black holes interacting via the coupled equations of self-gravity and fluid dynamics, all within the context of an expanding spacetime.
The IllustrisTNG...
Image generation is an important step in the modern astronomy data analysis workflow. It provides quick-look diagnostics on raw data or during the data reduction stages, enabling visual identification or classification of sources and features, and the presentation of the data to the larger scientific community. Traditionally, these images are created from stacking three (or more) scaled single...
Twitter spent ten years as the de facto online platform for astronomy networking and outreach. However, semi-recent events have seen it devolve into a politicized and ineffective platform for science communication and networking. The loss of Twitter has shown how fleeting online spaces can be. It begs the question: can we do better, or are astronomers doomed to always have their online homes...
The exponential growth in size and complexity of astronomical datasets from space missions presents significant computational and infrastructural challenges. ESA’s Euclid mission has already produced petabytes (PB) of processed data and is projected to produce 30 PB over its operational lifetime. Analysing and processing data on this scale requires specialised infrastructure and...
The discovery of transient phenomena—such as Gamma-Ray Bursts (GRBs), Fast Radio Bursts (FRBs), stellar flares, novae, and supernovae—together with the emergence of new cosmic messengers like high-energy neutrinos and gravitational waves, has revolutionized astrophysics in recent years. To fully exploit the scientific potential of multi-messenger and multi-wavelength follow-up observations, as...
The GAIA Datamining Platform provides interactive, JupyterHub-based access to the GAIA Data Release 3 dataset, which comprises 7TB of data.
The GAIA Data Release 4 dataset is expected to be in excess of 600TB.
We describe our progress in evolving the GAIA Data Mining Platform to a modern, kubernetes-based, platform-independent deployment, named Astroflow, adding dask functionality to...
The James Webb Space Telescope is producing a firehose of extragalactic imaging data through its diversity of legacy programs. Community organized initiatives, such as the Dawn JWST Archive, have come to fill the gap between archive products to uniformly-reduced data that enable large-scale exploration and analysis. These programs are catalyzing further initiatives to generate value-added...
Modern astronomical surveys such as HST, JWST, Euclid, and LSST are generating petabyte-scale imaging archives across multiple wavelengths and epochs. Traditional image retrieval methods, which are based solely on metadata, such as sky position, filter, or exposure time- are insufficient to identify objects with similar visual or physical characteristics. To enable efficient discovery in these...
With the new generations of large-scale surveys, we are faced with an avalanche of data that are no longer “images” but “cubes”, and whose third dimension is either temporal or spectral. In this new area, traditional hierarchical science platform visualisation methods must evolve to exploit this third dimension.
Building on the Hierarchical Progressive Survey method – endorsed by the IVOA and...
Arrays of Cherenkov telescopes detect ultra-short (nanosecond) flashes of blue light produced when high-energy gamma rays hit Earth’s atmosphere, triggering particle cascades. The upcoming Cherenkov Telescope Array Observatory (CTAO) will generate hundreds of petabytes of data annually, requiring extensive atmospheric monitoring and rich metadata to reconstruct event lists, images, spectra,...
The amounts of raw data in next-generation observatories, such as the Square Kilometre Array Observatory (SKAO), will be so large that they cannot be archived in their entirety, but must be significantly reduced. This is well known in high-energy physics, particularly at the Large Hadron Collider (LHC), where the data streams captured by the detectors are reduced by several orders of magnitude...
The latest SDSS Data Release 19 comes with a new suite of tools for helping astronomers and students visualize and analyze the vast richness of this dataset. In this demo we will showcase several of these tools, including the Zora web application - a modern and reactive interface for searching SDSS data, exploring observed target metadata, and visualizing or accessing spectral data - and the...
Advanced science platforms must handle large data volumes, complex workflows, and collaborations that span multiple disciplines and partners. While scientific questions may differ between fields, the challenges of building reliable and reproducible data-driven research infrastructures are very similar.
This talk demonstrates how a geophysics-oriented science platform integrates well known...
Whilst the Research Data and Software (RDS) team at the AAO has more than doubled in size, the nature of our funding is such that there is very limited scope to hire additional infrastructure support personnel. Taking inspiration from the Southern African Large Telescope (SALT) Astronomy Operations user support model, we have introduced the concept of a "DevOps Roster" to ameliorate the load...
Next year the Fifth Generation of the Sloan Digital Sky Surveys (SDSS-V) will launch the 20th SDSS public data release, 25 years after its very first early data release appeared on-line in 2001. Much has changed in SDSS over that time: telescopes have been added, new instruments have been built, and old instruments have been retired. What has remained however is the commitment to make SDSS...
By combining data from different messengers (electromagnetic radiation, gravitational waves, neutrinos, cosmic rays, ...), one can gain a better understanding of the physics in the universe. A milestone was the merger of a binary neutron star (2017) seen in gravitational waves by LIGO/Virgo, followed by gamma-ray burst, optical/infrared kilonova, X-ray, and radio counterparts.
There are...
ADASS POC conducted a community survey. We would like to use that to start the conversation on the future of ADASS.
The advances of generative AI are staggering and progress is expected to continue at high speed. In the near future astronomers will likely be able to use AI agents to accompany them in the entire process from proposal preparation to archival search, data analysis and publication. How to leverage the advantages for astronomy? How to mitigate the risks?
In this BoF we try to look into the...
In 2022 I transitioned from research in computational astrophysics to healthcare system modelling. I found most of the tools, techniques, and skills acquired during my work in astrophysics to be readily translatable to the new field, and the collaboration with experts from different backgrounds an extremely positive and stimulating experience for all those involved. In this talk, I will...
Current and upcoming astronomical surveys (e.g., SKA, LSST, Euclid, or even ALMA) present a significant data processing challenge, with data volumes that overwhelm traditional, single-node analysis workflows. Many of our community's essential analysis tools are built within the Python ecosystem, but they often struggle to scale to the high-performance computing (HPC) resources required for...
The quantities of data produced by next generation instruments such as the SKA, the DSA2000 and the ngVLA require new software ecosystems to convert observational data into science ready data products.
Traditionally, such scales of data and compute are solved using traditional HPC software and infrastructure. While this approach is still relevant going forward, the advent of (1) ubiquitous...
XMMGPT is a dual-purpose project which aims to serve as a unique access point to aid astronomers in their research with XMM-Newton data, and as an exploration of language models and AI systems within European Space Agency (ESA) workflows.
The system is comprised of 4 main parts, a heavily customized Agentic Retrieval Augmented Generation (RAG) pipeline, a visibility checker tool, a...
SciServer is a high-impact, highly successful Science Platform with a well-developed existing code base; an established user community; and demonstrated impact on scientific discovery, research, and education. SciServer has demonstrated a transformational impact in astronomy, providing collaborative features such as groups and file sharing, and free computational resources to access large...
Dadaflow is a new open source C++/CUDA library for the rapid development of high-performance, modular radio astronomy processing pipelines. Exploiting recent C++20 features, it provides strongly typed multidimensional data structures, a robust graph-based pipelining framework, and tooling for runtime control and interprocess communication. At its core, Dadaflow represents processing...
We present a novel algorithm designed to detect and correct cosmic ray (CR) hits in astronomical images obtained with the MEGARA integral field spectrograph at the Gran Telescopio Canarias (GTC). Traditional approaches in the MEGARA Data Reduction Pipeline (DRP) rely on median stacking of multiple exposures to mitigate CR contamination. However, this method becomes less effective for long...
This poster presents the China-VO (NADC) Dataset Metadata Specification, a foundational framework developed by the Chinese Virtual Observatory (China-VO) National Astronomical Data Center (NADC) to standardize the description and discovery of astronomical datasets. As data volumes grow exponentially, this specification addresses the critical need for a unified metadata standard to ensure data...
The increasing integration of astronomical data services with machine learning workflows has led to unprecedented demand on web-based astronomical databases. The SIMBAD astronomical database, operated by the Centre de Données astronomiques de Strasbourg (CDS), has experienced a significant surge in API requests, particularly from automated systems and AI model training pipelines that span...
Modern scientific simulations generate petabyte-scale datasets that exceed available memory, forcing researchers to compromise between simulation duration and resolution. We present Adaptive Quantization Networks (AQN), a neural compression method that learns to identify scientifically important features and allocates bits accordingly, rather than spreading error uniformly like traditional...
LAMOST’s double revolving fiber positioning is vital for efficient spectroscopy, with accuracy requirements of 0.″2. Traditionally, closed-loop control relies on back-illumination at fiber ends, but this study proposes a front-illumination method using focal plane images. It eliminates internal spectrograph lighting, reducing light pollution and avoiding extra photography. An AI model, trained...
To address the challenge of manually analyzing the approximately 50 daily X-ray transient candidates from the Einstein Probe (EP) satellite—a process that can take 10-30 minutes per source—we have developed an AI-driven Real-Time Transient Identification Assistance System. Built upon the AI Agent framework and leveraging Large Language Models, the system is designed to emulate an experienced...
The Astronomy Open Science Competence Centre Pilot project (Astro-CC Pilot) is an EU funded activity meant to enable the astronomy research communities to accelerate their use of Open Science by supporting the implementation of FAIR principles.
It will run community 'competence centre' events, to provide training on the implementation of interoperable services, the development of the...
BHTOM.space is a powerful platform designed to coordinate the photometric observations from a heterogeneous global network of small and medium telescopes, including both professional and amateur observatories. Built on a modular Django/Python backend with PostgreSQL and RESTful APIs, it supports automated ingestion and processing of photometric data (FITS, CSV), real-time cross-matching, and...
BlueMUSE is a new blue-optimized, medium spectral resolution, panoramic integral field spectrograph being developed for ESO's VLT. While building on the legacy of the much requested MUSE instrument, its blue wavelength coverage to the atmospheric cutoff (~350 nm) will make it unique. In Galactic, extragalactic, and high-redshift domains, BlueMUSE will enable new science not possible with...
The European Space Agency’s Euclid mission and the Vera C. Rubin Observatory’s Legacy Survey of Space and Time (LSST) are poised to revolutionize astrophysics. Euclid delivers razor-sharp space-based Euclid imagery in one wide optical band with additional photometry in three NIR filters while LSST provides deep multi-band photometry across six filters (u, g, r, i, z, y). Individually, these...
With the emergence of Large Language Models (LLMs), the accessibility to information encoded in text for scientific purposes has significantly increased and provides a source for the augmentation of scientific practices within physics collaborations and for open science. In this contribution, the interface generation to internal and external sources of scientific information and application to...
CIELOS (Canary Islands data cEnter for astronomicaL Observations and
Simulations) will be an initiative led by the Instituto de
Astrofísica de Canarias (IAC) to manage, process, and archive the
large volumes of data produced by the Observatorios de Canarias
(OCAN). Designed to support both observational and simulation-based
research, CIELOS aims to become a key node in the...
Stellar occultations are a powerful method for determining the physical properties of small Solar System bodies such as asteroids, Centaurs, and trans-Neptunian objects (TNOs). By recording the precise moments a body passes in front of a background star, occultations provide accurate measurements of its projected profile, enabling size and shape reconstruction, especially when combined with...
The China Space Station Survey Telescope (CSST) is a next-generation Stage-IV sky survey telescope, which is scheduled to be launched in 2027. It is equipped with five scientific instruments, i.e. Multi-band Imaging and Slitless Spectroscopy Survey Camera (SC), Multi-Channel Imager (MCI), Integral Field Spectrograph (IFS), Cool Planet Imaging Coronagraph (CPI-C), and THz Spectrometer (TS). Due...
The Monitor of All-sky X-ray Image (MAXI), installed on the Exposed Facility of the Japanese Experiment Module "Kibo" on the International Space Station (ISS) in 2009, has been providing valuable data for X-ray astronomy until now. The Data ARchives and Transmission System (DARTS: https://darts.isas.jaxa.jp), operated by JAXA, makes MAXI event data publicly available, enabling researchers to...
The Effelsberg Direct Digitization backend is a multi-science computing backend
for radio telescopes operating on commercial-off-the-shelf hardware. It
currently drives data recording of four independent telescopes, running on very
different computing clusters ranging from 2 to 36 HPC processing nodes. To
optimize the reliability and efficiency of software development, release...
One of the major challenges in astrophysics in the coming years will be managing and processing the large amounts of data that will be generated in the future by the new generation of telescopes such as the square kilometer array observatory (SKAO). Of particular interest here is the ease of use of data services and the simple setup of data pipelines. The solution to this problem lies in...
The NSF National Radio Astronomy Observatory (NRAO) is expanding its scientific data archive to include legacy radio astronomy survey products, beginning with the ingestion of neutral hydrogen (HI) spectral-line data cubes from the Arecibo Legacy Fast ALFA (ALFALFA) survey. Conducted with the 305-m Arecibo radio telescope from 2005-2012 with its seven-beam ALFA receiver, ALFALFA represents one...
Accurate classification of astronomical light curves is important for interpreting the large datasets produced by modern surveys. However, most existing feature sets overlook the nonlinear dynamics inherent to the astrophysical systems that generate these signals. In this work, we explore features derived from nonlinear time-series analysis and assess their utility for light-curve...
We present a comprehensive investigation of 10 detached eclipsing binaries composed of main-sequence stars, aiming to refine the precision of stellar parameter determination through multi-wavelength and multi-instrument data integration. This work combines spectroscopic data from the Large Sky Area Multi-Object Fiber Spectroscopic Telescope (LAMOST), photometric observations from TESS, and...
Current applications of machine learning in astrophysics focus on teaching machines to perform domain-expert tasks accurately and efficiently across enormous datasets. While essential in the big data era, this approach is limited by our intuitions and expectations, and provides at most only answers to the ‘known unknowns’. We are developing new tools to enable scientific breakthroughs by...
BlueMUSE is going to be an integral field spectrograph similar to MUSE but cover a more blue wavelength range than MUSE. As the two instruments are similar, the data reduction pipeline of BlueMUSE will be based on the pipeline for MUSE. While the MUSE pipeline does propagate the variance of pixels during the resampling into datacubes, it currently does not consider covariances. This can cause...
Quasar accretion disks, predicted to emit optical continuum from regions spanning light‐hours, remain challenging to resolve via traditional reverberation mapping (RM) techniques limited by: 1. Long, daily to monthly observational cadence, and 2. Oversmoothing in damped random walk (DRW) time-series models. We present a 6-month, high-cadence (180s exposure, 3-5 hours per night), cost-effective...
We will present the HiPS2FITS 3D prototype, the counterpart of HiPS2FITS to generate cube cutouts from HiPS 3D.
We will details the technical challenges we faced and will present how this new service allows for analysis of cubic data, once data of interest has been found using Aladin Lite or Aladin Desktop "visualization in context".
Our science depends on public digital resources, including data archives, software repositories, computational infrastructures, and catalogs. Recent actions by non-scientists have led to the loss of important digital resources in other fields, and even astrophysics assets may face uncertain futures. This poster invites astronomers to reflect on how dependent our research is on shared resources...
Bibliographies are a core tool used by observatories to evaluate the impact of their facilities and instruments. Yet, identifying and classifying papers referencing specific instruments is usually a manual, time-intensive task. We developed a large language model (LLM)-augmented pipeline to automatically construct a comprehensive list of instruments referenced across the full astronomy corpus...
We will present the implementation of the HiPS3D extension in Aladin Lite, enabling efficient visualization and exploration of large cube datasets directly in the browser. This work, carried out within the SRCNet framework, demonstrates how HiPS3D technology can support interactive navigation through GBs-TBs of spectral cube data. This article will focus on the technical details of the...
We attach a great importance to open our webpages and astronomical data services to all, particularly to people with disabilities. Tools already exist (audio conversion of web pages, etc.), and the idea was not to reinvent the wheel, but to do an additional work so that our content could be processed by these tools and to offer some new features. We began by raising awareness us of different...
The X-ray Integral Field Unit (X-IFU) aboard the upcoming NewAthena mission relies on precise pulse detection and reconstruction for high-resolution X-ray spectroscopy. In its current baseline configuration, the pulse detection algorithm employs a one-sided derivative kernel (typically [1, 1, -1, -1]) to enhance pulse edges, triggering on threshold crossings of the filtered signal. However,...
After 45 years in operation, and two decades of remote queued service observing (QSO), Canada France Hawaii Telescope is developing Kealahou: a reconstruction of our entire QSO software system.
Kealahou started as a platform for a single spectrograph, and has since grown to host CFHT’s main suite of instruments, a “Phase 1” proposal submission and review system, and science operations...
We present a new value-added parameter catalog for the LAMOST (Large Sky Area Multi-Object Fiber Spectroscopic Telescope) survey, produced by a spectral foundation model that unifies low- and medium-resolution LAMOST spectra with high-precision labels from multiple high-resolution surveys. The model, built upon the SpecCLIP framework, learns a shared latent space among spectra of different...
The Stratospheric Observatory for Infrared Astronomy (SOFIA) has gathered a considerable amount of scientific data between first light in May 2010 to the final observing flight in September 2022. The joint mission by NASA and DLR produced a diverse set of astronomical data from several science instruments.
During the active time of the mission the SOFIA Data Processing Software Team worked...
We develop the software for detecting moving objects in large catalog database, which are based on the PDR3-DUD (public data release version 3 for Deep and Ultra Deep survey) of Subaru Strategic Survey Project.
By comparing the catalogs based on Coadd image and each exposure images, we extract objects without Coadd detection but in exposure image as moving and/or transient object...
Advancements in AI have propelled multi-modal models to the forefront of astronomical research, particularly in time-domain astronomy. These models integrate diverse data types—images, light curves, spectral data, and metadata—to enhance analysis and prediction of dynamic celestial phenomena like supernovae, variable stars, X-ray bursts, and tidal disruption events. The Time Domain...
Our aim is to provide and develop infrastructure and services for the PUNCH sciences in Germany that interface data and resources by international initiatives and providers to enable efficient data analysis and data management according to the FAIR principles.
Modern astronomical observatories generate a torrent of complex data, offering an unprecedented opportunity for discovery. This presents a classic challenge: how to effectively harness this abundance. In China, we are answering this call by strategically engaging the public through citizen science, transforming science data to a broader range of fields. This poster presents a comprehensive...
We present a case-study in migrating a mask design software application for a Multi-Object Spectrograph (MOIRCS, operating at Subaru Telescope) from an IDL implementation to a pure Python one. The port accomplished several goals, including:
1) freeing users from onerous licensing restrictions,
2) improving the overall stability and responsiveness of the program, and
3) improving the...
Within the RADIOBLOCKS EU project, JIVE, in collaboration with
partners, is investigating state-of the art accelerator technologies
for implementing a correlator for Very Long Baseline Interferometry
(VLBI). In particular we are developing GPU kernels that implement
the same algorithm as the SFXC CPU correlator. SFXC is the current
production correlator for the European VLBI Network...
In this work, we try to find a set of algorithms to constrain the cosmological parameters using the quasar dataset. Quasars can be the potential cosmic probe that can fill up the gap between the farthest observed Type Ia Supernovae and the Cosmic Microwave Background CMB. Quasars are observed to the highest redshift of z ~ 7.1. It can give valuable insight into the tensions of the...
In an era of increasingly complex and numerous astronomical instruments, a standardized description of observation targets becomes more and more crucial for efficient planning and archiving of astronomical observations.
To ensure seamless out-of-the-box interoperability between different systems and instruments, ESO has internally proposed a new unified standard (ESO-371803, “Astronomical...
The emerging technology of Data Science platforms addresses the need to analyze Big Data directly where it is stored, because moving large volumes of data is nearly impossible. Typical platforms, like Pangeo, ESA DataLab, and SciServer are complex cloud-based systems, running on large clusters, providing hundreds of users access to petabyte-scale astronomical data archives. They use flexible...
Software development has become an essential part of every sub-field of astronomy. Because of that, software citation is crucial for crediting earlier work, motivating funding, and encouraging reproducible and collaborative science. While there exists a well-developed ecosystem of tools and services to assist with citations to traditional publications, such as ADS/SciX, this infrastructure...
The Space-based multi-band astronomical Variable Objects Monitor (SVOM) mission, a joint Sino-French collaboration launched in 2024, is designed to detect, localize, and study gamma-ray bursts (GRBs) and other high-energy transients. Among its onboard instruments, the Microchannel X-ray Telescope (MXT) plays a central role by providing follow-up X-ray observations of GRB afterglows and other...
We present the Bayesian framework "Incliscope", a novel approach to estimate inclinations of galaxies based on optical images. In contrast to traditional methods, our solution does not rely on the fitting of ellipsoids in order to solve the axis-ratio equation. Instead, we use a probabilistic approach to collect properly-calibrated posterior distributions among inclinations from simulated...
Classifying and summarizing extensive datasets from diverse sky surveys is essential for advancing astronomical research. Integrating data from 4XMM-DR13 (X-ray), SDSS DR18 (optical), and CatWISE (IR) surveys, we constructed the XMM-WISE-SDSS sample. Cross-matching with SDSS/LAMOST spectral classifications provided a training set of stars, galaxies, quasars, and young stellar objects (YSOs)....
The Python package teareduce has been developed to support teaching activities related to the reduction of astronomical data. Specifically, it serves as instructional material for students participating in practical classes on the processing of astronomical images acquired with various instruments and telescopes. These classes are part of the course Experimental Techniques in Astrophysics,...
The Canadian Data-Intensive Astrophysics PLatform (CanDIAPL) is a national, production-grade ecosystem that lets Canadian astronomers use petabyte-scale survey streams rather than drown in them. CanDIAPL will pair on-site streaming compute at SKA pathfinders (MeerKAT/South Africa; MWA/Australia) with dedicated off-site storage and analysis at the Alliance/CANFAR Nibi (formerly ‘Graham’) cloud...
The ASTRI Mini‑Array is an international project led by the Italian National Institute for Astrophysics (INAF) to build and operate nine dual‑mirror imaging atmospheric Cherenkov telescopes for very‑high‑energy (TeV) gamma‑ray astronomy and stellar intensity interferometry. This paper presents the Startup System, an observatory‑wide startup/shutdown orchestrator and monitoring layer that...
Julia is a modern, high-level, dynamically typed programming language designed for high-performance numerical and scientific computing. It combines the interactivity and ease of use of languages like Python or MATLAB with execution speeds approaching those of C/C++ and Fortran, enabled by its LLVM-based just-in-time (JIT) compilation and powerful multiple dispatch paradigm. A key advantage of...
NSF-DOE Vera C. Rubin Observatory's upcoming Legacy Survey of Space and Time (LSST) will process 20 terabytes of raw images into 10 million transient alerts per night, every night for ten years. The Prompt Processing system deployed at the SLAC National Accelerator Laboratory automatically handles incoming images and generates alerts in near real time. To meet the ambitious throughput,...
During 783 scientific flights, SOFIA, the Stratospheric Observatory for Infrared Astronomy of DLR and NASA, collected much scientific data, now available through IRSA at IPAC. After the end of flight operations in September 2022, during one year of post-operations, only a limited data reprocessing of SOFIA Observing Cycles 5 to 9 could be achieved. To complete the job, the SOFIA Data Center...
The ASTRI Mini-Array is an international project led by the Italian National Institute for Astrophysics (INAF) to construct and operate an array of nine dual-mirror Imaging Atmospheric Cherenkov Telescopes. The primary goal is to study very high-energy (TeV) gamma-ray sources and perform stellar intensity interferometry. This paper describes the design and implementation of the Startup System,...
Bayesian imaging of astrophysical measurement data shares universal properties across the electromagnetic spectrum: it requires probabilistic descriptions of possible images and spectra, and instrument responses. To unify Bayesian imaging, we present the the Universal Bayesian Imaging Kit (UBIK). Currently, UBIK allows X-ray satellite data imaging for Chandra and eROSITA and soon radio...
Debian Astro is a Debian Pure Blend dedicated to astronomy and astrophysics, providing a curated collection of software for observational and theoretical research, data analysis, and education. By integrating widely used packages—ranging from telescope and instrument control to data reduction pipelines and visualization tools—Debian Astro offers researchers a robust, reproducible, and fully...
Abstract: In the context of Open Science and Citizen Science, the WIVONA (We Implement Virtual Observatory Needs of Astrams) project, funded by the Gemini Pro/Am initiative at Observatoire de Paris, aims to promote software interoperability as well as to provide access to astronomical data through the Virtual Observatory (VO) for the amateur astronomical community. During the first two years...
With astronomy entering an era of petabyte to exabyte scale data from next-generation telescopes and surveys, existing cloud platforms face critical data management challenges. Traditional approaches of mounting entire datasets or copying filtered subsets create trade-offs between access efficiency and storage costs, while conventional storage engines lack flexible permission control. To...