
A machine learning model detects lags in active galactic nuclei
To explore the heart of the most energetic galaxies, astronomers use 3D reverberation mapping. This technique analyzes the temporal delays or lags in the light emitted by the accretion disks surrounding supermassive black holes. The future Vera Rubin Observatory will generate massive data for this task, but it also poses significant challenges that require new analysis tools 🕰️.
Lags reveal the hidden structure of the disk
Short lags arise from the time it takes for light to cross the disk, allowing to map its radial extent. On the other hand, long negative lags, which are more subtle and complex to detect, are linked to the time it takes for matter to flow inward, offering clues about the disk's vertical structure. Detecting these latter with traditional methods is very difficult, especially with data series that have gaps or a weak signal.
Challenges for the new era of observation:- The Rubin Observatory will observe millions of AGN, but its data will have seasonal gaps.
- The long negative lag signal is intrinsically faint and easy to mask.
- Classic analysis methods do not scale well to process the enormous volume of expected data.
It seems that even supermassive black holes can have a delay in responding, although in their case it is measured in light-days.
A transformer revolutionizes detection
To overcome these barriers, a machine learning model based on the transformer architecture has been developed and trained. This model examines simulated light curves that mimic those that Rubin will produce, seeking to automatically and robustly identify both types of lags.
Results that make a difference:- The model identifies the presence of a long negative lag with 96% completeness and only 0.04% contamination.
- It predicts the lag value with 98% accuracy.
- It far outperforms established techniques: the interpolated cross-correlation function achieves 54% accuracy and javelin only a