Astronomy Object of the Month: June 2024
< previous Archive next >
Cosmic Leap: Swift Satellite and AI unravel the distance of the farthest Gamma-Ray Bursts
An international team of astrophysicists has developed a novel machine learning technique to measure the distance to
gamma-ray bursts, the most powerful explosions in the universe. The result
was published in the Astrophysical Journal Letters (ApJL) and the Astrophysical
Journal Supplement Series (ApJSS). The lead authors of these works are Dr. Maria
Dainotti (Japan, NAOJ) and Mr. Aditya Narendra from the Astronomical Observatory of the Jagiellonian University, Poland.
Illustration: Gamma-Ray Bursts flow-chart scheme. Credit: The Authors.
Gamma-Ray Bursts (GRBs) are the most explosive phenomena in the Universe occurring after the Big Bang. In a few seconds, they release the same amount of energy our Sun releases in its entire lifetime. Because they are so explosive and bright, they can be observed at the edge of the visible Universe up to the time it was just 500 million years old. Looking at the Universe at such large distances opens the possibility of chasing the oldest stars that are extremely rare to observe and are thought to be the progenitors of GRBs.
GRBs are believed to occur in different ways. One is when a massive star, more than 30 times heavier than our Sun, reaches the end of its life, and explodes in a spectacular supernova. These are some of the brightest events in the known universe, and have been observed to give rise to a particular category of GRBs called Long GRBs. The second type of event which causes GRBs is called mergers. This is when the corpses of dead stars, such as neutron stars, come together gravitationally and collide with each other, releasing a huge amount of energy in a very short time. These events have been observed to lead to Short GRBs.
RBs are not only observed far away but also at close distances, and this allows us to understand how the stars evolve over time and how many GRBs can occur in a given space and in a given time. However, measuring the distance of GRBs is a very challenging task due to the paucity of Telescopes pointing to these serendipitous events and the limitations of other facilities. Thus, we need an indirect measurement of the distance, and this is where the Neil Gehrels Swift satellite data and machine learning come to help us.
An international team, led by Dr. Maria Dainotti, Assistant Professor at the National Astronomical Observatory of Japan, and Mr. Aditya Narendra, final year doctoral student at Jagiellonian University in Kraków, Poland, has developed a well-crafted machine learning methodology for inferring the distance to GRBs completely based on GRB properties which are not directly dependent on the distance.
The novelty of this approach is that, instead of using one machine learning model, they gather several methods together to improve the predictive power of these tools. This method is called Superlearner and assigns to each algorithm a weight whose values range from 0 to 1, and each weight corresponds to the predictive power of that singular method - says Dr. Dainotti. The advantage of the Superlearner is that the final prediction is always more performant than the singular models. Superlearner is also used to discard the algorithms which are the least predictive, for example, we set a threshold for accepting a model into our ensemble. This threshold is 0.05.
This increases the chances that the method can determine more precisely the distance of GRBs.
Because the SuperLearner can combine any machine learning model for creating the final ensemble, we can adapt the methodology depending on the data we are working with - explains Mr. Narendra. A crucial component of training the machine learning models is cross validation. This is a technique which mimics the real-life performance of a machine learning algorithm and thus gives us an accurate estimate of its performance.
Indeed, the team showed that the GRB rate does not follow the star formation rate at small distances, thus opening the possibility that the long duration GRBs at small distances may be generated not by a collapse of massive stars (with masses 30 times larger than the mass of the Sun) as it was previously assumed, but rather by the merger of very dense objects (neutron stars). This claim has been developed based on the Swift X-ray data by Dr. Vahe’ Petrosian, Professor at Stanford University and Dr. Dainotti, while the current paper confirms this for the optical data as well.
This research pushes forward the frontier in both Gamma-Ray astronomy and machine learning. Follow-up research and innovation can help astrophysicists achieve more precise results and, in the future, even address cosmological problems.
Original publication: Maria Giovanna Dainotti, Aditya Narendra, Agnieszka Pollo, Vahé Petrosian, Małgorzata Bogdan, Kazunari Iwasaki, Jason Xavier Prochaska, Enrico Rinaldi, David Zhou, Gamma-Ray Bursts as Distance Indicators by a Statistical Learning Approach, ApJL 967 L30 (2024).
The findings described are part of a study conducted in the Department of Stellar and Extragalactic Astronomy of the Jagiellonian University Astronomical Observatory in Kraków. A similar X-rays analysis using data by Swift is also published: Dainotti, M. G. et al. "Inferring the Redshift of More than 150 GRBs with a Machine-learning Ensemble Model." The Astrophysical Journal Supplement Series 271.1 (2024): 22, DOI: 10.3847/1538-4365/ad1aaf
Aditya Narendra Astronomical Observatory Jagiellonian University Aditya.Narendra [@] doctoral.uj.edu.pl |