Search
Close this search box.
Search
Close this search box.

Looking into the Depths: Technology and Science Combine to Reveal a Black Hole Flare in 3D

Using AI and ALMA data, scientists create a groundbreaking 3D video of flares around our galaxy’s central black hole, offering new insights into its dynamic…

AI Black Hole Physics Art Concept Illustration

Researchers at Caltech have created the first 3D video showing flares around our galaxy’s supermassive black hole, using AI methods and data from the ALMA telescope. This interdisciplinary research, mixing astrophysics and computer science, opens up new opportunities for understanding black hole environments. (Artist’s concept.) Credit: SciTechDaily.com

Using AI and ALMA data, scientists have produced a significant 3D video of flares around our galaxy’s central black hole, providing new insights into its dynamic environment.

Scientists think the region immediately surrounding a black hole is turbulent, containing hot magnetized gas that spins in a disk at incredible speeds and temperatures. Observations in astronomy reveal that within such a disk, mysterious flares occur multiple times a day, appearing bright momentarily before fading. Now a team headed by Caltech scientists has utilized telescope data and an artificial intelligence (AI) computer-vision technique to generate the first three-dimensional video depicting what such flares might look like around Sagittarius A* (Sgr A*), the supermassive black hole at the center of our Milky Way galaxy.

The 3D flare configuration includes two bright, compact features situated about 75 million kilometers (or half the distance between Earth and the Sun) from the black hole's center. It is based on data gathered by the Atacama Large Millimeter Array (ALMA) in Chile over a 100-minute period just after an eruption seen in X-ray data on April 11, 2017.

“This is the first three-dimensional portrayal of gas swirling near a black hole,” says Katie Bouman, assistant professor of computing and mathematical sciences, electrical engineering and astronomy at Caltech, whose team led the effort outlined in a new paper released today (April 22) in Nature Astronomy.

Aviad Levis, a postdoctoral scholar in Bouman’s team and the primary author of the new paper, stresses that while the video is not a simulation, it is also not a direct recording of events as they occurred. “It is a reconstruction based on our models of black hole physics. There is still a lot of uncertainty associated with it because it relies on these models being accurate,” he explains.


Using neural networks, a team led by Caltech has employed radio telescope data and models of black hole physics to create a 3D image showing how explosive flare-ups in the gas disk around our supermassive black hole, Sagittarius A* (Sgr A*), might appear. Here, the reconstructed 3D structure is depicted from a fixed angle as the model progresses over about 100 minutes, showing the trajectory the two bright features trace around the black hole. Credit: A. Levis/A. Chael/K. Bouman/M. Wielgus/P. Srinivasan

Using AI influenced by physics to determine potential 3D arrangements

To produce the 3D image, the team had to create new computational imaging tools that could, for example, accommodate the bending of light caused by the curvature of space-time around objects of immense gravity, such as a black hole.

The team of experts initially discussed whether they could make a 3D video of flares near a black hole in June 2021. The Event Horizon Telescope (EHT) Collaboration, including Bouman and Levis, had already revealed the supermassive black hole at the center of a distant galaxy, and was working on doing the same with EHT data from Sgr A*. Pratul Srinivasan from Google Research, a co-author on the new paper, was visiting the team at Caltech. He had helped develop a technique known as neural radiance fields (NeRF) that was just beginning to be used by researchers; it has since had a big impact on computer graphics. NeRF uses deep learning to create a 3D representation of a scene based on 2D images. It allows for observing scenes from different angles, even when only limited views of the scene are available. first image of the supermassive black hole at the core of a distant galaxy, called M87, and was working to do the same with EHT data from Sgr A*. Pratul Srinivasan of Google Research, a co-author on the new paper, was at the time visiting the team at Caltech. He had helped develop a technique known as neural radiance fields (NeRF) that was then just starting to be used by researchers; it has since had a huge impact on computer graphics. NeRF uses deep learning to create a 3D representation of a scene based on 2D images. It provides a way to observe scenes from different angles, even when only limited views of the scene are available.

The team wondered if, by building on these recent developments in neural network representations, they could reconstruct the 3D environment around a black hole. Their big challenge: From Earth, as anywhere, we only get a single viewpoint of the black hole.


Here, the reconstructed 3D structure is shown at a single time (9:20 UT), directly after a flare was detected in X-ray, with the view rotating to help visualize the structure from all angles. Credit: A. Levis/A. Chael/K. Bouman/M. Wielgus/P. Srinivasan

The team thought that they might be able to overcome this problem because gas behaves in a somewhat predictable way as it moves around the black hole. Consider the analogy of trying to capture a 3D image of a child wearing an inner tube around their waist. To capture such an image with the traditional NeRF method, you would need photos taken from multiple angles while the child remained stationary. But in theory, you could ask the child to rotate while the photographer remained stationary taking pictures. The timed snapshots, combined with information about the child’s rotation speed, could be used to reconstruct the 3D scene equally well. Similarly, by leveraging knowledge of how gas moves at different distances from a black hole, the researchers aimed to solve the 3D flare reconstruction problem with measurements taken from Earth over time.

With this insight in hand, the team built a version of NeRF that takes into account how gas moves around black holes. But it also needed to consider how light bends around massive objects such as black holes. Under the guidance of co-author Andrew Chael of Princeton University, the team developed a computer model to simulate this bending, also known as gravitational lensing.

With these considerations in place, the new version of NeRF was able to recover the structure of orbiting bright features around the event horizon of a black hole. Indeed, the initial proof-of-concept showed promising results on synthetic data.

A flare around Sgr A* to study

But the team needed some real data. That’s where ALMA came in. The EHT’s now famous image of Sgr A* was based on data collected on April 6–7, 2017, which were relatively calm days in the environment surrounding the black hole. But astronomers detected an explosive and sudden brightening in the surroundings just a few days later, on April 11. When team member Maciek Wielgus of the Max Planck Institute for Radio Astronomy in Germany went back to the ALMA data from that day, he saw a sign with a cycle that matches the time it would take for a bright area within the disk to complete a circle around Sgr A*. The team aimed to find the 3D shape of that brightening around Sgr A*. ALMA is one of the strongest radio telescopes globally. However, because of the huge distance to the galactic center (more than 26,000 light-years), even ALMA cannot see Sgr A*'s immediate surroundings clearly. What ALMA measures are light patterns, which are essentially videos of a single flickering dot, created by collecting all of the radio-wavelength light detected by the telescope for each moment of observation. Creating a 3D volume from a single-dot video might seem impossible. However, by using an additional piece of information about the expected physics for the disk around black holes, the team was able to go around the lack of spatial information in the ALMA data.

Strongly polarized light from the flares gave hints

ALMA doesn’t just capture a single light pattern. In reality, it provides several such “videos” for each observation because the telescope records data relating to different polarization states of light. Like wavelength and intensity, polarization is a fundamental property of light and represents which direction the electric component of a light wave is oriented with respect to the wave’s general direction of travel. “What we get from ALMA is two polarized single-pixel videos,” says Bouman, who is also a Rosenberg Scholar and a Heritage Medical Research Institute Investigator. “That polarized light is actually really, really informative.”

Recent theoretical studies suggest that hot spots forming within the gas are strongly polarized, meaning the light waves coming from these hot spots have a distinct preferred orientation direction. This is in contrast to the rest of the gas, which has a more random or scrambled orientation. By gathering the different polarization measurements, the ALMA data gave the scientists information that could help localize where the emission was coming from in 3D space.

Introducing Orbital Polarimetric Tomography

To determine a probable 3D structure that explained the observations, the team developed an updated version of its method that not only included the physics of light bending and dynamics around a black hole but also the polarized emission expected in hot spots orbiting a black hole. In this technique, each potential flare structure is represented as a continuous volume using a neural network. This allows the researchers to computationally progress the initial 3D structure of a hotspot over time as it orbits the black hole to create a whole light pattern. They could then solve for the best initial 3D structure that, when progressed in time according to black hole physics, matched the ALMA observations.

The result is a video showing the clockwise movement of two compact bright regions that trace a path around the black hole. “This is very exciting,” says Bouman. “It didn’t have to come out this way. There could have been arbitrary brightness scattered throughout the volume. The fact that this looks a lot like the flares that computer simulations of black holes predict is very exciting.”

Levis explains that the project involved collaboration between computer scientists and astrophysicists, resulting in cutting-edge advancements in both fields: numerical codes that simulate light movements around black holes and computational imaging work.

The researchers emphasize that this technology is just the beginning. Levis describes it as an intriguing example of how AI and physics can collaborate to unveil unseen phenomena. They hope astronomers can apply it to other complex data to gain new insights.

The title of the new paper is “Orbital Polarimetric Tomography of a Flare Near the Sagittarius A* Supermassive Black Hole.”

Funding for the work came from the National Science Foundation, the Carver Mead New Adventures Fund at Caltech, the Princeton Gravity Initiative, and the European Research Council.

Reference: “Orbital Polarimetric Tomography of a Flare Near the Sagittarius A* Supermassive Black Hole” 22 April 2024,

Nature Astronomy

DOI: 10.1038/s41550-024-02238-3 Scientists use AI and ALMA data to create an innovative 3D video of flares around our galaxy’s central black hole, providing new insights into its dynamics..
DOI: 10.1038/s41550-024-02238-3

Subscribe
Notify of
guest
0 Comments
Inline Feedbacks
View all comments