Chi-Kwan Chan waves his hand a few inches above a matchbox-size device. On a dark computer monitor, a million light dots appear as a solid sheet, each dot representing a light particle.
The photon sheet hovers above a black disc simulating a black hole. With a slow turn of the hand, the sheet approaches the black hole. As it passes, the gravitational monster swallows any light particles in its direct path, creating a circular cutout in the sheet of particles. The rest of the particles are on track to move past the black hole, or so it seems. But they don’t get very far: Instead of continuing along their straight lines of travel, their paths bend inward and they loop around the black hole and converge in one point, forming a sphere of photons around it.
“What you see here is light trapped in the fabric of space and time, curving around the black hole by its massive gravity,” explains Chan, an assistant astronomer at the University of Arizona’s Steward Observatory, who developed the computer simulation as part of his research into how black holes interact with things that happen to be nearby.
The demonstration was part of an event at UA’s Flandrau Science Center & Planetarium on Feb. 16 to kick off a UA-led, international project to develop new technologies that enable scientists to transfer, use and interpret massive datasets.
Known as Partnerships for International Research and Education program, or PIRE, the effort is funded with $6 million over five years by the National Science Foundation, with an additional $3 million provided by partnering institutions around the world. While the award’s primary goal is to spawn technology that will help scientists take the first-ever picture of the supermassive black hole at the center of our Milky Way, the project’s scope is much bigger.
What looks like a fun little animation on Chan’s computer screen is in fact a remarkable feat of computing and programming: As the computational astrophysicist drags virtual photons around a virtual black hole, a powerful graphics processor solves complex equations that dictate how each individual light particle would behave under the influence of the nearby black hole — simultaneously and in real time.
Study Relies on Simulations
Unlike the crew in the movie “Interstellar,” astrophysicists can’t travel to a black hole and study it from close range. Instead, they have to rely on simulations that mimic black holes based on their physical properties that are known to — or thought to — govern these most extreme objects in the universe.
Chan belongs to a group of researchers in an international collaboration called the Event Horizon Telescope, or EHT, that is gearing up to capture the first picture of a black hole — not just any black hole, but the supermassive black hole in the center of our galaxy. Called Sagittarius A* (referred to as “Sgr A Star,” pronounced Sag A Star), this object has the mass of more than 4 million suns.
Since nothing, not even light, can escape a black hole, it casts a silhouette in the background of in-falling plasma that is too small to be resolved by any single telescope. So far, the existence of Sgr A* has been inferred from indirect observations only, such as the intriguing choreography of stars in its vicinity, whose orbits clearly outline an unseen, incomprehensibly large mass.
“Imaging the black hole at the center of our galaxy from Earth is like trying to read the date on a dime on the East Coast from the UA campus,” says Feryal Özel, a professor of astronomy and physics at Steward and a co-investigator on the project. “There is not one telescope in existence that could do that.”
The EHT is an array of radio telescopes on five continents that together act as a virtual telescope the size of the Earth — the aperture needed to image “the date on the dime,” or in this case the supermassive black hole Sag A*. To accomplish this, the individual telescopes must be precisely synced in time. Because existing internet cables and even satellite communication are too coarse to ensure this, the researchers rely on atomic clocks and … FedEx (more on that later).
“Our PIRE project is a prime example of the kind of innovation you can only get by leveraging the innovative, intellectual capital in academia,” says Dimitrios Psaltis, the principal investigator on the project. “By its very nature, this project is multidisciplinary and requires expertise in areas ranging from detector development to high-performance computing and theoretical physics.”
At peak activity, the EHT will collect more data than any project before, according to Psaltis, a professor of astronomy and physics at the UA.
“We’re talking petabytes every single night,” he says, and this is comparable to the three petabytes of video uploaded each day on YouTube. “Post-processing is a huge effort, and we will need additional data to improve the science that we hope will come from these observations.”
The team uses graphic processing units, or GPUs — processors developed for gaming that are capable of performing many calculations in parallel. This makes them more efficient and energy-saving than “regular” computer processing units, or CPUs.
“We hope that this technology will transfer to other areas of science and life,” said Joaquin Ruiz, dean of the UA College of Science, at the launch event.
Applications Could Be Extensive
The PIRE project is expected to spin off technologies that go beyond the project’s primary goal. The fast processing of large data in real time and the efficient use of resources distributed across the globe will have applications ranging from self-driving cars to renewable energy production and national defense. Examples also include augmented reality applications that are good at fast computing with real-time input and minimum computing resources, Özel explains.
“This could be used, for example, in visual aids for security efforts around the globe where data connection bandwidth and energy supplies are limited,” she says, “so you want devices that make maximum use of precious resources available in those scenarios.”
The PIRE project team integrates researchers in the U.S., Germany, Mexico and Taiwan. Education of students and early career scientists is a key component, providing internally collaborative, hands-on experience in instrument technology, high-performance computing, and big and distributed data science. There also are monthly webinars and hackathons, as well as summer schools, that will be sponsored every year.
Fast and reliable real-time communication channels are crucial in syncing up telescopes scattered around the globe for observations, and improving such technology is one of PIRE’s goals. For now, EHT scientists rely on video chat, phones and whiteboards to keep track of each telescope location’s status. During a rare stretch of a few days in April 2017, skies were mostly clear in all nine observing sites that are part of the EHT array — including Arizona, Hawaii, Chile, Mexico and Antarctica.
The South Pole Telescope, or SPT, site was incorporated under another NSF grant to the UA, with Dan Marrone as principal investigator. Last year was the first year that the full EHT observed as an array, and the first year in which the SPT participated.
During that first observation run, the observing stations that together make up the EHT pointed at the Milky Way’s center and collected radio waves originating from the supermassive black hole over the course of several nights. By obtaining the first-ever images of black holes, researchers will be able to directly test Einstein’s theory of general relativity in extreme conditions.
“Each telescope records its observation data onto a bunch of physical hard drives,” explains Marrone, an associate professor at Steward and a co-investigator on the PIRE award. “Precisely time-stamped, the drives are loaded into crates and delivered to processing centers in Cambridge, Massachusetts, and Bonn, Germany, via FedEx.”
The EHT data are shipped on physical carriers because current internet data pipelines aren’t up to the scope this endeavor requires. Then data experts combine the literal truckloads of data, synchronize it according to their time stamps and process it to extract the signal from the black hole, which in the raw data is buried under a blanket of noise and error — the inevitable side effects of turning the Earth into one giant telescope.
“PIRE is an international project that not only will revolutionize worldwide eﬀorts to study black holes, but usher astronomical projects into the era of big and distributed data science,” Psaltis says. “By awarding the PIRE project, the NSF has tasked the UA and its collaborators to contribute solutions that may inform many areas of technology, including the internet of tomorrow.”
Source: University of Arizona, written by Daniel Stolte.