Skip to main content

Improved computing provides a better look at the cosmos

September 6, 2013 By Jill Sakai

Photo: Ice Cube cables being connected

Once IceCube’s optical sensors were deployed, the crew pulled cables to connect them to the lab’s servers in order to collect data. Improved computer engineering is now being used to find the neutrino events of most interest to scientists.

Photo: Freija Descamps/NSF

Building a neutrino telescope — a unique instrument that detects extremely small, high energy particles — out of 5,000 optical sensors embedded in a cubic kilometer of Antarctic ice, a tremendous engineering feat, was just the first challenge.

A feat of computer engineering is now tackling the next task facing the Wisconsin IceCube Particle Astrophysics Center (WIPAC): sorting through massive amounts of data to find the few particles of interest to scientists.

WIPAC’s goal is to detect and analyze signs of cosmic neutrinos — mysterious, subatomic particles released by distant astronomical phenomena such as exploding stars, black holes and supernovas. Finding evidence of the highest-energy and most scientifically interesting particles in a sea of background data is an ongoing puzzle for the physicists.

“IceCube registers 3,000 events every second, tens of billions per year. Among these we have to filter out the approximately 10 interesting events reaching us from the cosmos,” says UW–Madison physics professor Francis Halzen, principal investigator of the IceCube project.

Photo: Francis Halzen

Francis Halzen

One of the key challenges is handling the volume of data coming from the South Pole laboratory — sent to Madison via satellite — and providing efficient and timely access for researchers.

Terry Millar, a UW–Madison math professor and former associate dean for physical sciences in the Graduate School, saw an opportunity for collaboration. He connected Halzen with Chris Ré and Benjamin Recht, two assistant professors of computer science. With support from a Graduate School funding competition designed to encourage multidisciplinary research on campus, the groups applied advanced computing approaches to improve IceCube’s detection and particle track reconstruction methods.

Recht, Ré, and graduate student Mark Wellon developed an algorithm that uses robust statistical methods to estimate particle trajectories through the detector. The approach targeted weak points in the existing algorithms to improve IceCube’s detection and particle track reconstruction methods.

“[We] observed that the majority of the errors in the reconstruction pipeline were due to outliers in the data. These outliers could come from detectors firing randomly or from light scattering through the arctic ice,” says Ré. He is now at Stanford University.

“IceCube registers 3,000 events every second, tens of billions per year. Among these we have to filter out the approximately 10 interesting events reaching us from the cosmos.”

Francis Halzen

Rather than trying to model the physics that produced the outliers, the computer science team focused on detecting and removing outliers. Surprisingly, they say, removing physics from the models enabled them to produce more robust and reliable fits. A paper describing the research is available on arXiv.org.

“Our computer science collaborators reviewed our data stream and not only made it more efficient but managed to speed up the considerable computing effort performed on site at the South Pole,” says Halzen. “The results were spectacular.”

“Turning the potential of a cross-discipline collaboration into a high-impact activity is an art,” says Miron Livny, professor of computer sciences and director of the UW–Madison Center for High Throughput Computing. “I hope that this high-impact work will serve as a model and driver for future collaborations between computer and domain scientists on our campus.”

The Graduate School is currently accepting proposals for the 2014-2015 fall research competition, with a special competition for interdisciplinary projects.