Press Archive

National Science Foundation Forms Throughput Computing Partnership

Goal is to Cut Time to Solution by Creating Massive Supercomputer Networks

Published November 9, 2020

The 126 green dots on this map represent institutions participating in the OSG Compute Federation which provides approximately 160 million compute hours per month.

The National Science Foundation (NSF) has awarded $22.5 million to a partnership between the Center for High Throughput Computing (CHTC) at University of Wisconsin – Madison and the Open Science Grid (OSG) to advance open science via the practice of distributed High Throughput Computing (dHTC). The project seeks to harness the computing capacity of thousands of computers assembled in a network of campus clusters to cut time to science result that might take years, to be done in days, especially for applications that are parallel by design.

Collaborating in this initiative is the San Diego Supercomputer Center (SDSC) at the University of California San Diego. Frank Würthwein, SDSC’s lead for High-Throughput Computing and a physics professor at UC San Diego, also is executive director of OSG, a national cyberinfrastructure funded by the NSF to advance the sharing of resources, software, and knowledge.

Researchers and staff from the University of Southern California, Indiana University, the University of Chicago, and the University of Nebraska are also part of the collaboration, called the Partnership to Advance Throughput Computing, or PATh. The five-year award will fund more than 40 individuals across participating institutions, most of whom have been working together for years and in some cases for decades.

The collaboration is being undertaken to serve the growing need for throughput computing across the entire spectrum of research institutions and disciplines. The partnership will specifically focus on dHTC technologies and methodology, leveraging automation while building on distributed computing principles to enable researchers with large ensembles of computational tasks.

“Research computing is vital to almost every corner of basic research, and investments in expertise and supporting services are essential to decrease time to results,” said SDSC Director Michael Norman. “The Open Science Grid led by Frank Würthwein has helped make data science accessible and adaptable to thousands of scientific projects large and small.”

Würthwein contributed to the creation of OSG in 2005 as its first executive director. OSG brings the power of high throughput computing to national and international research institutions and science projects. Two of these projects have received Nobel Prizes in physics: The discovery of the Higgs boson particle (2013) and the detection of gravitational waves (2017).

The HTCondor Software Suite developed and maintained by the CHTC will power the fabric of services for the national science and engineering community and be an integral part of the national ecosystem of coordinated cyberinfrastructure services promoted by the NSF. “We see PATh as a valuable component of this evolving ecosystem of services,” said Manish Parashar, director of the NSF Office for Advanced Cyberinfrastructure.

SDSC’s role within the project includes technology evaluation via a “Global Infrastructure Laboratory” (GIL). GIL will provide an ingest mechanism for new ideas and software into OSG and PATh.

Würthwein noted that the PATh project is funded as part of NSF’s computational ecosystem. PATh federates resources at the campus level, including more than 30 campus clusters funded during the last two years by the NSF’s Campus Cyberinfrastructure program, which helps campuses build core capabilities in research computing. In turn, the campuses are required to make some portion of their computing capacity available for external users. Most campuses do so via the OSG, which aggregates computing capacity from sites across the world. PATh helps to support the core technologies and services offered by the OSG.

“The NSF gave us a mandate and the means to move in a new direction that includes community-building and workforce development so that more and more researchers and campuses will benefit from distributed high throughput computing,” said Würthwein.

More information about the PATh project can be found here.

About SDSC

The San Diego Supercomputer Center (SDSC) is a leader and pioneer in high-performance and data-intensive computing, providing cyberinfrastructure resources, services, and expertise to the national research community, academia, and industry. Located on the UC San Diego campus, SDSC supports hundreds of multidisciplinary programs spanning a wide variety of domains, from astrophysics and earth sciences to disease research and drug discovery. In late 2020 SDSC will launch its newest National Science Foundation-funded supercomputer, Expanse. At over twice the performance of CometExpanse supports SDSC’s theme of ‘Computing without Boundaries’ with a data-centric architecture, public cloud integration, and state-of-the art GPUs for incorporating experimental facilities and edge computing.