NW Advanced Computing Partnership Looks to Tackle Big Challenges

The Northwest Institute for Advanced Computing (NIAC) is taking shape, with researchers and professors from Pacific Northwest National Laboratory and the University of Washington brainstorming to identify a few large-scale strategic projects that could advance their budding partnership and the state of the art in computing.

At a coming out event Wednesday at the UW, leaders from the two institutions articulated their vision for a leading-edge computing research organization that draws on and complements the strengths of the university and national science laboratory, and educates the next generation of experts.

Doug Ray, an associate lab director at PNNL, says at maturity, the NIAC would grow to see perhaps 50 or 60 research staff on or near the UW campus, and UW faculty doing stints in Richland, WA. The NIAC has a physical presence already—a space for about 10 people in Sieg Hall on the UW campus. But it will be as much a virtual organization, with faculty from around the campus able to participate from their home departments, he says.

So what exactly is the NIAC going to work on? That question will be answered over the coming months through a series of workshops and open houses, meant to narrow down the focus from three broad starting points that share a common base in fundamental scientific computing, data management, and analysis technologies.

Those starting points are: advanced and future computing systems hardware, software, and programming models—essentially the next-generation tools for scientific discovery and simulation; scalable modeling and simulation design; and data-driven science and discovery.

On Wednesday, small-group discussion topics included high-performance computing applications at the exascale (computing systems that can handle 1018 floating point operations per second); graph analytics and network data analysis; urban science challenges; and systems biology.

Ray provided some example projects that could be addressed by these technologies from PNNL’s current research portfolio: real-time image reconstruction and analysis at extremely high resolution (three angstroms), potentially identifying new biological structures against which therapies could be targeted; atmospheric radiation measurements that look at the interactions of aerosols in clouds with sunlight; and the hosting of a 250 petabyte backup copy of what is expected to be the largest single scientific dataset on the planet, from the Belle II high-energy particle physics experiments ramping up in Japan.

Ed Lazowska, UW computer science professor, director of the eScience Institute, and a key link between PNNL and the university, put aspirations for the NIAC—unveiled early this year—in context of the “dawn of a new era of discovery.”

Data-intensive scientific discovery, he says, joins computational science as another arrow in the quiver for researchers, complementing the older methods of theory, experiment, and observation.

A proliferation of low-cost sensors and simulations is creating a torrent of data that presents enormous opportunities and huge challenges. Smart homes, smart cars, smart health, smart robots operating in unstructured environments—all are enabled by advances in areas such as machine learning, computer vision, and cloud computing.

“The big data revolution is what’s putting the smarts in everything,” Lazowska says. (He is leading a discussion on data-driven discovery at Xconomy’s upcoming public forum: Big Insight—Making Sense of Big Data in Seattle on Nov. 19.)

Jandhyala

Jandhyala

For data-intensive science to reach its potential, the onus is on computer scientists to build tools that can be used directly by oceanographers, biologists, geologists, and even sociologists and researchers in other fields, without having to wait for a data scientist to run reports for them. (He likened this potential bottleneck to the database administrators who sat between researchers and their data in the 1970s.)

In addition to large projects, the NIAC is aimed at fostering more one-to-one interaction between researchers at PNNL and UW. Vikram Jandhyala, UW electrical engineering chair and co-director of the institute from the UW side, says several initial collaborations have already started, or are scaling up from existing joint efforts, in areas including graph analytics, social media, parallel languages, systems biology, modeling of eco-hazards, smart grid simulation, and data visualization.

NIAC-affiliated researchers are going after an NIH funding opportunity to create Centers of Excellence for Big Data Computing in the Biomedical Sciences. Other funding possibilities include Department of Defense university research programs.

Thom Dunning Jr., NIAC co-director from the PNNL side beginning officially in January, has spent his career in both university and national laboratory settings. He surveyed other university-national lab partnerships around the country for models that could be applied here.

Lawrence Berkeley National Laboratory and the University of California, Berkeley, have perhaps the best relationship, thanks to their shared history and proximity: You can walk from one to the other in about 15 minutes. The NIAC will not enjoy the benefits of such a close physical location, with PNNL located more than three hours from UW by car.

Another one is the Joint Institute for Computational Sciences, which links up University of Tennessee and Oak Ridge National Laboratory.

Dunning says the best model for the NIAC is the Computation Institute, established in 2000 by the University of Chicago and Argonne National Laboratory, which are separated by only 30 miles, but an hour of frustrating Chicago traffic. Dunning, who most recently led the National Center for Supercomputing Applications at the University of Illinois at Urbana-Champaign, says the Computation Institute was established for many of the same reasons as the NIAC.

In 13 years, it has grown to 120 researchers and faculty divided between the two institutions, plus 50 supporting staff, and 30 graduate students (a number Dunning says is surprisingly low).

Dunning Jr.

Dunning Jr.

The Computation Institute has had some successes in tapping new sources of funding for its partnering institutions to pursue a handful of very large strategic computing projects, while also fostering a smaller number of collaborations among principal investigators. It has attracted about $150 million in the last five years, he says.

The largest project was TeraGrid, a major network of high-performance computing centers and other resources for e-science, backed by the National Science Foundation (NSF).

“It’s very difficult for a national laboratory to get funding from the National Science Foundation,” Dunning says.

The 17 Department of Energy national laboratories get most of their funding from energy and defense departments, while the NSF channels money—about $7 billion in the 2012 fiscal year—toward university researchers.

Argonne, Dunning says, attracted NSF funding “by basically masking themselves as University of Chicago,” even though it was Argonne expertise driving the project.

Another major Computation Institute project, Beagle, was driven by University of Chicago medical faculty who needed a supercomputer for life sciences research, and needed Argonne expertise to operate it, Dunning says. It was funded by National Institutes of Health, among others.

Dunning says Computation Institute staffers told him that their failures include insufficient strategic focus—something they are trying to correct through a new Urban Center for Computation and Data, meant to corral resources around areas of expertise at both institutions.

He acknowledges that the NIAC’s initial research directions are relatively broad, setting up the potential for a similar lack of focus. Dunning and other UW and PNNL leaders say the NIAC, to be successful, needs to identify about three significant strategic projects that get serious participation from as many researchers as is reasonable at both institutions.

The NIAC will also have a significant role for partnerships with industry, though Jandhyala acknowledges that this is one of the more challenging elements to be negotiated between the two institutions.

It seems to have good support from at least one key player locally. Cray CEO Peter Ungaro laid out the supercomputer maker’s big data strategy at the event Wednesday and says he is excited to form a partnership with the new institute.

Trending on Xconomy