Supercomputing Along the Columbia River: PNNL’s Chinook Operators Crunching Digits on How to be More Green

Xconomy Seattle — 

The most powerful computer in Washington state goes by the name of Chinook, and stands along the banks of the Columbia River. It is being used for thorny computing tasks like modeling ways to more efficiently store hydrogen for fuel cells, and how to safely sequester carbon dioxide residue underground.

This morning, I heard an overview of what this machine does from Moe Khaleel of the Pacific Northwest National Laboratory in Richland, WA, as part of the Technology Alliance’s monthly Discovery Series talks. The lab is known for studying nuclear contamination at the nearby Hanford site from the Manhattan Project, but today it likes to tell stories about all the non-military things it’s working on. The lab has 4,200 employees and an $850 million annual budget, so there’s certainly a lot of potential to get things done.

The Chinook machine was built by Hewlett Packard, with design help from the national lab, Khaleel said. It’s not the world’s most powerful machine—that title belongs to the roughly 1,000-teraflop RoadRunner at Los Alamos National Laboratory in New Mexico, while Chinook weighs in at about 160 teraflops, Khaleel says. But Chinook could achieve bigger things if pooled with resources just a few hours down the road at Washington State University in Pullman, and at the University of Washington in Seattle, he said.

Khaleel didn’t get too specific in his talk about what this machine is really being used for, other than national security, cybersecurity, and biology. One interesting part of the talk was the challenge in managing such a demanding piece of equipment. The thing draws enough juice to run up a $1 million a year electricity bill, with about one-third of that needed to keep it cool and the rest going to actual computing power, he said. Like most supercomputers, it’s quite inefficient, too, with 40 percent of the energy lost through conversion from AC current to DC current, he said.

“People tend to worry a lot about the upfront cost, but they forget about the operational cost,” he said.

These days the lab is trying to think of ways to make this system more “green,” Khaleel said. Scientists there are thinking of how best to capture the heat thrown off from the machine to help keep the rest of the building warm in winter. However, he said, “If it’s a cold winter, and the machine’s not running, folks could be in trouble.”

Not everyone in the audience was enamored with the idea of investing gobs more taxpayer money in supercomputers. Linden Rhoads, the University of Washington’s vice provost for technology transfer, asked Khaleel whether it might be a better idea to make an appeal to citizens to let scientists tap into the power of their millions of desktops around the country during off-peak hours. Khaleel pooh-poohed the idea, saying it would take too long to get the calculations done because improvements need to be made in virtualization technology to organize the data. (When I asked Rhoads afterwards if she bought that answer, she shook her head, saying maybe the country should then invest more in virtualization to tap into all those desktop computing resources.)

“If you asked people to contribute their computers to do this, they’d do it,” Rhoads said.