UCSD Supercomputing Center Looks to Create More Business Partnerships

The new five-story building that doubles the size of the San Diego Supercomputer Center is more than just concrete, steel, and glass. As Fran Berman, SDSC director (and an Xconomist) said yesterday, the new building also represents the next generation in supercomputing—and she wants to ensure that San Diego’s emerging companies feel like they’re part of the family.

Berman ostensibly was hosting a dedication ceremony for the SDSC’s $44 million expansion. But the event included breakfast for more than 100 local technology executives, venture capital partners, and others to explain how supercomputing is crunching massive amounts of data to analyze Medicare fraud, cancer cells, earthquakes, and climate change.

Such research may not necessarily be useful to the venture community, but biotech and information technology startups increasingly need such capabilities, especially in digital storage and data mining, and to analyze complex models and simulations.

Today “SDSC sees its role as a partner and pathfinder with the research and education community to harness the power of cyberinfrastructure to adderess science and society’s most critical problems,” Berman said.

The supercomputer center, which was founded on the UCSD campus 23 years ago, now boasts that it has more archival storage capacity than any other academic institution in the world, about 25 petabytes (25 thousand trillion bytes). That’s about 1,000 times the digital plain-text equivalent of the printed collection in the Library of Congress.

The new building also houses some of the first optical networking equipment to be connected to the TeraGrid, a next-generation Internet for scientific research funded by the National Science Foundation.

“I have a number of models where we could work with industry,” said Ron Hawkins, SDSC’s director of industry relations.

The center could conduct sponsored research to address specific questions a company may have, Hawkins said. SDSC researchers also can do “high-end consulting” on corporate projects, or the center can simply offer access to its computational resources.

As one industry consultant noted, however, the programming required to solve complex problems on high-performance computers can be extremely expensive.

“Parallel processing is a specialized field, so, yes, it requires specialized expertise,” Hawkins acknowledged. “But the computing world and the IT world is going to parallel processing, so it’s only appropriate for us to work with industry to bring them along.”

Bruce V. Bigelow was the editor of Xconomy San Diego from 2008 to 2018. Read more about his life and work here. Follow @bvbigelow

Trending on Xconomy