Intel’s Diane Bryant Blazing Trail From Data Centers to Next Era of IT
(Page 3 of 3)
Intel’s $7.7 billion acquisition of McAfee in 2010 has led the company to hardwire more anti-virus features and malware blockers into its newer PC and server chips. The technology can do things like shut off applications when a server is compromised, Bryant says. Intel also has been working on better encryption tools to improve data security in the cloud.
In a similar vein, the company has embedded what it calls “trusted execution technology” into its Xeon server processors. This is an effort to create pools of trusted servers within virtualized data centers, so that IT administrators can keep track of which servers are secure and which may be compromised. Sounds obvious, but I think the key lies in how the system uses measurements to verify the condition of both hardware and software in real time.
Meanwhile, for all its progress, Intel is still in the chipmaking business—which means it deals in the global-scale advantages and headaches of manufacturing. Last I checked, the company had fabrication plants as far-flung as Arizona, Oregon, Israel, and China (and even more test and assembly sites). I asked Bryant how the company tries to press its advantage in fabs.
As she puts it, Intel and Samsung are basically the only players left that have their own chip factories. Qualcomm, Nvidia, and Toshiba outsource their manufacturing, and Intel’s archrival Advanced Micro Devices (AMD) divested its manufacturing business in 2009, which became GlobalFoundries. Intel has been a clear leader in cramming more transistors and processing power onto smaller and smaller surfaces; this year it will be among the first to move to the 14-nanometer process (which refers to the tiny scale at which chip substrates are etched).
Bryant credits Intel’s multibillion-dollar investment in R&D for that. I took that to mean both the billions it takes to build each fab, and paying for research into the advanced physics and processes of manufacturing chips at such a scale. (I also take it that working with an independent fab as a chip designer would have major drawbacks, though it’s much cheaper.)
Yet the competitive environment in servers—and elsewhere—remains treacherous. “We take all competitors seriously,” Bryant says.
One class of competitors stands out. All told, she says, there are about 15 hardware companies that are developing ARM-based processors for servers. That’s the same ARM Holdings whose chip architecture, known for its low power consumption, has come to dominate the mobile-device chip market. And one of the companies leading the way in ARM-based server chips is—you guessed it—Intel nemesis AMD.
Bryant shot down a series of what she called “myths” about ARM-based server chips—that they are more energy efficient, easily compatible with existing data centers, and so forth. She thinks all the different kinds of processing units (cores) put out by the 15-odd vendors will be a big disadvantage. “That variation is what’s going to kill them,” she says. In the data-center market, “variation is your enemy. It costs [manufacturers] money.”
In any case, none of the competition seems to dissuade Bryant from her main conclusion that Intel will continue to ride a huge wave of Internet-connected devices and cloud services into the next era of tech. The main question in my mind—and this is outside Bryant’s domain for now—is whether Intel’s chips will find their way into enough end-user devices as well.