WEST LAFAYETTE, Ind. — It’s not just for geeks.
The new supercomputer at Purdue University — the fastest campus-based data cruncher in the country — has a host of down-to-earth applications, such as speeding development of cancer treatments and predict hazardous weather patterns.
“Without supercomputers, I wouldn’t be able to do this kind of research at all,” said Michael Baldwin, an assistant professor of atmospheric science, who is using vast amounts of data to develop a computational techniques to predict hurricanes, tornadoes, and other weather phenomena.
The computer, a collaboration between Purdue and software companies, was unveiled without the advance hoopla surrounding previous Purdue supercomputers, largely because it uses proprietary new technology.
Nicknamed “Carter,” after a Purdue alumnus, the supercomputer was just ranked 54th in the latest international Top500.org list of the world’s most powerful supercomputers.
Gerry McCartney, Purdue’s chief information officer, said the computer’s capacity — 186.9 trillion multiplications per second — will give faculty ability to run models and analyze data in countless ways.
“Things that are too slow, too difficult, too expensive, too dangerous or just impossible, like nano-wire manufacturing. You can’t even see this stuff — we are talking things 24 atoms wide, so how are we going to model what this looks like?,” McCartney said.
“Well, how about doing it on this supercomputer?
“We have more and more of our faculty that need this research. So us being able to provide it to them is an advantage to get their science done. It is like a new laboratory.”
Professor Alan Qi uses Carter to plug in millions of data points per sample to identify cancer stem cells.
The research aims at curtailing cancer growth by coming up with treatments that target cancer stem cells.
“This is a technique to understand the cell behavior,” Qi said. “This data (allows) you to examine every single cell.”
In his weather studies, Baldwin uses masses of data collected daily from the National Centers for Environmental Prediction.
“Currently, they receive roughly 3 billion unique weather-related observations globally on a daily basis. Most of those are from satellites and radar,” he said.
“We need as much computing capability as we can get to make our forecasts as detailed as possible. In our models, we combine physical processes of the atmosphere with high-resolution data from weather radar and satellites. We can use Carter to create forecasts that the public can use and potentially save lives.”
Purdue was contacted by Intel and Hewlett-Packard Co. to build the computer, McCartney said, because of the university’s experience building supercomputers. Because Carter uses unreleased technology, the project was kept under wraps. Only four Purdue faculty members were invited to try out the computer to start.
Carter’s inner workings, McCartney said, will appeal to computer enthusiasts because it was built using not-yet-released Xenon E-5 “Sandy Bridge” Intel processors and HP Proliant servers.
It also has Mellanox FDR InfiniBand network cables that, until now, could not be tested at their full capacity of 56 gigabits per second.
In all, the computer has 648 server nodes featuring 1,296 processors totaling 10,368 cores.
Raj Hazra, general manager of high-performance computing for Intel, said Carter provides three times more performance than Purdue’s 2008 supercomputer, nicknamed Steele, It also consumes less than half the energy and is half the size.
Carter is still undergoing testing and should be available to additional faculty next spring. Cost of the computer is about $2.5 million, McCartney said.
The amount is being split by Purdue’s information technology department and faculty research funds.
In the future, McCartney said, Purdue could expand the computer to allow other universities to buy into its use.Copyright © 2013 Paddock Publications, Inc. All rights reserved.