Columns

This is your brain on HPC

For about 2 years now, server vendors have been flogging the idea that high-performance computing is spreading beyond traditional academic and government environments, and is emerging as a serious option for enterprise IT. HPC is an important tool for visualization of oil and gas reservoirs, automotive and aerospace crash analysis, and financial risk analysis and similar industrial-strength processing.

The truth is, the eggheads are still doing all the cool stuff. For example, researchers at the University of California’s Laboratory of Neuro Imaging use HPC to make 3-D maps of the human brain. They employ a 306-node cluster of Sun Fire V20z servers to develop 3-D digital and functional neuroanatomic atlases for stereotactic localization and multisubject comparison.

LONI is the brainchild of Dr. Arthur Toga, PhD, founder and current director of the lab. "We do think of this as brain mapping," Toga says. "But this map is not a static thing. It’s a computational entity, where the map can be recomputed with changes in the parameters, the type of question you might pose, or the way you might want to perform the arithmetic on that data."

The lab is a computational resource; no patients are sitting around in bum-chilling hospital gowns. Here, the intracranial cartographers pursue programs that look at Alzheimer’s disease, schizophrenia, autism, brain development, tumor growth, fetal alcohol syndrome, drug abuse, and brain changes that result from AIDS.

They use image data from thousands of brain scans to create visually clarifying 3-D images that describe the shape, form, and functional attributes of the brains of individuals and groups or "subpopulations." To accomplish this, Toga and his colleagues must collect between 60 to 800 gigabytes of data per brain.

"Not only is the image data for each individual quite large," Toga says, "but you may be working with literally thousands of subjects to be sensitive to what are sometimes very subtle differences that are difficult to appreciate without appropriate statistical power. Couple that with algorithms that have literally millions of degrees of freedom, requiring the execution of simultaneous differential equations and other computationally expensive approaches. Needless to say, we need a lot of horsepower."

The lab recently upgraded to its current deployment of Opteron-based Sun machines running the Solaris 10 operating system. The servers are organized as a single massive cluster managed by Sun Control Station software. Sun N1 Grid Engine software manages the compute workload, not only of the servers but also the legacy servers from other vendors, as a single computing resource.

"We are a research facility, so it’s not really a production-transition-processing type of group," Toga says. "The problems change and the algorithms change. And it’s a heterogeneous environment. We wanted something that was compatible with the way we work—large data sets, distributed processing—and we wanted the flexibility of a grid."

In this last respect, says Sun’s Director of HPC Bjorn Andersson, LONI is a good example of how academia is actually making HPC more appealing to the mainstream. High-performance computing solutions were once confined to dedicated, specialized, number-crunching supercomputers housed in special environments. Now, this type of computing is increasingly done on grids, making resources available over the network and facilitating collaboration among a wider range of researchers.

You can browse the LONI image database.
-John K. Water