CIO CORNER

This is the MIT CIO Symposium blog. We invite participation from speakers, sponsors, attendees, and interested parties.

Cloud Computing: Number Crunching Made Easy

By annie shum | May 3, 2009

Cloud computing is making high-end computing readily available to researchers in rich and poor nations alike.
Christopher Werth, NEWSWEEK, From the magazine issue dated May 18, 2009

A dwindling water supply spells disaster for the residents of Brazil’s arid Northeast, who live by subsistence agriculture. Droughts have become longer and more frequent, and every year more families set off for the urban slums. Predicting how rainfall patterns will shift in a few years and how it will affect aquifers and agricultural output has become an urgent task. Civil engineers need to know where to build reservoirs and how much water they should hold. But this kind of local climate modeling requires a lot of number crunching, and supercomputers are rare in these parts.

To get around this hurdle, a group of universities and government labs, called SegHidro (which means “water security”), pooled the computing resources in labs scattered throughout the country. Using software called OurGrid, they adapt global climate models to local conditions, parceling out pieces of the massive job to little computers in the network. This kind of collaboration is getting a big boost from new so-called cloud-computing services from Amazon, Google and Microsoft. By driving down the cost of scientific computation, it promises to be a boon to researchers in rich and poor nations.

Distributing big research computing tasks via the Web isn’t new—for years scientists have been divvying up projects among global networks of volunteers who make their PCs available for data crunching. But setting up these arrangements is still costly and cumbersome, and requires computer expertise. This means that a lot of worthy but little-known projects—such as research on specific strains of antiretroviral-resistant HIV found only in parts of Africa and South America, or the type of local climate modeling that SegHidro carries out—have fallen by the wayside. Cloud computing, though, is beginning to put the power of big data centers at the fingertips of anybody with a Web browser.

Cloud computing has some attractive qualities for scientific researchers. It delivers data storage and processing as a service, rather than software that’s loaded onto a hard drive, or something that sits on a desk somewhere. Information is held on massive data centers spread all over the world, and available upon request. In the cloud, the “supercomputer” exists virtually, meaning no clunky hardware; the software interface is easy to use; and scientists have access to their data and simulations from just about anywhere by simply logging in. Amazon has been leading the way in on-demand computing for the past decade, invaluable for organizations with large databases that don’t necessarily want to hire an IT department. The service is flexible and pay-as-you-go. An hour will set you back 80 cents, or as little as 10 cents per gigabyte. Subscribers buy only what they use, which is ideal for research departments that face periodic peaks in the computational power they require.

The cloud is already making high-performance computing more readily available to researchers in the developed world. The Nimbus project at the Argonne National Laboratory in Illinois has developed open-source software that can launch a virtual supercomputer within minutes on Amazon’s Elastic Compute Cloud. Earlier this month, nuclear physicists at the Brookhaven National Laboratory in New York used the service to rush through a set of new simulations on data from the lab’s Relativistic Heavy-Ion Collider—set up to give a glimpse into what the universe may have looked like in its first few moments—rather than waiting weeks or months for a slot to open up on the lab’s big computers. The innovation could have a profound effect: simplifying and speeding up research in everything from renewable energies to drug testing.

The developing world stands to gain from the same benefits. Using the cloud, labs can forgo the cumbersome process of linking hundreds, if not thousands, of desktops—or add more muscle to the networks they already have. “Cloud computing promises to be the next big wave in computing democratization,” says Dan Reed, director of Microsoft’s Cloud Computing Futures. “Researchers in developing countries can access the same data and use the same computing infrastructure as researchers in [developed] countries.”

But clouds still have a way to go before they’re widely accepted. To start, they must earn the trust of researchers, who may think twice about loading all their information onto the Web. (What if it’s all lost?) In addition, the services are still too expensive for many developing nations. Even mere pennies per gigabyte can quickly add up, especially for labs with very little to begin with. A row broke out recently over the issue of accessibility. Big players like Google, Microsoft and Amazon refused to sign on to the Open Cloud Manifesto, developed by a consortium of technology firms, which has tried to define the cloud as a public resource for everyone’s benefit, to be given over to philanthropic causes wherever possible.

Researchers, though, may not have to wait for a manifesto. Prices will eventually drop, and cloud operators could make their own special arrangements for customers with large workloads, or with those from countries most in need. The UK’s Hadley Centre for Climate Prediction and Research, for instance, is now in negotiations with Amazon to sponsor a West African researcher for free access to their cloud computing services. The cloud is already being used to bridge the digital divide.

URL: http://www.newsweek.com/id/195734

Topics: Uncategorized | No Comments »

Leave a Reply