contents

business
 
Ovum on: Cloud Computing as the Enabler of HPC


by David Mitchell, SVP IT research at Ovum

Last week Wolfram Research announced that it was entering the cloud computing field. Together with Nimbus Services and R Systems, Wolfram is creating a service that allows users of the Mathematica product line to tap into additional high-performance computing (HPC) services in the cloud. The models that Mathematica users create are typically very computationally intensive, taking hours or even days to execute, making them ideal consumers of cloud-based HPC services – provided they are constructed to take advantage of highly parallel and distributed computing facilities.

HPC is becoming more mainstream

The introduction of cloud services by Wolfram is another demonstration that HPC has come of age. This follows the wider push that Microsoft made into the HPC arena with the release of Microsoft Windows HPC Server 2008 in September 2008. Both provide an infrastructure to support HPC computing for package applications – Mathematica in the first case and applications like Excel and Sharepoint in the latter – plus more general-purpose applications that can be developed with Visual Studio.

HPC was once available only to the scientific community, and the 'quants' in the financial services industry. It required specific custom development to be able to take advantage of multiple processors and multiple computers, utilising the MPI communications protocol (messaging passing interface) as a software enabler. Facilities cost millions of dollars and were extremely costly and time consuming to create, many being funded by government research grants. Nowadays the supercomputer market is much more approachable, with the entry-level Cray CX-1, running Windows HPC Server 2008, having a retail price of $25,000.

HPC is also being driven by the increased volume of data becoming available and desire from business to be able to generate insight from that data. Data is increasingly becoming available from core transactional systems, as well as the huge volumes of sensor-based computing such as satellite imagery, GPS data or RFID. Huge volumes of computing resources and sophisticated analysis tolls are needed to try to extract business insight from these data sources.

HPC and Cloud are natural bedfellows, but brokers and other market roles will need to emerge

Cloud computing follows on from previous attempts, such as software as a service (SaaS) or grid computing, to create a model of variable cost consumption for computing services. However, the broader move is really towards computational markets. Currently there are computing consumers and computing providers, but like the utilities and energy markets a richer variety of market roles will emerge. These will include wholesale providers, service aggregators, retail providers and metering agents.

The role of Nimbis Services within the Wolfram service is highly strategic, as it gives an early indication of the evolution towards these market roles. It allows the Wolfram services to draw upon many different computing providers, including many of the TOP500 supercomputers globally and the facilities of the Amazon Elastic Compute Cloud. In effect, it acts as an intermediary or broker that allows consumers to consume computing resources from different service providers.

There are a lot of cloud computing providers coming on-stream but there are few facilities that allow the cloud computing consumer to quickly change their providers, nor are there the range of inter-operability standards that would be needed to allow a fully-liquid market to emerge – as was recently pointed out by Salesforce.com CEO Marc Benioff. However, a lot more than technical standards will be needed if there are to be genuine moves towards the creation of computational markets.



write your comments about the article :: © 2008 Computing News :: home page