What are the differences between grid and clustered computing?

    Requires Free Membership to View

A "cluster" is a group of systems working together as one unit. An example is four database servers clustered together appearing as a single database server. If one system in the cluster goes down, the other servers are still available to do the work. One can scale the performance of the work by adding more resources to the cluster.

In a basic sense, grid computing is many systems performing many functions. When one function needs more resources, or resources to perform a function are unavailable, the grid can provision idle resources from elsewhere in the grid.

Clustered computing was the first evolution of grid computing. The work performed to make clustered computing a reality is also what makes grid computing a reality. The grid can be composed of multiple clusters and/or single nodes. For instance, in the grid, I might have one database cluster, two web server clusters, and a single database node. The database cluster needs more resources, so the grid provisions one node from one of the idle web server clusters to help out the database cluster. Once the workload on the database cluster goes down, the node can be provisioned elsewhere.

Note that you are not required to have clusters in your grid and not all clusters are part of a grid. But clusters and grids do work nicely together.

This was first published in January 2008

There are Comments. Add yours.

TIP: Want to include a code block in your comment? Use <pre> or <code> tags around the desired text. Ex: <code>insert code</code>

REGISTER or login:

Forgot Password?
By submitting you agree to receive email from TechTarget and its partners. If you reside outside of the United States, you consent to having your personal data transferred to and processed in the United States. Privacy
Sort by: OldestNewest

Forgot Password?

No problem! Submit your e-mail address below. We'll send you an email containing your password.

Your password has been sent to: