Problem solve Get help with specific problems with your technologies, process and projects.

Clustered computing vs. grid computing

A SearchOracle.com member asks, "What are the differences between grid and clustered computing?"

What are the differences between grid and clustered computing?
A "cluster" is a group of systems working together as one unit. An example is four database servers clustered together appearing as a single database server. If one system in the cluster goes down, the other servers are still available to do the work. One can scale the performance of the work by adding more resources to the cluster.

In a basic sense, grid computing is many systems performing many functions. When one function needs more resources, or resources to perform a function are unavailable, the grid can provision idle resources from elsewhere in the grid.

Clustered computing was the first evolution of grid computing. The work performed to make clustered computing a reality is also what makes grid computing a reality. The grid can be composed of multiple clusters and/or single nodes. For instance, in the grid, I might have one database cluster, two web server clusters, and a single database node. The database cluster needs more resources, so the grid provisions one node from one of the idle web server clusters to help out the database cluster. Once the workload on the database cluster goes down, the node can be provisioned elsewhere.

Note that you are not required to have clusters in your grid and not all clusters are part of a grid. But clusters and grids do work nicely together.

Dig Deeper on Oracle database administration

Start the conversation

Send me notifications when other members comment.

Please create a username to comment.