Problem solve Get help with specific problems with your technologies, process and projects.

SQL Server overworked, or more memory necessary?

For a 80-160 dimension OLAP query over three or four tables, each with 50-100 million rows, the query takes an hour on average on SQL Server. It is doing full table scans 60% of the time, lots of paging, i/o queues are 5-6, there are a lot of group by and order by clauses which do a lot of sorting, mostly in disk and some in memory. Would doubling the memory and striping across twice as many disks solve the problem or is it that SQL Server is too overworked in this case? The CPU utilization is 100% with three users. There are seven or eight users, but they cannot get into the system. Now we are using a 2 CPU Xeon 2.4 GHz machine with 4 GB of RAM. The database size is 300 GB. The average table size is 50 GB, and the max table size is 80 GB.

I would recommend taking a look at the data design before throwing more hardware at the problem. 80-160 dimensions...

in a query seems a bit excessive. Could it be you have an example of a one-size fits all data mart? I suggest getting your data architect/business analyst involved to gain an in-depth appreciation for how that query result set is being used, then look at the data architecture for areas where the design can be improved.

For More Information

This was last published in January 2004

Dig Deeper on Oracle database design and architecture

Have a question for an expert?

Please add a title for your question

Get answers from a TechTarget expert on whatever's puzzling you.

You will be able to add details on the next page.

Start the conversation

Send me notifications when other members comment.

By submitting you agree to receive email from TechTarget and its partners. If you reside outside of the United States, you consent to having your personal data transferred to and processed in the United States. Privacy

Please create a username to comment.