We have an Oracle 8 database for one of the biggest telecom providers. Now on daily basis we extract a huge amount
of data from the main database server to another database. But our main problem is that it takes lots of time (approximately 16 hours) for extraction and insertion. We are using PL/SQL procedures, which are scheduled to run at midnight and be completed in 16 hours. Is there some better solution for this problem? We need to make the data available within six or seven hours.
PL/SQL can be rather slow for bulk inserts of data. I would look into using SQL*Loader, External Tables (if using Oracle 9i+), or straight SQL statements. I have a white paper on my Web site titled "Data Loading with External Tables," which discusses a sample performance case loading data in multiple ways.
SQL*Loader is Oracle's utility for bulk loading data from a text file. So you'll have to dump the data to a text file first. But the speed of this utility can't be beat.
You are not using Oracle 9i, so the External Table option is not available to you. External Tables use the SQL*Loader engine to load the data, but there are nice features to this option. The white paper mentioned above discusses External Tables.
A simple SQL statement, if possible, would be preferrable to any PL/SQL block to load data. Such an SQL statement might look similar to the following:
INSERT INTO destination_table SELECT * FROM source_table@source_database WHERE source_column=my_condition;In this case, the source table is in a remote database, accessed through a database link.
If you convert to SQL or SQL*Loader, you should see improvements in your loading times. And these improvements can be quite significant.
Dig deeper on Oracle database design and architecture
Have a question for an expert?
Please add a title for your question
Get answers from a TechTarget expert on whatever's puzzling you.