Method other than DB link for large data transfer
We are working on the development of a datamart (in 9i) which takes data from two source systems. Since this is a transaction system, there is a lot of data. Currently we are doing incremental loads using a spool file and external tables. But the spool file size is huge and takes a long time and a lot of space on the server. Are there any other ways we can do this large data transfer?
The challenge of transmitting large volumes of data to a warehouse is always one that must be carefully considered. Your options are limited in some ways since you feel that creating flat files is not an option, and based on your description, this is probably a good idea. You also mention a DB Link -- this is another nice option, but again not an option for you. So what else do we have? My suggestion would be transportable tablespaces (TT). The concept behind TT is that you can do a VERY quick export of the metadata and then you can copy the export file and the data files from your source system and then attach them to your database. Another option would be to export and import data; you could pipe the export and improve your performance. You could also create a standby database, but you would still have to transmit data to the warehouse.
I hope this gives you some food for thought.