Data Pump (and exp/imp)

Oracle Data Pump can be used to load the historical data from the source database to the target database to ensure the source and target databases are in synch before the data can be replicated. Data Pump is useful when a subset of the database is being replicated. If the full database is being replicated, consider using standby database, rman duplicate or rman restore and recover to create the target database, although Data Pump can be still be used for copying the full database. 

As part of the setup wizard, Dbvisit Replicate creates the Data Pump scripts which is then run manually to export and import the data.

The advantage of Dbvisit Replicate creating the Data Pump script is:

  • The Data Pump (or exp) script contains an "flashback_scn" statement with the correct SCN so that the data will be exported at the correct point in time. Dbvisit Replicate knows this SCN and the replication will start from this SCN to ensure no data is lost during the transfer.
  • All the tables to be replicated are contained in the Data Pump (or exp) script. 

The following different Data Pump options are available in the setup wizard:

  1. Data Pump with network link. This Data Pump script is run from the target server and uses a manually created database link to extract data from the source database over SQL*Net to the target database. No export datafile is created. 
  2. Data Pump with export datafile. This Data Pump script is run the from source server where it creates an export datafile which has to be manually copied to the target server, where the Data Pump import is run. 
  3. Pre-datapump exp/imp with export datafile. This exp script is run the from source server where it creates an export datafile which has to be manually copied to the target server, where the imp script is run. 

In the setup wizard the above options are displayed as follows:

What data copy script to create? (dp_networklink/dp_exp/exp/ddl_file/ddl_run/none) 

 

The options in the setup wizard are:

  1. dp_networklink - Data Pump with network link
  2. dp_exp - Data Dump with export datafile
  3. exp - Oracle 9 export/import with export datafile
  4. ddl_file - A SQL DDL file is created to only create the objects (no data). This SQL has to be manually run in order to create the objects in the target database.
  5. ddl_run - The SQL DDL is automatically created and run at the target database to create the objects (no data)

When is the Data Pump script created?

The Data Pump (or any of the other scripts - exp, ddl_file, ddl_run) script is NOT created when the setup wizard is run, but when the *-all.sh or *-all.bat script is run.

The Data Pump command to create the Data Pump script is in the .dbvrep script (which is run as part of the *-all.sh or *-all.bat script). The command starts with:

ENGINE PREPARE_DP ...

The flash_back SCN is generated at the time that the *-all.sh or *-all.bat script is run.

How is the Data Pump script run?

The Data Pump, exp/imp or DDL script is generated by Dbvisit Replicate. It is then the responsibility of the user to run the generated script. The script can have two different names. They are:

  1. APPLY.sh
  2. APPLY.sql

The APPLY.sh file contains the Data Pump or exp/imp script.  Please review the script BEFORE running it.  Check the TNSNAMES, Check the schema and table names. Ensure that you have a database link already configured. Make sure that the file contains the information you want to run.

The APPLY.sql contains the DDL script (no data).  Before running the APPLY.sql script ensure that those are the names and datatypes you want.  This is especially important when going from Oracle to a different RDBMS.

Depending on the option the script has to be either run on the source or target server depending the type of script:

  1. dp_networklink - Run APPLY.sh on the target server. A database link in the target database pointing to the target server will have to be created MANUALLY
  2. dp_exp - Edit the APPLY.sh and run the Data Pump export part on the source server, transfer the export datafile to the target server, then run the Data Pump import part on the target server. 
  3. exp - Edit the APPLY.sh and run the export part on the source server, transfer the export datafile to the target server, then run the import part on the target server. 
  4. ddl_file - Run the APPLY.sql on the target database

Can the Data Pump script generation be done manually?

Yes the Data Pump script can be generated manually by Dbvisit Replicate outside of the setup wizard. The same command that is in the *.dbvrep command can be run at any time in the Dbvisit Replicate console to generate a new Data Pump or export/import script. The command is part of the internal ENGINE commands that is part of Dbvisit Replicate. The flashback_scn will be used at the time the engine command is run. 

Examples:

Manually generating the Data Pump script with a network link: 

ENGINE PREPARE_DP WRITE DP_NETWORKLINK DIRECTORY DATA_PUMP_DIR FILE /home/oracle/d112f/APPLY.sh DBLINK d112f_dbvisit210 USERID SYSTEM/oracle@d112f_dbvisit230 

 

Manually generating the exp/imp script with a network link:  

ENGINE PREPARE_DP WRITE EXP FILE /home/oracle/d112f/APPLY.sh MINE_USERID SYSTEM/oracle@d112f_dbvisit210 APPLY_USERID SYSTEM/oracle@d112f_dbvisit230

 

Example of Data Pump scripts generated by Dbvisit Replicate

An example of the Data Pump with network link script which is generated by Dbvisit Replicate is:

impdp SYSTEM/xxx@d112f_dbvisit230 table_exists_action=TRUNCATE network_link=d112f_dbvisit210 directory=DATA_PUMP_DIR flashback_scn=297940636 tables=OE.CUSTOMERS,OE.INVENTORIES,OE.LOGON,OE.ORDERENTRY_METADATA,OE.ORDERS,OE.ORDER_ITEMS,OE.PRODUCT_DESCRIPTIONS,OE.PRODUCT_INFORMATION,OE.STRESSTESTTABLE,OE.WAREHOUSES logfile=OE_CUSTOMERS.log JOB_NAME=DP_d112f_0001

If the import is going in a different tablespace, make sure you specify that with a parameter REMAP_TABLESPACE=source_tablespace:target_tablespace

 

 

In this example the

  • Source TNS name is: d112f_dbvisit210
  • Target TNS name is: d112f_dbvisit230
  • The flashback_scn is 297940636. This is the starting point for replication and all data before this point will be exported from the source database and into the target database.
  • The network_link is: d112f_dbvisit210. This has to be created manually in the target database with the following command:

CREATE public DATABASE LINK d112f_dbvisit210 CONNECT TO system IDENTIFIED BY xxx USING 'd112f_dbvisit210'

 

Data Pump with export datafile

An example of a Data Pump with export datafile script is:

echo ========================================
echo EXPORT
echo ========================================
expdp SYSTEM/xxxxxx@d112f_dbvisit210 dumpfile=OE_SHEMA.dmp directory=DATA_PUMP_DIR \
flashback_scn=xxxxxxxxxx tables=OE.CUSTOMERS,OE.INVENTORIES,OE.LOGON,\
OE.ORDERENTRY_METADATA,OE.ORDERS,OE.ORDER_ITEMS,\
OE.PRODUCT_DESCRIPTIONS,OE.PRODUCT_INFORMATION,\
OE.STRESSTESTTABLE,OE.WAREHOUSES \
logfile=exp_OE_schema.log JOB_NAME=DP_d112f_0001
echo ========================================
echo IMPORT
echo ========================================
impdp SYSTEM/xxxx@d112f_dbvisit230 dumpfile=OE_SHEMA.dmp directory=DATA_PUMP_DIR logfile=imp_OE_SCHEMA.log

The EXPORT part has to be run on the source server and the IMPORT part has to be run on the target server. 

If you have multiple tables that differ just by the special character and you select one-by-one, then you get duplicate file names when the APPLY.sh is generated. Please check the APPLY.sh file before exporting and confirm dump file names are unique and make adjustments if necessary in the file before running.

If have LOB tables that are have high amount of concurrent updates as well as reads of the LOB column and you are getting ORA-01555: snapshot too old: 
rollback segment number with name "" too small or ORA-22924: snapshot too old errors you may need to increase PCTVERSION & RETENTION. See forum document http://support.dbvisit.com/hc/en-us/articles/216524058

 

The normal order of starting the replication and running Data Pump is:

  1. Run the setup wizard
  2. Run the resulting *-all.sh or *-all.bat script
  3. Start Data Pump
  4. Start the MINE process. You do not have to wait until Data Pump has completed before started the MINE process
  5. Start the APPLY process once Data Pump has completed and the data instantiation on the target database has completed

Examples

See the child pages below for examples