Export Import in Oracle

Oracle's export (exp) and import (imp) utilities are used to perform logical database backup and recovery. When exporting, database objects are dumped to a binary file which can then be imported into another Oracle database.

These utilities can be used to move data between different machines, databases or schema. However, as they use a proprietary binary file format, they can only be used between Oracle databases. One cannot export data and expect to import it into a non-Oracle database.

Various parameters are available to control what objects are exported or imported. To get a list of available parameters, run the exp or imp utilities with the help=yes parameter.

Export in Oracle

Syntax:

EXP userid=`userid/password@sid owner=`schemaowner` File=`path with filename.dmp ` log=`logfile path with dmp`

Exporting the Whole schema

EXP userid=system/manager owner=scott STATISTICS=NONE File=’C:\backup\scott_%today%.dmp’ log=’C:\backup\scott_%today%.log’

Exporting the particular tables of the schema

C:\>EXP userid=system/admin@DIALNET FILE= C:\backup\scott_%today%.dmp log=C:\backup\scott_%today%.log ' TABLES=(SCOTT.EMP, SCOTT.DEPT)

Exporting the particular rows of a tables of the schema

EXP userid=system/manager@ORCL owner=scott STATISTICS=NONE File=D:\SCOTT_Ora.dmp log=D:\SCOTT_Ora.log TABLES=emp QUERY=\"where ename='JOHN'\"

Another Export Example

exp scott/tiger file=abc.dmp tables=abc query=\"where sex=\'f\'\" rows=yes

Import in Oracle

Suntax:

imp userid=system file=`filepath with filename.dmp` fromuser=schema touser=schema;

Importing whole Schema

imp userid=system file=Scott_ora.dmp fromuser=scott touser=scott;

Import Particular Tables

imp userid=system file=scott_ora.dmp fromuser=oraivr touser=oraivr TABLES=EMP,DEPT;

Can one monitor how fast a table is imported?

If you need to monitor how fast rows are imported from a running import job, try one of the following methods:

Method 1:

select substr(sql_text,instr(sql_text,'INTO "'),30) table_name,

rows_processed,

round((sysdate-to_date(first_load_time,'yyyy-mm-dd hh24:mi:ss'))*24*60,1) minutes,

trunc(rows_processed/((sysdate-to_date(first_load_time,'yyyy-mm-dd hh24:mi:ss'))*24*60)) rows_per_min

from sys.v_$sqlarea

where sql_text like 'INSERT %INTO "%'

and command_type = 2

and open_versions > 0;

For this to work one needs to be on Oracle 7.3 or higher (7.2 might also be OK). If the import has more than one table, this statement will only show information about the current table being imported.

Method 2:

Use the FEEDBACK=n import parameter. This parameter will tell IMP to display a dot for every N rows imported. For example, FEEDBACK=1000 will show a dot after every 1000 row.

How can one improve Import/ Export performance?

EXPORT:

  • Set the BUFFER parameter to a high value (e.g. 2Mb -- entered as an integer "2000000")
  • Set the RECORDLENGTH parameter to a high value (e.g. 64Kb -- entered as an integer "64000")
  • Use DIRECT=yes (direct mode export)
  • Stop unnecessary applications to free-up resources for your job.
  • If you run multiple export sessions, ensure they write to different physical disks.
  • DO NOT export to an NFS mounted filesystem. It will take forever.

IMPORT:

  • Create an indexfile so that you can create indexes AFTER you have imported data. Do this by setting INDEXFILE to a filename and then import. No data will be imported but a file containing index definitions will be created. You must edit this file afterwards and supply the passwords for the schemas on all CONNECT statements.
  • Place the file to be imported on a separate physical disk from the oracle data files
  • Increase DB_CACHE_SIZE (DB_BLOCK_BUFFERS prior to 9i) considerably in the init$SID.ora file
  • Set the LOG_BUFFER to a big value and restart oracle.
  • Stop redo log archiving if it is running (ALTER DATABASE NOARCHIVELOG;)
  • Create a BIG tablespace with a BIG rollback segment inside. Set all other rollback segments offline (except the SYSTEM rollback segment of course). The rollback segment must be as big as your biggest table (I think?)
  • Use COMMIT=N in the import parameter file if you can afford it
  • Use STATISTICS=NONE in the import parameter file to avoid time consuming to import the statistics
  • Remember to run the indexfile previously created