Wednesday, April 13, 2011

Resume and Attach Data Pump Operation

ATTACH parameter is one of the  useful parameter in oracle Data Pump. This can be used in environment where an datapump operation is going on and we have to resume the operation for the while and continue it later.  This can be done by use the attached parameter in  Data pump.

This allows the job to be more easily identified when querying the DBA_DATAPUMP_JOBS  data dictionary views. If a job name is not specified, Oracle will create a default job name for you. The default name will appear like this SYS_EXPORT_mode_nn , where mode denotes the type of export (FULL, TABLE, etc.) and "nn" is a incrementing number starting at 01. The job name is then used to attach, and reattach, to running jobs using the ATTACH parameter.

The ATTACH parameter attaches the client session to an existing export job and automatically places  in the interactive-command interface. Export displays a description of the job to which you are attached and also displays the export prompt.

For a successful attachment we need to have the uncorrupted dump file in the proper location, else we won't be able to attach. we will see how to use the attach parameter. Below is a demo of attach parameter .

C:\>expdp system/xxxx@noida  directory=datapump1   schemas=hr   dumpfile=hr_schema  logfile=ex_hrlog.log
 Export: Release 11.1.0.6.0 - Production on Wednesday, 13 April, 2011 14:21:25
Copyright (c) 2003, 2007, Oracle.  All rights reserved.
Connected to: Oracle Database 11g Enterprise Edition Release 11.1.0.6.0 - Production
With the Partitioning, OLAP, Data Mining and Real Application Testing options
Starting "SYSTEM"."SYS_EXPORT_SCHEMA_01":  system/********@noida directory=datapump1 dumpfile=hr_schema logfile=ex_hrlog.log schemas=hr
Estimate in progress using BLOCKS method...
Processing object type SCHEMA_EXPORT/TABLE/TABLE_DATA
Total estimation using BLOCKS method: 448 KB
Processing object type SCHEMA_EXPORT/USER
Processing object type SCHEMA_EXPORT/SYSTEM_GRANT
Processing object type SCHEMA_EXPORT/ROLE_GRANT
Processing object type SCHEMA_EXPORT/DEFAULT_ROLE

(Press CTR + C to resume the job and perform following steps on export prompt )

Export> stop_job
Are you sure you wish to stop this job ([yes]/no): y

Now the find out the job name to attach

C:\>set ORACLE_SID=noida
C:\>sqlplus / as sysdba
SQL*Plus: Release 11.1.0.6.0 - Production on Wed Apr 13 14:27:31 2011
Copyright (c) 1982, 2007, Oracle.  All rights reserved.
Connected to:
Oracle Database 11g Enterprise Edition Release 11.1.0.6.0 - Production
With the Partitioning, OLAP, Data Mining and Real Application Testing options
SQL> select  owner_name , job_name , operation, job_mode from dba_datapump_jobs;
OWNER_NAME                     JOB_NAME                         OPERATION                 JOB_MODE
-------------------------         ------------------------------    ------------------------    -----------------
SYSTEM                         SYS_EXPORT_SCHEMA_01        EXPORT                   SCHEMA

SQL> exit
Disconnected from Oracle Database 11g Enterprise Edition Release 11.1.0.6.0 - Production
With the Partitioning, OLAP, Data Mining and Real Application Testing options

C:\>expdp system/ramtech@noida attach=SYS_EXPORT_SCHEMA_01
Export: Release 11.1.0.6.0 - Production on Wednesday, 13 April, 2011 14:36:26
Copyright (c) 2003, 2007, Oracle.  All rights reserved.
Connected to: Oracle Database 11g Enterprise Edition Release 11.1.0.6.0 - Production
With the Partitioning, OLAP, Data Mining and Real Application Testing options
Job: SYS_EXPORT_SCHEMA_01
  Owner: SYSTEM
  Operation: EXPORT
  Creator Privs: TRUE
  GUID: A370CE1FA5154C27955DF34D010CB822
  Start Time: Wednesday, 13 April, 2011 14:37:27
  Mode: SCHEMA
  Instance: noida
  Max Parallelism: 1
  EXPORT Job Parameters:
  Parameter Name      Parameter Value:
     CLIENT_COMMAND        system/********@noida directory=datapump1 dumpfile=hr_schema logfile=ex_hrlog.log schemas=hr
  State: IDLING
  Bytes Processed: 0
  Current Parallelism: 1
  Job Error Count: 0
  Dump File: d:\dpump\hr_schema.dmp
    bytes written: 28,672
Worker 1 Status:
  Process Name: DW01
  State: UNDEFINED

Export> CONTINUE_CLIENT

Job SYS_EXPORT_SCHEMA_01 has been reopened at Wednesday, 13 April, 2011 14:37
Restarting "SYSTEM"."SYS_EXPORT_SCHEMA_01":  system/********@noida directory=datapump1 dumpfile=hr_schema logfile=ex_hrlog.log schemas=hr
Processing object type SCHEMA_EXPORT/TABLE/TABLE
Processing object type SCHEMA_EXPORT/TABLE/GRANT/OWNER_GRANT/OBJECT_GRANT
Processing object type SCHEMA_EXPORT/TABLE/INDEX/INDEX
Processing object type SCHEMA_EXPORT/TABLE/CONSTRAINT/CONSTRAINT
Processing object type SCHEMA_EXPORT/TABLE/INDEX/STATISTICS/INDEX_STATISTICS
Processing object type SCHEMA_EXPORT/TABLE/COMMENT
Processing object type SCHEMA_EXPORT/PROCEDURE/PROCEDURE
Processing object type SCHEMA_EXPORT/PROCEDURE/ALTER_PROCEDURE
Processing object type SCHEMA_EXPORT/VIEW/VIEW
Processing object type SCHEMA_EXPORT/TABLE/CONSTRAINT/REF_CONSTRAINT
Processing object type SCHEMA_EXPORT/TABLE/TRIGGER
Processing object type SCHEMA_EXPORT/TABLE/STATISTICS/TABLE_STATISTICS
Processing object type SCHEMA_EXPORT/POST_SCHEMA/PROCACT_SCHEMA
. . exported "HR"."COUNTRIES"                            6.375 KB      25 rows
. . exported "HR"."DEPARTMENTS"                          7.015 KB      27 rows
. . exported "HR"."EMPLOYEES"                            16.80 KB     107 rows
. . exported "HR"."JOBS"                                 6.984 KB      19 rows
. . exported "HR"."JOB_HISTORY"                          7.054 KB      10 rows
. . exported "HR"."LOCATIONS"                            8.273 KB      23 rows
. . exported "HR"."REGIONS"                              5.484 KB       4 rows
Master table "SYSTEM"."SYS_EXPORT_SCHEMA_01" successfully loaded/unloaded
******************************************************************************
Dump file set for SYSTEM.SYS_EXPORT_SCHEMA_01 is:
  d:\dpump\hr_schema.dmp
Job "SYSTEM"."SYS_EXPORT_SCHEMA_01" successfully completed at 14:44:00


Enjoy     J J J



2 comments:

Anonymous said...

thank up yar

Harish Suresh said...

thanks for this info. Its quite helpful to me :)