Dani's IT Blog

…moving from Oracle to Software Development

The gains and pains of dbms_datapump. Part I

with 20 comments

I personally have a love/hate relationship with dbms_datapump. On one hand it is a very powerful tool, on the other hand documentation and error messages are not very helpful. Because of this, I would like to show you some of my findings related to dbms_datapump. Let’s start with some code, which everyone can use to test what I show here.

The following code copies a schema over a database link from one database to another. You can also use the same database as source and target, if you want to. I tried to keep the code as simple as possible, so pre- and post-checks are missing. The code is based on the one from Chris Poole as well as some ideas from Andrew Fraser and an Oracle Support guy called Dan . Many thanks to all of them.

CREATE OR REPLACE PROCEDURE CopySchemaBlog (
  sourceSchema      IN VARCHAR2,
  destinationSchema IN VARCHAR2,
  databaseLink      IN VARCHAR2,
  newPassword       IN VARCHAR2,
  debug             IN BOOLEAN := FALSE)
IS
  handle                  NUMBER;
  message                 VARCHAR2(2000);
  flashbackSCN            NUMBER;
  logfileName             VARCHAR2(200);
  destinationSchemaExists NUMBER;
  jobState                user_datapump_jobs.state%TYPE;
BEGIN
  dbms_output.put_line('verify whether destination schema '||destinationSchema||' already exists');
  SELECT COUNT(username)
    INTO destinationSchemaExists
    FROM dba_users
      WHERE username = destinationSchema
      AND username NOT IN ('SYS','SYSTEM','DBSNMP','DIP','EXFSYS','OUTLN','TSMSYS');
  IF ( destinationSchemaExists > 0 ) THEN
    dbms_output.put_line('drop user '||destinationSchema);
    EXECUTE IMMEDIATE 'DROP USER '||destinationSchema||' CASCADE';
  END IF;

  dbms_output.put_line('open datapump job over database link '||databaseLink);
  handle := dbms_datapump.open (
    operation =>    'IMPORT',
    job_mode =>     'SCHEMA',
    remote_link =>  databaseLink);

  IF debug = TRUE THEN
    dbms_output.put_line('set logfile parameter');
    SELECT 'datapump_'||to_char(SYSDATE,'YYYYMMDD')||'.log'
      INTO logfileName
      FROM dual;
    dbms_datapump.add_file (
      handle =>     handle,
      filename =>   logfileName,
      directory =>  'DATA_PUMP_DIR',
      filetype =>   dbms_datapump.ku$_file_type_log_file);
  END IF;

  dbms_output.put_line('set FLASHBACK_SCN parameter');
  dbms_datapump.set_parameter(
    handle =>   handle,
    name =>     'FLASHBACK_SCN',
    value =>    dbms_flashback.get_system_change_number);

  dbms_output.put_line('define SCHEMA_LIST');
  dbms_datapump.metadata_filter(
    handle =>   handle,
    name =>     'SCHEMA_LIST',
    value =>    ''''||sourceSchema||'''');

  dbms_output.put_line('define schema remapping');
  dbms_datapump.metadata_remap(
    handle =>     handle,
    name =>       'REMAP_SCHEMA',
    old_value =>  sourceSchema,
    value =>      destinationSchema);

  dbms_output.put_line('start datapump job');
  dbms_datapump.start_job (handle);

  dbms_output.put_line('wait for job to finish');
  dbms_datapump.wait_for_job(handle, jobState);

  dbms_output.put_line('set new password for destination schema '||destinationSchema);
  EXECUTE IMMEDIATE 'ALTER USER '||destinationSchema||' IDENTIFIED BY '||newPassword;
EXCEPTION
WHEN OTHERS THEN
  dbms_output.put_line('-------------------------------------------------------------------------------------');
  dbms_output.put_line('-- Error Backtrace ------------------------------------------------------------------');
  dbms_output.put_line(dbms_utility.format_error_backtrace());
  dbms_output.put_line('-- Call Stack -----------------------------------------------------------------------');
  dbms_output.put_line(dbms_utility.format_call_stack());
  dbms_output.put_line('-------------------------------------------------------------------------------------');

  dbms_datapump.stop_job(handle);
  RAISE;
END;
/

When you run it, the output should look similar to the following.

SQL> exec copyschemablog('ORIGINAL','COPY','DAR102B','pw', TRUE);
verify whether destination schema COPY already exists
drop user COPY
open datapump job over database link DAR102B
set logfile parameter
set FLASHBACK_SCN parameter
define SCHEMA_LIST
define schema remapping
start datapump job
wait for job to finish
set new password for destination schema COPY

PL/SQL procedure successfully completed.

The logfile is written to the DATA_PUMP_DIR. Mine had the following content.


Starting "SYS"."SYS_IMPORT_SCHEMA_01":
Estimate in progress using BLOCKS method...
Processing object type SCHEMA_EXPORT/TABLE/TABLE_DATA
Total estimation using BLOCKS method: 64 KB
Processing object type SCHEMA_EXPORT/USER
Processing object type SCHEMA_EXPORT/SYSTEM_GRANT
Processing object type SCHEMA_EXPORT/ROLE_GRANT
Processing object type SCHEMA_EXPORT/DEFAULT_ROLE
Processing object type SCHEMA_EXPORT/PRE_SCHEMA/PROCACT_SCHEMA
Processing object type SCHEMA_EXPORT/TABLE/TABLE
. . imported "COPY"."T" 3 rows
Job "SYS"."SYS_IMPORT_SCHEMA_01" successfully completed at 16:05:01

To use this procedure, you should install it in the SYS schema of your test database (I have tested it with 10.2.0.2, 10.2.0.4 and 11.1.0.7 on Linux). In addition create a database link which connects to the system user. You can also use the SID of your database as database link parameter, because a hidden database link to the own database exists under the name of the SID. In a following post I will show how dbms_datapump can be run with lower privileged users (the documentation lacks some information in this area).

When you do some test with dbms_datapump, you might stumble upon any of the following errors:
ORA-31623: a job is not attached to this session via the specified handle
ORA-39001: invalid argument value
ORA-39004: invalid state

But what do they mean?
ORA-31623 is raised when you try to stop a job which doesn’t exist. Usually the dbms_datapump.open function didn’t succeed and now a following procedure, which tries to use the handle, raises this error.
ORA-39001 is a generic error and happens almost always, when there is a problem with dbms_datapump.
ORA-39004 is raised when you try to run dbms_datapump through a database link which uses a connection qualifier (i.e. sid.domain.com@userxyz). After hours of testing I opened a SR which ended in the enhancement request 811695.1 DataPump Import Over a NETWORK_LINK Fails if the Database Link Contains Connection Qualifiers

Now the question is, what can you do if ORA-39001: invalid argument value is raised? First run the procedure with debug => TRUE, this forces it to write a logfile. Then have a look at the logfile. If this doesn’t help you any further, you can abuse trace to gain some information. Depending on the problem, you might have to trace on the remote database. You can activate trace, amongst others, with the following command

ALTER SESSION SET EVENTS='10046 trace name context forever, level 12';

Afterwards search for the string err= in your trace file. Then run oerr with the number behind err= to find out what the error message stands for. In example: If you find err=942 then run
oerr ora 942
00942, 00000, "table or view does not exist"
// *Cause:
// *Action:

If no logfile is written at all, then usually the dbms_datapump.open procedure didn’t finish. In this case usually the database link is the problem. Verify your database link with SELECT * FROM v$instance@your_database_link.

Now I wish you a lot of fun with dbms_datapump and it would be nice to read some comments about your experiences with it.

About these ads

Written by danirey

August 13, 2009 at 07:50

20 Responses

Subscribe to comments with RSS.

  1. I just hit your blog and found this post Very useful Dani. Thank you for sharing

    coskan

    August 13, 2009 at 17:32

    • Hi Coskan

      Thanks for the kind words. I follow your blog too. Keep up the good work!

      danirey

      August 13, 2009 at 17:57

  2. [...] Dani Rey-The gains and pains of dbms_datapump. Part I [...]

  3. [...] a comment » In my first post about dbms_datapump I showed you some basics about this usefull but sometimes hard to use package. Usually the next [...]

  4. i want to export hr schema using stored procedure,how will i do this….plz send me the code,and its usage..
    i am new to oracle

    IMRAN KHAN

    June 24, 2010 at 12:28

  5. Hi Imran

    I prepared a little example for you. I hope this is what you where looking for. It’s based on some code I once found in the Internet, but unfortunately I don’t remember the author.

    DECLARE
      handle      NUMBER;                         -- Data Pump job handle
      job_state   VARCHAR2(30);                   -- To keep track of job state
      jobState    user_datapump_jobs.state%TYPE;  -- The status of the datapump job
      logfileName VARCHAR2(100);                  -- The name of the logfile
    BEGIN
    
    -- Create a (user-named) Data Pump job to do a schema export.
      handle := DBMS_DATAPUMP.OPEN('EXPORT','SCHEMA',NULL,'export_test');
    
    --define logfile
      SELECT 'datapump_'||to_char(SYSDATE,'YYYYMMDD')||'.log'
        INTO logfileName
        FROM dual;
      DBMS_DATAPUMP.ADD_FILE (
        handle =>     handle,
        filename =>   logfileName,
        directory =>  'DATA_PUMP_DIR',
        filetype =>   dbms_datapump.ku$_file_type_log_file);
    
    -- Specify a single dump file for the job (using the handle just returned)
    -- and a directory object, which must already be defined and accessible
    -- to the user running this procedure.
      DBMS_DATAPUMP.ADD_FILE(handle,'hr.dmp','DATA_PUMP_DIR');
    
    -- A metadata filter is used to specify the schema that will be exported.
      DBMS_DATAPUMP.METADATA_FILTER(handle,'SCHEMA_EXPR','IN (''HR'')');
    -- Start the job. An exception will be generated if something is not set up
    -- properly. 
      DBMS_DATAPUMP.START_JOB(handle);
      dbms_datapump.wait_for_job(handle, jobState);
      
      EXCEPTION
      WHEN OTHERS THEN
        dbms_output.put_line('-------------------------------------------------------------------------------------');
        dbms_output.put_line('-- Error Backtrace ------------------------------------------------------------------');
        dbms_output.put_line(dbms_utility.format_error_backtrace());
        dbms_output.put_line('-- Call Stack -----------------------------------------------------------------------');
        dbms_output.put_line(dbms_utility.format_call_stack());
        dbms_output.put_line('-------------------------------------------------------------------------------------');
        dbms_datapump.stop_job(handle);
        RAISE;
    END;
    /
    
    

    As you say, you’re new to Oracle, I would like to recommend you my little “PL/SQL Basics for the DBA” series – http://danirey.wordpress.com/tag/plsql-basics/

    danirey

    June 28, 2010 at 05:53

  6. Hi Dani,

    thanks for this very useful post.

    I’m trying to replicate a schema from our production environment (10.2.0.3) into a XE database, which is installed on my local computer. For this, i’m executing your procedure on the XE with a user which i’ve already granted the import full database role and the privs imp_full_database, connect and resource (in the end even the dba).. On the source schema, the user (‘OMT’) is granted exp_full_database (although i guess it is not necessary at all, since it is only exporting its own schema?)

    Database link points directly to the source schema ‘OMT’.

    When i execute the procedure

    begin
    CopySchemaBlog( sourceschema => ‘OMT’,
    destinationschema => ‘testuser’,
    databaselink => ‘DWH_OMT’,
    newpassword => ‘test’);
    end;

    i keep receiving the same error:

    ORA-31631: privileges are required
    ORA-06512: in “COPYSCHEMABLOG”, Line 82

    What privs am i missing? Thank’s for your support.

    Best Regards,
    Onur

    onurgokcen

    June 8, 2011 at 09:21

  7. Hi,
    thanks for your CopySchemaBlog procedure. It was very helpfull for me.
    I had a problem defining destinationSchema with mixed lower and upper case characters (f.e. “AbCd”) .
    The procedure failed at “EXECUTE IMMEDIATE ‘ALTER USER’ …”. or “EXECUTE IMMEDIATE ‘DROP USER’ …”.
    I think Oracle internally converts lower case characters and spaces different.
    When defining destinationSchema all in upper cases the procdure worked.
    Regards,
    Wilfried

    Wilfried

    June 14, 2011 at 11:40

    • Hi Wilfried

      My code only works with “standard” usernames, without mixed cases or special signs.

      If you would like to drop users with mixed case usernames you would have to change

      EXECUTE IMMEDIATE ‘DROP USER ‘||destinationSchema||’ CASCADE';

      to

      EXECUTE IMMEDIATE ‘DROP USER “‘||destinationSchema||'” CASCADE';

      Basically you have to protect the username with double quotes.

      Regards
      Dani

      danirey

      June 16, 2011 at 21:10

  8. Hi Dani,

    You can make exception handler quite elaborative using the following code:

    EXCEPTION 
    WHEN OTHERS THEN
      DECLARE
        l_job_state VARCHAR2(4000);
        l_status ku$_status1020;
      BEGIN
        dbms_datapump.get_status(handle => handle,
                                 mask => dbms_datapump.KU$_STATUS_JOB_ERROR,
                                 job_state => l_job_state,
                                 status => l_status);
        dbms_output.put_line('STATE: ' || l_job_state);
        IF l_status.error IS NOT NULL AND l_status.error.count > 0 THEN
          dbms_output.put_line('ERROR: ');
          FOR i IN l_status.error.first..l_status.error.last LOOP
            dbms_output.put_line(l_status.error(i).LogText);
          END LOOP;
        END IF;
      END;
    
      dbms_datapump.stop_job(handle);
      RAISE;
    END;
    

    Nikolay Melekhin

    October 20, 2011 at 13:35

    • Hi Nikolay

      Many thanks for the valuable input. I’m currently on vacation and therefore don’t have a machine to test it. But if it does what I think, it is very nice. Maybe you can run it with a user which has not enough privileges, or a database link which does not work and post the output.

      Cheers
      Dani

      danirey

      October 20, 2011 at 19:18

  9. Hi Dani,

    I made a procedure based on yours, I created a db link, and through this db link I import data to the destination schema. I created a scheduled job, that has to run every night. But it runs just for first time, the next run fails with this error:
    “ORA-31623: a job is not attached to this session via the specified handle
    ORA-06512: at “SYS.DBMS_SYS_ERROR”, line 79
    ORA-06512: at “SYS.DBMS_DATAPUMP”, line 1137
    ORA-06512: at “SYS.DBMS_DATAPUMP”, line 5658
    ORA-06512: at “SYS.COPYSCHEMAHUGODWH”, line 96
    ORA-06512: at line 1

    I see that the import datapump job (“SYS_IMPORT_SCHEMA_07″) is terminated succesfully and it dissapears from dba_datapump_jobs tables, but the SYS_EXPORT_SCHEMA_01 job keeps hanging there in “not running” status and the owner is the destination schema user.

    please help me,
    thank you,
    Szili

    szilard kele

    September 3, 2013 at 10:43

    • Hi Szili

      ORA-31623 is raised when you try to stop a job which doesn’t exist. Usually the dbms_datapump.open function didn’t succeed and now a following procedure, which tries to use the handle, raises this error.

      For further analysis you need to add some logging. So run your procedure with debug => TRUE

      Depending on the problem, you might have to trace on the remote database. You can activate trace, amongst others, with the following command

      ALTER SESSION SET EVENTS='10046 trace name context forever, level 12';

      Afterwards search for the string err= in your trace file. Then run oerr with the number behind err= to find out what the error message stands for.

      Good Luck
      Dani

      danirey

      September 3, 2013 at 11:39

      • Hi Dani,

        ty, for quick response.
        I have the debug = true, in the log file I see:

        Starting “SYS”.”SYS_IMPORT_SCHEMA_07″:
        .
        .
        .
        Job “SYS”.”SYS_IMPORT_SCHEMA_07″ completed with 11 error(s) at …

        the errors are procedure and package compilation warnings (ORA-39082:).
        Can these cause that MY_SHEMA.SYS_EXPORT_SCHEMA_01 job hangs?

        another hand everythings looks ok, the schema is created and data is loaded.

        But when the job runs next time, MY_SHEMA already exists and countains the sys_export_schema_01 master table, and my job fails.
        When I manually drop the master table than my job runs ok.

        Schould I put this master table drop at the and of my procedure? or is this very dirty solution?

        thanks for advise..
        szili

        szilard kele

        September 3, 2013 at 14:09

      • Hi Szili

        You’re welcome. Looks like a good analysis. Your compile errors shouldn’t be the problem. But using a target schema, which already exists, is an issue.

        I always deleted the target schema, when I wanted to copy a schema. Otherwise you might end up with a version mix of your source schema. For example, the name of a table might have changed. Without deleting the target schema upfront, you will end up with the old and the new version of this table.

        You can see my delete commands in the beginning of the CopySchemaBlog Procedure in the post http://danirey.wordpress.com/2009/09/15/the-gains-and-pains-of-dbms_datapump-part-ii/

        Cheers
        Dani

        danirey

        September 4, 2013 at 09:14

  10. – Nice article, Dani, but I can’t catch a whole class of errors with a simple EXCEPTION section. If the job *completes* with errors, which may be as severe as the requested objects not existing, no exceptions are thrown and the procedure completes “successfully”:

    DECLARE
    job NUMBER;
    job_state VARCHAR2(30);
    status ku$_Status;
    — logentry, etc…
    datapump_more_info EXCEPTION;
    PRAGMA EXCEPTION_INIT(datapump_more_info, -31627);
    invalid_argval EXCEPTION;
    PRAGMA EXCEPTION_INIT(invalid_argval, -39001);

    BEGIN
    job := DBMS_DATAPUMP.OPEN(‘EXPORT’, ‘TABLE’, NULL, ‘FAIL_JOB’);

    DBMS_DATAPUMP.ADD_FILE(job, ‘fail.dmp’, ‘DATA_PUMP_DIR’);
    DBMS_DATAPUMP.METADATA_FILTER(job, ‘NAME_LIST’, ”’NONEXISTENT_TABLE”’);

    DBMS_DATAPUMP.START_JOB(job);
    DBMS_DATAPUMP.WAIT_FOR_JOB(job, job_state);

    dbms_output.put_line(‘Job state = ‘||job_state);
    DBMS_DATAPUMP.DETACH(job);
    EXCEPTION
    WHEN others THEN
    dbms_output.put_line(‘exception:’);
    DBMS_DATAPUMP.GET_STATUS(handle => job,
    mask => dbms_datapump.ku$_status_job_error +
    dbms_datapump.ku$_status_job_status +
    dbms_datapump.ku$_status_wip,
    timeout => -1, job_state => job_state, status => status);
    — yadda yadda
    RAISE;
    END;

    Also, by the time WAIT_FOR_JOB returns, the job is no longer valid and you can’t use GET_STATUS to see the errors. So far my only solution is to loop after starting the job and GET_STATUS repeatedly to get the error log entries, print the lines, and raise an user exception at the end. Any other ideas?

    arielCo

    September 7, 2013 at 00:42

    • Unfortunately this is true and hasn’t changed for better with Oracle 12c.

      In the PL/SQL Packages and Types reference you can find a description of all issues, which can only be handled by querying dbms_datapump.get_status.

      In the database utilities documentation you can find an example for a procedure, which also loops dbms_datapump.get_status, to handle such situations.

      danirey

      September 15, 2013 at 12:12

    • Hi Dani,

      Thanks for advice, sorry for late response, I was ill. You had right, the problem was, that it could not drop the user therefor it went to the exeption block where it wanted to stop the job, which wasent existed…

      I killed then the session which was blocking the drop. The first day it run good, but from second day/run my log is full whit this errors:
      ORA-31693: Table data object “destinationschema”.”ROAD_USAGE_PAYMENT_EXTRA_DATA”:”ACTIVE” failed to load/unload and is being skipped due to error:
      ORA-01555: snapshot too old: rollback segment number 11 with name “_SYSSMU11_1081956174$” too small
      ORA-02063: preceding line from “source_schema”

      have you any idea why I get this? thank you

      best regards,
      szilard

      szilard kele

      September 23, 2013 at 15:12

      • HI Dani,

        so the problem is probably, that the undo_retention is too small? I found a script:

        SELECT d.undo_size/(1024*1024) “ACTUAL UNDO SIZE [MByte]“,
        SUBSTR(e.value,1,25) “UNDO RETENTION [Sec]“,
        ROUND((d.undo_size / (to_number(f.value) *
        g.undo_block_per_sec))) “OPTIMAL UNDO RETENTION [Sec]”
        FROM (
        SELECT SUM(a.bytes) undo_size
        FROM v$datafile a,
        v$tablespace b,
        dba_tablespaces c
        WHERE c.contents = ‘UNDO’
        AND c.status = ‘ONLINE’
        AND b.name = c.tablespace_name
        AND a.ts# = b.ts#
        ) d,
        v$parameter e,
        v$parameter f,
        (
        SELECT MAX(undoblks/((end_time-begin_time)*3600*24))
        undo_block_per_sec
        FROM v$undostat
        ) g
        WHERE e.name = ‘undo_retention’
        AND f.name = ‘db_block_size’ ;

        the result on source schema is:

        “ACTUAL UNDO SIZE [MByte]“,”UNDO RETENTION [Sec]“,”OPTIMAL UNDO RETENTION [Sec]”
        10305, “900”, 32370

        do I have to increase the undo retention at least to 32370? doas it affect the production db any unwanted way?

        ty,
        szili

        szilard kele

        September 23, 2013 at 17:39


Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

Follow

Get every new post delivered to your Inbox.

%d bloggers like this: