Dani's IT Blog

…moving from Oracle to Software Development

Archive for the ‘Advanced’ Category

The gains and pains of dbms_datapump. Part II

with one comment

In my first post about dbms_datapump I showed you some basics about this usefull but sometimes hard to use package. Usually the next question after “how can I use this package?” is “how can I use it with a different user than SYS?”.

The target is to install the procedure of my last post under a separate user. The database link as well should connect to a separate user.

Which privileges does a user need to install my procedure?

  • CREATE PROCEDUE of course this is needed
  • IMP_FULL_DATABASE is the basic role to import stuff via dbms_datapump
  • CREATE TABLE is needed by dbms_datapump, because it uses a table to save some information during execution
  • with CREATE TABLE comes the need for QUOTA on the DEFAULT TABLESPACE
  • DROP and ALTER USER are specific to my procedure, because I drop the user and alter its password
  • CREATE SESSION is not necessarely needed. I would prefer to write a wrapper procedure around CopySchemaBlog and grant the privileges to execute it to my Application User or whatever user is best to run my jobs with
  • GRANT SELECT ON sys.dba_users is needed to verify whether the user, which is going to be copied, already exists
  • EXECUTE ON sys.dbms_flashback is needed for the procedure to be valid



  • In the source database we need a user with some different privileges.

  • EXP_FULL_DATABASE is the basic privilege needed to export data of different users with dbms_datapump
  • FLASHBACK ANY TABLE is needed to do a consistent datapump export using FLASHBACK_SCN parameter. With this parameter set, dbms_datapump uses Flashback Query to export a consistent set of data. No Flashback Logs are required, because Flashback Query uses the UNDO tablespace and not Flashback Logs to get the before image of changed blocks
  • CREATE SESSION is needed to connect via database link to the remote site
  • SELECT ON sys.v_$instance is needed for my procedure to be able to verify from which instance the data is copied. This privilege is not needed by dbms_datapump
  • EXECUTE ON sys.dbms_flashback is needed to get the current system change number


  • This is the script for the user on the source database.

    CREATE USER source_user IDENTIFIED BY &pw
    DEFAULT TABLESPACE users
    TEMPORARY TABLESPACE temp;
    
    --privileges for dbms_datapump
    GRANT EXP_FULL_DATABASE TO source_user;
    GRANT FLASHBACK ANY TABLE TO source_user;
    GRANT CREATE SESSION TO source_user;
    
    ALTER USER source_user DEFAULT ROLE ALL;
    
    GRANT SELECT ON sys.v_$instance TO source_user;
    GRANT EXECUTE ON sys.dbms_flashback TO source_user;
    

    This is the script to install the user and the CopySchemaBlog procedure on the destination database.

    CREATE USER target_user IDENTIFIED BY &pw
    DEFAULT TABLESPACE users
    TEMPORARY TABLESPACE temp;
    
    --privileges for dbms_datapump
    GRANT IMP_FULL_DATABASE TO target_user;
    GRANT DROP USER TO target_user;
    GRANT ALTER USER TO target_user;
    GRANT CREATE SESSION TO target_user;
    GRANT CREATE TABLE TO target_user;
    --GRANT FLASHBACK ANY TABLE TO target_user;
    
    ALTER USER target_user DEFAULT ROLE ALL;
    ALTER USER target_user QUOTA UNLIMITED ON users;
    
    GRANT SELECT ON sys.dba_users TO target_user;
    GRANT EXECUTE ON sys.dbms_flashback TO target_user;
    
    CREATE OR REPLACE PROCEDURE target_user.CopySchemaBlog (
      sourceSchema      IN VARCHAR2,
      destinationSchema IN VARCHAR2,
      databaseLink      IN VARCHAR2,
      newPassword       IN VARCHAR2,
      debug             IN BOOLEAN := FALSE)
    IS
      handle                  NUMBER;
      message                 VARCHAR2(2000);
      flashbackSCN            NUMBER;
      logfileName             VARCHAR2(200);
      destinationSchemaExists NUMBER;
      jobState                user_datapump_jobs.state%TYPE;
    BEGIN
      dbms_output.put_line('verify whether destination schema '||destinationSchema||' already exists');
      SELECT COUNT(username)
        INTO destinationSchemaExists
        FROM dba_users
          WHERE username = destinationSchema
          AND username NOT IN ('SYS','SYSTEM','DBSNMP','DIP','EXFSYS','OUTLN','TSMSYS');
      IF ( destinationSchemaExists > 0 ) THEN
        dbms_output.put_line('drop user '||destinationSchema);
        EXECUTE IMMEDIATE 'DROP USER '||destinationSchema||' CASCADE';
      END IF;
    
      dbms_output.put_line('open datapump job over database link '||databaseLink);
      handle := dbms_datapump.open (
        operation =>    'IMPORT',
        job_mode =>     'SCHEMA',
        remote_link =>  databaseLink);
    
      IF debug = TRUE THEN
        dbms_output.put_line('set logfile parameter');
        SELECT 'datapump_'||to_char(SYSDATE,'YYYYMMDD')||'.log'
          INTO logfileName
          FROM dual;
        dbms_datapump.add_file (
          handle =>     handle,
          filename =>   logfileName,
          directory =>  'DATA_PUMP_DIR',
          filetype =>   dbms_datapump.ku$_file_type_log_file);
      END IF;
    
      dbms_output.put_line('set FLASHBACK_SCN parameter');
      dbms_datapump.set_parameter(
        handle =>   handle,
        name =>     'FLASHBACK_SCN',
        value =>    dbms_flashback.get_system_change_number);
    
      dbms_output.put_line('define SCHEMA_LIST');
      dbms_datapump.metadata_filter(
        handle =>   handle,
        name =>     'SCHEMA_LIST',
        value =>    ''''||sourceSchema||'''');
    
      dbms_output.put_line('define schema remapping');
      dbms_datapump.metadata_remap(
        handle =>     handle,
        name =>       'REMAP_SCHEMA',
        old_value =>  sourceSchema,
        value =>      destinationSchema);
    
      dbms_output.put_line('start datapump job');
      dbms_datapump.start_job (handle);
    
      dbms_output.put_line('wait for job to finish');
      dbms_datapump.wait_for_job(handle, jobState);
    
      dbms_output.put_line('set new password for destination schema '||destinationSchema);
      EXECUTE IMMEDIATE 'ALTER USER '||destinationSchema||' IDENTIFIED BY '||newPassword;
    EXCEPTION
    WHEN OTHERS THEN
      dbms_output.put_line('-------------------------------------------------------------------------------------');
      dbms_output.put_line('-- Error Backtrace ------------------------------------------------------------------');
      dbms_output.put_line(dbms_utility.format_error_backtrace());
      dbms_output.put_line('-- Call Stack -----------------------------------------------------------------------');
      dbms_output.put_line(dbms_utility.format_call_stack());
      dbms_output.put_line('-------------------------------------------------------------------------------------');
    
      dbms_datapump.stop_job(handle);
      RAISE;
    END;
    /
    

    I tested the hole process on 11gR2 by installing source and target user in one database. Then I created a user called ORIGINAL.

    
    SQL> CREATE USER original IDENTIFIED BY test1234;
    
    User created.
    
    SQL> GRANT connect, resource TO original;
    
    Grant succeeded.
    
    SQL> CREATE TABLE original.t (x NUMBER);
    
    Table created.
    
    SQL> INSERT INTO original.t VALUES(1);
    
    1 row created.
    
    SQL> commit;
    
    Commit complete.
    


    Afterwards I logged in as TARGET_USER an created a database link pointing to the same database (GLOBAL_NAMES=FALSE) but connected as SOURCE_USER.

    
    CREATE DATABASE LINK blog CONNECT TO source_user IDENTIFIED BY test1234 USING 'dar112.ltdar.trivadis.com';
    


    Now everything was prepared to execute a test run.

    
    SQL> set serveroutput on
    SQL> exec CopySchemaBlog(sourceSchema => 'ORIGINAL', destinationSchema => 'COPY', databaseLink => 'blog', newPassword => 'test1234', debug => TRUE);
    verify whether destination schema COPY already exists
    open datapump job over database link blog
    set logfile parameter
    set FLASHBACK_SCN parameter
    define SCHEMA_LIST
    define schema remapping
    start datapump job
    wait for job to finish
    set new password for destination schema COPY
    
    PL/SQL procedure successfully completed.
    


    The logfile had the following content

    
    Starting "TARGET_USER"."SYS_IMPORT_SCHEMA_02":
    Estimate in progress using BLOCKS method...
    Processing object type SCHEMA_EXPORT/TABLE/TABLE_DATA
    Total estimation using BLOCKS method: 64 KB
    Processing object type SCHEMA_EXPORT/USER
    Processing object type SCHEMA_EXPORT/SYSTEM_GRANT
    Processing object type SCHEMA_EXPORT/ROLE_GRANT
    Processing object type SCHEMA_EXPORT/DEFAULT_ROLE
    Processing object type SCHEMA_EXPORT/PRE_SCHEMA/PROCACT_SCHEMA
    Processing object type SCHEMA_EXPORT/TABLE/TABLE
    . . imported "COPY"."T"                                       1 rows
    Job "TARGET_USER"."SYS_IMPORT_SCHEMA_02" successfully completed at 23:56:11
    


    Everything ran fine, so it’s proven that the privileges are correct. My code should work fine for all versions of 10g and 11g (but I only tested it with 11gR2 and partially with 10gR2 and 11gR1).

    Written by danirey

    September 15, 2009 at 07:02

    The gains and pains of dbms_datapump. Part I

    with 20 comments

    I personally have a love/hate relationship with dbms_datapump. On one hand it is a very powerful tool, on the other hand documentation and error messages are not very helpful. Because of this, I would like to show you some of my findings related to dbms_datapump. Let’s start with some code, which everyone can use to test what I show here.

    The following code copies a schema over a database link from one database to another. You can also use the same database as source and target, if you want to. I tried to keep the code as simple as possible, so pre- and post-checks are missing. The code is based on the one from Chris Poole as well as some ideas from Andrew Fraser and an Oracle Support guy called Dan . Many thanks to all of them.

    CREATE OR REPLACE PROCEDURE CopySchemaBlog (
      sourceSchema      IN VARCHAR2,
      destinationSchema IN VARCHAR2,
      databaseLink      IN VARCHAR2,
      newPassword       IN VARCHAR2,
      debug             IN BOOLEAN := FALSE)
    IS
      handle                  NUMBER;
      message                 VARCHAR2(2000);
      flashbackSCN            NUMBER;
      logfileName             VARCHAR2(200);
      destinationSchemaExists NUMBER;
      jobState                user_datapump_jobs.state%TYPE;
    BEGIN
      dbms_output.put_line('verify whether destination schema '||destinationSchema||' already exists');
      SELECT COUNT(username)
        INTO destinationSchemaExists
        FROM dba_users
          WHERE username = destinationSchema
          AND username NOT IN ('SYS','SYSTEM','DBSNMP','DIP','EXFSYS','OUTLN','TSMSYS');
      IF ( destinationSchemaExists > 0 ) THEN
        dbms_output.put_line('drop user '||destinationSchema);
        EXECUTE IMMEDIATE 'DROP USER '||destinationSchema||' CASCADE';
      END IF;
    
      dbms_output.put_line('open datapump job over database link '||databaseLink);
      handle := dbms_datapump.open (
        operation =>    'IMPORT',
        job_mode =>     'SCHEMA',
        remote_link =>  databaseLink);
    
      IF debug = TRUE THEN
        dbms_output.put_line('set logfile parameter');
        SELECT 'datapump_'||to_char(SYSDATE,'YYYYMMDD')||'.log'
          INTO logfileName
          FROM dual;
        dbms_datapump.add_file (
          handle =>     handle,
          filename =>   logfileName,
          directory =>  'DATA_PUMP_DIR',
          filetype =>   dbms_datapump.ku$_file_type_log_file);
      END IF;
    
      dbms_output.put_line('set FLASHBACK_SCN parameter');
      dbms_datapump.set_parameter(
        handle =>   handle,
        name =>     'FLASHBACK_SCN',
        value =>    dbms_flashback.get_system_change_number);
    
      dbms_output.put_line('define SCHEMA_LIST');
      dbms_datapump.metadata_filter(
        handle =>   handle,
        name =>     'SCHEMA_LIST',
        value =>    ''''||sourceSchema||'''');
    
      dbms_output.put_line('define schema remapping');
      dbms_datapump.metadata_remap(
        handle =>     handle,
        name =>       'REMAP_SCHEMA',
        old_value =>  sourceSchema,
        value =>      destinationSchema);
    
      dbms_output.put_line('start datapump job');
      dbms_datapump.start_job (handle);
    
      dbms_output.put_line('wait for job to finish');
      dbms_datapump.wait_for_job(handle, jobState);
    
      dbms_output.put_line('set new password for destination schema '||destinationSchema);
      EXECUTE IMMEDIATE 'ALTER USER '||destinationSchema||' IDENTIFIED BY '||newPassword;
    EXCEPTION
    WHEN OTHERS THEN
      dbms_output.put_line('-------------------------------------------------------------------------------------');
      dbms_output.put_line('-- Error Backtrace ------------------------------------------------------------------');
      dbms_output.put_line(dbms_utility.format_error_backtrace());
      dbms_output.put_line('-- Call Stack -----------------------------------------------------------------------');
      dbms_output.put_line(dbms_utility.format_call_stack());
      dbms_output.put_line('-------------------------------------------------------------------------------------');
    
      dbms_datapump.stop_job(handle);
      RAISE;
    END;
    /
    

    When you run it, the output should look similar to the following.

    SQL> exec copyschemablog('ORIGINAL','COPY','DAR102B','pw', TRUE);
    verify whether destination schema COPY already exists
    drop user COPY
    open datapump job over database link DAR102B
    set logfile parameter
    set FLASHBACK_SCN parameter
    define SCHEMA_LIST
    define schema remapping
    start datapump job
    wait for job to finish
    set new password for destination schema COPY

    PL/SQL procedure successfully completed.

    The logfile is written to the DATA_PUMP_DIR. Mine had the following content.


    Starting "SYS"."SYS_IMPORT_SCHEMA_01":
    Estimate in progress using BLOCKS method...
    Processing object type SCHEMA_EXPORT/TABLE/TABLE_DATA
    Total estimation using BLOCKS method: 64 KB
    Processing object type SCHEMA_EXPORT/USER
    Processing object type SCHEMA_EXPORT/SYSTEM_GRANT
    Processing object type SCHEMA_EXPORT/ROLE_GRANT
    Processing object type SCHEMA_EXPORT/DEFAULT_ROLE
    Processing object type SCHEMA_EXPORT/PRE_SCHEMA/PROCACT_SCHEMA
    Processing object type SCHEMA_EXPORT/TABLE/TABLE
    . . imported "COPY"."T" 3 rows
    Job "SYS"."SYS_IMPORT_SCHEMA_01" successfully completed at 16:05:01

    To use this procedure, you should install it in the SYS schema of your test database (I have tested it with 10.2.0.2, 10.2.0.4 and 11.1.0.7 on Linux). In addition create a database link which connects to the system user. You can also use the SID of your database as database link parameter, because a hidden database link to the own database exists under the name of the SID. In a following post I will show how dbms_datapump can be run with lower privileged users (the documentation lacks some information in this area).

    When you do some test with dbms_datapump, you might stumble upon any of the following errors:
    ORA-31623: a job is not attached to this session via the specified handle
    ORA-39001: invalid argument value
    ORA-39004: invalid state

    But what do they mean?
    ORA-31623 is raised when you try to stop a job which doesn’t exist. Usually the dbms_datapump.open function didn’t succeed and now a following procedure, which tries to use the handle, raises this error.
    ORA-39001 is a generic error and happens almost always, when there is a problem with dbms_datapump.
    ORA-39004 is raised when you try to run dbms_datapump through a database link which uses a connection qualifier (i.e. sid.domain.com@userxyz). After hours of testing I opened a SR which ended in the enhancement request 811695.1 DataPump Import Over a NETWORK_LINK Fails if the Database Link Contains Connection Qualifiers

    Now the question is, what can you do if ORA-39001: invalid argument value is raised? First run the procedure with debug => TRUE, this forces it to write a logfile. Then have a look at the logfile. If this doesn’t help you any further, you can abuse trace to gain some information. Depending on the problem, you might have to trace on the remote database. You can activate trace, amongst others, with the following command

    ALTER SESSION SET EVENTS='10046 trace name context forever, level 12';

    Afterwards search for the string err= in your trace file. Then run oerr with the number behind err= to find out what the error message stands for. In example: If you find err=942 then run
    oerr ora 942
    00942, 00000, "table or view does not exist"
    // *Cause:
    // *Action:

    If no logfile is written at all, then usually the dbms_datapump.open procedure didn’t finish. In this case usually the database link is the problem. Verify your database link with SELECT * FROM v$instance@your_database_link.

    Now I wish you a lot of fun with dbms_datapump and it would be nice to read some comments about your experiences with it.

    Written by danirey

    August 13, 2009 at 07:50