Dani's IT Blog

…moving from Oracle to Software Development

Posts Tagged ‘ORA-39001

The gains and pains of dbms_datapump. Part III

with 3 comments

Imran Khan, a visitor of my blog, asked me how to export a schema using dbms_datapump. While preparing a little example for him, I once again came around an ugly detail in dbms_datapump, which I didn’t mention in my earlier posts.

You might receive the following error message when you execute your code the second time.

ERROR at line 1:
ORA-39001: invalid argument value
ORA-06512: at line 42

If it happens the second time you execute your code, then it might be your DBMS_DATAPUMP.ADD_FILE statement which fails.

But what’s the reason? dbms_datapump fails to add the file, because it already exists. Unfortunately the error message “ORA-39001: invalid argument value” doesn’t help much to find the problem. It makes sense, that Oracle doesn’t just overwrite an existing file, but they could at least come up with an error message which leads towards the problem.

So in the following code snippet it would be line 3 causing the problem. To solve it simply delete the file which should be used as destination for the export. In my case it’s the file hr.dmp in the DATA_PUMP_DIR.


If you wrote a logfile and did this before adding the destination file, it’s quite easy to find the problem. The logfile will contain the following lines.

ORA-39000: bad dump file specification
ORA-31641: unable to create dump file "/u00/app/oracle/admin/DAR112/dmp/hr.dmp"
ORA-27038: created file already exists
Additional information: 1

To write a logfile, you can add the following lines in your script, directly after the job creation.

SELECT 'datapump_'||to_char(SYSDATE,'YYYYMMDD')||'.log'
  INTO logfileName
  FROM dual;
  handle =>     handle,
  filename =>   logfileName,
  directory =>  'DATA_PUMP_DIR',
  filetype =>   dbms_datapump.ku$_file_type_log_file);

If your interested in more details about dbms_datapump, have a look at my other posts about dbms_datapump.


Written by danirey

June 28, 2010 at 16:57

The gains and pains of dbms_datapump. Part I

with 21 comments

I personally have a love/hate relationship with dbms_datapump. On one hand it is a very powerful tool, on the other hand documentation and error messages are not very helpful. Because of this, I would like to show you some of my findings related to dbms_datapump. Let’s start with some code, which everyone can use to test what I show here.

The following code copies a schema over a database link from one database to another. You can also use the same database as source and target, if you want to. I tried to keep the code as simple as possible, so pre- and post-checks are missing. The code is based on the one from Chris Poole as well as some ideas from Andrew Fraser and an Oracle Support guy called Dan . Many thanks to all of them.

  sourceSchema      IN VARCHAR2,
  destinationSchema IN VARCHAR2,
  databaseLink      IN VARCHAR2,
  newPassword       IN VARCHAR2,
  debug             IN BOOLEAN := FALSE)
  handle                  NUMBER;
  message                 VARCHAR2(2000);
  flashbackSCN            NUMBER;
  logfileName             VARCHAR2(200);
  destinationSchemaExists NUMBER;
  jobState                user_datapump_jobs.state%TYPE;
  dbms_output.put_line('verify whether destination schema '||destinationSchema||' already exists');
  SELECT COUNT(username)
    INTO destinationSchemaExists
    FROM dba_users
      WHERE username = destinationSchema
  IF ( destinationSchemaExists > 0 ) THEN
    dbms_output.put_line('drop user '||destinationSchema);
    EXECUTE IMMEDIATE 'DROP USER '||destinationSchema||' CASCADE';

  dbms_output.put_line('open datapump job over database link '||databaseLink);
  handle := dbms_datapump.open (
    operation =>    'IMPORT',
    job_mode =>     'SCHEMA',
    remote_link =>  databaseLink);

  IF debug = TRUE THEN
    dbms_output.put_line('set logfile parameter');
    SELECT 'datapump_'||to_char(SYSDATE,'YYYYMMDD')||'.log'
      INTO logfileName
      FROM dual;
    dbms_datapump.add_file (
      handle =>     handle,
      filename =>   logfileName,
      directory =>  'DATA_PUMP_DIR',
      filetype =>   dbms_datapump.ku$_file_type_log_file);

  dbms_output.put_line('set FLASHBACK_SCN parameter');
    handle =>   handle,
    name =>     'FLASHBACK_SCN',
    value =>    dbms_flashback.get_system_change_number);

  dbms_output.put_line('define SCHEMA_LIST');
    handle =>   handle,
    name =>     'SCHEMA_LIST',
    value =>    ''''||sourceSchema||'''');

  dbms_output.put_line('define schema remapping');
    handle =>     handle,
    name =>       'REMAP_SCHEMA',
    old_value =>  sourceSchema,
    value =>      destinationSchema);

  dbms_output.put_line('start datapump job');
  dbms_datapump.start_job (handle);

  dbms_output.put_line('wait for job to finish');
  dbms_datapump.wait_for_job(handle, jobState);

  dbms_output.put_line('set new password for destination schema '||destinationSchema);
  EXECUTE IMMEDIATE 'ALTER USER '||destinationSchema||' IDENTIFIED BY '||newPassword;
  dbms_output.put_line('-- Error Backtrace ------------------------------------------------------------------');
  dbms_output.put_line('-- Call Stack -----------------------------------------------------------------------');


When you run it, the output should look similar to the following.

SQL> exec copyschemablog('ORIGINAL','COPY','DAR102B','pw', TRUE);
verify whether destination schema COPY already exists
drop user COPY
open datapump job over database link DAR102B
set logfile parameter
set FLASHBACK_SCN parameter
define schema remapping
start datapump job
wait for job to finish
set new password for destination schema COPY

PL/SQL procedure successfully completed.

The logfile is written to the DATA_PUMP_DIR. Mine had the following content.

Starting "SYS"."SYS_IMPORT_SCHEMA_01":
Estimate in progress using BLOCKS method...
Total estimation using BLOCKS method: 64 KB
Processing object type SCHEMA_EXPORT/USER
Processing object type SCHEMA_EXPORT/SYSTEM_GRANT
Processing object type SCHEMA_EXPORT/ROLE_GRANT
Processing object type SCHEMA_EXPORT/DEFAULT_ROLE
Processing object type SCHEMA_EXPORT/TABLE/TABLE
. . imported "COPY"."T" 3 rows
Job "SYS"."SYS_IMPORT_SCHEMA_01" successfully completed at 16:05:01

To use this procedure, you should install it in the SYS schema of your test database (I have tested it with, and on Linux). In addition create a database link which connects to the system user. You can also use the SID of your database as database link parameter, because a hidden database link to the own database exists under the name of the SID. In a following post I will show how dbms_datapump can be run with lower privileged users (the documentation lacks some information in this area).

When you do some test with dbms_datapump, you might stumble upon any of the following errors:
ORA-31623: a job is not attached to this session via the specified handle
ORA-39001: invalid argument value
ORA-39004: invalid state

But what do they mean?
ORA-31623 is raised when you try to stop a job which doesn’t exist. Usually the dbms_datapump.open function didn’t succeed and now a following procedure, which tries to use the handle, raises this error.
ORA-39001 is a generic error and happens almost always, when there is a problem with dbms_datapump.
ORA-39004 is raised when you try to run dbms_datapump through a database link which uses a connection qualifier (i.e. sid.domain.com@userxyz). After hours of testing I opened a SR which ended in the enhancement request 811695.1 DataPump Import Over a NETWORK_LINK Fails if the Database Link Contains Connection Qualifiers

Now the question is, what can you do if ORA-39001: invalid argument value is raised? First run the procedure with debug => TRUE, this forces it to write a logfile. Then have a look at the logfile. If this doesn’t help you any further, you can abuse trace to gain some information. Depending on the problem, you might have to trace on the remote database. You can activate trace, amongst others, with the following command

ALTER SESSION SET EVENTS='10046 trace name context forever, level 12';

Afterwards search for the string err= in your trace file. Then run oerr with the number behind err= to find out what the error message stands for. In example: If you find err=942 then run
oerr ora 942
00942, 00000, "table or view does not exist"
// *Cause:
// *Action:

If no logfile is written at all, then usually the dbms_datapump.open procedure didn’t finish. In this case usually the database link is the problem. Verify your database link with SELECT * FROM v$instance@your_database_link.

Now I wish you a lot of fun with dbms_datapump and it would be nice to read some comments about your experiences with it.

Written by danirey

August 13, 2009 at 07:50