Dani's IT Blog

…moving from Oracle to Software Development

The gains and pains of dbms_datapump. Part III

with 3 comments

Imran Khan, a visitor of my blog, asked me how to export a schema using dbms_datapump. While preparing a little example for him, I once again came around an ugly detail in dbms_datapump, which I didn’t mention in my earlier posts.

You might receive the following error message when you execute your code the second time.

ERROR at line 1:
ORA-39001: invalid argument value
ORA-06512: at line 42

If it happens the second time you execute your code, then it might be your DBMS_DATAPUMP.ADD_FILE statement which fails.

But what’s the reason? dbms_datapump fails to add the file, because it already exists. Unfortunately the error message “ORA-39001: invalid argument value” doesn’t help much to find the problem. It makes sense, that Oracle doesn’t just overwrite an existing file, but they could at least come up with an error message which leads towards the problem.

So in the following code snippet it would be line 3 causing the problem. To solve it simply delete the file which should be used as destination for the export. In my case it’s the file hr.dmp in the DATA_PUMP_DIR.


If you wrote a logfile and did this before adding the destination file, it’s quite easy to find the problem. The logfile will contain the following lines.

ORA-39000: bad dump file specification
ORA-31641: unable to create dump file "/u00/app/oracle/admin/DAR112/dmp/hr.dmp"
ORA-27038: created file already exists
Additional information: 1

To write a logfile, you can add the following lines in your script, directly after the job creation.

SELECT 'datapump_'||to_char(SYSDATE,'YYYYMMDD')||'.log'
  INTO logfileName
  FROM dual;
  handle =>     handle,
  filename =>   logfileName,
  directory =>  'DATA_PUMP_DIR',
  filetype =>   dbms_datapump.ku$_file_type_log_file);

If your interested in more details about dbms_datapump, have a look at my other posts about dbms_datapump.


Written by danirey

June 28, 2010 at 16:57

3 Responses

Subscribe to comments with RSS.

  1. […] 2. The gains and pains of dbms_datapump.  […]

  2. Hey, thanks for great tutorials! It’s really hard to find constructive and well structured information on this topic, and these posts seem exactly like that. It would be great to see a tutorial on how to import from existing dump file filtering by schema and table names to specified schema (my case).


    March 13, 2014 at 13:46

    • When having a dump file, I would usually just use impdp instead of dbms_datapump.
      If you have a dump file, you most probably copy it around with a shell script or manually trigger the import. In this case it is easier to use impdp.
      I think dbms_datapump is most interesting, if you use if over a database link and don’t involve anything outside the two databases.


      March 13, 2014 at 14:02

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )


Connecting to %s

%d bloggers like this: