Hi everyone, hope you’re doing good!

I’m a member of a WhatsApp Group in Brazil for DBAs (not only Oracle), and one of the members asked about how to export a large list of tables with DataPump.

This question may sound a little bit basic, but indeed it’s not.

When you don’t want or can’t export the entire schema, but only a list of tables, it depending of the number of characters, you may face some issues:

  • UDE-00014: invalid value for parameter, ‘<parameter_value>’
  • UDE-00019: ‘tables’ parameter list is too long

We also have the following Support Note mentioning about that:

4000 Bytes Character Limit in DataPump Parameter File (Doc ID 2275833.1)

If you try to export a large list of tables passing them as a value for parameter TABLES (or parameter INCLUDE, or even EXCLUDE), you are limited to 4000 characters. So, to illustrate and give an example, let’s imagine that I’d like to export a list of tables, the owner is OWNER_TEST. What I need to do is connect to database (using the user you are going to export, or a DBA user):

create table export_tables (table_name varchar2(128)) tablespace users;

Table created.

insert into export_tables values (‘TABLE1’);

1 row created.

insert into export_tables values (‘TABLE2’);

1 row created.


Commit complete.

select * from export_tables;


So, we need to create what we call the “work” table. Insert all the table names we would like to export in this work table and then, we can export using DataPump. Let me show how the command should be, so, in the Database Server (Operating System – Linux in my case):

expdp owner_test/mypassword directory=export dumpfile=exp_test.dmp logfile=exp_test.log INCLUDE=TABLE:\”IN (SELECT table_name FROM owner_test.export_tables)\” SCHEMAS=OWNER_TEST

Export: Release – Production on Mon May 27 21:41:22 2024
Copyright (c) 1982, 2019, Oracle and/or its affiliates. All rights reserved.
Connected to: Oracle Database 19c Enterprise Edition Release – Production
FLASHBACK automatically enabled to preserve database integrity.
Starting “OWNER_TEST”.”SYS_EXPORT_SCHEMA_01″: “/ AS SYSDBA” directory=extract dumpfile=exp_test.dmp logfile=exp_test.log INCLUDE=TABLE:”IN (SELECT table_name FROM owner_test.export_tables)” SCHEMAS=OWNER_TEST
Processing object type SCHEMA_EXPORT/TABLE/TABLE
. . exported “OWNER_TEST”.”TABLE1″ 544.7 KB 36745 rows
. . exported “OWNER_TEST”.”TABLE2″ 184.7 KB 11927 rows
ORA-39173: Encrypted data has been stored unencrypted in dump file set.
Master table “SYS”.”SYS_EXPORT_SCHEMA_01″ successfully loaded/unloaded
Dump file set for SYS.SYS_EXPORT_SCHEMA_01 is:
Job “SYS”.”SYS_EXPORT_SCHEMA_01″ successfully completed at Mon May 27 21:41:37 2024 elapsed 0 00:00:14

Of course, in my example I did only for two tables, but in a real world scenario, you could insert hundreds or thousands of tables.

So, in my example, the parameter INCLUDE is defined below:

INCLUDE=TABLE:”IN (SELECT table_name FROM owner_test.export_tables)”

If we would like to export only tables that ends with the string ‘BKP’ on its names:


You noticed that this is a powerful way to do it, right? You can for example specify only the tables that matches with a specific creation date (column CREATED on DBA_OBJECTS view).

There is a support note really useful with lots of examples:

Export/Import DataPump Parameters INCLUDE and EXCLUDE – How to Load and Unload Specific Objects (Doc ID 341733.1)

Hope this helps!