Oracle® Database Utilities 11g Release 2 (11.2) Part Number E10701-02 |
|
|
View PDF |
If you use original Export (exp) and Import (imp), then you may have scripts you have been using for many years. To ease the transition to the newer Data Pump Export and Import utilities, Data Pump provides a legacy mode which allows you to continue to use your existing scripts with Data Pump.
Data Pump enters legacy mode once it determines a parameter unique to original Export or Import is present, either on the command line or in a script. As Data Pump processes the parameter, the analogous Data Pump Export or Data Pump Import parameter is displayed. Oracle strongly recommends that you view the new syntax and make script changes as time permits.
Note:
Data Pump Export and Import only handle dump files and log files in the Data Pump format. They never create or read dump files compatible with original Export or Import. If you have a dump file created with original Export, then you must use original Import to import the data into the database.This chapter contains the following sections:
This section describes how original Export and Import parameters map to the Data Pump Export and Import parameters that supply similar functionality.
See Also:
Data Pump Export accepts original Export parameters when they map to a corresponding Data Pump parameter. Table 4-1 describes how Data Pump Export interprets original Export parameters. Parameters that have the same name and functionality in both original Export and Data Pump Export are not included in this table.
Table 4-1 How Data Pump Export Handles Original Export Parameters
Original Export Parameter | Action Taken by Data Pump Export Parameter |
---|---|
BUFFER |
This parameter is ignored because Data Pump does not make use of conventional mode. |
COMPRESS |
This parameter is ignored. In original Export, the The Data Pump Export |
CONSISTENT |
Data Pump Export determines the current time and uses |
|
If original Export used The default behavior is to include constraints as part of the export. |
DIRECT |
This parameter is ignored. Data Pump Export automatically chooses the best export method. |
FEEDBACK |
The Data Pump Export In original Export, feedback was given after a certain number of rows, as specified with the |
FILE |
Data Pump Export attempts to determine the path that was specified or defaulted to for the See Management of File Locations in Data Pump Legacy Mode for more information about how Data Pump handles the original Export |
GRANTS |
If original Export used If original Export used |
INDEXES |
If original Export used If original Export used |
|
Data Pump Export attempts to determine the path that was specified or defaulted to for the See Management of File Locations in Data Pump Legacy Mode for more information about how Data Pump handles the original Export The contents of the log file will be those of a Data Pump Export operation. See Log Files for information about log file location and content. |
OBJECT_CONSISTENT |
This parameter is ignored because Data Pump Export processing ensures that each object is in a consistent state when being exported. |
OWNER |
The Data Pump |
RECORDLENGTH |
This parameter is ignored because Data Pump Export automatically takes care of buffer sizing. |
RESUMABLE |
This parameter is ignored because Data Pump Export automatically provides this functionality to users who have been granted the |
RESUMABLE_NAME |
This parameter is ignored because Data Pump Export automatically provides this functionality to users who have been granted the |
RESUMABLE_TIMEOUT |
This parameter is ignored because Data Pump Export automatically provides this functionality to users who have been granted the |
ROWS |
If original Export used If original Export used |
STATISTICS |
This parameter is ignored because statistics are always saved for tables as part of a Data Pump export operation. |
TABLESPACES |
If original Export also specified If original Export also specified |
|
If original Export used If original Export used |
TRIGGERS |
If original Export used If original Export used |
TTS_FULL_CHECK |
If original Export used If original Export used |
VOLSIZE |
When the original Export |
Data Pump Import accepts original Import parameters when they map to a corresponding Data Pump parameter. Table 4-2 describes how Data Pump Import interprets original Import parameters. Parameters that have the same name and functionality in both original Import and Data Pump Import are not included in this table.
Table 4-2 How Data Pump Import Handles Original Import Parameters
Original Import Parameter | Action Taken by Data Pump Import Parameter |
---|---|
BUFFER |
This parameter is ignored because Data Pump Import does not make use of conventional path mode. |
CHARSET |
This parameter was desupported several releases ago and should no longer be used. It will cause the Data Pump Import operation to abort. |
COMMIT |
This parameter is ignored. Data Pump Import automatically performs a commit after each table is processed. |
COMPILE |
This parameter is ignored. Data Pump Import compiles procedures after they are created. A recompile can be executed if necessary for dependency reasons. |
CONSTRAINTS |
If original Import used If original Import used |
DATAFILES |
The Data Pump Import |
DESTROY |
If original Import used If original Import used |
FEEDBACK |
The Data Pump Import In original Import, feedback was given after a certain number of rows, as specified with the |
|
Data Pump Import attempts to determine the path that was specified or defaulted to for the See Management of File Locations in Data Pump Legacy Mode for more information about how Data Pump handles the original Import |
FILESIZE |
This parameter is ignored because the information is already contained in the Data Pump dump file set. |
|
The Data Pump Import |
GRANTS |
If original Import used If original Import used |
IGNORE |
If original Import used If original Import used |
INDEXES |
If original Import used If original Import used |
|
The Data Pump Import The same method and attempts made when looking for a directory object described for the If no directory object was specified on the original Import, then Data Pump Import uses the directory object specified with the |
|
Data Pump Import attempts to determine the path that was specified or defaulted to for the See Management of File Locations in Data Pump Legacy Mode for more information about how Data Pump handles the original Import The contents of the log file will be those of a Data Pump Import operation. See Log Files for information about log file location and content. |
RECORDLENGTH |
This parameter is ignored because Data Pump handles issues about record length internally. |
RESUMABLE |
This parameter is ignored because this functionality is automatically provided for users who have been granted the |
RESUMABLE_NAME |
This parameter is ignored because this functionality is automatically provided for users who have been granted the |
RESUMABLE_TIMEOUT |
This parameter is ignored because this functionality is automatically provided for users who have been granted the |
ROWS=N |
If original Import used If original Import used |
SHOW |
If The name of the file will be the file name specified on the |
STATISTICS |
This parameter is ignored because statistics are always saved for tables as part of a Data Pump Import operation. |
STREAMS_CONFIGURATION |
This parameter is ignored because Data Pump Import automatically determines it; it does not need to be specified. |
STREAMS_INSTANTIATION |
This parameter is ignored because Data Pump Import automatically determines it; it does not need to be specified |
|
If original Import also specified If original Import also specified |
TOID_NOVALIDATE |
This parameter is ignored. OIDs are no longer used for type validation. |
|
The Data Pump Import The |
|
If original Import used If original Import used |
TTS_OWNERS |
This parameter is ignored because this information is automatically stored in the Data Pump dump file set. |
VOLSIZE |
When the original Import |
Original Export and Import and Data Pump Export and Import differ on where dump files and log files can be written to and read from because the original version is client-based and Data Pump is server-based.
Original Export and Import use the FILE
and LOG
parameters to specify dump file and log file names, respectively. These file names always refer to files local to the client system and they may also contain a path specification.
Data Pump Export and Import use the DUMPFILE
and LOGFILE
parameters to specify dump file and log file names, respectively. These file names always refer to files local to the server system and cannot contain any path information. Instead, a directory object is used to indirectly specify path information. The path value defined by the directory object must be accessible to the server. The directory object is specified for a Data Pump job through the DIRECTORY
parameter. It is also possible to prepend a directory object to the file names passed to the DUMPFILE
and LOGFILE
parameters. For privileged users, Data Pump supports the use of a default directory object if one is not specified on the command line. This default directory object, DATA_PUMP_DIR
, is set up at installation time.
If Data Pump legacy mode is enabled and the original Export FILE=
filespec
parameter and/or LOG=
filespec
parameter are present on the command line, then the following rules of precedence are used to determine a file's location:
Note:
If theFILE
parameter and LOG
parameter are both present on the command line, then the rules of precedence are applied separately to each parameter.
Also, when a mix of original Export/Import and Data Pump Export/Import parameters are used, separate rules apply to them. For example, suppose you have the following command:
expdp system FILE=/user/disk/foo.dmp LOGFILE=foo.log DIRECTORY=dpump_dir
The Data Pump legacy mode file management rules, as explained in this section, would apply to the FILE
parameter. The normal (that is, non-legacy mode) Data Pump file management rules, as described in Default Locations for Dump, Log, and SQL Files, would apply to the LOGFILE
parameter.
If a path location is specified as part of the file specification, then Data Pump attempts to look for a directory object accessible to the schema executing the export job whose path location matches the path location of the file specification. If such a directory object cannot be found, then an error is returned. For example, assume that a server-based directory object named USER_DUMP_FILES
has been defined with a path value of '/disk1/user1/dumpfiles/'
and that read and write access to this directory object has been granted to the hr
schema. The following command causes Data Pump to look for a server-based directory object whose path value contains '/disk1/user1/dumpfiles/'
and to which the hr
schema has been granted read and write access:
expdp hr FILE=/disk1/user1/dumpfiles/hrdata.dmp
In this case, Data Pump uses the directory object USER_DUMP_FILES
. The path value, in this example '/disk1/user1/dumpfiles/'
, must refer to a path on the server system that is accessible to the Oracle Database.
If a path location is specified as part of the file specification, then any directory object provided using the DIRECTORY
parameter is ignored. For example, if the following command is issued, then Data Pump does not use the DPUMP_DIR
directory object for the file parameter, but instead looks for a server-based directory object whose path value contains '/disk1/user1/dumpfiles/'
and to which the hr
schema has been granted read and write access:
expdp hr FILE=/disk1/user1/dumpfiles/hrdata.dmp DIRECTORY=dpump_dir
If no path location is specified as part of the file specification, then the directory object named by the DIRECTORY
parameter is used. For example, if the following command is issued, then Data Pump applies the path location defined for the DPUMP_DIR
directory object to the hrdata.dmp
file:
expdp hr FILE=hrdata.dmp DIRECTORY=dpump_dir
If no path location is specified as part of the file specification and no directory object is named by the DIRECTORY
parameter, then Data Pump does the following, in the order shown:
Data Pump looks for the existence of a directory object of the form DATA_PUMP_DIR_
schema_name
, where schema_name
is the schema that is executing the Data Pump job. For example, the following command would cause Data Pump to look for the existence of a server-based directory object named DATA_PUMP_DIR_HR
:
expdp hr FILE=hrdata.dmp
The hr
schema also must have been granted read and write access to this directory object. If such a directory object does not exist, then the process moves to step b.
Data Pump looks for the existence of the client-based environment variable DATA_PUMP_DIR
. For instance, assume that a server-based directory object named DUMP_FILES1
has been defined and the hr
schema has been granted read and write access to it. Then on the client system, the environment variable DATA_PUMP_DIR
can be set to point to DUMP_FILES1
as follows:
setenv DATA_PUMP_DIR DUMP_FILES1 expdp hr FILE=hrdata.dmp
Data Pump then uses the served-based directory object DUMP_FILES1
for the hrdata.dmp
file.
If a client-based environment variable DATA_PUMP_DIR
does not exist, then the process moves to step c.
If the schema that is executing the Data Pump job has DBA privileges, then the default Data Pump directory object, DATA_PUMP_DIR
, is used. This default directory object is established at installation time. For example, the following command causes Data Pump to attempt to use the default DATA_PUMP_DIR
directory object, assuming that system has DBA privileges:
expdp system FILE=hrdata.dmp
See Also:
Default Locations for Dump, Log, and SQL Files for information about Data Pump file management rules of precedence under normal Data Pump conditions (that is, non-legacy mode)Data Pump legacy mode requires that you review and update your existing scripts written for original Export and Import because of differences in file format and error reporting.
Data Pump Export and Import do not generate log files in the same format as those created by original Export and Import. Any scripts you have that parse the output of original Export and Import must be updated to handle the log file format used by Data Pump Export and Import. For example, the message Successfully Terminated
does not appear in Data Pump log files.
Data Pump Export and Import may not produce the same errors as those generated by original Export and Import. For example, if a parameter that is ignored by Data Pump Export would have had an out-of-range value in original Export, then an informational message is written to the log file stating that the parameter is being ignored. No value checking is performed, therefore no error message is generated.