Skip Headers
Oracle® Database Utilities
11
g
Release 2 (11.2)
Part Number E10701-02
Home
Book List
Index
Master Index
Contact Us
Next
View PDF
Contents
List of Examples
List of Figures
List of Tables
Title and Copyright Information
Preface
Audience
Documentation Accessibility
Related Documentation
Syntax Diagrams
Conventions
What's New in Database Utilities?
New Features in Oracle Database Utilities 11
g
Release 2
New Features in Oracle Database Utilities 11
g
Release 1
Part I Oracle Data Pump
1
Overview of Oracle Data Pump
Data Pump Components
How Does Data Pump Move Data?
Using Data File Copying to Move Data
Using Direct Path to Move Data
Using External Tables to Move Data
Using Conventional Path to Move Data
Using Network Link Import to Move Data
Required Roles for Data Pump Export and Import Operations
What Happens During Execution of a Data Pump Job?
Coordination of a Job
Tracking Progress Within a Job
Filtering Data and Metadata During a Job
Transforming Metadata During a Job
Maximizing Job Performance
Loading and Unloading of Data
Monitoring Job Status
Monitoring the Progress of Executing Jobs
File Allocation
Specifying Files and Adding Additional Dump Files
Default Locations for Dump, Log, and SQL Files
Oracle RAC Considerations
Using Directory Objects When Oracle Automatic Storage Management Is Enabled
Using Substitution Variables
Moving Data Between Different Database Versions
SecureFile LOB Considerations
2
Data Pump Export
What Is Data Pump Export?
Invoking Data Pump Export
Data Pump Export Interfaces
Data Pump Export Modes
Full Export Mode
Schema Mode
Table Mode
Tablespace Mode
Transportable Tablespace Mode
Network Considerations
Filtering During Export Operations
Data Filters
Metadata Filters
Parameters Available in Export's Command-Line Mode
ATTACH
CLUSTER
COMPRESSION
CONTENT
DATA_OPTIONS
DIRECTORY
DUMPFILE
ENCRYPTION
ENCRYPTION_ALGORITHM
ENCRYPTION_MODE
ENCRYPTION_PASSWORD
ESTIMATE
ESTIMATE_ONLY
EXCLUDE
FILESIZE
FLASHBACK_SCN
FLASHBACK_TIME
FULL
HELP
INCLUDE
JOB_NAME
LOGFILE
NETWORK_LINK
NOLOGFILE
PARALLEL
PARFILE
QUERY
REMAP_DATA
REUSE_DUMPFILES
SAMPLE
SCHEMAS
SERVICE_NAME
SOURCE_EDITION
STATUS
TABLES
TABLESPACES
TRANSPORT_FULL_CHECK
TRANSPORT_TABLESPACES
TRANSPORTABLE
VERSION
Commands Available in Export's Interactive-Command Mode
ADD_FILE
CONTINUE_CLIENT
EXIT_CLIENT
FILESIZE
HELP
KILL_JOB
PARALLEL
START_JOB
STATUS
STOP_JOB
Examples of Using Data Pump Export
Performing a Table-Mode Export
Data-Only Unload of Selected Tables and Rows
Estimating Disk Space Needed in a Table-Mode Export
Performing a Schema-Mode Export
Performing a Parallel Full Database Export
Using Interactive Mode to Stop and Reattach to a Job
Syntax Diagrams for Data Pump Export
3
Data Pump Import
What Is Data Pump Import?
Invoking Data Pump Import
Data Pump Import Interfaces
Data Pump Import Modes
Full Import Mode
Schema Mode
Table Mode
Tablespace Mode
Transportable Tablespace Mode
Network Considerations
Filtering During Import Operations
Data Filters
Metadata Filters
Parameters Available in Import's Command-Line Mode
ATTACH
CLUSTER
CONTENT
DATA_OPTIONS
DIRECTORY
DUMPFILE
ENCRYPTION_PASSWORD
ESTIMATE
EXCLUDE
FLASHBACK_SCN
FLASHBACK_TIME
FULL
HELP
INCLUDE
JOB_NAME
LOGFILE
NETWORK_LINK
NOLOGFILE
PARALLEL
PARFILE
PARTITION_OPTIONS
QUERY
REMAP_DATA
REMAP_DATAFILE
REMAP_SCHEMA
REMAP_TABLE
REMAP_TABLESPACE
REUSE_DATAFILES
SCHEMAS
SERVICE_NAME
SKIP_UNUSABLE_INDEXES
SOURCE_EDITION
SQLFILE
STATUS
STREAMS_CONFIGURATION
TABLE_EXISTS_ACTION
TABLES
TABLESPACES
TARGET_EDITION
TRANSFORM
TRANSPORT_DATAFILES
TRANSPORT_FULL_CHECK
TRANSPORT_TABLESPACES
TRANSPORTABLE
VERSION
Commands Available in Import's Interactive-Command Mode
CONTINUE_CLIENT
EXIT_CLIENT
HELP
KILL_JOB
PARALLEL
START_JOB
STATUS
STOP_JOB
Examples of Using Data Pump Import
Performing a Data-Only Table-Mode Import
Performing a Schema-Mode Import
Performing a Network-Mode Import
Syntax Diagrams for Data Pump Import
4
Data Pump Legacy Mode
Parameter Mappings
Using Original Export Parameters with Data Pump
Using Original Import Parameters with Data Pump
Management of File Locations in Data Pump Legacy Mode
Adjusting Existing Scripts for Data Pump Log Files and Errors
Log Files
Error Cases
Exit Status
5
Data Pump Performance
Data Performance Improvements for Data Pump Export and Import
Tuning Performance
Controlling Resource Consumption
Effects of Compression and Encryption on Performance
Initialization Parameters That Affect Data Pump Performance
Setting the Size Of the Buffer Cache In a Streams Environment
6
The Data Pump API
How Does the Client Interface to the Data Pump API Work?
Job States
What Are the Basic Steps in Using the Data Pump API?
Examples of Using the Data Pump API
Part II SQL*Loader
7
SQL*Loader Concepts
SQL*Loader Features
SQL*Loader Parameters
SQL*Loader Control File
Input Data and Datafiles
Fixed Record Format
Variable Record Format
Stream Record Format
Logical Records
Data Fields
LOBFILEs and Secondary Datafiles (SDFs)
Data Conversion and Datatype Specification
Discarded and Rejected Records
The Bad File
SQL*Loader Rejects
Oracle Database Rejects
The Discard File
Log File and Logging Information
Conventional Path Loads, Direct Path Loads, and External Table Loads
Conventional Path Loads
Direct Path Loads
Parallel Direct Path
External Table Loads
Choosing External Tables Versus SQL*Loader
Behavior Differences Between SQL*Loader and External Tables
Multiple Primary Input Datafiles
Syntax and Datatypes
Byte-Order Marks
Default Character Sets, Date Masks, and Decimal Separator
Use of the Backslash Escape Character
Loading Objects, Collections, and LOBs
Supported Object Types
column objects
row objects
Supported Collection Types
Nested Tables
VARRAYs
Supported LOB Types
Partitioned Object Support
Application Development: Direct Path Load API
SQL*Loader Case Studies
Case Study Files
Running the Case Studies
Case Study Log Files
Checking the Results of a Case Study
8
SQL*Loader Command-Line Reference
Invoking SQL*Loader
Alternative Ways to Specify Parameters
Command-Line Parameters
BAD (bad file)
BINDSIZE (maximum size)
COLUMNARRAYROWS
CONTROL (control file)
DATA (datafile)
DATE_CACHE
DIRECT (data path)
DISCARD (file name)
DISCARDMAX (integer)
ERRORS (errors to allow)
EXTERNAL_TABLE
Restrictions When Using EXTERNAL_TABLE
FILE (tablespace file to load into)
LOAD (number of records to load)
LOG (log file)
MULTITHREADING
NO_INDEX_ERRORS
PARALLEL (parallel load)
PARFILE (parameter file)
READSIZE (read buffer size)
RESUMABLE
RESUMABLE_NAME
RESUMABLE_TIMEOUT
ROWS (rows per commit)
SILENT (feedback mode)
SKIP (records to skip)
SKIP_INDEX_MAINTENANCE
SKIP_UNUSABLE_INDEXES
STREAMSIZE
USERID (username/password)
Exit Codes for Inspection and Display
9
SQL*Loader Control File Reference
Control File Contents
Comments in the Control File
Specifying Command-Line Parameters in the Control File
OPTIONS Clause
Specifying File Names and Object Names
File Names That Conflict with SQL and SQL*Loader Reserved Words
Specifying SQL Strings
Operating System Considerations
Specifying a Complete Path
Backslash Escape Character
Nonportable Strings
Using the Backslash as an Escape Character
Escape Character Is Sometimes Disallowed
Identifying XMLType Tables
Specifying Datafiles
Examples of INFILE Syntax
Specifying Multiple Datafiles
Identifying Data in the Control File with BEGINDATA
Specifying Datafile Format and Buffering
Specifying the Bad File
Examples of Specifying a Bad File Name
How Bad Files Are Handled with LOBFILEs and SDFs
Criteria for Rejected Records
Specifying the Discard File
Specifying the Discard File in the Control File
Specifying the Discard File from the Command Line
Examples of Specifying a Discard File Name
Criteria for Discarded Records
How Discard Files Are Handled with LOBFILEs and SDFs
Limiting the Number of Discarded Records
Handling Different Character Encoding Schemes
Multibyte (Asian) Character Sets
Unicode Character Sets
Database Character Sets
Datafile Character Sets
Input Character Conversion
Considerations When Loading Data into VARRAYs or Primary-Key-Based REFs
CHARACTERSET Parameter
Control File Character Set
Character-Length Semantics
Shift-sensitive Character Data
Interrupted Loads
Discontinued Conventional Path Loads
Discontinued Direct Path Loads
Load Discontinued Because of Space Errors
Load Discontinued Because Maximum Number of Errors Exceeded
Load Discontinued Because of Fatal Errors
Load Discontinued Because a Ctrl+C Was Issued
Status of Tables and Indexes After an Interrupted Load
Using the Log File to Determine Load Status
Continuing Single-Table Loads
Assembling Logical Records from Physical Records
Using CONCATENATE to Assemble Logical Records
Using CONTINUEIF to Assemble Logical Records
Loading Logical Records into Tables
Specifying Table Names
INTO TABLE Clause
Table-Specific Loading Method
Loading Data into Empty Tables
Loading Data into Nonempty Tables
Table-Specific OPTIONS Parameter
Loading Records Based on a Condition
Using the WHEN Clause with LOBFILEs and SDFs
Specifying Default Data Delimiters
fields_spec
termination_spec
enclosure_spec
Handling Short Records with Missing Data
TRAILING NULLCOLS Clause
Index Options
SORTED INDEXES Clause
SINGLEROW Option
Benefits of Using Multiple INTO TABLE Clauses
Extracting Multiple Logical Records
Relative Positioning Based on Delimiters
Distinguishing Different Input Record Formats
Relative Positioning Based on the POSITION Parameter
Distinguishing Different Input Row Object Subtypes
Loading Data into Multiple Tables
Summary
Bind Arrays and Conventional Path Loads
Size Requirements for Bind Arrays
Performance Implications of Bind Arrays
Specifying Number of Rows Versus Size of Bind Array
Calculations to Determine Bind Array Size
Determining the Size of the Length Indicator
Calculating the Size of Field Buffers
Minimizing Memory Requirements for Bind Arrays
Calculating Bind Array Size for Multiple INTO TABLE Clauses
10
SQL*Loader Field List Reference
Field List Contents
Specifying the Position of a Data Field
Using POSITION with Data Containing Tabs
Using POSITION with Multiple Table Loads
Examples of Using POSITION
Specifying Columns and Fields
Specifying Filler Fields
Specifying the Datatype of a Data Field
SQL*Loader Datatypes
Nonportable Datatypes
INTEGER(
n
)
SMALLINT
FLOAT
DOUBLE
BYTEINT
ZONED
DECIMAL
VARGRAPHIC
VARCHAR
VARRAW
LONG VARRAW
Portable Datatypes
CHAR
Datetime and Interval Datatypes
GRAPHIC
GRAPHIC EXTERNAL
Numeric EXTERNAL
RAW
VARCHARC
VARRAWC
Conflicting Native Datatype Field Lengths
Field Lengths for Length-Value Datatypes
Datatype Conversions
Datatype Conversions for Datetime and Interval Datatypes
Specifying Delimiters
Syntax for Termination and Enclosure Specification
Delimiter Marks in the Data
Maximum Length of Delimited Data
Loading Trailing Blanks with Delimiters
How Delimited Data Is Processed
Fields Using Only TERMINATED BY
Fields Using ENCLOSED BY Without TERMINATED BY
Fields Using ENCLOSED BY With TERMINATED BY
Fields Using OPTIONALLY ENCLOSED BY With TERMINATED BY
Conflicting Field Lengths for Character Datatypes
Predetermined Size Fields
Delimited Fields
Date Field Masks
Specifying Field Conditions
Comparing Fields to BLANKS
Comparing Fields to Literals
Using the WHEN, NULLIF, and DEFAULTIF Clauses
Examples of Using the WHEN, NULLIF, and DEFAULTIF Clauses
Loading Data Across Different Platforms
Byte Ordering
Specifying Byte Order
Using Byte Order Marks (BOMs)
Suppressing Checks for BOMs
Loading All-Blank Fields
Trimming Whitespace
Datatypes for Which Whitespace Can Be Trimmed
Specifying Field Length for Datatypes for Which Whitespace Can Be Trimmed
Predetermined Size Fields
Delimited Fields
Relative Positioning of Fields
No Start Position Specified for a Field
Previous Field Terminated by a Delimiter
Previous Field Has Both Enclosure and Termination Delimiters
Leading Whitespace
Previous Field Terminated by Whitespace
Optional Enclosure Delimiters
Trimming Trailing Whitespace
Trimming Enclosed Fields
How the PRESERVE BLANKS Option Affects Whitespace Trimming
How [NO] PRESERVE BLANKS Works with Delimiter Clauses
Applying SQL Operators to Fields
Referencing Fields
Common Uses of SQL Operators in Field Specifications
Combinations of SQL Operators
Using SQL Strings with a Date Mask
Interpreting Formatted Fields
Using SQL Strings to Load the ANYDATA Database Type
Using SQL*Loader to Generate Data for Input
Loading Data Without Files
Setting a Column to a Constant Value
CONSTANT Parameter
Setting a Column to an Expression Value
EXPRESSION Parameter
Setting a Column to the Datafile Record Number
RECNUM Parameter
Setting a Column to the Current Date
SYSDATE Parameter
Setting a Column to a Unique Sequence Number
SEQUENCE Parameter
Generating Sequence Numbers for Multiple Tables
Example: Generating Different Sequence Numbers for Each Insert
11
Loading Objects, LOBs, and Collections
Loading Column Objects
Loading Column Objects in Stream Record Format
Loading Column Objects in Variable Record Format
Loading Nested Column Objects
Loading Column Objects with a Derived Subtype
Specifying Null Values for Objects
Specifying Attribute Nulls
Specifying Atomic Nulls
Loading Column Objects with User-Defined Constructors
Loading Object Tables
Loading Object Tables with a Subtype
Loading REF Columns
Specifying Table Names in a REF Clause
System-Generated OID REF Columns
Primary Key REF Columns
Unscoped REF Columns That Allow Primary Keys
Loading LOBs
Loading LOB Data from a Primary Datafile
LOB Data in Predetermined Size Fields
LOB Data in Delimited Fields
LOB Data in Length-Value Pair Fields
Loading LOB Data from LOBFILEs
Dynamic Versus Static LOBFILE Specifications
Examples of Loading LOB Data from LOBFILEs
Considerations When Loading LOBs from LOBFILEs
Loading BFILE Columns
Loading Collections (Nested Tables and VARRAYs)
Restrictions in Nested Tables and VARRAYs
Secondary Datafiles (SDFs)
Dynamic Versus Static SDF Specifications
Loading a Parent Table Separately from Its Child Table
Memory Issues When Loading VARRAY Columns
12
Conventional and Direct Path Loads
Data Loading Methods
Loading ROWID Columns
Conventional Path Load
Conventional Path Load of a Single Partition
When to Use a Conventional Path Load
Direct Path Load
Data Conversion During Direct Path Loads
Direct Path Load of a Partitioned or Subpartitioned Table
Direct Path Load of a Single Partition or Subpartition
Advantages of a Direct Path Load
Restrictions on Using Direct Path Loads
Restrictions on a Direct Path Load of a Single Partition
When to Use a Direct Path Load
Integrity Constraints
Field Defaults on the Direct Path
Loading into Synonyms
Using Direct Path Load
Setting Up for Direct Path Loads
Specifying a Direct Path Load
Building Indexes
Improving Performance
Temporary Segment Storage Requirements
Indexes Left in an Unusable State
Using Data Saves to Protect Against Data Loss
Using the ROWS Parameter
Data Save Versus Commit
Data Recovery During Direct Path Loads
Media Recovery and Direct Path Loads
Instance Recovery and Direct Path Loads
Loading Long Data Fields
Loading Data As PIECED
Optimizing Performance of Direct Path Loads
Preallocating Storage for Faster Loading
Presorting Data for Faster Indexing
SORTED INDEXES Clause
Unsorted Data
Multiple-Column Indexes
Choosing the Best Sort Order
Infrequent Data Saves
Minimizing Use of the Redo Log
Disabling Archiving
Specifying the SQL*Loader UNRECOVERABLE Clause
Setting the SQL NOLOGGING Parameter
Specifying the Number of Column Array Rows and Size of Stream Buffers
Specifying a Value for the Date Cache
Optimizing Direct Path Loads on Multiple-CPU Systems
Avoiding Index Maintenance
Direct Loads, Integrity Constraints, and Triggers
Integrity Constraints
Enabled Constraints
Disabled Constraints
Reenable Constraints
Database Insert Triggers
Replacing Insert Triggers with Integrity Constraints
When Automatic Constraints Cannot Be Used
Preparation
Using an Update Trigger
Duplicating the Effects of Exception Conditions
Using a Stored Procedure
Permanently Disabled Triggers and Constraints
Increasing Performance with Concurrent Conventional Path Loads
Parallel Data Loading Models
Concurrent Conventional Path Loads
Intersegment Concurrency with Direct Path
Intrasegment Concurrency with Direct Path
Restrictions on Parallel Direct Path Loads
Initiating Multiple SQL*Loader Sessions
Parameters for Parallel Direct Path Loads
Using the FILE Parameter to Specify Temporary Segments
Enabling Constraints After a Parallel Direct Path Load
PRIMARY KEY and UNIQUE KEY Constraints
General Performance Improvement Hints
Part III External Tables
13
External Tables Concepts
How Are External Tables Created?
Location of Datafiles and Output Files
Access Parameters
Datatype Conversion During External Table Use
External Table Restrictions
14
The ORACLE_LOADER Access Driver
access_parameters Clause
record_format_info Clause
FIXED length
VARIABLE size
DELIMITED BY
CHARACTERSET
PREPROCESSOR
Restriction When Using the PREPROCESSOR Clause
LANGUAGE
TERRITORY
DATA IS...ENDIAN
BYTEORDERMARK (CHECK | NOCHECK)
STRING SIZES ARE IN
LOAD WHEN
BADFILE | NOBADFILE
DISCARDFILE | NODISCARDFILE
LOG FILE | NOLOGFILE
SKIP
READSIZE
DISABLE_DIRECTORY_LINK_CHECK
DATE_CACHE
string
condition_spec
[directory object name:] filename
condition
range start : range end
field_definitions Clause
delim_spec
Example: External Table with Terminating Delimiters
Example: External Table with Enclosure and Terminator Delimiters
Example: External Table with Optional Enclosure Delimiters
trim_spec
MISSING FIELD VALUES ARE NULL
field_list
pos_spec Clause
start
*
increment
end
length
datatype_spec Clause
[UNSIGNED] INTEGER [EXTERNAL] [(len)]
DECIMAL [EXTERNAL] and ZONED [EXTERNAL]
ORACLE_DATE
ORACLE_NUMBER
Floating-Point Numbers
DOUBLE
FLOAT [EXTERNAL]
BINARY_DOUBLE
BINARY_FLOAT
RAW
CHAR
date_format_spec
VARCHAR and VARRAW
VARCHARC and VARRAWC
init_spec Clause
column_transforms Clause
transform
column_name
NULL
CONSTANT
CONCAT
LOBFILE
lobfile_attr_list
Example: Creating and Loading an External Table Using ORACLE_LOADER
Parallel Loading Considerations for the ORACLE_LOADER Access Driver
Performance Hints When Using the ORACLE_LOADER Access Driver
Restrictions When Using the ORACLE_LOADER Access Driver
Reserved Words for the ORACLE_LOADER Access Driver
15
The ORACLE_DATAPUMP Access Driver
access_parameters Clause
comments
COMPRESSION
ENCRYPTION
LOGFILE | NOLOGFILE
File Names for LOGFILE
VERSION Clause
Effects of Using the SQL ENCRYPT Clause
Unloading and Loading Data with the ORACLE_DATAPUMP Access Driver
Parallel Loading and Unloading
Combining Dump Files
Supported Datatypes
Unsupported Datatypes
Unloading and Loading BFILE Datatypes
Unloading LONG and LONG RAW Datatypes
Unloading and Loading Columns Containing Final Object Types
Tables of Final Object Types
Performance Hints When Using the ORACLE_DATAPUMP Access Driver
Restrictions When Using the ORACLE_DATAPUMP Access Driver
Reserved Words for the ORACLE_DATAPUMP Access Driver
Part IV Other Utilities
16
ADRCI: ADR Command Interpreter
About the ADR Command Interpreter (ADRCI) Utility
Definitions
Starting ADRCI and Getting Help
Using ADRCI in Interactive Mode
Getting Help
Using ADRCI in Batch Mode
Setting the ADRCI Homepath Before Using ADRCI Commands
Viewing the Alert Log
Finding Trace Files
Viewing Incidents
Packaging Incidents
About Packaging Incidents
Creating Incident Packages
Creating a Logical Incident Package
Adding Diagnostic Information to a Logical Incident Package
Generating a Physical Incident Package
ADRCI Command Reference
CREATE REPORT
ECHO
EXIT
HOST
IPS
Using the <ADR_HOME> and <ADR_BASE> Variables in IPS Commands
IPS ADD
IPS ADD FILE
IPS ADD NEW INCIDENTS
IPS COPY IN FILE
IPS COPY OUT FILE
IPS CREATE PACKAGE
IPS DELETE PACKAGE
IPS FINALIZE
IPS GENERATE PACKAGE
IPS GET MANIFEST
IPS GET METADATA
IPS PACK
IPS REMOVE
IPS REMOVE FILE
IPS SET CONFIGURATION
IPS SHOW CONFIGURATION
IPS SHOW FILES
IPS SHOW INCIDENTS
IPS UNPACK FILE
PURGE
QUIT
RUN
SET BASE
SET BROWSER
SET CONTROL
SET ECHO
SET EDITOR
SET HOMEPATH
SET TERMOUT
SHOW ALERT
SHOW BASE
SHOW CONTROL
SHOW HM_RUN
SHOW HOMEPATH
SHOW HOMES
SHOW INCDIR
SHOW INCIDENT
SHOW PROBLEM
SHOW REPORT
SHOW TRACEFILE
SPOOL
Troubleshooting ADRCI
17
DBVERIFY: Offline Database Verification Utility
Using DBVERIFY to Validate Disk Blocks of a Single Datafile
Syntax
Parameters
Sample DBVERIFY Output For a Single Datafile
Using DBVERIFY to Validate a Segment
Syntax
Parameters
Sample DBVERIFY Output For a Validated Segment
18
DBNEWID Utility
What Is the DBNEWID Utility?
Ramifications of Changing the DBID and DBNAME
Considerations for Global Database Names
Changing the DBID and DBNAME of a Database
Changing the DBID and Database Name
Changing Only the Database ID
Changing Only the Database Name
Troubleshooting DBNEWID
DBNEWID Syntax
Parameters
Restrictions and Usage Notes
Additional Restrictions for Releases Earlier Than Oracle Database 10
g
19
Using LogMiner to Analyze Redo Log Files
LogMiner Benefits
Introduction to LogMiner
LogMiner Configuration
Sample Configuration
Requirements
Directing LogMiner Operations and Retrieving Data of Interest
LogMiner Dictionary Files and Redo Log Files
LogMiner Dictionary Options
Using the Online Catalog
Extracting a LogMiner Dictionary to the Redo Log Files
Extracting the LogMiner Dictionary to a Flat File
Redo Log File Options
Starting LogMiner
Querying V$LOGMNR_CONTENTS for Redo Data of Interest
How the V$LOGMNR_CONTENTS View Is Populated
Querying V$LOGMNR_CONTENTS Based on Column Values
The Meaning of NULL Values Returned by the MINE_VALUE Function
Usage Rules for the MINE_VALUE and COLUMN_PRESENT Functions
Querying V$LOGMNR_CONTENTS Based on XMLType Columns and Tables
Restrictions When Using LogMiner With XMLType Data
Example of a PL/SQL Procedure for Assembling XMLType Data
Filtering and Formatting Data Returned to V$LOGMNR_CONTENTS
Showing Only Committed Transactions
Skipping Redo Corruptions
Filtering Data by Time
Filtering Data by SCN
Formatting Reconstructed SQL Statements for Re-execution
Formatting the Appearance of Returned Data for Readability
Reapplying DDL Statements Returned to V$LOGMNR_CONTENTS
Calling DBMS_LOGMNR.START_LOGMNR Multiple Times
Supplemental Logging
Database-Level Supplemental Logging
Minimal Supplemental Logging
Database-Level Identification Key Logging
Disabling Database-Level Supplemental Logging
Table-Level Supplemental Logging
Table-Level Identification Key Logging
Table-Level User-Defined Supplemental Log Groups
Usage Notes for User-Defined Supplemental Log Groups
Tracking DDL Statements in the LogMiner Dictionary
DDL_DICT_TRACKING and Supplemental Logging Settings
DDL_DICT_TRACKING and Specified Time or SCN Ranges
Accessing LogMiner Operational Information in Views
Querying V$LOGMNR_LOGS
Querying Views for Supplemental Logging Settings
Steps in a Typical LogMiner Session
Enable Supplemental Logging
Extract a LogMiner Dictionary
Specify Redo Log Files for Analysis
Start LogMiner
Query V$LOGMNR_CONTENTS
End the LogMiner Session
Examples Using LogMiner
Examples of Mining by Explicitly Specifying the Redo Log Files of Interest
Example 1: Finding All Modifications in the Last Archived Redo Log File
Example 2: Grouping DML Statements into Committed Transactions
Example 3: Formatting the Reconstructed SQL
Example 4: Using the LogMiner Dictionary in the Redo Log Files
Example 5: Tracking DDL Statements in the Internal Dictionary
Example 6: Filtering Output by Time Range
Examples of Mining Without Specifying the List of Redo Log Files Explicitly
Example 1: Mining Redo Log Files in a Given Time Range
Example 2: Mining the Redo Log Files in a Given SCN Range
Example 3: Using Continuous Mining to Include Future Values in a Query
Example Scenarios
Scenario 1: Using LogMiner to Track Changes Made by a Specific User
Scenario 2: Using LogMiner to Calculate Table Access Statistics
Supported Datatypes, Storage Attributes, and Database and Redo Log File Versions
Supported Datatypes and Table Storage Attributes
Unsupported Datatypes and Table Storage Attributes
Supported Databases and Redo Log File Versions
SecureFile LOB Considerations
20
Using the Metadata APIs
Why Use the DBMS_METADATA API?
Overview of the DBMS_METADATA API
Using the DBMS_METADATA API to Retrieve an Object's Metadata
Typical Steps Used for Basic Metadata Retrieval
Retrieving Multiple Objects
Placing Conditions on Transforms
Accessing Specific Metadata Attributes
Using the DBMS_METADATA API to Re-Create a Retrieved Object
Using the DBMS_METADATA API to Retrieve Collections of Different Object Types
Filtering the Return of Heterogeneous Object Types
Using the DBMS_METADATA_DIFF API to Compare Object Metadata
Performance Tips for the Programmatic Interface of the DBMS_METADATA API
Example Usage of the DBMS_METADATA API
What Does the DBMS_METADATA Example Do?
Output Generated from the GET_PAYROLL_TABLES Procedure
Summary of DBMS_METADATA Procedures
Summary of DBMS_METADATA_DIFF Procedures
21
Original Export
What is the Export Utility?
Before Using Export
Running catexp.sql or catalog.sql
Ensuring Sufficient Disk Space for Export Operations
Verifying Access Privileges for Export and Import Operations
Invoking Export
Invoking Export as SYSDBA
Command-Line Entries
Parameter Files
Interactive Mode
Restrictions When Using Export's Interactive Method
Getting Online Help
Export Modes
Table-Level and Partition-Level Export
Table-Level Export
Partition-Level Export
Export Parameters
BUFFER
Example: Calculating Buffer Size
COMPRESS
CONSISTENT
CONSTRAINTS
DIRECT
FEEDBACK
FILE
FILESIZE
FLASHBACK_SCN
FLASHBACK_TIME
FULL
Points to Consider for Full Database Exports and Imports
GRANTS
HELP
INDEXES
LOG
OBJECT_CONSISTENT
OWNER
PARFILE
QUERY
Restrictions When Using the QUERY Parameter
RECORDLENGTH
RESUMABLE
RESUMABLE_NAME
RESUMABLE_TIMEOUT
ROWS
STATISTICS
TABLES
Table Name Restrictions
TABLESPACES
TRANSPORT_TABLESPACE
TRIGGERS
TTS_FULL_CHECK
USERID (username/password)
VOLSIZE
Example Export Sessions
Example Export Session in Full Database Mode
Example Export Session in User Mode
Example Export Sessions in Table Mode
Example 1: DBA Exporting Tables for Two Users
Example 2: User Exports Tables That He Owns
Example 3: Using Pattern Matching to Export Various Tables
Example Export Session Using Partition-Level Export
Example 1: Exporting a Table Without Specifying a Partition
Example 2: Exporting a Table with a Specified Partition
Example 3: Exporting a Composite Partition
Warning, Error, and Completion Messages
Log File
Warning Messages
Nonrecoverable Error Messages
Completion Messages
Exit Codes for Inspection and Display
Conventional Path Export Versus Direct Path Export
Invoking a Direct Path Export
Security Considerations for Direct Path Exports
Performance Considerations for Direct Path Exports
Restrictions for Direct Path Exports
Network Considerations
Transporting Export Files Across a Network
Exporting with Oracle Net
Character Set and Globalization Support Considerations
User Data
Effect of Character Set Sorting Order on Conversions
Data Definition Language (DDL)
Single-Byte Character Sets and Export and Import
Multibyte Character Sets and Export and Import
Using Instance Affinity with Export and Import
Considerations When Exporting Database Objects
Exporting Sequences
Exporting LONG and LOB Datatypes
Exporting Foreign Function Libraries
Exporting Offline Locally Managed Tablespaces
Exporting Directory Aliases
Exporting BFILE Columns and Attributes
Exporting External Tables
Exporting Object Type Definitions
Exporting Nested Tables
Exporting Advanced Queue (AQ) Tables
Exporting Synonyms
Possible Export Errors Related to Java Synonyms
Support for Fine-Grained Access Control
Transportable Tablespaces
Exporting From a Read-Only Database
Using Export and Import to Partition a Database Migration
Advantages of Partitioning a Migration
Disadvantages of Partitioning a Migration
How to Use Export and Import to Partition a Database Migration
Using Different Releases of Export and Import
Restrictions When Using Different Releases of Export and Import
Examples of Using Different Releases of Export and Import
22
Original Import
What Is the Import Utility?
Table Objects: Order of Import
Before Using Import
Running catexp.sql or catalog.sql
Verifying Access Privileges for Import Operations
Importing Objects Into Your Own Schema
Importing Grants
Importing Objects Into Other Schemas
Importing System Objects
Processing Restrictions
Importing into Existing Tables
Manually Creating Tables Before Importing Data
Disabling Referential Constraints
Manually Ordering the Import
Effect of Schema and Database Triggers on Import Operations
Invoking Import
Command-Line Entries
Parameter Files
Interactive Mode
Invoking Import As SYSDBA
Getting Online Help
Import Modes
Import Parameters
BUFFER
COMMIT
COMPILE
CONSTRAINTS
DATA_ONLY
DATAFILES
DESTROY
FEEDBACK
FILE
FILESIZE
FROMUSER
FULL
Points to Consider for Full Database Exports and Imports
GRANTS
HELP
IGNORE
INDEXES
INDEXFILE
LOG
PARFILE
RECORDLENGTH
RESUMABLE
RESUMABLE_NAME
RESUMABLE_TIMEOUT
ROWS
SHOW
SKIP_UNUSABLE_INDEXES
STATISTICS
STREAMS_CONFIGURATION
STREAMS_INSTANTIATION
TABLES
Table Name Restrictions
TABLESPACES
TOID_NOVALIDATE
TOUSER
TRANSPORT_TABLESPACE
TTS_OWNERS
USERID (username/password)
VOLSIZE
Example Import Sessions
Example Import of Selected Tables for a Specific User
Example Import of Tables Exported by Another User
Example Import of Tables from One User to Another
Example Import Session Using Partition-Level Import
Example 1: A Partition-Level Import
Example 2: A Partition-Level Import of a Composite Partitioned Table
Example 3: Repartitioning a Table on a Different Column
Example Import Using Pattern Matching to Import Various Tables
Exit Codes for Inspection and Display
Error Handling During an Import
Row Errors
Failed Integrity Constraints
Invalid Data
Errors Importing Database Objects
Object Already Exists
Sequences
Resource Errors
Domain Index Metadata
Table-Level and Partition-Level Import
Guidelines for Using Table-Level Import
Guidelines for Using Partition-Level Import
Migrating Data Across Partitions and Tables
Controlling Index Creation and Maintenance
Delaying Index Creation
Index Creation and Maintenance Controls
Example of Postponing INdex Maintenance
Network Considerations
Character Set and Globalization Support Considerations
User Data
Effect of Character Set Sorting Order on Conversions
Data Definition Language (DDL)
Single-Byte Character Sets
Multibyte Character Sets
Using Instance Affinity
Considerations When Importing Database Objects
Importing Object Identifiers
Importing Existing Object Tables and Tables That Contain Object Types
Importing Nested Tables
Importing REF Data
Importing BFILE Columns and Directory Aliases
Importing Foreign Function Libraries
Importing Stored Procedures, Functions, and Packages
Importing Java Objects
Importing External Tables
Importing Advanced Queue (AQ) Tables
Importing LONG Columns
Importing LOB Columns When Triggers Are Present
Importing Views
Importing Partitioned Tables
Support for Fine-Grained Access Control
Snapshots and Snapshot Logs
Snapshot Log
Snapshots
Importing a Snapshot
Importing a Snapshot into a Different Schema
Transportable Tablespaces
Storage Parameters
The OPTIMAL Parameter
Storage Parameters for OID Indexes and LOB Columns
Overriding Storage Parameters
Read-Only Tablespaces
Dropping a Tablespace
Reorganizing Tablespaces
Importing Statistics
Using Export and Import to Partition a Database Migration
Advantages of Partitioning a Migration
Disadvantages of Partitioning a Migration
How to Use Export and Import to Partition a Database Migration
Tuning Considerations for Import Operations
Changing System-Level Options
Changing Initialization Parameters
Changing Import Options
Dealing with Large Amounts of LOB Data
Dealing with Large Amounts of LONG Data
Using Different Releases of Export and Import
Restrictions When Using Different Releases of Export and Import
Examples of Using Different Releases of Export and Import
23
Enterprise Manager Configuration Assistant (EMCA)
Configuring Database Control with EMCA
Configuring Software Library With EMCA
Using an Input File for EMCA Parameters
Using EMCA With Oracle Real Application Clusters
Specifying the Ports Used by EMCA
EMCA Troubleshooting Tips
Using EMCA After Changing the Database Listener Port
Upgrading Database or Oracle ASM Instances With 11
g
Release 1 Grid Control Agents
Using EMCA When Database Host Name or IP Address Changes
Using EMCA When the TNS Configuration Is Changed
Part V Appendixes
A
SQL*Loader Syntax Diagrams
Index
Scripting on this page enhances content navigation, but does not change the content in any way.