BC remote Oracle DBA - Call (800) 766-1884  
Oracle Consulting Oracle Training Development

Remote DBA

Remote DBA Plans  

Remote DBA Service

Remote DBA RAC

   
Remote DBA Oracle Home
Remote DBA Oracle Training
Remote DBA SQL Tuning Consulting
Remote DBA Oracle Tuning Consulting
Remote DBA Data Warehouse Consulting
Remote DBA Oracle Project Management
Remote DBA Oracle Security Assessment
Remote DBA Unix Consulting
Burleson Books
Burleson Articles
Burleson Web Courses
Burleson Qualifications
Oracle Links
Remote DBA Oracle Monitoring
Remote DBA Support Benefits
Remote DBA Plans & Prices
Our Automation Strategy
What We Monitor
Oracle Apps Support
Print Our Brochure
Contact Us (e-mail)
Oracle Job Opportunities
Oracle Consulting Prices





   

 

 

 

Remote DBA services

Remote DBA Support

Remote DBA RAC

Remote DBA Reasons

Remote Oracle Tuning

Remote DBA Links

Oracle DBA Support

Oracle DBA Forum

Oracle Disaster

Oracle Training

Oracle Tuning

Oracle Training

 Remote DBA SQL Server

Remote MSSQL Consulting

Oracle DBA Hosting

Oracle License Negotiation

 

 


 

 

 

 

 

 

 

Oracle Dumpfile Compression Parameter

Oracle Tips by Burleson Consulting

This is an excerpt from "Oracle 10g New Features for Administrators" by Ahmed Baraka.

COMPRESSION =(METADATA_ONLY | NONE) - This parameter applies from Oracle 10.2. It specifies whether to compress metadata before writing to the dump file set. Compression reduces the amount of disk space consumed by dump files.

Data Pump Import Parameters

You’ll need the IMPORT_FULL_DATABASE role to perform an import if the dump file for the import was created using the EXPORT_FULL_DATABASE role.

File- and Directory-Related Parameters

The Data Pump import utility uses the PARFILE, DIRECTORY, DUMPFILE, LOGFILE, and NOLOGFILE commands in the same way as the Data Pump exportutility.

SQLFILE - This parameter enables you to extract the DDL from the export dump file, without importing any data.

impdp salapati/sammyy1 DIRECTORY=dpump_dir1
DUMPFILE=finance.dmp
SQLFILE=dpump_dir2:finance.sql

REUSE_DATAFILES - This parameter tells Data Pump whether it should use existing datafiles for creating tablespaces during an import.

Import Mode-Related Parameters

You can perform a Data Pump import in various modes, using the TABLE, SCHEMAS, TABLESPACES, and FULL parameters, just as in the case of the Data Pump export utility.

Filtering Parameters

The Data Pump import utility uses the CONTENT, EXCLUDE and INCLUDE parameters in the same way as the Data Pump export utility. If you use the CONTENT=DATA_ONLY option, you cannot use either the EXCLUDE or INCLUDE parameter during an import.

QUERY can also be used but in this case Data Pump will use only the external table data method, rather than the direct-path method, to access the data.

TABLE_EXISTS_ACTION - Use this parameter to tell Data Pump what to do when

a table already exists.

o SKIP (the default), Data Pump will skip a table if it exists.

o APPEND value appends rows to the table.

o TRUNCATE value truncates the table and reloads the data from the    export dump file.

o REPLACE value drops the table if it exists, recreates, and reloads it.

Job-Related Parameters

The JOB_NAME, STATUS, and PARALLEL parameters carry identical meanings as their Data Pump export counterparts.

Import Mode-Related Parameters

You can perform a Data Pump import in various modes, using the TABLES, SCHEMAS, TABLESPACES, and FULL parameters, just as in the case of the Data Pump export utility.

Remapping Parameters

REMAP_SCHEMA - Using this parameter, you can move objects from one schema to another.

impdp system/manager dumpfile=newdump.dmp
REMAP_SCHEMA=hr:oe
REMAP_DATAFILE

Changes the name of the source datafile to the target datafile name in all SQL statements where the source datafile is referenced: CREATE TABLESPACE, CREATE LIBRARY, and CREATE DIRECTORY.

Remapping datafiles is useful when you move databases between platforms that have different file naming conventions.

impdp hr/hr FULL=y DIRECTORY=dpump_dir1
DUMPFILE=db_full.dmp
REMAP_DATAFILE='DB1$:[HRDATA.PAYROLL]tbs6.f':'
/db1/hrdata/payroll/tbs6.f'

REMAP_TABLESPACE - This parameter enables you to move objects from one
tablespace into a different tablespace during an import.

impdp hr/hr
REMAP_TABLESPACE='example_tbs':'new_tbs'
DIRECTORY=dpump_dir1 PARALLEL=2
JOB_NAME=cf1n02 DUMPFILE=employees.dmp
NOLOGFILE=Y

The Network Link Parameter

NETWORK_LINK - In case of network import, the server contacts the remote source
database referenced by the parameter value, retrieves the data, and writes it directly back to the target database. There are no dump files involved.

impdp hr/hr TABLES=employees
DIRECTORY=dpump_dir1
NETWORK_LINK=source_database_link

EXCLUDE=CONSTRAINT - The log file is written to dpump_dir1, specified on the DIRECTORY parameter.

The TRANSFORM Parameter

TRANSFORM - This parameter instructs the Data Pump import job to modify the storage attributes of the DDL that creates the objects during the import job.

TRANSFORM = transform_name:value[:object_type]
transform_name: takes one of the following values:

SEGMENT_ATTRIBUTES - If the value is specified as y, then segment attributes (physical attributes, storage attributes, tablespaces, and logging) are included, with appropriate DDL. The default is y.

STORAGE - If the value is specified as y, the storage clauses are included, with appropriate DDL. The default is y. This parameter is ignored if SEGMENT_ATTRIBUTES=n.

OID - If the value is specified as n, the assignment of the exported OID during the creation of object tables and types is inhibited. Instead, a new OID is assigned. This can be useful for cloning schemas, but does not affect referenced objects. The default is y.

PCTSPACE - It accepts a greater-than-zero number. It represents the percentage multiplier used to alter extent allocations and the size of data files.

object_type: It can take one of the following values:

CLUSTER,CONSTRAINT,INC_TYPE,INDEX,ROLLBACK_SEGMENT,TABLE,TABLESPACE,TYPE
impdp hr/hr TABLES=employees \
DIRECTORY=dp_dir DUMPFILE=hr_emp.dmp \
TRANSFORM=SEGMENT_ATTRIBUTES:n:table
impdp hr/hr TABLES=employees \
DIRECTORY=dp_dir DUMPFILE=hr_emp.dmp \
TRANSFORM=STORAGE:n:table

 

If you like Oracle tuning, see the book "Oracle Tuning: The Definitive Reference", with 950 pages of tuning tips and scripts. 

You can buy it direct from the publisher for 30%-off and get instant access to the code depot of Oracle tuning scripts.


Expert Remote DBA

BC is America's oldest and largest Remote DBA Oracle support provider.  Get real Remote DBA experts, call
BC Remote DBA today.

 

 

Remote DBA Service
 

Oracle Tuning Book

 

Advance SQL Tuning Book 

BC Oracle support

Oracle books by Rampant

Oracle monitoring software

 

 

 

 

 

 

BC Remote Oracle Support

Remote DBA

Remote DBA Services

Copyright © 1996 -  2013 by Burleson. All rights reserved.

Oracle® is the registered trademark of Oracle Corporation.



Hit Counter