Oracle 10g DataPump

of 41/41
Oracle 10g DataPump, Part 1: Overview By Jim Czuprynski Synopsis. Oracle 10g's new DataPump utility is designed as the eventual replacement for the original Oracle Import and Export utilities. This article - the first in a series - provides an overview of the DataPump's suite of tools for extracting, transforming, and loading data within an Oracle database. Oracle's Import and Export utilities offer a great way to create, retain, and reload a point-in-time snapshot of a database's structure and contents at the table, schema, tablespace, or database level. Database exports also have a place as a potential third line of defense in an in-depth disaster recovery plan (please view my article on Becoming A Master of Disaster for a more detailed discussion of this concept). I have also used these utilities on a regular basis to migrate data from our production environment to multiple development environments, but as reliable as these utilities are, they do have some drawbacks that needed to be addressed: Operations are difficult to restart. If an export session died because it ran out of disk space, or if the session was terminated deliberately or accidentally, the only option is to restart the entire export from the beginning, or to create a new export control file at the point of failure and initiate the export again. Execution is client-side and single-threaded. The Import and Export utilities use a client process to execute their respective operations. This means that the most powerful source of information - the database server itself - is unavailable to help control the Import or Export operation. Also, there is no way to utilize parallel processing on a database server while exporting data, so each Export operation is essentially one single, serial, potentially long- running task. Dump files can grow enormously. The Export utility creates one or more export dump files during each session; during a complete dump of a database, these files can obviously grow quite large. Since it is difficult to estimate the approximate expected size of the completed export files, the operation may fail due to disk space constraints. Also, to import just one table or database object from a multi-schema export file, the Import utility has to read all dump files to locate the object before it can be restored - obviously not an optimal situation!
  • date post

    18-Aug-2015
  • Category

    Documents

  • view

    25
  • download

    4

Embed Size (px)

description

,,

Transcript of Oracle 10g DataPump

Oracle 10g DataPump, Part 1: OverviewBy Jim CzuprynskiSynopsis Oracle 10g's new DataPump utility is designed as the eventual replacement for the original Oracle Import and Export utilities. his article ! the first in a series ! provides anoverview of the DataPump's suite of tools for extracting" transforming" and loading data within an Oracle data#ase.Oracle's Import and Export utilities offer a great way to create, retain, and reload a point-in-time snapshot of a database's structure and contents at the table, schema, tablespace, or database level. Database exports also have a place as a potential third line of defense in an in-depth disaster recovery plan please view my article on !ecoming " #aster of Disaster for a more detailed discussion of this concept$. I have also used these utilities on a regular basis to migrate data from our production environment to multiple development environments, but as reliable as these utilities are, they do have some drawbac%s that needed to be addressed& Operations are !i""icult to restart If an export session died because it ran out of dis% space, or if the session was terminated deliberately or accidentally, the only option is to restart the entire export from the beginning, or to create a new export control file at the point of failure and initiate the export again.#$ecution is client%si!e an! single%t&rea!e! 'he Import and Export utilities use a client process to execute their respective operations. 'his means that the most powerful source of information - the database server itself - is unavailable to help control the Import or Export operation. "lso, there is no way to utili(e parallel processing on a database server while exporting data, so each Export operation is essentially one single, serial, potentially long-running tas%.Dump "iles can grow enormously 'he Export utility creates one or more export dump files during each session) during a complete dump of a database, these files can obviously grow *uite large. +ince it is difficult to estimate the approximate expected si(e of the completed export files, the operation may fail due to dis% space constraints. "lso, to import ,ust one table or database ob,ect from a multi-schema export file, the Import utility has to read all dump files to locate the ob,ect before it can be restored - obviously not an optimal situation-'uning mec&anisms are limite! #ethods to tune a poorly executing Export or Import operation are relatively limited and somewhat arcane. Experimenting with different settings for the !.//E0 parameter -- which controls the maximum number of rows fetched into the Export utility's buffer at one time - is one of the few recommended options for improving performance.Progress monitoring is !i""icult 1hile a D!" can certainly monitor the command sessionthat is running the operation, or review the spooled output from an operation, there is no simple way to tell exactly how efficiently an Export or Import operation is performing, and there is no simple way to tell exactly what percentage of a large operation is completed.Data(ase o()ect "iltering "eatures are limite! 1hile it is certainly possible to export and import selected individual tables from a database or schema, filtering features are relatively limited. /or example, it is impossible to export all tables with common names across schemas within the same export session. It is also impossible to select specific data#ase o#$ect types other than ta#les li%e pac%ages, stored procedures, or triggers$ for inclusion in an export dump file.#nter '&e DataPump: *eatures Overview/ortunately, Oracle 23g has addressed many of these issues with the DataPump, a new setof utilities that significantly expands the capabilities of the original Export and Import utilities.Server%si!e parallel e$ecution Data4ump Import and Export operations are scheduled and processed within the data#ase server itself as a Data4ump $o# - a much more effective method than client-side execution. In addition, if the database server's O+ and hardware can ade*uately support it, Data4ump export operations can ta%e advantage of parallel processes to write out multiple data streams simultaneously.+mprove! control an! restarta(ility o" operations +ince the database server controls the Data4ump, a D!" can start an Export or Import operation and then detach from it, allowing the operation to run to completion without further intervention. 'he D!" can also reconnect to the operation from another command-line session, interrogate its current status and choose to pause the operation if necessary. "nd if it appears that the operation isdangerously close to failing - for example, because of insufficient dis% space for an export file - the D!" can even add more dis% files 5on the fly5 without canceling the existing operation. 'his provides a level of control at least an order of magnitude more powerful than the current Export6Import utilities.Simpli"ie! monitoring o" operational status Data4ump provides a new S','-S parameter that shows the status of an ongoing Export or Import operation. 1hen S','-S is specified at operation startup, Oracle will report the operation's status to the terminal session of the command-line interface at the specified interval. 'he D!" can also reattach toa detached Data4ump ,ob, issue a S','-S re*uest, determine how far the operation has progressed, and then detach from the ,ob again.,utomate! per"ormance tuning .nli%e the traditional Export and Import utilities, Data4ump implements self!tuning mechanisms to insure that tables are exported and imported in the most efficient manner. 7one are the B-**#., CO//+', CO/P.#SS, CO0S+S'#0', D+.#C', and .#CO.D1#02'3 parameters.+mprove! !ata(ase o()ect "iltering Data4ump implements filtering of the database ob,ects to capture during an Export operation or include as part of an Import operation with a new set of mutually-exclusive parameters, +0C1-D# and #4C1-D# 'hese parameters allow the D!" to specify almost any database ob,ect for inclusion or exclusion based on its o#$ect type and o#$ect name. 'his dramatically improves flexibility when exporting and importing) for example, it is now possible to gather ob,ects with similar names across an entire data#ase.#$port !ump "ile control en&ancements In prior releases, the D!" could specify more than one dump file as a destination for an Export session) however, once the session commenced, no additional files could be specified. Data4ump Export improves upon this by permitting parameteri%ed file names similar to the naming conventions for redo logs and 0#"8 bac%up files.DataPump Concepts'he Data4ump is the next logical progression in a series of improvements that have appeared during the last few Oracle releases. 'he two most dramatic upgrades involve the ability to read data extremely *uic%ly via direct path unloading, and the capability to write data directly to external tables.Direct Pat& -nloa!s 4rior releases of Oracle had introduced the concept of the direct pathload" which permits the server to bypass +9: parsing of I8+E0' statements so that data can be inserted directly and with blinding speed into database tables, providing the tables met certain conditions. !uilding on this concept, Oracle 23g adds the capability to perform direct path unloads of database tables within the same limits that apply for direct path loads.5riting 'o #$ternal 'a(les Oracle ;i 0elease < introduced the ability to read data from files stored outside the database as external ta#les using existing Oracle +9:=:oader accessmethods. Oracle 23g expands upon this feature by including a new access method, O.,C1#6D,',P-/P, that allows the server to write directly to an external file.Data4ump Export uses these two enhancements to unload data extremely *uic%ly. Even better, Data4ump needs virtually no assistance in deciding which method is fastest for exporting data, and is relatively self-tuning. /or example, the concept of direct vs. conventional exports is now meaningless, and several of the tuning parameters e.g. !.//E0, 0E>O0D:E87'?$ have been deprecated.'he Data4ump engine will choose to write to external tables over direct path unload whenever a table meets any of the following criteria& 'he table is part of a cluster. 'here is an active trigger on the table. &ine!grained access control is enabled for the table for I8+E0's. One of the table's columns has a referential integrity constraint. 'he table has an :O! column, and that :O! column has a domain index. 'he table contains a !/I:E column or '())(* column, and those columns$ have an em#edded" opa+ue *PE. DataPump ComponentsData4ump consists of three components that wor% in concert to manage and perform operations&Comman!%1ine +nter"ace :i%e the original Export #4P#4#$ and Import +/P#4#$ command-line interfaces, the Data4ump provides two command!line interfaces, #4PDP#4# and +/PDP#4#, for controlling Data4ump Export and Import operations, respectively.Data4ump also expands the command-line interface's capabilities by providing Interactive!,ommand -ode. 'his mode allows the D!" to start a Data4ump operation and then disconnect from it by simply issuing a >':@> %eystro%e. :ater, the D!" simply opens another Data4ump session while specifying the operation's ,ob name and then issues the "''">? directive to reconnect to the session. 'he D!" can then issue a series of commands, from ta%ing a pulse of the operation via the +'"'.+ directive, to adding more export dump files, or even terminating a long-running operation if desired.DB/S6/#',D,', Introduced in Oracle ;i, this 4:6+9: supplied pac%age provides methods to extract and format metadata - literally, 5information about information5 - from an Oracle database. "t its core, DB/S6/#',D,', stores metadata in A#: format for ultimate flexibility in its presentation of that metadata. Data4ump Export uses DB/S6/#',D,', to gather all metadata about the database ob,ects being processed then stores that metadata in the Export operation's master table. Data4ump Import uses the corresponding master table to extract information from the export dump files$, and then uses DB/S6/#',D,', to create the DD: statements it needs to create new database ob,ects as part of the Import operation.DB/S6D,',P-/P "t the heart of the Data4ump is the new DB/S6D,',P-/P 4:6+9: supplied pac%age. 'his pac%age contains methods for exporting data from and importing data into any Oracle database. EA4D4 and I#4D4 are actually ma%ing calls to this new pac%age to perform their respective operations.1hat is especially powerful about DB/S6D,',P-/P is that a D!" can also utili(e it directly, either in anonymous 4:6+9: bloc%s or within a stored procedure, stored function, orpac%age, to create custom Data4ump Export and Import ,obs. I will demonstrate how to accomplish this in the final article in this series when we loo% at some of the other advancedfeatures of the Data4ump.DataPump Jo(s, /aster Processes, an! t&e /aster 'a(leData4ump also implements ma,or and in my opinion, elegant-$ improvements to the execution and management of Export and Import operations. Each Data4ump operation uses a master process to manage and control the export or import session. Each master process is identified by a uni*ue $o# name, and Data4ump automatically assigns one by default if one is not specified at execution time.'he master process controls the Data4ump operation by invo%ing at least one wor.er process that actually performs the Export or Import. 'his is extremely similar to the way 0ecovery #anager 0#"8$ manages bac%up, restoration and recovery processing.$ If the Data4ump operation has re*uested additional parallel processes for multiple-threaded processing, more than one wor%er process will be created and managed.'he master process also creates a master table as a user table within the user account in which the Data4ump ,ob is invo%ed. 'his master table is always named the same as the Data4ump ,ob, and it is used slightly differently depending on the type of Data4ump operation, but it is always created within the same user account that is invo%ing the operation.During an #$port operation, the master table is created when the ,ob commences, containsall target ob,ects for the Export, and is retained until the ,ob finishes successfully, at which point it is dropped. ?owever, if the ,ob is paused or fails due to error, the Export process uses the master table to determine what ob,ects still need to be exported and how best to resume the ,ob. Once the ,ob completes successfully, the master table's contents are writtento the Export file itself for use by an Import process.On the other hand, an +mport operation reads the master table from the Export dump file and then creates the master table under the appropriate user account. During an Import, the master process uses the master table to decide which database ob,ects need to be created and in what order before any data is actually imported., Simple DataPump #$port Scenario'o demonstrate an example of how easy it is to use Data4ump for exporting data, I will create a dump file containing all database ob,ects stored in the ?uman 0esources ?0$ schema for use in a later Data4ump Import operation. 'he next article will greatly expand upon the capabilities of Data4ump Export, I assure you-$1isting 11 shows how to create a DI0E>'O0B database ob,ect for storage of all Data4ump dump files. Data4ump also uses the DI0E>'O0B ob,ect as a default location for log files and parameter files.1isting 17 provides a set of *ueries to display listings of what ob,ect types can be processed with the Data4ump, and whether a filter can be applied against the ob,ect type atthe database, schema, or table level.1isting 18 shows the Data4ump Export command issued, its corresponding parameter file, and the log file that documents the results of the Data4ump Export operation., Simple DataPump +mport Scenario.sing Data4ump to import data is e*ually simple. 'o illustrate, I have constructed the following scenario& 'he Data4ump Export dump file created in the previous scenario will be used to import data into a new schema, 3.6O1'P, that will eventually become the target of a new online transaction processing application under development. 'he new schema needs all of the tables from the existing ?0 schema and those tables' related ob,ects, i.e. referential constraints, indexes, se*uences, and triggers. 8o other database ob,ects should be loaded to this new schema from the existing ?0schema. +ee 1isting 19 for the +9: to create the new schema, the Data4ump Import command issued, its corresponding parameter file, and the log file that documents the results of the Data4ump Import operation./onitoring DataPump OperationsOracle 23g provides two new views, DB,6D,',P-/P6JOBS and DB,6D,',P-/P6S#SS+O0S that allow the D!" to monitor the progress of all Data4ump operations. 1isting 1: shows two *ueries that return valuable information about ongoing Data4ump ,obs and sessions.ConclusionOracle 23g's new Data4ump offers significant improvements over the original command-line-based Export and Import utilities, including multi-threaded processing, improved ob,ect filtering capabilities, and restartability. 'he next article in this series will dive deeply into utili(ing the Data4ump effectively to accomplish some typical and not so typical-$ 5real-world5 export and import tas%s..e"erences an! ,!!itional .ea!ing1hile there is no substitute for direct experience, reading the manual is not a bad idea, either. I have drawn upon the following Oracle 23g documentation for the deeper technical details of this article&B10;:0%01 Oracle Database New Features Guide/*|| Oracle 10g DataPump - Listing 1|||| Contains examples of new Oracle 10g DataPump Features.|||| ut!or" #im C$upr%ns&i|||| 'sage (otes"|| )!is script is pro*i+e+ to +emonstrate *arious features of Oracle 10g,s || new DataPump an+ s!oul+ -e carefull% proofrea+ -efore executing it against || an% existing Oracle +ata-ase to insure t!at no potential +amage can occur.||*/----- -- Listing 1.1" .etting up a D/01C)O02 o-3ect for DataPump use.--(ote t!at t!e +irector% fol+er nee+ not exist for t!is comman+--to succee+4 -ut an% su-se5uent attempt to utili$e t!e D/01C)O02--o-3ect will fail until t!e fol+er is create+ on t!e ser*er.--)!is s!oul+ -e run from .2.)16 for -est results-----D0OP D/01C)O02 export7+ir8C01)1 D/01C)O02 export7+ir as ,c"9oracle9export7+ir,8:0() 01D4 ;0/)1 O( D/01C)O02 export7+ir )O !r4 s!8----- -- Listing 1.1671APO0)/)FL1/)FL17D))otal estimation using FLOCH. met!o+" 1671APO0)/'.10Processing o-3ect t%pe .C>1671APO0)/.2.)167:0()Processing o-3ect t%pe .C>1671APO0)/0OL17:0()Processing o-3ect t%pe .C>1671APO0)/D1F'L)70OL1Processing o-3ect t%pe .C>1671APO0)/)FL1.PC17K'O)Processing o-3ect t%pe .C>1671APO0)/.17P017.C>167P0OCOF#C)/P0OCC)7.C>16Processing o-3ect t%pe .C>1671APO0)/)2P1/)2P17.P1CProcessing o-3ect t%pe .C>1671APO0)/.1K'1(C1/.1K'1(C1Processing o-3ect t%pe .C>1671APO0)/)FL1/)FL1Processing o-3ect t%pe .C>1671APO0)/)FL1/:0()/OF#1C)7:0()Processing o-3ect t%pe .C>1671APO0)/)FL1//(D1A//(D1AProcessing o-3ect t%pe .C>1671APO0)/)FL1/CO(.)0/()/CO(.)0/()Processing o-3ect t%pe .C>1671APO0)/)FL1//(D1A/.))/.)/C.//(D1A7.))/.)/C.Processing o-3ect t%pe .C>1671APO0)/)FL1/.))/.)/C./)FL17.))/.)/C.Processing o-3ect t%pe .C>1671APO0)/)FL1/CO661()Processing o-3ect t%pe .C>1671APO0)/)FL1/'D/)7OF#Processing o-3ect t%pe .C>1671APO0)/PCH:1/PCH:17.P1CProcessing o-3ect t%pe .C>1671APO0)/F'(C)/O(/F'(C)/O(Processing o-3ect t%pe .C>1671APO0)/P0OC1D'01/P0OC1D'01Processing o-3ect t%pe .C>1671APO0)/PCH:1/CO6P/L17PCH:1/PCH:17.P1C/L)107PCH:17.P1CProcessing o-3ect t%pe .C>1671APO0)/F'(C)/O(/L)107F'(C)/O(Processing o-3ect t%pe .C>1671APO0)/P0OC1D'01/L)107P0OC1D'01Processing o-3ect t%pe .C>1671APO0)/L/1;/L/1;Processing o-3ect t%pe .C>1671APO0)/L/1;/:0()/OF#1C)7:0()Processing o-3ect t%pe .C>1671APO0)/PCH:1/PCH:17FOD2Processing o-3ect t%pe .C>1671APO0)/)FL1/CO(.)0/()/01F7CO(.)0/()Processing o-3ect t%pe .C>1671APO0)/)FL1/)0/::10Processing o-3ect t%pe .C>1671APO0)/)FL1//(D1A/.17)FL7FF67/(D1A7/(D1A//(D1AProcessing o-3ect t%pe .C>1671APO0)/)FL1//(D1A/.))/.)/C./.17)FL7FF67/(D7.)).//(D1A7.))/.)/C.Processing o-3ect t%pe .C>1671APO0)/.17PO.)7.C>167P0OCOF#C)/P0OCC)7.C>16. . exporte+ I>0I.ILOF.1PI @[email protected]= HF 1 rows. . exporte+ I>0I.IF/:)F/0.)I 0I.IPPL/C().I 10.MJ HF?0 rows. . exporte+ I>0I.IPPL/C().71I11.= HFM= rows. . exporte+ I>0I.IPP0OL1.I J.0GN HF G rows. . exporte+ I>0I.IPP.I =.J?< HF ? rows. . exporte+ I>0I.ICO.)7C1()10.I J.?