Home » RDBMS Server » Server Utilities » import issue (10.2.0.3, windows 2003)
import issue [message #410862] Tue, 30 June 2009 14:15 Go to next message
sant_new
Messages: 165
Registered: June 2008
Senior Member
Hi all,

I'm importing our customer's data into our database, it is about 101GB database. I had lot of problems during the import, this is the 3rd import that is running and seems to be fine so far it has imported the data and the objects. But it is stuck at ' processing object type schema_export table statistics table_statistics' for about 12hrs now. The session is still active, so it makes me believe that it is still doing the statistics. I know I should've used statistics=none. But I didnt know its going to take this long to run the statistics....

I'll have to get this database ready asap, so I'm trying to figure out if there is a way to find out how much longer it will take for the statistics to complete? Is there a way to find this out?? Or the import got stuck or failed?

Please give me your suggestions. Thanks a lot
Re: import issue [message #410864 is a reply to message #410862] Tue, 30 June 2009 14:19 Go to previous messageGo to next message
Michel Cadot
Messages: 68647
Registered: March 2007
Location: Nanterre, France, http://...
Senior Member
Account Moderator
No way to know how long it will take.

Regards
Michel
Re: import issue [message #410865 is a reply to message #410862] Tue, 30 June 2009 14:22 Go to previous messageGo to next message
sant_new
Messages: 165
Registered: June 2008
Senior Member
Okay.I do see all the data got imported including the objects like views,procedures,functions,triggers etc.

Is computing table_statistics last step in the impdp? If so, does it matter if I kill the import and run dbms utility for the statistics?

Thanks
Re: import issue [message #410994 is a reply to message #410862] Wed, 01 July 2009 07:59 Go to previous messageGo to next message
sant_new
Messages: 165
Registered: June 2008
Senior Member
The statistics finished a few mins later. So it worked out fine. But future imports, I'll have to use exclude statistics.

I've another problem now. I've a full export of our client's database, but I want to import only one schema from the dump. I'm using the below imp command

 impdp ver/..@..  directory=dpdir dumpfile=exp_full.dmp logfile=ver.log full=n remap_schema=perm:ver exclude=statistics


I just want it to import perm schema into ver user but the import is creating all the schemas that exist in the export dump.

Please give me your thoughts. Thanks a lot.

Re: import issue [message #411019 is a reply to message #410862] Wed, 01 July 2009 08:42 Go to previous messageGo to next message
sant_new
Messages: 165
Registered: June 2008
Senior Member
I just found that this command can be used:
exclude=schema:"='userown'"

I tried this:

impdp ver/..@.. directory=dpdir dumpfile=exp_full.dmp logfile=ver.log full=n remap_schema=perm:ver exclude=statistics exclude=schema:"='userown'"

LRM-00116: syntax error at 'schema:' following '='

Please give me your thoughts.

Thanks
Re: import issue [message #411031 is a reply to message #410862] Wed, 01 July 2009 09:47 Go to previous messageGo to next message
BlackSwan
Messages: 26766
Registered: January 2009
Location: SoCal
Senior Member
impdp help=yes

Import: Release 10.2.0.4.0 - Production on Wednesday, 01 July, 2009 7:46:15

Copyright (c) 2003, 2007, Oracle.  All rights reserved.


The Data Pump Import utility provides a mechanism for transferring data objects
between Oracle databases. The utility is invoked with the following command:

     Example: impdp scott/tiger DIRECTORY=dmpdir DUMPFILE=scott.dmp

You can control how Import runs by entering the 'impdp' command followed
by various parameters. To specify parameters, you use keywords:

     Format:  impdp KEYWORD=value or KEYWORD=(value1,value2,...,valueN)
     Example: impdp scott/tiger DIRECTORY=dmpdir DUMPFILE=scott.dmp

USERID must be the first parameter on the command line.

Keyword               Description (Default)
------------------------------------------------------------------------------
ATTACH                Attach to existing job, e.g. ATTACH [=job name].
CONTENT               Specifies data to load where the valid keywords are:
                      (ALL), DATA_ONLY, and METADATA_ONLY.
DIRECTORY             Directory object to be used for dump, log, and sql files.
DUMPFILE              List of dumpfiles to import from (expdat.dmp),
                      e.g. DUMPFILE=scott1.dmp, scott2.dmp, dmpdir:scott3.dmp.
ENCRYPTION_PASSWORD   Password key for accessing encrypted column data.
                      This parameter is not valid for network import jobs.
ESTIMATE              Calculate job estimates where the valid keywords are:
                      (BLOCKS) and STATISTICS.
EXCLUDE               Exclude specific object types, e.g. EXCLUDE=TABLE:EMP.
FLASHBACK_SCN         SCN used to set session snapshot back to.
FLASHBACK_TIME        Time used to get the SCN closest to the specified time.
FULL                  Import everything from source (Y).
HELP                  Display help messages (N).
INCLUDE               Include specific object types, e.g. INCLUDE=TABLE_DATA.
JOB_NAME              Name of import job to create.
LOGFILE               Log file name (import.log).
NETWORK_LINK          Name of remote database link to the source system.
NOLOGFILE             Do not write logfile.
PARALLEL              Change the number of active workers for current job.
PARFILE               Specify parameter file.
QUERY                 Predicate clause used to import a subset of a table.
REMAP_DATAFILE        Redefine datafile references in all DDL statements.
REMAP_SCHEMA          Objects from one schema are loaded into another schema.
REMAP_TABLESPACE      Tablespace object are remapped to another tablespace.
REUSE_DATAFILES       Tablespace will be initialized if it already exists (N).
SCHEMAS               List of schemas to import.
SKIP_UNUSABLE_INDEXES Skip indexes that were set to the Index Unusable state.
SQLFILE               Write all the SQL DDL to a specified file.
STATUS                Frequency (secs) job status is to be monitored where
                      the default (0) will show new status when available.
STREAMS_CONFIGURATION Enable the loading of Streams metadata
TABLE_EXISTS_ACTION   Action to take if imported object already exists.
                      Valid keywords: (SKIP), APPEND, REPLACE and TRUNCATE.
TABLES                Identifies a list of tables to import.
TABLESPACES           Identifies a list of tablespaces to import.
TRANSFORM             Metadata transform to apply to applicable objects.
                      Valid transform keywords: SEGMENT_ATTRIBUTES, STORAGE
                      OID, and PCTSPACE.
TRANSPORT_DATAFILES   List of datafiles to be imported by transportable mode.
TRANSPORT_FULL_CHECK  Verify storage segments of all tables (N).
TRANSPORT_TABLESPACES List of tablespaces from which metadata will be loaded.
                      Only valid in NETWORK_LINK mode import operations.
VERSION               Version of objects to export where valid keywords are:
                      (COMPATIBLE), LATEST, or any valid database version.
                      Only valid for NETWORK_LINK and SQLFILE.

The following commands are valid while in interactive mode.
Note: abbreviations are allowed

Command               Description (Default)
------------------------------------------------------------------------------
CONTINUE_CLIENT       Return to logging mode. Job will be re-started if idle.
EXIT_CLIENT           Quit client session and leave job running.
HELP                  Summarize interactive commands.
KILL_JOB              Detach and delete job.
PARALLEL              Change the number of active workers for current job.
                      PARALLEL=<number of workers>.
START_JOB             Start/resume current job.
                      START_JOB=SKIP_CURRENT will start the job after skipping
                      any action which was in progress when job was stopped. 
STATUS                Frequency (secs) job status is to be monitored where
                      the default (0) will show new status when available.
                      STATUS[=interval]
STOP_JOB              Orderly shutdown of job execution and exits the client.
                      STOP_JOB=IMMEDIATE performs an immediate shutdown of the
                      Data Pump job.
Re: import issue [message #411037 is a reply to message #410862] Wed, 01 July 2009 10:54 Go to previous message
sant_new
Messages: 165
Registered: June 2008
Senior Member
Just FYI if someone has the same problem, this worked:

EXCLUDE=SCHEMA:\"= 'USEROWN'\

Previous Topic: SQL Loader
Next Topic: Parameters / Variables in SQL Loader Control File
Goto Forum:
  


Current Time: Fri Apr 26 14:01:20 CDT 2024