ORA-01466: unable to read data – table definition has changed

I got ” ORA-01466: unable to read data – table definition has changed ” error in Oracle database during impdp.

 

ORA-01466: unable to read data – table definition has changed

 

Details of error are as follows.

ORA-31693: Table data object "MSDBA"."TEST_TABLE" failed to load/unload and is being skipped due to error:
ORA-02354: error in exporting/importing data
ORA-01466: unable to read data - table definition has changed
Processing object type TABLE_EXPORT/TABLE/PROCACT_INSTANCE
Processing object type TABLE_EXPORT/TABLE/TABLE

ORA-01466: Unable to read data -- Table definition has changed

Cause:  This is a time-based read consistency error for a database object,  such as a table or index. Either of the following may have happened: The query was parsed and executed with a snapshot older than the time the object was changed.
 
The creation time-stamp of the object is greater than the current system time.  This happens, for example, when the system time is set to a time earlier than the creation time of the object.
 
Action: If the cause is an old snapshot, then commit or rollback the transaction and resume work. a creation time-stamp in the future, ensure the system time is set correctly. If the object creation time-stamp is still greater than the system time, then export the object's data, drop the object, recreate the object so it has a new creation time-stamp, import the object's data, and resume work.

 

 

The ORA-01466 error is related with the table or index last DDL time is greater than the time of impdp. Namely someone may drop or truncate the table when you did this action.

Or if you use flashback_scn option, The value of SCN supplied to parameter flashback_scn does not correspond to the time of LAST_DDL_TIME from DBA_OBJECTS for the specific table.

A table’s DDL may have changed due to a truncate job and LAST_DDL_TIME can end up ahead of the flashback_scn time supplied in the parameter.

 

 

ORA-01466

In This example, a developer or business analyist created a job that ran a truncate command while datapump job was running.

To solve this error, you should Check if there is any DDL job which is causing LAST_DDL_TIME  to be modified to newer value. If so, change the time of execution of that job.

As a work around you can remove the FLASHBACK_SCN  parameter until you get the solution for the DDL job.

EXPDP syntax :

expdp userid=<LOGIN>
     dumpfile=<DUMP_NAME>_%U.dmp
     logfile=<LOG_NAME>.log              
     filesize=28000m                                    
     flashback_scn=$SCN                              
     parallel=8              
     cluster=N              
     full=y

This error can also be generated  for FLASHBACK_TIME.

To determine if you are  facing this issue, execute following steps:

  1. Check dba_objects for LAST_DDL_TIME :
    SQL> select object_name,last_ddl_time from dba_objects where owner='<SCHEMA_NAME>' and object_name='<TABLE_NAME>';

     

  2. Collect SCN timestamp
    SQL> select SCN_TO_TIMESTAMP(SCN_specified_with_export) from dual;

     
    If the LAST_TIME is newer than the scn time, you are facing this issue.

  3. Remove the job which is running any DDL command on the failing object.

 

 

Do you want to learn Oracle Datapump ( Export import ), then Click This Link and read the articles.

Oracle Data Pump – Export Import ( Expdp Impdp ) Tutorial with Examples-4

 828 views last month,  1 views today

About Mehmet Salih Deveci

I am Founder of SysDBASoft IT and IT Tutorial and Certified Expert about Oracle & SQL Server database, Goldengate, Exadata Machine, Oracle Database Appliance administrator with 10+years experience.I have OCA, OCP, OCE RAC Expert Certificates I have worked 100+ Banking, Insurance, Finance, Telco and etc. clients as a Consultant, Insource or Outsource.I have done 200+ Operations in this clients such as Exadata Installation & PoC & Migration & Upgrade, Oracle & SQL Server Database Upgrade, Oracle RAC Installation, SQL Server AlwaysOn Installation, Database Migration, Disaster Recovery, Backup Restore, Performance Tuning, Periodic Healthchecks.I have done 2000+ Table replication with Goldengate or SQL Server Replication tool for DWH Databases in many clients.If you need Oracle DBA, SQL Server DBA, APPS DBA,  Exadata, Goldengate, EBS Consultancy and Training you can send my email adress [email protected].-                                                                                                                                                                                                                                                 -Oracle DBA, SQL Server DBA, APPS DBA,  Exadata, Goldengate, EBS ve linux Danışmanlık ve Eğitim için  [email protected] a mail atabilirsiniz.

Leave a Reply