Search code examples
oracleoracle11goracle11gr2oracle11gr1

SQL DUMP missing information in a column


On Database A: I create test.dmp via oracle sql exp.exe command:

exp.exe %CONNECT% FILE=test.dmp LOG=%LOGFILE% DIRECT=Y STATISTICS=NONE

In the .log file it writes:

. . export table TBL_TEST 7000 rows exported (no error!). 

Version: Oracle Database 11g Enterprise Edition Release 11.2.0.4.0 - 64bit Production

On Database B: I importing test.dmp via imp.exe command:

imp.exe %CONNECT% file=test.dmp LOG=%LOGFILE% FULL=Y

In the .log file I see error:

. . importing table TBL_TEST
IMP-00019: row rejected due to ORACLE error 1400
IMP-00003: ORACLE error 1400 encountered
ORA-01400: cannot insert NULL into (TBL_TEST.COLUMN_A)

Version:Oracle Database 11g Enterprise Edition Release 11.2.0.4.0 - 64bit Production

TBL_TEST.COLUMN_A type is NUMBER(1,0), Nullable=No, DATA_DEFAULT=0.

How this can be possible? On Database A the column "COLUMN_A" is filled and also defined as not null. All the tables are recreated from scratch on Database B.

Do you have any idea why this can happen?

Thank you in advance,

Luisa Bradusca


Solution

  • This looks like the column in question was added to an existing table which already had data in it. In previous releases this could have been quite slow. Without a default value specified the column would have been added, its value set for all existing rows via an update, and only then could the not-null constraint be added. With a default value the updates would happen automatically but would still have taken time.

    In 11g Oracle takes that pain away when you specify a default value. That makes the column addition almost instant, and is transparent to queries. This is explained in the documentation:

    If you specify the DEFAULT clause for a NOT NULL column, then the default value is stored as metadata but the column itself is not populated with data. However, subsequent queries that specify the new column are rewritten so that the default value is returned in the result set.

    The legacy exp tool doesn't understand that though; it exports the values in the columns, which will be null for the old rows.

    You need to use the datapump export tool, expdp, instead - that will include the default value in the dumped data, so it will be inserted into the database B table correctly. And you need to then use impdp for the import of course.