If we just want to know how much disk space for the dump without actually exporting any data, we can use the ESTIMATE_ONLY=y parameter on the expdp command. ESTIMATE_ONLY=y parameter estimate the space in bytes per tables. The below demo will show that how a system user wants to see the disk space estimates on hr and scott schemas export .
C:\>expdp system/ramtech@noida directory=dpump schemas=scott,hr estimate_only=y
Export: Release 11.1.0.6.0 - Production on Friday, 15 April, 2011 18:56:03
Copyright (c) 2003, 2007, Oracle. All rights reserved.
Connected to: Oracle Database 11g Enterprise Edition Release 11.1.0.6.0 - Production
With the Partitioning, OLAP, Data Mining and Real Application Testing options
Starting "SYSTEM"."SYS_EXPORT_SCHEMA_01": system/********@noida directory=dpump schemas=scott,hr logfile=scott_hr.log estimate_only=y
Estimate in progress using BLOCKS method...
Processing object type SCHEMA_EXPORT/TABLE/TABLE_DATA
. estimated "HR"."COUNTRIES" 64 KB
. estimated "HR"."DEPARTMENTS" 64 KB
. estimated "HR"."EMPLOYEES" 64 KB
. estimated "HR"."JOBS" 64 KB
. estimated "HR"."JOB_HISTORY" 64 KB
. estimated "HR"."LOCATIONS" 64 KB
. estimated "HR"."REGIONS" 64 KB
. estimated "SCOTT"."DEPT" 64 KB
. estimated "SCOTT"."EMP" 64 KB
. estimated "SCOTT"."SALGRADE" 64 KB
. estimated "SCOTT"."BONUS" 0 KB
Total estimation using BLOCKS method: 640 KB
Job "SYSTEM"."SYS_EXPORT_SCHEMA_01" successfully completed at 18:57:38
Now we see that we need 640 KB disk space to export the entire data base.
Note :
- The ESTIMATE_ONLY parameter cannot be used in conjunction with the QUERY parameter dumpfile.
- Estimate Parameter do not give the accurate size . It gives the approax. size of the dumpfile .
- If the Data Pump export job involves compressed tables, the default size estimation given for the compressed table is inaccurate when ESTIMATE=BLOCKS is used. This is because the size estimate does not reflect that the data was stored in a compressed form. To get a more accurate size estimate for compressed tables, use ESTIMATE=STATISTICS.
- The estimate may also be inaccurate if the QUERY, SAMPLE, or REMAP_DATA parameter is used.
Enjoy :-)
2 comments:
This approach may not give you exact size since it uses block method, you can use statistics method which would be more accurate no.
Refer to ESTIMATE={BLOCKS | STATISTICS}
Hi Vasudeva ,
It's all depends on the condition's . Both parameters didn't give the exact size but in general cases block size gives the approax size .
I have done some experiment and come to conclusion that Actual dump size may varies upto 50% of original size , If size of the dumpfile is small then there may be more deviation in size but in case of large file , deviation keeps on decreasing .
I have added few notes on this .
You comments are very useful for me .
Thanks and good time ahead . :) :)
Post a Comment