Friday, May 27, 2011

What is Data Masking ?


Data masking is the process of protecting sensitive information in non-production databases from inappropriate visibility. After sanitization, the database remains perfectly usable - the look-and-feel is preserved - but the information content is secure. The Data Masker software provides a simple, repeatable and "push-button" method of scrambling data in test systems.

It is important to be aware that data masking is appropriate to more than just personal details – sometimes business confidential information is appropriate for masking as well. For example, it may be desirable to prevent quarterly sales figures for some products being present in an outsourced test database.

Why Mask Data ?                                                                                                                                      
1.) Legal Requirements :  The regulatory environment surrounding the duties and obligations of a data holder to protect the information they maintain are becoming increasingly rigorous in just about every legal jurisdiction. It is a pretty safe assumption that the standards for the security and maintenance of data will become increasingly strict in the future .

2.) Loss of Confidence And Public Relations Disasters :   It can reasonably be said in most locations, that if a data escape happens at our organization, then the formal legal sanctions applied by governmental bodies is not the only problem we will be facing. Possibly it may not even be the biggest of our immediate worries.

3.) Malicious Exposure :    Most people think the major risk to the information they hold is external entities (and organized syndicates) out to break in and steal the data. The assumption then follows that protecting the network and firewalls is the appropriate and sufficient response. There is no denying that such protection is necessary – however it has been shown that in many cases the data is stolen by malicious insiders who have been granted access to the data

4.) Accidental Exposure :   The risk of accidental exposure of information is often neglected when considering the security risks associated with real test data. Often it is thought that “there is no point in masking the test data because everybody has access to production anyways”. Not so, the risks associated with an accidental exposure of the data remain. Often just masking the most sensitive information (credit card numbers, customer email addresses etc) is enough to somewhat mitigate the damage associated with accidental exposure and the masked databases remain just as functional.

What Data To Mask :

I.) Light Masking on a Bug-Fix or Fire-Fighting Database .

II.) Medium Masking on Internal Development Databases .

III.) Thorough Masking on an Outsourced Database .

Is My Data Too Complex to be Masked ?
Not likely. Data Masker handles even the most intricate data structures. It can preserve data relationships between rows in tables, between rows in the same table or even internally between columns in the same row. Data synchronization issues of this type can be automatically handled by the addition of simple, easily configured masking rules. Don't worry about the size of the data either - Data Masker has successfully masked tables containing hundreds of millions of rows.

What Platforms and Databases does Data Masker Support ?
Oracle, SQL Server and DB2 UDB versions are available. The Data Masker software is installed on a Windows PC and operates on both local and remote databases. There are no server side components for any Data Masker version. Oracle versions 9i, 10g, 11g, Sql Server versions 2000, 2005, 2008 and DB2 UDB (LUW) versions 8.2 or greater are supported.

Enjoy     :-)




ORA-04030 : Out of Process Memory

Ora-04030 indicates that the oracle server process is unable to allocate more memory from the operating system.This memory consists of the PGA (Program Global Area) and its contents depend upon the server configuration.For dedicated server processes it contains the stack and the UGA (User Global Area) which holds user session data, cursor information and the sort area. In a multithreaded configuration (shared server), the UGA is allocated in the SGA (System Global Area) and will not be responsible for ORA-4030 errors.

In my case it occurs when a client program connects to oracle database an oracle process doesnot work and throws the Ora-04030 Out of Process Memory.

After getting this error, we troubleshoot this errors and on googling we finally come to conclusion that it is a BUG . Finally we have to taken the support for this error .

This error occurs due to various reason which is listed below :

1.) The Oracle process need more memory in order to request client program and it requests additional memory from Operating System but the Operating System can't serve the request. This is likely to happen if OS does not have enough memory or swap space is not available.

2.) There is a memory limit restriction from OS for oracle process to use. Oracle process already use the limit and now it requested additional memory and hence error appears.

3.) Might be an oracle bug .

Below are few links which are quite useful to understand this error.

http://orainternals.wordpress.com/2009/08/06/ora-4031-and-shared-pool-duration/
http://jhdba.wordpress.com/2009/11/13/problems-with-sga-a-multiiple-of-4gb-and-high-cpu-count/
http://www.tek-tips.com/viewthread.cfm?qid=835937
http://gdesaboyina.wordpress.com/2009/12/04/ora-04030-out-of-process-memory-when-trying-to-allocate-840-bytes-kgsp-heapkglss/


Enjoy      :-)


remap_table Parameter of Data-Pump in Oracle 11g


Oracle 11g datapump provide a new feature remap_table command to remap the table data to new table name on target database.we can use the REMAP_TABLE parameter to rename entire tables.
Syntax :
REMAP_TABLE=[schema.]old_tablename[.partition]:new_tablename .

In 10g datapump ,we use the REMAP_SCHEMA parameter to remap the schema name during the import or we use the FROMUSER and TOUSER parameters in the original import . There is no parameter to remap table names . This means that Import DataPump can only import data into a table with the same name as the original table.

If we have to import a table data having same structure into a database i.e, it is containing the table with same name then we have to perform it in two ways .

I.) Rename the original source table temporarily : 

II.) If the original source table cannot be rename then follow the below steps :
a.) import the dump into another schemas.
b.) rename the table name.
c.) Again export the table .
d.) Finally import the table name .

Remap_table allows us to rename tables during an import operation . Here is demo of the remap_table :

Here , we will create a table and take export of it and import it in the same schemas . In this scenario we have table name "test" and we will rename it as "newtest".

1.) Create a table  "test"

SQL> conn hr/hr@noida
Connected.
SQL> create table test(id number);
Table created.
SQL> insert into test values (1);
1 row created.
SQL> insert into test values (2);
1 row created.
SQL> insert into test values (3);
1 row created.
SQL> insert into test values (4);
1 row created.
SQL> commit;
Commit complete.

SQL> select * from test;
        ID
----------
         1
         2
         3
         4

2.) Export the table "test"

SQL> host expdp hr/hr@noida    dumpfile=hr_test.dmp    logfile=hrtestlog.log     tables=test

Export: Release 11.2.0.1.0 - Production on Fri May 27 11:20:43 2011
Copyright (c) 1982, 2009, Oracle and/or its affiliates.  All rights reserved.
Connected to: Oracle Database 11g Enterprise Edition Release 11.2.0.1.0 - Production
With the Partitioning, OLAP, Data Mining and Real Application Testing options
Starting "HR"."SYS_EXPORT_TABLE_01":  hr/********@noida dumpfile=hr_test.dmp logfile=hrtestlog.log tables=test
Estimate in progress using BLOCKS method...
Processing object type TABLE_EXPORT/TABLE/TABLE_DATA
Total estimation using BLOCKS method: 64 KB
Processing object type TABLE_EXPORT/TABLE/TABLE
. . exported "HR"."TEST"                                 5.031 KB       4 rows
Master table "HR"."SYS_EXPORT_TABLE_01" successfully loaded/unloaded
******************************************************************************
Dump file set for HR.SYS_EXPORT_TABLE_01 is:
  D:\APP\NEERAJS\ADMIN\NOIDA\DPDUMP\HR_TEST.DMP
Job "HR"."SYS_EXPORT_TABLE_01" successfully completed at 11:21:16

Since,we have the dump of the table "test". We import into hr schemas with new name  "newtest"

3.) Import the dump with remap_table Parameter

SQL>host impdp hr/hr@noida  dumpfile=hr_test.dmp logfile=imphrtestlog.log remap_table=hr.test:newtest
Import: Release 11.2.0.1.0 - Production on Fri May 27 11:22:11 2011
Copyright (c) 1982, 2009, Oracle and/or its affiliates.  All rights reserved.
Connected to: Oracle Database 11g Enterprise Edition Release 11.2.0.1.0 - Production
With the Partitioning, OLAP, Data Mining and Real Application Testing options
Master table "HR"."SYS_IMPORT_FULL_04" successfully loaded/unloaded
Starting "HR"."SYS_IMPORT_FULL_04":  hr/********@noida dumpfile=hr_test.dmp logfile=imphrtestlog.log remap_table=hr.test:newtest
Processing object type TABLE_EXPORT/TABLE/TABLE
Processing object type TABLE_EXPORT/TABLE/TABLE_DATA
. . imported "HR"."NEWTEST"                              5.031 KB       4 rows
Job "HR"."SYS_IMPORT_FULL_04" successfully completed at 11:22:25

Since the job is successfully completed .So we check the imported table i.e, "newtest"

SQL> select * from tab;
TNAME                                     TABTYPE                         CLUSTERID
----------------------                       ------------                         ----------------
COUNTRIES                               TABLE
DEPARTMENTS                         TABLE
EMPLOYEES                              TABLE
EMP_DETAILS_VIEW                VIEW
JOBS                                          TABLE
JOB_HISTORY                           TABLE
LOCATIONS                               TABLE
NEWTEST                                  TABLE
REGIONS                                    TABLE
SYS_IMPORT_FULL_01             TABLE
SYS_IMPORT_FULL_02             TABLE
SYS_IMPORT_FULL_03             TABLE
TEST                                            TABLE

13 rows selected.

SQL> select * from newtest;
        ID
----------
         1
         2
         3
         4

Note : Only objects created by the Import will be remapped. In particular, preexisting tables will not be remapped if TABLE_EXISTS_ACTION is set to TRUNCATE or APPEND

Enjoy   :-)