1.1 Check Knowledge Modules (CKM)
- Data Quality control is essential in ensuring the overall consistency of the data in your information System's applications.
- Application data is not always valid for the constraints and declarative rules imposed by the information system. You may, for instance, find orders with no customer, or order lines with no product, etc. In addition, such incorrect data may propagate via integration flows
- The CKM can be used in 2 ways:
- Check the consistency of existing data.This can be done on any data store or within interfaces, by setting the STATIC_CONTROL option to "Yes".
- Check consistency of the incoming data before loading the records to a target data store. This is done by using the FLOW_CONTROL option. In this case, the CKM simulates the constraints of the target data store on the resulting flow prior to writing to the target.
- In summary: the CKM can check either an existing table or the temporary "I$" table created by an IKM.
- The CKM accepts a set of constraints and the name of the table to check. It creates an "E$" error table which it writes all the rejected records to. The CKM can also remove the erroneous records from the checked result set.
- The following figures show how a CKM operates in both STATIC_CONTROL and FLOW_CONTROL modes.
Check Knowledge Module (STATIC_CONTROL)
In STATIC_CONTROL mode, the CKM reads the constraints of the table and checks them against the data of the table. Records that don't match the constraints are written to the "E$" error table in the staging area.
Check Knowledge Module (FLOW_CONTROL)
In FLOW_CONTROL mode, the CKM reads the constraints of the target table of the Interface. It checks these constraints against the data contained in the "I$" flow table of the staging area. Records that violate these constraints are written to the "E$" table of the staging area.
In both cases, a CKM usually performs the following tasks:
- Create the "E$" error table on the staging area. The error table should contain the same columns as the data store as well as additional columns to trace error messages, check origin, check date etc.
- Isolate the erroneous records in the "E$" table for each primary key, alternate key, foreign key, condition, mandatory column that needs to be checked.
- If required, remove erroneous records from the table that has been checked.
Exercise 1: Static Control on Source table
- CKM is model level property hence we need to import ckm and need to assign to model
STEP 1: Importing CKM KM
- Open project right click on Check(CKM) → Click on Import Knowledge Module
- Select CKM as CKM ORACLE → Click on ok
STEP 2: Assigning CKM KM to Model
- Right click on source model → Click on open → select control tab → select KM as CKM oracle
- save → close
STEP 3: Checking COMM column for not null values
- Create a table using below command
- create table B107EMP as select * from emp
- Reverse engineer this table into source model
- expand the table → expand attributes open comm column select mandatory and static options
- save → close
- right click on this table → click on control → click on check select context properly → click on ok → again ok
- go to database notice that two tables created
- SNP_CHECK_TAB
- E$_B107EMP
- Observe that all comm null records are copied into E$B107EMP table
Exercise 2: Static Control using package on data store
- Create a package as CKM_STATIC_CONTROL
- drag and drop B107EMP datastore into package
- select B107EMP datastore and select type data store check
- check delete errors from the checked table
- save → run
- observe output
Exercise 3: Flow Control Understanding
Step 1: Prepare source data
- open sql developer → Connect to source connection → double click on table B107EMP name → go to the data tab
- Insert records as shown → COMMIT
INSERT INTO "SCOTT"."B108EMP" (EMPNO, ENAME, JOB, MGR, HIREDATE, SAL, COMM, DEPTNO) VALUES ('5555', 'RRITEC', 'Training', '7698', TO_DATE('08-SEP-81', 'DD-MON-RR'), '1500', '100', '20');
INSERT INTO "SCOTT"."B108EMP" (EMPNO, ENAME, JOB, MGR, HIREDATE, SAL, COMM, DEPTNO) VALUES ('5555', 'RRITEC', 'Training', '7698', TO_DATE('08-SEP-81', 'DD-MON-RR'), '1500', '100', '20');
INSERT INTO "SCOTT"."B108EMP" (EMPNO, ENAME, JOB, MGR, HIREDATE, SAL, COMM, DEPTNO) VALUES ('12', 'John', 'Manager', '7782', TO_DATE('08-SEP-81', 'DD-MON-RR'), '1500', '100', '20');
INSERT INTO "SCOTT"."B108EMP" (EMPNO, ENAME, JOB, MGR, HIREDATE, SAL, COMM, DEPTNO) VALUES ('6666', 'Nancy', 'Manager', '7782', TO_DATE('08-SEP-81', 'DD-MON-RR'), '1500', '100', '60');
|
Step 2: CREATE TARGET TABLE
- open target schema B107TDBU execute below script
create table TGT_B107EMP as select * from scott.emp where 1=2
|
- Reverse engineer into target model
Step 3: Creating Constraints
- open the target model navigate to target table TGT_B107EMP
- right click on constraints → click on newkey
- In description name it as PK_EMPNO and select primary key option
- Click on attributes tab → select empno → click on >
- save → close
- Right click on constraints → Click on New Reference
- In definition tab provide below information
- In columns tab select DEPTNO as shown below
- Save → close
- Again right click on constraints → select New Condition
- Provide below information
- Save→ close
Step 4: Creating Mapping
- Create a mapping with the name of m_CKM_FLOW_CONTROL
- drag and drop source B107EMP and target TGT_B107EMP
- map corresponding columns
- In flow tab select IKM and mark FLOW_CONTROL as TRUE
- Select Control tab and observe conditions
- Click on RUN → observe output .
No comments:
Post a Comment