Source is a flat file and want to load unique and duplicate
records separately into two separate targets; right??

Answers were Sorted based on User's Feedback



Source is a flat file and want to load unique and duplicate records separately into two separate t..

Answer / nitin

Create the mapping as below to load Unique records and duplicate records each in separate targets

Source->SQ->Sorter->Aggregator->Router-> Tgt_Unique
-> Tgt_Duplicate
In aggregator use group by on all ports.
and define a port OUTPUT_COUNT = COUNT(*)
In the router define two groups OUTPUT_COUNT > 1 and OUTPUT_COUNT = 1; Connect the outputs from the first group
OUTPUT_COUNT > 1 to tgt_Duplicate and OUTPUT_COUNT = 1 to Tgt_Unique

Is This Answer Correct ?    1 Yes 0 No

Source is a flat file and want to load unique and duplicate records separately into two separate t..

Answer / ankit kansal

Hi,
What i have understood after seeing your problem is like if your source contains 1,2,1,2,3 then only 3 is taken as unique and 1,2 will be considered as duplicate values.

SRC->SQ->SRT->EXP(to set flags for dup)->ROUTER->JOINER->EXP->RTR->2TGTS

http://deepinopensource.blogspot.in/

Is This Answer Correct ?    1 Yes 1 No

Source is a flat file and want to load unique and duplicate records separately into two separate t..

Answer / mohank106

Refer the below link, the answer is crystal clear here

http://www.bullraider.com/database/informatica/scenario/11-informatica-scenario3

Is This Answer Correct ?    0 Yes 0 No

Source is a flat file and want to load unique and duplicate records separately into two separate t..

Answer / rani

Take Source Qualifier,next place sorter t/f,select option Distinct in sorter and load it in Unique_target.

Take lookup transformation and lookup on target and compare it with source, when a record occurs more than 1 ,delete that record from target using Update strategy -DD_DELETE 2 and load
in Duplicate_target.This is a source in another pipeline and take unconnected lookup and write lookup override like count(*) having >1 then load them in Duplicate_target.

Is This Answer Correct ?    0 Yes 2 No

Post New Answer

More Informatica Interview Questions

Informatica settings are available in which file?

0 Answers   Informatica,


can u any challenge your project?

0 Answers   HP,


I am not able to connect to the domain with the client although all services and databases are up and there is no network issue?

0 Answers  


How union transformation is active ?

2 Answers   Puma,


One flatefile it contains some data but i want to dont want to load first and last record ..how it is? can u tell me complete logic?

2 Answers  






Under what condition selecting sorted input in aggregator may fail the session?

0 Answers  


Data transformed successfully from Source table to target table. Now how you will ensure that the data in Target table is proper. I answered will verify one or two records and check Then question was that the Development is doing (1-2 record verification)but as a tester you have to verify the complete data, how you will do?? Please answer

2 Answers  


Two workflows are running at same time first workflow is succeeded but second workflow is failed but there is no dependencies?

1 Answers   IBM,


What is degenerated dimention table?

7 Answers   Patni,


for ex: in source 10 records are there with column sal. use a filter transformation condition as Sal=TRUE and connect to target. what will happen.

11 Answers   Accenture,


what is plsql table?

2 Answers  


what are the unix commands to start the wrk flow ?

4 Answers   Accenture, TCS,


Categories