Source is a flat file and want to load unique and duplicate
records separately into two separate targets; right??
Answers were Sorted based on User's Feedback
Answer / nitin
Create the mapping as below to load Unique records and duplicate records each in separate targets
Source->SQ->Sorter->Aggregator->Router-> Tgt_Unique
-> Tgt_Duplicate
In aggregator use group by on all ports.
and define a port OUTPUT_COUNT = COUNT(*)
In the router define two groups OUTPUT_COUNT > 1 and OUTPUT_COUNT = 1; Connect the outputs from the first group
OUTPUT_COUNT > 1 to tgt_Duplicate and OUTPUT_COUNT = 1 to Tgt_Unique
| Is This Answer Correct ? | 1 Yes | 0 No |
Answer / ankit kansal
Hi,
What i have understood after seeing your problem is like if your source contains 1,2,1,2,3 then only 3 is taken as unique and 1,2 will be considered as duplicate values.
SRC->SQ->SRT->EXP(to set flags for dup)->ROUTER->JOINER->EXP->RTR->2TGTS
http://deepinopensource.blogspot.in/
| Is This Answer Correct ? | 1 Yes | 1 No |
Answer / mohank106
Refer the below link, the answer is crystal clear here
http://www.bullraider.com/database/informatica/scenario/11-informatica-scenario3
| Is This Answer Correct ? | 0 Yes | 0 No |
Answer / rani
Take Source Qualifier,next place sorter t/f,select option Distinct in sorter and load it in Unique_target.
Take lookup transformation and lookup on target and compare it with source, when a record occurs more than 1 ,delete that record from target using Update strategy -DD_DELETE 2 and load
in Duplicate_target.This is a source in another pipeline and take unconnected lookup and write lookup override like count(*) having >1 then load them in Duplicate_target.
| Is This Answer Correct ? | 0 Yes | 2 No |
In a table there are 1 million records there in which 3 records are duplicate how will you find out those 3 records?
How many ways a relational source definition can be updated and what are they?
Why we use partitioning the session in informatica?
How u use pdf file in informatica source?
Flat file heaving 1 lack records and I want to push 52000 records to the target?
What is the cumulative sum and moving sum?
What do you mean incremental aggregation?
Is informatica power center designer has 64 bit support?
Whether Aggrigator transformation ignore the null values or consider the null values ? Advance Thanks, Manojkumar
If the source has duplicate records as id and name columns, values: 1 a, 1 b, 1 c, 2 a, 2 b, the target should be loaded as 1 a+b+c or 1 a||b||c, what transformations should be used for this?
In which conditions we can not use joiner transformation (Limitaions of joiner transformation) ?
1)what is the use of bottlenecks in informatica. 2)where we are use shellscripting. 3)what meant by informatica.