Golgappa.net | Golgappa.org | BagIndia.net | BodyIndia.Com | CabIndia.net | CarsBikes.net | CarsBikes.org | CashIndia.net | ConsumerIndia.net | CookingIndia.net | DataIndia.net | DealIndia.net | EmailIndia.net | FirstTablet.com | FirstTourist.com | ForsaleIndia.net | IndiaBody.Com | IndiaCab.net | IndiaCash.net | IndiaModel.net | KidForum.net | OfficeIndia.net | PaysIndia.com | RestaurantIndia.net | RestaurantsIndia.net | SaleForum.net | SellForum.net | SoldIndia.com | StarIndia.net | TomatoCab.com | TomatoCabs.com | TownIndia.com
Interested to Buy Any Domain ? << Click Here >> for more details...


Source is a flat file and want to load unique and duplicate
records separately into two separate targets; right??

Answers were Sorted based on User's Feedback



Source is a flat file and want to load unique and duplicate records separately into two separate t..

Answer / nitin

Create the mapping as below to load Unique records and duplicate records each in separate targets

Source->SQ->Sorter->Aggregator->Router-> Tgt_Unique
-> Tgt_Duplicate
In aggregator use group by on all ports.
and define a port OUTPUT_COUNT = COUNT(*)
In the router define two groups OUTPUT_COUNT > 1 and OUTPUT_COUNT = 1; Connect the outputs from the first group
OUTPUT_COUNT > 1 to tgt_Duplicate and OUTPUT_COUNT = 1 to Tgt_Unique

Is This Answer Correct ?    1 Yes 0 No

Source is a flat file and want to load unique and duplicate records separately into two separate t..

Answer / ankit kansal

Hi,
What i have understood after seeing your problem is like if your source contains 1,2,1,2,3 then only 3 is taken as unique and 1,2 will be considered as duplicate values.

SRC->SQ->SRT->EXP(to set flags for dup)->ROUTER->JOINER->EXP->RTR->2TGTS

http://deepinopensource.blogspot.in/

Is This Answer Correct ?    1 Yes 1 No

Source is a flat file and want to load unique and duplicate records separately into two separate t..

Answer / mohank106

Refer the below link, the answer is crystal clear here

http://www.bullraider.com/database/informatica/scenario/11-informatica-scenario3

Is This Answer Correct ?    0 Yes 0 No

Source is a flat file and want to load unique and duplicate records separately into two separate t..

Answer / rani

Take Source Qualifier,next place sorter t/f,select option Distinct in sorter and load it in Unique_target.

Take lookup transformation and lookup on target and compare it with source, when a record occurs more than 1 ,delete that record from target using Update strategy -DD_DELETE 2 and load
in Duplicate_target.This is a source in another pipeline and take unconnected lookup and write lookup override like count(*) having >1 then load them in Duplicate_target.

Is This Answer Correct ?    0 Yes 2 No

Post New Answer

More Informatica Interview Questions

In a table there are 1 million records there in which 3 records are duplicate how will you find out those 3 records?

6 Answers  


How many ways a relational source definition can be updated and what are they?

0 Answers   Informatica,


Why we use partitioning the session in informatica?

2 Answers  


How u use pdf file in informatica source?

2 Answers   Ericsson, HP, IBM,


Flat file heaving 1 lack records and I want to push 52000 records to the target?

7 Answers  


What is the cumulative sum and moving sum?

0 Answers  


What do you mean incremental aggregation?

0 Answers  


Is informatica power center designer has 64 bit support?

1 Answers  


Whether Aggrigator transformation ignore the null values or consider the null values ? Advance Thanks, Manojkumar

3 Answers   IBM,


If the source has duplicate records as id and name columns, values: 1 a, 1 b, 1 c, 2 a, 2 b, the target should be loaded as 1 a+b+c or 1 a||b||c, what transformations should be used for this?

4 Answers   ABC, Cap Gemini,


In which conditions we can not use joiner transformation (Limitaions of joiner transformation) ?

2 Answers  


1)what is the use of bottlenecks in informatica. 2)where we are use shellscripting. 3)what meant by informatica.

1 Answers   CTS,


Categories