Golgappa.net | Golgappa.org | BagIndia.net | BodyIndia.Com | CabIndia.net | CarsBikes.net | CarsBikes.org | CashIndia.net | ConsumerIndia.net | CookingIndia.net | DataIndia.net | DealIndia.net | EmailIndia.net | FirstTablet.com | FirstTourist.com | ForsaleIndia.net | IndiaBody.Com | IndiaCab.net | IndiaCash.net | IndiaModel.net | KidForum.net | OfficeIndia.net | PaysIndia.com | RestaurantIndia.net | RestaurantsIndia.net | SaleForum.net | SellForum.net | SoldIndia.com | StarIndia.net | TomatoCab.com | TomatoCabs.com | TownIndia.com
Interested to Buy Any Domain ? << Click Here >> for more details...


Source is a flat file and want to load unique and duplicate
records separately into two separate targets; right??

Answers were Sorted based on User's Feedback



Source is a flat file and want to load unique and duplicate records separately into two separate t..

Answer / nitin

Create the mapping as below to load Unique records and duplicate records each in separate targets

Source->SQ->Sorter->Aggregator->Router-> Tgt_Unique
-> Tgt_Duplicate
In aggregator use group by on all ports.
and define a port OUTPUT_COUNT = COUNT(*)
In the router define two groups OUTPUT_COUNT > 1 and OUTPUT_COUNT = 1; Connect the outputs from the first group
OUTPUT_COUNT > 1 to tgt_Duplicate and OUTPUT_COUNT = 1 to Tgt_Unique

Is This Answer Correct ?    1 Yes 0 No

Source is a flat file and want to load unique and duplicate records separately into two separate t..

Answer / ankit kansal

Hi,
What i have understood after seeing your problem is like if your source contains 1,2,1,2,3 then only 3 is taken as unique and 1,2 will be considered as duplicate values.

SRC->SQ->SRT->EXP(to set flags for dup)->ROUTER->JOINER->EXP->RTR->2TGTS

http://deepinopensource.blogspot.in/

Is This Answer Correct ?    1 Yes 1 No

Source is a flat file and want to load unique and duplicate records separately into two separate t..

Answer / mohank106

Refer the below link, the answer is crystal clear here

http://www.bullraider.com/database/informatica/scenario/11-informatica-scenario3

Is This Answer Correct ?    0 Yes 0 No

Source is a flat file and want to load unique and duplicate records separately into two separate t..

Answer / rani

Take Source Qualifier,next place sorter t/f,select option Distinct in sorter and load it in Unique_target.

Take lookup transformation and lookup on target and compare it with source, when a record occurs more than 1 ,delete that record from target using Update strategy -DD_DELETE 2 and load
in Duplicate_target.This is a source in another pipeline and take unconnected lookup and write lookup override like count(*) having >1 then load them in Duplicate_target.

Is This Answer Correct ?    0 Yes 2 No

Post New Answer

More Informatica Interview Questions

How will you update the row without using update statergy?

10 Answers   CTS,


what are the types of facts with Examples?

3 Answers  


I have done MBA in 2008. i got job as business analyst in 2008 january through consultany. but after 3 months they are giving training Informatica developer. now iam continuing this job. my question is when iam going to interview HR people ask me many times like this " YOU ARE MBA GRADUATE. HOW YOU ARE SELECT THIS POSTION. IAM EXPLAINING WHAT I HAVE MENTION ABOVE". PLEASE TELL HOW IAM TELLING THIS QUESTION ANSWER.

0 Answers  


what is junk dimension

4 Answers   Cap Gemini,


Let’s say I have more than have record in source table and I have 3 destination table A,B,C. I have to insert first 1 to 10 records in A then 11 to 20 in B and 21 to 30 in C. Then again from 31 to 40 in A, 41 to 50 in B and 51 to 60 in C……So on up to last record.

5 Answers  


How to load relational source into file target?

3 Answers   IBM,


How to create Target definition for flat files?

0 Answers   Informatica,


Can we use the mapping parameters or variables created in one mapping into any other reusable transformation?

0 Answers   Informatica,


How do you load first and last records into target table?

0 Answers  


what are testing in a mapping level please give brif eplanation

1 Answers   CTS,


what is the logic will you implement to load data into a fact table from n dimension tables?

4 Answers   TCS,


How i can Schdule the Informatica job in "Unix Corn Schduling tool" ?

2 Answers  


Categories