Golgappa.net | Golgappa.org | BagIndia.net | BodyIndia.Com | CabIndia.net | CarsBikes.net | CarsBikes.org | CashIndia.net | ConsumerIndia.net | CookingIndia.net | DataIndia.net | DealIndia.net | EmailIndia.net | FirstTablet.com | FirstTourist.com | ForsaleIndia.net | IndiaBody.Com | IndiaCab.net | IndiaCash.net | IndiaModel.net | KidForum.net | OfficeIndia.net | PaysIndia.com | RestaurantIndia.net | RestaurantsIndia.net | SaleForum.net | SellForum.net | SoldIndia.com | StarIndia.net | TomatoCab.com | TomatoCabs.com | TownIndia.com
Interested to Buy Any Domain ? << Click Here >> for more details...


Source is a flat file and want to load unique and duplicate
records separately into two separate targets; right??

Answers were Sorted based on User's Feedback



Source is a flat file and want to load unique and duplicate records separately into two separate t..

Answer / nitin

Create the mapping as below to load Unique records and duplicate records each in separate targets

Source->SQ->Sorter->Aggregator->Router-> Tgt_Unique
-> Tgt_Duplicate
In aggregator use group by on all ports.
and define a port OUTPUT_COUNT = COUNT(*)
In the router define two groups OUTPUT_COUNT > 1 and OUTPUT_COUNT = 1; Connect the outputs from the first group
OUTPUT_COUNT > 1 to tgt_Duplicate and OUTPUT_COUNT = 1 to Tgt_Unique

Is This Answer Correct ?    1 Yes 0 No

Source is a flat file and want to load unique and duplicate records separately into two separate t..

Answer / ankit kansal

Hi,
What i have understood after seeing your problem is like if your source contains 1,2,1,2,3 then only 3 is taken as unique and 1,2 will be considered as duplicate values.

SRC->SQ->SRT->EXP(to set flags for dup)->ROUTER->JOINER->EXP->RTR->2TGTS

http://deepinopensource.blogspot.in/

Is This Answer Correct ?    1 Yes 1 No

Source is a flat file and want to load unique and duplicate records separately into two separate t..

Answer / mohank106

Refer the below link, the answer is crystal clear here

http://www.bullraider.com/database/informatica/scenario/11-informatica-scenario3

Is This Answer Correct ?    0 Yes 0 No

Source is a flat file and want to load unique and duplicate records separately into two separate t..

Answer / rani

Take Source Qualifier,next place sorter t/f,select option Distinct in sorter and load it in Unique_target.

Take lookup transformation and lookup on target and compare it with source, when a record occurs more than 1 ,delete that record from target using Update strategy -DD_DELETE 2 and load
in Duplicate_target.This is a source in another pipeline and take unconnected lookup and write lookup override like count(*) having >1 then load them in Duplicate_target.

Is This Answer Correct ?    0 Yes 2 No

Post New Answer

More Informatica Interview Questions

If no. of source columns is changing every time (First time it is 10 next time it is 20 so on). How to deal with it without changing mapping?

9 Answers  


how can we load starting with 11th record of a table from source to target

6 Answers   IBM,


Tell me any other tools for scheduling purpose other than workflow manager pmcmd?

0 Answers  


What is a dimensional model?

0 Answers  


Can we run session without using workflows?

5 Answers   TCS,


Explain Dataware house architecture .how data flow from intial to end?

0 Answers   Keane India Ltd,


comonly how meny mappings r there in Banking projects?

1 Answers  


I have Flat file like the data, sal have 10,000. I want to load the data in the same format as sal as 10,000. Can anybody know the answer means please mail me. Thanks in advance.. My mail id is chandranmca2007@gmail.com

2 Answers   DELL, iGate,


Hi, source data is col1 values are 5,6,7 col2 are 3,2,1 col3 are 8,9,10 and i want to get target as col1 5,6,7 col2 1,2,3 col3 8,9,10 how to do this one?

6 Answers  


-Which expression we can not use in Maplets?, -Can we join(relate) two dimensions in a schema? -Why and where we use 'sorted input' option?

1 Answers   TCS,


can anyone explain me about sales project in informatica?

8 Answers   MBT, Satyam,


What is meant by incremental aggregation?

0 Answers  


Categories