Golgappa.net | Golgappa.org | BagIndia.net | BodyIndia.Com | CabIndia.net | CarsBikes.net | CarsBikes.org | CashIndia.net | ConsumerIndia.net | CookingIndia.net | DataIndia.net | DealIndia.net | EmailIndia.net | FirstTablet.com | FirstTourist.com | ForsaleIndia.net | IndiaBody.Com | IndiaCab.net | IndiaCash.net | IndiaModel.net | KidForum.net | OfficeIndia.net | PaysIndia.com | RestaurantIndia.net | RestaurantsIndia.net | SaleForum.net | SellForum.net | SoldIndia.com | StarIndia.net | TomatoCab.com | TomatoCabs.com | TownIndia.com
Interested to Buy Any Domain ? << Click Here >> for more details...


I have 2 files 1st contains duplicate records only, 2nd file contains Unique records.EX:
File1:
1 subhash 10000
1 subhash 10000
2 raju 20000
2 raju 20000
3 chandra 30000
3 chandra 30000
File2:
1 subhash 10000
5 pawan 15000
7 reddy 25000
3 chandra 30000
Output file:--&#61664; capture all the duplicates in both file with count.
1 subhash 10000 3
1 subhash 10000 3
1 subhash 10000 3
2 raju 20000 2
2 raju 20000 2
3 chandra 30000 3
3 chandra 30000 3
3 chandra 30000 3

Answers were Sorted based on User's Feedback



I have 2 files 1st contains duplicate records only, 2nd file contains Unique records.EX: File1: 1 ..

Answer / subbuchamala

File1,File2====&#61672;Funnel-----&#61664;Copy=======1st link AGG, 2nd link JOIN----&#61664;Filter----&#61664;OutputFile
1. pass the 2 files to funnel stage and then copy stage.
2. from copy stage 1st link to AGG stage, 2nd link to JOIN stage
3. In AGG stage, Group by Key column say ID, NAME take the count and JOIN based on KEY column
4. Filter on COUNT>1 send the output OutputFile
we get desired output

Is This Answer Correct ?    14 Yes 0 No

I have 2 files 1st contains duplicate records only, 2nd file contains Unique records.EX: File1: 1 ..

Answer / ankit gosain

Hi,

This problem can be solved by creating a job with following
stages:

File2 File2
| |
| |
| |
File1-----Funnel----Aggregator----Join----Filter---Tgt_File
|
|
|
File1

1. Funnel both the files (Now you have Unique & Duplicates
records).
2. Aggregate on the basis of any i/p column and mention the
calculation type = Count Rows (say o/p column row_count).
3. Join the aggregated o/p with the i/p file1,2 one the
basis of key & mention the join type = Inner Join.
4. In filter stage, mention the where clause as row_count>1.

If you have further doubt or query, catch me on
ankitgosian@gmail.com

Cheers,
Ankit :)

Is This Answer Correct ?    1 Yes 0 No

Post New Answer

More Data Stage Interview Questions

How can we achive parallelism

1 Answers   CTS,


What steps should be taken to improve Datastage jobs?

0 Answers  


What is the difference between validate and compile?

1 Answers   CTS,


What is the flow of loading data into fact & dimensional tables?

0 Answers  


Hi Every one, I have a scenario plz suggest me 1)On daily we r getting some huge files data so all files metadata is same we have to load in to target table how we can load? 2) One column having 10 records at run time we have to send 5th and 6th record to target at run time how we can send? Hi plz help me for above scenarios and If any one is having JobSequence kindly send me one example and the scenario to my mail ID(nrvdwh@gmail.com)

3 Answers   HSBC,


What are the features of datastage flow designer?

0 Answers  


What is the difference between account and directory options ?

0 Answers  


i/p o/p1 o/p2 1 1 4 1 1 5 1 1 6 2 2 2 2 2 2 3 3 4 5 6 how to populates i/p rows into o/p1&o/p2 using datastage stages?and also the same scenario using sql?

8 Answers   IBM,


How to reverse the string using SQL?

0 Answers   CTS,


Define Merge?

0 Answers  


how to achieve this output ? Two Input columns(ID & Name) - ID | Name 1 | Jack 1 | Kara In output there should be only 1 column which will be populated as - 1,Jack 1,Kara

0 Answers  


i WANTED TO USE THE RANGE LOOKUP SCENARIO IN DATASTAGE 7.5.2 SRVER JOB.i HAVE A DATE FIELD IN SOURCE AND I SHOULD MATCH IT WITH A FIELD IN LOOKUP FILE.BUT,THE FIELDS SHOULD MATCH EVEN THOUGH THERE IS SOME RANGE.CAN SOMEONE TELL ME HOW CAN I DO THAT. THANKS

0 Answers  


Categories