i want send my all duplicate record one tar and all uniq
records one target how we will perfome explain
example:
input data
eid
251
251
456
456
951
985
out put/target1
251
251
456
456
out put/target2
951
985
how we will bring
Answer Posted / subhash
According his logic: from aggregator stage: the output is
this: 251,2
456,2
951,1
985,1
the main data is:
251
251
456
456
951
985
If you join these two links then the output will be:
251,2
251,2
456,2
456,2
951,1
985,1
Then your are specifying that count=1 then you get the
unique records. means YOU get: 951,985
in another link count<>1 means YOU get: 251,
251,
456,
456
this is our desired out put, our out put.
the logic explained by Shiva is correct.
| Is This Answer Correct ? | 9 Yes | 0 No |
Post New Answer View All Answers
How you can fix the truncated data error in datastage?
Why do we use exception activity in Datastage?
what is the use of surogate key in datastage
How to find value from a column in a dataset?
what is the difference between == and eq in UNIX shell scripting?
What is the different type of jobs in datastage?
Which commands are used to import and export the datastage jobs?
Where do you see different stages in the designer?
What is the roundrobin collector?
What are the benefits of datastage?
What steps should be taken to improve Datastage jobs?
how to sort two columns in single job in datastage.
hi.... am facing typical problem in every interview " I need some critical scenarios faced in real time" plz help me guys
What are the types of jobs we have in datastage?
What are the types of hashed files in data stage