Golgappa.net | Golgappa.org | BagIndia.net | BodyIndia.Com | CabIndia.net | CarsBikes.net | CarsBikes.org | CashIndia.net | ConsumerIndia.net | CookingIndia.net | DataIndia.net | DealIndia.net | EmailIndia.net | FirstTablet.com | FirstTourist.com | ForsaleIndia.net | IndiaBody.Com | IndiaCab.net | IndiaCash.net | IndiaModel.net | KidForum.net | OfficeIndia.net | PaysIndia.com | RestaurantIndia.net | RestaurantsIndia.net | SaleForum.net | SellForum.net | SoldIndia.com | StarIndia.net | TomatoCab.com | TomatoCabs.com | TownIndia.com
Interested to Buy Any Domain ? << Click Here >> for more details...


How to delete duplicate records if we have huge volume of
records in a table ?
(rowid is not the correct approach)

Answers were Sorted based on User's Feedback



How to delete duplicate records if we have huge volume of records in a table ? (rowid is not the..

Answer / keyrun

Hi, There are many ways to delete the duplicates!

By Using the following Transformations you can delete the Duplicates:

Source Qualifier: Use SQ and check 'Distinct' property (If the Source is Relational)

Sorter: Use sorter and in Properties tab Check 'Distinct'(If the Source is Flat File)

Aggregator: Use Aggregator & Group by on the key port.

Is This Answer Correct ?    7 Yes 0 No

How to delete duplicate records if we have huge volume of records in a table ? (rowid is not the..

Answer / chandu

select * from from <table_name> where rowid not in(select
min(rowid) from <table_name> group by column(primary or
unique key column))

Is This Answer Correct ?    5 Yes 0 No

How to delete duplicate records if we have huge volume of records in a table ? (rowid is not the..

Answer / chandrasekar

First we can count the no of records using aggregator
Transformation with group by (port) for ex, in emp table
take empno. Second we can take the Filter condition like
count=1. Other records are rejected not loaded into the target.

Hope it will help little bit...

Is This Answer Correct ?    1 Yes 1 No

How to delete duplicate records if we have huge volume of records in a table ? (rowid is not the..

Answer / cmanojkumar

Hi,
Thanks for your answer.
I am sorry this question i need to post in oracle forum not
informatica.
Any way your answer is usefull for me.
Could you please tell me how can we delete with our rowid in
oracle ?

Is This Answer Correct ?    0 Yes 0 No

How to delete duplicate records if we have huge volume of records in a table ? (rowid is not the..

Answer / ravikumar2614

DELETE FROM EMP E1 WHERE ROWID<
(SELECT MAX(ROWID) FROM EMP E2 WHERE E1.ROWID=E2.ROWID)

IF I AM WRONG PLEASE CORRECT ME ON ravi.info2614@gmail.com

Is This Answer Correct ?    1 Yes 1 No

How to delete duplicate records if we have huge volume of records in a table ? (rowid is not the..

Answer / dilip ingole

Delete from emp where rowid not in(select min(rowid) from emp group by eid,ename)

in group by column you need mention all column in table

Is This Answer Correct ?    0 Yes 0 No

Post New Answer

More Informatica Interview Questions

Can we have a Mapping without a Source Qualifier?

14 Answers   Deloitte,


Hey I am net to informatica? Can any one explain me step by step How scd will work ?

1 Answers   Infosys,


Explain in detail about scd type 1 through mapping.

0 Answers  


how do u fnd the duplicate rows and how to delete the duplicate rows?

2 Answers   IBM,


in my lookup table i want to catch the recently updated records from the source to target,how to achieve this and what is last commit interval point for this?

2 Answers   Thomson Reuters,


Can I use same Persistent cache(X.Dat) for 2 sessions running parallely? If it is not possible why?If yes How?

0 Answers  


Explain scd type 3 through mapping.

0 Answers  


Can we use the mapping parameters or variables created in one mapping into another mapping?

1 Answers   Informatica,


Why use shortcuts(Instead of making copies).

1 Answers  


How to enter same record twice in the target table,explain?

7 Answers  


&#61656; Informatica Checkpoints

0 Answers   DELL,


permutations of mapplet

0 Answers   TCS,


Categories