How to fasten loading of 100 million distinct records in
informatica?
(Initially they are loaded into target without using any
transformation, taking 2 hours)
Answer Posted / nitin tomer
1) Load Data using Bulk Load for that we have to drop index from the target table, but after loading the data to create the index will also take good amount of time
2) Create a sequence 1 to 100 million using sequence generator and create the pass-through partition to load data, let say 10 partition we can create from 1 to 100000 like that.
Partitioning will definitely give huge performance improvement.
| Is This Answer Correct ? | 0 Yes | 0 No |
Post New Answer View All Answers
Briefly explain your complete project(sales) flow, (ie. from source received from client, transformations, then despatch to end user) what are all the process. Kindly give step by step process.
How will the document be delivered to me?
What do you mean by DTM and Load manager and what is difference between load manager and load balancer?
how to load rows into fact table in data warehouse
What is decode in static cache?
In what scenario we use to improve session performance by pushdown optimization?can any one give example?
Suppose on 1st Nov 2010 you had created a mapping which includes huge aggregator calculations and it is under process for next two days. You will notice that even on 3rd day also its still calculating. So without changing a logic or changing a mapping How will you troubleshot or to run that mapping? Explain the steps
Which transformation should we use to normalise the COBOL and relational sources?
if we have a delimiters at unwanted places in a flat file how can we over come those.
I have 100 records in source table, but I want to load 1, 5,10,15,20…..100 into target table. How can I do this? Explain in detailed mapping flow.
What is a connected transformation?
What is transformation?
What is update strategy transform?
Differentiate between source qualifier and filter transformation?
Explain sessions?