How to fasten loading of 100 million distinct records in
informatica?
(Initially they are loaded into target without using any
transformation, taking 2 hours)
Answer Posted / nitin tomer
1) Load Data using Bulk Load for that we have to drop index from the target table, but after loading the data to create the index will also take good amount of time
2) Create a sequence 1 to 100 million using sequence generator and create the pass-through partition to load data, let say 10 partition we can create from 1 to 100000 like that.
Partitioning will definitely give huge performance improvement.
| Is This Answer Correct ? | 0 Yes | 0 No |
Post New Answer View All Answers
Under what conditions selecting sorted input in aggregator will still not boost session performance?
What do you mean incremental aggregation?
How can you differentiate between powercenter and power map?
what is the -ve test case in your project.
tell me the rules and responsblites in our project(my project is development)
tell me 5 session failure in real time how can you solve that in your project?
In informatics server which files are created during the session rums?
where to store informatica rejected data? How to extract the informatica rejected data?
suppose we are using dynamic lookup cache and in lookup condition the record is succeeded but in target it is failed due to some reasons then what happened in the cache ?
What is meant by lookup transformation? Explain the types of lookup transformation?
What is the sql query overwrite in source qualifier in informatica
What are the different types of olap? Give an example?
How can you define user defied event?
I have worked on Informatica 7.1.1. I want to know the answer for the below question. Target is not created initially.Then how to take the DDL script from the source and run the Mapping by creating the Target dynamically?Is it possible by SQL Transformation?
How do you load first and last records into target table? How many ways are there to do it? Explain through mapping flows.