How to fasten loading of 100 million distinct records in
informatica?
(Initially they are loaded into target without using any
transformation, taking 2 hours)
Answers were Sorted based on User's Feedback
Answer / dev
Have index on the sources and make sure indexes are dropped in target(any ways these are distinct records) before load and opt for bulk loading. Once load is done you can create thee indexes again and maybe analyze the table also.
| Is This Answer Correct ? | 14 Yes | 0 No |
Answer / nitin tomer
1) Load Data using Bulk Load for that we have to drop index from the target table, but after loading the data to create the index will also take good amount of time
2) Create a sequence 1 to 100 million using sequence generator and create the pass-through partition to load data, let say 10 partition we can create from 1 to 100000 like that.
Partitioning will definitely give huge performance improvement.
| Is This Answer Correct ? | 0 Yes | 0 No |
without using rank transformation how can we rank items by using some other transformations
Explain pushdown optimization and types in informatica
I have worked on Informatica 7.1.1. I want to know the answer for the below question. Target is not created initially.Then how to take the DDL script from the source and run the Mapping by creating the Target dynamically?Is it possible by SQL Transformation?
I have a scenario which load the data frm single source to 2 targets as T1, T2, and T1 have a P.K and T2 have F.K relations. first data has to load in T2, and then data load to T1 if that record exist in T1... how can we acheive it?
How many joins in Informatica
Global and Local shortcuts. Advantages.
What are batches?
I am having two tables,say table1 having cols Empid,firstname,lastname,middlename and table2 having Empid,firstname,lastname can i union them using Union t/f?
Which kind of index is preferred in DWH?
What are the mapping parameters and mapping variables?
A Flat file located in unix source folder. data transfer every day 8 am. can ask 5 question for above scenario validation. recently i have faced this question.
In update strategy t/r we had given dd_insert condition & in session we gave delete condition . Then what will happen? mapping will run ?