Suppose we are using a Dynamic Lookup in a Mapping and the commit
Interval set for the tgt is 10000. Then how does the data get
committed in the lookup if there were only 100 roows read from
the src and the dynamic lookup dint have the 100th row in it?
Answers were Sorted based on User's Feedback
Answer / abhishek singh
First of all can u please tell me whether target is used as
source in this mapping.
If target is used as source in the mapping then commit
interval should be 1.
In other case where target is not used as source then it
could be anything, In your case it will be 10000.
| Is This Answer Correct ? | 13 Yes | 1 No |
Answer / akash
The cache data should not get impacted in this scenario. The commit interval is a target property.
| Is This Answer Correct ? | 1 Yes | 0 No |
Answer / imran
The dynamic lookup cache will be updated in real-time as each row is processed, independent of the target commit interval. The target commit interval only affects when the data is committed to the target table, not when the dynamic lookup cache is updated. Thus, the dynamic lookup will contain the 100th row immediately after it is processed, even though the target table commit interval is not reached.
| Is This Answer Correct ? | 0 Yes | 0 No |
How to list Top 10 salary, without using Rank Transmission?
I have a scenario which load the data frm single source to 2 targets as T1, T2, and T1 have a P.K and T2 have F.K relations. first data has to load in T2, and then data load to T1 if that record exist in T1... how can we acheive it?
What are the scheduling options to run a sesion?
what are the types of facts with Examples?
Is informatica power center designer has 64 bit support?
What is joiner transformation in informatica?
we have a parameter file in Unix location where we have .txt files and those file will be used as source in informatica. I cannot use source file name directly as file name will keep on changing in unix location. I need to define $$InputFile as parameter. Can anybody send me the parameter file and the steps to handle this.
In a table there are 1 million records there in which 3 records are duplicate how will you find out those 3 records?
How to delete the (flat file) data in the target table after loaded.
waht is dataware house
What are the main issues while working with flat files as source and as targets ?
What is limitations of truncate and load option