What is the difference between Junk and Confirmed Dimention?
where can be used htat one in Informatica?
Answers were Sorted based on User's Feedback
Answer / prabhu
Junk dimension contains the data which are currently not
useful. we have to keep these data for feature analysis
purpose.
Whereas Confirm dimentions are common for multiple fact
table.
| Is This Answer Correct ? | 4 Yes | 0 No |
Answer / sarvesh
merge two or more dimensional tables which are having the
low cardinality colomns into a single table is known
as "junk dimensional tables,
two or more fcat tables can be shared with a single
dimensional table is known as "conformed dimension"
| Is This Answer Correct ? | 2 Yes | 2 No |
Answer / chaitanya
An interesting use for a junk dimension is to capture the context of a specific transaction. While our
common, conformed dimensions contain the key dimensional attributes of interest, there are likely
attributes about the transaction that are not known until the transaction is processed.
| Is This Answer Correct ? | 0 Yes | 0 No |
Can some one explain me about Telecommunications(wireless) project in Informatica? Thanks in advance
How many ways are there to create ports?
in aggregator transformation we want to get middle record how to implement, source containg empno,name sal,deptno,address
When we are using Dynamic Cache,which options we will select in session level?
What are partitions in informatica and which one is used for better performance?
i have f;latfile source. i have two targets t1,t2. i want to load the odd no.of records into t1 and even no.of recordds into t2.
Can we call a stored procedure from a unix script which is run using command task
what is fact and what types of fact tables is there
case and like function in informtica (my source is XML). case when OS Like'%Windows%' and OS Like '%200%' then 'Windows 200' case when OS Like'%Windows%' and OS Like '%200%'and OS like '%64%' then 'windows 200 64 bit' etc.,,
Define sessions in informatica etl?
Can we make worklet inside worklet?
If a session fails after loading of 10,000 records in to the target. How can you load the records from 10001 th record when u run the session next time?