Is snow flake or star schema used?
If star schema means why?
Answers were Sorted based on User's Feedback
Answer / venkat
In the aboue two (snow flake,star)schemas ,star schema is
best,because star schema is denormalised and star schema is
centrally located Fact table and surrounded by multiple
dimention tables.so fact table size can be calculated based
on the lowest level granularity of the dimentions.
| Is This Answer Correct ? | 5 Yes | 0 No |
Answer / koti
star schema is for simple data marts where as snowflake
schema for complexives...
| Is This Answer Correct ? | 5 Yes | 1 No |
Answer / senthil
hi
normally people using star schema in data warehousing
because it is de_normalized and easy way to retrieve data
from target and also understanding..
But snow flake is in fully normalized.normally OLTP
we use normalized ..
star schema used in data warehousing ..
| Is This Answer Correct ? | 2 Yes | 0 No |
Answer / radhakrishnansk
star schema is used for easy retrieval.
It is in 2nd normal form.
But snow flake is in 3rd normal form it needs more joins for
retrieval.
| Is This Answer Correct ? | 2 Yes | 3 No |
Source having one lakh record and loaded into target. Then, how can i compare records will loaded in table? For example Source having Firstname,Lastname. the same Firstname,Lastname record will be loaded into Target? How can i check in Oracle?
Hi experts, can anyone tell how much we use plsql in real time
In mapping f.f as one src and f.f as trg,f.f as src and oracle as trg which is fast? mean which is complete first process
What is the difference between normal and bulk loading? Which one is recommended?
how many data models u have done in informatica project?
what are the transformations that restricts the partitioning of sessions?
How would you copy the content of one repository to another repository?
What are the types of schemas we have in data warehouse and what are the difference between them?
How to recover the standalone session?
What is data transformation manager process?
How to load the name of the current processing flat file along with the data into the target using informatica mapping?
I am getting five sources in a day and i donot know when i get them. i need to load data into the target and run the session. but here i can't keep the session in running or can't stop the session. plz help me