What is staging area?
Answers were Sorted based on User's Feedback
Answer / haritha reddy
It is the temparory storage area where the reconcilation of
data is possible.
you can extract the data from diff source systems and
transform,you can aggregate, cleans the data.
staging area reduce the burden on the source system.
Is This Answer Correct ? | 18 Yes | 1 No |
Answer / mln
This is temporary area where the data is kept for
transformation and clensing.
Is This Answer Correct ? | 8 Yes | 1 No |
Answer / anju
staging area is used to integrate data from various
heterogenous sources. the advantage is recoverability. ie
if load session fails we can get data from staging area.
Is This Answer Correct ? | 7 Yes | 0 No |
Answer / sai suresh
Staging area is a where data transformaions takes place. It
is temparly storage area . Transformations like "DATA
SCRUBBING, DATA CLEANSING,DATA AGGREGATION,DATA MERGING"
The above transformations are takes place in the staging
area, in staging area data transformed from one format to
required business format then it will load into the target.
Is This Answer Correct ? | 5 Yes | 0 No |
Answer / tapsi
staging area is the place where data is collected from the
different sources and it was cleansed and made ready for
transformations.
Is This Answer Correct ? | 2 Yes | 0 No |
Answer / mkselva
A place where data is processed before entering the
warehouse
Is This Answer Correct ? | 1 Yes | 0 No |
Answer / k raju
Staging area is we kept all data From OLTP with offline mode
in staging area where we made datacleansing and ready for
transformation
Is This Answer Correct ? | 1 Yes | 0 No |
Answer / simon
Staging area is a place where you hold temporary tables on data warehouse server. Staging tables are connected to work area or fact tables. We basically need staging area to hold the data, and perform data cleansing and merging, before loading the data into warehouse.
you can refer dwhlaureate.blogspot.com
Is This Answer Correct ? | 1 Yes | 0 No |
Answer / lakshminarayana
intermediate area between source system and data warehouse.it is temporary schema/user.after completion of each session we have to delete the previous data from staging area(table) before going to load data into staging area.it is used mainly for the purpose of increase the performance of source system means we perform join condition it will take lot of time to execute because join perform on the total data of table.but staging area required particular condition data means it required small amount data not the total copy of source data.and it used for clean the data before loading.it is relational data base.no end user can't access the data and no transformation perform on the staging area.if the session will failure then we directly receive particular data from staging area.means we can decrease the burden on source by using staging area.
Is This Answer Correct ? | 1 Yes | 0 No |
Answer / giri
Staging area is collection of database. It is temporary storage
Is This Answer Correct ? | 2 Yes | 2 No |
What are the challenges of Dataware housing in the future?
Hi, I saw one mapping implemented by my seniors . In Expression transformation they implemented following logic. That is iif(is_date(in_UC_DATINV,'YYYYMMDD'),to_date(in_UC_DATINV,'Y YYYMMDD'),'Inventory Date is either invalid or null') Inventory_Date is validated only for is_date() But not validated for notisnull() . But error says “ either invalid or null “ why? Whether is_date() also check for not isnull() ? or in this logic something is different ? Please answer me . Advance thanks
What are ETL Tools?
What will happen if the select list columns in the custom override sql query and the output ports order in sq transformation do not match?
following scenario i have 1000 record flatfile source i want ist row to ist target 2nd eow to 2nd target 3rd row to 3rd target how will u do?
How do we eliminate duplicate records in a flat file without using Sorter and Aggregator?
Hi all, i am new to this site and new to Informatica too. I have few questions regarding that. 1) When we load flat files into target tables how do we identify duplicates? and where do load the duplicate records for further reference? 2) How do we do chage data capture? Is this Slowly changing Dimension technique? Thanks in Advance
Why we use ENABLE HIGH PRECISION in session properties?
why union transformation is active transformation?
My flat file source is C_Id 1-nov-2011 8-nov-2011 100 2000 1500 101 2500 2000 I want my Target as C_Id Week_Num Amt 100 45 2000 100 46 1500 101 45 2500 101 46 2000
difference between source based commit? and target based commit? which is better with respect to performance?
If I have a index defined on target table and if set it to bulk load will it work ?