what is usage of datastage with materialized views
Answers were Sorted based on User's Feedback
Answer / syed naseruddin
materailized views are useful in two in datastage
1)when we are loading data from standard interface(staging
area) to dataware house
2)from datawarehose to reporting area
note:standard interface means---->in this arera we are
going to load the standerzie data using datastage from
differnet flat files and staging area.
on top of standerdized data we perform transformations
so,after performing transformations according to bussines
requirments we will load data to warehouse in between this
we material viwes to automatically referesh the data
same way when we are loading data from warehouse to
reporting purpose
| Is This Answer Correct ? | 1 Yes | 0 No |
Answer / hari
Hi Kpk
I Think u r answer correct in Oracle view but question is
how can we use the materilized views in dataStage.
My view is No need of materialized view in DataStage it can
be used in Reporting time ( in OLAP)
| Is This Answer Correct ? | 1 Yes | 2 No |
Answer / kpk
Materilized views are similar to normal views, but the
difference is, materilized views are stored in the databse
and they are refereshed timely to get the new data....
| Is This Answer Correct ? | 2 Yes | 4 No |
Answer / srinu
materialized views are used in lookup stage,for data refernces in local machine.these does not import physical data into local machine just refernces by using materialized views
| Is This Answer Correct ? | 0 Yes | 3 No |
at source level i have 40 columns,i want only 20 cols at target what r the various ways to get it
How you Remove the Dataset in Unix?
What are the difference types of stages?
I am defining one varaible parameter date in job parameters.I want use this variable date in where clause in source query.
HOW U CAN ABORT THE JOB IF THE DATA IS DUPLICATE?
How can u write exception handling in seq file????
HOW CAN WE SEE THE DATA IN DATASET?
How do you run datastage job from the command line?
Instead of using shared container in a job, I use jobs which perform similar function as Container in the sequence. Then what is the need of Shared Container?
I have 3 jobs A,B and C , which are dependent each other. I want to run A & C jobs daily and B job run only on sunday. how can we do it?
Hi guys, In sequencer job, we have 3 sources, in that 1st source have some records, Here requirement is 1st source records are 100 then only run the job otherwise total job will abort... How to calculate this. please design the job. Thanks.
1)Source file contains one record, I want 100 records in target file.