If no. of source columns is changing every time (First time
it is 10 next time it is 20 so on). How to deal with it
without changing mapping?
Answers were Sorted based on User's Feedback
Answer / shrikant
Hello,
If I understand this question properly, it says that the
no. of "Source" columns are changing.
I do not agree with this scenario. Probably in
Datawarehousing, you won't find such a design. As far as
DWH is concerned, it takes the data from the OLTP systems &
after performing some operations (E-extract, T- transform)
it finally loads the data in some targets.
Here, as per the question, the question itself arises for
the OLTP design. No any OLTP (or database design principal)
system suggests a varying number of columns.
So, please do not get confused by such trivial kind of
questions. DWH is a very disciplined subject & it follows a
very good standards. Please go through the concepts first.
You will get a clear picture of DWH then.
| Is This Answer Correct ? | 18 Yes | 3 No |
Answer / santosh sinha
This is really a confusing question and raise question on
OLTP design butIf we are not changing the design of column
which is used in our mapping then it will not effect the
mapping so increasing the no of column will not effect.
| Is This Answer Correct ? | 2 Yes | 0 No |
Answer / sam
If your Source is XML this kind of scenario will come.Take
the exact XML DDL (max number of columns)as a source
defination.
| Is This Answer Correct ? | 0 Yes | 0 No |
Answer / suganthi
We do have a design like this(in DWH to DM load). But to
accomplish this we have created a procedure to add new
columns in teh table.
| Is This Answer Correct ? | 0 Yes | 0 No |
Answer / bidhar
If this is the requirement,then you will have to reimport
the source metadata in source analyzer and the save so that
the new columns are reflected in the mapping source.But to
pull it to target you need to join manually the pipelines.
But Such type of scenarios rarelly comes in OLAP models.A
good DWH design should be in such a way that if in future
any new columns come in the OLTP systems then DWH can
accomodate that.
For example Product Table has 20 columns in OLTP system.
Then OLAP model can have more than 20 columns for Product
Dimension(ATRIBUTE_1,ATRIBUTE_2 and so on)and populate null
values into that.
Now whenever new columns are added into the OLTP Product
table then we can simply map the new column to the ATRIBUTE
Columns,Keeping the datatypes in mind.
| Is This Answer Correct ? | 0 Yes | 0 No |
Answer / arnab
This should not happen when executing a datawarehousing
project , This reflects poor design and poor visualisation
of the requirements
The safest way to handle this is re-import the source
definition, someone suggested create a source with the
maxinmum number of columns , there is something called
quality of code and if someone codes like this it is really
a bad example of informatica coding
| Is This Answer Correct ? | 0 Yes | 0 No |
Answer / sathish rajaiah
Create source defination with maximum (depends on the max
numbers of columns u get in the source) number of fields
and use that in the mapping.
ex: create source defiantion with 20 fields and use this in
mapping this wil handle source comes with columns btween 1
to 20.
Note : Above wrks only for flat file sourcse not fr RDBMS
sources
| Is This Answer Correct ? | 0 Yes | 6 No |
If I have a index defined on target table and if set it to bulk load will it work ?
why cant we put a sequence generator or upd strategy transformation before joiner transformation?
Had any one faced informatica(ETL/Developer), Datawarehouseing interview in UK. Than plz help me (i have any exprience of 3yrs informatica,datawarehousing,oracle,teradata) 1.hw the procedure will b here 2. wht type of question's they will b asking. 3.In which area they concertate more. since this is the first time im facing interview in UK. plz help ASAP.it will b a great help for me thanks to All in Advance
How many input parameters can exist in an unconnected lookup?
Define Update Override?
Which transformation is needed while using the Cobol sources as source definitions?
hi real timers . iam waiting for ur reply regarding ETL TESTING
how to join the two flatfiles using the joiner t/r if there is no matching port?
Informatica and datawarehousing courses in Pune?
delete data from staging table as it loads to target table.here is the case we are getting data from 3 different server.a b and c.the data from server A loaded into staging table and we ran the task and data loaded to target table.now today data from server B and C also got loaded to the staging table.now what techniques and what transformations should be used to delete only the data which has been loaded only to the target.we need to delete only that data from staging which has been loaded into the target.looking for your responses
why we use source qualifier transformation?
How to extract original records at one target & Duplicate records at one target?