we have 1 source table containing 100 records. now we have
to transfer first set of 1-10 (i.e1-10) records to one
target table and another set of 1-10(11-20) records to
other target table and continue like that till 100th record
Answer Posted / kvikas2056
Use seq. Generator to create sequence or you can use expression transformation to create sequence, then in router transformation create 10 group and give conditions like seq >1 and seq <11, seq>11 and seq <20 etc.
| Is This Answer Correct ? | 0 Yes | 0 No |
Post New Answer View All Answers
What are snapshots? What are materialized views & where do we use them? What is a materialized view log?
how would u estimate the size of Aggregator transform data and index cache?
Explain the difference between etl tool and olap tools?
Explain do we need an etl tool? When do we go for the tools in the market?
explain the methodology of Data Warehousing?(Polaries)
how do u handle performance issues in Informatica . where can u monitor the performance ?
What is a mapping?
Hi, I've a mapping with flat file source The target update override property for the target table is using update stmt. There is no update strategy between source and target. Also The session has the target properties as Insert , Update as update options checked. Does this mean that recs will be inserted only and the update override will not be applied at all. Thanks
how can i text accracy of ETL migration? i am very new to data warehousing. we are writing ETL scripts using SCRIPELLA tool. how can i test the correctness of data. and we are generating reports using pentaho . is there any easy way to test the pentaho. how can test these ETL scripts written in scriptella? thanks in advance
What are three tier systems in etl?
A session S_MAP1 is in Repository A. while running the session error message has displayed `server hot-ws270 is connect to Repository B`.what does it mean?
What is a materialized view log?
what is dynamic insert?
Compare etl & elt?
can Informatica be used as a cleansing tool? If yes, give examples of transformations that can implement a data cleansing routine.