Golgappa.net | Golgappa.org | BagIndia.net | BodyIndia.Com | CabIndia.net | CarsBikes.net | CarsBikes.org | CashIndia.net | ConsumerIndia.net | CookingIndia.net | DataIndia.net | DealIndia.net | EmailIndia.net | FirstTablet.com | FirstTourist.com | ForsaleIndia.net | IndiaBody.Com | IndiaCab.net | IndiaCash.net | IndiaModel.net | KidForum.net | OfficeIndia.net | PaysIndia.com | RestaurantIndia.net | RestaurantsIndia.net | SaleForum.net | SellForum.net | SoldIndia.com | StarIndia.net | TomatoCab.com | TomatoCabs.com | TownIndia.com
Interested to Buy Any Domain ? << Click Here >> for more details...


A flat file contains 200 records. I want to load first 50
records at first time running the job, second 50 records at
second time running and so on, how u can develop this job?



A flat file contains 200 records. I want to load first 50 records at first time running the job, se..

Answer / subhash

1st Way:
1. Add 'row number' column in Seq File stage, so that each
record has a number associated with it.
2. Add a job param with which we can provide the number of
record from where we want to run the job. We can pass this
either using Sequence Start LOOP(List type variables-
50,100,150,200) or by shell script.
3. In the tfm, use a stage variable to run only from the
record number till 50 records by counting each record.

2nd way:
Design the job like this:
1. Add 'row number' column in Seq File stage, so that each
record has a number associated with it.
2. Use filter stage and write the conditions like this:
a. row number column<=50(in 1st link to load the records
in target file/database)
b. row number column>50 (in 2nd link to load the records
in the file with the same name as input file name, in
overwrite mode)


So, first time when your job runs first 50 records will be
loaded in the target and same time the input file records
are overwritten with records next first 50 records i.e. 51
to 200.
2nd time when your job runs first 50 records(i.e. 51-100)
will be loaded in the target and same time the input file
records are overwritten with records next first 50 records
i.e. 101 to 200.
And so on, all 50-50 records will be loaded in each run to
the target

Is This Answer Correct ?    8 Yes 1 No

Post New Answer

More Data Stage Interview Questions

Hi, i did what you mentioned in the answer, i.e. source- >Transformer -> 3 datasets. Iam able to see the data in datasets but its not sort order... Can you tell how sort the data?? i also checked Hash partition with performsort.

1 Answers   CGI,


1.what is stagearea?what is stage variable? 2.this is my source source:id, name target:id, name 100, murty 100,madan we have three duplicate records for the id column,how can we getthe source record? 100,madan 100,saran

1 Answers   HCL,


i have a project manager round on sat this week can you post what are the main question i have to check.if you have any idea regular question on project pls send me. thanks in advance

2 Answers   IBM,


difference between server shared container and parallel shared container

6 Answers   CTS,


one file contains col1 100 200 300 400 500 100 300 600 300 from this i want to retrive the only duplicate like this tr1 100 100 300 300 300 how it's possible in datastage?can any one plz explain clearley..........?

8 Answers   IBM,


What is oci?

0 Answers  


Could anyone give brief explanation bout datastage admin

0 Answers  


What is ibm datastage flow designer?

0 Answers  


Converting Vertical PIVOTing without using PIVOT stage in DataStage. Ex: DEPT_NO EMPNAME 10 Subhash 10 Suresh 10 sravs Output: DEPT_NO EMP1 EMP2 EMP3 10 subhash suresh sravs 2) How to implement Horizontal PIVOTing without using PIVOT stage.

3 Answers   Cognizant, UHG,


I have source file which contains duplicate data,my requirement is unique data should pass to one file and duplicate data should pass another file how?

7 Answers   CTS,


How can we move a DATASTAGE JOB from Development to Testing environment with the help of a datastage job using unix commands.

5 Answers  


Two source files contains same meta data third file contains different data types can I funnel that file.

2 Answers  


Categories