Golgappa.net | Golgappa.org | BagIndia.net | BodyIndia.Com | CabIndia.net | CarsBikes.net | CarsBikes.org | CashIndia.net | ConsumerIndia.net | CookingIndia.net | DataIndia.net | DealIndia.net | EmailIndia.net | FirstTablet.com | FirstTourist.com | ForsaleIndia.net | IndiaBody.Com | IndiaCab.net | IndiaCash.net | IndiaModel.net | KidForum.net | OfficeIndia.net | PaysIndia.com | RestaurantIndia.net | RestaurantsIndia.net | SaleForum.net | SellForum.net | SoldIndia.com | StarIndia.net | TomatoCab.com | TomatoCabs.com | TownIndia.com
Interested to Buy Any Domain ? << Click Here >> for more details...

A flatfile contains 200 records.I want to load first 50
records at first time running the job,second 50 records at
second time running and so on,how u can develop the job?pls
give the steps?pls pls

Answer Posted / subhash

other than VARUN solution

1. Add 'row number' column in Seq File stage, so that each record has a number associated with it.
2. Add a job param with which we can provide the number of record from where we want to run the job. We can pass this either using Sequence Start LOOP(List type variables-50,100,150,200) or by shell script.
3. In the tfm, use a stage variable to run only from the record number till 50 records by counting each record.

Is This Answer Correct ?    2 Yes 0 No



Post New Answer       View All Answers


Please Help Members By Posting Answers For Below Questions

What are the types of containers and how to create them?

982


Explain connectivity between datastage with datasources?

985


How will you load you daily/monthly jobs datas in to Fact and Dimension table using datastage.

3461


What is Ad-Hoc access? What is the difference between Managed Query and Ad-Hoc access?

2974


Explain how a source file is populated?

1373


Describe the main features of datastage?

1078


hi iam new to this tooliam cmpltied to know abt datastage so now iam in project tell me whole step by step what iam doing iwnt to go with exp so plz hlp me pals

1873


what is use of SDR function?

5178


What is apt_config in datastage?

1143


Can you explain kafka connector?

1315


How to manage date conversion in Datastage?

1142


What are the functionalities of link partitioner?

1023


How to implement complex jobs in data stage?

1116


What are the different common services in datastage?

1291


In work load management there are three options of Low priority, Medium priority and High Priority Jobs which can be used for resource management. why this feature is developed when there is already jobs prescheduled by scheduler or autosys. what will be the use of workload management then?

1503