Golgappa.net | Golgappa.org | BagIndia.net | BodyIndia.Com | CabIndia.net | CarsBikes.net | CarsBikes.org | CashIndia.net | ConsumerIndia.net | CookingIndia.net | DataIndia.net | DealIndia.net | EmailIndia.net | FirstTablet.com | FirstTourist.com | ForsaleIndia.net | IndiaBody.Com | IndiaCab.net | IndiaCash.net | IndiaModel.net | KidForum.net | OfficeIndia.net | PaysIndia.com | RestaurantIndia.net | RestaurantsIndia.net | SaleForum.net | SellForum.net | SoldIndia.com | StarIndia.net | TomatoCab.com | TomatoCabs.com | TownIndia.com
Interested to Buy Any Domain ? << Click Here >> for more details...

1)What is ur project architecture ?
2)how to move project from developement to uat?
3)What is the difference between datastage 6,7.1 and
datasttage 7.5?
4).How to do error handling in datastage?
5)3.Whta is unit testing, system testing and integration
testing?
6)What is the Exact difference between BASIC Transformer
and NORMAL Transformer?When we will go for BASIC Or NORMAL
Transformer

7)why we use third party tools in datastage?
8)What is the purpose of Debugging stages? In real time
Where we will use?

Answer Posted / ramya

Answer for 6th question.

1. The Transformer stage is inherent PX functionality,
whereas the BASIC Transformer uses a Server interface to
call a Server Transformer stage. There's severe performance
impact as well as partitioning limitations, but it does
give a PX job some access to existing Server functionality.
2. There ate two types of transformer i. Basic
transformer and ii. Active transformer. Basic transformer
is used for SMP system and not in MPP or cluster. Basic
transformer (BASIC is the language supported by the Data
stage server engine and available in Server job). Where in
Datastage Px the Active transformer get use.
3. Transformer stages are always active stages. The
basic transform stage is part of the Server product, but
the PX engine allows this stage to be called (the opposite,
using a PX stage in Server is not possible)

4.
In parallel jobs there are two, very different, stage
types. One is called the Transfomer stage, the other is
called the BASIC Transformer stage.

Parallel jobs do not use the DataStage BASIC run machine;
instead they use a C-based environment that is/was called
osh (from "Orchestrate shell", Orchestrate being the
original name for the parallel execution environment
acquired by Ascential by purchasing Torrent).

This is why the Transformer stage in parallel jobs does not
use BASIC. The BASIC language not supported in osh .

The BASIC Transformer stage in parallel jobs causes osh to
invoke the DataStage BASIC run machine for the stage to be
executed. This obviously is an extra overhead.


5. The Transformer stage in parallel jobs can NOT access
BASIC functions, etc. It uses a completely different
expression editor, with a largely equivalent list of
operators, and a different, though overlapping, list of
functions and other operands (but they're not BASIC
functions and operators).

The BASIC Transformer stage in parallel jobs is a direct
equivalent of the Transformer stage in server jobs. Its use
will carry a performance overhead, because the BASIC run
machine will also need to be loaded when the parallel job
is executed. But the convenience might well override this
consideration, particularly given the richness of the
function set in BASIC

Is This Answer Correct ?    5 Yes 1 No



Post New Answer       View All Answers


Please Help Members By Posting Answers For Below Questions

What are the steps needed to create a simple basic datastage job?

1235


What are stage variables?

1202


Define Data Stage?

1092


How many types of sorting methods are available in datastage?

1073


What are the different common services in datastage?

1285


whom do you report?

1930


How do you import and export the datastage jobs?

1159


What are the important features of datastage?

1127


Can you explain repository tables in datastage?

1408


How to clean the datastage repository?

1127


hi.... am facing typical problem in every interview " I need some critical scenarios faced in real time" plz help me guys

2865


In work load management there are three options of Low priority, Medium priority and High Priority Jobs which can be used for resource management. why this feature is developed when there is already jobs prescheduled by scheduler or autosys. what will be the use of workload management then?

1501


How to manage date conversion in Datastage?

1138


Explain connectivity between datastage with datasources?

976


How will you move hashed file from one location to another location?

2151