explain the scenario for bulk loading and the normal
loading option in Informatica Work flow manager ???
Answers were Sorted based on User's Feedback
Answer / rekha
NORMAL LOAD : IT LOADS THE RECORD ONE BY ONE AND WRITES LOG
FOR EACH FILE . IT TAKES MORE TIME TO COMPLETE
BULK LOAD : I LOAD THE NUMBER OF RECORDS AT A TIME ,IT WONT
FALLOW ANT LOG FILES OR TRACE LEVELS . IT TAKES LESS TIME
USE THE BULK MODE FOR IMPROVING THE SESSION PERFORMANCE
Is This Answer Correct ? | 49 Yes | 4 No |
Answer / addy
Hi
I would like to mention- apart from logging differences
there is a crucial diffrence - if we are running the session
in bulk target load mode we cant recover a session from the
same point when we run it for the next time.
whereas if we are running the session in normal target load
mode we can recover the session if the output being loaded
to the target is deterministic in nature - this is relay a
helpful feature with the applications handling RealTime data.
-addy
Is This Answer Correct ? | 31 Yes | 4 No |
Answer / shashank
in the normal loading the taget write all the row on the
database log , while laoading the bulk loading the database
log is not come in the picture (that mean its skip the
property )so when the session got failed we can easily find
recover the seesion by the help of data base log.but in
case of bulk loading we can do .
but normaol loading is very slow as compare to bulk laoding.
Is This Answer Correct ? | 14 Yes | 3 No |
Answer / sujith
Above 5,6&7 are the correct for this questions....
Is This Answer Correct ? | 8 Yes | 1 No |
Answer / thiru
In normal loading it will creates the log file before
loading target.it will takes the time but in this sessiom
recovery is available.
In bulk loading the integration service bypasses the log
file,direct load into the target.inthis there is no session
recovery available but performance increase.
Is This Answer Correct ? | 7 Yes | 2 No |
Answer / sankar
NORMAL LOADING:THE INTEGRATION SERVUCE CREATE THE DATA BASE
LOG BEFORE LOADING DATA INTO THE THE TARGET DATA BASE.SO
--THE INTEGRATION SERVICE PERFORM ROLL BACK AND SESSION
RECOVERY.
BULK LOADING:THE INTEGRATION SERVICE IN WORK THE BULK
UTILITY WHEN BYPASS THE DATA BASE LOG
--THIS IS IMPROVES THE PERFORMANCE DATA LOADING
--THIS IS NOT PERFORM ROLLBACK
Is This Answer Correct ? | 6 Yes | 2 No |
1)Bulkload & Narmal load
Normal: In this case server manager allocates the
resources(Buffers) as per the parameter settings. It creates
the log files in database.
Bulk: In this case server manager allocates maximum
resources(Buffers) available irrespective of the parameter
settings. It will not create any log files in database.
In first case data loading process will be time taking
process but other applications are not affected. While in
bulk data loading will be much faster but other application
are affected.
Is This Answer Correct ? | 12 Yes | 13 No |
Answer / jyothsna katakam
when you select the normal, it will check the p.k and f.k
relation ship while running the mapping but when you select
the bulk it wont check any p.k and f.k relation ship
Is This Answer Correct ? | 7 Yes | 27 No |
in realtime which situations u can use unconnected lookup transformation
h0w many versions have been developed of onformatica so far?
what is the purpose of surrogate key and diff between primary key&surrogate key
What are conformed dimensions?
i have a table with name field. i,e name Shankar prabhakar nitikripa so no if a occures 3 times in name then it will go to tgt A if b occures 3 times in name then it will go to tgt b .. if z occures 3 times in name then it will go to tgt z
What is target load order?
Suppose we are using a Dynamic Lookup in a Mapping and the commit Interval set for the tgt is 10000. Then how does the data get committed in the lookup if there were only 100 roows read from the src and the dynamic lookup dint have the 100th row in it?
how to call lookup qualifer in unconnected look;up
What does role playing dimension mean?
if i have records like these (source table) rowid name 10001 gdgfj 10002 dkdfh 10003 fjfgdhgjk 10001 gfhgdgh 10002 hjkdghkfh the target table should be like these by using expression tranformation. (Target table) rowid name 10001 gdgfj 10002 dkdfh 10003 fjfgdhgjk xx001 gfhgdgh xx002 hjkdghkfh (that means duplicated records should contain XX in there rowid)
List the transformation in informatica.
how to obtain performance data for individual transformations.