What is denormalization and when would you go for it?
Answer Posted / ravindra gaikwad
In computing, denormalization is the process of attempting
to optimise the read performance of a database by adding
redundant data or by grouping data.[1][2] In some cases,
denormalization is a means of addressing performance or
scalability in relational database software.
| Is This Answer Correct ? | 0 Yes | 0 No |
Post New Answer View All Answers
A) Which two are benefits of Teradata's support for ANSI Standard SQL? (Choose two.) 1.data is distributed automatically 2.queries get optimized to better plans 3.submit queries from other database systems 4.can interface with BI tools B) Which statement is true when comparing the advantages of third normal form to star schema? 1.Star schema tends to have fewer entities. 2.Star schema requires additional data storage. 3.Third normal form tends to have fewer entities. 4.Third normal form requires additional data storage. C)Which two sets of functions does the Parsing Engine (PE) perform? (Select two.) 1.sorting, formatting, and aggregating of the final answer set 2.flow control of the data to and from the participating tables 3.SQL statement interpretation, syntax validation, and semantic evaluation 4.dispatching the step execution sequence to the AMP via the BYNET D) Which two can be achieved with Teradata Active System Management (TASM)? (Choose two.) 1.disable hardware 2.react to hardware failure 3.influence response times 4.collect metadata E) Which three mechanisms can be used to ensure security within the Teradata Database? (Choose three.) 1.views 2.spool limits 3.roles 4.access rights 5.profiles
What are the main phases of database development?
Which four data types cannot be used as a return type from a user-defined function?
Write short notes on manual refreshes.
Where is dbms used?
fact table and dimension table containg one to many relationship or many to one relastionship
There is a trigger defined for INSERT operations on a table, in an OLTP system. The trigger is written to instantiate a COM object and pass the newly insterted rows to it for some custom processing. What do you think of this implementation? Can this be implemented better?
How to use online Backups?
What is an application role and explain a scenario when you would use one?
How would you design a database for an online site, which would average a million hits a day?
Explain about relational operator join?
Explain about network model?
How can I detect whether a given connection is blocked?
Write the fastest query to find out how many rows exist in a table?
What you can do to remove data from the cache and query plans from memory for testing the performance of a query repeatedly?