Database normalization is a technique used in designing
relational database tables to minimize duplication of information
thereby protecting the database
against certain types of logical or structural problems.
For example, when multiple instances of a data item occur in a
table, it is possible for these instances to not remain consistent
when the data within the table is updated. This results in a loss
of data integrity.
A table that is sufficiently normalized is less vulnerable to such
data anomolies because its structure follows basic assumptions where
multiple occurences of the same information should be represented
as a single instance only.
The payment for this is that more tables are required and more
joins, reducing performance. For this reason, highly normalized
tables are typically used in database applications involving many
isolated transactions; less normalized tables tend are used when
complex relationships between data entities and data attributes
must be mapped.
Normalization is described in degrees of strictness. Each table
has a "highest normal form" [HNF]. A 3NF inherits higher
norms [2NF]. but the reverse does not apply.
Next: Data Integration
Back To Top
The World's Leading Guide To BI Strategy, Program & Technology
Data Index | Data Defintion
| Meta Data | Data
Management | MDM | Data
Governance | Data Cleansing | Normalization
| Data Integration | Data
Growth | Data Solutions