Case Study: Database Development
Maintenance plans and activities that could be performed in order to improve data quality
In order to improve the quality of datasets various activities to be performed like error detection and correlation, process design and process control and improvement In error detection method, errors are detected by comparing data to a correct baseline and by checking for missing values. By examining time stamp incorrected data can be find out and by applying correction policies faults in data can be identified.
In process control and improvement, drawbacks of implementing error detection will be found out and to overcome this Total Quality Management Methodology …show more content…
Data quality measurement will be done to optimize the performance using data quality score cards.
3. by integrating data quality into the application or into system development life cycle according to business requirement.
Using SDLC, in Data Quality Requirements Analysis phase data quality analyst synthesizes data quality and determines the data quality dimensions. Analyst proposes data control methods for data transformation or data extraction. In data quality inspection and monitoring process, data validations rules are applied. After implementing these data will be integrated in SDLC and data validation and correction will be done at the final stage. Concurrency control methods and lock …show more content…
Mostly used methods for concurrency control are Two-phase locking, concurrency control based on time stamping, Multiversion Two-Phase Locking Using Certify Locks, Lock compatibility matrix and lock granularity. Concurrency is done to enforce isolation among conflicting transactions and to preserve consistency. Two phase locking is a process in which permission to read and permission to write for data items are secured. Locking is the process for giving permissions and unlocking is to remove permission. There are two lock modes shared and exclusive. In shared mode more than one transaction can apply share lock on data for reading but without any write lock. But in exclusive mode only write lock can be applied. By applying this method deadlock can be prevented. All data items can be locked before execution. To avoid deadlocks by rolling back victim method by Wound-wait and wait die algorithms are used which uses timestamp methods of concurrency control. Granularity can be defined as the lockable unit of data. Degree of concurrency will be low for coarse granularity and high for fine granularity. It can be implemented using three types of locks intension shared (shared lock), intension exclusive (exclusive lock) and shared intention exclusive (locked by shared lock and requested by exclusive