- SS Murthy Tumuluri Srinath D Prem Chand K Naveen Kumar
INTRODUCTION
A voltmeter finds its importance wherever voltage is to be measured.
A voltmeter is an instrument used for measuring the electrical potential difference between two points in an electric circuit. Analog voltmeters move a pointer across a scale in proportion to the voltage of the circuit. General purpose analog voltmeters may have an accuracy of a few per cent of full scale, and are used with voltages from a fraction of a volt to several thousand volts.
Digital voltmeters give a numerical display of voltage by use of analog to digital converter. Digital meters can be made with high accuracy, typically better than 1%. Specially calibrated test instruments have higher accuracies, with laboratory instruments capable of measuring to accuracies of a few parts per million. Meters using amplifiers can measure tiny voltages of micro-volts or less. Digital voltmeters (DVMs) are usually designed around a special type of analog-to-digital converter called an integrating converter. Voltmeter accuracy is affected by many factors, including temperature and supply voltage variations. To ensure that a digital voltmeter's reading is within the manufacturer's specified tolerances, they should be periodically calibrated. Digital voltmeters necessarily have input amplifiers, and, like vacuum tube voltmeters, generally have a constant input resistance of 10 mega-ohms regardless of set measurement range.
This project aims at building a Digital Voltmeter using an 8051 microcontroller. All the data accessed and processed by the microcontroller is the digital data. And thus, the usage of an analog-to-digital converter finds its