HARSH KUMAR
A voltmeter is an instrument used for measuring the electrical potential difference between two points in an electric circuit. Analog voltmeters move a pointer across a scale in proportion to the voltage of the circuit; digital voltmeters give a numerical display of voltage by use of an analog to digital converter. Voltmeters are made in a wide range of styles. Instruments permanently mounted in a panel are used to monitor generators or other fixed apparatus. Portable instruments, usually equipped to also measure current and resistance in the form of a multimeter, are standard test instruments used in electrical and electronics work. Any measurement that can be converted to a voltage can be displayed on a meter that is suitably calibrated; for example, pressure, temperature, flow or level in a chemical process plant. General purpose analog voltmeters may have an accuracy of a few per cent of full scale, and are used with voltages from a fraction of a volt to several thousand volts. Digital meters can be made with high accuracy, typically better than 1%. Specially calibrated test instruments have higher accuracies, with laboratory instruments capable of measuring to accuracies of a few parts per million. Meters using amplifiers can measure tiny voltages of micro volts or less. Part of the problem of making an accurate voltmeter is that of calibration to check its accuracy. In laboratories, the Weston Cell is used as a standard voltage for precision work. Precision voltage references are available based on electronic circuits. and supply voltage variations. To ensure that a digital voltmeter's reading is within the manufacturer's specified tolerances, they should be periodically calibrated against a voltage standard such as the Weston cell. Digital voltmeters necessarily have input amplifiers, and, like vacuum tube voltmeters, generally have a constant input resistance of 10 megohms regardless of set measurement range. A digital