
For professional users, it’s vital to have calibration of measurement equipment regularly, to ensure each piece of equipment is performing within its specifications. A check is made how well a multimeter measures a “standard”. If all the readings fall into the spec ranges for that device, they put their label of approval on the unit without cracking open the case. If it falls out of range, they will arrange for the appropriate adjustment, then re-check. Depending on the equipment, this may require analog changes to variable resistors, or digital offset changes in the software code.
If the range of adjustment does not work, there could be three potential paths. (1) Have the problem repaired (2) Put a sticker on the unit alerting the user which functions or ranges are not valid, or (3) decommission the unit, assigning it “for reference only” and replace it. Total cost will be the basic calibration, the adjustment work, and any repair work as well.
An important thing to note, a calibrated piece of equipment is not spot-on. It’s simply within the limits of the multimeter specifications. This is important to understand because a freshly calibrated piece of equipment will not necessarily make a good reference for adjusting other devices. Usually, you want the calibration standard to be more than 10X "better" specification limits than what you are calibrating.
That’s most of what us in the Cheap-O realm need to know about calibration. We’re amateurs or occasional users. We may have paid a fraction of the cost of one annual calibration certificate for our meter. Does that mean calibration is entirely impractical, that we have to live with what the manufacturer shipped to us? Well, in many cases – no! We can do something about it!
What to Calibrate to?

Shown in this picture: A LM399 portable calibration circuit is programmed to match a recently calibrated precision multimeter.
We usually don’t have free access to the calibration standards that professionals use. However, we can buy or borrow something called a transfer standard which traces back to a real standard. However, it is a "copy of a copy". These may have more digits of precision than the meter we are calibrating, which may be good enough for casual users.
Easiest implementation: You may have an associate or friend who has a precision, calibrated multimeter. Just go through the corresponding ranges and compare readings measuring the same things. Try to utilize some stable references, like a resistor, capacitor, power supply, signal generator, and so on. This gives you an idea how well your meter is measuring.

Shown in this picture: A DMM and VOM simultaneously measuring the LM399 10V transfer standard.
To make the comparison, take turns attaching leads. For some measures, you can get both readings at the same time: Voltage, place both meters in parallel. Current: place both meters in series. For components like resistors, you should take turns only, don’t try to measure components simultaneously.
Once you verify your meter measures within spec, rejoice! Make your own label if you want. If you found your meter measures out of range, you could also attach a special note, such as “DCV measures 2% high”. Then, when you make measurements with that device, you can divide the reading by 1.02 to compensate manually.
There are more direct forms of transfer standard you can use. For instance, if you search for “voltage reference” you may come across many reasonably-priced options for purchase. Simply supply power to the unit, and let it stabilize for 15 minutes, then connect your meter to it. You will have access to an accurate voltage which was recorded at the factory and recorded on the label. It may also be a collection of labeled precision components populated on a circuit board, such as resistors and capacitors, with similar labeling. Just make sure you always do your adjustment at room temperature unless stated otherwise, because values change when temperature change.
While each of these units are inexpensive, the total cost of accumulating all of the required standards may be beyond your budget. Often people acquire these over time. It's common to start with a DC voltage standard. Also to collect various precision (low tolerance) resistors and capacitors. If you don't have access to a standard today, you may find yourself in better position tomorrow.
The KeepOnTesting lab has demonstrated many different standards in the review videos, originally informal and later more professional. As the channel has progressed, these standards keep getting better! Watch the videos for ideas for your personal lab.
Preparing to Make Adjustments
What to do if the Cheap-O meter you are using doesn’t fall into it’s design specification, or is further from being “on center” than you would like, and you want to make the correction instead of an "offset value" label? This usually requires opening the case to get started with correction. Keep in mind, any disassembly of the unit may void the warranty, result in damage, and may be unsafe if high voltages are present (so disconnect from mains!). You may want to think seriously first: Is it worth the risk to make the tweak? Probably depends on you and how far off the meter is. Personally, I like to align things whenever I can nondestructively get inside the unit. However I know that things wear out from being opened and closed, especially cheap plastics.

Upon peering inside, you may find some small variable resistors on the main board which have a "-" or “+” shape. Usually in white text nearby the adjuster, it will read something like "VR1". If you see only one of these, it’s probably the DC voltage adjustment potentiometer. If another is directly nearby with different value, it might be that DCV has separate course and fine adjustments, similar to what is shown here, 2K (course) and 200Ω (fine).

If you see several spaced apart (VR2, VR3, etc), these are likely for different features of the meter such as ACV or current. You will need to do some research on your unit. If you get lucky, someone like Darren already did the work for you!
If you do not find adjuster identification, you are “on your own” to experimentally determine what is affected by turning the pot, or by deducting how the circuit is wired. While I don’t recommend turning dials blindly, for a severe offset condition this may be your only option. A good plan is to make a pair of thin ink marks on the adjuster wheel and the base of the variable resistor to get back to the original position. A picture can help get back too, so take several when you first get inside.

Hopefully you have a set of plastic or ceramic tip precision screwdrivers. If you miss the slot while making the adjustment, you are less likely to short something or cause physical damage. If the tip is metal, then next best option is a grip made of plastic or rubber, these will help reduce noise from the environment that could affect the smoothness of adjustment. Turning the adjuster potentiometer clockwise doesn't always result in an increase, sometimes it's backwards from what you might expect.
For any mains voltages, or those more than 30V, consider making iterative changes where the test leads are disconnected while turning the potentiometer. Put the case back on while reconnected to voltage for another check.
At this point, you have identified which potentiometers in your unit adjust what parameters. What’s the procedure? This will depend on whether you are adjusting a DMM (Digital Multi Meter) or analog multimeter (VOM = Volt Ohm Meter).
Digital Meter Calibration
For voltage, power up your voltage reference and select the lowest range that covers that voltage, so that you get the most digits of precision in the display. If you are using a variable voltage reference, then try to select a value which is near the top end of the range. For instance, if the unit has 19,999 counts, you may want to use a 18V reference. For 9,999 counts, a 8V reference. If you get too close to the top of the range, the display might roll into a lower precision. Usually there is only one adjustment setting covering all ranges, so choose a voltage which is representative of they typical measurement you make. If you are usually measuring between 3 and 10V, you may want to use a 5V source. The further away from your tweaked voltage, the more the value may be further away from nominal due to range changes and nonlinearity in the DMM.
For current, there is usually one set of resistors for mA, and another for A. Two circuits almost always means two banana jack for mA and 10A, because a different circuit is activated. I recommend using 150 mA and 1.5A as reasonable values to use. Often Cheap-O multimeters don’t have a current adjustment, and the only way to calibrate is to modify the resistors in the circuit. I’ve done this, but it’s an advanced topic I may cover elsewhere.
While inexpensive multimeters have a 10A jack, they are only rated for doing that amperage 10 seconds or less, and often don’t have proper fusing. They may also use a common resistor instead of a proper current shunt wire. The value on the display may change over time as the resistors inside heat up, which can be a bit frustrating while adjusting. The test leads that come with the also might heat up quickly. Therefore, during adjustment avoid using them for more than 2A continual.
For other measures, such as diodes, resistance, capacitance, make your measures across various ranges. This applies whether your device is auto ranging or manually ranging. You may not be able to turn a potentiometer inside the unit to adjust these, but it’s worth confirming how well your meters do at measuring certain parameters.
Often Cheap-O meters don’t do very well at the low extremes, they may display a measurement, but is increasingly far off what it should be. Unfortunately, it can't be adjusted into the same calibration state. Examples include low value capacitors in pF range, and low resistance below 10 Ω. Usually, there isn’t many options but to use a different meter with special low ranges such that the device has the appropriate sensitivity. On the upper end, DMMs will usually display "OL" when measurements can't be accurate, so the higher extreme is usually not as bad as the low extreme for a DMM.
Analog Meter Calibration
The suggestion for calibrating an analog meter is different than a digital meter. There are protection diodes across the meter coil which prevents damage to the needle if too low of a range is selected by the user. These diodes “steal” some of the current away from the meter movement as the needle gets toward max, causing lower readings a the very top end of the meter scale. If the method to align at maximum scale is used, for instance calibrating 20V on the 20V scale, the lower readings such as 5, 10, and 15V will be incorrect. Therefore, calibrate most analog meters using a value 2/3 to 5/6 along the dial for best results.
For multiple adjustment dials, best to start with DC volts. Choose the range you would use most often and a source value that corresponds to a ink mark on the VOM scale. Connect and disconnect voltage to ensure the needle consistently reverts back to the “0” tic mark. Be aware the needle may be affected by static electricity causing inconsistency. Tip: A dryer sheet may be lightly waved across the lens to dissipate such static (don’t push to hard and scratch your lens).
To ensure the calibration is consistent between ranges, you may instead choose to calibrate the input impedance to meet the design intent of a VOM, instead of checking the needle against a reference voltage, which is actually an indirect path. This involves checking the resistance of the input jacks of the unit against the "DC Sensitivity" rating for the meter found printed on the scale or the user manual. This doesn't seem intuitive, but it is more effective at getting all the ranges calibrated than the voltage reading comparison method, which tends to correct only one range selector position and skew the other ranges out of alignment.
The analog meter is called “Volt Ohm Meter” for a reason. It’s simply using ohm’s law to set up a chain of resistors to correctly actuate the meter movement. When that chain is misaligned, you could not possibly get consistency when moving across ranges. Therefore, aligning the range selector resistors to the engineered meter sensitivity specification is Job#1 for an analog meter. If this is not effective, there may be other issues with the unit that need addressed.
I will spare you the proof and detailed math. Let's jump ahead to how this impedance alignment is done. The key is to adjust for the resistance reading on the input terminals at the lowest voltage setting possible. This aligns the fundamental resistance in the voltage sensing ladder. If the resistance reading checks out, no need to open the case - unless you decide you want to make adjustment at step 5.
Analog Input Impedance Calibration Method
You can think if resistance and impedance interchangeable for this purpose, sorry if the two different terms confuses things!
Step 1: Find the most accurate auto-ranging DMM you have available to you, with the most counts if possible. Set the device on ohms. If you have yet another meter, check what voltage this meter is using to measure resistance. Mine uses only 0.1V .
Step 2: Set your VOM to be adjusted to the lowest voltage setting that is not lower than the voltage measured in step 1. This check assures you do not over-range the VOM.
Step 3: Multiply your VOM “ohms/volt” value by this lowest voltage range setting value. Example: 20,000 ohm/volt sensitivity (printed on face plate of dial), times 0.6V range selected, = 12,000 target ohms (12K) for aligned impedance.
Step 4: Connect the red leads and black leads to the ohmmeter's corresponding color. Compare the reading on the DMM ohmmeter to the target value calculated in the previous step. No need to look at the VOM needle, but you will probably see it move because the DMM is supplying a voltage in order to measure resistance.

Step 5: If you find correction is needed, adjust the DC Voltage potentiometer inside the VOM until the ohmmeter matches the target resistance calculated in step 3.
Step 6: While the ohmmeter is still attached, move to higher voltage ranges and record the resistance for each setting.
Step 7: Multiply “ohm/volt” specification by each voltage range, same as step 3. Compare your readings against the target. The resistance for each range should roughly correspond to a calculation, the difference can be used to calculate % error. If this % is out of the meter's specification, resistors may need to be replaced, or some other service may be needed.*
Step 8: Final check the meter readings across various ranges for some sample voltages to verify a good result, and that the impedance alignment method is effective for your device.
Step 9: Adjust other measures such as DCA and ACV if desired, matching the needle to the scale comparing to a standard, near 2/3 to 5/6 scale reading.

*If you determine some impedances cannot be adjusted, consider cleaning and general inspection first. Then if necessary, replacing resistors starting with the lowest DCV scale that needs corrected. Usually there is a chain of resistors attached to the range selector, so changes to the lower range affect the higher ranges. This is more advanced and probably requires a schematic or reverse engineer of the range selection circuit.
One side note to make here, a good practice anytime using an analog multimeter is set the range so the needle is in the middle 2/3 of the display. There are errors associated with the ends of the dial. On the low end, your measurement will lack precision, and be more influenced by other meter movement factors. At the high end, the protection diodes "start kicking in" and will be diverting some of the current to be measured away from the meter movement. Unless you know for certain that the manufacturer compensated the scale for the diode protection response, the upper 1/6 of the scale will likely have error. To check this, vary the input. If you get accurate center-dial readings, but slightly low full-scale readings, it’s likely that they didn’t compensate the scale. In this case, for best calibration, you simply have to accept that full scale readings are under-reported on an inexpensive meter which uses protection diodes.
Which brings another topic to light: Ohms. If you cannot get ohms to zero out on an analog meter, consider changing the internal cell. Once the cell voltage drops too low, you cannot get a full scale zero when shorting the leads together. The ohmmeter relies on a reasonably healthy internal battery and there is usually no low battery warning on an inexpensive VOM.
The question here is: knowing about the protection diode error problem, is the procedure to use the ohms adjust knob to set “0” ohms to the full scale reading the most accurate? Sometimes not! Unless measuring resistance near full scale, it may be better to use a verified precision resistor which is around 2/3 of the dial reading (usually 10 ohms) and zero to that before your actual mid-scale measurement. Keep in mind the precision resistor necessary will change for higher scales (10x requiring 100Ω zeroing resistor, 100x requiring 1KΩ resistor, and so on).

Battery Tip: Often the battery or cell used in analog meters gets forgotten, because most measurements are powered by the test voltage. The battery is only for ohms measurements and "ages out" before the capacity is used up. Standard types of cells (alkaline, heavy duty, carbon-zinc) will usually start leaking before the owner notices the need to put in a new battery. The leakage is corrosive and will damage the metal contacts and more inside the meter.
Lithium "primaries" seldom leak & have a very long shelf life. The chance of corrosion damage is practically eliminated. The "ultimate" version has no advantage here, so feel free to go for the less expensive "advanced" version. We're not suggesting lithium ion / lithium polymer type, those don't have the proper voltage. Even so, "primaries" do have a slightly higher voltage when new (AA's and AAA's closer to 1.7V than 1.5V) , so a handful of DMM's might not accept them.
Calibration correction opportunities in the Cheap-O realm:
About half the Cheap-O’s I own offer some form of adjustment for DCV. I have found a couple that can be “hacked” by replacing a fixed resistor with a variable one (advanced topic). Also, many of them lack DCA adjustment, but can be modified with resistor changes to align the current measurements for mA and/or A ranges. Very rarely in this realm an ACV adjustment, usually only seen with analog meters.
Most DMM’s do not offer adjustment for resistance or capacitance. However, you may want to check your owner’s manual, they may describe a way to “REL” out lead resistance or capacitance, improving the low range accuracy.
In general, if the DMM uses software calibration adjustments, it's best left alone. If there are potentiometers on board, the calibration is likely available without negative side effects for doing so.
Whether you want to optimize the accuracy of your meters for better measures, OCD, or just for fun, there are ways to get there depending on the model and your know-how & persistence to learn.
Turning knobs without some sort of understanding or standard to measure against is not advised, especially with meters which already measure well or undergo annual calibration. Most new meters in the cheap-o realm are surprisingly accurate and need no special adjustment.
But hey, life is short and meters are cheap, so do it safely and KeepOnTesting !
Comentarios