Stupid Question re: Multimeter

UCLAHutchinson

Newly Enlightened
Joined
Apr 3, 2010
Messages
2
Hi all,

I just bought this multimeter from home depot and am testing the output of a few driver boards with different battery configurations, but the readouts don't seem to make sense. When I have it in 20m/10A mode (there are three DC current modes and this is the one the manual says to use with amperage above 200mA) the readouts for boards that are supposed to drive at 1A are generally around .20 to .25. Is there a conversion factor of 5 in there somewhere? Or am I really getting only 200mA?

Thanks
 
Most multimeters will give different readings at different current settings.

Get and insert a one tenth ohm one percent 3 or 5 watt resistor between the output of the
driver and the load. Read the voltage drop across the resistor with the DC voltage
setting and use ohm's law for a more accurate result. For currents over about one amp
also figure in the power used by the resistor and add that to the initial result.

Curt
 
It could be argued that they are measuring the current accurately, but their own internal resistance (which is higher the smaller the current range it's set to) is causing the actual current that's flowing to be reduced.
 
I just bought this multimeter from home depot and am testing the output of a few driver boards with different battery configurations, but the readouts don't seem to make sense. When I have it in 20m/10A mode (there are three DC current modes and this is the one the manual says to use with amperage above 200mA) the readouts for boards that are supposed to drive at 1A are generally around .20 to .25. Is there a conversion factor of 5 in there somewhere? Or am I really getting only 200mA?
This is not right. Most meters will read 1 A much more accurately than that on the 10 A range.

It seems there might be something wrong with your measurement protocol, but there is not enough information in your post to know what it might be.

Have a read through this thread and see if it gives you any hints.
 
Last edited:
Are there any multimeters that measure the amps accurately without this extra trick?
Sure, but they're $$$. You can measure current with a clamp on current probe. Still, a multimeter shouldn't throw off the reading by much.
 
Sure, but they're $$$. You can measure current with a clamp on current probe. Still, a multimeter shouldn't throw off the reading by much.

A multimeter on current setting adds resistance to the circuit. Some circuits tolerate this without much effect, and some are greatly affected by this.

For example, direct driving an LED from a lithium cell, the resistance of your wires is very important, and the meter would be a huge effect. With an 18650 cell and an SST-50, you might get 4.5A without the meter, and you might have a total resistance of around 0.1 ohms. If you add the meter which might have another 0.1 ohms, theoretically the current would drop by 50% (actually it's a little less than that).

So it all depends on what the circuit is and where you put the meter.

Also the meter resistance changes with which range you have the meter set to. The highest current range will have the lowest resistance. That's why it was suggested to use the 10 or 20A range. It will affect the circuit less, even though the accuracy of the reading may not be as good.

D
 
Top