Hi there,
Nice web page remuen!
When i want to measure current that is
more then about 100ma or so, i use
a resistor in series with the current.
Then, turn the circuit on and measure the
voltage across the resistor. Because of
Ohm's law (see remuens page) the voltage
measured across this resistor tells you
exactly how much current is flowing in
the circuit. The added benefit is that
you dont drop as much voltage as you do
with a typical multimeter.
For example, lets say we use a 0.1 ohm resistor.
You wire this resistor in series with the
LED and connect your volt meter across the
0.1 ohm resistor. Now you turn on the power.
If you read 0.1 volts across the resistor that
means you have 1 amp flowing in the circuit.
Here is a short table:
0.01 volts -- 0.1 amps
0.02 volts -- 0.2 amps
0.03 volts -- 0.3 amps
0.10 volts -- 1.0 amps
Inspecting the table above shows that really
all we have to do is multiply the voltmeter
reading by ten (10) and we know the current
flowing through the circuit.
This is by far the better choice for measuring
current especially in switching circuits.
One note of caution though is that the tolerance
of the 0.1 ohm resistor should be 1% or better
if possible, and it should not be a wire wound type
if you intend to use it in switching circuits.
If you cant get a 1% resistor, you can 'calibrate'
your chosen resistor by connecting it in series
with another meter (set to read milliamps) and
hooked up to a circuit that draws about 390ma.
You then measure the current on the one meter and
the voltage across the 0.1 with your original meter and compare the results and come up with
a 'calibration' factor.
Lets say you measure 380ma with the current meter
and you see a reading of 0.040 volts across
your 0.1 ohm resistor. Multiplying 0.040 by
10 (as shown above) results in seemingly 400ma
of current flowing, but the current meter reads
380ma. This means we need to compute a
calibration factor for this resistor:
We simply take 380 and divide by 400 and we get
0.95 as our calibration factor.
Now, when we use this resistor ( and this resistor
only) in a circuit to measure current we have to
multiply by 10 to get the approximate current,
and then multiply the result by 0.95 to get the
more exact current measurement.
With the above reading of 0.040 volts, we would
thus get:
0.040 times 10 equals 0.400 amps,
and this times 0.95 equals 380ma.
380ma is the real current flowing through the
circuit.
The 'calibration' factor will usually range
from about 0.85 to about 1.15 for most resistors.
If you end up with a calibration factor of exactly
1.00, you got lucky and have found a resistor
that is exactly 0.10 ohms (or close enough).
It is of course entirely possible to buy
several 0.1 ohm resistors and try each one untill
you find one that has exactly 1.00 cal factor,
or if you find one that is slightly higher you
can parallel a much higher value resistor
untill it comes down to 1.00 exactly. In this
way, you dont have to compute a calibration factor
and so it makes making measurements a little
faster.
Also, if the cal factor comes out to close to
1.00 but not exactly, the small error may not
matter for most measurements anyway.
Good luck with your LED circuits,
Al