I wanted to see how much current was going through a regular LED at a certain voltage. It should have been 20 or 30 mA. I set my multimeter to scale 200 mA and wire up appropriately and the LED does not light - no current flows. I then take the lead out of the red socket and plug it into the yellow socket which allows up to 10 A. I switched the multimeter to the 10 A setting. The LED lights and the current registered as "0.01" A which is of course not very accurate, hence me wanting to use the 200 mA setting. Any idea why it's not letting me use the 200 mA setting? My guess is some sort of protection against too-high a current, but this protection should not be getting activated.