The ideal LED would convert all the electrical energy into light, but it's clear that a lot gets converted to heat.
Unlike filament bulbs, most of the heat needs to be conducted out of the back of the LED to keep the LED cool - the hotter it gets, the less efficient it is.
I've seen a few attempts on CPF at working out how much heat an LED puts out, but I think the only reliable method is substitution - find out how much heat will raise the same heatsink to the same temperature. 100% of the electrical energy that goes into a resistor gets converted to heat - it's very hard to make an inefficient electrical heater !
I found two identical unadonised aluminium finned heatsink - 10 x 5 x 3.5 cm.
On one, I mounted an XM-L T6 on a 20mm star from www.ledsales.com.au. On the other, I mounted two 10 watt 1 ohm resistors in aluminium extruded housings with two screw mounts.
LED 8.42 watts
- efficiency was 18% - heat output from resistors was 6.86 watts = 82%
- heatsink temperature = 56.7 degC, ambient = 25.5 degC
- LED 3.12v, 2.70 amp
- Resistor 3.71v, 1.85amp
LED 2.95 watts
- efficiency was 31% - heat output from resistors was 2.04 watts = 69%
- heatsink temperature = 35.5 degC, ambient = 23.7 degC
- LED 2.95v, 1.00 amp
- Resistor 2.02v, 1.01 amp
So over a normal operating range, modern high power LEDs waste around 75% of the power going in to them.
Of course this testing method ignores radiant heat loss from the LED - when I hold my hand 1cm in front of the LED it gets a lot warmer than holding it 1cm in front of a resistor putting out the same heat. EDIT- as pointed out by jtr1962, the warmth I feel would be due to the light being converted to heat when it strikes my skin - the energy has to go somewhere.