A rough but useful guide for cool-white phosphor white LEDs is ~330 lumens per watt of emitted output. High-CRI LEDs will be somewhat less than this, perhaps in the high 200s. Anyway, if for example your LED consumes 3.75 watts at one amp, and emits 250 lumens, then light output will be 250/330, or 0.76 watts. Since heat = power input - light emitted, then this means the LED will give out 3.75 - 0.76, or 2.99 watts of heat.
Note that up until when the Cree XR-E came out a few years ago, most people didn't even bother taking the light output into consideration. Most LEDs were less 10% efficient so it didn't affect calculations by much. Now the best LEDs are 30% efficient or more. It's starting to get to the point where it makes sense to take this into account. It's also interesting to note what happens as efficiency rises further. Suppose we manage to make LEDs which are 80% efficient. That means 80% light, 20% heat. Now one might say it doesn't pay to try to get to 90% as that would only mean 12.5% more light for a given power input. However, note that this increase cuts your heat in half compared to 80%. And going from 90% to 95% cuts it in half again! Practically speaking, this means smaller heat sinks, or more light output for a given size heat sink.