The math itself is fairly straightforward. Basically you add up thermal resistances (K/W value) for each element between LED die -> ambient air.
PER LED you have:
1) LED die -> LED case (or solder points). See the LED's datasheet.
2) LED case -> heatsink. Hard to say, depends on mounting method. You could try and find a K/W value for similar mounting method in some other application. Or use LED die -> LED case value as a ballpark figure. :thinking:
For all LEDs combined you have:
3) Heatsink -> ambient air. This K/W should be in that heatsink's datasheet, but would be valid for a specified mounting method only (horizontal / vertical / ...). And it assumes a uniform heatsink temperature. Which may be
approximated with a large # of evenly spaced LEDs, but not when you've got just a few LEDs in different locations on the heatsink.
In each case, power (Watt) multiplied with K/W value will tell you how big a temperature raise you get from A to B. For example if LED is dissipating 2W and has 3.5 K/W from LED die -> case, then LED die will have 7 K (or
oC) higher temperature than LED case. And so on for the other steps towards ambient air. Note that LED dissipation is lower than the electric power (volts * ampere) since a significant portion of that power is radiated out as light. So 3W electric power -> maybe 2~2.5W heat dissipated. The remaining factor is max. allowed temperature of the LED die. This may be specified, but you may want to keep that at a lower temperature (say, 80~100
oC where specified max is 120
oC).
Personally I'd just mount enough LEDs on that heatsink that you're
sure your lumens requirements are covered for the area the LEDs on that heatsink will cover. Then you can drive them at a current for which [each LED remains within specified limits] and [heatsink doesn't get hot enough to burn your hand when you
keep your hand holding it]. Or any lower current.
The math may give you a good indication / ballpark figure, but semiconductor cooling isn't really exact science.