For maximum life you want the junction temperature at 80°C or less. Each emitter has a typical junction to thermal pad resistance of 8°C/W. Add maybe another 2°C/W to account for the thermal epoxy. This gives 10°C/W total. If you plan to drive the LEDs at 1000 mA then each would be dissipating about 3.7 watts, and the thermal rise of the junction above the heat sink temperature would be 3.7*10 or 37°C. Let's just call it 40°C. Since you want the junction temperature at no more than 80°C this means the heat sink can't be over 80-40, or 40°C. If we assume a 25°C ambient then the heat sink can be allowed a temperature rise of 40-25, or 15°C. The only unknown at this point is the thermal impedance of the heat sink. Just by looking at it I'd say it's around 0.3°C/W. This means you can safely dissipate 15/0.3, or about 50 watts. At 3.7 watts per emitter you can mount about 12 or 13 emitters and still keep the junction temperature under 80°C.
Doing a similar set of calculations for driving the emitters at 350 mA instead of 1000 mA gives a maximum of about 125 emitters. The far greater number is because the temperature rise of the junction above the thermal pad is far less at 350 mA as opposed to 1000 mA.
In all cases I didn't take into account the efficiency of the emitters. Since roughly 20% to 30% of the power is emitted as light, then the actual temperature rise will only be about 70% to 80% of what I calculated. In other words, going by my figures you should have a safety margin.