If it actually puts out 9A (it may be somewhat under that), the output power would be about 9A * 3.6V = 32.4W. That's the voltage times current in the LED.
I would expect that driver to have an efficiency of 80-85% (or 0.8 - 0.85), so the input power could be as high as 32.4/0.8 = 40.5W.
If you want it to run for 3 hours, you need a battery with a capacity of 40.5W * 3H = 121.5WH. This is not to be confused with the AH (or mAH) rating of the battery, which is properly termed ampacity. To keep it simple however, the term capacity is used almost universally for both the WH and AH ratings ;-)
The WH capacity of a battery is the AH rating times the voltage. Since watts=volts*amps, WH = V * AH.
In all cell chemistries, the voltage drops as the cell is discharged. The effect is more dramatic in some chemistries than others. So the "voltage" to use in the conversion of AH to WH is somewhat arbitrary. A healthy LiIon cell under moderate load will average around 3.7V, so this is a number that's pretty commonly used. For NiMH, 1.1V is pretty good estimate, though the specified voltage of NiMH battery packs almost always use 1.2V for the cell voltage. That's pretty optimistic unless loads are very light. Likewise some LiIon cells and packs are specified as 4.2V per cell. That's the MAX charging voltage, and for all practical purposes, the voltage will always be less than that under load.
Hope that helps!
Oh, by the way, welcome to the forum!