Would a voltage regulator have to work harder (less efficiently) at the higher voltage as compared to the 3.7v which is closer to the LED's voltage? I understand the batteries will not have to work as hard (less amps).
In the simple case, the answer to your question is generally "yes".
However, rather than think of modern driver circuitry as a "voltage regulator", try instead thinking of it as a "power converter". Also, its intended purpose is not so much to regulate voltage to the LED as it is to regulate current (amperage). It may:
a) have "boost" capability to take low voltage and high current to deliver to the LED the intended amperage (at whatever voltage results from that)
b) have "buck" capability to take high voltage and low current to deliver to the LED the intended amperage (again at whatever voltage results from that)
c) have both capabilities
In general, the less power conversion the driver circuitry needs to perform the more efficient it will be, so providing it with a voltage that is a bit more than the forward voltage of the LED plus the overhead voltage of the driver circuitry can be very efficient. However, this is just a broad generalization, and each different circuit design with have its own range that it is most tuned for and most efficient in.
Having said that, note that not all driver circuits are of the power converting type. In particular, linear current regulators are in use for which amperage to the LED is the same as the amperage draw from the battery (cells), and the wattage from any excess voltage is shed as heat.