I recall a long time ago when I was first studying electronics, having difficulty understanding current sources. We're used to dealing with more-or-less constant voltage sources like mains power and batteries (although regarding batteries as having constant voltage can lead us astray, too). But you've got to get around the idea of a constant current source to understand LED drivers.
An LED is very different from a resistor and even more different from an incandescent bulb. I have in front of me a data sheet for a Luminus SST-50 power LED so I'll use it as an example. All LEDs act very much the same way, just at different power levels.
At 3 volts forward, this LED draws about 1 amp. At 3.25 volts it draws about 2 amps. At 3.5 volts it draws about 4.5 amps. So for every quarter volt of forward voltage the current doesn't change by some fixed amount like a resistor would -- it doubles! Furthermore, if you were to fix the voltage, the current and power dissipation would increase about 10% for every 10 degree C temperature increase. It would get hotter, draw more current which would make it get hotter, and so forth until it either hits a supply limit or self destructs. This is known as "thermal runaway". A "direct drive" system works only because of internal battery resistance, not something you can count on if you really care how hard you'll be driving the LED.
So it's a very bad idea to try to run an LED with a constant voltage like you do an incandescent lamp. That's where constant current "sources" (usually made of a more-or-less constant voltage source and a special regulator) come in. A constant current driver will provide a constant current to the LED, letting the LED voltage be whatever it needs to be. Looking at the same thermal characteristic when driven with a constant current, if the temperature increases 10 degrees C, the voltage will decrease about 25 mV. This will make the LED dissipation actually decrease, but by only about 25/3250 or less than 1%.
In order for a constant current regulator to work, it has to provide enough voltage for the LED to consume the desired current. For example, for the SST-50, you'd need it to be able to provide at least 3.25 volts if you want to run the LED at 2 amps. If the driver could provide only 3 volts, the LED would consume only about 1 amp -- at room temperature, that is. It would vary considerably with temperature.
There are two basic classes of regulators, passive (or linear) and active (or switching). The 7135 is a passive regulator. Your idea of how it works is generally correct, but don't think of it as limiting the voltage. What it does is limit the current to a fixed value. The output voltage is whatever the LED voltage is at that current. Like any passive regulator, the output voltage must be less than the input voltage. The greater the difference, the more power the regulator dissipates as heat (V * I) where V = voltage drop across the regulator and I = the current through it, so the less efficient it is. When the input-to-output drop gets too low for the circuitry (that is, the battery voltage isn't enough greater than the required LED voltage), the current drops below the regulated value and the 7135 acts pretty much like a low value resistor. As the supply voltage and LED voltage drop, the LED current drops dramatically -- halving each time the voltage drops a quarter of a volt. The other type of regulator is an active one which can transform the supply voltage to a higher or lower value in order to furnish the LED with whatever voltage it needs to draw the desired current. Boost regulators produce a higher output voltage than the supply, and buck regulators a lower voltage. Buck-boost can do either but are more complex and tend to be a bit less efficient. A well designed active (switching) regulator can typically achieve an efficiency of 80 - 90% over a wide range of input-output voltage differentials.
Hope this helps.
c_c