Why? Who declared that law?
Constant curent drivers have more problems than this forum will admit to given the amount of mail I've responded to. Current regulation electronics are prone to failure and Q/C issues, which might explain all the dead bucks out there.
A fixed voltage supply will also not put out more voltage than it's rated for. However, a current regulated supply will fling out it's max voltage under less than ideal conditions including transient shorts, power-ups, etc.
So will a voltage regulated supply if something goes wrong. Constant voltage circuits use a feedback network ( generally a resistor divider ) to determine the output voltage. If one of those resistors goes, then the circuit may think it has zero on the output. Bingo, output keeps increasing until it reaches the input. Yes, I've seen it happen, thankfully only once.
The ideal for driving an LED is a constant current/constant voltage circuit but unfortunately few drivers do it this way. Ideally, the voltage limit is set above whatever the string of LEDs will reach under normal operation. This ensures constant current operation except if the string open circuits. If that happens, voltage rises, but not to the point where it will kill the LEDs if the open circuit somehow reconnects itself.
The problems you mention are poor design. If your constant current circuit doesn't have a voltage limit, then all the components on the output ( i.e. filter caps ) should be able to take whatever voltage the output rise to ( generally something close to the input voltage ). Sadly many don't. End result predictably is failed supplies.
I think the reason you've had a lot better luck with CV supplies might have to with numbers more than anything else. CV supplies are made in
massive numbers. You can just design them better at any given price point by economies of scale. CC supplies mostly cater to the LED market, which is a niche market compared to the market for power bricks. You have a lot of one-off designs which really aren't throughly tested before coming to market.
As for why operate LEDs on constant current ( besides that the major LED manufacturers all say you should in their design guides ), the reason is to better control heat and light production. LEDs have a wide variance in forward voltage. Operate a string of 3 Crees at 10V, for example, and most of the time forward current might be 500 mA, give or take. However, if you happen to get LEDs with particularly low Vf they might well be pulling over a amp. If your heat sink is barely adequate for 500 mA then you have problems. Temperature goes up, Vf drops further, current goes up. Eventually either the power supply, or LEDs, or both, destroy themselves. With a constant current supply of 500 mA, you'll have a much smaller variation in heat output ( it would really only vary with Vf ). With constant voltage, your heat here is 10V times whatever current the LEDs happen to pull at 10V ( less light output, of course ). That could easily vary by a factor of 2 or more. With a 500 mA constant current circuit heat is 500 mA times the total forward voltage of the string ( which might vary from ~9.3V to maybe 10.5V, or only about 13% ). If you want to massively heat sink your LEDs, then constant voltage is just fine. I've done it both ways but I'm honestly more comfortable with constant current, or better yet CC/CV. The downside to CC/CV is that you're pretty much limited to a fixed number of LEDs. If you put more LEDs in then they won't be driven at full current. If you put fewer in they'll get the full current but won't be protected in the event of an open circuit/reconnect. This is why most drivers are simply CC-flexibility. Not an ideal situation, but not as problematic as you make them out to be if designed well ( most sadly aren't ).
For instance, a solder breaks on a string of LEDs on a 700mA Current Regulated supply causing intermittent contact, and hence rapid pulses of dangerous voltage as the circuit keeps discharging. Next thing you know I'm replacing $50 worth of Crees. Never had that happen with a constant voltage supply.
Are you referring to those Xitanium drivers? That shouldn't happen on a well-designed supply. I've made CC circuits where I can connect LEDs to powered drivers all day long without problems. The key is to limit the output capacitance to something reasonable so that the LED junction isn't hit with massive amounts of energy. Apparently Xitanium doesn't do that.
When a laptop brick shorts or detects a problem, it typically shuts off for a few second. When a 700mA current regaulted driver detects a fault it delivers more power first before stopping.
Shouldn't happen. When a 700 mA CC circuit is shorted, you should get 700 mA going through the short, and near zero power. Or at least that's what happens with the drivers I make.
The driver problems you or your customers experience are mostly to do with designers of CC circuits not knowing what the heck they're doing ( or being forced to do things they shouldn't to contain costs ). I've had the opportunity to examine drivers even on $200 lights. For the most part I'm shocked by what I see. Inadequate current sense resistors, inadequate MOSFETs. Inductors which operate very close to saturation even at the design current. Output caps with voltage ratings far below the input ( the best I saw was a 4V cap on the output of a 12 volt input driver-yes, it was designed to drive 1 LED, but if the output was open then the cap fried ). It's even worse with boost drivers. At least with buck drivers the output voltage is inherently limited to the input voltage less a few tenths. Boost drivers on the other hand MUST be designed with some means of limiting voltage in the event of an open circuit or they WILL destroy themselves. Sadly, a lot aren't.
There's a good reason I don't really bother with commercial drivers when I make projects. They're a crapshoot. Some are great, most aren't. Which you end up with is sadly not necessarily determined by what you pay for them.