Hi there,

I have checked manufacturer's info.

The specs are as follows:

Power: 10W,

Lumens: 900-1000Lm

Current: 800-900 mA

Voltage: 9-12V (Seller recommended a 1.5 Ohm resistor for 12V)

They are array of 3x3 led inside the chip.

Is a 1W 1.5 Ohm resistor okay for every led chip I have ?

On 12.6V they get quite hot to touch... (over 100 Degrees Celsius)

Eklogite

The seller saying use a 1.5ohm resistor seems kinda vague to me.. That will change with slight voltage variance, heat and other factors. That said, a resistor is a poor way of driving an led. But, lets do it.

First off, this is all calculated by using Ohms Law.

lets do the math. We need to know the power supply's voltage. I will give an Example using 12.6V power supply.

Formula to calc resistor is "R= V / I"

Ex: say you have 12.6v supply voltage, and the led wants "12V" at given amperage of .800 amps..

(12.6-12)/.800 = .75 ohm resistor

Ex 2: to calculate wattage rating of resistor, use formula "Watts= V x A"

So, here we will take supply voltage, and subtract led forward voltage(12.6-12) x .800 = .75 watts

So, if you had a 12.6v supply, you would need a .75ohm/ 1 watt resistor.

Its always better to round up on the wattage of a resistor if you have the room too.

personally, id look on ebay for a "Constant Current" driver within the specs your after.

i say that because with the heat thats generated, it will lower the voltage required, and increase the amperage, = more heat made. rinse and repeat until it frys itself.

A constant current driver will keep amperage in check.

Also, ensure you heatsink is adequate enough to support the led