Solar Charging Batteries

CTR

Newly Enlightened
Joined
Aug 3, 2004
Messages
113
Location
London
Hi guys,

I'm a bit of a novice when it comes to electronics so just looking for some pointers for the following problems I'm trying to get over.

1. I am trying to build a solar panel battery charger to charge say my 18650 batteries and also a 12V car battery. However I do not know what voltage/current is required to charge batteries or what power etc. Supposing the 18650 battery is 3.7V, what is the minimum voltage I need across the battery for it to charge and for how long? Is there a forumula I can use?

2. How can I tell if the the battery is fully charged? I have a voltmeter and an ampmeter but again, I do not have a formula to use to see if the voltage from the battery is good enough. I know from reading up on the forum that for a 18650 battery coming off the charger it would be around 4.25V. So is that about 15% above the voltage rating of the battery (i.e. 1.15 x 3.7)?

3. My final question is in relation to the car battery. I want to run some 3W Luxeon LEDs / CREE LEDs in my living room as secondary lighting from a 12V car battery. I want to charge the car battery during the day and use it to light up my living room/hall-way at night. But I want to know whether there will be enough charge in the battery to keep the LEDs going say for 4hrs a night. And also if the solar panels will provide enough charge to fully top up the batteries each day. I've read that to calculate how long the battery will last is simply dividing the capacity of the battery by the current draw x 0.7. Is this correct?

Thanks for any help in advance!

:)
CTR
 
Hi guys,

I'm a bit of a novice when it comes to electronics so just looking for some pointers for the following problems I'm trying to get over.

1. I am trying to build a solar panel battery charger to charge say my 18650 batteries and also a 12V car battery. However I do not know what voltage/current is required to charge batteries or what power etc. Supposing the 18650 battery is 3.7V, what is the minimum voltage I need across the battery for it to charge and for how long? Is there a forumula I can use?

2. How can I tell if the the battery is fully charged? I have a voltmeter and an ampmeter but again, I do not have a formula to use to see if the voltage from the battery is good enough. I know from reading up on the forum that for a 18650 battery coming off the charger it would be around 4.25V. So is that about 15% above the voltage rating of the battery (i.e. 1.15 x 3.7)?
CTR


Hi,

There should be some good info around here or on the web about charging lithium batteries (I assume the 18650 is lithium, based on the 3.7v voltage). All I know is that lithiums are very demanding in regards to charge regulation. If you aren't using a suitable charge controller, then you are taking a large risk. (i.e. risk of fire or explosion).


3. My final question is in relation to the car battery. I want to run some 3W Luxeon LEDs / CREE LEDs in my living room as secondary lighting from a 12V car battery. I want to charge the car battery during the day and use it to light up my living room/hall-way at night. But I want to know whether there will be enough charge in the battery to keep the LEDs going say for 4hrs a night. And also if the solar panels will provide enough charge to fully top up the batteries each day. I've read that to calculate how long the battery will last is simply dividing the capacity of the battery by the current draw x 0.7. Is this correct?

Thanks for any help in advance!

:)
CTR


this is a relatively simple calculation, assuming that you are just using a resistor to limit the led current (i.e. not using a switching regulator to drive the leds). Well, it is if the battery has an amp-hour rating. As an example, let's say that the leds are drawing 2 amps from the battery. Also, let's assume that the battery has a rating of 100 amp-hours. The run time is given by dividing the battery amp-hour rating by the actual current draw, which yields an answer of 50 hours.

It doesn't hurt to run an experiment to validate your calculations. A lead-acid battery can be float charged around 13.7v to 14.4v. Take the fully charged battery, hook up the leds, and monitor the voltage. Every hour or so, write down the battery voltage. There will be a slow decrease in the voltage. At some point, the battery voltage will decrease quickly. That's a good sign that the available charge has been used up, and that the battery can be damaged by further discharge.

In this sort of set-up, where a battery is charged from the solar panel, you won't always get the battery fully charged each day (it might be a cloudy day). A good practice is to have a low-voltage cutoff circuit, which disconnects the load from the battery once the battery voltage has dropped below a safe limit.

Well, that's a quick answer to your questions.... sort of. In general, I'd stick to charging lead-acid or nicad batteries if you don't want to use any sophisticated charge controllers. And even with these batteries, some circuitry is recommended to avoid damaging them during discharge.

regards,
Steve K.
 
18650s , charge them at about 1amp, and with solar you can just put in a voltage regulator that stops at ~4.2v +- .05v , with a voltage regulator they will charge faster for the first 80% or so, and the rate will taper down as the voltage differential gets smaller. simple easy and lame.

If your going to use any li-ion in quantity for area lighting solar applications, you could max the charge at a lower voltage like a nice 4.1v, and get better liftime of operation out of them.

also there is a cheap trick that uses a protection PCB to get to the 4.25v with solar, but it has issues, like it wont restart till the next day if the trigger voltage is reached. AND it would not be great to use the protection curcuit for discharge cutoff. i use that method for manual solar charging, where discharge is controlled elsewhere.

there is a voltage chart for li-ions , that gives you a fair idea of the li-ions charge state when RESTED. Under a charge or a load that voltage is different, and dependant on the load and charge. but if you hook up to solar , and leds, then your can toss a meter in and see the voltages all the time, so you will have a fair idea yourself. The load and the charge rate, will vary the voltage highs and lows, but with li-ion you can still guage the approximate capacity from the voltages, like a gass guage.

Li-Ion is basically fully charged when its at 4.1v - 4.2v for all practical purposes then it is charged. its empty when the capacity is extinguished, and output drops rapidly, which is around 3.1v - 3.5v depending on the age of the cell, the type of cell, the load at the time. cutoff curcuits are set for 2.4v - 2.7v , and are basically last chance things. going over or under is BAD, very bad.


to do the math (ohh hurt my head) on the solar , just ask darrel :)
solar pannels are rated by MAX watts in full sun aimed straight on.
they are also rated on voltage when unloaded, and current when shorted.
and of course all that stuff is almost useless because very few of those factors ever always exist.
the voltage to charge must be higher than the battery, and the amperage under load is relative to the voltage differential between the battery and the solar pannel. just the right voltage of solar, and the charge appropriatly tapes off when the battery is mostly charged.

a good marine deep discharge battery of big Truck size , has about 80-100 Amps total capacity. to charge a 12V lead acid battery with solar cells rated by open voltages, you need about 18V of solar cells.

3amps (speced) of cells propely loaded via the voltage differential, will charge at a good 2.5 Amps.
you gotta leave this stuff out IN the sun, so this stuff gets dirty, the top translucent cover reduces output, picking up solar THROUGH your normal home thermal glass stuff , can cut it back another 20%.
the voltage of solar cells doent NOT change much when there is way less light, but the amperage changes LOTS, this is very usefull when used to charge and run stuff.

solar cells without Tracking the sun, are 40% less efficient , than when tracking the sun (daily). and because most people find it easier to just LOCK the solar cells down, instead of using devices to turn them, which makes the mount more flimsey. SO. they often prefer to just have 2 times as much solar cells.
adjusting the solar cells for the SEASON, can be done manually, on one asmith, and that fixes the 2nd direction for tracking.

A cloudy day, gives you only 10-25% of full power ever, plastic flexable cells drop down to completly useless on cloudy days.

then because Lead acid batteries dont like to be deep discharged, people like to have about 4-8 times as much battery minimum, first you dont want the battery to deep discharge daily, and secondly how do you get through winter.

so i recommend getting WAY to much solar pannel, and way to much battery than you think you will need, and using it in an efficent manner. because everybody adds this stuff in later anyways, then it looks like a motly assortment.

1 single luxeon 1W uses 1W of power, when run at full speed.

the one thing you are going to have to learn is the Volts Amps Watts formula, its rather simple, and will help all of your calculations, and conversions.

I (myself) find that self-regulated solar, kicks butt , for certain reasons. my small test system runs mostly unregulated. the more sun, the more light each night, the less sun the less light, and no sun, and i have less and less light, untill ,,, well its never happend :) because when led voltage drops the consumption goes WAY down, i never run out of light.

just like the unregulated led flashlight, it does diminish, but it sure doesnt leave you in the dark, besides simple easy lesser regulated stuff is cheap and easier to do and less can go wrong.
 
Last edited:
Wow, thanks for the advice guys.

By your responses, I gather this is nothing too taxing for you guys. Do you know of any websites that can show me how to go about working out what I need to know and how to calculate the voltages/current/power etc?

Steve K, yes, I was hoping just to use a resister with the car battery but I was a bit worried that most resistors would simply burn out as soon as it is connected to the battery.

I need to charge the 18650 batteries for my LED torches as I work night shifts quite often so was hoping to save on costs and the environment! It sounds like charging these lithium batteries is more difficult than I first anticipated. This is not too much of a problem as my main "want" is really the LED living room/hall way lighting. But I'm not sure if they will be bright enough?
 
Also, if my solar panels gave out say, 6V but I needed 12V. How would I go about raising the voltage?

Is there an "invertor" type device that can be sourced easily and cheaply?
 
If you need 12V from 6V solar cells, just use 2 of them in series :)
 
My WF139 Li-ion charger has a 12 volt input jack.

I'm not sure if it (the 12v. jack) actually works, but providing it 12 volts/400mA from a solar panel should be just a matter of plugging the panel into the charger. Then the charger circuits will handle properly charging the cells. Maybe you would need a 12v. battery or large capacitor between the solar panel and charger to provide a steadier voltage supply to the WF139.

I'd like to hear more peoples input on this, as I'm also interested in an "off the grid" charging option for my Li-ion cells.

Paladin
 
right most people would use the battery, because if the voltage dropped, the curcuit could struggle , and you wouldnt know how that curcuit is built, or what happens to it when it get a sagging input.
most stuff would stop working or slow down, but with mosfet gates not being triggered FULLY, they can heat up, because they become much more resistive. so it could fail over time, from running poorly at lesser voltage inputs. again this is where it is usefull that the solar cell doesnt change much in voltage, but in its current output.
if its something you design yourself , then you can know how it will react.
 
Last edited:
1. I am trying to build a solar panel battery charger to charge say my 18650 batteries and also a 12V car battery. However I do not know what voltage/current is required to charge batteries or what power etc. Supposing the 18650 battery is 3.7V, what is the minimum voltage I need across the battery for it to charge and for how long? Is there a forumula I can use?

a) Start with an assumption. Lets assume you will charge your torch batteries at night.

You should use one of the smart chargers that utilizes 12 volts DC for its power. You need a smart charger, 12v solar panel, 12v battery and a battery charge controller.

b) To size the solar array first determine the maximum current you will need at 12v during charging and multiply that number by the number of hours you anticipate you will operate your smart charger. The resultant value in Amp*Hours is the estimated coulombic energy your array must supply during the day and your 12v battery must supply to your charger at night.

c) To size the battery just assume a 50% maximum depth of discharge. Then the battery capacity must be twice the estimated energy calculated in step b. The battery must also be capable of supplying the maximum current you specified in step b.

d) The battery charge controller will connect between the solar array and 12v battery and will sense when the 12v battery is fully charged by detecting voltage rollover which manifests as a peak voltage followed by a slight reduction of voltage accompanied by an abrupt increase in temperature (for lead-acid). The charge controller will consume a small amount of energy of its own but a 12v solar array will actually supply about 14v so design margin is built in to account for the controller.

e) Your smart charger connects directly to the 12v battery.

2. How can I tell if the the battery is fully charged? I have a voltmeter and an ampmeter but again, I do not have a formula to use to see if the voltage from the battery is good enough. I know from reading up on the forum that for a 18650 battery coming off the charger it would be around 4.25V. So is that about 15% above the voltage rating of the battery (i.e. 1.15 x 3.7)?

With a smart charger you won't have to run the charge cycle manually at all.
 
Last edited:
I see what you mean. So, instead of building my own charger, just get an "off the shelf" in car charger and connect it to a 12V solar panel?
 
He said the charger, not the panel!:p

But - one word of caution - check the input voltage rating of the charger, against the on-load panel voltage. You may or may not need a dropping resistor to protect the charger from excessive voltage.

And - maybe - a blocking diode. I have known chargers, and lead-acid solar reglators, which let through a reverse current through the panel at nighttime, which is obviously somewhat counter-productive.
 
Thanks!

I have some high voltage uni-direction diodes which may do the job but I am not sure if I wire these up to a car battery if they would just burn out as the current is high. Any ideas?

Also, solar panels is hopefully going to be one means of generating electricity, I also have other ideas which I will hook up if I can get the solar panels working.

But back to my question about raising the voltage, I know connecting 2 6V solar panels will give me 12V :p but I was hoping for someone to say that there is an invertor type device that will also do the job.

I'll go shopping for these bits soon, so hopefully I can start putting one together shortly. Will post pics if I can get it to work!
 
That's right, place the charger somewhere cool, in the shade. Otherwise the cells won't be happy.

He said the charger, not the panel!:p

But - one word of caution - check the input voltage rating of the charger, against the on-load panel voltage. You may or may not need a dropping resistor to protect the charger from excessive voltage.

And - maybe - a blocking diode. I have known chargers, and lead-acid solar reglators, which let through a reverse current through the panel at nighttime, which is obviously somewhat counter-productive.
 
Check the charge current against the diode's rating. Any chaep silicon diode will have more than enough capacity - I would be surprised if you will be charging at more than an amp.

Another word of caution - either one 12v panel, or 2 6v panels, will put out anything from sixteen to nineteen volts at peak power.

And more under a light load -anything up to 22 volts.

So - it is vital to check the input voltage rating of your charger, and maybe add a dropping resistor, but I've already said that, come to think of it.
 
Raw solar panels don't behave much like a power supply. Compared to a power supply, they have a high internal resistance, making them act a lot more like a current source than a voltage source. As Ictorana says, the open circuit voltage of a "12 volt" panel is typically around 20 volts. The short circuit current isn't too much greater than the current it delivers to the 12 - 14 or so volts typical of a lead-acid battery. Maximum power output is at about 12 - 14 volts. At higher output voltages, current drops rapidly.

Some regulator circuits aren't very happy with this large source resistance, and will become unstable or otherwise malfunction. Sometimes a large capacitor across the panel will provide a low enough impedance to keep an otherwise upset regulator happy.

You can find a huge amount of information about solar panels and battery charging on alternative power web sites, although most charging systems are designed for lead-acid batteries.

c_c
 
Apart from the solar panels, I am considering building my own small wind turbine to help with the trickle-charging. But as the hardest part is to get a decent generator/motor I am going to struggle to find something cheap and suitable.

Also as the propellor usually rotates at low rpm, I will need some gearing to speed things up. Has anyone made a small wind generator before? If so, what motor did you use and how do you increase the voltage?

Would a small RC Car motor be sufficient to generate the voltage I need?
 
Top