|
Re: Battery Chargers
Guys,
I am afraid you are confusing things a little because of the common terms. In SLA/AGM batteries, in order to force charge current to flow, the external power supply must have a supply voltage higher than the "cell voltage". So when you are measuring a battery that is connected to a charger and the charger is supplying current, you are measuring the supply voltage not the battery voltage. You must remember that the battery has an internal resistance and that you are measuring the battery cell voltage plus the voltage across the resistance. The voltage drop across the internal resistance will increase as the charge current goes up. However, there are limits on the conversion of that current into reversing the chemical reaction within the battery. No matter how much current you attempt to force into the battery, that current which is in excess of what can be used for charging is lost as heat in the internal resistance of the battery. Once fully charged, all excess current is converted to heat.
So when charging at 2 amps, the supply voltage is set by the charger to produce 2 amps of charge current, to start. This will fully charge the battery, it will just take longer and produce less heat. To pick an arbitrary value, say you measure 12.8 volts on the power supply at 2 amps. At 6 amps, that voltage will increase to perhaps 14.5 volts, the battery will still fully charge, but in less time than the 2 amp setting. However, the increased current will also generate increased heat in the internal resistance. The internal resistance is 0.011 ohms. It is a simple matter to make a calculation. At 2 amps, 0.044 watts will be dissipated in the internal resistance. At 6 amps close to 0.4 watts will be dissipated. The remainder will be used in the charge chemical reaction (which also produces some heat). If we used a constant current charger then at the time that the chemical reaction is finished, all power will be dissipated by the internal resistance of the battery or nearly 72 watts. Thankfully, SLA chargers do not use a constant current charge method and most smart chargers never will charge at the rated current for their entire charge cycle. Please keep in mind that different battery chemistry have different cell voltages. Our batteries are very close to 2 volt/cell. Your car battery is more like 2.3 volts per cell so it will show 13.8 volts when fully charged. Alkalines are 1.5 volts per cell, NiCad is 1.2 and lithium is 3 volts per cell.
As I explained before there is always the "surface charge" that will confuse your readings if you remove the battery from charger and then immediately measure the terminal voltage. It is likely to measure as much as 14 or 15 volts but it is a lie. Leave the battery for a while or connect it to a load and the terminal voltage will fall back to 12 volts. Make that an excessive load like six motor drives and the terminal voltage will fall well below 12 volts, again because of the voltage drop across the internal resistance. The Battery Beak switches in different loads to make a quick calculation for terminal voltage, capacity, and internal resistance. The CBAIII simply charts the terminal voltage vs. time while connected to a dynamic load. The resulting chart will mimic the discharge graphs supplied by the manufacturers and will make Amp Hour calaculation based on the data.
__________________
Good Luck All. Learn something new, everyday!
Al
WB9UVJ
www.wildstang.org
________________________
Storming the Tower since 1996.
Last edited by Al Skierkiewicz : 07-05-2013 at 22:03.
|