Quote:
Originally Posted by baronep
I read somewhere on the web that to measure the voltage of a battery while charging (because while charging you can't just measure between the battery terminals because you would be measuring the charging voltage) you subtract the voltage drop between the positive terminal and ground And between the negative terminal and ground.
Batt Voltage = (V Drop:[Pos & Ground]) - (V Drop:[Neg & Ground])
Is this in any way correct?
-Patrick
|
Patrick,
There are a lot of variables here that make that kind of measurement meaningless. There are a lot of charge methods that are employed that use pulsed voltage or simply unfiltered DC voltage to charge the battery. In lead acid cells, most often, chargers will supply a voltage that is above the standard terminal voltage of the battery. This higher voltage is what forces charge current into the battery as Don suggested above. Simple car chargers actually will have pulses that 3-4 volts above the maximum voltage of the battery. In your suggested method, as the charge current varies over the charge cycle, the voltage drop in the wiring will change. The greater the charge current the greater the voltage drop in the wiring. In the last part of the charge cycle, the charge current is relatively small and there will be little voltage drop in the wiring. Unfortunately, many smart chargers also switch off the charge current to measure the battery as part of the charge cycle. If your measurements were to take a reading at this point, they likely would read lower than normal voltages.
As a general rule of thumb that all teams should remember,
voltage is not an indication of charge status on our batteries. A nearly dead battery with no load will measure 12 volts. The Batterybeak mentioned above will supply brief loads to make calculations for the internal resistance of the battery. Internal resistance is a good indication of battery charge.