# 2015 Beta Testing - The Components are Here.

Interesting question, I’m hoping someone more qualified will step in but I’ll try anyway.

The easiest answer is that 100% of charge is when 204Wh (12V*17Ah) of energy can be drawn from the battery, and 0% is when 204Wh has been drawn. This of course requires a few assumptions, such as we ignore any aging affects and the battery is able to push reasonably large currents at near 12V (say 10.5V and above).

Maybe a more useful answer is that 100% corresponds to what a useful charger will say is charged, and 0% is the point at which most robots will no longer completely function (ie have difficulty driving/turning). Perhaps you could go one further and say 0% is when non-motor electronics start failing (such as the RoboRIO or the VRM) and have another point, say 10% which is when robots stop turning. ie 0% is when the robot can no longer “idle”. This one would require characterising some batteries in that you would need to find an amount of energy that you could draw before reaching this point.

An even simpler answer is that state of charge is just what a battery analyser such as the Battery Beak or the CBA will say. Does anyone know how the Battery Beak works out what it’s state of charge is?

Unless we can get an answer on how the CBA/Battery Beak defines state of charge, I’d suggest we go with the 0% - idle, 10% - motors, 100% - off chargers model for this discussion. Of course I’d be happy to be corrected by someone with a better understanding of lead acid batteries.

It seems that you are saying that state of charge has no formal definition, but rather that it is defined uniquely for each system to provide a useful value representing the state of a system. Would you agree with that?

If we are defining S for FRC batteries ourselves, then I would suggest the relationship S = Eavailable/Emax where Emax is the difference in energy of the battery between the state that some standard charger says “fully charged” and some standard 0 energy value E0, such as the energy at which non-motor electronics on the robot start to fail. If we just define these two points, the relationship between S and E is linear, but if we also try to define a third point (such as the point at which motors start failing), we will not generally be able to use a linear relationship to describe S in terms of E.

I too would be interested to learn how the battery beak defines state of charge. If the beak says 78% charged, does that tell me anything quantitative about the battery or does it just mean I should leave it on the charger longer before using it in a match?

We also had issues with cables coming off easily, both bought pre-made cables, and ones we make ourselves. Older, well used cables were especially an issue.

We’ve been working on a part to be 3D printed that would (hopefully) help with cables coming off, and also the large space between the pins. We’ve just published a blog post about it here. We are still iterating over the design. But, you’ll get the idea of where we’re headed.

I still don’t see this working well for what you’re trying to accomplish. Your value of “E available” is not really quantifiable for the battery. If you read the datasheet, you’ll see that the battery’s capacity (which affects E_available) varies greatly with current draw.

Basically, this means that your effective energy consumed is proportional not only to the energy you’re actually using, but the rate at which you’re using this energy (or some weird function of the rate).

Assume that we can estimate state of charge in a battery by timing how long it takes a 10 amp load to cause the battery voltage to drop below 10 volts.

If we play one match where we draw 43,200 joules (12 amp hour volts, or 1 amp hour for our battery), and we run the test on the battery, we may see that the battery can power the 10 amp load for 15 minutes before dropping below 10 volts.

We may then completely charge the battery, and play another match where we also draw 43,200 joules (again, 1 amp hour for our battery), but the load test will cause the voltage to drop below 10 volts after only 2 minutes.

Our integration of current over time gives the same result both times, but in the second match, we ended up depleting a larger portion of the batteries capacity because we used our energy in short, high power, fast spikes, rather than a slow steady draw in the first match.

It would be interesting to see if a function for effective energy used (out of the rated 18 amp hours) could be used if we took into account both the current and the integral of the current with respect to time.

Also, does anybody have specifications for the new PDB’s current sensing (latency, resolution, sampling rate, maximum current)?

There are two common methods of determining the SOC of a lead acid battery. The most accurate is to measure the specific gravity of the electrolyte, unfortunately with a sealed AGM battery that is not possible. The other method is the resting voltage, that means a battery that has not recently been charged or had a load applied to it. The definition of recently varies depending on who you ask. Some say as little as 15min while others say it should be 24hrs. Most do agree on a fairly narrow range of around 11.8v as a 0% SOC. I’m pretty certain that the battery beak determines the SOC based on voltage particularly since the instructions I saw stated the battery needed to be “at rest” for an accurate reading.

Sample rate is 40hz, resolution is 1/8 amp. Latency seems to be less then then the sample rate, we compared it to an analog current sensor we used last year and didn’t see any unexpected latency. I haven’t seen any specs on max current, but it measured 60+ amps per channel on our drivetrain.

I’ve attached data we collected from the PDP during a match at the SCRRF Fall Classic. Note that each channel has a small steady state error, which will be calibrated out in a later firmware update.

330_SCRRF_Match_PDP_Data.xlsx (704 KB)

330_SCRRF_Match_PDP_Data.xlsx (704 KB)

Well, I can safely say that we did not see any issues with loose PWM connectors at Rumble in the Roads. We had some hard defense played on us too and we got into some real pushing matches. One of them bent a bolt on the robot and the other ripped off our green LED ring lights. Hard stuff. I was happy with the performance of the RoboRIO though and we didn’t have any loose PWM or sensor cables. We aren’t doing anything special to tie them down or apply strain relief to them.

I guess I still don’t fully understand why we wouldn’t be able to easily calculate Eavailable assuming we account for the internal resistance of the battery. We should be able to integrate the power over time to get the energy. Something like Eavailable(t) = Emax-(time integral of (V(t’)I(t’) + I(t’)^2Rinternal) from t’=0 through t’=t) where V(t) is the terminal voltage of the battery at time t, I(t) is the current supplied by the battery at time t, and Rinternal is the internal resistance of the battery.

This should be all we need to calculate Eavailable, since the initial energy of the battery either has to turn into electric energy that moves through the circuit or turn into heat due to the internal resistance of the battery. I don’t see anywhere else that the energy of the battery can go. The only thing I might not be considering here would be that Rinternal might not be a constant, but a function of current and/or temperature. Does anyone know if this is the case? Because if so, that could explain why the battery loses charge more quickly than expected at higher currents.

Check out the datasheet. The capacity of the battery is much different when the current is different. The battery is rated for 18 amp hours, which it can achieve when used in low current situations, but when used in FRC situations, the capacity is likely closer to 7 amp hours.

I don’t know exactly why this is true, but I’d be willing to bet that the chemical reaction isn’t quite as effective/efficient when it happens really quickly.

I have looked at the datasheet, and I can see that the capacity is different, but I can’t think of a good reason for this besides internal resistance of the battery. A fully charged battery has some energy associated with it, and a discharged battery has some energy associated with it. No matter how you get from this charged state to the discharged state, the energy change must be the same, correct? As far as I can tell, this energy can only turn into either heat or electrical energy.

You are correct. When the battery goes from charged to discharged slowly, most of the energy is released as electricity, and a small part is released as heat. When the battery is discharged quickly, a much larger part is released as heat.

*Using a simple model of the battery as a fixed internal resistance of 0.011 ohms in series with a constant 12.7v voltage source, it’s straightforward to compare the energy wasted across the internal resistance for the same ampere-hours at different currents.

But it’s even worse than that. A close look at the battery discharge curves suggests that the internal resistance is not constant, but rather increases substantially with current.

*

battery_wasted_energy.xls (14.5 KB)

battery_wasted_energy.xls (14.5 KB)

Remember that the battery resistance is a simplistic model of complex chemical reactions in the battery. It is a measure of the battery to pass current, not necessarily the number of electrons in the battery.

Has anyone put the PWM signals on an oscope? I assume the PWM signal is still 5v max but will it change to 3.3v if the internal jumper is set to 3.3v. I also noticed that there is no jumper on the roborio to pass 6v to the center PWM conductor. The crio had this jumper. Can someone take a measurement and let me know these two voltages?

Thanks

Mr. Ross,
That spread sheet is exactly what our team wants to study and implement power management. Can you detail the programming set up to capture that data?

I haven’t seen Joe’s code, but I suspect they are reading the power API at about 25ms and storing the results in their on file. This level of logging along with events where the PDP saw a breaker trip should soon be built in, though you are always welcome to write your own.

Here is the code that read the data into a 2d array (for buffering) in telopPeriodic and disabledPeriodic (java). The code to handle the buffering and writing the file periodically and handling exceptions was more code then reading the data from the pdp.

``````    	for (int i=0; i<16; i++) {
pdpArray[pdpLine]* = pdp.getCurrent(i);
}
pdpArray[pdpLine][16] = pdp.getVoltage();
pdpArray[pdpLine][17] = pdp.getTemperature();
``````

The 3.3v/5v internal jumper only affects the DIO power output.
The PWM signals are always 5v max.
The PWM power is always 6v. The motor controllers have this line disconnected, so power on it doesn’t affect them.

@Frank: You seem to be implying that the number of electrons in the battery changes as the battery supplies current. Was that your intent?

What about the output impedance of the PWM 5v? Has anyone measured that? Or, is it specified somewhere?

Also, has anyone collected millamps vs voltage drop data across the input of the new motor controllers?