battery voltage vs current

*I am looking for data to better model battery voltage drops during competition.

Are there any teams out there who have battery voltage vs current1 data they’d be willing to share?

I’m looking for data on fully charged2 good FRC-legal batteries, and data on those same batteries after a “typical” FRC match.

Thank you.

1 volts measured across the battery terminals and total current through the battery

2 fully charged, with a rest period to dissipate surface charge
*
*

I’ve been giving this some thought. We do not have this kind of data, unfortunately.

Nevertheless, what you are looking for is a measurement of battery internal resistance in a real-world environment. I’d like to see this recorded dynamically during a match, indeed for several matches (who draws conclusions from an experiment performed just once?).

Next year we should have tons of data with the new power distribution board’s logging ability. I wish we had this now pre season. The programmers brains are going to hurt when I bring up the subject of active power management. On the surface it seems easy but, becomes complex in the implementation. The drive train arms race will bring this to the forefront NEXT year. TEAMS will need to deal with it or bad things will happen. In my opinion FIRST should be proactive and limit power input to the drive train. The coming problems go beyond popped breakers and battery problems. Will not happen in usual FIRST fashion. THEY’LL LET IT BLOW UP THEN DEAL WITH THE CRISIS.

I’m exited for the potential explosion of such data from real matches broken down motor by motor in 2015 because of the new PDB.

Just be careful, logging too much data can eat CPU cycles.

Just to document it in public: dynamic Battery internal resistance varies as a function of State of Charge (it goes up as the SoC goes down), and as a function of discharge current (it goes down as discharge rate increases).

Can you explain what you mean by teams needing to limit power input to the drive train? I’m not familiar enough yet with the new PDB capabilities I guess. Thanks.

I will take a crack at this:

Let’s say you have 6 CIMs on VexPro gear boxes on your drive train.
http://www.vexrobotics.com/vexpro/gears-and-gearboxes/3cimballshifter.html

Now let’s say you have some issues in your drive train.
Slight shaft misalignment.
Bearings loaded improperly.

Plus let’s make that robot reach the FRC legal weight limit.

Add on some end effectors maybe with a few more CIM or miniCIM.

At this point the battery voltage will start to drop quite a bit during operation because the combined load is high enough.

If the battery voltage gets low enough the control system will brown out and the robot will loose field connectivity.

In the past a potential solution was to reduce the maximum output from the speed control for a given input from the operator controls via the software. This would reduce the total load on the battery even if the operators would otherwise cause issues with their request. However this is a static degrading of the system performance. It may be difficult to predict where the trade off between performance and system failure is especially since you could not previously drive around with the test equipment on the competition field and sometimes things change.

The new PDB will allow a team to monitor their power usage more continuously so they can tune the power usage dynamically. Of course they must exploit that information in their software for the full effect. With this sort of information teams can move further into the performance spectrum and further from the safe zone. That increases the loads on the electrical system in the form of higher surge currents.

However it appears that there is some sort of inherent protection at work in the underlying software anyway because I noticed Team 11 during RampRiot kept really beating on their battery until it was down to 6V and they only lost communications for at most 2 seconds each time. So far as I could see, when their lead programmer asked for my help getting the RoboRIO on the field, the loaded Java software was very similar to previous years.

It is entirely possible to not overload breakers and still create so much system load that the battery can’t keep the entire control system fully powered on. Such that in previous years teams would field robots that were often near the limits of the battery and then fail to grasp the issues it would create. Then again power measurement was not a tool teams were previously provided by FIRST (except for the Jaguar) and FIRST really doesn’t teach electronics. With the new tools the expectations will shift and I think for the better.

The control system has been designed so that the connectivity is maintained during a brownout event. Robot outputs are shut off by the roboRIO firmware to preserve battery voltage well before the VRM input voltage falls below its ability to maintain the outputs, and before the roboRIO’s minimum input voltage is reached.

Even the roboRIO’s 5V DIO power is shut off when the robot goes into deep brownout protection. This is something teams need to be aware of if they depend on constant, uninterrupted sensor feedback.

That’s all great info. Is this documented somewhere? I’d like to get our programmers reading up on this stuff asap.

On the RoboRio that is a welcome improvement over the previous cRIO based system behavior.
Is there an official source for the technical information yet?

I have quite intentionally avoided meddling in Team 11’s beta test of the RoboRio. The first time I got near it was during RampRiot and only because I did not know they intended to field it. We found out and had to make sure everything was going to work on the field without issue.

Details are still being tweaked as Beta Test teams identify areas that need attention. I’m afraid that official documentation will have to wait until after the Beta Test period is over. That basically coincides with Kickoff. We can still answer questions and give useful information; we just can’t give you access to any unpublished documents.

The cRIO’s brownout protection strategy relied on a relatively slow measurement of battery voltage through the analog input module. It often could not react quickly enough for the system to survive a sudden voltage sag. The roboRIO voltage-measurement hardware is much faster, and there is a lot more room for fanciness in the FPGA. For example, the roboRIO’s PWM outputs will actively turn the motors off upon a brownout instead of just stopping the signal to the speed controllers and waiting for them to time out and go into disable mode.

Ether,
I am searching around for the data we collected using “StangSense” many years ago. As I remember, CIM motors were capable of pulling voltage down into the 4-6 volt range for 10-20 mSec at a time. We lost some of our data files when we had to move to a new provider and then again as we are reconstructing our website. I haven’t found any on my storage devices yet. I will continue to look. The pulses were significant but relatively short lived if everything was working OK.

Ether,
Will this data help?
http://www.chiefdelphi.com/forums/attachment.php?attachmentid=17363&d=1412564839
Provided by Joe Ross In the NI-rio beta thread. An example of next year’s data logging ability.

For funsies, here’s a quick plot of battery voltage vs. total current draw from that match. It approximately shows the relationship we’re expecting. It also shows just how approximate the data is … Linear fit included.


http://i.imgur.com/NGsKe5F.png

To get a nice looking curve, you’ll need a much more controlled test configuration, but this is worth something.

This is fantastic data. It is really good to see what is going on during a real match. The spreadsheet does not indicate what the measurement interval is. I was wondering if you could confirm that. It seems like it is reading 50 times per second. There are 7100 readings, and assuming it is 2:30 in total duration that would be approximately 47 readings per second.

http://www.chiefdelphi.com/forums/showpost.php?p=1403002&postcount=153

This seems to suggest 330’s code is also sampling at 40hz (25ms).

Everyone,
The limiting factor in the previous control system was the power supply on the DSC. It would cut out at about 5.5 volts which would cease PWM outputs. The power supply for the cRio would shut down at about 4.5 volts. So if the brownout didn’t last a long time or was terribly deep, the cRio stayed up and the motor drive would stop allowing the battery to come back to normal levels.

The data was read in the 50hz driver station loop in auto and telop (not disabled). However, as Aren mentioned, the data only updates at about 40hz. The last buffer of data wasn’t written, which is why it comes out to a little shorter then 2:30.

The attached data is from additional matches (and probably some practice). The name of the file is meaningless. It doesn’t meet all of Ether’s criteria, as it doesn’t have voltage before and after the match.

It would be interesting to perform Aren’s analysis on all the matches and use that to compare the internal resistances of the batteries.

330 SCRRF PDP Data.zip (880 KB)


330 SCRRF PDP Data.zip (880 KB)