# Limitting motor torque

How do you keep the CIMs from putting out so much torque you break traction with the ground? Is there some way of measuring or calculating how much torque you are currently applying? Or of anticipating wheel slip, without actual having to have it occur?

Yes. Measure the current.

How do you keep the CIMs from putting out so much torque you break traction with the ground?

You could try limiting the current.

Or of anticipating wheel slip, without actual having to have it occur?

For FRC, not really.

Commercial and military aircraft use highly complex software (and hardware) to anticipate the onset of loss of traction (during braking) and prevent it.

**

This was discussed a lot in 2009. Some teams used a form of Traction Control, some didn’t.

One of the most simple ways to implement a form of TC was to program in a ‘ramping’ function so that the driver couldn’t go from 0% (standstill) to 100% and spin the wheels.

Another method is to have an array of follower wheels that keep track of robots actual speed and compare this to the speed that the wheel is spinning at, if the two speeds are different, then your wheels are slipping and you adjust the motor motor speed accordingly.

For what it’s worth, this is almost never an issue in carpeted FRC games. Wheel slip doesn’t hamper performance all that much on the off chance your robot is capable of it.

Current is proportional to torque, so you should be able to measure the current and calculate torque. You can estimate the maximum torque that each motor should provide, and use software to limit the output based on that.

From my basic knowledge and a brief reading of the Wikipedia article on ABS, the basic idea of ABS is to measure the speed of each wheel, calculate the overall speed of the car. If one if the wheels is significantly slower, it is about to lock up. Conversely, if one if the wheels is going faster, it has broken static friction and is spinning. In the case of a tank-steer robot where each side of the drivetrain is connected, (rather than using a differential and having independent speeds for each wheel), you would probably need to use other sensors to measure the robot’s actual speed.

I’m sure there’s a lot more going on in modern ABS and traction control systems, but several teams implemented such systems on their robots in 2009 with demonstrable success. Search around and you should find some discussion and video.

All that said, in the context of FRC, having your wheels slip is not necessarily a bad thing. Given the choice between having your wheels slip and tripping a breaker, you definitely want the wheels to slip.

if you are not using CAN then the torque can be calculated with the voltage you are putting in and speed, this requires an encoder on the motor though. although instead of worrying about how to make do with less traction, i would focus on getting more traction. this years robot was geared for 14 feet per second, and before we switched to roughtop tread it was able to spin the wheels at stall, after the switch, we had no problems.

Why does it make a difference what kind of control you are using?

Increasing traction by increasing friction will also increase power draw which is not a good thing (for obvious reasons).

http://mizugaki.iis.u-tokyo.ac.jp/staff/hori/paperPDF/EV_Trans.pdf This paper originally posted to CD by Tom Schindler may be interesting to some of you. It explains how to implement traction control on an electric vehicle.

Jaguars using CAN report their current draw and support a current control mode.

If you using PWM, you don’t have access to the internal current sensor and need to find torque by some other path, such as an external current sensor or `Hawiian Cadder`s method.

Ah, that makes sense, I was assuming he was saying you couldn’t do it with CAN, my mistake.

increasing friction will increase the amount of power draw when the robot stalls. however that is not a bad thing, more grip = more pushing power. because the robots only run for 3 min at most, power draw, and efficiency dont really matter.

No it won’t. If the wheels are stalling, the motors are outputting stall current. There’s no “super duper stall” or whatnot that’s harder than that.

Unless you trip a breaker or the Jag’s overcurrent protection, or brown-out the cRIO.

**

He said when the “robot” stalls, not when the “wheel” stalls.

**

In many regional finals, you have a 6-minute match cycle time. If you have a high-current match, then your battery may not have enough time to fully recharge before the next time it is used. In the worst case, it could take 6 matches to make it to finals – most teams (though I’m sure not all) I’ve seen at competition have between 4 & 6 batteries. The question then becomes, can your battery chargers properly keep up with your demand?

Match-for-match in elims (or anything in quals) where high-current situations aren’t common, sure I’d agree with you in most* scenarios. Yet I’d hate for you to put yourself in a bad situation come competition day because you thought something was totally negligible.

• In situations where multiple mechanisms must perform quickly (i.e. high-power) and heavy game objects are involved (2008 is a perfect example), then electrical efficiency should be considered (IMO). Other considerations are PID-hold algorithms where an arm must move to a position and use back-drive current with extra current to hold itself in position. 2006 also saw at least one team implement a PID-hold software mechanism on their drive train in order to keep from being pushed while they shot at the goal (I’m not sure of the team, but it’s in the “Behind the Design” book). All of the mechanisms on the robot can accrue large amounts of power draw (mAh) over the course of a match if you’re not careful; so designing the drive train to use a little less current isn’t a bad thing.

I have seen robots that drain a battery in one match, power draw is something teams should think about.

Increasing friction also increases the amount of power that is required to turn, this can result in a large power draw. If there is a reasonable amount of performance increase that can be gained by minimizing slipping in software why not take advantage of it?

Also remember that as your motors transform electrical power into mechanical power, they are generating heat due to inefficiency. The more inefficient your motors (e.g. the further in the power curve from the “max efficiency” point you are), the more heat they generate. What is truly insidious is that a hot motor is less efficient than a cool one - resistivity in metals increases with temperature. So the hot motors get hotter more quickly. In FRC terms, red hot motors will produce less mechanical power before tripping their breakers (or burning themselves out for good).

In addition to increasing the electrical resistance of the motor’s coils, the heat also affects the motor’s magnetic properties.

**

we put mountain dew cans over our cims, at first because it looked cool, but i think it did help with cooling. the cims in this years robot never seemed quite as hot as previous drive’s.

Unless the dew cans were adorned with fins, I can’t see why it should help with cooling. Black is a better radiator.

**