Line Following with PID

Has anyone ever tried writing line following code using a PID style control system.

Right now I am using Lego Mindstorms, programming it with NQC (basically C for mindstorms with a few built in functions), and I am only allowed to use two light sensors.

Right now I am sucessfully able to drive my motors by having one function that rapidly turns them on and off for a certain period of time, which is passed to the function (in 10’s of milliseconds). I take the threshold value of the black material and from it I subtract the current sensor value. I multiply that value by a gain and voila. That works well.

But then I tried to implement a Derivative control and everything went beserk.

Anyone ever do this or have any type of experience with it?

I remember doing this with the old drag and drop code blocks of Mindstorm and it was easy. However that wasnt a derivative control system. Can you post the code for us to look at?

Sometimes in a competition where there are lines on the field, following the line may not be the best technique. What our team has done in the past is to use the lines as way points. It can be faster to drive straight by dead reckoning or encoders until the IR sensors detects the line. Then turn until another sensor detects the line for alignment, then go on. This only works if the lines are placed right.Detecting way points on a field can many times be done faster than following the line. It’s an important concept to to keep in mind for autonomous navigation.

Unless you have a fairly high number of sensors, using PID control can be difficult.

One way to increase the speed is to put the sensors just on the edge of the tape, so that if you’re a little off, then 1 will be “on” and 1 will be “off”. You then slow down a motor to correct.

He is right. One way to have high resolution is to use a camera. I have written PID line followers using both the CMU Cam and the RC and a DVT camera linked over ethernet to a laptop controlling a wheelchair.

If you dont have either alot of sensors or a camera, there are simpler algorithms.

What are you using for sensors?

FYI, If you’re looking for an example of a robot using the CMUCam to follow a line, this one does it pretty well. His video link doesn’t have content, but I found video of it in action at the end of this demo video

I guess it would have beneficial to say a few things in my original post.

  1. This is not for an FRC compeititon. I am using the lego mindstorms platform for a competition here at my college.

  2. I am limited to using two light sensor and two touch sensors. So any suggestions of CMU cam to detect the line arent really applicable.

  3. The competition is line following, so by default, it is the most efficient way of doing it.

On top of all of this, the width of the line varies from 1 1/2 inches to 1/8 inch meaning that way point detection can be very difficult.

I will post my code as soon as I am on my own computer.

If you just keep track of the change from white to black or vice versa, the width of the line will not matter if it is used as a waypoint.

Something that has worked well for me: Scale the sensor input to motor speed.
So if left sensor has input 50 and right sensor has input 40 (assuming white is high and following a black line), then 100% power to left motor and 80% power to the right motor

Totally useless for this particular problem, but I’ve wondered about how to improve basic line following for FIRST bots. I think you could use a relatively fast line sensor and just basically wave the thing around where you think the line should be. And you keep track of where the sensor is with a potentiometer, so you end up with an analog (ish) value for the position of the line instead of a digital value from 3 or so static sensors. I think it just depends on if the sensor would acquire the line in a fast and predictable manner.

I am not sure if this could work, but what if the PID input were time? If the slope of the robot in relation to the line is small, it will take longer for an error to occur. (one of the line detectors to detect the line). If the difference in slope is great, then an error will occur more quickly.
Perhaps pseudo code would be something like

:Beginning
Begin Timer
While line not detected
Go forward
Stop Timer
correction = PID(timer,sensor)
GOTO Beginning

Just a thought.

Are you allowed to modify the robot design, or are you stuck with a particular robot design?

The reason I ask is that about 4 years ago I built an incredible line follower where the secret wasn’t in the software but was in the construction of the lego robot. Do you have freedom to choose your own robot design?

What is the design of your robot? Can you post a picture?

There are two ways to do line following with an RCX/NXT.
The first easier, but slower way is simply if it sees black go left, if it see white go right. This approach uses only one light sensor so if you want to use the other sensor for something else you could.

The second more complicated way is to have two light sensors mounted (in your case) about 1 1/2-2’’ apart. When the left sensor sees black slow the left motor, and increase speed of the right motor, and when the right sensor sees black slow the right motor, and increase speed of the left motor. This approach is more complicated, but the robot will move faster.

This site has a lot of useful information. I don’t own any of his books, but the line following robot is very fast. In the video it looks like it’s on rails. Simple 2wd differential steering.

http://robotroom.com/Jet.html

Even though I haven’t heard back after my note from yesterday, I decided I’d provide a little more info. In my opinion, the real trick to getting incredible line following ability is to use a “steering robot”, like a car, rather than like a typical FRC robot which typically uses “tank-style” differential drive to turn.

In a line-following steering robot, the light sensor is actually mounted to pivot with the front wheels as they turn. In this way, the light sensor ensures that the front wheels track down the line. Whenever the robot is driving with the light sensor over the edge of the line, the front wheels get centered over the edge of the line, pointing the front tires in the correct direction along the line due to the geometry between the light sensors and the front wheels.

After building a robot like this, play with things you can do to tighten the steering response (you want no slop) and adjust the distance between the light sensor and the front wheels. You’ll want the robot to be as small and light as possible. The programming is actually rather simple – use the steering to keep the sensor centered over the edge of the line, and simply power the drive motor to keep the robot moving. Use of proportional control for the steering would probably be sufficient, but PD would probably improve the steering. PID would probably not be needed in this case. In addition to modifying the steering software, you can adjust the forward speed to be greater when the robot is “tracking well” and slower when it is “tracking poorly.”

The robot that I built as a line follower a few years ago was an enhancement upon an incredible line follower by Gas Jansson – his famous “Steerbot” – http://www.lugnet.com/~726/SteerBot His page describes a lot about how the robot is built, includes detailed photos, and talks about the programming. There are also some great videos of the robot driving at http://www.lugnet.com/org/us/smart/~48/meetings/2001/01

My improvements were never documented, but had to do with decreasing the slop between the steering motor and the front tires, as well as slop between the front tires and the light sensor. (Gus’ robot has a fair bit of play there, which decreases the responsiveness of the system.) I’d love to show you a picture or photo, but the robot was cannibalized for my FLL team before I took the time to take photos.

Let us know how it turns out!