Has anyone posted their autonomous code from 2007?

Has any team posted their “successful” autonomous code from 2007? It would be of immense value for the CD community to see how such success was obtained.

Step 1: Drive forward X Feet. (PID loop with wheel encoders).
Step 2: Wait for Camera Lock. (Kevin Watson’s camera code modified to only search at a specific tilt - angle for better speed).
Step 3: Based on the PWM value of the tilt servo we used a lookup table to determine how much we needed to drive.
Step 4: Based on the angle of the pan servo we used a PID loop with the kit Gyroscope to turn to the correct angle
Step 5: We drove forward the amount named in Step 3.
Step 6: Release tube and back up.

You will notice we used a counter for timing some of the events - things like backing up, or making sure the robot stopped. We could have used the encoders to do it, but this was this year’s quick and dirty. We had started doing it this way when we realize the other way would be more accurate. Next year we’ll just check encoder values and potentiometer values instead of time values - it will help save some time.

Some tricks that we used: We knew that each field would be different, and we knew the camera would get knocked around a bit during the competition. To compensate, we added an “offset” value to both the distance and the turn angle that we would determine by actually running the robot.

Prior to them allowing us to calibrate on the field, we used the practice matches to calibrate. If we stopped 5 inches short, we’d add 5 inches to the offset to drive.

We adjusted the turning offset on the practice field by having the robot turn left and checking how far we were off, then turn right.

The key for us to get the camera to sustain a good lock was to shield the servo cables from all power sources, and then get the lock while the robot was stationary.

This was obviously preceded by an enormous amount of work on the robot to adjust gains, work around deadbands, etc to get it to drive the distance we told it and the angle we told it. Our robot couldn’t drive less than 1 foot accurately, and couldn’t turn less than 2 degrees accurately. Anything over those values and we were within an inch and within about 1/2 a degree of being dead on.

Hopefully this code helps some teams. Please - if you do take ideas from it give us some credit. Our egos are in desperate need of help after not making it to Atlanta :wink:

auton.txt (12 KB)


auton.txt (12 KB)

Tom,
Thanks for your post.
I mentored Team 1286 and didn’t get much time for tuning the algorithm on the actual robot itself.
I haven’t touched the code since the Detroit regional,but it is interesting to note that I followed a similiar path.

Step 1: Drive forward slowly (base PWM value is large due to friction from tank tread design). (no wheel encoders).
Step 2: (Kevin Watson’s camera code modified to only search at a specific tilt - angle for better speed).
Step 3: Based on the PWM value of the tilt servo I used a lookup table to determine how much tilt PWM value to add
to the base PWM value.
(The tilt PWM value was calibrated to slow down as the target was reached - MBC).
Step 4: Based on the PWM value of the pan servo I used a correction to the base PWM value plus tile PWM value to turn to the correct angle. (One side was sped up and/or the other side slowed down to correct angle - MBC). So now have base PWM forward value plus tilt angle PWM value plus pan servo PWM value.
Step 5: We drove forward the amount until the total PWM value caused the robot to stop (as tilt angle increased, tilt
angle PWM value decreased, as pan servo angle neared zero, pan angle PWM value approached zero).
Step 6: (no arm on robot - tank tread design only) - No tube to release at “the rack”.

Was I successful? Yes, somewhat. After spending many hours at the bench prior to the Detroit Area Regional to get
everything right , I was able to move straight forward and stop. I was also able to start off to one side of the intended target and move forward and stop. I was disappointed that the team didn’t give me multiple opportunities to test at Detroit.

A couple of important comments are in order:

  1. Spent lots of time making software work on the bench prior to getting to the robot

  2. Getting the camera to work and interface to the software the way you want it to work is a job in itself

  3. I used fixed point math (all numbers were between 0 and 255) - I am familiar with using fixed point due
    to past work experience.

  4. No trig functions are necessary to operate. I think this is difficult idea for high school students to contemplate (using PMW values as a surrogate for angles).

  5. Only Proportional logic used in conjunction with table lookups and fixed point math. This simplifies
    implementation. Proportional gain changes as a function of the error (important - most PID’s discussed utilize
    constant P rather than varying P). More refinement may have demonstrated the need to provide integral correction
    capability.

  6. A calibration philosophy must accompany the control strategy. Trying to remember back to my high school experiences
    over 30 years ago, I would find it a challenge to A) have a control strategy and then B) have a calibration strategy

  7. Calibration strategy: A) Go straight forward slowly, entire time with enough power to reach rack during autonomous
    period. This establishes your base calibration values for your robot incorporating friction. With calibrations for each side,
    I also accomodated frictional differences between both sides B) Calibrate camera for tile angle. Find camera tilt angle value at starting position- make this value (179) correspond to largest output value in your tilt angle PWM calibration table(16?). Find camera tilt angle value at end posiition (195) make this value correspond to zero in your tilt angle PWM calibration table. C)Calibrate camera for pan angle (not completed). Start in left or right starting position or equivalent. Find maximum
    pan angle (80). Associate this with largest PWM correction in table lookup to make robot turn to desired direction.

Tom, I like the use of your case statements to indicate your starting position and target height . When I
stopped my development at Detroit, I essentially had the starting position logic operational.
I am considering drafting a white paper “The almost ready for Prime Time autonomous mode for Team 1286
for Rack ‘N’ Roll - 2007” Perhaps it would be a good read for next year’s teams.

As you can see by our use of the “User Byte” variable for that case statement, we actually used the User Interface control board LCD to pick which autonomous mode we were going to use.

We guessed that we could see how the opposing side’s robots were set up and then set it, but there were so few auton’s this year we didn’t have to worry about it.

The only “problem” we had was during the initial 3 matches at West Michigan when we decided to try to get get middle row scoring working instead of just low row scoring. We had to change our strategy a bit and get our distance and turn amounts prior to raising the arm because the tube would block the camera angle. This resulted in the rather embarassing situation of the robot moving up to the rack, raising the arm, then stopping dead. 3 times.

I wanted to hide underneath the bleachers :o

this year we didnt have auto mode, but next year i hope to. so far, unless the competition has absolutely no point in using omni wheel, we want to use them. over the long weekend i was bored and wrote some dead-reckoning omni wheel autonomous code. some of the functions are not included. honestly, i dont even know if this code works.lol

anyways, the not included functions:

blend - inputs 2 numbers, subtracts 127, adds them together, spits out the value

speed - subtracts 127 from the values so 0 becomes neutral instead of 127.

EDIT
the robot also uses a gyroscope, and when doing a straight line, should correct itself.

also, the turnLeft and turnRight functions just turn the robot on a pivot point, like tank drive.

autonomous omni functions.txt (2.4 KB)


autonomous omni functions.txt (2.4 KB)

about the user mode byte, isn’t it either ON or OFF? how do you assign a value to it like that?

Do a search for it - I actually got the idea and the implementation off Chief Delphi to begin with.

Here is the core function that Team 11 used to drive to the rack, we were very successful towards the end of the season with this function


void approach_rack(void)
{
	static unsigned char approach_state = PIVOT_ROBOT;		// First state of approach rack
	static int temp_pan_servo = 127;
	static int temp_tilt_servo = 127;
	static int servo_pan_error = 0;
	static int servo_tilt_error = 0;
	static int last_servo_tilt = 0;
	static int encoder_error = 0;
	static int combined_encoder_count = 0;
	static int last_right_encoder_count = 0;
	static int break_friction = 0;

	if(Get_Tracking_State() == TARGET_IN_VIEW || Get_Tracking_State() == CAMERA_ON_TARGET) // CHECK IF THE CAMERA SEES THE TARGET, SHOULD BE FASTER
	{																						//	THAN JUST CHECKING TO SEE IF IT'S ON IT,
		temp_pan_servo = PAN_SERVO;															//		WITH JUST A SMALL SACRIFICE IN ACCURACY
		temp_tilt_servo = TILT_SERVO;
		servo_pan_error = (temp_pan_servo - SERVO_PAN_TARGET);
		servo_tilt_error = (temp_tilt_servo - SERVO_TILT_TARGET);
		service_brakes(0);

		switch (approach_state)
		{
			case PIVOT_ROBOT:		//ROBOT PIVOTING

				if(servo_pan_error >= SERVO_PAN_DEADZONE)
				{
					approach_rack_x =  RIGHT_TURN_SPEED + break_friction; //RIGHT TURN SPEED
					//printf("PIVOTING RIGHT... 
");
				}
				else if(servo_pan_error <= -1*SERVO_PAN_DEADZONE)
				{
					approach_rack_x =  LEFT_TURN_SPEED - break_friction;	//LEFT TURN SPEED
					//printf("PIVOTING LEFT... 
");
				}
				else
				{
					approach_rack_x = 127;
				//	printf("PIVOTING DONE... 
");
					approach_state = APPROACH_AND_SWERVE;
				}

	//		printf("friction: %d
",break_friction);

				if (Get_Encoder_2_Count() >= last_right_encoder_count - 2 && Get_Encoder_2_Count() <= last_right_encoder_count + 2 )
				{
					break_friction += 2;
				}
				else
				{
					break_friction -= 3;
				}
				if (break_friction < 0)
				{
					break_friction = 0;
				}

			break;

			case APPROACH_AND_SWERVE:	//APPROACH AND SWERVE

				if(servo_pan_error >= PAN_SWERVE_DEADZONE || servo_pan_error <= -1*PAN_SWERVE_DEADZONE) // CHECK TO SEE IF WE ARE TOO FAR OFF TO SWERVE
				{
					approach_rack_x = 127;
					approach_rack_y = 127;
					approach_state = PIVOT_ROBOT;
				}
				else
				{

					if(servo_tilt_error >= SERVO_TILT_DEADZONE || servo_tilt_error <= -1*SERVO_TILT_DEADZONE)
					{
						approach_rack_y = 127 - (1.75 * servo_tilt_error);	//multiplier was 1

						if(approach_rack_y >= FORWARD_SPEED)
						{
							approach_rack_y = FORWARD_SPEED;
						//	printf("FORWARD FAST
");
						}
						else if(approach_rack_y <= MIN_FORWARD_SPEED && approach_rack_y > 127)
						{
							approach_rack_y = MIN_FORWARD_SPEED; //MINIMUM FORWARD SPEED, DON'T WANT TO GO TO SLOW
						//	printf("FORWARD SLOW
");
						}
						else if(approach_rack_y <= BACKWARD_SPEED) 	//IF WE OVERSHOT ON THE Y
						{
							approach_rack_y = BACKWARD_SPEED;
						//	printf("BACKWARDS FAST
");
						}
						else if(approach_rack_y >= MIN_BACKWARD_SPEED && approach_rack_y < 127)
						{
							approach_rack_y = MIN_BACKWARD_SPEED; //MINIMUM FORWARD SPEED, DON'T WANT TO GO TO SLOW
						//	printf("BACKWARDS SLOW
");
						}


					}
					else
					{
						approach_rack_x = 127;
						approach_rack_y = 127;
						service_brakes(1);
						approach_state = AT_TARGET;
					}

					if(servo_pan_error >= SERVO_PAN_DEADZONE || servo_pan_error <= -1*SERVO_PAN_DEADZONE)
					{
						approach_rack_x = (servo_pan_error + 127);
					}
					else
					{
						approach_rack_x = 127;
					}

					if(approach_rack_x <= LEFT_SWERVE_SPEED)
					{
						approach_rack_x = LEFT_SWERVE_SPEED;
						printf("SWERVING LEFT
");
					}
					else if(approach_rack_x >= RIGHT_SWERVE_SPEED)
					{
						approach_rack_x = RIGHT_SWERVE_SPEED;
						printf("SWERVING RIGHT
");
					}
				}


			break;

			case AT_TARGET:

				service_brakes(1);
				approach_rack_x = 127;
				approach_rack_y = 127;
				at_target = 1;
				printf("AT TARGET...
");

			break;

			default:

			break;
		}
	}
	else
	{
		approach_rack_y = 127;
		approach_rack_x = 127;
	}

	last_right_encoder_count = Get_Encoder_2_Count();
}

Wow! Decided to share our code I see :slight_smile: But, to the readers, remember, its all relative to the camera, you don’t need everything exact, you just need to know how far you are from where you want to be. Also, it is not worth getting data from the camera the entire run, due to the shaking of the robot as it moves across the floor or turns, the camera can move around a lot during tracking, so stopping, and getting a snapshot of where you are at in relation to the light is the way to go. Once you know you are x feet from the light and x degrees, its just a matter of doing it. Also, don’t forget to take off the distance from the arm to the camera. We did that for about an hour, we would see we are 10 feet from camera to the light, and go forward that far, each time hitting the rack and flipping over, as the camera was 4 feet from the front of the robot. We were really just telling the camera to go under the light, not the end of arm.

-Kyle

Yeah, I broke down and decided to share. Since we aren’t going to IRI, and I doubt it will be applicable to next year’s competition, there isn’t any reason not to give some other team some help.

I’ve moved on to trying to figure out how to take the x/y 0 - 255 from the joystick and turn it into an angle without floating point so we can do a swerve next year.