Chief Delphi

Chief Delphi (http://www.chiefdelphi.com/forums/index.php)
-   Programming (http://www.chiefdelphi.com/forums/forumdisplay.php?f=51)
-   -   Team 254 Presents: FRC 2016 Code (http://www.chiefdelphi.com/forums/showthread.php?t=151816)

wsh32 20-10-2016 01:59

Re: Team 254 Presents: FRC 2016 Code
 
How do you check your position in auto after you crossed a defense like the moat, where your wheels might be turning more than you're actually moving? Or did you not run into that problem?

Jared Russell 20-10-2016 11:10

Re: Team 254 Presents: FRC 2016 Code
 
Quote:

Originally Posted by wsh32 (Post 1612618)
How do you check your position in auto after you crossed a defense like the moat, where your wheels might be turning more than you're actually moving? Or did you not run into that problem?

1) we went slowly enough that worst case slip was limited.
2) we used closed-loop velocity control on the wheels to ensure that even if one side of the drive train momentarily lost traction, we didn't suddenly lose a ton of encoder ticks.
3) in the end, we didn't need to be that precise - our auto-aim could make the shot from anywhere in the courtyard, and on the way back we either just used a conservative distance (for our one-ball mode) or used a reflective sensor to ensure we didn't cross the center tape (for our two-ball modes).

kingca 20-10-2016 12:46

Re: Team 254 Presents: FRC 2016 Code
 
oh ok so your "traction control" was making sure that your robot remained straight; by any chance what was the logic behind making the robot stay straight.

Jared Russell 20-10-2016 15:26

Re: Team 254 Presents: FRC 2016 Code
 
Quote:

Originally Posted by kingca (Post 1612689)
oh ok so your "traction control" was making sure that your robot remained straight; by any chance what was the logic behind making the robot stay straight.

Yeah - the logic for staying straight is here. There's a PID controller that compares our actual heading to the desired heading, and adjusts the desired velocities of the left and right sides of the drive accordingly.

frcguy 20-10-2016 15:35

Re: Team 254 Presents: FRC 2016 Code
 
Quote:

Originally Posted by Jared Russell (Post 1612613)
I'm not sure what you're referring to? We did have a "traction control" mode that used closed-loop speed feedback along with a gyro to cross defenses while remaining straight, but this didn't require calculating traction.

What gyro did you guys use? I saw the Spartan Board on your bot at Chezy Champs, so I assume that you used the ADXRS453 that is built in to it, but I didn't have a chance to get a closer look.

Jared Russell 20-10-2016 15:42

Re: Team 254 Presents: FRC 2016 Code
 
Quote:

Originally Posted by frcguy (Post 1612736)
What gyro did you guys use? I saw the Spartan Board on your bot at Chezy Champs, so I assume that you used the ADXRS453 that is built in to it, but I didn't have a chance to get a closer look.

Yes.

kiettyyyy 21-10-2016 02:40

Re: Team 254 Presents: FRC 2016 Code
 
Quote:

Originally Posted by Jared Russell (Post 1611189)
I can't really answer the second question very well - I have only worked with the team for 3 years so all of the programmers are either still on the team or in college. Perhaps someone else can speak to history before that.

I might not be the best person to answer this as well, however, I have had one of their students end up as one of my interns within Qualcomm's Corporate R&D group back in 2013.

He ended up being one of my top interns, able to keep up with the PhD candidates in programming, control theory and hardware concepts... without taking a single college course.

gerthworm 21-10-2016 08:55

Re: Team 254 Presents: FRC 2016 Code
 
Awesome job guys!

I saw in previous years you had a robot-hosted website interface for status & tuning, although I didn't see that this year. I might just be missing it.... Assuming I'm not, did you have a reason for not carrying that forward?

Jared Russell 21-10-2016 11:12

Re: Team 254 Presents: FRC 2016 Code
 
Quote:

Originally Posted by gerthworm (Post 1612819)
Awesome job guys!

I saw in previous years you had a robot-hosted website interface for status & tuning, although I didn't see that this year. I might just be missing it.... Assuming I'm not, did you have a reason for not carrying that forward?

If we had infinite time, I'm sure we would have. Instead, we used a combination of SmartDashboard / our own web interface for Network Tables for monitoring feedback, and used the Talon SRX configuration page for adjusting most of our gains.

ranlevinstein 21-10-2016 15:22

Re: Team 254 Presents: FRC 2016 Code
 
Thank you guys for sharing your amazing code!
I have a couple of questions:
Why did you choose to follow a path instead of a trajectory during auto this year?
Why did you choose the adaptive pure pursuit controller instead of other controllers?

gerthworm 21-10-2016 16:32

Re: Team 254 Presents: FRC 2016 Code
 
Quote:

Originally Posted by Jared Russell (Post 1612854)
If we had infinite time, I'm sure we would have.

The struggle is real. Nice, thanks again for posting and answering questions! I always love going through your stuff, so many great and well implemented ideas!

Jared Russell 22-10-2016 01:05

Re: Team 254 Presents: FRC 2016 Code
 
Quote:

Originally Posted by ranlevinstein (Post 1612898)
Why did you choose to follow a path instead of a trajectory during auto this year?

Great question! I assume you are referring to (my) definition of path vs. trajectory from the motion profiling talk (these definitions are hardly universal).

Path: An ordered list of states (where we want to go, and in what order). Paths are speed-independent.

Trajectory: A time-indexed list of states (at each time, where we want to be). Because each state needs to be reached at a certain time, we also know get a desired speed implicitly (or explicitly depending on your representation).

In 2014 and 2015, our controllers followed trajectories. In 2016, our drive followed paths (the controller was free to determine its own speed). Why?

Time-indexed trajectories are planned assuming you have a pretty good model of how your robot will behave while executing the plan. This is useful because (if your model is good), your trajectory contains information about velocity, acceleration, etc., that you can feed to your controllers to help follow it closely. This is also nice because your trajectory always takes the same amount of time to execute. But if you end up really far off of the trajectory, you can end up with weird stuff happening...

With a path only, your controller has more freedom to take a bit of extra time to cross a defense, straighten out the robot after it gets cocked sideways, etc. This helps if you don't have a good model of how your robot is going to move - and a pneumatic wheeled robot climbing over various obstacles is certainly hard to model.

Quote:

Originally Posted by ranlevinstein (Post 1612898)
Why did you choose the adaptive pure pursuit controller instead of other controllers?

Simplicity. A pure pursuit controller is basically a P-only controller on cross track error, but somewhat easier to tune. Adaptive pure pursuit is sort of akin to a PD controller (the only difference is a fudge factor in how far you look ahead). If you only have a day to get auto mode working, and the robot is being repaired up on a table while you are coding, then pure pursuit requires very little time to get tuned once you are back on the floor :)

apache8080 26-10-2016 22:15

Re: Team 254 Presents: FRC 2016 Code
 
Thanks for all of the great resources.

I had a few questions on your vision code:

Are you guys calculating distance from the goal to adjust the hood? If so, how?

If you guys would have used the Jetson TX1, would you have considered using the ZED stereocamera from Stereolabs?

Jared Russell 26-10-2016 23:58

Re: Team 254 Presents: FRC 2016 Code
 
Quote:

Originally Posted by apache8080 (Post 1613744)
Are you guys calculating distance from the goal to adjust the hood? If so, how?[/url]?

Yep. I'll point you to a few places in the code that help explain how.

First, in the Android app, we find the pixel coordinates corresponding to the center of the goal:
https://github.com/Team254/FRC-2016-...View.java#L131

...and then turn those pixel coordinates into a 3D vector representing the "ray" shooting out of the camera towards the target. The vector has an x component (+x is out towards the goal) that is always set to 1; a y component (+y is to the left in the camera image); and a z component (+z is up). This vector is unit-less, but the ratios between x, y, and z define angles relative to the back of the phone. The math behind how we create this vector is explained here.

The resulting vector is then sent over a network interface to the RoboRIO. The first interesting place it is used is here:
https://github.com/Team254/FRC-2016-...tate.java#L187

In that function, we turn the unit-less 3D vector from the phone into real-world range and bearing. We can measure pitch (angle above the plane of the floor) by using our vector with some simple trig; same thing for yaw (angle left/right). Since we know where the phone is on the robot (from CAD, and from reading the sensors on our turret), we can compensate for the fact that the camera is not mounted level, and the turret may be turned. Finally, we know how tall the goal should be (and how high the camera should be), so we can use more trigonometry to use our pitch and yaw angles to determine distance. We feed these values into a tracker (which smooths out our measurements by averaging recent goal detections that seem to correspond to the same goal).

The final part is to feed our distance measurement (and bearing) into our auto-aiming code. We do this here:
https://github.com/Team254/FRC-2016-...ture.java#L718

Notice that we use a function to convert between distance and hood angle. This function was tuned (many times throughout the season) by putting the robot on the field, shooting a bunch of balls from a bunch of different spots, and manually adjusting hood angle until the shots were optimized for each range. We'd record the angles that worked best, and then interpolate between the two nearest recorded angles for any given distance we want to shoot from.

Jared Russell 27-10-2016 00:00

Re: Team 254 Presents: FRC 2016 Code
 
Quote:

Originally Posted by apache8080 (Post 1613744)
If you guys would have used the Jetson TX1, would you have considered using the ZED stereocamera from Stereolabs?

Probably not. Stereo was totally unnecessary for estimating range to the goal last year; we were able to estimate our distance to within a few inches using only the method described in the previous post.


All times are GMT -5. The time now is 17:53.

Powered by vBulletin® Version 3.6.4
Copyright ©2000 - 2017, Jelsoft Enterprises Ltd.
Copyright © Chief Delphi