![]() |
Re: FRC971 Spartan Robotics 2016 Release Video
Quote:
We did a bunch of CAD sketches to try to determine packaging. We quickly discovered that to make it all package like we wanted, a side-by-side wheel shooter helped a lot. That info was then fed to the prototyping team, and they set to work. The first job was to build a CIM + wheels module. By random chance, we picked 1 CIM and 2 wheels. Based on some old math, we picked a reasonably high surface speed, and machined the module on the router. We then attached the module to a bunch of 8020, and started doing a parameter sweep. ![]() The obvious parameters (for us) to try were compression, and surface speed. Once the shooter was making shots into the goal reliably, we started collecting ball landing position data. They were using a pulse counter on the Fluke multimeter and reflectance sensor to count RPM to dial the RPM in accurately for the tests. ![]() As you increase compression, there is a compression level where the spread shrinks, and then starts to increase. We picked that point. The prototyping team didn't the standard deviation was quite good enough, so they kept working at it. Someone came up with the idea to offset the wheels vertically to put a little spin on the ball. That reduced the spread far enough that they were happy with it, and we shipped it with those numbers exactly. We haven't had to touch any of the shooter parameters all season, which is wonderful. |
Re: FRC971 Spartan Robotics 2016 Release Video
Quote:
|
Re: FRC971 Spartan Robotics 2016 Release Video
Quote:
We machined them using a 5C collet fixture on a HAAS mini mill (PN 3174A42 on McMaster or equivalent). This puts the shaft vertically. We machined it primarily using with a 2" LOC 1/2"D carbide endmill, and designed the shafts so that they could fit in two setups of this. We started with 5/8 precision ground 7075. We would start with a blank of known length (so we could use collet stops), machine one side, flip it over and hold onto some section that was either previously machined or a section that was designed to be left un-machined, and then machine the back side. Last year we started using slitting saws to put the snap ring groves in during the same operation, and that has been a nice time saver. Another trick we learned is it is really worth while to have a finisher endmill that you use just for the finish pass so that you don't end up with the bottom of each profile being a little tight due to the end of the endmill being more worn. We haven't had any issues of runout, although it is a concern. Overall, I estimate that we spend 1/2 to 3/4 of a day machining all the critical shafts for both robots. The setup and CAM changeover goes relatively quick once the first part is setup. We find the time savings on the design and fiddle factor to be worth the effort. |
Re: FRC971 Spartan Robotics 2016 Release Video
Your last 3 robots have been extremely ambitious, unique, and well done. Why are your robots so different, what are you doing that's different than other teams? During brainstorming when someone proposes something like your 2015 robot everyone would throw it out immediately. How do encourage such unique thinking, and when you move into the design process how do you know you'll be able to pull off your crazy robots?
|
Re: FRC971 Spartan Robotics 2016 Release Video
Quote:
We started by controlling every variable possible. Once we had something that performed well enough, we started looking at the CAD and what the final solution was going to look like. We then measured the prototype and tried to make the model match. Whenever we found a place where we wanted the model to diverge from the prototype, we tweaked the prototype to match the proposed design and re-ran our tests. Our final design has a plate holding the ball until it is grabbed by the wheels, which serves the same purpose. We pulled the bar back until it was just about as long as the final design wanted, and verified that it still worked. Quote:
This year was different in that you didn't need to shoot multiple balls. We chose to use that by designing a shooter to hold one ball very securely and load it very consistently. That pointed us towards designing something which grabbed the ball in a "cage" with pistons and linkages, and some sort of piston loading mechanism. After a couple conceptual iterations in discussions, someone proposed having 2 links where the links were driven relative to each other to grab and release the ball, and the pair of links rotated together to feed the ball into the flywheels. We very carefully worked through all the geometry in solidworks sketches, figured out all the components, and then worked on finalizing the design. We tried at least 4 different piston models before we found a piston that would work. I'm not sure I explained the pistons the best. Ask for clarification where I wasn't clear enough. |
Re: FRC971 Spartan Robotics 2016 Release Video
Quote:
One of the things that makes 971 unique is that we are willing and able to move significant mechanical complexity into software. 2014 and 2015 were good examples of this. 2016 wasn't as crazy from a software point of view, but we also have gotten good at making robots work. We have strong software mentors and students with experience in controls and automation along with strong mechanical mentors and students to build designs good enough to be accurately controlled. The mechanical bits make the software easy (or at least possible). We tend to start with a list of requirements for the robot, quickly decide what we know is going to work from previous years (and actually start designing it in parallel), and then sketch out concept sketches of the rest. We do a lot of big picture design with blocky mechanisms and sketches to work out geometry and the big questions. We then start sorting out all the big unknowns in that design, and working our way from there to a final solution. I think that a lot of what makes us different is that we impose interesting constraints on our robots, and then work really hard to satisfy them. We come up with most of our most out of the box ideas while trying to figure out how to make the concept sketch have the CG, range of motion, tuck positions, and speeds that we want before we get to the detailed design. In 2014, we wanted to build something inspired by 1114 in 2008. We put requirements on that to be able to quickly intake the ball and also be able to drive around with the claw inside our frame perimeter ready to grab the ball. We tend to get to the requirements pretty quickly, and then start figuring out the best way to achieve them. After a bunch of challenging packaging questions, someone proposed that rather than using a piston to open and close the claw, we could just individually drive the two claws with motors. That brakethrough let us make our packaging requirements work much easier, and ended up being a pretty unique design. That got us comfortable building robots with more and more software. In 2015, we were pretty confident that we didn't need to cut into the frame perimeter. Unfortunately, by the time we had determined that that requirement was hurting us, we had already shipped the drive base. We spent a lot of time working through various alternatives to figure out how to stack on top of the drive base and then place the stack. In the end, the only way we could make it work was what you saw. We knew that we were very close on weight, and again used software to allow us to remove the mechanical coupling between the left and right sides to reduce weight. We were confident from our 2014 experience that we could make that work. I like to tell people that our 2015 robot was very good execution of the wrong strategy... We wouldn't build it again, so maybe you guys are all smarter than us there ;) 2016 fell together much easier than I expected. It had more iteration than we've ever done before (lots of V2 on mechanisms), which helps it look polished. Honestly, most of the hard work in 2016 was in the implementation, not the concept. We wanted a shooter that shot from high up, and the way to do that was to put it on an arm. We are getting to the point where we have a lot of knowledge built up around what fails in these robots and what to pay attention to. That part has just taken a lot of time and a lot of hard work. We don't spend much time debating where to do low backlash gearboxes, or figuring out how to control or sense various joints. Sometimes, I think we design the robots we design because we over-think problems and then come up with solutions to them. We work through a lot of math for gearbox calculations, power usage, etc, and do some basic simulations on some of the more critical subsystems. We also do a gut check to make sure that we think the subsystems will work when we build them, and we have good enough prototypes to prove out anything we are uncertain about. |
Re: FRC971 Spartan Robotics 2016 Release Video
Could you explain your vision tracking process this year, I heard you guys used 2 cameras. And what coprocessor did you use,if any?
|
Re: FRC971 Spartan Robotics 2016 Release Video
Can you reveal the secret of how you dealt with the changing compression of the boulders?
|
Re: FRC971 Spartan Robotics 2016 Release Video
Quote:
|
Re: FRC971 Spartan Robotics 2016 Release Video
Quote:
We do have two cameras. There was an early thought to use them to do stereo (do separate target recognition in both cameras, and get depth from the offset distance), and we had a bench prototype of this that had good preliminary results. We ended up not needing accurate depth info, so the two cameras were just used for finding the center of the goal. We could have done that with one camera mounted centered, but that is easier said than done. We were using the jetson for vision processing and were happy with its performance. |
Re: FRC971 Spartan Robotics 2016 Release Video
Quote:
As Travis said, we wanted to do stereo, but didn't get around to verifying that it worked well enough to start using the distance that it reported. One of the side effects of stereo cameras was that we didn't need to deal with the transforms required to deal with the camera not being centered. Our shooter didn't have any space above or below the ball for a camera. The bottom of the shooter rested on the bellypan, and the top just cleared the low bar. We did the shape detection on the Jetson TK1, and passed back a list of U shapes found to the roboRIO over UDP in a protobuf, including the coordinates of the 4 corners for each camera. We didn't find that we needed to do color thresholding, just intensity thresholding, and then shape detection. This ran at 20 hz, 1280x1024 (I think), all on the CPU. The roboRIO then matched up the targets based on the angle of the bottom of the U. We were very careful to record the timestamps through the system. We recorded the timestamp that v4l2 reported that the image was received by the kernel, the timestamp at which it was received by userspace on the Jetson, the timestamp it was sent to the roboRIO and the timestamp that the processed image was received on the roboRIO. The let us back out the projected time that the image was captured on the Jetson in the roboRIO clock within a couple ms. We then saved all the gyro headings over the last second and the times at which they were measured, and used those two pieces of data to interpolate the heading when the image was taken, and therefore the current heading of the target. This, along with our well tuned drivetrain control loops, let us stabilize to the target very quickly. Ask any follow-on questions that you need. |
Re: FRC971 Spartan Robotics 2016 Release Video
Quote:
|
Re: FRC971 Spartan Robotics 2016 Release Video
Quote:
I have a few questions about your control. 1.I have read about your delta-u controller and I am not sure if I understood it correctly and I would like to know if i got it right. You have 3 states on your state space controller which include position , velocity and error voltage. you model it as (dx/dt) = Ax + Bu and u is the rate of change of voltage. Then you use pole placement to find the K matrix and then in your controller you set u to be -Kx. Then you estimate the state from position using an observer. you use an integrator for u and command the motors with it. To the error voltage state you feed in the difference between the estimated voltage from the observer and the integrated u commands. 2.Will the delta-u controller work the same if i will command the motors with u and not the integral of it and instead use the integral voltage error as state? why did you choose this way for the controller and not another form? 3.Is the delta-u controller in the end a linear combination of position error, velocity error and voltage error? 4.Why did you use Kalman filter instead of a regular observer? How much better was it in comparison to a regular observer? 5.How did you tune the Q and R matrices in the kalman filter? 6.How do you tune the parameters that transfrom the motion profile to the feed-forward you can feed to your motors? 7.How did you create 2 dimensional trajectories for your robot during auto? 8.How do you sync multiple trajectories in the auto period? for example how did you make the arm of your robot go up after crossing a defense? Thank you very much! :) |
Re: FRC971 Spartan Robotics 2016 Release Video
Quote:
Delta-U won't work if you command the motors with U, since your model doesn't match your plant (off by an integral). I've recently switched formulations to what we used this and last year, and I think the new formulation is easier to understand. If you have an un-augmented plant dx/dt = Ax + Bu, you can augment it by adding a "voltage error state". d[x; voltage_error]/dt = [A, B; 0, 0] x + [B; 0] u You then design the controller to control the original Ax + Bu (u =K * (R - X), design an observer to observe the augmented X, and then use a controller which is really [K, 1]. We switched over last year because it was easier to think about the new controller. In the end, both it and Delta U controllers will do the same thing. Quote:
Quote:
Honestly, I end up tuning it one way and then looking at the poles directly at the end to see how the tuning affected the results. Quote:
Quote:
The equation to minimize is: (B * U - (R(n+1) - A R(n)))^T * Q * (B * U - (R(n+1) - A R(n))) This means that you have 3 goals running around. The un-profiled goal, the profiled goal and the R that the feed-forwards is asking you to go to. I'd recommend you read the code to see how we kept track of it all, and I'm happy to answer questions from there. The end result was that our model defined the feed-forwards constants, so it was free :) We also were able to gain schedule the feed-forwards terms for free as well. FYI, this was the first year that we did feed-forwards. Before, we just relied on the controllers compensating. You can see it in some of the moves in the 2015 robot where it'll try to do a horizontal move, but end up with a steady state offset while moving due to the lack of feed-forwards. Quote:
Quote:
Awesome questions, keep them coming! I love this stuff. |
Re: FRC971 Spartan Robotics 2016 Release Video
Quote:
Are the A and B matrices here the same as in this pdf? https://www.chiefdelphi.com/forums/a...1&d=1419983380 Quote:
u = constant * integral of position error + constant * integral of velocity error + constant * integral of voltage error Isn't that a PI control + integral control of the voltage? This controller as far as I know should have integral windup. What am I missing? Quote:
This is really smart! I want to make sure I got it, Q is weight matrix and you are looking for the u vector to minimize the expression? In which way are you minimizing it? My current Idea is to set the derivative of the expression to zero and solve for u. Is that correct? Did you get to this expression by claiming that R(n+1) = AR(n) + Bu where u is the correct feed forward? Can you explain how did you observe the voltage? Quote:
Also how does your students understand this paper? There are lot of things that needs to be known in order the understand it. Who teaches your students all this stuff? My team don't have any control's mentor and we are not sure if to move to model based control or not. Our main problem with it is that there are a lot of things that need to be taught and it's very hard to maintain the knowledge if we don't have any mentor that knows this. Do you have any advice? Thank you very much! |
Re: FRC971 Spartan Robotics 2016 Release Video
Quote:
Quote:
The trick is that the integrator is inside the observer, not the controller. The controller may be commanding 0 volts, but if the observer is observing motion where it shouldn't be, it will estimate that voltage is being applied. This means that the third term will start to integrate the commanded voltage to compensate. If the observer is observing the correct applied voltage, it won't do that. You can shot this in simulation a lot easier than you can reason about it. That's one of the reasons I switched to the new controller formulation with a direct voltage error estimate. I could think about it easier. Quote:
That is the correct equation, nice! You then want to drive R to be at the profile as fast as possible. Quote:
Intuitively, you can think of the observer estimating where the next sensor reading should be, measuring what it got, and then attributing the error to some amount of error in each state. So, if the position is always reading higher than expected, it will slowly squeeze the error into the voltage error term, where it will finally influence the model to not always read high anymore. You'll then have a pretty good estimate of the voltage required to do what you are currently doing. Quote:
We have a small number of students who get really involved in the controls on 971. Some years are better than others, but that's how this type of thing goes. There is a lot of software on a robot that isn't controls. I'm going to see if the paper actually gets good results, and then work to involve students to see if we can fix some of the shortcomings with the model that one of the examples in the paper uses and help make improvements. I think that'll let me simplify the concept somewhat for them and get them playing around with the algorithm. I've yet to try to involve students in something this mathematically challenging, so I'll know more once I've pulled it off... I mostly mention the paper as something fun that you can do with controls in this context. When I get the proper amount of interest and commitment, I sit down with the student and spend a significant amount of time teaching them how and why state space controllers work. I like to do it by rederiving some of the math to help demystify it, and having them work through examples. I've had students take that knowledge pretty far and do some pretty cool things with it. Teaching someone something this tricky is a lot of work. We tend to have about 1 student every year actually take the time to succeed. Sometimes more, sometimes less. Doing model based controls without good help can be tricky. I honestly recommend most of the time to focus on writing test cases with simulations with more simple controllers (PID, for example) before you then start looking at model based controls. This gets you ready for what you need to do for more complicated controllers, and if you were to stop there having learned dependency injection and testing, that would already be an enormous success. :) The issue is that most of this stuff is upper division college level material, and is sometimes graduate level material. Take a subsystem on your robot, and try to write a model based controller for it over the off-season. |
Re: FRC971 Spartan Robotics 2016 Release Video
Quote:
acceleration = a * velocity + b * (voltage error) + b * u, where a and b are constants. I am a bit confused about why this is true because the voltage error is in volts and u is volts/seconds so you are adding numbers with different units. Quote:
u = constant * position error + constant * velocity error + constant * integral of voltage error Maybe there is a problem with velocity error part here but I still don't understand how there won't be an integral windup when you have integral of position error in your controller. What am I missing? Also I saw you are using moment of inertia of what being spun in your model, What units is it and how can I find it? Quote:
I was wondering about how the delta-u controller works when the u command get's higher than 12 volts, because then you can't control the rate of change of the voltage anymore. Thank you so much! Your answers helped my team and me a lot!:) |
Re: FRC971 Spartan Robotics 2016 Release Video
Quote:
Awesome robot - as always. |
Re: FRC971 Spartan Robotics 2016 Release Video
Quote:
u = (B^T *Q*B)^-1 * (r(n+1)^T - r(n)^T * A^T)*Q*B Is that correct? |
Re: FRC971 Spartan Robotics 2016 Release Video
Quote:
Most of our backlash reduction comes from eliminating hex backlash. We also do the standard run as large a chain reduction as you can on the last stage, and then keep this chain tensioned well. Going forward we are going to be using #35 whenever we can for these reductions to avoid stiffness issues, which also helps with the controls. |
Re: FRC971 Spartan Robotics 2016 Release Video
Quote:
Code:
def TwoStateFeedForwards(B, Q): |
Re: FRC971 Spartan Robotics 2016 Release Video
Quote:
Quote:
|
Re: FRC971 Spartan Robotics 2016 Release Video
You set your wheel velocity to 640 in your shooter code. I can't figure out what unit of measure this 640 is calculated in; RPM would be too slow while RPS would be too fast. What unit do you use, and is it related to your model based calculation of everything? (Do you have any tips for starting to learn how to run everything using models?)
|
Re: FRC971 Spartan Robotics 2016 Release Video
Quote:
|
Re: FRC971 Spartan Robotics 2016 Release Video
Quote:
Quote:
u = Kp * x + Kv * v + voltage_error So, if torque_error = 0 (the model is behaving as expected), then you don't add anything. Ask again if I read too fast. Quote:
Quote:
To solve that all correctly, you'll want to use a Model Predictive Controller. They are able to actually take into account saturation correctly. Unfortunately, they aren't easy to work with. We haven't deployed one yet to our robot. (Go read up on them a bit. They are super cool :) That was either one of, or my favorite class at college.) It's been another busy week. Sorry for taking so long. I started a reply a week ago and couldn't find time to finish it. |
Re: FRC971 Spartan Robotics 2016 Release Video
Quote:
Quote:
|
| All times are GMT -5. The time now is 08:35. |
Powered by vBulletin® Version 3.6.4
Copyright ©2000 - 2017, Jelsoft Enterprises Ltd.
Copyright © Chief Delphi