Team 2910 Jack in the Bot is excited to release the CAD and Tech Binder for our 2023 robot, Phantom!
Please feel free to ask any questions.
Please feel free to ask any questions.
Is the onshape link broken for anyone else but me?
I can’t access it either
Sorry about that, the Onshape link should work now!
Thanks for releasing this.
We look forward to building the 359 version this summer, and compete (if we get in) Chezy Champs and other off-season events.
Mahalo,
Glenn
Some questions:
Looks like you went all-in on April Tags (as opposed to reflective tape). Can you explain a little bit more about your algorithm for automated scoring? Did the two angled cameras try to find a certain xy target for the april tags when scoring? Or were you completely relying on the global field pose / localizing estimate? How did you find the accuracy (how often did the drivers have to correct it?)
We wanted our arm to move in sub 1 second like yours, but had trouble with the acceleration causing the robot to tip. We got a bit of extra speed by carefully timing all of the joint movements to be out of sync with eachother, but never quite got all the way to 1 second. Did you have similar issues, and how did you solve them?
I haven’t seen four motors on a single joint since 1114’s climber in 2019. That’s one way to get things to move fast! No questions, just respect.
It looks like all of your joints used the Falcon incremental encoders, rather than absolute encoders. Is that correct? If so, did you do anything special to reduce backlash in all of the gears (such as shimming the hex bores etc)?
I gotta say, yall are one of my favorite teams with the small robots. And thank you for making an onshape document for us this year, much appreciated. Excited to dive into this later.
I’ll let our software folks answer the first two.
Yes, that’s correct that we only used the Falcon internal encoders. We used the Rev ION gears, which have a much tighter fit on the hex and rounded hex shafts than Vex/WCP gears typically have. We did have some gap-filler glue ready, but ultimately didn’t need to do anything there. The tight tolerance fit helps a lot.
Other than that, it was about making everything as stiff and low backlash as possible. Good tensioning, good structural design of the A-frame (dead axle main pivot is part of those design choices).
I can answer the software side of a few of these questions.
During tele-op, our two angled cameras were being used constantly for relocalization of the robot’s position on the field. Our automated scoring, by the end of the season, was showing really promising results, see this video. It works by calculating which cone scoring column the center of the cone is closest to. It derives where the center of the cone is using a Time of Flight sensor mounted on the inner wall of the intake. This is done to account for the cone being in an extreme position in the intake such that if the robot was to align to the center of the drive base, the cone would not be scored. It PIDs to the correct Y (horizontal) position and angle, and the driver pulls the joystick forward essentially as a way to “confirm” the alignment. When the robot senses that its X, Y, yaw, and even pitch are correct, it will automatically eject the cone. To answer your final question about accuracy: we rarely had any issues with false positives due to how strict our thresholding was, but of course, this impacted cycle time and we found that in many cases, our driver was able to score faster than the auto-alignment.
Yes, tipping was an issue that software tackled. It was important for us to sequence the movement of our joints such that the extension did not occur until the shoulder was within some allowable distance from its setpoint. If the arm extends and the shoulder rotates simultaneously and instantly, the arm will be fully extended (or close to) for much of the shoulder’s rotation, thus adding much more tippyness. I assume this is fairly similar to the joint movements being out of sync like you described. Largely, the rest of the tippyness-avoidance came from the design side. Our bot is fundamentally very hard to tip based on its low CG.
Let me know if you have any follow-up questions!
Mind sharing what thresholding you ended up using? In addition to that, what kinds of controls did you implement in terms of rejecting tags when localizing (ie rejecting a pose that’s x meters different than odometry) ? We were pretty happy with what we had by end of the season, but like yourselves, driver control was seemingly faster than auto alignment.
Are you guys still going to use solidworks for 2024? if so, what is your plan to replace grabcad?
The team is planning on switching to Onshape going forward.
We required the X and Y to be within one inch of the target waypoint, the Yaw to be within 1 degree, and the Pitch to be within 3 degrees of 0. The Pitch check was to correct for edge cases in which we come in fast and the robot is tipping forward.
Our solution to handling potentially poor vision measurements was to adjust the standard deviations we were using for vision in our SwerveDrivePoseEstimator using this method that allows for a standard deviation matrix. We applied a coefficient to the distance of the closest tag to the robot and thus trusted vision measurements much less from further away, and much more close up.
I hope this helps!
- We wanted our arm to move in sub 1 second like yours, but had trouble with the acceleration causing the robot to tip. We got a bit of extra speed by carefully timing all of the joint movements to be out of sync with eachother, but never quite got all the way to 1 second. Did you have similar issues, and how did you solve them?
Big brass bar in the front acted as a counterweight to keep it from tipping. Overall low cg also helped a lot with avoiding tipping, but we did prepare for it as an issue, hence the S.W.A.G. (Super Wide Area of Gravity) bar. In terms of syncing the movements, design or software people would probably be better to answer specifics.
The title page of your tech binder shows a Limelight on the side of the superstructure. Was that later swapped for the Arducam/PhotonVision system?
That is actually our Arducam on the side there. It’s just completely enclosed by 3D printed parts save for, obviously, the lens.
Did you run calculations to check that one #35 chain would be enough to rotate the arm with the torque of the four Falcons? We only had 2 falcons for our bottom arm joint, and still went with two #35 chains for safety.
This? Did you design a limelight-shaped arducam mount and just use this as a CAD standin? I’m confused.
Oh that page, yes it appears that render still uses an older version of the CAD where we were experimenting with limelights. Apologies, I thought you were referring to the cover.
My bad, I thought I was looking at the cover!
Totally not biased, but what motivated the choice of PhotonVision over Limelight, and roughly when was that decision made?