5013 the Trobots 2023 Charged UP Open Alliance build thread

Our arm dimensions are not finalized and will be tightened up a bit more as the horizontal drive system is added, but we do intend to skim the rear frame perimeter. We will not be reaching out in both directions inches or feet as shown in the first example of Figure 7-3 and we will also not be picking up game pieces while any portion of the robot is outside the frame perimeter. We will end up with a momentary protrusion of primarily electrical wires only during the transition from the vertical stowed position to the ground pickup position and then returning. We believe this is well within the rules definition of momentary and inconsequential because we will not be completing game actions during the approximate tenth of a second we break the plane. We will absolutely be trying to minimize this as much as possible, but the value of the counterbalance weight is worth pushing to the frame boundaries as much as possible.


1-19 Update

We worked with strategy to come up with a uniform naming convention for autonomous paths to make our scouting language and path planning language to be consistent.
The Auto Naming

  • Number from 1-9 denotes each column starting at the barrier (field centric)
  • H, M, L denotes High, Medium, and Low placement respectively
  • X represents a failed action (for example 2LX is a failed low, this has some implications because pieces from high can fall in the low nodes)
  • U represents going through the bumper side (opposite side of barrier, which has the wires), O represents going over the charging station (don’t know how difficult this will be), and A represents going through the way close to the barrier (piece that separates the two team zones in community)
  • B represents a cube game piece, while N represents a cone game piece. Pickups are different depending on the game pieces selected at the start of the match, so we need to communicate with our alliance to pick the best possible pieces.
  • At the end of auto, depending on what we want to do, we can either dock or engage. Dock is represented by D, while Engage is represented by E.

We also setup the test board to work on PhotonVision calibration

Calibration did not go well. In November we had tested the OV9281 cameras and were happy with them, we were seeing detection at 1280:800 in the 15-25ft range. Today one camera we were testing was getting 8ft maybe and the other under 4. This was in 2d mode but the results for detection in 3d mode were similar. We noticed the images were blurry and tried cleaning the lenses but it didn’t see to help much. Even if we can clean or replace the lenses the dirty nature of the competition makes this concerning. Anyone else having similar issues.

CAD work has continued on the electrical layout. This is the most CAD complete robot we have ever done. We will see returns on this effort for sure.

We did a test print of the arm extension spool. We tried a new filament DURAMIC PLA Plus Pro and we were impressed with it. We did test print of the spool at 40% infill. We did decide that on the new printer we need to print items with an overhang with the doors to the chamber open and maybe a lower temperature since the overhangs were a bit messy. Since the Durmic claims to be 8 times harder than normal we introduced our students to the term failure testing. We tried squeezed it in a wood vice and barely dented it. We moved to the metal vice and were able to puncture a little as it had a pressure point. Significant hammer strikes on the surface only created small dents. I regret not video documenting the process.

With the dimensions decided mechanical started building the frame. However the students involved had missed the section of the meeting where the arrival of the grid pattern Max Tube was announced so they were using standard 2x1. We will rectify that Saturday.


You should be able to adjust the focus of your lens by twisting it in/out. Maybe you knocked it out of focus?


Thanks, this was the case.


1-20-2023 Update

Busy day today
Busy assembling drive train and base electronics.

They had a moment of panic when everything fit together correctly. Have I mentioned this is our most CADed robot ever? So ABC always be CADing

Mechanical/CAD has also refined the pully for the telescoping arm to use a Thrifty Aluminum Insert

Carpentry finished our field elements and did some drop testing of cones from the substation. After 80 drops only 7 fell over. They did capture a slow motion video of a drop.

Programming had a better today as we got the calibration on the PhotonVision tuned in. We can detect an AprilTag at 28ft ! 1280x800 running at 20-25 fps. Thanks @Richard_Sims for the help on that.

We found that using a music stand to hold the calibration target provided a much better calibration that a human holding it. Also sometimes a cell phone flashlight helped detect the calibration grid better.

We also tested out the Limelight MegaPose feature and liked it as well be are inclined to use our limelight with a Coral Edge for game piece detection and alignment for cross field pickup.


You might have to change permissions on this - It won’t let me view it as I don’t have the right permissions

1 Like

28ft is impressive, we have not dug into fine tuning on our limelights yet, I would like for us at a minimum to get a good pose solution in auto on the game piece side of the charge station, seems like you guys will have no problem achieving that. For us grabbing the cone properly is very important.



We have CADed and test printed a case for the Andymark potentiometer with hardware kit. The reason to design our own was so we could have a longer draw length (48 inches) . We also modified the mounting some.

WE also 3D printed test fits of our motor bracket and frame brackets for the A-Frame assembly. We do plan on cutting them out of aluminum for competition.

Our students decided to do more destructive failure testing on a part that didn’t have the right hole pattern. We are again impressed with the Durmamic PLA plus pro filament.

Also in mechanical the students painted some Max Tube and replaced the frame of the robot.

Work continued on cutting parts for the Every bot intake.
A side project was started to investigate making a Playing With Fusion style telescoping arm if we end up being unhappy with the CF springs in the Thrifty arm

Next week we plan to get the A-Frame assembled

We finished the code to update the drive train SwerveDrivePoseEstimator with the values from the PhotonVision cameras.

We experimented with the Path Planner SwerveAutoBuilder instead of the custom factory we had used last year.

We also spent time discussing what was needed for aligning on positions in teleop. We are considering using Path Planner on the fly generation. This would require some logic to pick some way points to go around the charging station and in front of the desired position. We decided the best way to handle the points was to give then names and store them in a map. One issue with a map is that the field is not symmetric for red and blue. Our answer to this was to store the blue values in the map and write to code that checks alliance and transforms the coordinates. The x values would stay the same and the red y would be 8.2296-y . The angles would need to transform to 2pi - blue angle.

We did manage to get a Limelight 3 ordered. You may wonder if we are detecting pose from AprilTags at 28ft (yes @Emerson1706 it really is 28 feet) why we would want a new Limelight. We do have a Coral Edge from Axon experimentation last year and we plan to use it with the neural pipeline to detect an line up on game pieces. We are glad to see @Brandon_Hjelstrom making impressive progress with this new feature.

We have a new video series this year on our YouTube Channel called Tiny Mic Interviews

Our Social Media Presences


If you end up making one, I’d be very interested in how easy it is to make.


Hey everyone! I have some updates on our awards, PR, and spirit teams.
Awards: This week we finished our Woodie Flowers essay and after some final proof reading we plan to submit it this Tuesday. We have also gotten through around 5/13 executive summaries for the Impact Award. In the upcoming week we plan to finish up a good rough draft on the executive summaries and start on the actual impact essay. We also are going to submit for Dean’s list but because I am not a mentor, I don’t really have any updates on that end.
PR: We have started a talk show, trobot tiny mic interview. Each week a member of the PR team will hold a short interview with a subteam for an update on what they are doing. These videos are less than two minutes long and are posted on our instagram, facebook, and YouTube. I belive that the first two episodes are linked in the thread above.
Spirit: Our seasonal logo for 2023 is almost finished. There are some final adjustments that need to be made before releasing it.
I’m pretty sure that’s everything. Feel free to ask if you have any questions or want clarification. I’ll try to get back to you as soon as possible. :smiley:


The robot assembly and be found here


I have a couple more questions on the bearing blocks for the thriftybot tube.
How are/were you planning on retaining the bearing shafts? Also, are those R4 bearings?

1 Like

I don’t know the series of bearings but the shaft is interference fit. So friction should hold it in place.


2/6/2023 update
Due to encoder issues we swept away the last remenants of the old replublic (SDS-swervelib) and switched to 364 BaseFalconSwerve though it is more of a template than library.

We did discover a few issues we thought needed addressing.
First was it didn’t support CANivore. We addressed that by introducing a CanPort type which has an id and a bus name and passing it down to the motor controller and encoder creation instead of a int.

The second is that upon bootup all wheels would do a 360 degree change, That was due to setting the sensor position on the falcons to a negative value. So we added a method to SwerveModule.java to make a positive angle between 0 and 360 modified the resetToAbsolute method to look like the below

    public double makePositiveDegrees(double anAngle ){
        double degrees = anAngle;
        degrees = degrees % 360;
        if (degrees < 0.0){
            degrees = degrees + 360;
        return degrees;

  public void resetToAbsolute(){
        double absolutePosition = Conversions.degreesToFalcon(makePositiveDegrees(getCanCoder().getDegrees() - angleOffset.getDegrees()), DrivetrainConstants.angleGearRatio);

The third issue was there was no turn optimization so we added an optimize method that came almost directly from the old SDS code and made an overloaded makePositiveDegrees

    public double makePositiveDegrees(Rotation2d anAngle){
        return makePositiveDegrees(anAngle.getDegrees());
    public Rotation2d optimizeTurn(Rotation2d oldAngle, Rotation2d newAngle){
        double steerAngle = makePositiveDegrees(newAngle);
        steerAngle %= (360);
        if (steerAngle < 0.0) {
            steerAngle += 360;

        double difference = steerAngle - oldAngle.getDegrees();
        // Change the target angle so the difference is in the range [-360, 360) instead of [0, 360)
        if (difference >= 360) {
            steerAngle -= 360;
        } else if (difference < -360) {
            steerAngle += 360;
        difference = steerAngle - oldAngle.getDegrees(); // Recalculate difference

        // If the difference is greater than 90 deg or less than -90 deg the drive can be inverted so the total
        // movement of the module is less than 90 deg
        if (difference >90 || difference < -90) {
            // Only need to add 180 deg here because the target angle will be put back into the range [0, 2pi)
            steerAngle += 180;

        return Rotation2d.fromDegrees(makePositiveDegrees(steerAngle));

Over the off season we will most likely work on a project to make this more of a library than a template, so we won’t have specifically named constants referenced in the library portion of the code.

We continued work on Vision with the Limelight. We will be using a python/OpenCV pipeline for color detection instead of the neural pipeline. We could detect both cones and cubes on the same pipeline. We believe this will perform better both in terms of speed and reliability. The script is almost like the example except we have 2 threshold lists. They used the limelight retroreflective pipeline interface and the dropper tool to determine the hsv values and spread out the sliders until it had the whole game piece then copied those values into the hsv array thresholds below.

import cv2
import numpy as np

# global variables go here:
testVar = 0

# To change a global variable inside a function,
# re-declare it with the 'global' keyword
def incrementTestVar():
    global testVar
    testVar = testVar + 1
    if testVar == 100:
    if testVar >= 200:
        testVar = 0

def drawDecorations(image):
        'Limelight python script!', 
        (0, 230), 
        .5, (0, 255, 0), 1, cv2.LINE_AA)
# runPipeline() is called every frame by Limelight's backend.
def runPipeline(image, llrobot):
    img_hsv = cv2.cvtColor(image, cv2.COLOR_BGR2HSV)
    cone_threshold = cv2.inRange(img_hsv, (10, 200, 110), (40, 255, 215))
    cube_threshold = cv2.inRange(img_hsv, (110, 120, 40), (150, 205, 190))

    threshhold = cone_threshold + cube_threshold 
    contours, _ = cv2.findContours(threshhold, 
    largestContour = np.array([[]])
    llpython = [0,0,0,0,0,0,0,0]

    if len(contours) > 0:
        cv2.drawContours(image, contours, -1, 255, 2)
        largestContour = max(contours, key=cv2.contourArea)
        x,y,w,h = cv2.boundingRect(largestContour)

        llpython = [1,x,y,w,h,9,8,7]  
    # make sure to return a contour,
    # an image to stream,
    # and optionally an array of up to 8 values for the "llpython"
    # networktables array
    return largestContour, image, llpython

Started Wrist/Intake code. Determined the wrist angle will need to be calculated against the arm angle to run a PID to keep the wrist at a constant angle to the floor. The wrist angle will need to be set to 180-arm angle plus the offset of the yet to be determined fixed angle optimal for pickup.

We verified the PathPlanner trajectories still work with the new code, and started working on new PathPlanner trajectories

Finally we added a GitHub workflow to verify compile upon push and pull request creation.

The link for our robot assembly has changed since our initial post. I mentioned above but want to reiterate the main robot assembly is here.

Arm work continued.

We showed the students how to use heat set inserts to make fastening locations in 3-d printed parts. They seem to be mostly metric sizes but M5 accepts a 10-32 quite readily.

We discovered and fixed the hole in the arm bearing blocks was in the wrong orientation. We also made the large block outside portion 1/16” wider and the small block we made in the inside 1/8” narrower because we decided to use 1/8” wall tube in place of the 1/16” wall.

Designed some pillow blocks for bearings for the arm.

We built a new battery cart that will hold 10 batteries, it has 6 2A charger ports and 4 6A charger ports.

We completed the EveryBot intake

We also would like to thank @DChilson for his generous contribution of a new horizontal band saw. He could no longer stand the cut quality on the radial chop saw.

Worked on Woody Flowers Essay
The head mentor and the PR mentor worked on our Dean’s List essays.

Release the Week 3 Tiny Mic Interview

Final T-Shirt Design

Started creating buttons for competition.

Created some fake data using the scouting app and are working on tableau visualization.

Timelines are getting tight and the crunch is starting!


2/20/2023 Update


Started working on on-the-fly path generation to line us up with the grids and nodes for both alliances using the AprilTags.

We’re also working on detecting game pieces with one of our Limelights. We’re using a Python OpenCV color pipeline, and we’ve set a range of colors to detect both cubes and cones.

Finally, worked with the drive team to finalize button mapping on our controllers.


We mounted the arm (again) after we finished cutting/printing/painting a bunch of parts. We also tested arm extension and it worked! Now we’re refining our intake and wrist. Hoping to have a bot ready to practice with very soon!

Also, after hearing about the problems that 3467 experienced with their SDS MK4i modules, we have ordered the bolts they recommended and are planning on reassembling our modules after they arrive.


After creating some imitation pit + match scouting data, we’ve started working in Tableau for our data visualization. We’re aiming for three different dashboards - one for our first 2-3 qualification matches, one for the rest of our match strategy, and one for alliance selection.

PR and Awards

We’ve kept uploading our weekly Trobot Tiny Mic interviews, which can be found on our Instagram page. Additionally, we’ve submitted all of our awards essays for the season! Now we’ve started to prep for the presentations.


One of our alumni donated an Ender 3 V2 for us to use during the season, which helps speed up our part production process!

Additionally, we’ve started working on creating a custom driver station tray this year. We’ll share pictures once we have something more finalized.

That’s all for this update!


This week a couple weeks later than intended programming finally got the bot.

We ran Sysid on our arm using the integrated falcon encoders (even though in the end we will use a rev through bore for our arm position)

Then we realized we didn’t have the Rev through bore coded properly to get angle from the duty cycle and duty cycle encoders didn’t handle inversion as both our encoders ended up inverted. So we wrote a little class to handle it RevThroughBoreEncoder.java

It turns out that our gains must not be right or we are doing something wrong as the arm doesn’t seem to even want to stay in position. I suspect the gravity feed forward of 0.051777 was way too low.

We do have the wrist keeping an angle to the floor somewhat, it does need tuned as well.

We needed to take a break in the middle of the day so mechanical could alter the motor mount to allow for chain tensioning.

The pressure is high at this point as we don’t have a controllable arm and we don’t have camera mounts for our april tag positioning yet. We have 2 and half weeks.


PR and Awards update
PR: We have started button production for our 2023 regionals. We have made more episodes of Tiny Mic Interview. Our Tiny Mic broke, so we will order a new one.
Awards: We summited our impact essay and summaries a few weeks ago. In the meantime, I have been working on our impact documentation form, presentation, and video.


Programming update 3/12/23
Due to the arm and other issues, most of the week was spent fixing the mechanisms on the bot.
On Saturday, we finally got the bot back from mechanical. Unfortunately, we discovered even more major issues. Time is getting extremely restrictive for the team, especially for us programmers.

The first issue is that our wrist wasn’t adjusting automatically as it did in previous driving practices because sometimes the arm would just randomly slam into the ground or act in an uncalled way. This was fixed by our mechanical and electrical folks.

The second issue is that one of our swerve modules mysteriously turned 90 degrees with respect to the other swerve modules. We fixed this by rezeroing the swerve modules.

The third issue is that when we mapped buttons to extend and rotate the arm to a desired location, such as a node or a substation, the extension would work but the rotation would never happen. This part is still a work in progress, and we hope to fix it by our first competition.

Additionally, we need to get the autonomous routines working, but we haven’t had a lot of time to actually work on these. We do have a Pathplanner autonomous framework set up so I believe we should be fine. We just need to get our commands, like placing game pieces and creating an X with our swerve modules to dock or engage, integrated into the system.


Before Competition

We worked furiously to get everything in working order and to make sure we had enough weight to not tip.

We found our clamping blocks were too weak in Duramic PLA so we went to to Overture Nylon .

Programming had a little time to get setpoints for arm extension and rotaion for mid height as well as the double subtation .

We didn’t get the constant force springs double wound so our extension needs to happen with the arm down so we didn’t go for high.

The team took a brief moment to finally name the robot Goblin.

Practice day we identified we were losing connectivity in the arm and wrist encoders. We identified the PWM style extension cable was the weakness and cut the ends and soldered the extension on. We also added code to check to see if the sensors were connected before attempting to move the arm or wrist off of encoder data.

On practice day our plastic clamping blocks were failing, we replaced several times but that ate up time we could have been on the practice field working on autos.

Our nylon clamping blocks continued failing come Friday. It was a hectic time in the pit replacing them repeatedly. We ran out of spares and it was looking pretty rough for us when riding over the horizon came Jon Smith (@jon.smith) from the 1987 Bronco Bots. He went to 1730 Team Driven and got a 1/4 inch aluminum plate. Then rounded up some help. With our mentor @DChilson and Jon’s team we were able to manufacture aluminum blocks which held the rest of the competition. Without Jon’s help it would have been a sad competition for us. Jon and all who were involved with helping us are shining examples of Gracious Professionalism and the spirit of Coopertition .

After that we finally had time to work on and test auto balance. We didn’t get it all worked out but good enough we got 3 of 4 in playoffs mostly because auto ended and shut us off in the right place while we were oscillating.

We ended up in the 5th captain spot but after selection we were the captains of the 3rd ranked alliance. Our scouting alliance with 2357 System Meltdownproduced some good data which led us to choose 5119 and 1769. We think we got a steal on both picks. We fought our way through but attained our second loss in the fourth round of playoffs. Congratulations to the winning alliance. 1987 3184 2457.

Defense isn’t our usual position but I want to call out some incredible defense driving by our drive team.

We would like to apologize for any and all bent frames and robot damage caused by our tiny sledge hammer.

I would like to congratulate our programming lead Tyler Dinh (@tylurr )for his Dean’s List Finalist award. Tyler has been an asset to our team and a help to many of our students. Congratulations! We were also thrilled to see our good friends 1764 also had a Dean’s list finalist with Makena Dickens

We now have 3 weeks till Tulsa.
Mechanical plans on replacing the chain with 35 chain and doubling the CF springs.
Programming needs time!
Once the CF springs and chain are done we need to recharacterize the arm.
Get vision integrated. We have individual vision components working. The Limelight can identify game pieces and the photon vision cameras can locate but those are not tied into any control yet. We need fast alignment on the game pieces with the Limelight and we need lineup on the nodes with Photon vision and Path Planner.
Get autonomous scoring and balance.

A busy few weeks ahead.


Late Saturday Leads sat down with the mechanical team for a focused After Action Review to assist with prioritizing the tasks between now and our next competition in Tulsa. @Bmongar cannot overstate the importance of maximizing programming’s time with the robot and we cannot afford to fall into the trap of the mechanical team fixing everything we want, we would never hand over the bot. We listed the items that we consider “must-fix” and then prioritized them by what needs to be done for programming to do their work and what we can work on when we have time.

We’ve determined only #1 and 2 on the list must be done for programming to build accurate routines, the rest will have the parts built and made ready but replaced when time allows. We believe this prioritization will help us maximize our precious few build days between now and Tulsa. We all know how quickly they go by.

For anyone wondering, Bamm-bamm is a ghost in the machine that programming has been trying to find. Every once in awhile our arm will smash the intake into the floor with as much force as it can muster with devastating results. The latest attempt to locate Bamm-bamm has been mostly successful as we finished up Heartland without a single reoccurrence on the field.

2023-03-19 09_39_23-bam bam flintstones smash - Google Search