FRC 1257 Parallel Universe |2024 Build Thread | Open Alliance

Hello and welcome to Team 1257 Parallel Universe’s Open Alliance Build Thread! This is our first season on Open Alliance, and we’re excited to join this great community, and learn from and be a resource for other teams.

So first, a bit about our team. We’re in the Mid-Atlantic District and based in the Union County Vocational Technical Schools. Our Rookie Year was 2004, but we had a few years break, before reviving the team in 2009. Our team has nearly 150 members, 2 coaches, and 2 mentors.

Team Structure:

Over the offseason, we’ve restructured our team to have better communication and more inter-subteam training.

Our team is divided into Operations, Strategy, Drive, and Technical. We used to have separate Build, Electronics, and Programming subteams, but we’ve now consolidated these three subteams into one, the Technical subteam. Additionally, this is our first year where we are putting a focus on documentation, so we have added documentation roles to our team structure.

Goals for 2024:

  1. To improve the sustainability of our program. This goal is broken up into categories:
  • Financial Sustainability: By forming long-term relationships with sponsors, creating new materials to secure sponsorships, applying to more sponsorship and grant programs, and starting a team member fundraising requirement, we can improve financial sustainability.

  • Knowledge Sustainability: By creating more cohesive documentation and transferring our current documentation into an online repository, we can improve the sustainability of our knowledge.

  • Equipment Sustainability: By replacing and upgrading our current technical equipment and supplies, we have more at our disposal to prototype with and so our robots have more longevity (avoid subsystems breaking every competition).

  1. To become more involved in the robotics community through platforms like Chief Delphi, and also by participating in more robotics community events.

  2. To make our team safer by redesigning our pit frame, pit tool chests, and robot cart with an eye towards safety.

  3. To have more of an impact on our community by hosting more STEM events, scout troops, summer camps, and possibly beginning to mentor FLL teams.

Impact:

  • RVSTS: Our biggest impact event is the Raritan Valley Science & Technology Showcase, where we dedicate one weekend each year to showcasing STEM at the Bridgewater Commons Mall in New Jersey. We co-hosted a FIRST Robotics Competition showcase and had teams set up STEM booths and rooms. A few examples of these booths include the robot driving station, an FLL exploration room, STEAM Stations which hosted interactive mini-projects, and an Investigation Alley having booths from local STEM retailers and summer camps.

  • Robotics Summer Camp: This past summer, we ran a robotics summer camp at a local library, where we used Lego robots to teach the engineering design process and block programming, and taught elementary and middle school students how they could become involved in FIRST.

  • Scouts: Additionally, we host various Club Scout and Girl Scout troops at our makerspace where we teach them about robotics and teamwork.

  • …And we’re looking forward to having more of an impact in 2024!

We’re really excited to contribute to Open Alliance!

— The snails

8 Likes

Also, here are some of our links to our website, training, and code. We’ll add more as the season progresses.

Old Website (We’re currently making a new website)

New Website: To come

GitHub

Offseason Training:

2024 Robot CAD: To come

Hello robotics community! Team 1257 just hosted our 2024 kickoff to start off the build season. Here are some things we did to make this event both memorable and fun, and efficient and productive!

Pre-livestream

  • Team building: Our operations team did a wonderful job conducting team building games such as icebreakers, blookets, and more.

  • Team spirit: We made and operated our own 1257 photobooth where members were able to take photos with their friends to share!

Post-livestream

  • Strategy: Before we started brainstorming ideas, our strategy team discussed some things we should take into consideration while approaching the design process.

    • Do we want to focus on scoring points in the amp, speaker, or both?

    • Do we want to be able to climb?

    • Do we want to be able to score on the stage?

  • Brainstorming: Due to the large size of our team, we had to come up with an efficient way to share our ideas and collect them. We hung poster paper on the walls and allowed members to form groups and brainstorm their ideas on the paper.

    • All of these ideas were collected into a brainstorming repository so the whole team can have access to the ideas in one place. If anyone comes up with new ideas over the weekend before our next meeting, they can upload more ideas to the repository.

    • Many alumni and mentors came to help brainstorm ideas as well.

Design Week

  • Next Monday we will be splitting off into formal design groups, each containing build, programming, and electronics members. Each group will design and maybe even prototype an idea for the robot.

  • On Thursday we will conclude our design week by having each group present their ideas to our leadership.

  • Leadership will use these ideas and continue to brainstorm over the weekend and prepare the final design by Tuesday the 16th.

Hello robotics community!

Since Kickoff we have divided our technical subteam members into design brainstorming groups. To make the ideas well balanced, all brainstorming groups contain a documentation lead, 4-5 build members, a strategy member, 3-4 programming members, 3 electronics members, and 1 person who studied advanced CAD during the offseason. We currently have 8 of these design groups. Each group had Monday’s meeting and Wednesday’s meeting to brainstorm, and tomorrow, Thursday, they will present their ideas to the leadership team.

Field Elements

In addition, we have a group of upperclassmen that is in charge of creating game elements for us to test our robot with. We will constructing our game elements from cardboard and some wood for cost effectiveness.

  1. Amp: The amp will be a wooden rectangular prism with a rectangular hole cut into the front side.
  2. Speaker: The speaker will be made of cardboard. It will be adhered directly to the wall using command strips, so it is easy to assemble and disassemble.
  3. Subwoofer: The subwoofer will be a wooden frame on the ground. The slanted walls of the subwoofer are not necessary for testing so we will not be making those in the interest of time and efficiency.

Robot Design Ideas

Our leadership team has also been brainstorming our own ideas!

  • So far, we are considering using a two stage elevator shooter that can pick up notes from the floor or from the source, then shoot into the speaker. We might use a cam beneath the elevator to adjust the pivot angle of the shooter. On the other side of the robot, we would have another mechanism for the trap and amp which we are still debating. We would also use a one-stage elevator for the chain.

  • Another idea we really like is a mechanism similar to the Unqualified Quokkas Ri3D with a combined intake and a shooter on the other side mounted on a single rotating arm. The arm can also be used for the climb. This design would give us more room in the robot to explore other mechanisms and strategies.

Note for the future:

Tomorrow, after the design groups present their work, we’ll give an update on some of their ideas. Our design deadline is Tuesday, January 16, so we will have a definite direction for our robot by then. Once we have the broad strokes design, we’ll first create a Crayola CAD to see the basic measurements and mechanisms, then start a more complex CAD in Fusion360 to share in Open Alliance and for documentation reasons.

We’re starting off the season on a high note!

1 Like

Hello robotics community! Over the weekend our leadership did a lot of brainstorming and talking and we have settled on our final robot design. We were heavily inspired by the Ri3D design by the Unqualified Quokkas. We really liked how they had a combined intake and shooter, and we thought the pivoting arm was very efficient. However, there were still some other things we needed to take into consideration.

External vs. Internal Intake

  • The Quokkas’ design featured an external intake. However, accidents like crashing into walls and other robots risk damaging the intake. Since the intake would double as a shooter, it would be a very critical point of failure. There were two different options we were considering in order to solve this problem:
  1. Create an internal intake which would essentially pull the notes up from under the robot. This intake would then pass the notes into the shooter arm. An issue with this idea is that we would lose height on the arm, having to decrease its size to fit within the boundaries of the chassis. However we had a few workarounds to this problem, like playing with the size of our chassis, as well as the height of our pivot point.

  2. Keep the external intake and focus more time during the season toward training drivers better so we would be less likely to crash on the field. While this still runs the small risk of crashing, we already had plans in motion to train better drivers, so this plan would be more feasible for us to do.

  • Since, option 1 would likely lead to more limited space on the robot and make things very difficult to build, we ended up keeping the external intake.

climb Built into the Pivot vs. Telescoping Climb

  • The Quokkas’ design features a climb hook built into the pivoting arm. While this is very efficient in that it is aligned with the center of mass of the robot, the hook is at a fixed height. If we wanted to achieve Harmony on the climb, this type of climb would make it difficult. If an alliance robot were already hanging from the chain, the height we would have to climb would be higher, and there isn’t much flexibility with the climb being built into the pivot.

  • We were considering using two telescoping arms on either side of the pivot arm as a climb, but this mechanism would be difficult to build and again, limit space on the robot.

  • In the end we decided to keep the climb built into the pivot, but we are still brainstorming different hook locations on the pivot that would make Harmony possible.

#Horizontal vs. Vertical Flywheels

  • The Quokkas design has vertical flywheels. We noticed that the path of the note did not have horizontal consistency because of this. The note had a tendency to flip mid air. The notes also flew much farther with horizontal or angled flywheels. A potential strategy is to collect notes from the source then launch them towards our alliance partners closer to the amp and speaker. As such, we wanted to incorporate non-vertical rollers so we could launch the notes farther. We’ve decided that we want to incorporate horizontal flywheels in our design but we are still figuring out the specifics. Stay tuned!

timing Belts vs Chain

  • One of our mentors suggested that we use timing belts instead of chains in this year’s design. The cons of chain was that it’s hard to work with and can bind easily. The pros of chains were that it’s easier to find large sprockets for. The cons of timing belts were that they can slip and potentially break under high torque. We’re still debating this decision, but we’re currently leaning toward using timing belts instead of chains wherever plausible, and we’ll make a final decision by the 25th.

this week

  • This Thursday, we’ll present the design to the rest of the team. By Thursday, we’ll make a Crayola CAD to envision the design, and after that we’ll start a more detailed CAD. We are currently ordering supplies and we’ll start robot construction next Monday.
4 Likes

Hello robotics community!

We’ve made a lot of progress this past week, including making key design decisions, making a starter CAD, starting our code, dividing and planning in subsystem teams, and making progress constructing our field elements. Also, our team made a huge financial decision to buy swerve for this season! We have wanted swerve for years to improve our autos and cycle time, and this year we decided that swerve was critical to the game and we made it a priority. We have bought all of the swerve supplies and we plan to have our chassis fully assembled within six days by 1/26.

Our Robot Design (Initial CAD)

  • Amp Scoring

  • Speaker Scoring

Robot Design Explanations

In the end, our team liked and was heavily inspired by the Unqualified Quokkas. We have decided to do a similar design, an intake and shooter attached to a single pivot arm, but with a few key design differences.

  • Intake: Our team decided to use an internal intake where notes will go under the bumper, and then rollers will move the note up inside the chassis. The pivot arm will be shortened so that the notes will be taken in from within the chassis. Two years ago, for Rapid React, our robot had an external intake that was a failure point during competitions whenever another robot hit it. We designed an internal intake to make our robot more resilient.

  • Climb: While the original Quokkas bot used the intake/shooter to pull itself onto the chain, we have been experimenting with other methods. The geometric constraints of the original design allow the Quokkas bot to only climb at the lowest point of the chain, but our team is prioritizing harmony and needs to be able to climb higher. Our current plan utilizes both hooks on the pivoting intake and stationary hooks on the chassis to lift our robot higher.

  • Trap: The original Quokkas ri3d did not have a trap, but our team plans to implement one. One idea that we’re toying with includes having a scissor arm attached to the back of the pivot arm that places the note in the trap. We currently have a subsystem group of 8 members doing R&D, so more updates soon.

Team Organization

  • Team Organization: We have divided our technical team into subsystem groups including Chassis/Swerve, Field Elements, Bumper, Ground Intake, Intake/Shooter, Pivot Arm, and Trap. Each group contains at minimum 2 leads, 1 documentation liaison, 1 member with CAD training, 2 build members, 2 electronics members, and 2 programming members. One of the goals mentioned in our initial post was to make our technical teams more integrated and we have done so by merging programming/electronics/build into subsystem groups. Other changes we’ve made since last year include forming “subsystem groups” working on projects rather than “subteams” so that the groups are more fluid and members gain experience working on multiple aspects of the robot.

Prototyping

  • At the latest meeting on Saturday 1/20/2024, we were able to start prototyping our ground intake. The idea is to use three hex shafts with rollers to pull the note in from the ground, bend it, and raise it up at an angle so the shooter can pick it up. One shaft will be controlled by a NEO motor. Gears will help it mesh with the second shaft, and a pulley will be used to connect it with the final shaft.

|335.63218390804593x251.3673817880274

  • The measurements in the above prototype are still off, so it will need to be rebuilt with more precise measurements. However, this prototype gave us a good visual on what the ground intake may look like.

Now that we have our design chosen, the building is in full swing! Many more updates soon on build progress, prototyping, tests, and subsystem results!

1 Like

Programming Update

Hello robotics community!

After picking a robot design, our programming team started thinking about how we wanted to organize our robot. Including how to tackle our new challenge of SWERVE! We also have to plan out driver control methods, vision, autos, and the works. This post will highlight everything our programmers have done so far.

Software Library Decisions

Advantage Kit

simulation

Recently, we created our 2024 robot code project! We made the decision to use Advantage Kit to help log data on our robot and improve the real and simulated separation of our code. We hope that this will help us debug problems in our code before we test it on the robot and better understand how our robot works by analyzing log files in Advantage Scope.

In addition, they also have some really good example projects, especially for advanced swerve drive. So we decided to use their template code as the base of our project.

Photon Vision

vision

Last season, our team used Photon Vision to detect April Tags. We had code to combine the pose estimates from one camera to update our odometry. However, we didn’t end up using it very much.

This year we plan to use two coprocessors. One Raspberry Pi 4 from last year with one camera and a new Orange Pi 5 with two cameras. We will use two cameras for April Tag detection and one on the Orange Pi for note detection. Photon Vision makes it really easy to set up these vision measurements and use them in our code.

The first thing that I worked on was incorporating vision into our project base. I started by using Photon Vision examples to create simulation and real IO classes in our vision subsystem. Then I worked to add dual camera support. We use these values in the pose estimator built in to our drive subsystem. Here is a screenshot of what the simulated camera looks like. Below is a video showcasing vision and pose estimation working in tandem.

Path Planner

autonomous pathplanner

Last season, our team created a customizable auto system using WPILib trajectories. This year, with swerve, we want to use a more powerful system to fully utilize the benefits of our new drivetrain. So we decided to use Path Planner, which was already integrated into our base project.

Path Planner has a nice web UI that we can use to create autos and run commands in between. It also has support for generating trajectories on the fly which we can use to create another customizable auto system for this year. We think this might be really useful in higher levels of play along with our manually created autos. Below is an example of an auto running on the simulated robot.

Features

Advantage Scope

To test the physical movements of our robot in simulation, we wanted to import our robot CAD to Advantage Scope. I separated our robot CAD into drive base and pivot components to use in Advantage Scope. Here is the folder that we created to store our configuration and object files. Below is a picture of the robot in simulation. (the arm isn’t in the correct position but we hope to fix that later)

Another thing we created was a custom Advantage Scope layout for our code. We decided to store this in our Advantage Scope folder as well to allow for easy access for all our programmers.

Auto Aim

We worked on some rudimentary auto-aim code for our drivetrain that controls the angle of our robot as we move around the field. It constantly tries to put the shooting end of our bot at the position of the speaker. It works with a PID controller connected to the robot’s angular speed. Here is a video of it in action.

Path Finding

pathing

Initially, we were having some trouble getting path finding with Path Planner working. Path finding allows us to move to positions on the field while avoiding obstacles. This would be useful for moving to specific points of the field like the AMP, SOURCE, or SPEAKER. Instead, we began by generating trajectories to points. The problem with this is that it ignores obstacles on the field, so it can drive through walls and solid objects which is not something we wanted. These trajectories are more customizable than path finding, so we will definitely use them with manually created trajectories to create our customizable auto system.

However, we did end up getting path-finding working this week. The problem was that we were using an older version of the LocalADStarAK.java file. Once we updated that file, we were all set. Below is a video of the robot path finding to the AMP and SOURCE.

Here is a more complete video of our robot in simulation with all the things we can do.

With this drive and vision code down, our team can build on top of this base to accomplish all of our goals.

Testing Swerve

Once our drivetrain was built, we flashed all of our motor controllers and our roborio with the latest software. We also zeroed our absolute encoders for measuring the turn and ensured our motor controllers had the right IDs.

The first time we ran our code on the bot the turn motors were jittering and making all sorts of crunching noises. Clearly, something wasn’t working correctly. Thankfully we had a fallback option. We used the REV MAXSwerve template to test our robot and it worked!

Thanks for making this edit Michael!

Later, we realized the Advantage Kit example code that we copied was meant for a different swerve module and the encoders the code was looking for were non-existent. So we brought the REV MAXSwerve code to our code and prepared to test it on the robot. We’ll test it next week to see if it works. It would really suck if it didn’t :sob:.

Programming Subsystems

We’ve split up our team to work on the Pivot, Intake/Shooter, Ground Intake, Climb, and Trap?!?! mechanisms. Each group has 3-4 programmers on it and they will all contribute to their own subsystem branches in the code. Once they are done, we will merge everything together for our final robot code!

Currently, once we get the basic layout for our subsystems down, we plan to build on top with more advanced and autonomous features in different branches. Hopefully, the code for the subsystems will be done within a week or so and after that we will reconvene to talk about what exactly we want to do for autos. That’s when we’ll create some of our hard-coded Path Planner autos and think about what sort of customizable autos to add in the future.

In addition, we’ll also work together to figure out how to shoot while moving. Right now, we’re going to take it one step at a time, starting with the bare minimum going up to a fancy autonomous system.

Things to do

This week our programming team wants to:

  • Get our swerve bot working with our Advantage Kit code
  • Set up Vision
    • Calibrate April Tag Detection
    • Note Detection
  • Finish Subsystems
  • Test Things we made in simulation
    • Auto-aim
    • Path Finding
    • Pose Estimation
  • Create Advantage Scope Layouts

Here’s the complete list of things we want to do Programming TODO List

Bonus:

Here’s a video I made explaining swerve for our members

I’ll make a higher-quality explanation video later once we get our swerve code working.

Thanks for reading! Have a nice day!

7 Likes

Incredible writeup! Be careful with that music in the YouTube videos… might they get taken down?

1 Like

I’m not sure how that works. It gave me a copyright warning while I uploaded the videos but still let me upload them. I only added it because I think the music makes the video seem way cooler.

Apparently, the copyright owners allow the content to be used on YouTube which is why it works.

2 Likes

Hello robotics community!

Over the past couple of weeks, our team has worked on the design of the climb. We created a hook in which the chain easily slides into and allows the robot to get a good hold onto the chain. Our climb will either be attached to the shooter or the pivot arm. When our pivot arm rotates, the robot is then able to pull itself up from off the ground.

Our current design:

.

1 Like

Hi everyone!

Here’s a quick update on our intake/shooter:
We finished up the design last Saturday (2/3). The shooter is about 17” wide and takes in notes from the ground intake using rollers. It shoots the note out at -15 degrees with flywheels powered by belts.

Last week, we cut and 3D printed our materials and CNCed 8 plates(4 for each side):


Right now we’re still waiting on the pulleys and for the pivot arm of our robot to be finished, so construction has been a little slow. Here’s our progress so far:

2 Likes

I believe this is possibly just perspective from the photo, but the standoffs you’re using may need to be rechecked. Looks like you might see significant binding. Or is this still just loosely assembled for testing?

Looks good! Figured I’d at least mention it if it wasn’t already obvious.

Standoffs were definitely cut at an angle but we just assembled to have a basic idea of the dimensions

Hello!

Huge programming post incoming. Thank goodness for our snow day today couldn’t finish this post without it lol.

TL;DR generated by ChatGPT: We built subsystems like a pivot, intake, shooter, and LED control for our robot. Used PID loops, tested in simulation, and integrated controller commands. Created a GitHub repo for real logs. Developed a note-shooting visualizer and planned auto routes for our swerve drive. Future plans include finishing the robot, testing, vision tuning, and documentation. Check out our videos and code progress. Thanks for reading!

Programming Update # 2

Making all our Subsystems

Pivot

We created a branch for our pivot subsystem which uses 4 NEOs following each other to move the entire mechanism. We added PID position and voltage control modes to this system. We also added some code to make the PID values Tunable Numbers which helps while testing. We also added some basic commands and bound them to controller inputs. In our code, we ran our robot in simulation to catch some of the bugs it had before. Below is a video of the robot running in simulation and showing our CAD move in Advantage Scope. It took a while for us to tune our robot configuration in Advantage Scope to look correct.

Intake / Ground Intake

Programmatically, both of these subsystems are nearly identical. The only difference is that the intake system has a break beam sensor while the ground intake does not. So, we decided to finish the intake first, then copy-paste the code and remove the break beam for our ground intake code. This allows the ground intake to work in tandem with the intake as commands are executed.

Some notable features are PID loop for velocity and commands with the break beam like intaking till it detects a note.

We tested these in sim, but didn’t record any videos of it. The only visual way of seeing if it was working was to use Advantage Scope to graph out the intake speed and setpoint.

Much of the code’s structure is based on our virtual 2023 robot.

Shooter

Since we have a very similar shooter design to 6328, we are taking some inspiration from their shooter code. We made sure to include velocity PID loops for both sides and create the relevant commands using it. Currently, we are working on some math to control the speed that our shooter needs to go and the angle of the arm to score shots in the speaker.

LED

We’re also going to have LEDs on our robot controlled by the REV Blinkin. We’ve never used these before, so we were super grateful to find this code with exactly what we were looking for:

We haven’t made any commands or anything yet to control our LEDs, but will do so in the future.

Putting it all together

We put all of our subsystems together in the super structure branch to later combine with our swerve branch and bring in to master. Masterfully written post coming to explain our issues with swerve and how we fixed them.

Here’s a visual of our network graph so you can see the code being merged into the Super Structure branch (blue) and later master (white).

Other Extra Features

Note Shooting Visualizer

Recently, Advantage Scope added support for 3D Trajectories, so we thought it would be cool to make a little note-shooting simulator for our robot. Currently, the speeds are hard-coded to move at a set speed when set to launch, so later we will incorporate it with our shooter mechanism to take the velocities from there to show. We added a small util section to our code to store all this note visualization stuff. Another feature we need to add is to incorporate the robot’s speed into account for the note simulation.

Later on, unrelated to simulating shots, we also want to write some more code to simulate notes on the ground for our intake to pick up in simulation.

Here’s a video of it in action

Bonus:

Here’s a video of the trig being messed up in our simulation class.

It was missing code to multiply the horizontal speed by the cosine of the pivot angle.

Here’s a video of it not working when we move the robot around. Also, the code to show the pivot component was also messed up.

The y value needed to be negated. For the arm component, the Pose3d needed to be at the origin of the field not at the position of the robot.

Controller Commands

We also added a new Drive Controls util class to our code which will allow us to have different controller modes depending on the driver or operator of the robot. It works by storing our controller and the different inputs we might want from the controller as static variables that are accessed elsewhere.

Then we instantiate these controls depending on the driver or operator.

We added this mainly because we might want to have different controller layouts when we are testing the robot or to customize the controls to what other drivers might want it to be set to.

Then we configure all our commands as normal.

Storing Real Logs

Another addition we made was to create a GitHub repository to upload the real robot logs we get to keep them all in one place. Here it is

Autos

With all our subsystems done, we’ve started planning out auto routes. Since it’s our first year with swerve, we spent a while playing around with Path Planner just to see what we could do with it. We knew that we wanted to have as many autos as possible, so for now we split up our group to try to make as many autos during our last meeting as we could. Here’s one example, the rest of our autos are in our robot code repo.

We made sure to use linked waypoints to make sure our paths remained the same everywhere if we changed one point. In addition, we also added some named commands to event markers to run during these paths. These are all the commands we anticipate using for now.

Road Ahead

Our plan

  • Finish the Robot
    • Test Motors/Encoders/Sensors
    • PID Constants and SysId
    • Vision Tuning
    • Run our Autos
  • Shooting
    • Shooting from anywhere on the field
    • Shooting while moving
  • Autos
    • Work on a bunch of autos in Path Planner
    • Work on customizable ones
    • Note following command using vision
  • Documentation
    • Make some nice visuals with all our commands and autos
    • Create and stick to a naming convention for our autos

Onward to the next week of build season! Here’s a little picture of what our robot looks like right now. Thanks for the picture Kevin!

Thanks for reading! Have a wonderful day!

Authors
Akash Dubey
Tyler Nguyen

6 Likes

Open Curtain

Act 1: Satus

In this world filled with divine comedy, individual shows are played out each day. Here begins my progress report for 2/5/2024 regarding the issue of our team’s swerve drive.

Monday: The Rise of Darkness (1/29/2024)

I will begin by defining the problem. Our swerve modules are built and wired dubiously, but that’s beyond the fact. The problem with our code is the alignment of the wheels. Because the wheels and the motors could not be installed in the same orientation, it is up to us poor programmers to fix the sins of the father and right these wrongs. We use angle offset variables in our code to change the starting orientation of the wheels in our code and apply them constantly so that the wheel pretends like it knows where it is and moves accordingly with respect to where it should be. While this should work in theory, however, our wheels were not moving (Monday). After sifting through gilded towers filled with 1s and 0s, we discovered that we weren’t actually applying our offsets at all. With our target in sight, we applied the offsets to the optimized angle, and to our getAngle and setAngle method (I’m sure you guys know what those methods do). With our hearts lightened, we ran the code, expecting victory. What we found was much more shocking.

Wednesday: The Unfading Void (1/31/2024)

Our motors started spasming back and forth, like an overstressed student faced with several tests.

Flip flop issue

After doing some investigative journalism, we realized that this was not feasible for the robot, and reverted our changes a bit, returning to an age before the darkness. At the same time, we found some other team’s code, and took inspiration from their work, seeking to fix our problems. Through our relentless efforts, we discovered the fact that we should not try to offset our optimized states, rather, our current position only. However, this all comes to head today.

Thursday: The Tower’s Fall

Today the kitchen was opens, the tools were laid out, and the meat was done marinating. I cooked quite a bit. I started by analyzing the Rev code repository. As I looked at where the angle offset was being applied, I realized that it was only being applied in three places, one of which didn’t do anything.



The third place it was being applied was in the getState command, but because that command wasn’t being called outside of where it was defined, I have no included a screenshot of it.

Like a good programmer, I copied their technique and applied the offset in the same way to our code.


This worked out, but in a very strange way. Although all the wheels now point the same way, and drive, but after driving for a bit, they decide to point in all different directions. On top of that, the turn commands on the right joystick didn’t work.
I believe that this problem has to do with our PID constants, and accumulated error, which may be due to PID or the optimize function not working correctly. However, progress was made today. Despite our obstacles a match was struck, our path is lit, and our hearts are ready to face the darkest dungeon.

Friday: The Return of The Light (2/2/2024)

I’ve been doing some more investigative journalism, and I’ve noticed something. I don’t think the odometry drive and turn positions thing matters. I’ve tested Akash’s code and it doesn’t matter whether or not the offset is there. Rather, I think the offset matters when the setpoint is run. Looking at the Rev Swerve code, as I said above, the offset is being applied in two places, the getPosition itself and the setting the position it wants to go to (remember that getState doesn’t exist in the code outside of where it gets defined). However, something funny occurs here. The getPosition and setDesiredStates methods interface with each other

So me and Akash Dubey did some cooking during 5th and 6th periods. After looking at the code again and trying some various fixes, we decided to open Advantage Scope (a program that we use to look at the state of the robot). As we started at the shifting numbers and dials, Akash noticed that some of the swerve modules weren’t matching up with their location on Advantage Scope, so we changed around the motor IDs a bit. Turns out that fixed the offset problem as well as the turning issue. However, the robot is driving a bit of a weird angle. Might be because it’s field relative. Further reading of code is required. (HI This is zzhao1 from the future. The code is not in field relative right now. I will touch upon this a bit later in the Act and in Act 3

Sliding around and catching - IMG_3960.MOV

Video of it working at the very end - IMG_3961.MOV

INTERMISSION

Use this time to grab a drink and snack. Our story is just beginning

INTERMISSION FIN

As we move to Saturday (2/3/2024), our endless struggle continues. As Akash had brought up earlier, the issue we were facing could potentially be a PID issue, so we decided to get sysID working. sysID, aka system ID, is used to find the specific PID constants to run our wheels with. For those who do not know what PID is, it stands for Proportion, Integral, Derivative, and is used to move motors to accurate rotations. The reason we need this for our drivetrain is because our motors need to all spin in the same orientation and rotation in order to effectively drive the robot, and thus must be kept accurate to produce effective results. Before we could do sysID, we had to ensure our offsets worked, and worked they did. Offsetting the desiredstate and the getPosition opposite ways worked, and with our motor controller IDs switched around, we were balling. As we were staring at Advantage Scope, Akash had a startling realization, our odometry wasn’t being properly updated. This means that fieldRelative driving won’t work (I’ll come back to this). Our righteous crusade would have to come to rest for a moment though, as our build subteam required the robot to fix its belly pan. As they toiled with their fancy tools and grinding machines, Akash and Sam combined their minds to calculate equations to find the angle at which to shoot from. Using their combined knowledge and the power of Desmos, they created a calculator that would relate the position of the speaker relative to the robot. Linked here, it is very handy (and is a surprise tool that will come in handy later). As time progressed, the robot was no closer to being done. In fact, it had only regressed its status, turning from a temporary chassis into no chassis at all. Mobilizing, Jace Lopez, Akash Dubey, William Kimmel, and Michael Sisoev sprung into action, disassembling the swerve modules and replacing them with the pinions required for the highest speed, as well as taping up the bars, ensuring that no empty space remains. As the meeting wrapped up, we affirmed our vows and returned to our respective abodes, ready to face the horrors of the future.

Wednesday: Shadows (2/7/2024)

Each day brings a fresh new battlefield. After some robot testing, I discover that the absolute encoder wire connectors have all fallen out due to the robot spinning. The WIFI power port connection also wasn’t connected, which was honestly a small miracle that the I was able to connect to the robot at all. To remedy this situation, I zip-tied the encoders onto the motor controller and rewired the WIFI power port , expecting the problem to be fixed, but with a solution comes fresh problems. By zip-tying the encoder connections down, apparently that someone changed the angle they were plugged in at, which caused the motor to lose track of where it was. With this lesson in mind, I resolved to drive the robot more carefully. After another couple minutes of testing, I realized that the offsets were being applied in the wrong direction, so I decided to re-zero all of the swerve modules, to ensure they would be pointed in the right direction. Halfway through the zeroing process however, I neglected to follow my own labels, and had zeroed all the motor modules in the wrong direction. Hastily undoing my error, I resumed testing, and discovered that trying to drive the robot forward would only spin the turning motors without actually moving the wheels. Running the code again, the spinning issue fixed itself, but the wheels stopped turning. With time running short, I tactically fall back, leaving the issue for another day.

Thursday: The Birth of Hope (2/8/2024)

All of the swerve problems fixed themselves overnight. Sometimes when logic fails, faith prevails. Though I cannot explain why it’s fixed, what matters is that the swerve drive works. However, one small problem remains. Our wheels’ angle in robot relative drive is slightly off, despite field relative working. This might be a sysID issue, or the wheels themselves could be installed wrong, though that remains to be seen. The robot drives, and that’s all that matters.

The Mountain

Our robot still a long journey to complete before it can compete, programming wise. Swerve must be tuned, vision must be calibrated, and physics simulations must be completed. However, a journey of a thousand miles always begins with a single step. With the robot’s drivetrain working, we are ready to brave this darkest dungeon.

Act 1: Fin

Written by Zandy Zhao @Swordman51

4 Likes

Really nice archytype!

  • main arch similar to Quokka with separate ground intake
  • pivot similar to 2910
  • mechanical advantage 6328 shooter,

a combination I suggested in my team as well few weeks ago (sadly we didn’t use it due to some integration concers)
Good luck

4 Likes

Hi everyone! Here’s a long overdue update on our robot:

Build Update:

Overview:

After printing new pulleys and ordering belts, we finished assembling most of our intake/shooter and tested it out with screwdrivers. It was a little janky because the screwdrivers were weak but the motors will have more torque.

The complete shooter

Note being shot with screwdriver

The shooter is split into two sections, each run by a separate motor running at a different speed. This is because of a problem we noticed in other teams using a single motor where the note would miss the speaker every 3-4 shots because it didn’t have enough spin to reach it. To solve this, we decided to shoot the note like a frisbee, which led to both longer range and more accuracy.

The pivot arm has been completely assembled and has been attached to many of the other subsystems.

Thursday:

On Thursday, we followed the CAD to finish our shooter but faced a few problems along the way. Our hex shafts were too long and interfered with the note inside, so we had to cut them. We also had trouble finding the right size for our belts, but this was solved quickly with a few tensioners. The main problem we faced were the pulleys, which were initially made of TPU. The material was malleable and the pulleys were misshapen by the belts run around them, which caused skipping and stopped them from moving. To solve this we decided to print new pulleys out of nylon. We also assembled the NEO 550 and attached it to the shooter.

We assembled the gear boxes and the motors for the pivot arm. We cut the hex shafts to the correct size after two failed attempts. Following the CAD of the pivot arm, we attached the gearboxes to the base of the arm. Unfortunately, the hex shaft was cut a little too perfect and happened to be just short enough for a snap ring to not fit. To fix this problem, we attached a shaft collar in the gearbox to prevent the shaft from shifting. After assembling the arm together, we disassembled it. (T-T). The climb subsystem had a major redesign so we had to disassemble the arm in order to drill holes for bearings at the base of the arm. It’s okay though, we got it done.

Afterwards, we attached the arm and shooter to each other. We connected the parts using aluminum plates and tube caps in order to bolt them together, then used temporary rivets to bolt it to the robot.

Saturday:

On Saturday, we replaced the TPU pulleys with the newly printed nylon ones. We also replaced our shooter tensioners from offsets to a hex shaft because the offsets kept rotating with the belts.

We attached the tensioner for the chain for the pivot arm. However, we encountered an issue with the chain skipping. We found that the chain and sprocket were not set in place because a shaft collar bolt was stripped. To fix this problem, we cut a spacer to align the chain with the sprocket. After finishing up the pivot arm, we attached it onto the belly pan and riveted it down on both sides.

Electronics Update:

Since we finished the body of our robot on Saturday, we were able to start electronics! Here’s the general layout that we planned:

We put our PDH in the center, right behind the ground intake, and used a 3D printed plate bolted to the base to stack the roborio on top. We also put the battery across from the PDH on its side to save space.

image

We also connected our neos to the sparkmax controllers. Last year, we used anderson powerpoles to connect the wires. This year, we replaced them with wago terminals so we wouldn’t have to worry about the orientation of the connectors. To help with wire organization, we designed custom plates for the wago terminals to slide into and be secured with zip ties. While wiring, we realized that we didn’t have any more ferrule crimps, so we used mostly scrap wire from previous years. We then the CAN chain across the motor controllers. We’re also planning to add leds this year so we wired the led strip to the VRM. The end result was almost complete but very messy:

Plates for wagos, without zip ties (they'll go in the holes at the top)

Saturday electronics

We started organizing wires on Monday by replacing all of the anderson powerpoles with wagos and attaching them to plates. We also bought more ferrule crimps to use with the PDH. In the end, we were able to finish wiring most of the motors and start testing them with programming.

Working on wiring on Monday

A problem we found when testing was that the shooter motors would constantly get stuck and needed a concerning amount of torque to run. We found that this was because the belts were the wrong size and the hex shafts were still somewhat crooked from our last update. We started replacing the belts and will cut the hex shafts in between meetings, so hopefully we’ll have videos of a working shooter soon!

2 Likes

Hello!

Instead of me (Akash) writing the programming post for this week. I thought I would hand it off to our programming team to talk about instead! Hope you enjoy!

TL;DR generated by ChatGPT:

Hey folks!

Our programming team has achieved significant milestones in robot development. They tackled shooter and pivot complexities, opting for a versatile lookup table approach. LED functionalities were implemented based on dynamic robot states, and a Note Shooting Visualizer class was crafted for simulation and tuning. Compound commands like autoScore and shootSpeaker were explained, showcasing efficient robotic actions. Vision code was enhanced to consider Pose estimates’ averages for improved accuracy. The team also designed multiple autonomous routines, including customizable ones through the Elastic dashboard. NoteChooser facilitates autonomous routine selection, while preparations for testing, PID tuning, vision optimization, and code cleanup are underway.

Cheers,
1257 Programming Subteam

Programming Update #3

Things we worked on

Shooter and Pivot

Darrien, Tyler, and Sam

Shooter code, shoot anywhere, and shoot while moving

Shooting While Moving

Recently, we’ve been trying to play around with all the above kinds of shooting commands and created one that can hopefully shoot from any position while stationary and moving. The command waits for the robot to turn towards the speaker, then uses the lookup table to determine the appropriate shooter speed and pivot arm angle to score.

Here’s our code. It doesn’t exactly work yet as we’ve seen in simulation, however, we’re going to fix up the problems to get it working soon.

Lookup Table Rationale:

Instead of using a mathematical equation to calculate angle and RPM, we decided to use a lookup table. Because real life physics are nonlinear and dynamic, we will inevitably need to adjust values to compensate. Compared to a single math equation, the lookup table allows us to adjust values very easily.

We were mainly inspired by this post by Team Rembrandts.

Lookup Table:

The Lookup Table uses a 2D array list that stores velocity and angles corresponding to a specific distance. The Lookup.java file stores the code used to retrieve values from this lookup table. It takes the specified distance value, interpolates it between the two closest distance values found in the lookup table, and then returns an interpolated velocity/angle value. This strategy is also known as a map() function in some coding languages.

We also wrote a class to tune our table with Logged Dashboard Numbers to make testing easier.

We know that WPILib has its own built in InterpolatingDoubleTreeMap and we plan to use it in the future.

TurnSpeakerAngle Command:

The command defines the speaker’s location and creates a vector between it and the robot’s current location. A Rotation2d method is then used to determine how far the robot must rotate to be ready to score into the speaker. Here is the code for it.

ShootWhileMoving Command:

The command runs the TurnSpeakerAngle command in parallel; This is necessary for the robot to shoot accurately and is a more efficient implementation as opposed to copying the previous command.

Every tick, the robot gets the ideal angle, sets the pivot arm PID accordingly, and runs the PID. The PID gets closer and closer to its setpoint, and when it is at the setpoint, the shooter runs at the ideal RPM. The ideal angle and RPM are derived from the getRPM and getAngle functions from before.

If interrupted, the command stops the pivot and shooter. The end condition for the command is for the pivot and shooter to reach their setpoint. An andThen is used to make sure the intake is available before the command is over.

ShootAnywhere Prerequisite Functions:

See here for these methods.

getEstimatedTransform:

  • Using the current velocity, it predicts how far the robot will move over 20ms (one tick)

getEstimatedPosition:

  • Returns the estimated position in the next tick. It adds the estimated transform to the robot’s current position.

getEstimatedDistance:

  • Returns the distance between the speaker position and the estimated position.

getAngle / getRPM:

  • Uses the distance from getEstimatedDistance, inputs it into the lookup table, and returns the ideal Angle or RPM.

LED

This system flashes LED different colors based on the state of the robot. The BlinkinLEDController has a bunch of static variables that describe the state of the robot

  • isEnabled: true if robot is enabled, false otherwise

  • isEndgame: true if match time < 30 seconds, false otherwise

  • noteInIntake: true if there is a note in the intake, false otherwise

  • shooting: true if the robot is currently shooting a note, false otherwise

  • pivotArmDown: true if the pivot arm is down, false otherwise

Unlike a subsystem, variables are updated in LEDPeriodic method in RobotContainer.java, which runs periodically, and LEDs are updated in periodic method in BlinkinLEDController.java based on these variables

Here’s a picture of us setting up the Blinkin controller and configuring colors.

Note Shooting Visualizer

The Note Visualizer class is designed to facilitate the analysis of, and subsequent manipulation of code used in shooting from various positions, and tuning of the aforementioned lookup table. It’s important to note that the Note Visualizer class is strictly for simulation purposes, and does not have any effect on the robots shooting during a match.

setRobotPoseSupplier:

  • Sets the supplier values for robotPoseSupplier, leftSpeed, rightSpeed, and pivotAngle

shoot:

  • Simulates the action of shooting a note using a variety of closely associated classes and methods.

Note Following

We also wrote some code to follow notes, however, we haven’t been able to test it on our robot yet. We have two different approaches: one that estimates the Pose2d of the note and creates a trajectory towards it and another that just takes the angle to the note.

Compound Commands

Carlos and Raghav

Here’s an explanation of some of our compound commands and a nice diagram.

autoScore

  • Create a setpoint using PID for intake to go to
  • Have the robot move to amp and intake go to setpoint at the same time
  • Once the robot is ready, handoff happens and the note is released

shootSpeaker

  • Arm moves to the position necessary to shoot note
  • Set speeds for the shooter motors to shoot the note and then releases the note

Handoff

  • For releasing note
  • Set the speed of shooter motors and shoot out a note

aimShooter

  • Uses PhotonVision and sets the position and angle needed to shoot note into speaker from anywhere on the field

Vision

We also redid our vision code to take the average of our Pose estimates and not just the “best” one. In simulation, the output looks slightly better however we won’t know till we test it.

Autos

Bowen, Claire, Jase, Kavi, and Mai

So many autos! Here’s some information about our autos!

  • Naming convention:
  1. s(1-3): signifies the starting position (top mid or bottom)
  2. n(1-8): signifies the location of each note
  3. sc(1-8): signifies the shooting location correlating with each note position
  • Types of autos
  1. There are no 1 note autos as that is fairly simple
  2. 2 Note, 3 Note, 4 Note, 5 Note, and 6 Note autos pick up that many notes and shoot in one autonomous round
  • How we made the autos (in steps)
  1. Set 1 starting location of 3 possible (top middle or bottom)
  2. Call upon preset paths (paths typically go to a note location and then shoot)
  3. Make sure that the paths are smooth by copying the x and y coordinates of the ending positions to starting positions (could just guess and check)

Customizable Autos

This year we wanted to be able to perform any auto action on the field. Our customizable autos are facilitated by the drive team who select the note positions for our robot to pick up at and then where they should be scored. The MakeAutos.java file tells the robot to go to the note specified through our Elastic dashboard (basically a nicer version of ShuffleBoard), then directs the robot to the specified shooting position and shoots the note. The code has the capability to run this action four times for up to five note custom autos. These are the methods used:

goToPose: uses path finding to go to a location (specified by Elastic)

getSelected: a method used to retrieve the location from the Network Table (basically what’s in the Elastic dashboard)

deadlineWith: makes the robot operate an action simultaneously with another

  • In this case it is to initiate the intake while the robot goes to the specified note

NoteChooser/AutoChooser

NoteChooser uses the SendableChooser class to present a selection of options to our dashboard. For example, we may want to select between different autonomous routines. By putting every possible Command into a SendableChooser, we will have a list of different options on the programming laptop. We have different options for starting positions, set score positions, and set note positions. By doing this, we can make the autos more efficient and optimize the process for autonomous.

This screenshot of a sim shows the different dropdowns that the NoteChooser created. In this example, the robot first intakes Note 1, then shoots it at the top. Next, it chooses Note 2, and shoots it at the top again, and so on so forth.

This same process is displayed in this GIF, with different notes chosen.

(Note 8, Bottom; Note 7, Really Bottom; Note 3, Center; Note 1, Center) (4 Note Auto)

We also added some code to flip our poses in Field Constants to ensure that our code works on both sides of the field.

Road Ahead

Our plan

  • Finish the Robot
    1. Test Motors/Encoders/Sensors
    2. PID Constants and SysId
    3. Vision Tuning
    4. Run our Autos
  • Clean up our code
    1. Remove Unused Imports
    2. Reformat Code
    3. Add Comments
  • Advanced Features
    1. Custom Web Dashboard maybe
    2. Autonomous Teleop Cycling
    3. Improving our simulation code to include notes for vision and intaking

Thanks for reading! Have a wonderful day!

Authors
1257 Programming Subteam

3 Likes

Love to see other teams get inspired by our posts!

At what interval are your setpoints placed?

Hi! Thanks for inspiring so many teams like ours from around the world!

Currently, our intervals are set to one meter.

The angle values we used are also guesstimates which may also be part of the problem. Tomorrow, we’re going to try to properly tune our table in simulation as much as we can before we test it on our real robot.

Having setpoints closer together is definitely better for accuracy. Do you recommend making our intervals say 50 cm instead or is there not too much of a difference?

Thanks again!