Hi, our team was wondering if there were other teams using sensors or cameras to align to the rocket/cargo ship using the withers lines that are on the field?
Our team prototyped a using a Pixy Cam 2 to track the lines but didn’t have time to implement it in competition. We had the pixy run to an arduino that plugged into the rio usb port.
Teams were certainly working on it during build season:
I recall reading in a robot reveal topic that a team was using a mechanical guide which engaged the keyhole cutouts at the top and/or bottom of the hatches to get the final inches of alignment.
Not us - and I was one of the people asking about line sensors early in the build season.
We did get the Allen Bradley’s to recognize the tape lines but from an ops concept it didn’t turn out to make sense to use them.
Maybe if the tape lines were a bunch longer.
So we gave up the robot real estate and weight for a JeVois camera for vision processing instead, plus a HAB3 climb mechanism latch.
The vision processing is pretty good for lining things up plus we have two IR sensors used to help ensure we are squared with the target.
A spinning LIDAR and some RANSAC would work, but that’s kid of a pain.
Check out 67 - While they do have a limelight they use, they started this year with a manual alignment plate that they run into the rocket with. The edges of it alongside their h-drive allow them to slide in pretty well.
2811 is using a Pixy to align to the white lines and then move in to shoot/place.
Nope. We were planning on Pixy2 but could never have enough time to figure anything out past recognizing colors and objects.
We are using one of these as a visual aid to the drivers, along with sideways “nudge” buttons for precise lateral positioning for hatch placement. We originally intended a spaced pair and automated adjustment of both centering and angle, but we had trouble getting two to work at once and ran out of time anyway, so we ended up with just a simple graphic display the drivers can use for manual positioning. It does work very well for that, though.
I really like 2791 and how they use the white tape lines to center themselves (https://www.youtube.com/watch?v=zr0UMJPcuWk). Very simple(ish) way of interacting with the field.
In a similar style, we had a little eureka moment on kickoff that led to our trapezoids. All of day 1 at OC was without limelight and just the trapezoids.
My team used a pixy2 at our second district and it was awesome. Well, until it decided to scream at us and cause brownout like conditions. We took the pixy code out and it was fine. But, definitely going to fix it for the future.
We have a turret on an elevator, allowing us to side outtake.
When doing cargo ship runs with cargo, we detect the lines on the ground and line up the centre of our turret with the centre of the cargo ship bay, so that we are accurate.
Currently looking into detecting cargo with vision already inside of the cargo ship, so that we can auto stop at a cargo ship bay that doesn’t already have cargo in it.
We use the navX and preset angles to line up flat but don’t have a sensor for lining up laterally. It actually has work pretty well but not ideal for speed
We had one like this on the practice bot but the girls didn’t seem to put much time into learning to use it.
The team I mentored for Steamworks used one very effectively.
Has anyone messed up with pathweaver?
We played with the idea but the lines weren’t long enough. The vision targets ended up being a way more viable option.
Yeah, that was the catch, we were hoping to alging horozaontally since we use mecanum drive… but can you point me to some vision tutorials or something similar?
Down facing ir sensor