Team 4480 Presents: WallE

Watch it on YouTube:


Nice job Upsala!

Great robot! Really digging the smooth drive and how well that plays into your autonomous modes.

Looking good!

I’m liking how smooth your [strike]Mecanum [/strike] Vectored Intake Wheels are. A large number of teams seem to have issues creating a [strike]Mecanum [/strike] Vectored Intake Wheel drive base.

Do you have any plans to release a whitepaper or drive code at the end of the competition season, so we can see what you’re doing differently?

Love the 360* cameras! The vectored ground intake seems really smooth. What are those green parts made out of?


They are 3D Prints: Made with Hatchbox Filament from Amazon.

Took a little CAD time, but we got it working out nicely!

Also the Code is done with Python and is currently posted amidst GitHub!

Yes we will. What we do is we have a PID angle locked drive where no matter how you strafe, it will keep its angle. Then to change the angle, you use the right stick of the Xbox controller to rotate.

Very nice.

When you say keeping angle, do you mean field-centric control, or robot centric?

Robot-centric. We wanted this because we always wanted the driver to be able to look at the camera and know which direction it is in. Without the angle lock on, it almost spins in a circle after about 20 feet of strafing to one side with no adjustment from the driver.

Is the angle lock gyro corrected, or functioning off drive encoders?

Gyro (NavX) corrected. Drive Encoders seemed more likely to become off over time. Also, with our gyro, we really don’t deal with gyro drift at all unless the driver decides to never rotate.

The NavX is pretty solid. Appreciate you answering questions, I always find it’s better for everyone the more information there is out there about successful systems.

I like the epic soundtrack, and I especially like the big video screen on the driver station. Is the live video useful, or is it too laggy? Do your drivers watch the screen or ignore it to watch the robot?

The big screen is new for us this year. as our side project at the beginning of the school year, the Kids wanted to build a more indepth Driver Station.

There is a hint of lag (less than 200ms) but is quite useful currently for the Loading station on the opposite side of the field.

We have been doing both, When out of sight, use the video (especially loading) and when close for pegging look and line up using video plus real.

As a side note, last year we had a smaller screen and usage with Stronghold… That screen was neglected but mainly because video was brought in later than anticipated.

It’s training and practice mostly!

Cool bot!

P.S. Yasss Python

How do you guys do auto?

In which manner?

There are the basic, drive forwards and peg gears from all three places. but we’ve also added options, such as backing away for other robots to drive near, clearing our zone and going into the mid zone to be ready for another Gear load etc.

Trying to be ready for all scenarios :slight_smile:

Do you use any computer vision?

Yes, we use a Raspberry Pi 3 running at 57FPS using OpenCV 3. The Pi is onboard our Robot. If you want to see our vision code, look on our GitHub page here: 2017-Robot-Code/Vision at master · schlumpyj/2017-Robot-Code · GitHub

Love the dashboard.

What are you using for the 360 camera?

I’m trying to convince my team to mount a 360fly on our bot, but it’s a glorified gopro. Can’t hook it up to our dashboard.