Release:
http://www.youtube.com/watch?v=4p_bFfEBzMw
Build Timelapse:
http://www.youtube.com/watch?v=gI1f_2RK6dY
706 is going to Milwaukee and St. Louis
Release:
http://www.youtube.com/watch?v=4p_bFfEBzMw
Build Timelapse:
http://www.youtube.com/watch?v=gI1f_2RK6dY
706 is going to Milwaukee and St. Louis
I love the lighting and the robot. May I ask what you used as a motor to drive your bridge tilter. We packed the bot up a little incomplete due to our bridge tilter refusing to work. We used a window motor but it didn’t have nearly enough power to push the bridge down.
We are using pneumatics. The motor that you might be seeing is connected to a custom gearbox just below the cylinder. We are using it as a ball intake roller.
How big of a cylinder did you guys end up needing? Ours is a 2.5" bore 1.5" stroke monstrosity, so if you’re designing a mechanism, make sure to do the math before doing anything!
Impressive. Most impressive.
Thanks!
@Grim Tuesday: I forgot the exact dimensions but I know that they are any where near that large! The hinge is self locking so we don’t have to use pressure to keep it in place
From what I saw it looked like you guys had auto-tracing working at the Sussex Mini Regional. How reliable is it? I’m excited to see this robot at the WI regional and drive with/against you guys.
We have tracking in a working state. From a distance point of view and from a speed point of view the tracking is very accurate. However we have some few adjustments to make with the PID loop that controls the turret to get things in a smooth motion.
And I have to say myself Osprey is looking impressive. 1716, 2826, 706 Alliance? :yikes:
While we will not be going to any regionals other than Wisconsin, we have a nearly identical practice bot for driver practice, code debugging, and mechanical improvements. By the time Week 4 rolls around, 1716 will be in tip top shape to play among the best.
That’d be a killer alliance if we could pull it off.
Just posted our Build Timelapse:
What does your camera tracking code use to track the targets? Color? Luminense? Magic?
We use a negative image from the camera to find the black/white boundary. Then we compute possible polygons and put those into a genetic sorting algorithm that chooses the most likely target. The Heurisitics need a little more tunning but we can capture distance with a ± 2 inch accuracy.
In a more technical setting we are using a threaded socket connection from the DS computer to the robot. All vision processing happens offboard and is then sent back. We have used the system in competition without a hitch.