View Single Post
  #28   Spotlight this post!  
Unread 02-02-2017, 18:53
Tom Line's Avatar
Tom Line Tom Line is offline
Raptors can't turn doorknobs.
FRC #1718 (The Fighting Pi)
Team Role: Mentor
 
Join Date: Jan 2007
Rookie Year: 1999
Location: Armada, Michigan
Posts: 2,569
Tom Line has a reputation beyond reputeTom Line has a reputation beyond reputeTom Line has a reputation beyond reputeTom Line has a reputation beyond reputeTom Line has a reputation beyond reputeTom Line has a reputation beyond reputeTom Line has a reputation beyond reputeTom Line has a reputation beyond reputeTom Line has a reputation beyond reputeTom Line has a reputation beyond reputeTom Line has a reputation beyond repute
Re: What will be the next technical growth leap for the average team?

Quote:
Originally Posted by Chris is me View Post
I don't think this is a super accurate version of the events.

To some extent, machine vision has been a thing since 2006 - teams used cameras effectively that year for the high goal.

The main reason camera tracking was not a thing in 2009 was the new control system combined with the moving, non-lit target. Teams certainly did use camera tracking to some extent, but often / usually, manual tracking was faster.

In 2010 and 2011, your robot started in a known location facing a stationary goal, so what was the camera even for?

In 2012, a similar argument applied, but when teams started moving around the camera got beneficial. The rules also opened up to start allowing coprocessors sometime before 2012 which was a big help. 341 was a notable, highly visible example of camera tracking, certainly, and they had an effect, but I think it's a bit simplifying and disingenuous to suggest that camera tracking wasn't taken seriously until 341 demonstrated it being used effectively.

In 2013 and 2014, see 2010 and 2011. 2015 doesn't count.

So 2016 was really the only opportunity since 2012 for camera tracking to be an advantage. You had to cross defenses before shooting which made the physical position of the robot on the field not nearly as certain. This is why you saw a lot more cameras for autonomous that year - as well as flashlights for teleop.

So really, the use of cameras is much more driven by the game than by the coolest example of it from a particular year - and cameras have been an available option to FRC teams in one way or another for many, many years. There certainly was a dramatic shift in capability between the pre-2009 CMUcam and the 2009 NI system, as well as a shift in 2012 the first year coprocessors were allowed when a camera was a8 potential advantage. But camera use is really a response to the conditions of the game more than anything else.
I agree. There are many well-known examples of vision process being used for over a decade. Even in 2012, I could list a number of robots other than 341 that were highly successful with their vision system. Vision is still very difficult. For the most part, only teams with prior expertise or a single skilled individual use it.

Two years (3 years?) ago was the NavX. Then the Spartan board, and many others with extremely high quality gyros. That allowed for much better navigation and vibration rejection - just look at crossing the barriers last year. I'd add in versa planetaries too.

The current 'revolution' going on is thanks to CTRE. The can-bus system seamlessly integrated into the control system was a huge step. Built in current monitoring. Pneumatic control integration.

The motor controllers that implemented easy speed control, and now motion control, have been transformative. Many, many teams struggled mightily with speed control for years. It was never a simple thing. Just look at the discussions about sampling rates versus rpm, the correct sensors to use, etc. Last year for many teams it was as simple as plug-and-play.

The next true revolution? I suspect it will be brushless DC motors. They allow for lighter weight in a smaller package, etc. The minute they are allowed, they will become ubiquitous among top flight teams.
Reply With Quote