Thread: New Sensors?
View Single Post
  #18   Spotlight this post!  
Unread 09-07-2014, 23:57
slibert slibert is offline
Software Mentor
AKA: Scott Libert
FRC #2465 (Kauaibots)
Team Role: Mentor
 
Join Date: Oct 2011
Rookie Year: 2005
Location: Kauai, Hawaii
Posts: 361
slibert has much to be proud ofslibert has much to be proud ofslibert has much to be proud ofslibert has much to be proud ofslibert has much to be proud ofslibert has much to be proud ofslibert has much to be proud ofslibert has much to be proud ofslibert has much to be proud of
Re: New Sensors?

Quote:
Originally Posted by Jared Russell View Post
I would personally love to see an end-of-match autonomous mode, which necessitates very good localization. Up until now, teams could get by on using odometry from a known starting position and be (reasonably) repeatable over 10-15 seconds. Remove the ability to precisely control the initial conditions of the robot and it is a whole different animal.
It's clear the Google car and its brethren will be real very soon, and in fact its the social, not the technical, challenges which will dominate. I for one don't want to wait for FIRST to design games that will require this technology - I believe we need to develop it now, and I believe once developed it will be both very usable in current games, and future-ready when drivers have to focus on problems of a higher order than vehicular navigation.

Two lynchpin sensors:

A) A $79 LED-based optical ranging system (Lidar-Lite) is being released this month. We're planning on building a 2-D 360 scanning lidar, the goal is to range the entire field (at robot height) at 1-2 degree resolution within about 3 seconds.

Lidar Lite (http://www.dragoninnovation.com/proj...-pulsedlight):

- Uses LEDs, not Lasers, w/an optic. Repeat: NO LASERS REQUIRED.
- Measures distance to 20 meters with a 10ms integration period using time-of-flight calculations
- Notably, provides SNR measurements for each calculation. Key point: We should get much higher SNR for ranging retroreflective tape than other surfaces.

B) An auto-calibrating Attitude Heading Reference System (AHRS). The nav6 IMU we developed for FIRST (https://code.google.com/p/nav6) provides this solution; this was used by several teams at nationals last year and includes C++, Java and LabView Libraries for easy integration onto the robot.

Given knowledge of the field metrics, these two intelligent sensors together provide:

- Robot Current Position relative to Field (derived from field metrics and 360 degree lidar scan) throughout the match.
- Robot Starting Orientation (measured w/magnetometers before game (and motors) start).
- Instantaneous Orientation (100Hz Motion Fusion of Gyro/Accelerometer).
- Gravity-corrected Linear Acceleration measurements
- Angle to Retro-reflective Tape (based on SNR thresholds of LIDAR data).

And unlike camera-based approaches, we believe this approach should be insensitive to variable lighting conditions.

The nav6 sells for $70. And I'm estimating we can build the Lidar Lite for about $150 in parts, not sure what we the sale price to the FIRST community might be yet.

Now of course on top of that we need collision avoidance and waypoint navigation algorithms. But the amount of published research done in this area recently (not to mention some cool Cheesy Poofs navigation code ) is rapidly making this an engineering, not a research, task.

It's a good time to be a robotics engineer!

Last edited by slibert : 10-07-2014 at 00:06.