Quote:
Originally Posted by teletype-guy
So that was a lot of preamble to get to my question: within the roboRIO autonomous loop, I can constantly poll for up to eight ranging distances -- what is the best way to use that information to perform obstacle avoidance or field way-point determination? This must be much like code that uses other ranging parts (IR or ultrasonic) so how is that data best fed into the drive/steering code?
|
Ok, that's pretty cool

Does it see the plexiglass? That's my biggest worry right now with LIDAR. Also, if there was a way to get 5 meter lidar for cheap, that would be perfect for FRC. Unfortunately, the field of view is 20 degrees, which won't give you the nice point cloud that a more expensive lidar would give you.
The industry buzz-words you are looking for is localization. You want to use the LIDAR to do localization. I'm pretty certain that the 2 meter distance is going to be too short to be useful, but if you can see the perimeter of the field, you should be pretty close to done. One of the classic algorithms is the particle filter.
http://robots.stanford.edu/papers/th...tics-uai02.pdf should be useful, though I haven't read it myself to confirm. Someone like Jared Russell would have more good insight if you can get him to chime in.
Once you know where you are, you want to then follow a path in the absolute coordinate frame. The general problem of driving somewhere without hitting things wicked hard. Algorithms like ELQR and ILQR are pretty good for that, but the math is hairy. You can spend an entire lifetime on this part.
I'm definitely not an expert. I've been working on adding localization and path planning to 971's bots over the last couple years, and am getting closer.