OCCRA
Go to Post I would rather KEEP the structural integrity of my skull, thank you. - carpedav000 [more]
Home
Go Back   Chief Delphi > Technical > Programming
CD-Media   CD-Spy  
portal register members calendar search Today's Posts Mark Forums Read FAQ rules

 
Reply
Thread Tools Rating: Thread Rating: 4 votes, 5.00 average. Display Modes
  #1   Spotlight this post!  
Unread 07-18-2018, 04:12 PM
iwilcove's Avatar
iwilcove iwilcove is offline
It's not the code
AKA: Isaac Wilcove
FRC #7308 (DeepVision)
Team Role: Programmer
 
Join Date: Jan 2018
Rookie Year: 2017
Location: California, USA
Posts: 63
iwilcove is on a distinguished road
Most Accurate form of Localization.

Hey Everyone,

I want to get some other people's opinions on an interesting thought experiment/problem a few friends and I tried thinking up solutions for. The problem is as follows:

- The robot needs an accurate estimate of it's position on the field at all times during the match (Must be accurate within one robot length).
- The robot must be able to accurately pinpoint where it is on a map of the field at least 5 times per second (5 Hz).
- The robot must be able to act on the localization data with no more than a 500 ms delay.
- Secondary processors ARE allowed.
- Data CAN be transferred between the driver station and the robot.
- All standard FRC rules apply, specifically the rules about part pricing and robot weight.
- Assume that you DO NOT know what objects are outside the field, only the configuration of the field inside the fence.
- Assume there are no reflective tape vision targets on the field.
- Bonus: Be simultaneously cheap, simple, AND effective.

The main issue creating this problem is that localizing with wheel encoders and a gyro is good for finding difference in location in short time frames, but drifts too much from reality in longer time frames.

Some of the solutions we could think of include:
- SLAM landmarks with LiDAR (has trouble seeing lexan fences).
- SLAM landmarks with LiDAR and vision landmarks with a 360 camera (Again, trouble with lexan fences, and vision is inconsistent).
- Use encoders and gyro for rough estimation, then bump into and line up with physical corners on the field every time the drift gets too bad (Wastes time and is unreliable)
- Use 4 laser distance sensors on the 4 corners of the robot, and use SLAM to find landmarks (cheaper than LiDAR, still can't see lexan).
- Use encoders, swerve drive (so you don't have to rotate), wheels with high traction, and move slowly. (Wastes time and is unreliable)
- Use cameras on the driver station and CV to detect the robot's bumper and numbers, then transform the camera perspective plane onto the field plane. (Can't see through objects in the way, inconsistent)
- Just don't ( )

I'm interested to see what some other people think about this, because I can't think of anything better. Maybe somebody has some witty ideas?
__________________
DV8

Last edited by iwilcove : 07-18-2018 at 04:16 PM.
Reply With Quote
  #2   Spotlight this post!  
Unread 07-18-2018, 04:34 PM
gerthworm's Avatar
gerthworm gerthworm is offline
Making the 1's and 0's
FRC #1736 (Robot Casserole)
Team Role: Mentor
 
Join Date: Jan 2015
Rookie Year: 2015
Location: Peoria, IL
Posts: 633
gerthworm has a reputation beyond reputegerthworm has a reputation beyond reputegerthworm has a reputation beyond reputegerthworm has a reputation beyond reputegerthworm has a reputation beyond reputegerthworm has a reputation beyond reputegerthworm has a reputation beyond reputegerthworm has a reputation beyond reputegerthworm has a reputation beyond reputegerthworm has a reputation beyond reputegerthworm has a reputation beyond repute
Re: Most Accurate form of Localization.

Quote:
Originally Posted by iwilcove View Post
- Just don't ( )
Ha.

Well, there is something to be said for this. There is a key portion of your problem statement I think you left out:

--For all but 15 seconds of the match, there are multiple humans in the control loop


The key here is that, as awesome as knowing absolute field position would be at all times, having a human to observe and correct is a huge advantage over most autonomous systems. It reduces the required complexity of the problem by orders of magnitude. That's not to say that we couldn't try to say "let's see if we can get by without the human", but it is to say the bar for success (aka better than human) is much higher.

Also, personally, I would say I need sub-foot accuracy (maybe sub inch in certain places - think peg in 2017 or high goal in 2016) throughout the match to really get usage out of it. At a minimum, I'd say the required accuracy is different on different parts of the field, and depends on what you're using the data for.

I promise this wont' be my last comment on the thread, let me think of some more ideas though.
Reply With Quote
  #3   Spotlight this post!  
Unread 07-18-2018, 04:40 PM
s_forbes's Avatar
s_forbes s_forbes is offline
plz give me red dots
FRC #0842 (Falcon Robotics)
Team Role: Engineer
 
Join Date: Jan 2006
Rookie Year: 2006
Location: Phoenix, AZ
Posts: 1,431
s_forbes has a reputation beyond reputes_forbes has a reputation beyond reputes_forbes has a reputation beyond reputes_forbes has a reputation beyond reputes_forbes has a reputation beyond reputes_forbes has a reputation beyond reputes_forbes has a reputation beyond reputes_forbes has a reputation beyond reputes_forbes has a reputation beyond reputes_forbes has a reputation beyond reputes_forbes has a reputation beyond repute
Re: Most Accurate form of Localization.

I'm anticipating that someone will stick ping pong balls on their robot and use a driver-station based camera to determine location, similar to how some college research teams track drones.

Until that becomes mainstream, gyros and encoders seem to work well enough.
Reply With Quote
  #4   Spotlight this post!  
Unread 07-18-2018, 04:42 PM
gerthworm's Avatar
gerthworm gerthworm is offline
Making the 1's and 0's
FRC #1736 (Robot Casserole)
Team Role: Mentor
 
Join Date: Jan 2015
Rookie Year: 2015
Location: Peoria, IL
Posts: 633
gerthworm has a reputation beyond reputegerthworm has a reputation beyond reputegerthworm has a reputation beyond reputegerthworm has a reputation beyond reputegerthworm has a reputation beyond reputegerthworm has a reputation beyond reputegerthworm has a reputation beyond reputegerthworm has a reputation beyond reputegerthworm has a reputation beyond reputegerthworm has a reputation beyond reputegerthworm has a reputation beyond repute
Re: Most Accurate form of Localization.

One additional source of sensor input I'd like to try at some point is a magnometer. Absolute angle would be cool to have, and could definitely increase accuracy of positioning.

Measuring displacement with idler Omni wheels and encoders will get you better location accuracy than just measuring wheel rotations. Having multiple of these would allow you to cross-reference and possibly correct if one of them slips?

I've always wondered about taking a vision camera and pointing it at the ground. If the framerate is high enough, you can track variations in the carpet to get x/y velocity (AKA how optical mice work). Similar to the omni wheel solution, just without the possibility of slip, but add in all the problems of a vision system in general.

A very accurate model of how your drivetrain moves is also key. With this, in effect, you can use the applied motor voltage as another "sensor" to guess at how you're moving. Creating this model will be non trivial. Perhaps a machine-learning library, trained on localization data gathered from a motion captures system? Note even with all this, this whole calcuation will become incorrect for any external force acting on the robot.

Some students and mentors who are awesome at kalman filters and the like seem to be in high demand.
Reply With Quote
  #5   Spotlight this post!  
Unread 07-18-2018, 04:43 PM
gerthworm's Avatar
gerthworm gerthworm is offline
Making the 1's and 0's
FRC #1736 (Robot Casserole)
Team Role: Mentor
 
Join Date: Jan 2015
Rookie Year: 2015
Location: Peoria, IL
Posts: 633
gerthworm has a reputation beyond reputegerthworm has a reputation beyond reputegerthworm has a reputation beyond reputegerthworm has a reputation beyond reputegerthworm has a reputation beyond reputegerthworm has a reputation beyond reputegerthworm has a reputation beyond reputegerthworm has a reputation beyond reputegerthworm has a reputation beyond reputegerthworm has a reputation beyond reputegerthworm has a reputation beyond repute
Re: Most Accurate form of Localization.

Quote:
Originally Posted by s_forbes View Post
I'm anticipating that someone will stick ping pong balls on their robot and use a driver-station based camera to determine location, similar to how some college research teams track drones.
How well does this work with a single camera? Do you know anyone who's tried it yet? We're almost out of summer to do new project, but I kinda wanna make one of my students bite it off for fun.
Reply With Quote
  #6   Spotlight this post!  
Unread 07-18-2018, 04:43 PM
iwilcove's Avatar
iwilcove iwilcove is offline
It's not the code
AKA: Isaac Wilcove
FRC #7308 (DeepVision)
Team Role: Programmer
 
Join Date: Jan 2018
Rookie Year: 2017
Location: California, USA
Posts: 63
iwilcove is on a distinguished road
Re: Most Accurate form of Localization.

Quote:
Originally Posted by gerthworm View Post
The key here is that, as awesome as knowing absolute field position would be at all times, having a human to observe and correct is a huge advantage over most autonomous systems.
That is why I just made this a thought experiment, because I know it is probably not viable . It's just a fun problem i've been thinking about even if it won't actually prove useful. But it's a good point that there are other robots on the field, not just your own, so even if you were to get accurate positioning, it would be difficult to navigate with unpredictable robots on the field.

Quote:
Originally Posted by gerthworm View Post
Also, personally, I would say I need sub-foot accuracy (maybe sub inch in certain places - think peg in 2017 or high goal in 2016) throughout the match to really get usage out of it. At a minimum, I'd say the required accuracy is different on different parts of the field, and depends on what you're using the data for.
I agree. I'm just thinking in small steps, because even if it is obviously worse than a human and can't align to most game objects, it would still be cool to see a robot drive around during teleop without the driver doing anything.

Quote:
Originally Posted by gerthworm View Post
I promise this wont' be my last comment on the thread, let me think of some more ideas though.
I can't wait
__________________
DV8
Reply With Quote
  #7   Spotlight this post!  
Unread 07-18-2018, 04:46 PM
iwilcove's Avatar
iwilcove iwilcove is offline
It's not the code
AKA: Isaac Wilcove
FRC #7308 (DeepVision)
Team Role: Programmer
 
Join Date: Jan 2018
Rookie Year: 2017
Location: California, USA
Posts: 63
iwilcove is on a distinguished road
Re: Most Accurate form of Localization.

Quote:
Originally Posted by s_forbes View Post
I'm anticipating that someone will stick ping pong balls on their robot and use a driver-station based camera to determine location, similar to how some college research teams track drones.
Somebody will eventually do this, and it will be very cool and everybody will praise them for it.

Then somebody else will ruin it by putting ping pong balls on their robot as well .
__________________
DV8
Reply With Quote
  #8   Spotlight this post!  
Unread 07-18-2018, 04:51 PM
ClayTownR's Avatar
ClayTownR ClayTownR is offline
Registered User
AKA: Clayton
FRC #0100 (The WildHats)
Team Role: CAD
 
Join Date: Dec 2016
Rookie Year: 2015
Location: California
Posts: 176
ClayTownR has much to be proud ofClayTownR has much to be proud ofClayTownR has much to be proud ofClayTownR has much to be proud ofClayTownR has much to be proud ofClayTownR has much to be proud ofClayTownR has much to be proud ofClayTownR has much to be proud of
Re: Most Accurate form of Localization.

Some very smart people studied this for their bachelor's at WPI and wrote a 70-page paper on it. Feel free to read it here. Their findings were generally that individual sensors alone were pretty bad for finding your location, so you had to take as many readings as you could get if you wanted a chance at accuracy. Section 3.1 states that IMUs, encoders, cameras with tags, beacons, and optical flow were the most promising techniques. When they built a sample robot, they used encoders, the NavX, and a camera with tags to find their location. They used an extended Kalman filter to combine this information.

You can access their GitHub here.
__________________
FTC 4800, 2015-
FRC 100, 2016-
Reply With Quote
  #9   Spotlight this post!  
Unread 07-18-2018, 04:52 PM
gerthworm's Avatar
gerthworm gerthworm is offline
Making the 1's and 0's
FRC #1736 (Robot Casserole)
Team Role: Mentor
 
Join Date: Jan 2015
Rookie Year: 2015
Location: Peoria, IL
Posts: 633
gerthworm has a reputation beyond reputegerthworm has a reputation beyond reputegerthworm has a reputation beyond reputegerthworm has a reputation beyond reputegerthworm has a reputation beyond reputegerthworm has a reputation beyond reputegerthworm has a reputation beyond reputegerthworm has a reputation beyond reputegerthworm has a reputation beyond reputegerthworm has a reputation beyond reputegerthworm has a reputation beyond repute
Re: Most Accurate form of Localization.

There's also some forms of "camera-only" SLAM I've heard of:

http://www.roboticsproceedings.org/rss10/p08.pdf

The idea there was by just putting a handful of cheap webcams on the bot, hooking them to a coprocessor, running some fancy-schmancy algorithm, you could get absolute position from a self-trained model.

From what I looked into a while back, I don't think the research hadn't gotten the actual acuracy or delay/framerate good enough for real-time FRC usage, but that's not to say it couldn't ever be an option.

Imagine a limelight-style coprocessor package with a 360degree camera. You strap it on your robot, drive it around the field a bunch. Put it in a specific spot, press a button to call that the origin, and get absolute position from that spot every time afterward.

Until lighting conditions change. or someone who was standing in the same spot during training moves.
Reply With Quote
  #10   Spotlight this post!  
Unread 07-18-2018, 04:52 PM
iwilcove's Avatar
iwilcove iwilcove is offline
It's not the code
AKA: Isaac Wilcove
FRC #7308 (DeepVision)
Team Role: Programmer
 
Join Date: Jan 2018
Rookie Year: 2017
Location: California, USA
Posts: 63
iwilcove is on a distinguished road
Re: Most Accurate form of Localization.

Quote:
Originally Posted by ClayTownR View Post
Some very smart people studied this for their bachelor's at WPI and wrote a 70-page paper on it. Feel free to read it here.
Thanks for sharing this, i'll check it out.
__________________
DV8
Reply With Quote
  #11   Spotlight this post!  
Unread 07-18-2018, 04:53 PM
iwilcove's Avatar
iwilcove iwilcove is offline
It's not the code
AKA: Isaac Wilcove
FRC #7308 (DeepVision)
Team Role: Programmer
 
Join Date: Jan 2018
Rookie Year: 2017
Location: California, USA
Posts: 63
iwilcove is on a distinguished road
Re: Most Accurate form of Localization.

Quote:
Originally Posted by gerthworm View Post
running some fancy-schmancy algorithm
Problem solved.
__________________
DV8
Reply With Quote
  #12   Spotlight this post!  
Unread 07-18-2018, 04:53 PM
gerthworm's Avatar
gerthworm gerthworm is offline
Making the 1's and 0's
FRC #1736 (Robot Casserole)
Team Role: Mentor
 
Join Date: Jan 2015
Rookie Year: 2015
Location: Peoria, IL
Posts: 633
gerthworm has a reputation beyond reputegerthworm has a reputation beyond reputegerthworm has a reputation beyond reputegerthworm has a reputation beyond reputegerthworm has a reputation beyond reputegerthworm has a reputation beyond reputegerthworm has a reputation beyond reputegerthworm has a reputation beyond reputegerthworm has a reputation beyond reputegerthworm has a reputation beyond reputegerthworm has a reputation beyond repute
Re: Most Accurate form of Localization.

Quote:
Originally Posted by ClayTownR View Post
Some very smart people studied this for their bachelor's at WPI and wrote a 70-page paper on it. Feel free to read it here. Their findings were generally that individual sensors alone were pretty bad for finding your location, so you had to take as many readings as you could get if you wanted a chance at accuracy. Section 3.1 states that IMUs, encoders, cameras with tags, beacons, and optical flow were the most promising techniques. When they built a sample robot, they used encoders, the NavX, and a camera with tags to find their location. They used an extended Kalman filter to combine this information.

You can access their GitHub here.

"Position Hallucination" nice.

Also, "Building on windows". Nice.
Reply With Quote
  #13   Spotlight this post!  
Unread 07-18-2018, 05:15 PM
Loveless Loveless is online now
Registered User
no team
Team Role: Engineer
 
Join Date: Mar 2017
Rookie Year: 2011
Location: St Louis
Posts: 69
Loveless has a spectacular aura aboutLoveless has a spectacular aura about
Re: Most Accurate form of Localization.

https://i.imgur.com/G2q9ikB.gifv

^ Real time localization from noisy lidar and gyro data through automated feature learning.

The ping pong ball method mentioned in an earlier comment would be the best solution when paired with a solution to the PNP problem. Another iteration of this is to put a target on your DS and then have the robot track that, which eliminates network lag.
Reply With Quote
  #14   Spotlight this post!  
Unread 07-18-2018, 05:26 PM
iwilcove's Avatar
iwilcove iwilcove is offline
It's not the code
AKA: Isaac Wilcove
FRC #7308 (DeepVision)
Team Role: Programmer
 
Join Date: Jan 2018
Rookie Year: 2017
Location: California, USA
Posts: 63
iwilcove is on a distinguished road
Re: Most Accurate form of Localization.

Quote:
Originally Posted by Loveless View Post
https://i.imgur.com/G2q9ikB.gifv

^ Real time localization from noisy lidar and gyro data through automated feature learning.
This is assuming LiDAR can see the lexan, which in my testing it can't. The best that happens is if you have a dusty piece, you will get a point that is about 6-12 inches off. Never mind, I see that this is accounted for.

Also, do you have a link to the code for this?
__________________
DV8
Reply With Quote
  #15   Spotlight this post!  
Unread 07-18-2018, 05:35 PM
solomondg's Avatar
solomondg solomondg is offline
Registered User
AKA: Solomon
FRC #2898 (Flying Hedgehogs)
Team Role: Leadership
 
Join Date: Aug 2016
Rookie Year: 2016
Location: Portland, Oregon
Posts: 109
solomondg has a brilliant futuresolomondg has a brilliant futuresolomondg has a brilliant futuresolomondg has a brilliant futuresolomondg has a brilliant futuresolomondg has a brilliant futuresolomondg has a brilliant futuresolomondg has a brilliant futuresolomondg has a brilliant futuresolomondg has a brilliant futuresolomondg has a brilliant future
Re: Most Accurate form of Localization.

Honestly, I don’t think localization tech is in a place where it’s practical for FRC. I’m not sure it’s useful either, but that’s a separate discussion.

First off, I think we can establish that encoders+gyro alone won’t work – there’s simply too much drift for them to be useful past the 15 seconds of autonomous.

Field barriers is only one of the issues with lidar. With five other robots on the field, there’s simply not the signal to noise ratio to get useful data from it. While Zebracorns were able to mount a LIDAR sensor at the height of the steel poles on top of the lexan exterior, the data they got was subpar at best, and this was on a flat field devoid of other robots.

Denser methods such as optical flow from a RGBD/stereo camera like the Zed would be feasible. However, as an integration-based method, this is subject to the same drift you’d experience with a conventional encoder+gyro setup. You’d most likely have marginally better performance, but thanks to the aforementioned six robots, and now also the entire crowd and field surroundings, you’re not going to have much luck.

That WPI paper’s interesting, but it’s essentially just a survey of the available methods. Since a lack of vision tags is a constraint of this thought experiment, that is going to drastically reduce the effectiveness of that method.

The “right” choice for inside-out absolute localization is usually a Monte-Carlo particle filter. However, as with many other things, this is sabotaged by the high number of robots on the field.

Quote:
Originally Posted by gerthworm View Post
One additional source of sensor input I'd like to try at some point is a magnometer. Absolute angle would be cool to have, and could definitely increase accuracy of positioning.
The MPU9250 used on the Navx and Pidgeon already incorporate magnetometer readings into their heading reports. Unfortunately, magnetometers kind of suck, and you can’t get amazing readings from them, especially not when around the huge amount of EM radiation from a high power robot’s control systems.

Quote:
Originally Posted by gerthworm View Post
I've always wondered about taking a vision camera and pointing it at the ground. If the framerate is high enough, you can track variations in the carpet to get x/y velocity (AKA how optical mice work). Similar to the omni wheel solution, just without the possibility of slip, but add in all the problems of a vision system in general.
I’ve done a lot of experiments here. While it sounds like a good idea, in reality, it’s incredibly noisy and generally not worth your time. Ditto with mouse sensors, unfortunately.
Quote:
Originally Posted by gerthworm View Post
A very accurate model of how your drivetrain moves is also key. With this, in effect, you can use the applied motor voltage as another "sensor" to guess at how you're moving. Creating this model will be non trivial. Perhaps a machine-learning library, trained on localization data gathered from a motion captures system? Note even with all this, this whole calcuation will become incorrect for any external force acting on the robot.
Some students and mentors who are awesome at kalman filters and the like seem to be in high demand.
I’m not sure that’s the solution, to be honest. Your encoders are going to give you just about every bit of information you need. Any mathematical model is going to be a lot less accurate than the real thing, unfortunately. While Kalman filters are nice, they’re not magic, and adding sensors without much covariance won’t help too much. Machine learning is pretty neat, but deep Kalman filters really only excel on highly complicated, nonlinear systems, as opposed to robots, where you’re simply dealing with a lot of generally unpredictable noise and drift.

Quote:
Originally Posted by Loveless View Post
Real time localization from noisy lidar and gyro data through automated feature learning.
I’ve been following this project for a while, and while it’s certainly promising, it’s very much theoretical, and not something that I would yet trust to work. I’d love to see this in the real world or working on actual match data, though, and be proven wrong.

If you had an unlimited budget, I’d be surprised if it wasn’t possible to maintain reasonably accurate localization throughout a match. However, the “unlimited budget” tends to be an issue.

Personally, I think the outside-in localization approach (tracking robot from driver station) is going to be the best bet in regard to absolute localization. There are a lot of issues there, and it would be a massive pain, but it’s doable. One thing I’ve also tested, which showed some promise, is tracking the ceiling lights. It’s lower fidelity, but you can easily maintain frame to frame accuracy, and it would be effective for keeping track of rotation. Once again, lot of issues there, but it seems to be promising.

Those two methods combined with some quality 3D imaging sensors, a better-than-average IMU (I’ve been meaning to try Redshift Labs’s UM7-LT), a drivetrain with low scrub forces (2+4 or similar), and some well-written sensor fusion code could very easily result in a usably accurate absolute localization implementation. However, it’s far from easy, and probably not all too feasible for the average team. I’d love to see how sensors improve in the coming years, though. dGPS technology is becoming much cheaper; I don’t think it would be impossible for an FRC game in the next ten years to integrate dGPS-based absolute localization. It’ll be exciting to see what the future holds!
Reply With Quote
Reply


Thread Tools
Display Modes Rate This Thread
Rate This Thread:

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

vB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Forum Jump


All times are GMT -5. The time now is 10:56 AM.

The Chief Delphi Forums are sponsored by Innovation First International, Inc.


Powered by vBulletin® Version 3.6.4
Copyright ©2000 - 2018, Jelsoft Enterprises Ltd.
Copyright © Chief Delphi