Have anyone thought about doing robot localization this year to help shooting?
It seems that the sensors we are allowed to use are limited to the ultra sonic range sensor and the camera/kinect, and class I laser sensor (Does someone have more information on Class I laser sensors?)
Here are some problems with each:
- Ultrasonic Sensor: Interferance from other team's sensors
- Camera/Kinect: Stereo vision is tough to do. Kinect setup on the robot is equally as difficult.
- Laser sensor: Class I may also be outlawed as it's not suppose to blind the drivers.
Has anyone thought about solving these problems?
Also, how are you guys thinking about programming the robot to localize itself? I'm thinking of a particle filter with the ultrasonic sensor, but I don't know about its effectiveness due to other team's sensors.
Ideas?