I've wanted to see this done forever! I spent all summer trying to come up with a list of characteristics and sensors that would aide in doing this. The best thing I've thought of is using a NIR TOF camera with 360 degree panoramic lens to obtain a point cloud that surrounds the entire robot. (A 360 degree panorama looks like
this, which can be unfolded to look like
this. Vertical lines are preserved without distortion, making it possible to infer the locations of rectangular targets). Apparently, it should be relatively easy to account for and calibrate the image processing algorithm for the additional distance added by the lens. Of course, almost all TOF cameras are
way outside of a FIRST budget (I'm currently searching for a monocular affordable TOF camera to construct a lens for and the resources to do so). In addition to that, Astral navigation is starting to pique my interest for sensory.