Has anyone achieved the perfect velocity sensor platform?
I’ve seen measurements done with mice (x/y), but that wouldn’t help too much with rotation. I believe a set of 3 or 4 sensors (three would be easier) arranged in an equilateral triangle underneath would achieve that.
Encoders? Don’t catch slippage.
Accelerometers? Can be affected by jerk and/or require precise timing and encoding (see rotation).
Gyros? Fix the rotation issue- assuming the lever arms are correctly accounted for.
Ultrasonic? Some great success on a whitepaper at 40mph with a modified set of transducers.
Suggestions would be most welcome. I’ve been wanting to build a sensor for this for years.
Many of these work only if there is no slippage, or (as in the encoders) if you aren’t slipping while in a pushing match (sometimes with a wall). The followers are arguably the best solution as they can sense what your real integrated displacement is, assuming they don’t slip, and with care and thought you could model most of the turns and dips.
It does require contact, however, and I think back to the “Moon” competition where we were unsure if followers would be allowed. A non-contact type sensor would be ideal- which limits you to optical, sound, or ? But then that’s the problem with all sensor platforms… they must keep getting more and more complex to filter for the undesirable motions.
I know of one team that did exactly what I think you’re trying to explain using modified laser mice, sitting half an inch above the ground (no contact = no slipping). I’m still trying to figure out exactly how they did it, though. I think they probably used a coprocessor to interpret the information from the USB cables.
put an encoder on each of three appropriately placed unpowered omni follower wheels and use those 3 signals to compute the three independent motions of the vehicle (X,Y,rotation).
I’ve been trying to figure out how to perform the math behind this for two seasons! The idea I’m working with has four unpowered wheels, though, arranged in a square.
A camera and 2d optic flow algorithms will give you x and y. A optic mouse is an example of a sensor that uses optic flow to give this. The problem with mice sensors are the optics and the maximum velocity they can work at. Recently, some drone hobbyist have taken a good camera and interfaced it to a Cortex M4 Arm microcontroller. They also added a ST Micro gyro. There is not allot of feed back as to ground based applications and I believe it is best at greater than 12" off the floor. Different optics and some Lighting experimentation may yield a very good solution. Our team has been looking at auton navigation this summer. OpenCV has optic flow algorithms that could be used if these solutions are not fast and accurate enough. Goggle optic flow and check Wikipedia. A gyro and an accelerometer like the Invensense MPU-6050 will give a very good tilt compensated Yaw.
The other possibility is to use astral navigation. Every event has lights in the ceiling that are fixed like stars. Point a camera up at the ceiling. Using some filters a pixel matrix of dark and light (0 and 1) can be produced. Then run 2 d optic flow on that matrix. The lights are far enough away that some triangulation can also be done. Every event has different lights and it would have to be calibrated at each event. This stuff is hard else every one would be doing it.
Velocity relative to what? Whatever your answer, that (thing) needs to be part of the solution.
Even the carpet can slip if you push hard enough.
If a contact solution isn’t desired, then contactless means optical of some type, or maybe ultrasonic.
Astral navigation would work, but what a pain to do. It would have to adapt to the current lighting setup no matter where. I have a “Mint” floor cleaner, it uses something like that, but the pattern is fixed. (It sends an IR pattern to the ceiling, which the Mint uses for navigation). This is still optical though…