"mouse encoders"

I know it’s possible to convert an optical computer mouse into an optical encoder that you can mount on the bottom of your robot to tell how far you’ve gone, but I can’t seem to find any mouse sensors fast enough. (The fastest I’ve found has a top speed of about 5 ft/sec.) Is there any pre-existing encoder that functions like a mouse, or any optical mouse hack that would achieve a sufficient speed to go on a FIRST robot?

What about actually using a ball that ran on the ground to “gear down” the speed you are moving at. I think that would work… either that or a series of balls to do it. I cant say how efficient that would be.

As for a larger version of that system I cant say.

I’ve thought about this before, and ran into the same problem that you described. Commercial mice are just not made for high speeds. Can it be done? Sure - but you will need a lot of expertise on the subject, not to mention the ability to fabricate optics.

I am, however, working on something similar for next years’ control system using optical flow navigation from a downwards-pointing digital camera. With the processing power of the cRIO, it might just work. Basically, you take a picture of the carpet, then another picture a fraction of a second later. You can solve the optimization problem to determine the vector relating the movement between the two shots. At a high enough speed (I’m hoping for 15-30Hz), you can essentially get optical mouse-like results.

That could work but it might have slop or break if it hit a piece of debris on the field. Thats why I was hoping to avoid contact in the first place.

I think he was talking about using an optical mouse. Ball mice can work and I’ve seen them on robots with some modifications.

That was the other option I was considering however given the proximity of the camera to the ground if its on the bottom of the robot wouldn’t the size of the picture be so small that the next frame could have none of the old picture?

I know, I was thinking read the ball, truth be told, that might help you with your camera thing, keep the issue of field debris away. Just put a fine checkerboard pattern on the ball and go from there.

ya that could work i wonder if its possible just to avoid the ball all together though. That would be ideal.

A couple of years ago some people looked into this. There are gaming mice that are fast enough. The problem is optics, illumination and the fact that optical mice are designed to work best on a laminate surface. Avago holds most of the patents on the algorithms used to detect the movement and are suing a bunch of other chip companies. Using a 1/2 VGA camera could work if you got the focus and lighting right. Then you would have to develop the algorithm and have enough processing speed. The good algorithms are closely guarded IP.

I need to warn anyone thinking of using a “telephoto mouse” or camera system of a real problem. The field of view, and thus the relationship between image motion and actual velocity, changes with height from the floor. Any bouncing of the system, or even significant vibration, will throw off the results.

Same thing for us.
We tried the optical mouse thing only to find out they are too slow. Instead we put encoders on omni wheels to accomplish the same task. One encoder for x axis and one for the y axis. It was accurate and accomplished our task. The only problem was by the end of everything the rollers in the omni wheels (ones for VEX) were not turning very well because of cuts and scrapes.

I think a vision-based system (ie, optical mouse/camera) is probably more trouble than it’s worth. As Mr. Anderson said, the potential for error due to vibration/bumps is pretty high (try bouncing your optical mouse and see what your cursor does).
MORT has used these on our drive shafts, and they don’t seem to be a problem. Naturally, any time the wheels slip, there is slop.

With a good Kalman filter, the optical flow path can work quite well. I’ve done it before.

Is it worth the trouble? Probably not :smiley: But yes, it can and does work when well-implemented.

This approach is used by Team 16, the Bomb Squad…

(Photographic illustration of martschr’s post above)

That’s what I suggested when Nick asked me about it after school today :cool:

Our 2005 Triple Play robot had three miniature omniwheel “follower” encoders. There was one in front and one in back to measure sideways motion and turning, and one in the middle to measure forward motion. It worked great on carpet. On vinyl or tile or concrete, not so great.

How did you measure rotation with them? Ive been looking at a similar concept but couldnt come up with a way to do it without a ton of floating point math/look up tables.

I will have to find out exactly, but I think we went with using circumference of the wheels times number of rotations. We used the x and y separately. we measured how far forward we went to change sections in the code. Ex: after x many feet forward we would then execute our turn. Then we measured how far over we went to know what lane we were in. In our hybrid we gave it the signal of what lane the ball was in and the rest relied on the encoders and gyro. If needed I’m sure there is a way of using the x and y together in this same setup, but i don’t know the math needed to do it. I will talk to one of our mentors, and try to find a better answer for you.

I would be interested in what you guys did for your omni-wheel encoders, I had thought of playing with that.

We used drive wheel encoders and a gyro. It would constantly be calculating our x/y position on the field using trig. Then we used a way point routines to tell the bot to go to a new x/y so we could quickly change where we would want the bot to do also if the bot got hit it could adjust.

It worked well but I feel we had too much gyro problems and too much slippage of the wheels the encoders were connected to. So a different way to track position would be great.

We experimented with using two of the new large VeX omni wheels mounted at right angles to one another. Each of the wheels directly drove a 64 count per revolution greyhill shaft encoder. The Y (front/back) wheel was close to the center of the robot and the X wheel was mounted just inside the back chassis member. The following is an excerpt from the “test report” that is attached. - This report also includes ultrasonic and gyro data. The gyro data in this report was with a 150degree/second 2006 KOP gyro (illegal for 2008) - we ended up using the 2008 KOP gyro for our first regional (VCU) and then the AD 300 degree/sec gyro for NY and Nationals… Oh this is an mecanum system using the 6" AM wheels. We figured we could not tell with the gyro and drive wheel shaft encoders what direction the robot was really moving without the odometry wheels. When all was said and done we used the ultrasonic range to the center wall to control strafe to keep/move the robot to the right lane. I have data sets from our matches with some interesting/puzzling ultrasonic data, not sure how enlightining it would be for others but most of the time the ultrasonic sensor worked well other times it just plain gave us bad data and it is not at all clear if the problem was an intermittent sensor failure or interference.

Oh yeah, I have a bunch of optical parts the controls hardware guys ordered in the basement - it will be a summer project. I haven’t had the time to do a sanity check on the design or optics but I think that it will not keep up as well. I think this is the design reference http://home.roadrunner.com/~maccody/robotics/mouse_hack/mouse_hack.html With the old controller we would have had to use a microcontroller to do the USB to serial conversion but if this works it might interface more easily to the cRIO?

Hope this data is helpful,


This is the result of the odometry wheels for the linear run. The measured distance for this run was 22’8” whereas the predicted Y distance from the Y odometry wheel was 22’7.44” – almost perfect. The measured X displacement however was 11.5” but the predicted X displacement from the odometry wheel was only 1.68” showing that the wheels do not provide accurate displacement information when the direction of motion is almost parallel with their axel. I think this makes sense given the roller pattern on the wheels.

some_data_2_23_08_revA.pdf (163 KB)

some_data_2_23_08_revA.pdf (163 KB)