Log in

View Full Version : Optical Mouse Navigation


phrontist
29-12-2004, 17:14
A little after build season was over for the 2004 season I started thinking about new ideas for this year's season, particularly autonomous mode. I was poking around CD when I saw a post by Craig Putnam in which he mentioned he was working on an optical mouse based navigation system. What a great idea, I thought! I thought about it periodically during the off season and eventually I had come up with how I wanted to do this, and had almost everything figured out except how to interface the mouse, or mice, to the RC. So I emailed Craig Putnam and he explained his progress on making such a system and much to my delight I found that much of what he had done was what I was planning to do, so I couldn't be too far off track. He also mentioned something mind-boggling...

But before I talk about that I'd like to mention something else. When I first got this piece of information from Mr. Putnam a few days ago, I didn't tell anyone and hoped he wouldn't either. This was pretty stupid for a variety of reasons, which don't need to be enumerated. Anyway, in the past few days I've seen several posts that here that are dancing around the problem, namely the following:


TTL Serial Port and PS/2 Mouse Interface (http://www.chiefdelphi.com/forums/showthread.php?threadid=31656)
External Lens for Optical Mouse (http://www.chiefdelphi.com/forums/showthread.php?threadid=31992)
New Serial Port Driver (http://www.chiefdelphi.com/forums/showthread.php?threadid=31931)
Looking for Mouse Part (http://www.chiefdelphi.com/forums/showthread.php?threadid=31943)


I realized that a lot of people are now working on Optical Mouse Navigation, and it wouldn't be fair to not share everything. So then...

There is a chip that converts PS/2 to RS232, made specifically for interfacing PICs to mice! It's from Al-Williams (http://www.al-williams.com/) and it's called the PAK-XI (http://www.al-williams.com/pak11.htm). It slices, it dices, it makes a mean Crème Brûlée! Mr. Putnam, a mentor from team 42, P.A.R.T.S., tipped me off to it after I emailed him about it. But wait, theres more! It's also cheap! Hey! So what's the catch? Well, it may or may not be legal.

Here are a few snippets from my correspondence with Mr. Putnam:

Panic!

I'm really hoping I'm wrong, but isn't the PAK-XI illegal for use in
FIRST accoording to last year's rule:

<R71>Additional electronic components for use on the robot must be
currently available from or equivalent to those available from Newark
InONe (http://www.newarkinone.com), Future Active
(http://future-active.com), Radio SHack (http://www.radioshack.com) or
Digi-key Corporation (http://www.digikey.com). Additional electronic
components include any object that conducts electricity other than IFI
relays and voltage controllers, wires, connectors and solder. The
total catalogue value of additional electronic components must not
exceed $300.00 USD. This cost is counted as part of the $3,500 limit.
No single electronic component shall have a catalogue value of over
$100.00 USD

Is the whole system in dire peril, dependent on a rule change? Can it
be argued that their chip is essentially equivalent to a PIC with
special software on it? For that matter couldn't it be argued that a
massive collection of bipolar semiconducters could theoretically do
the same thing ;-)


Oh good grief! Somehow, when I started this project way back when, I had
convinced myself that the PAK chip was OK to use. Given the current rules
though, it would seem problematic.

I suspect that the PAK chip is indeed nothing more than a PIC (or
equivalent) microprocessor with the software already developed and loaded.
Whether we could get a ruling from FIRST on this at this point (before
kickoff) is probably also problematic but worth a try. I'll send them a
message tonight and see what happens.

I will continue on with this project anyway - just because it is a challenge
and useful in teaching our courses. But it will be interesting to see
whether or not it can be used on a FIRST robot. I'd sure hate to see it not
be usable due to the lack of a $5 interface chip!



*fingers crossed*



Well, I sent my question off to FIRST last night and got a response today.
Or, more accurately, a non-response (which didn't really surprise me very
much).

As I expected, they won't comment on any possible changes in rules for the
upcoming season. That's not a surprise as it could give one team a
competetive advantage over another.

Regarding whether use of a PAK chip would have been legal last year or not -
they couldn't say.

While poking around on the Microchip web site I found some application notes
and technical briefs that may be helpful. There must be a way to communicate
with the mouse without using the PAK chip. After all, the PAK chip itself
does that and I suspect it is probably some flavor of PIC microprocessor
itself (or something equivalent). You can see some examples of how Microchip
has done that if you look at TB055 and AN956. In the latter case, they are
using USB, but switching from a PS/2-based mouse to a USB-based mouse should
be a trivial exercise.

Good luck!


(I've already ordered some of those chips, and I'm not quite sure if I'm going to go ahead using them and risk their illegality, ideally I can find another solution)


Well, now everything has come full circle!

I initially emailed you after finding nothing on the web about this
task except for those documents on the Microchip site. Oh well.

Were you able to find C18 versions of this kind of thing, because
there is little chance that I'll be learning assembler in time to make
this thing work for shipment. Bah.


Mr. Putnam has also put together a great presentation on the topic which can be found here (http://faculty.dwc.edu/putnam/Optical%20Mouse%20Project/Robot%20Navigation%20Using%20Optical%20Mouse-based%20Odometry.ppt).

I'm currently building a prototype mouse assembly, and the problem of the moment is sufficent illumination. Here are some more snippets from my correspondance with Mr. Putnam.


We've been scratching our heads as far as illumination goes, in your
presentation there was a picture of mighty LED cluster. Now, from
expermentation I've ascertained that the LED's in mice are modulated,
and that the CMOS will only work with images illuminated with
modulated light. How did you add more LEDS? We've thought about just
making a cluster and soldering it to the old LED terminals but fear
this will cause too much draw and upset the circuit. Your help is
greatly appreciated.

Oh, and what about using IR leds? I've seen sites on which
case-modding folks do this just to be different, but rumor also has it
that the sensor is more sensitive to infared light.



Modulated huh? Hmmm... It never occurred to me to even consider that!

I simply found and whacked in a "mighty LED cluster" (as you so succinctly
put it!) and it worked just fine. The LED clusters are replacement tail
lights that you can get for cars and trucks. I got mine from
http://www.superbrightleds.com/. They come in various "strengths" (number of
LEDs in the array) and dispersion patterns. Getting them from this source is
also a whole lot less inexpensive than going to an auto parts store
(something like 1/2 to 1/3 the price for the exact same unit!).

I don't power the cluster from the mouse for the same reason you cited. I
just run it directly from the 12V supply. Even the biggest array I have
(which is the one in the photo you saw) draws very little.

Now you've got me wondering whether I got lucky with the mouse I used. I
don't recall off-hand the manufacturer of the mouse but I'm sure I can
figure it out once I get back to my office (this is my home email address).
I *do* recall that it uses an Agilent chip, but again I can't tell you right
now which one.

Perhaps the LED is not modulated on my mouse. Or perhaps the little clear
plastic piece that fits over the *other* side of the chip (from where the
optics & floor are located) supplies enough light from the board-mounted LED
(which I left in place) to satisfy the chip. I have had a feeling that the
plastic piece was some kind of a light pipe as it seems to direct some of
the LED's light onto the back side of the chip. I think you can see that
piece in one of the shots of the circuit board in the presentation.

The web site for SuperBrightLEDs has specs on their LEDs as does the Agilent
web site for their chip. I have the details at the college, but the
wavelength of the peak output of the LED array I am using is almost exactly
what is expected by the chip. As I recall it is well inside the visible
(red) spectrum.



The plot thickens!

Well, I simply guessed it may be modulated after waving my finger back
and forth and finding that if I did it at just the right speed I could
see several fingers. Ultra-scientific, I know. Maybe dragging out ye
olde O-Scope is in order.

We've modified a mouse chip by resolding it's LED to the same
terminals, but on the bottem of the chip. Our thinking is that when
the mouse detects a change it "brightens" the LED until a period of
non-change occurs. Thus we can see if the CMOS is picking up anything
at all by simply holding it at various heights without any optics on
the CMOS. By carefully adjusting the angle of the LED we've
ascertained that 4 inches (measured from the bottem of the chip to the
ground) is the highest we can go before the LED brightness "times out"
indicating the CMOS is detecting zilch. Now we're not putting a whole
lot of stock in this, for obvious reasons.

Here is another idea, if it is modulated, connecting the LED cluster
directly isn't the only way to modulate them acoordingly. Couldn't
some sort of solution be reached with a sort of "amplifyer"? Maybe I'm
just being daft but couldn't a semiconducter or two correctly used
modulate a larger current based on a smaller one.

Hmm, looking at your diagram in the presentation, I'm beginning to
think that you got lucky. I think a single LED is sufficient
illumination to get good data at the distance shown in your diagram.


So this is where things stand at the moment. The purpose of this thread is to have a central place for everyone working on this to talk about it. Please post anything you find that's relevant.

(Innumerable thanks to Mr. Putnam, who doesn't seem to post on CD to often. I hope he doesn't mind me quoting him! Note: These are out of chronological order, for the sake of clarity)

NoRemorse
29-12-2004, 17:20
I'll admit that I did read almost any of that post, but... I had this idea last year. We ruled out that idea becuse of one main reason: The maximum rate of speed that the mouse can mesure was well under the speed of our robot. That might be somthing to look into if your thinking about a mouse for navigation.

phrontist
29-12-2004, 17:22
I'll admit that I did read almost any of that post, but... I had this idea last year. We ruled out that idea becuse of one main reason: The maximum rate of speed that the mouse can mesure was well under the speed of our robot. That might be somthing to look into if your thinking about a mouse for navigation.

If you had bothered to read Mr. Putnam's excellent presentation, you would have noticed that the problem is solved by raising the mouse off the ground and the addition of some optics. Furthermore, if the PAK-XI chip is allowed, it can supposedly adjust the resolution of the mouse it's connected to. The chip isn't anything special either, and any coprocessor based solution could do this. (Some?) Mice are programmable.

jonathan lall
29-12-2004, 17:30
Well now that it's out in the open, we've toyed with this concept for some time as a terrain-mapping navigational system, but we never got past the conceptual stages because of the specs on the sensor devices. I was pushing for some research into this the past two years on my team actually, and we did (we believe) figure out how to get two mouse inputs into the controller (we would place one on either side of the robot and judge distance and pitch from those readings), but the main problem with this was purely technical on the sensor side. Simply put, your robot would have to be going very slowly for the sensor to be able to position you. A robot that blasts off at say 10 feet per second in autonomous mode is by a large margin (I don't have the numbers offhand) too fast for such a device. The framerate is not high enough such that the sensor (which is really a camera) can take two frames and compare them. Thus, no movement would be detected. Perhaps we've missed something, but our preliminary research into this lead us to that conclusion.

phrontist
29-12-2004, 17:37
Well now that it's out in the open, we've toyed with this concept for some time as a terrain-mapping navigational system, but we never got past the conceptual stages because of the specs on the sensor devices. I was pushing for some research into this the past two years on my team actually, and we did (we believe) figure out how to get two mouse inputs into the controller (we would place one on either side of the robot and judge distance and pitch from those readings), but the main problem with this was purely technical on the sensor side. Simply put, your robot would have to be going very slowly for the sensor to be able to position you. A robot that blasts off at say 10 feet per second in autonomous mode is by a large margin (I don't have the numbers offhand) too fast for such a device. The framerate is not high enough such that the sensor (which is really a camera) can take two frames and compare them. Thus, no movement would be detected. Perhaps we've missed something, but our preliminary research into this lead us to that conclusion.

I'm not sure what sensor you are refering to specifically, but I'm fairly confident that a mouse can resolve at that speed accurately with the correct optics. If it's "zoomed out" by a factor of 5, wouldn't it logically follow that it's seeing 1/5 the movement, and hence it's normally 12 inch per second max become a 5 foot per second max? I'm not saying it's a solved problem, but it's certainly not a desperate situation by any means.

Mr. Putnam is using an accellerometer to gauge odometry, and the mouse to measure rotation. I was planning on using a mouse for both, each one going through a PAK-XI and then to a serial port. This is possible through Kevin Watson's new serial port driver. This would make debugging hard, as Max Lobovsky pointed out, because you couldn't use printf's to send data back to your laptop anymore, as all the ports would be used. His idea was to multiplex the two streams...

Questions Questions!

phrontist
29-12-2004, 17:59
Okay, I think my plan is to go ahead and use the PAK-XI's on the offchance they'll be allowed, while also working on code (In C, INSTEAD OF THIS ASM NONSENSE :ahh: ) to replicate the function of the PAK-XI should it be banned. Does anyone have any idea which PIC would be best suited for simultaneous PS/2 and RS232 communication?

Dave Flowerday
29-12-2004, 18:16
Wildstang experimented with optical mice last year as well. I'm having a hard time telling from that long post just how far others have gotten. Has anyone actually communicated with their mouse?

Last year we were able to get an optical mouse connected to our robot controller and read X/Y positioning from the mouse chip. We did not use PS/2, however. Instead we removed the PS/2 driver chip from inside the mouse and connected the RC directly to the optical sensor chip from Agilent (BTW, Agilent is the only company making optical mice components, so if you open one up that is what you'll find). The Agilent chip can speak in a simple synchronous protocol which is how we communicated with it. We implemented this using the normal input/output pins on the RC and bit-banging the protocol. This allows us to read out the X/Y deltas as well as obtain the actual image captured by the mouse, the surface quality measurement, and a bunch of other goodies. The chip is pretty nice because it will remember how far it's moved since the last time you queried it, so you can query it at your leisure (as long as it's fast enough that the counter inside the sensor doesn't overflow).

We also affixed a different lens to the mouse to change its field of vision to accommodate larger speeds. Illumination is a problem, as was already mentioned. We fixed a ring of superbright LEDs around the lens to combat this. However, that is where the bad news starts. With all that out of the way, we mounted the modified mouse to a cart and started doing tests to see if it accurately tracked motion. We found that it did not. When the cart was moved at different speeds over the same distance, the mouse would report different amounts of measured distance. This was disappointing, because before we fitted the new lens we tested the mouse by affixing it to an X-Y table in our model shop with a very accurate readout and found that without the new lens it was good to something like a few thousandths of an inch. At this point it was getting pretty late in the season so with the mouse concept not looking too promising we had to abandon it and concentrate on reusing our old positioning system that we used in 2003.

Agilent's site (http://www.home.agilent.com/cgi-bin/pub/agilent/Product/cp_ProductComparison.jsp?NAV_ID=-536893734.0.00&LANGUAGE_CODE=eng&COUNTRY_CODE=US) is pretty useful, with datasheets for the various mouse sensors. If you dig around there you will find that they give you lots of info on the optics used as well as what wavelengths the sensors are most sensitive to, etc.

Also something to keep in mind is that even if you get to the point where the mouse can track movement accurately, it does not handle rotation. You'll still need a compass or something to know your heading which needs to be combined with the vector obtained from the mouse. I think you can substitute two mice on opposite sides of the robot instead of the gyro, but I haven't yet worked through the math to prove it to myself. There's some odd cases if you do a tank-style spin and such that I have to think about a little bit to see if it can still work.

phrontist
29-12-2004, 18:22
Wildstang experimented with optical mice last year as well. I'm having a hard time telling from that long post just how far others have gotten. Has anyone actually communicated with their mouse?

Last year we were able to get an optical mouse connected to our robot controller and read X/Y positioning from the mouse chip. We did not use PS/2, however. Instead we removed the PS/2 driver chip from inside the mouse and connected the RC directly to the optical sensor chip from Agilent (BTW, Agilent is the only company making optical mice components, so if you open one up that is what you'll find). The Agilent chip can speak in a simple synchronous protocol which is how we communicated with it. We implemented this using the normal input/output pins on the RC and bit-banging the protocol. This allows us to read out the X/Y deltas as well as obtain the actual image captured by the mouse, the surface quality measurement, and a bunch of other goodies. The chip is pretty nice because it will remember how far it's moved since the last time you queried it, so you can query it at your leisure (as long as it's fast enough that the counter inside the sensor doesn't overflow).

We also affixed a different lens to the mouse to change its field of vision to accommodate larger speeds. Illumination is a problem, as was already mentioned. We fixed a ring of superbright LEDs around the lens to combat this. However, that is where the bad news starts. With all that out of the way, we mounted the modified mouse to a cart and started doing tests to see if it accurately tracked motion. We found that it did not. When the cart was moved at different speeds over the same distance, the mouse would report different amounts of measured distance. This was disappointing, because before we fitted the new lens we tested the mouse by affixing it to an X-Y table in our model shop with a very accurate readout and found that without the new lens it was good to something like a few thousandths of an inch. At this point it was getting pretty late in the season so with the mouse concept not looking too promising we had to abandon it and concentrate on reusing our old positioning system that we used in 2003.

Agilent's site (http://www.home.agilent.com/cgi-bin/pub/agilent/Product/cp_ProductComparison.jsp?NAV_ID=-536893734.0.00&LANGUAGE_CODE=eng&COUNTRY_CODE=US) is pretty useful, with datasheets for the various mouse sensors. If you dig around there you will find that they give you lots of info on the optics used as well as what wavelengths the sensors are most sensitive to, etc.

Also something to keep in mind is that even if you get to the point where the mouse can track movement accurately, it does not handle rotation. You'll still need a compass or something to know your heading which needs to be combined with the vector obtained from the mouse. I think you can substitute two mice on opposite sides of the robot instead of the gyro, but I haven't yet worked through the math to prove it to myself. There's some odd cases if you do a tank-style spin and such that I have to think about a little bit to see if it can still work.

Craig Putnam has indeed communicated with his mouse, but using an component that may or may not be legal. The maths are the only part I am sure of, and if I can interface two mice, I'll be golden. That little step is proving harder than origionally thought. I'll have to look into this aligent chip business...

Thanks for the info! :D

UPDATE: Looking at some PDFs on their site, it would seem the particular chip I'm looking at outputs PS/2 directly! Was this the case with yours?

Craig Putnam
29-12-2004, 19:39
I guess its time for me to get back into the conversation...

I'll begin by correcting one error in the thread so far. We are using the mouse to tell us how far we have moved, not the rotation (as it is pretty insensitive to rotations about its optical axis). We are using one of the old gyro chips to measure the rotation rate and then merging the inputs from the two sensors to give us a measure of the robot's motion since the last time we looked. All of that is going into a PID feedback loop (the details of which we are still being worked out). The intent is to enable the robot to accurately travel along any mathematically defined path: straight line, circular arc, spline curve, etc.

Using two mice (one on each side of the robot) is indeed another way around the rotation problem as is using one mouse at the center of the robot to measure distance traveled and a second mouse mounted at the edge of the frame to measure turning motion.

Re: the mouse not being able to track the robot's speed. By lifting the mouse board up and inserting the appropriate optics, we have effectively changed the size of the "Mickey". So instead of (for example) getting 200 Mickeys per inch, we are getting a much smaller number. While our resolution has gone down, the speed that we can track has gone up. We expect to eventually have a resolution of about 1/4 inch and be able to easily track the top speed of a FIRST robot.

We do indeed speak to the mouse quite well using the PAK-VIa chip. [ If you go to the Al Williams site you will see that there is now a chip specifically designed for communicating with mice - the PAK-XI. We starting with the PAK-VIa chip however and have found that works well enough for our needs at the moment. ] As has been pointed out however, the use of any PAK chip may well be illegal. So I am very interested in hearing from anyone who has successfully communicated directly with either a PS/2 or USB based mouse (or directly communicated with the Agilent chip as was noted above).

Kevin Watson
29-12-2004, 20:30
The optical mice are going to have a problem because one blurry patch of grey FIRST carpet is going to look like the next. I agree that this is a cool idea, but think it'll be hard to get it to work (but I'm willing to help nonetheless <grin>).

-Kevin

sanddrag
29-12-2004, 21:01
Thinking about this, yes optical mice are cool but what came before them? Ball mice! It may not be practical for all games (like climbing a step) but for a game like 2002's Zone Zeal, maybe you could just stick a ball and a couple wheels under the robot. Think Technokats 2003 Ball robot, but maybe 1/4-1/3 the size and instead of motors powering the little wheels there would be shaft encoders hooked to the little wheels. Yeah, that would work, at least on a flat field. Make a cradle in the center of your robot and put a ball in there with two little rollers at 90 degrees from each other contacting the ball at all times.

Dave Flowerday
29-12-2004, 21:20
UPDATE: Looking at some PDFs on their site, it would seem the particular chip I'm looking at outputs PS/2 directly! Was this the case with yours?
We've seen both types. The ones we mainly used contained a ADNS2610 chip which only does the sync serial. The other one we saw had 2 interfacecs, either PS/2 or good old quadrature output. If you have trouble with the PS/2 and don't want to try the sync serial to the Agilent chip, you could always use the quadrature output. That's really easy to decode.

phrontist
29-12-2004, 21:36
Thinking about this, yes optical mice are cool but what came before them? Ball mice! It may not be practical for all games (like climbing a step) but for a game like 2002's Zone Zeal, maybe you could just stick a ball and a couple wheels under the robot. Think Technokats 2003 Ball robot, but maybe 1/4-1/3 the size and instead of motors powering the little wheels there would be shaft encoders hooked to the little wheels. Yeah, that would work, at least on a flat field. Make a cradle in the center of your robot and put a ball in there with two little rollers at 90 degrees from each other contacting the ball at all times.

Max Lobovsky said the exact same thing. But there is one problem with this! Try using a ball mouse with a carpet patch as your mouse pad for an hour and I garuntee your enthusiasm for this idea will be curbed.

Craig Putnam
29-12-2004, 21:41
The optical mice are going to have a problem because one blurry patch of grey FIRST carpet is going to look like the next. I agree that this is a cool idea, but think it'll be hard to get it to work (but I'm willing to help nonetheless <grin>).

-Kevin

By zooming out (demagnifying) with the optics we are going to see more of the surface underneath the robot. This can cut both ways - more surface may give us the chance to see more variation. On the other hand, zooming out means we will see less detail in whatever it is that we do see.

My experience with testing various surfaces under my prototype optical system seemed to support that it will work OK. But I didn't have any of the FIRST carpet to test with so I can't say for sure that the modified mouse will work well with that. Time will tell...

The other thing that can really help is the angle of the light. Think about how an optical mouse works. The light is shining not directly down but at a very low angle to the surface. The idea is to cast as many shadows from surface variations as possible in order to give the chip something "interesting" to look at.

phrontist
29-12-2004, 21:47
We've seen both types. The ones we mainly used contained a ADNS2610 chip which only does the sync serial. The other one we saw had 2 interfacecs, either PS/2 or good old quadrature output. If you have trouble with the PS/2 and don't want to try the sync serial to the Agilent chip, you could always use the quadrature output. That's really easy to decode.

Quadrature! Sweeeeeeeeeeeeeeeeeeeeeeet! :D :cool: :D

That's the geekiest thing I've ever said.

Kyle Fenton
29-12-2004, 21:50
Max Lobovsky said the exact same thing. But there is one problem with this! Try using a ball mouse with a carpet patch as your mouse pad for an hour and I garuntee your enthusiasm for this idea will be curbed.

Yeah I was figuring the same thing. Just use a ball mouse. Optical mice are not meant to detect any fast movements. You can prove this by moving your optical mouse very fast on the mouse pad you will see that the curser on the screen seems to go in random patterns.

The only problem with ball mice that I can figure is the problem with resistance, but that should be negligible

gnormhurst
30-12-2004, 22:41
Okay, let me play devil's advocate.

Wheel encoders have been done before, and I assume they work. If a robot uses differential (two-wheel) drive and we keep track of each wheel's motion, and the tires are pneumatic and don't ever slip on the carpet, can't we measure not only x and y but rotation as well?

So my question is this: what great advantage does an optical mouse have over wheel encoders that makes us want to make this work? Other than being really cool, that is.

phrontist
30-12-2004, 22:56
Okay, let me play devil's advocate.

Wheel encoders have been done before, and I assume they work. If a robot uses differential (two-wheel) drive and we keep track of each wheel's motion, and the tires are pneumatic and don't ever slip on the carpet, can't we measure not only x and y but rotation as well?

So my question is this: what great advantage does an optical mouse have over wheel encoders that makes us want to make this work? Other than being really cool, that is.

Well, Astronouth and I were actually talking about this last night. The problem is, there is a ton of slippage, all the time. And even if there weren't it's less precise in general. You can do it, a lot of people have, but optical will be much more precise and accurate. If you have rotating wheels, you gain some more accuracy: see "StangPS".

Optical, if you interface to the chip using quaderature, is, from a code standpoint, just as easy as using quaderature wheel encoders. Easier, in fact. The only issue is optics and illumination, both of which I am near solving.

I'm totally psyched, as I believe I'll be the first to have a working, all optical, nav system. Whether all optical is even a good idea remains to be seen. :rolleyes:

And once this works, I've got an even cooler idea to work on, Muhahaha!

jonathan lall
30-12-2004, 23:07
Okay, let me play devil's advocate.

Wheel encoders have been done before, and I assume they work. If a robot uses differential (two-wheel) drive and we keep track of each wheel's motion, and the tires are pneumatic and don't ever slip on the carpet, can't we measure not only x and y but rotation as well?

So my question is this: what great advantage does an optical mouse have over wheel encoders that makes us want to make this work? Other than being really cool, that is. The problem is that wheel slippage always occurs, in every turn one makes (not to mention once pushing becomes a factor). Some robots with conventional swerve-style steering have a differential to minimize (but not by any means eliminate) this, but therein lies the problem. An encoder or hall effect sensor is placed somewhere on the drivetrain, which propels the robot, but does not reflect its actual movement as accurately as some would like. You are absolutely right that this usually doesn't prove troublesome to robots that steer as you describe, but they represent a minority in FIRST. This problem is especially the case with tank-style steering robots, i.e. most robots.

Enter terrain-following. Instead of looking at what the propulsion device is doing to estimate where the robot is, we are following the movement of the robot. Assuming the camera doesn't skip a beat and screw up, we get a much more accurate guidance system that opens up possibilities of pinpoint accuracy.

phrontist
30-12-2004, 23:09
The problem is that wheel slippage always occurs, in every turn one makes (not to mention once pushing becomes a factor). Some robots with conventional swerve-style steering have a differential to minimize (but not by any means eliminate) this, but therein lies the problem. An encoder or hall effect sensor is placed somewhere on the drivetrain, which propels the robot, but does not reflect its actual movement as accurately as some would like. You are absolutely right that this usually doesn't prove troublesome to robots that steer as you describe, but they represent a minority in FIRST. This problem is especially the case with tank-style steering robots, i.e. most robots.

Enter terrain-following. Instead of looking at what the propulsion device is doing to estimate where the robot is, we are following the movement of the robot. Assuming the camera doesn't skip a beat and screw up, we get a much more accurate guidance system that opens up possibilities of pinpoin accuracy.

What is terrain following, and how is it different than optical mouse based navigation?

jonathan lall
30-12-2004, 23:20
What is terrain following, and how is it different than optical mouse based navigation? It's not different. Say I took some Agilent optical mouse sensor and placed it on the bottom of my robot. The camera would map the terrain over which the robot passes by comparing frame to frame. Hence the term "terrain following." This isn't to be confused with US Air Force LANTIRN technology, which is completely different... I mean more on the order of a non-INS that doesn't guesstimate position based on what the propulsion system is doing. A robot with a non-drive wheel hooked up to an encoder works by the same principle as an optical mouse then, and could be classified as "terrain following," though it's of course not as accurate as an optical mouse.

phrontist
30-12-2004, 23:23
It's not different. Say I took some Agilent optical mouse sensor and placed it on the bottom of my robot. The camera would map the terrain over which the robot passes by comparing frame to frame. Hence the term "terrain following." This isn't to be confused with US Air Force LANTIRN technology, which is completely different... I mean more on the order of an INS that doesn't guesstimate position based on what the propulsion system is doing. A robot with a non-drive wheel hooked up to an encoder works by the same principle as an optical mouse then, and could be classified as "terrain following," though it's of course not as accurate as an optical mouse.

How far along are you guys to a working system? I'm just waiting on parts :ahh:

and I think Mr. Putnam actually has something working

jonathan lall
30-12-2004, 23:30
How far along are you guys to a working system? I'm just waiting on parts :ahh:

and I think Mr. Putnam actually has something working
We have no plans for this year. Perhaps if other teams figure it out we'll brush the dust off our optical sensors. We never wanted to use a lens to zoom-out the image because of a lack of interest and many other uncertainties with this technology as it applies to guidance systems (for example, if we could program a guidance system that's appreciably better, and if we could get the inputs into the RC at all).

Craig Putnam
31-12-2004, 09:30
How far along are you guys to a working system? I'm just waiting on parts :ahh:

and I think Mr. Putnam actually has something working

Well, there is "working" and then there is "ready to reliably run in autonomous mode match after match". We definitely have not reached the latter state yet.

We have the optical system / circuit board mount done and ready to go. We have the illumination system done and ready to go. We have a presumably illegal interface chip (PAK-VIa from Al Williams, Inc.) between the mouse and the PIC that has to go (sigh...). And we have some PID code taking inputs from the mouse and gyro that is getting there, but isn't quite ready for prime time either.

We have a prototype system using an unmodified mouse mounted to the mini robot controller. If I have to divert a lot of time into eliminating the PAK chip then I have serious doubts as to whether we will get there in time.

gnormhurst
01-01-2005, 15:34
If I have to divert a lot of time into eliminating the PAK chip then I have serious doubts as to whether we will get there in time.

I can certainly understand that.

What we need is somebody to step up and solve the direct-PIC interface (eliminate the [wonderful] PAK chip). Maybe this solution would start with Kevin's new serial port code.

Any volunteers? Or two who want to work together? I imagine Kevin would be happy to consult!

Gdeaver
01-01-2005, 19:05
The solution is easy. The mouse chip that is probably in your mouse is a Aligent 2610. The data sheet can be found at 2160data sheet (http://cp.literature.agilent.com/litweb/pdf/5988-9774EN.pdf) . The chip uses a SPI interface. Kronos robotics has an app note on this very subject. It can be found at Kronosroboticsmouse (http://kronosrobotics.com/Anotes/AN168%20Athena%20to%20Optical%20Mouse.pdf) . From the data sheet Pin 3 is the IO and pin 4 is the clock. You need to find the circuit board traces for these 2 pins and follow them to the USB PS2 interface chip. After you find them, remove the interface chip and re-solder 2 wires to the circuit board as shown in the app note. The mouse I used had a different interface chip. The board was coated. I had to hold the board up to an intense light to see the traces. Next you need to cut the end off the mouse cord and solder on some pins or sockets depending on if you going to use a bread board or the robot controller. This will allow you to power you mouse and interface it to the PIC. I used a Kronos Dios Microcontroller and the code from their site to get it working. The Dios is a Pic microcontroller very similar to the PIC in our robot controller. It is programed in basic. Kronos basic has a software function to send out synchronous serial streams that make the SPI communications easy. They have code examples to get it working. The nice thing about the Dios basic language is that they have high level function to communicate to another micro with ether rs232 or rs485. You could use the Dois chip to interface to the robot controller using Mr. Watson's serial port libraries. The other option is to have the mouse talk directly to the robot controller. While the PIC in our RC has hardware SPI, I believe it is tied up with communications between the 2 PICs in the RC. This means the SPI would have to be done with software or what is called bit banging. Maybe someone on the forum could reference some examples of Software SPI code. While getting the mouse to talk to the RC is doable, I think where this all will fall apart is in adapting optics and illumination to keeping the mouse a couple inches off the carpet . Can an optic system be adapted to the mouse that would maintain focus as the robot bounces around? The chip is also very sensitive to the wave length , Intensity and angle of the illumination. The mouse may be a good replacement for an encoder. Place a wood disk on the inside of the wheel and mount the mouse as close as possible with out touching. The programing could be easier as it wouldn't have to deal with interrupts. The possibility of replacing a gyro and encoder with a cheap mouse is enticing

Kevin Watson
01-01-2005, 20:16
...This means the SPI would have to be done with software or what is called bit banging. Maybe someone on the forum could reference some examples of Software SPI code... I wrote an example asynchronous serial transmitter a while back. It shows how to implement a serial transmitter using a timer and a state machine. It might get the creative juices flowing (hint: you'll need to tie the SPI clock line to an interrupt line and clock the SPI state machine in the ISR). The code can be found here: http://kevin.org/frc.

-Kevin

Craig Putnam
01-01-2005, 20:37
... While getting the mouse to talk to the RC is doable, I think where this all will fall apart is in adapting optics and illumination to keeping the mouse a couple inches off the carpet . Can an optic system be adapted to the mouse that would maintain focus as the robot bounces around? The chip is also very sensitive to the wave length , Intensity and angle of the illumination....

I actually have the optics working - at least on a test bed. Focus is a concern, but the purpose is to guide the robot during autonomous mode. I'm hopeful there won't be so much bouncing around during that time that focus would be lost to an extent that seriously hampers navigation. I may be able to get more depth of field if I stop the aperature down a bit. That means I will need more intensity, but I already have way more than I need.

The LED cluster I'm using matches the wavelength requirement of the Agilent chip almost exactly so that's not a problem. I also have the mount for the LED cluster done which will allow us to bring the light in at a fairly low angle. Not as low as it is with a mouse, but it seems to work on the test bed nicely.

I'm assuming the Kronos Dios microcontroller is available from the approved list of suppliers? I have already run afoul of last year's rule R71 with the interface chip I'm presently using (a PAK-VIa from Al Williams, Inc.). That interface chip works fine, but I can't get it from last year's approved list, so (I'm presuming) it's no good to me for use onboard a competition robot. I could hope that R71 gets revisited this year and the possible sources of electronics get opened up, but I can't count on that happening.

Thanks very much for the pointer!

Gdeaver
01-01-2005, 23:06
The Dios or Athena chip was pointed out because the example code is all there. It would be a good option for proto typing because of all the tools. The ability to capture and plot data are part of the environment. However, even though all the parts on a carrier board are available from digikey, it is a special PIC. A strict interpretation of last years rules would prohibit it. After the set up is debugged it's just getting the software in pic c on the robot controller. Someone out there on the form must of played with a SPI device.
WE just need some sample code of bit banging.

phrontist
02-01-2005, 11:34
Hmmm...

I don't think bit banging is nessicary at all. In fact, I don't plan to use interrupts at all. If you look at David Flowerday's post above, he links to the incredibly useful aligent site (http://www.home.agilent.com/cgi-bin/pub/agilent/Product/cp_ProductComparison.jsp?NAV_ID=-536893734.0.00&LANGUAGE_CODE=eng&COUNTRY_CODE=US) which has information on all of their chips. Many of the chips have quaderature output, which is really really easy to interface to! But the following issues came up in a discussion between Max and I:


The mouse would generate interrupts way to fast when the robot was at top speed, even with optics and such
You need three optical axis's, which means six pins to take quaderature input. Some of the digital inputs on the RC and grouped together, so while possibly doable, it becomes harder.


The solution is to use an accumulator chip, which will take in quaderature input and count "rotations" of the pattern, outputting it in as 8 binary outputs, which can in turn be read by the RC without issue. But no! I hear you cry... you need 8 digital inputs per axis! Well this is where the clever bit come in, once again courtesy of Max. The output of these chips can be turned on and off (and their counts can be reset), through one of the digital outs on the RC. So you connect all three accumulators to the same 8 pins, and than you poll them every so often (as long as its fast enough that the accumulators don't overflow). So you just turn each chip's output on in turn and read the values in!

Pretty cool, huh?

seanwitte
02-01-2005, 11:44
Hmmm...

I don't think bit banging is nessicary at all. In fact, I don't plan to use interrupts at all. If you look at David Flowerday's post above, he links to the incredibly useful aligent site (http://www.home.agilent.com/cgi-bin/pub/agilent/Product/cp_ProductComparison.jsp?NAV_ID=-536893734.0.00&LANGUAGE_CODE=eng&COUNTRY_CODE=US) which has information on all of their chips. Many of the chips have quaderature output, which is really really easy to interface to! But the following issues came up in a discussion between Max and I:


The mouse would generate interrupts way to fast when the robot was at top speed, even with optics and such
You need three optical axis's, which means six pins to take quaderature input. Some of the digital inputs on the RC and grouped together, so while possibly doable, it becomes harder.


The solution is to use an accumulator chip, which will take in quaderature input and count "rotations" of the pattern, outputting it in as 8 binary outputs, which can in turn be read by the RC without issue. But no! I hear you cry... you need 8 digital inputs per axis! Well this is where the clever bit come in, once again courtesy of Max. The output of these chips can be turned on and off (and their counts can be reset), through one of the digital outs on the RC. So you connect all three accumulators to the same 8 pins, and than you poll them every so often (as long as its fast enough that the accumulators don't overflow). So you just turn each chip's output on in turn and read the values in!

Pretty cool, huh?

Another option is to use a counter with synchronous serial output. You can tie the data and clock lines from all of the accumulators to two RC pins, then have one dedicated RC pin for the Chip Select on each accumulator. The code to shift in the bits would be easy to implement but I think Kevin Watson has already written it.

Alan Anderson
02-01-2005, 13:14
Vertical bounce is a severe problem for an "optical mouse" robot position tracking system. It's not a matter of focus, it's a matter of scale. When the distance from floor to sensor is not absolutely constant, the measured travel is not sufficiently repeatable for a reliable result.

Collaborating loosely with Wildstang, we worked on making it function well for much of last season. I eventually concluded that the distance problem is a showstopper, and gave up on it. I spent a lot of time working out the trigonometry for full position/rotation tracking using a pair of modified mice, and even implemented most of the code using CORDIC, but the system wasn't going to be physically capable of being any more useful than a more standard wheel/shaft encoder.

phrontist
02-01-2005, 13:49
Vertical bounce is a severe problem for an "optical mouse" robot position tracking system. It's not a matter of focus, it's a matter of scale. When the distance from floor to sensor is not absolutely constant, the measured travel is not sufficiently repeatable for a reliable result.

Collaborating loosely with Wildstang, we worked on making it function well for much of last season. I eventually concluded that the distance problem is a showstopper, and gave up on it. I spent a lot of time working out the trigonometry for full position/rotation tracking using a pair of modified mice, and even implemented most of the code using CORDIC, but the system wasn't going to be physically capable of being any more useful than a more standard wheel/shaft encoder.

Could you be more specific? What tolerance are we talking about here? A centimeter, a few milimeters, or more like 3 centimeters? How much distortion are we actually talking about here? Thank you very much for pointing this out, and preventing us from repeating mistakes. You may have found the fatal flaw in this plan. C'est la vie.

Alan Anderson
02-01-2005, 15:17
What tolerance are we talking about here? A centimeter, a few milimeters, or more like 3 centimeters?
We used a test jig with adjustable height and a strip of carpet on a moving platform. During one trial, after I adjusted it for the best focus (as determined by the reported image quality from the sensor), the measured travel from one end of the platform's range to the other was on the order of 400 counts. As long as nothing disturbed the height of the mouse it was very repeatable, with the cumulative distance returning to zero over many cycles back and forth. However, raising the mouse by a couple of millimeters decreased the measurement to about 380 counts. Lowering it a similar amount increased it to about 440. That much variation is probably what you'd get just driving across a smooth carpet, and pretty much wipes out any chance of using the system to get an accurate reading of the robot's position on the field.

To improve things much, you'd have to employ three-dimensional imaging. I doubt there are any cheap devices available to do that.

phrontist
02-01-2005, 16:26
We used a test jig with adjustable height and a strip of carpet on a moving platform. During one trial, after I adjusted it for the best focus (as determined by the reported image quality from the sensor), the measured travel from one end of the platform's range to the other was on the order of 400 counts. As long as nothing disturbed the height of the mouse it was very repeatable, with the cumulative distance returning to zero over many cycles back and forth. However, raising the mouse by a couple of millimeters decreased the measurement to about 380 counts. Lowering it a similar amount increased it to about 440. That much variation is probably what you'd get just driving across a smooth carpet, and pretty much wipes out any chance of using the system to get an accurate reading of the robot's position on the field.

To improve things much, you'd have to employ three-dimensional imaging. I doubt there are any cheap devices available to do that.

Could an ultrasonic or infrared range sensor be used to read height, and scale input values accoordingly?

Jizvonius
02-01-2005, 16:26
If you guys don’t mind a little mechanical system involved in your navigation, you can control the surface quality that your optical system sees and the distance to that surface.

The system requires an interface between the carpet and your optical sensors. Say, a ball with a textured surface (ball size and material must take into account carpet properties). The sensors and ball are a fixed distance from each other by attaching the sensors to the ball cradle. Now to minimize slippage on the ground, have the entire apparatus slide up and down on a track with downward force provided by a constant tension spring.

This isn’t a completely thought through design yet, and yes, it is just a higher resolution encoder. But it is free spinning (deals with pushing and drive train slippage), much higher resolution, and with a little math you may be able to calculate yaw (depending on the placement of the ball and sensors).

It seems that we will either have to deal with mechanical or optical slop either way, so instead of having your sensors losing focus, you can just have the minimized slippage of a free spinning ball (and yell at your mechanical team if it’s not working). Anyway let me know what you guys think.

phrontist
02-01-2005, 16:32
If you guys don’t mind a little mechanical system involved in your navigation, you can control the surface quality that your optical system sees and the distance to that surface.

The system requires an interface between the carpet and your optical sensors. Say, a ball with a textured surface (ball size and material must take into account carpet properties). The sensors and ball are a fixed distance from each other by attaching the sensors to the ball cradle. Now to minimize slippage on the ground, have the entire apparatus slide up and down on a track with downward force provided by a constant tension spring.

This isn’t a completely thought through design yet, and yes, it is just a higher resolution encoder. But it is free spinning (deals with pushing and drive train slippage), much higher resolution, and with a little math you may be able to calculate yaw (depending on the placement of the ball and sensors).

It seems that we will either have to deal with mechanical or optical slop either way, so instead of having your sensors losing focus, you can just have the minimized slippage of a free spinning ball (and yell at your mechanical team if it’s not working). Anyway let me know what you guys think.

Bah! If I get that desperate I'd just make a really low friction housing for the assembly and put it in direct contact with the ground, with the ability to retract up into the robot's body. I'm no where near that desperate yet.

Jizvonius
02-01-2005, 16:44
Bah! If I get that desperate I'd just make a really low friction housing for the assembly and put it in direct contact with the ground, with the ability to retract up into the robot's body. I'm no where near that desperate yet.


Thats not a bad idea as long as the carpet won't mess up the surface of your housing and you have a low slung robot. But in terms of controlling the surface quality and distance to the surface, I'm not sure that it will be as effective. Either way, it might have been a joke but don't rule it out.

Craig Putnam
02-01-2005, 17:54
... However, raising the mouse by a couple of millimeters decreased the measurement to about 380 counts. Lowering it a similar amount increased it to about 440. ...

It sounds to me as though you had very little depth of field. Think of a completely manual camera (I know - that sets a lower boundary on my age!). You have to set the f-stop & shutter speed to get a correct exposure. You can trade them off back and forth as necessary to get the effect you desire.

Want more depth of field? Then stop down the lens (increase the f-stop). Because there is less light coming through the smaller aperature, you need to more brightly illuminate your subject (if you can) or decrease the shutter speed (and risk camera shake).

I suspect I've got a much smaller aperature (higher f-stop) and therefore more depth of field than you had in your setup. Now, I don't have a *lot* of headroom vertically - perhaps +/- 5 mm at the moment, but I can increase it if I need to by stopping down further (I have more light than I need and can always add more if necessary).

phrontist
02-01-2005, 18:15
It sounds to me as though you had very little depth of field. Think of a completely manual camera (I know - that sets a lower boundary on my age!). You have to set the f-stop & shutter speed to get a correct exposure. You can trade them off back and forth as necessary to get the effect you desire.

Want more depth of field? Then stop down the lens (increase the f-stop). Because there is less light coming through the smaller aperature, you need to more brightly illuminate your subject (if you can) or decrease the shutter speed (and risk camera shake).

I suspect I've got a much smaller aperature (higher f-stop) and therefore more depth of field than you had in your setup. Now, I don't have a *lot* of headroom vertically - perhaps +/- 5 mm at the moment, but I can increase it if I need to by stopping down further (I have more light than I need and can always add more if necessary).

Mr. Putnam, you made my day! I was just thinking about this, but not being as ancient^H^H^H^H^H^H^H experienced as you I couldn't put it into words. ;)

Speaking of which, where did you get your lense(s)? I presume you are using a single biconvex lens...

I haven't talked to our optics guy in a week or so (it's winter break), so I don't know what he's got planned, but I'm curious to hear about your optics implementation

Gdeaver
02-01-2005, 19:26
I've done some more testing today and noticed that if there are 2 different texture surfaces the mouse changes scale and it isn't the same each time.
today I put a stripped down mouse assemble on a 3/4" drive shaft. Took some work to get the distance right. Wow, instant encoder and the accuracy and repeatability look good. The real nice thing is no interrupts. I also noticed that there is a software SPI library included with pic c. If the terrain following doesn't work out then use it as a 400dpi encoder.

Craig Putnam
02-01-2005, 19:35
... Speaking of which, where did you get your lense(s)? I presume you are using a single biconvex lens...

Correct - it is a single bi-convex lens that I purchased from Edmund Industrial Optics (a division of Edmund Scientific) . See www.edmundoptics.com (http://www.edmundoptics.com/) for details. I don't have the records here of exactly which one it was, but as I recall there weren't all *that* many to choose from.

Conor Ryan
02-01-2005, 19:57
has anyone ever had a steering wheel as a control system, kind of like the kind of wheel you'd use in a racing game of sort, with a pedal for an accelerator

Alan Anderson
02-01-2005, 22:48
It sounds to me as though you had very little depth of field...
No, as I said before, it's not a focus problem.

The "telephoto mouse" we constructed has a bit more than a centimeter of variation in range where it will still track the carpet if appropriately illuminated. But it won't track it consistently if the distance isn't kept absolutely constant.

phrontist
02-01-2005, 23:20
No, as I said before, it's not a focus problem.

The "telephoto mouse" we constructed has a bit more than a centimeter of variation in range where it will still track the carpet if appropriately illuminated. But it won't track it consistently if the distance isn't kept absolutely constant.

I'm sorry, I don't know that much about optics, please explain how this is not a focus problem. Logically, if you move the mouse up and down it will be tracking larger and smaller patches of carpet. Why would a sudden jump in registered distance occur if it were getting a clear image? If it didn't have a large enough depth of field, it would make sense if a sudden jump occured, for that would indicate the sensors loss of focus. Please clarify.

phrontist
02-01-2005, 23:22
No, as I said before, it's not a focus problem.

The "telephoto mouse" we constructed has a bit more than a centimeter of variation in range where it will still track the carpet if appropriately illuminated. But it won't track it consistently if the distance isn't kept absolutely constant.

I'm sorry, I don't know that much about optics, please explain how this is not a focus problem. Logically, if you move the mouse up and down it will be tracking larger and smaller patches of carpet. Why would a sudden jump in registered distance occur if it were getting a clear image? If it didn't have a large enough depth of field, it would make sense if a sudden jump occured, for that would indicate the feild of view had been passed and the sensor was no longer in focus. Please clarify.

Max Lobovsky
02-01-2005, 23:39
Another option is to use a counter with synchronous serial output. You can tie the data and clock lines from all of the accumulators to two RC pins, then have one dedicated RC pin for the Chip Select on each accumulator. The code to shift in the bits would be easy to implement but I think Kevin Watson has already written it.
My original design was to use an 8 bit counter and shift register for each axis but I decided that the increased complexity in circuit and software was not worth saving a few pins. (Save the pins for what, anyway?) I did look for a counter with shift register combined, and the only one I (think I) found is http://www.digikey.com/scripts/DkSearch/dksus.dll?Filter The reason I say think is because the datasheet is very unclear (no truth tables or anything). Additonally, it is only 4 bit, so I would either have to poll it outside the main loop using a timer interrupt or cascade it. Again, the complexity was prohibitive.

I will soon post pictures of my counters and sensors (which I think should void any of this discussion because of their simplicty.) Everything has been tested except for the robustness of the sensors on a running robot.

phrontist
03-01-2005, 00:15
My original design was to use an 8 bit counter and shift register for each axis but I decided that the increased complexity in circuit and software was not worth saving a few pins. (Save the pins for what, anyway?) I did look for a counter with shift register combined, and the only one I (think I) found is http://www.digikey.com/scripts/DkSearch/dksus.dll?Filter The reason I say think is because the datasheet is very unclear (no truth tables or anything). Additonally, it is only 4 bit, so I would either have to poll it outside the main loop using a timer interrupt or cascade it. Again, the complexity was prohibitive.

I will soon post pictures of my counters and sensors (which I think should void any of this discussion because of their simplicty.) Everything has been tested except for the robustness of the sensors on a running robot.

Just to clarify, max is reffering to omni wheels with encoders on them, not optical mice. The interfacing is the same though.

the_undefined
04-01-2005, 21:24
Hmm reading that I thought the Logitech Mx 1000 Mouse could be very interesting for a project like this since its about 20x better then any optical mouse beacause it's actually really using a laser but I couldn't find it at the pages of the authorized resellers :( ...

Who wants to look at it: Logitech Mx 1000 (http://www.logitech.com/index.cfm/products/details/US/EN,CRID=2135,CONTENTID=9043)

phrontist
04-01-2005, 21:54
Hmm reading that I thought the Logitech Mx 1000 Mouse could be very interesting for a project like this since its about 20x better then any optical mouse beacause it's actually really using a laser but I couldn't find it at the pages of the authorized resellers :( ...

Who wants to look at it: Logitech Mx 1000 (http://www.logitech.com/index.cfm/products/details/US/EN,CRID=2135,CONTENTID=9043)

Sure you could use it! It's from the same manufacturer as all the other chips, but I don't see a datasheet for it on their site linked above. I don't think this is possible though, because you'd need to illuminate a huge area by laser standards. It's easy to illuminate a 5mm by 5mm patch, but try 6 by 6 centimeters.

UPDATE: You're in germany? I wasn't aware that there was a team there yet... that's uber cool.

BrianJennings
05-01-2005, 00:43
Sorry to get off topic, but is this project being doen by someone who does stuff at DW college? If so, just wondering, becuase my dad was talking about you doing it, if not, then you should sollaberate with the guy who is doing it, because he seems to know alot about it... It is posted on these forus somewhere...

the_undefined
05-01-2005, 00:52
Sure you could use it! It's from the same manufacturer as all the other chips, but I don't see a datasheet for it on their site linked above. I don't think this is possible though, because you'd need to illuminate a huge area by laser standards. It's easy to illuminate a 5mm by 5mm patch, but try 6 by 6 centimeters.

UPDATE: You're in germany? I wasn't aware that there was a team there yet... that's uber cool.

Hmm I'm not in Germany right now, I'm doing an exchange year in Atlanta, Ga right now and joined their robotics club their since I like programming a lot and it's not every day you can play around with plc's and all that stuff : ). But as far as I think Germany has one or two teams, too but I'm not quiete sure about that since thats just what somebody told me o_O.

My Highschool is competing for the first team this year but we got some good suport/sponsoring on our site so I hope we can get a decent robot together : ).

Youl'll probably hear more from me in the next team since I'll probably have some questions (especally regarding programming/etc. soon). I just had a look at the default code from 2004 so far and no chances to trie anything out but it didn't seem horrible difficult to me (at least the standard functions/etc.). We'll probably have a little Testrobot for the club from tomorrow on which has the same chip so I can start playing around with it, which will probably cause the first question soon (or may not, if it's googleable : ) ...

Ok bye guys,

Felix "the_undefined" Geisendörfer

Alan Anderson
05-01-2005, 08:57
I'm sorry, I don't know that much about optics, please explain how this is not a focus problem. Logically, if you move the mouse up and down it will be tracking larger and smaller patches of carpet. Why would a sudden jump in registered distance occur if it were getting a clear image?
The "larger and smaller patches of carpet" will appear to be moving by "smaller and larger distances" as seen by the sensor. If you travel three feet, the mouse will record a certain number of counts. If you then raise the mouse and travel the same distance, the mouse will record fewer counts.

An optical mouse works by tracking offsets in the image at the sensor. The image at the sensor is a scaled representation of the carpet being viewed. The scale factor varies with distance from the carpet. If the distance is not kept perfectly consistent, the resulting measurements are likewise not consistent.

gnormhurst
05-01-2005, 19:24
Devil's Advocate here again...

Is there a findamental problem with using an optical mouse? The mouse reports a delta distance since the previous frame, as best as it can determine by performing a correlation between the current and the previous frame. But any fractional part of that measurement is not carried forward to the next measurement, and is lost.

For example, let's say you were trying to get the robot to move perfectly straight, but it was moving very slightly to the right, so slightly that the X axis change was less than one pixel of the mouse's imager per frame, say 1/3 pixel. The mouse will output zero change (because the best correlation would be "zero" offset).

On the next frame the mouse again compares the frame with the previous frame, and again the motion is so slight (1/3 pixel, not 2/3 because we only compare adjacent frame pairs) that the mouse again reports zero change. But the robot is moving steadily to the right.

Our program accumulates the deltas to measure position, but zero plus zero plus zero will be zero -- the slight drift to the right is never observed.

Compare that with a pair of wheel encoders. The difference between the two wheel encoders might be only 1/3 tick, but the error is "remembered" mechanically, so that the next time the difference is 2/3 tick, and finally the third time one side reports an extra tick compared to the other side, and the drift is detected.

The lack of memory for fraction pixel offsets seems fundamental to the optical mouse. Am I missing something? Is the resolution so high that my example would result in a trivial guidance error over the length of the field? If I knew the effective "size" of one pixel (on the carpet) I could calculate the potential error accumulation.

-Norm

phrontist
05-01-2005, 19:30
Devil's Advocate here again...

Is there a findamental problem with using an optical mouse? The mouse reports a delta distance since the previous frame, as best as it can determine by performing a correlation between the current and the previous frame. But any fractional part of that measurement is not carried forward to the next measurement, and is lost.

For example, let's say you were trying to get the robot to move perfectly straight, but it was moving very slightly to the right, so slightly that the X axis change was less than one pixel of the mouse's imager per frame, say 1/3 pixel. The mouse will output zero change (because the best correlation would be "zero" offset).

On the next frame the mouse again compares the frame with the previous frame, and again the motion is so slight (1/3 pixel, not 2/3 because we only compare adjacent frame pairs) that the mouse again reports zero change. But the robot is moving steadily to the right.

Our program accumulates the deltas to measure position, but zero plus zero plus zero will be zero -- the slight drift to the right is never observed.

Compare that with a pair of wheel encoders. The difference between the two wheel encoders might be only 1/3 tick, but the error is "remembered" mechanically, so that the next time the difference is 2/3 tick, and finally the third time one side reports an extra tick compared to the other side, and the drift is detected.

The lack of memory for fraction pixel offsets seems fundamental to the optical mouse. Am I missing something? Is the resolution so high that my example would result in a trivial guidance error over the length of the field? If I knew the effective "size" of one pixel (on the carpet) I could calculate the potential error accumulation.

-Norm


While I can't give you an exact figure (after optical modification), I'm using a chip with 800 dpi resolution. Even if that is reduced 10 fold, it's still many, many times more precise AND accurate than encoders. Your arguement is not a logically flawed one, but this is happening on such a small scale, I think it will actually be more straight than gyros or encoders. We'll see! :D

phrontist
05-01-2005, 19:41
The "larger and smaller patches of carpet" will appear to be moving by "smaller and larger distances" as seen by the sensor. If you travel three feet, the mouse will record a certain number of counts. If you then raise the mouse and travel the same distance, the mouse will record fewer counts.

An optical mouse works by tracking offsets in the image at the sensor. The image at the sensor is a scaled representation of the carpet being viewed. The scale factor varies with distance from the carpet. If the distance is not kept perfectly consistent, the resulting measurements are likewise not consistent.

Right, but I still think this will be more accurate than anything else out there. I think it's entirely possible to keep the mouse within a 8mm range. And how long is it really going to be out of whack? A few tenths of a second perhaps... So lets say it misses 300 counts. That's a few inches off, and even if this happens fairly regularly its still so much dramatically better than any other robot navigation solution. It's not going to be perfect, but it will be better than it needs to be.

I think the bigger issue will be different surfaces, which will have to be accounted for by using banner sensors to detect changes. This will be interesting.

At the very worst, infrared and ultrasonic sensors are precise enough that they could be mounted near the optical sensor and used to scale data live. I really don't think it will come to that though.

Craig Putnam
05-01-2005, 20:13
Devil's Advocate here again...

Is there a findamental problem with using an optical mouse? The mouse reports a delta distance since the previous frame, as best as it can determine by performing a correlation between the current and the previous frame. But any fractional part of that measurement is not carried forward to the next measurement, and is lost.

...snip...

-Norm

It certainly is a legitimate concern, but I think in the grand scheme of things it will turn out to be a non-issue.

I am *not* using the mouse to give me lateral distances (call it the x-axis offset distance) in part for the reasons you have stated. Rather, I am using a rate gyro to tell me how much rotation the robot has experienced in the most recent time interval. I use the mouse to give me just the y-axis offset, i.e., distance traveled. A little trig and a couple of approximations later I can calculate a lateral offset which *is* accumulated.

The real question is: How much error will accumulate during autonomous mode using the <whatever> navigation system, and is it so much that the <whatever> navigation system is not reliable? Obviously all of these different systems have their individual strengths and weaknesses.

If during the (presumed) 30 seconds of autonomouse mode (good grief, I don't believe my fingers just typed that - but I like it!) - if during that time period the robot is only off by a couple inches then I will be ecstatic. [ I calculate that my refitted mouse should have a 1/4" resolution. We'll see if that holds up or not... ]

The wheel encoder system we used last year had about a 6 inch resolution. I know that much greater resolution is possible using wheel encoders, but that's what we had. So if the optical mouse works better, then great. If it doesn't work better, then maybe we all learned something.

lynca
01-03-2005, 23:24
I have been performing a experiment to validate optical mouse sensing with off the shelf robotic componets.

My work started with writing a custom PS/2 driver with an HCS12 microcontroller from motorola. Basically using a standard SPI interface I was able to communicate between the mouse successfully.
http://iarc1.ece.utexas.edu/~lynca/multimedia/FIRST/SPI_oscopeTests/

the code released is not meant for a PIC but can serve as a guide on implementing communication between a mouse and SPI port.
contact by email for the project which is still under development,
http://iarc1.ece.utexas.edu/~lynca/multimedia/FIRST/src/

My solution was not simple and easily transferable. Therefore I was determined to find something better for the robotic community.
I discovered Al William here near my area who had implemented a much more elegant PS/2 driver on a chip he call PAK XI (earlier version pak6).
http://www.awce.com/pak6.htm

Testing pictures, http://iarc1.ece.utexas.edu/~lynca/multimedia/FIRST/PAKXI/

I also tested above, the gp5 with great success, I have misplaced my hyperterm outputs but email and I'll provide picture evidence if requested.
http://www.awce.com/gp5.htm

In all, the drivers can be implemented with any PIC (including RC) but an offboard processor streaming serial port commands of x and y is a very quick solution (GP5 through programming port). I have a Microchip project which does so but the code is messy at the moment, please email me a code request if curious.


thanks for your time,

~Andrew Lynch
for in depth look at PS2 driver for a PIC
http://www.computer-engineering.org/

russell
14-03-2005, 19:33
Well with my season over I am now working on some non FIRST related robotics projects. The first (and more relevant to this discussion) of these is trying to build a robot that can navigate our school, and build a sonar map which it can use for plotting routes between its location and its destination. The robot will be based around a PC running linux. Right now I am just toying with the idea of trying to use an optical mouse for measuring travel.

As far as mounting height why dont you just use a regular optical mouse mounted on one of the skinny pneumatic rams with a regulator regulating the pressure down to say 10 psi. Then you mount the ram such that it is not fully extended when the mouse is on the ground, so the mouse will always be being gently pressed on the ground. Then if you encounter terrain that is unsuitable for the mouse (put a sonar range finder pointing at the ground in front of the mouse to detect bumps) you simply retract the ram until you are back on flat ground. Just a thought.

Alan Anderson
14-03-2005, 21:32
As far as mounting height why dont you just use a regular optical mouse mounted on one of the skinny pneumatic rams with a regulator regulating the pressure down to say 10 psi. Then you mount the ram such that it is not fully extended when the mouse is on the ground, so the mouse will always be being gently pressed on the ground...
If you're going very slowly, that might work. The problem is that an optical mouse will lose tracking completely if it moves too fast. One reason for considering a "telephoto mouse" mounted high is to trade resolution for maximum speed.

russell
14-03-2005, 21:48
Oh I get it now. Would a regular mouse work at around 3 feet per second? It doesnt seem like much, but now that I think about it I am sure I dont usually move my mouse that fast..