![]() |
Optical Mouse Navigation
A little after build season was over for the 2004 season I started thinking about new ideas for this year's season, particularly autonomous mode. I was poking around CD when I saw a post by Craig Putnam in which he mentioned he was working on an optical mouse based navigation system. What a great idea, I thought! I thought about it periodically during the off season and eventually I had come up with how I wanted to do this, and had almost everything figured out except how to interface the mouse, or mice, to the RC. So I emailed Craig Putnam and he explained his progress on making such a system and much to my delight I found that much of what he had done was what I was planning to do, so I couldn't be too far off track. He also mentioned something mind-boggling...
But before I talk about that I'd like to mention something else. When I first got this piece of information from Mr. Putnam a few days ago, I didn't tell anyone and hoped he wouldn't either. This was pretty stupid for a variety of reasons, which don't need to be enumerated. Anyway, in the past few days I've seen several posts that here that are dancing around the problem, namely the following:
I realized that a lot of people are now working on Optical Mouse Navigation, and it wouldn't be fair to not share everything. So then... There is a chip that converts PS/2 to RS232, made specifically for interfacing PICs to mice! It's from Al-Williams and it's called the PAK-XI. It slices, it dices, it makes a mean Crème Brûlée! Mr. Putnam, a mentor from team 42, P.A.R.T.S., tipped me off to it after I emailed him about it. But wait, theres more! It's also cheap! Hey! So what's the catch? Well, it may or may not be legal. Here are a few snippets from my correspondence with Mr. Putnam: Quote:
Quote:
Quote:
Quote:
Quote:
I'm currently building a prototype mouse assembly, and the problem of the moment is sufficent illumination. Here are some more snippets from my correspondance with Mr. Putnam. Quote:
Quote:
Quote:
(Innumerable thanks to Mr. Putnam, who doesn't seem to post on CD to often. I hope he doesn't mind me quoting him! Note: These are out of chronological order, for the sake of clarity) |
Re: Optical Mouse Navigation
I'll admit that I did read almost any of that post, but... I had this idea last year. We ruled out that idea becuse of one main reason: The maximum rate of speed that the mouse can mesure was well under the speed of our robot. That might be somthing to look into if your thinking about a mouse for navigation.
|
Re: Optical Mouse Navigation
Quote:
|
Re: Optical Mouse Navigation
Well now that it's out in the open, we've toyed with this concept for some time as a terrain-mapping navigational system, but we never got past the conceptual stages because of the specs on the sensor devices. I was pushing for some research into this the past two years on my team actually, and we did (we believe) figure out how to get two mouse inputs into the controller (we would place one on either side of the robot and judge distance and pitch from those readings), but the main problem with this was purely technical on the sensor side. Simply put, your robot would have to be going very slowly for the sensor to be able to position you. A robot that blasts off at say 10 feet per second in autonomous mode is by a large margin (I don't have the numbers offhand) too fast for such a device. The framerate is not high enough such that the sensor (which is really a camera) can take two frames and compare them. Thus, no movement would be detected. Perhaps we've missed something, but our preliminary research into this lead us to that conclusion.
|
Re: Optical Mouse Navigation
Quote:
Mr. Putnam is using an accellerometer to gauge odometry, and the mouse to measure rotation. I was planning on using a mouse for both, each one going through a PAK-XI and then to a serial port. This is possible through Kevin Watson's new serial port driver. This would make debugging hard, as Max Lobovsky pointed out, because you couldn't use printf's to send data back to your laptop anymore, as all the ports would be used. His idea was to multiplex the two streams... Questions Questions! |
Re: Optical Mouse Navigation
Okay, I think my plan is to go ahead and use the PAK-XI's on the offchance they'll be allowed, while also working on code (In C, INSTEAD OF THIS ASM NONSENSE :ahh: ) to replicate the function of the PAK-XI should it be banned. Does anyone have any idea which PIC would be best suited for simultaneous PS/2 and RS232 communication?
|
Re: Optical Mouse Navigation
Wildstang experimented with optical mice last year as well. I'm having a hard time telling from that long post just how far others have gotten. Has anyone actually communicated with their mouse?
Last year we were able to get an optical mouse connected to our robot controller and read X/Y positioning from the mouse chip. We did not use PS/2, however. Instead we removed the PS/2 driver chip from inside the mouse and connected the RC directly to the optical sensor chip from Agilent (BTW, Agilent is the only company making optical mice components, so if you open one up that is what you'll find). The Agilent chip can speak in a simple synchronous protocol which is how we communicated with it. We implemented this using the normal input/output pins on the RC and bit-banging the protocol. This allows us to read out the X/Y deltas as well as obtain the actual image captured by the mouse, the surface quality measurement, and a bunch of other goodies. The chip is pretty nice because it will remember how far it's moved since the last time you queried it, so you can query it at your leisure (as long as it's fast enough that the counter inside the sensor doesn't overflow). We also affixed a different lens to the mouse to change its field of vision to accommodate larger speeds. Illumination is a problem, as was already mentioned. We fixed a ring of superbright LEDs around the lens to combat this. However, that is where the bad news starts. With all that out of the way, we mounted the modified mouse to a cart and started doing tests to see if it accurately tracked motion. We found that it did not. When the cart was moved at different speeds over the same distance, the mouse would report different amounts of measured distance. This was disappointing, because before we fitted the new lens we tested the mouse by affixing it to an X-Y table in our model shop with a very accurate readout and found that without the new lens it was good to something like a few thousandths of an inch. At this point it was getting pretty late in the season so with the mouse concept not looking too promising we had to abandon it and concentrate on reusing our old positioning system that we used in 2003. Agilent's site is pretty useful, with datasheets for the various mouse sensors. If you dig around there you will find that they give you lots of info on the optics used as well as what wavelengths the sensors are most sensitive to, etc. Also something to keep in mind is that even if you get to the point where the mouse can track movement accurately, it does not handle rotation. You'll still need a compass or something to know your heading which needs to be combined with the vector obtained from the mouse. I think you can substitute two mice on opposite sides of the robot instead of the gyro, but I haven't yet worked through the math to prove it to myself. There's some odd cases if you do a tank-style spin and such that I have to think about a little bit to see if it can still work. |
Re: Optical Mouse Navigation
Quote:
Thanks for the info! :D UPDATE: Looking at some PDFs on their site, it would seem the particular chip I'm looking at outputs PS/2 directly! Was this the case with yours? |
Re: Optical Mouse Navigation
I guess its time for me to get back into the conversation...
I'll begin by correcting one error in the thread so far. We are using the mouse to tell us how far we have moved, not the rotation (as it is pretty insensitive to rotations about its optical axis). We are using one of the old gyro chips to measure the rotation rate and then merging the inputs from the two sensors to give us a measure of the robot's motion since the last time we looked. All of that is going into a PID feedback loop (the details of which we are still being worked out). The intent is to enable the robot to accurately travel along any mathematically defined path: straight line, circular arc, spline curve, etc. Using two mice (one on each side of the robot) is indeed another way around the rotation problem as is using one mouse at the center of the robot to measure distance traveled and a second mouse mounted at the edge of the frame to measure turning motion. Re: the mouse not being able to track the robot's speed. By lifting the mouse board up and inserting the appropriate optics, we have effectively changed the size of the "Mickey". So instead of (for example) getting 200 Mickeys per inch, we are getting a much smaller number. While our resolution has gone down, the speed that we can track has gone up. We expect to eventually have a resolution of about 1/4 inch and be able to easily track the top speed of a FIRST robot. We do indeed speak to the mouse quite well using the PAK-VIa chip. [ If you go to the Al Williams site you will see that there is now a chip specifically designed for communicating with mice - the PAK-XI. We starting with the PAK-VIa chip however and have found that works well enough for our needs at the moment. ] As has been pointed out however, the use of any PAK chip may well be illegal. So I am very interested in hearing from anyone who has successfully communicated directly with either a PS/2 or USB based mouse (or directly communicated with the Agilent chip as was noted above). |
Re: Optical Mouse Navigation
The optical mice are going to have a problem because one blurry patch of grey FIRST carpet is going to look like the next. I agree that this is a cool idea, but think it'll be hard to get it to work (but I'm willing to help nonetheless <grin>).
-Kevin |
Re: Optical Mouse Navigation
Thinking about this, yes optical mice are cool but what came before them? Ball mice! It may not be practical for all games (like climbing a step) but for a game like 2002's Zone Zeal, maybe you could just stick a ball and a couple wheels under the robot. Think Technokats 2003 Ball robot, but maybe 1/4-1/3 the size and instead of motors powering the little wheels there would be shaft encoders hooked to the little wheels. Yeah, that would work, at least on a flat field. Make a cradle in the center of your robot and put a ball in there with two little rollers at 90 degrees from each other contacting the ball at all times.
|
Re: Optical Mouse Navigation
Quote:
|
Re: Optical Mouse Navigation
Quote:
|
Re: Optical Mouse Navigation
Quote:
My experience with testing various surfaces under my prototype optical system seemed to support that it will work OK. But I didn't have any of the FIRST carpet to test with so I can't say for sure that the modified mouse will work well with that. Time will tell... The other thing that can really help is the angle of the light. Think about how an optical mouse works. The light is shining not directly down but at a very low angle to the surface. The idea is to cast as many shadows from surface variations as possible in order to give the chip something "interesting" to look at. |
Re: Optical Mouse Navigation
Quote:
That's the geekiest thing I've ever said. |
Re: Optical Mouse Navigation
Quote:
The only problem with ball mice that I can figure is the problem with resistance, but that should be negligible |
Re: Optical Mouse Navigation
Okay, let me play devil's advocate.
Wheel encoders have been done before, and I assume they work. If a robot uses differential (two-wheel) drive and we keep track of each wheel's motion, and the tires are pneumatic and don't ever slip on the carpet, can't we measure not only x and y but rotation as well? So my question is this: what great advantage does an optical mouse have over wheel encoders that makes us want to make this work? Other than being really cool, that is. |
Re: Optical Mouse Navigation
Quote:
Optical, if you interface to the chip using quaderature, is, from a code standpoint, just as easy as using quaderature wheel encoders. Easier, in fact. The only issue is optics and illumination, both of which I am near solving. I'm totally psyched, as I believe I'll be the first to have a working, all optical, nav system. Whether all optical is even a good idea remains to be seen. :rolleyes: And once this works, I've got an even cooler idea to work on, Muhahaha! |
Re: Optical Mouse Navigation
Quote:
Enter terrain-following. Instead of looking at what the propulsion device is doing to estimate where the robot is, we are following the movement of the robot. Assuming the camera doesn't skip a beat and screw up, we get a much more accurate guidance system that opens up possibilities of pinpoint accuracy. |
Re: Optical Mouse Navigation
Quote:
|
Re: Optical Mouse Navigation
Quote:
|
Re: Optical Mouse Navigation
Quote:
and I think Mr. Putnam actually has something working |
Re: Optical Mouse Navigation
Quote:
|
Re: Optical Mouse Navigation
Quote:
We have the optical system / circuit board mount done and ready to go. We have the illumination system done and ready to go. We have a presumably illegal interface chip (PAK-VIa from Al Williams, Inc.) between the mouse and the PIC that has to go (sigh...). And we have some PID code taking inputs from the mouse and gyro that is getting there, but isn't quite ready for prime time either. We have a prototype system using an unmodified mouse mounted to the mini robot controller. If I have to divert a lot of time into eliminating the PAK chip then I have serious doubts as to whether we will get there in time. |
Re: Optical Mouse Navigation
Quote:
What we need is somebody to step up and solve the direct-PIC interface (eliminate the [wonderful] PAK chip). Maybe this solution would start with Kevin's new serial port code. Any volunteers? Or two who want to work together? I imagine Kevin would be happy to consult! |
Re: Optical Mouse Navigation
The solution is easy. The mouse chip that is probably in your mouse is a Aligent 2610. The data sheet can be found at 2160data sheet . The chip uses a SPI interface. Kronos robotics has an app note on this very subject. It can be found at Kronosroboticsmouse . From the data sheet Pin 3 is the IO and pin 4 is the clock. You need to find the circuit board traces for these 2 pins and follow them to the USB PS2 interface chip. After you find them, remove the interface chip and re-solder 2 wires to the circuit board as shown in the app note. The mouse I used had a different interface chip. The board was coated. I had to hold the board up to an intense light to see the traces. Next you need to cut the end off the mouse cord and solder on some pins or sockets depending on if you going to use a bread board or the robot controller. This will allow you to power you mouse and interface it to the PIC. I used a Kronos Dios Microcontroller and the code from their site to get it working. The Dios is a Pic microcontroller very similar to the PIC in our robot controller. It is programed in basic. Kronos basic has a software function to send out synchronous serial streams that make the SPI communications easy. They have code examples to get it working. The nice thing about the Dios basic language is that they have high level function to communicate to another micro with ether rs232 or rs485. You could use the Dois chip to interface to the robot controller using Mr. Watson's serial port libraries. The other option is to have the mouse talk directly to the robot controller. While the PIC in our RC has hardware SPI, I believe it is tied up with communications between the 2 PICs in the RC. This means the SPI would have to be done with software or what is called bit banging. Maybe someone on the forum could reference some examples of Software SPI code. While getting the mouse to talk to the RC is doable, I think where this all will fall apart is in adapting optics and illumination to keeping the mouse a couple inches off the carpet . Can an optic system be adapted to the mouse that would maintain focus as the robot bounces around? The chip is also very sensitive to the wave length , Intensity and angle of the illumination. The mouse may be a good replacement for an encoder. Place a wood disk on the inside of the wheel and mount the mouse as close as possible with out touching. The programing could be easier as it wouldn't have to deal with interrupts. The possibility of replacing a gyro and encoder with a cheap mouse is enticing
|
Re: Optical Mouse Navigation
Quote:
-Kevin |
Re: Optical Mouse Navigation
Quote:
The LED cluster I'm using matches the wavelength requirement of the Agilent chip almost exactly so that's not a problem. I also have the mount for the LED cluster done which will allow us to bring the light in at a fairly low angle. Not as low as it is with a mouse, but it seems to work on the test bed nicely. I'm assuming the Kronos Dios microcontroller is available from the approved list of suppliers? I have already run afoul of last year's rule R71 with the interface chip I'm presently using (a PAK-VIa from Al Williams, Inc.). That interface chip works fine, but I can't get it from last year's approved list, so (I'm presuming) it's no good to me for use onboard a competition robot. I could hope that R71 gets revisited this year and the possible sources of electronics get opened up, but I can't count on that happening. Thanks very much for the pointer! |
Re: Optical Mouse Navigation
The Dios or Athena chip was pointed out because the example code is all there. It would be a good option for proto typing because of all the tools. The ability to capture and plot data are part of the environment. However, even though all the parts on a carrier board are available from digikey, it is a special PIC. A strict interpretation of last years rules would prohibit it. After the set up is debugged it's just getting the software in pic c on the robot controller. Someone out there on the form must of played with a SPI device.
WE just need some sample code of bit banging. |
Re: Optical Mouse Navigation
Hmmm...
I don't think bit banging is nessicary at all. In fact, I don't plan to use interrupts at all. If you look at David Flowerday's post above, he links to the incredibly useful aligent site which has information on all of their chips. Many of the chips have quaderature output, which is really really easy to interface to! But the following issues came up in a discussion between Max and I:
The solution is to use an accumulator chip, which will take in quaderature input and count "rotations" of the pattern, outputting it in as 8 binary outputs, which can in turn be read by the RC without issue. But no! I hear you cry... you need 8 digital inputs per axis! Well this is where the clever bit come in, once again courtesy of Max. The output of these chips can be turned on and off (and their counts can be reset), through one of the digital outs on the RC. So you connect all three accumulators to the same 8 pins, and than you poll them every so often (as long as its fast enough that the accumulators don't overflow). So you just turn each chip's output on in turn and read the values in! Pretty cool, huh? |
Re: Optical Mouse Navigation
Quote:
|
Re: Optical Mouse Navigation
Vertical bounce is a severe problem for an "optical mouse" robot position tracking system. It's not a matter of focus, it's a matter of scale. When the distance from floor to sensor is not absolutely constant, the measured travel is not sufficiently repeatable for a reliable result.
Collaborating loosely with Wildstang, we worked on making it function well for much of last season. I eventually concluded that the distance problem is a showstopper, and gave up on it. I spent a lot of time working out the trigonometry for full position/rotation tracking using a pair of modified mice, and even implemented most of the code using CORDIC, but the system wasn't going to be physically capable of being any more useful than a more standard wheel/shaft encoder. |
Re: Optical Mouse Navigation
Quote:
|
Re: Optical Mouse Navigation
Quote:
To improve things much, you'd have to employ three-dimensional imaging. I doubt there are any cheap devices available to do that. |
Re: Optical Mouse Navigation
Quote:
|
Re: Optical Mouse Navigation
If you guys don’t mind a little mechanical system involved in your navigation, you can control the surface quality that your optical system sees and the distance to that surface.
The system requires an interface between the carpet and your optical sensors. Say, a ball with a textured surface (ball size and material must take into account carpet properties). The sensors and ball are a fixed distance from each other by attaching the sensors to the ball cradle. Now to minimize slippage on the ground, have the entire apparatus slide up and down on a track with downward force provided by a constant tension spring. This isn’t a completely thought through design yet, and yes, it is just a higher resolution encoder. But it is free spinning (deals with pushing and drive train slippage), much higher resolution, and with a little math you may be able to calculate yaw (depending on the placement of the ball and sensors). It seems that we will either have to deal with mechanical or optical slop either way, so instead of having your sensors losing focus, you can just have the minimized slippage of a free spinning ball (and yell at your mechanical team if it’s not working). Anyway let me know what you guys think. |
Re: Optical Mouse Navigation
Quote:
|
Re: Optical Mouse Navigation
Quote:
Thats not a bad idea as long as the carpet won't mess up the surface of your housing and you have a low slung robot. But in terms of controlling the surface quality and distance to the surface, I'm not sure that it will be as effective. Either way, it might have been a joke but don't rule it out. |
Re: Optical Mouse Navigation
Quote:
Want more depth of field? Then stop down the lens (increase the f-stop). Because there is less light coming through the smaller aperature, you need to more brightly illuminate your subject (if you can) or decrease the shutter speed (and risk camera shake). I suspect I've got a much smaller aperature (higher f-stop) and therefore more depth of field than you had in your setup. Now, I don't have a *lot* of headroom vertically - perhaps +/- 5 mm at the moment, but I can increase it if I need to by stopping down further (I have more light than I need and can always add more if necessary). |
Re: Optical Mouse Navigation
Quote:
Speaking of which, where did you get your lense(s)? I presume you are using a single biconvex lens... I haven't talked to our optics guy in a week or so (it's winter break), so I don't know what he's got planned, but I'm curious to hear about your optics implementation |
Re: Optical Mouse Navigation
I've done some more testing today and noticed that if there are 2 different texture surfaces the mouse changes scale and it isn't the same each time.
today I put a stripped down mouse assemble on a 3/4" drive shaft. Took some work to get the distance right. Wow, instant encoder and the accuracy and repeatability look good. The real nice thing is no interrupts. I also noticed that there is a software SPI library included with pic c. If the terrain following doesn't work out then use it as a 400dpi encoder. |
Re: Optical Mouse Navigation
Quote:
|
Re: Optical Mouse Navigation
has anyone ever had a steering wheel as a control system, kind of like the kind of wheel you'd use in a racing game of sort, with a pedal for an accelerator
|
Re: Optical Mouse Navigation
Quote:
The "telephoto mouse" we constructed has a bit more than a centimeter of variation in range where it will still track the carpet if appropriately illuminated. But it won't track it consistently if the distance isn't kept absolutely constant. |
Re: Optical Mouse Navigation
Quote:
|
Re: Optical Mouse Navigation
Quote:
|
Re: Optical Mouse Navigation
Quote:
I will soon post pictures of my counters and sensors (which I think should void any of this discussion because of their simplicty.) Everything has been tested except for the robustness of the sensors on a running robot. |
Re: Optical Mouse Navigation
Quote:
|
Re: Optical Mouse Navigation
Hmm reading that I thought the Logitech Mx 1000 Mouse could be very interesting for a project like this since its about 20x better then any optical mouse beacause it's actually really using a laser but I couldn't find it at the pages of the authorized resellers :( ...
Who wants to look at it: Logitech Mx 1000 |
Re: Optical Mouse Navigation
Quote:
UPDATE: You're in germany? I wasn't aware that there was a team there yet... that's uber cool. |
Re: Optical Mouse Navigation
Sorry to get off topic, but is this project being doen by someone who does stuff at DW college? If so, just wondering, becuase my dad was talking about you doing it, if not, then you should sollaberate with the guy who is doing it, because he seems to know alot about it... It is posted on these forus somewhere...
|
Re: Optical Mouse Navigation
Quote:
My Highschool is competing for the first team this year but we got some good suport/sponsoring on our site so I hope we can get a decent robot together : ). Youl'll probably hear more from me in the next team since I'll probably have some questions (especally regarding programming/etc. soon). I just had a look at the default code from 2004 so far and no chances to trie anything out but it didn't seem horrible difficult to me (at least the standard functions/etc.). We'll probably have a little Testrobot for the club from tomorrow on which has the same chip so I can start playing around with it, which will probably cause the first question soon (or may not, if it's googleable : ) ... Ok bye guys, Felix "the_undefined" Geisendörfer |
Re: Optical Mouse Navigation
Quote:
An optical mouse works by tracking offsets in the image at the sensor. The image at the sensor is a scaled representation of the carpet being viewed. The scale factor varies with distance from the carpet. If the distance is not kept perfectly consistent, the resulting measurements are likewise not consistent. |
Accumulation of small deltas?
Devil's Advocate here again...
Is there a findamental problem with using an optical mouse? The mouse reports a delta distance since the previous frame, as best as it can determine by performing a correlation between the current and the previous frame. But any fractional part of that measurement is not carried forward to the next measurement, and is lost. For example, let's say you were trying to get the robot to move perfectly straight, but it was moving very slightly to the right, so slightly that the X axis change was less than one pixel of the mouse's imager per frame, say 1/3 pixel. The mouse will output zero change (because the best correlation would be "zero" offset). On the next frame the mouse again compares the frame with the previous frame, and again the motion is so slight (1/3 pixel, not 2/3 because we only compare adjacent frame pairs) that the mouse again reports zero change. But the robot is moving steadily to the right. Our program accumulates the deltas to measure position, but zero plus zero plus zero will be zero -- the slight drift to the right is never observed. Compare that with a pair of wheel encoders. The difference between the two wheel encoders might be only 1/3 tick, but the error is "remembered" mechanically, so that the next time the difference is 2/3 tick, and finally the third time one side reports an extra tick compared to the other side, and the drift is detected. The lack of memory for fraction pixel offsets seems fundamental to the optical mouse. Am I missing something? Is the resolution so high that my example would result in a trivial guidance error over the length of the field? If I knew the effective "size" of one pixel (on the carpet) I could calculate the potential error accumulation. -Norm |
Re: Accumulation of small deltas?
Quote:
While I can't give you an exact figure (after optical modification), I'm using a chip with 800 dpi resolution. Even if that is reduced 10 fold, it's still many, many times more precise AND accurate than encoders. Your arguement is not a logically flawed one, but this is happening on such a small scale, I think it will actually be more straight than gyros or encoders. We'll see! :D |
Re: Optical Mouse Navigation
Quote:
I think the bigger issue will be different surfaces, which will have to be accounted for by using banner sensors to detect changes. This will be interesting. At the very worst, infrared and ultrasonic sensors are precise enough that they could be mounted near the optical sensor and used to scale data live. I really don't think it will come to that though. |
Re: Accumulation of small deltas?
Quote:
I am *not* using the mouse to give me lateral distances (call it the x-axis offset distance) in part for the reasons you have stated. Rather, I am using a rate gyro to tell me how much rotation the robot has experienced in the most recent time interval. I use the mouse to give me just the y-axis offset, i.e., distance traveled. A little trig and a couple of approximations later I can calculate a lateral offset which *is* accumulated. The real question is: How much error will accumulate during autonomous mode using the <whatever> navigation system, and is it so much that the <whatever> navigation system is not reliable? Obviously all of these different systems have their individual strengths and weaknesses. If during the (presumed) 30 seconds of autonomouse mode (good grief, I don't believe my fingers just typed that - but I like it!) - if during that time period the robot is only off by a couple inches then I will be ecstatic. [ I calculate that my refitted mouse should have a 1/4" resolution. We'll see if that holds up or not... ] The wheel encoder system we used last year had about a 6 inch resolution. I know that much greater resolution is possible using wheel encoders, but that's what we had. So if the optical mouse works better, then great. If it doesn't work better, then maybe we all learned something. |
Re: Optical Mouse Navigation with GP5
I have been performing a experiment to validate optical mouse sensing with off the shelf robotic componets.
My work started with writing a custom PS/2 driver with an HCS12 microcontroller from motorola. Basically using a standard SPI interface I was able to communicate between the mouse successfully. http://iarc1.ece.utexas.edu/~lynca/m...I_oscopeTests/ the code released is not meant for a PIC but can serve as a guide on implementing communication between a mouse and SPI port. contact by email for the project which is still under development, http://iarc1.ece.utexas.edu/~lynca/m...dia/FIRST/src/ My solution was not simple and easily transferable. Therefore I was determined to find something better for the robotic community. I discovered Al William here near my area who had implemented a much more elegant PS/2 driver on a chip he call PAK XI (earlier version pak6). http://www.awce.com/pak6.htm Testing pictures, http://iarc1.ece.utexas.edu/~lynca/m...a/FIRST/PAKXI/ I also tested above, the gp5 with great success, I have misplaced my hyperterm outputs but email and I'll provide picture evidence if requested. http://www.awce.com/gp5.htm In all, the drivers can be implemented with any PIC (including RC) but an offboard processor streaming serial port commands of x and y is a very quick solution (GP5 through programming port). I have a Microchip project which does so but the code is messy at the moment, please email me a code request if curious. thanks for your time, ~Andrew Lynch for in depth look at PS2 driver for a PIC http://www.computer-engineering.org/ |
Re: Optical Mouse Navigation
Well with my season over I am now working on some non FIRST related robotics projects. The first (and more relevant to this discussion) of these is trying to build a robot that can navigate our school, and build a sonar map which it can use for plotting routes between its location and its destination. The robot will be based around a PC running linux. Right now I am just toying with the idea of trying to use an optical mouse for measuring travel.
As far as mounting height why dont you just use a regular optical mouse mounted on one of the skinny pneumatic rams with a regulator regulating the pressure down to say 10 psi. Then you mount the ram such that it is not fully extended when the mouse is on the ground, so the mouse will always be being gently pressed on the ground. Then if you encounter terrain that is unsuitable for the mouse (put a sonar range finder pointing at the ground in front of the mouse to detect bumps) you simply retract the ram until you are back on flat ground. Just a thought. |
Re: Optical Mouse Navigation
Quote:
|
Re: Optical Mouse Navigation
Oh I get it now. Would a regular mouse work at around 3 feet per second? It doesnt seem like much, but now that I think about it I am sure I dont usually move my mouse that fast..
|
| All times are GMT -5. The time now is 16:12. |
Powered by vBulletin® Version 3.6.4
Copyright ©2000 - 2017, Jelsoft Enterprises Ltd.
Copyright © Chief Delphi