Chief Delphi

Chief Delphi (http://www.chiefdelphi.com/forums/index.php)
-   Technical Discussion (http://www.chiefdelphi.com/forums/forumdisplay.php?f=22)
-   -   The Perfect swerve (http://www.chiefdelphi.com/forums/showthread.php?t=129003)

Tyler2517 22-04-2014 20:17

Re: The Perfect swerve
 
Yeah you would need 2 mice to have the rotation control. As far as i know my mouse can track on the fields carpet extremely well. I wonder if they would loss calibration over time.

theawesome1730 22-04-2014 20:23

Re: The Perfect swerve
 
Quote:

Originally Posted by Ether (Post 1377880)
Field-centric is about robot angular orientation. How are you planning to measure robot angular orientation with a single mouse? Won't you need two?



I was thinking the same thing. I suppose a team could create a device using a laser and receiver from a gaming mouse and something like a scroll wheel mounted sideways and controlled by the thumb with the purpose of changing angle relative to the field. The programming would be complex but the results could be good.

Ether 22-04-2014 20:34

Re: The Perfect swerve
 
Quote:

Originally Posted by Tyler2517 (Post 1377883)
Yeah you would need 2 mice to have the rotation control. As far as i know my mouse can track on the fields carpet extremely well. I wonder if they would loss calibration over time.

I wonder what their dynamic response capability is. Like if you take a hit.



Gdeaver 22-04-2014 20:36

Re: The Perfect swerve
 
In 2013 looked at high end gaming mice. With swerve and the chassis orientation decoupled, there are 2 solutions of the x and y counts coming from the mouse. On competition carpet and no changes to the optics we found the accuracy to be less than needed. The optic flow algorithm is not tuned for this use. With a usb port on the roborio in 2015, A usb camera highly filtered and a more robust optic flow algorithm may yield better results. A gyro at minimum would need to be fused with the optic flow. This is all for field centric control. There are 2 paths 2 look at. Sensing from the robot reference frame (gyro accelerometer fusion). Adding a world reference frame with a magnetometer or some other sensor to reference out side the robot frame of reference. GPS is out. Constellation navigation has grabbed my curiosity. It's hard. In 2013 we could have reset the gyro every time we went up against the feeder station wall to correct for drift. This year we considered IMU field centric control not doable because of the constant impacts and never having time for a reset. The last thing our drivers need this year is for the field centric control to suddenly shift several degrees while being smash defended and trying to roll out. With our low designs the last several years a magnetometer location and calibration issues ruled out that solution. Fortunately for the future of swerve field centric a couple of companies have released affordable IMUs based on gyro, accelerometer and magnetometer sensors coupled with highly tuned extended state kalman filters that should handle the rough First environment. The key is constant hard and soft iron calibrations on the magnetometer sensor. I'm hoping to make this an off season project if I can get some programming students on board. I think soon a plug and play Field centric IMU solution will be available for FIRST.

Ether 22-04-2014 20:47

Re: The Perfect swerve
 
Quote:

Originally Posted by Gdeaver (Post 1377894)
there are 2 solutions of the x and y counts coming from the mouse

It's not clear what you mean by 2 solutions. With a single mouse, an XY displacement could correspond to a multitude of possible robot motions.

With 2 mice, you could in theory derive all three degrees of freedom of the robot motion, if the XY readings are accurate.



Dunngeon 22-04-2014 20:54

Re: The Perfect swerve
 
Quote:

Originally Posted by Ether (Post 1377880)
Field-centric is about robot angular orientation. How are you planning to measure robot angular orientation with a single mouse? Won't you need two?



You would, in order to get the data needed

ekapalka 22-04-2014 21:01

Re: The Perfect swerve
 
It seems to me like follower wheels shouldn't be out of the question when monitoring position, especially when used with other forms of sensory. Here is how you calculate the speeds, and this is a thread all about how to utilize and manage that information (both thanks to Ether).

Gdeaver 22-04-2014 21:23

Re: The Perfect swerve
 
There have been several college papers written about this subject and the ones that had some success used 2 mice and a gyro. There are some crowd sourced devices on the market based on these. They work kind of on small robots and smooth surfaces going slow. Haven't seen anybody that has found a mouse solution that would work in the First environment.

Gdeaver 22-04-2014 21:25

Re: The Perfect swerve
 
The problem with First is high G impacts.

T^2 22-04-2014 22:27

Re: The Perfect swerve
 
Quote:

Originally Posted by Dunngeon (Post 1377900)
You would, in order to get the data needed

Some mice, such as the Razer Taipan, use both a laser and an optical sensor. Theoretically you could use the combination to detect rotation.

Ether 22-04-2014 23:15

Re: The Perfect swerve
 
Quote:

Originally Posted by T^2 (Post 1377931)
Some mice, such as the Razer Taipan, use both a laser and an optical sensor. Theoretically you could use the combination to detect rotation.

How ?



cadandcookies 22-04-2014 23:50

Re: The Perfect swerve
 
Quote:

Originally Posted by Ether (Post 1377953)
How ?



I would assume that the idea is that if there are two sensors, there's a displacement between them and that could somehow be used to derive an angle. However since the two sensors don't have a fixed reference point I don't really see how this would actually be able to work (though it's also possible that I'm completely out of my depth here, which is actually the most likely scenario).

RyanCahoon 23-04-2014 00:33

Re: The Perfect swerve
 
Quote:

Originally Posted by Gdeaver (Post 1377894)
A usb camera highly filtered and a more robust optic flow algorithm may yield better results

Without specialized hardware and/or a lot of custom optimization, I'm doubtful that you'll be able to run an optical flow algorithm much faster than 15 Hz at any decent resolution. At that framerate with a robot traveling 12 ft/s, you'll see displacements of 10 inches/frame. Getting an unobstructed view of that much carpet beneath your robot seems like it would be a challenge, assuming you can even reliably track displacements that large using the texture of the carpet.

s_forbes 23-04-2014 00:50

Re: The Perfect swerve
 
Quote:

Originally Posted by RyanCahoon (Post 1377970)
Without specialized hardware and/or a lot of custom optimization, I'm doubtful that you'll be able to run an optical flow algorithm much faster than 15 Hz at any decent resolution. At that framerate with a robot traveling 12 ft/s, you'll see displacements of 10 inches/frame. Getting an unobstructed view of that much carpet beneath your robot seems like it would be a challenge, assuming you can even reliably track displacements that large using the texture of the carpet.

As an alternative method: what's the average height of a ceiling at a typical venue? The truss structures I've seen in most gymnasiums might impact an optical flow approach, but it may work better than the carpet.

ekapalka 23-04-2014 01:45

Re: The Perfect swerve
 
Quote:

Originally Posted by s_forbes (Post 1377972)
As an alternative method: what's the average height of a ceiling at a typical venue? The truss structures I've seen in most gymnasiums might impact an optical flow approach, but it may work better than the carpet.

Astral navigation :D I really wanted to use this approach to verify the gyro readings when in view of landmarks on the ceiling, and use the gyro for orientation when the landmarks are obscured (by the truss / balls flying overhead), but we never got around to getting a camera for it (not even a stereo camera - just an Axis). If anyone ever makes any progress on this, I would love to know about it. I've only ever seen one real-world thing that boasts their use of it, and those are StarGazer localization cameras, which need specific landmarks (so this particular method of astral navigation isn't suited to be used on the playing field)


All times are GMT -5. The time now is 02:00.

Powered by vBulletin® Version 3.6.4
Copyright ©2000 - 2017, Jelsoft Enterprises Ltd.
Copyright © Chief Delphi