|
|
|
![]() |
|
|||||||
|
||||||||
![]() |
| Thread Tools |
Rating:
|
Display Modes |
|
#16
|
||||
|
||||
|
Re: The Perfect swerve
Yeah you would need 2 mice to have the rotation control. As far as i know my mouse can track on the fields carpet extremely well. I wonder if they would loss calibration over time.
|
|
#17
|
||||
|
||||
|
Re: The Perfect swerve
I was thinking the same thing. I suppose a team could create a device using a laser and receiver from a gaming mouse and something like a scroll wheel mounted sideways and controlled by the thumb with the purpose of changing angle relative to the field. The programming would be complex but the results could be good.
|
|
#18
|
||||
|
||||
|
Re: The Perfect swerve
Quote:
|
|
#19
|
|||
|
|||
|
Re: The Perfect swerve
In 2013 looked at high end gaming mice. With swerve and the chassis orientation decoupled, there are 2 solutions of the x and y counts coming from the mouse. On competition carpet and no changes to the optics we found the accuracy to be less than needed. The optic flow algorithm is not tuned for this use. With a usb port on the roborio in 2015, A usb camera highly filtered and a more robust optic flow algorithm may yield better results. A gyro at minimum would need to be fused with the optic flow. This is all for field centric control. There are 2 paths 2 look at. Sensing from the robot reference frame (gyro accelerometer fusion). Adding a world reference frame with a magnetometer or some other sensor to reference out side the robot frame of reference. GPS is out. Constellation navigation has grabbed my curiosity. It's hard. In 2013 we could have reset the gyro every time we went up against the feeder station wall to correct for drift. This year we considered IMU field centric control not doable because of the constant impacts and never having time for a reset. The last thing our drivers need this year is for the field centric control to suddenly shift several degrees while being smash defended and trying to roll out. With our low designs the last several years a magnetometer location and calibration issues ruled out that solution. Fortunately for the future of swerve field centric a couple of companies have released affordable IMUs based on gyro, accelerometer and magnetometer sensors coupled with highly tuned extended state kalman filters that should handle the rough First environment. The key is constant hard and soft iron calibrations on the magnetometer sensor. I'm hoping to make this an off season project if I can get some programming students on board. I think soon a plug and play Field centric IMU solution will be available for FIRST.
|
|
#20
|
||||
|
||||
|
Re: The Perfect swerve
Quote:
With 2 mice, you could in theory derive all three degrees of freedom of the robot motion, if the XY readings are accurate. |
|
#21
|
|||
|
|||
|
Re: The Perfect swerve
You would, in order to get the data needed
|
|
#22
|
||||
|
||||
|
Re: The Perfect swerve
It seems to me like follower wheels shouldn't be out of the question when monitoring position, especially when used with other forms of sensory. Here is how you calculate the speeds, and this is a thread all about how to utilize and manage that information (both thanks to Ether).
|
|
#23
|
|||
|
|||
|
Re: The Perfect swerve
There have been several college papers written about this subject and the ones that had some success used 2 mice and a gyro. There are some crowd sourced devices on the market based on these. They work kind of on small robots and smooth surfaces going slow. Haven't seen anybody that has found a mouse solution that would work in the First environment.
|
|
#24
|
|||
|
|||
|
Re: The Perfect swerve
The problem with First is high G impacts.
|
|
#25
|
|||
|
|||
|
Re: The Perfect swerve
Some mice, such as the Razer Taipan, use both a laser and an optical sensor. Theoretically you could use the combination to detect rotation.
|
|
#26
|
||||
|
||||
|
Re: The Perfect swerve
Quote:
|
|
#27
|
||||
|
||||
|
Re: The Perfect swerve
I would assume that the idea is that if there are two sensors, there's a displacement between them and that could somehow be used to derive an angle. However since the two sensors don't have a fixed reference point I don't really see how this would actually be able to work (though it's also possible that I'm completely out of my depth here, which is actually the most likely scenario).
|
|
#28
|
||||
|
||||
|
Re: The Perfect swerve
Without specialized hardware and/or a lot of custom optimization, I'm doubtful that you'll be able to run an optical flow algorithm much faster than 15 Hz at any decent resolution. At that framerate with a robot traveling 12 ft/s, you'll see displacements of 10 inches/frame. Getting an unobstructed view of that much carpet beneath your robot seems like it would be a challenge, assuming you can even reliably track displacements that large using the texture of the carpet.
|
|
#29
|
||||
|
||||
|
Re: The Perfect swerve
Quote:
|
|
#30
|
||||
|
||||
|
Re: The Perfect swerve
Quote:
Last edited by ekapalka : 23-04-2014 at 01:53. |
![]() |
| Thread Tools | |
| Display Modes | Rate This Thread |
|
|