Quote:
Originally Posted by kamocat
The "perceive" discussion entails what sensors should be used for what purposes.
I'll list some things an autonomous 'bot might want to know: - Where am I on the field?
- Where are the robots around me? (what alliance are they?)
- Where are the balls around me? (on the floor, presumably)
- Where are the goals? The bumps? The towers? The walls?
- Have I flipped over?
- What are the other robots doing? Do they need help? (Inter-robot communication)
There are also simpler things, implemented into the "control" section, liked "have I completed my kick" or "is my arm fully extended", usually potentiometers or limit switches, that are used in feedback to make sure the action is completed. Unless someone's doing something exotic like using a non-contact thermometer to tell when a motor is stalled, I don't think these need to be discussed with the rest of the sensors.
|
This is where the GDC may play nice again, and bring back something like the 2 freq. IR beacons.
But, even if they don't there are simple ways for determining where you are on a field using encoders (Assuming the wheels don't slip. Hard for last year).
Using kinematic formulas:
Code:
S = (Delta Left + Delta Right) / 2
Delta Theta = (Delta left - Delta right) / wheelbase
Theta = Theta + Delta Theta
X = X + (S * cosine ( Theta ) )
Y = Y + (S * sine ( Theta ) )
This satisfies one item on the list, but only slightly.
As far as knowing where other robots are, that would need a lot of DSP, or an external observer telling the bots where they are, or all bots to communicate with each other