|
|
|
![]() |
|
|||||||
|
||||||||
Our vision system utilized 3 120 degree cameras. We removed the IR filter from them, added an IR pass filter onto our 3d printed camera mount, and added IR leds to saturate the targets. It ran on an O-Droid XU, and was threaded. The program took data from all three cameras and calculated:
where we were on the field
where a ball was of either color, its velocity and heading relative to us
and if a robot was in front of us, it tracked that was well (velocity and heading)
15-04-2014 13:25
yash101
Are you using RotatedRects? It seems as though your bounding rectangles are turned, not just boxed!
15-04-2014 14:08
JesseK|
... The program took data from all three cameras and calculated:
where we were on the field where a ball was of either color, its velocity and heading relative to us and if a robot was in front of us, it tracked that was well (velocity and heading) |
15-04-2014 15:21
faust1706we only used hot goal detection. the rest is still a work in progress. Our team has 3 programmers, we have plans to program for all the data we have, but again, it is only a plan. We have had game piece detection for 3 seasons now and have not used it on the machine control side. We're dwelling in making the switch to c++, which should attract more students than labview, but who knows. We are also interested in doing path planning and autonomously transversing across the field, but that may not happen. Matlab has given us licenses though....so we could port code over to c.
15-04-2014 15:28
JesseKIn May I plan to release a white paper about FRC Tele-Operated HMI that you may be interested in.
It seems we have the opposite problem you have - we know how to acquire & use the vision data in tele-operated, and we have plenty of it - but getting a simple message back to the robot seems to be our Achilles heel
. The programmers worked out the issues on the test platform before we packed everything up last weekend, but we'll see what happens at Champs on the real robot.