Does anyone experience not being able to see their robot because a field element gets in the way during competition? If so, how often and does it inhibit your team’s performance at all?
Please take this survey to answer these questions:
Thank you
Does anyone experience not being able to see their robot because a field element gets in the way during competition? If so, how often and does it inhibit your team’s performance at all?
Please take this survey to answer these questions:
Thank you
Interesting set of questions… what are you hoping to see with this data and what are your plans for the results?
I would recommend adding another question to highlight what year had the issue. I can guarantee that everyone has had some viability issue in the last 4 years.
The answer is yes, visibility is an issue in some games. Unless you’re limiting your playing area significantly, you’ll run into this - which means everyone has run into it. That’s part of the challenge of the game design that teams need to overcome. I’m interested to know what you’re trying to show/affect with this survey.
This is why I place Steamworks below Recycle Rush in my compendium of most enjoyed games. It was a major turnoff when, as a spectator, one could not see a good percentage of critical robot gameplay.
2015, 2016, 2017, and 2018 have all had significant visibility issues. I find it hard to imagine any driver or coach from those years saying that they have never had a problem with not being able to see the robot. And unless you’re driving the robot virtually (i.e. not by sight), that is bound to inhibit your team’s performance.
I imagine this survey is preceding either some kind of product release or a petition for FIRST to get rid of tall field-elements. I’ll reserve judgement until that comes out.
2015? I operated and had no issues with seeing our robot. What moments made visibility bad?
If your robot only played in the space between the driver station and front of the platforms, it probably wasn’t a problem. If you played between the platforms or in the landfill, it became very hard to see the robot behind a wall of capped 6-stacks.
Stacks and Stacks, and poorly placed stacks. Or just tons and tons of stacks. With the occasional giant robot in the way.
EDIT: Sniped by AriMB
I wish they’d bring back driver station pole cams from 2016. They were awesome for seeing the field and as a bonus you’d have your match recorded & ready to watch when you got back to the pits.
2012 and 2013 were pretty good, and 2014 was excellent, as there were so few field elements between a few inches and six feet from the floor. 2016 and 2017 were terrible by design; too many opaque and glare-inducing field elements. 2015 started great but got bad really quick unless you carefully coordinated mining the landfill with placing stacks. 2018 wasn’t nearly as bad as the three preceding years; the switch walls were low and mostly transparent, and the scale wasn’t particularly wide at low altitudes.
Another possibility is that this is to justify spending money and/or time implementing a streaming camera. Not a major expense, but one that some teams need to plan in advance if they’re going to have it.
How about a compromise: the field has two “pole cams”, one on each driver station side, that can be accessed via the dashboard while at the field and footage downloaded at the end of the match.
Would be not that expensive, much safer, and would be available to everyone.
My teammate and I are trying to create a module that will go on the robot, “communicate” with the field, and gives feedback to display the current position of the robot on an aerial view of the field on a monitor.
Sounds great to me. C’mon FIRST, make it happen!
Here’s an example of what we’d have the second we got back to the pit in 2016: https://youtu.be/Z0aSxoOGtEE it was really helpful for evaluating what happened post-match with the drive team. We used a GoPro on an extendable painter’s pole and also fed that to a separate monitor during the match.
What “module” do you use? I think an easy way of doing this is by approximating the location of the robot in space by using encoder outputs. However, this could be inaccurate since there is scrubbing between the wheels and the floor, and any sideways pushing would make your bot lose its position. You could potentially overcome this by recalibrating during a match by hitting certain points and using an accelerometer to gauge when there is an impact, but that takes time. Although I would really like to see this done, the better way is to just have your drivers look around the objects.
From a sensors standpoint, this is a very difficult problem to solve. Using encoders, accelerometers, and gyroscopes can help, but you’ll suffer from drift, loss of traction, defense, and other things that’ll make such a system less and less accurate as the match goes on. Adding on sensors like LIDAR’s can help, but provide an interesting computational problem to solve, trying to fix your position based on a couple of distance measurements while accounting for non-fixed assets like other robots.
There are other methods that can help fix a location that are used in other competitions, however they rely on wireless communication from known, fixed points to determine the current location. That is, regrettably, illegal in FRC (R69 last year). I would love to see FIRST set up such a system on the field for teams to use, coupled with a KoP receiver and some default code to help teams get started. But that’s something that is out of a team’s control.
Been done at a couple of off-seasons at least.
This tracks robot position and orientation.
https://www.chiefdelphi.com/media/img/54e/54e08fa3c0d5ce7ffb8353e5ab7b61d4_l.jpg
It looks like multiple bots were given paths in this image. Were they judging their own position in real time, or was a video feed processing where and what orientation the robots were in at the given time? Either way the data is neat, and can be used the same strategically outside of a match, but the main topic for this was to allow the drivers to see the real-time position of the robot’s current position on the field.
Neither. They were using this system for determining the robots’ location relative to the field. The data was collected by Zebra and distributed well after the event. If you read through the threads they’ve written on the technology in FRC, they say that it could theoretically be used in near-real-time to give position feedback to drivers/audience, but isn’t accurate enough to run auton routines off of.
EDIT: It should be noted that this system requires sensors at known locations on the field to ping the tags on the robots and determine their position. So for now, this technology is not FRC legal. One can hope that someday they or a similar company might partner with FIRST to bring this technology to all teams. But in the meanwhile, a standardized low-latency camera pole will do a lot of the same work for driver visibility with a lot less overhead.