|
|
|
![]() |
|
|||||||
|
||||||||
![]() |
|
|
Thread Tools |
Rating:
|
Display Modes |
|
|
|
#1
|
||||
|
||||
|
Re: Video Review Needs to Happen Now
Quote:
I broadly agree with your comments about game design, but I agree far less with the assertion about One option that doesn't require changing any game rules would be to simply scrounge up 6 volunteers from the crowd and give them the job of watching one robot apiece (there are details to be dealt with, but you get my point). [/EDIT] When I wrote the two paragraphs above, my brain was stuck in a thinking-about-off-season-events/experiments rut; and it just dawned on me that Chris was almost certainly thinking about regular-season events. Doh! That said, the bigger picture point is that if off-season experiments such as getting teams attending events to supply a small handful of students/adults to do a few low-skill scorekeeping tasks (or rule tweaks, or ...), create a dramatic error reduction during off-season events, the GDC would probably notice (notice both the errors and the simple(ish) way to treat the root cause's symptom). [/EDIT] I'm not saying in this post that using video evidence it's bad or good, wise or foolish, etc. I'm also not saying in this post whether or not I think there is a difference between inspiring someone to try something new, and that person later on being excited or depressed by they way an FRC competition unfolds. I am saying that I don't think video is the only lever that can be pulled. Now, I'm going back to waiting for the "hard" data posts. Blake PS: The one time I got to spend some time with an FRC GDC, they seemed like nice people . I think they would welcome well-organized feedback from event organizers; especially if it took the form of a video-based, post-mortem of a game's rules. I'm thinking about the sort of review and analysis that would use a large number of hours of video from multiple events to identify the sorts of calls/rules that are hardest for humans to make/enforce correctly. That sort of info could definitely influence future games (and treat a cause instead of a symptom), especially if could be put into a simple checklist of things to avoid, or do. Maybe a pro-video person reading this thread will contact the GDC in order to volunteer to do that for the next 1-3 seasons? Last edited by gblake : 14-11-2016 at 21:57. |
|
#2
|
|||||
|
|||||
|
Re: Video Review Needs to Happen Now
What would hard data for this subject look like to you? Most implementations will only have anecdotal evidence at this point because video review probably only gets used once or twice during an event (so far.)
|
|
#3
|
||||
|
||||
|
Re: Video Review Needs to Happen Now
That lack of usage is a valid data point.
|
|
#4
|
|||||
|
|||||
|
Re: Video Review Needs to Happen Now
What's interesting is I know some argued it would be used so much that it would slow events down, or that would be a risk at least, correct? So far it seems that is not the case for the few events that have tried it out.
|
|
#5
|
||||
|
||||
|
Re: Video Review Needs to Happen Now
Have any events actually had an overturned call yet? There was one rescoring that was mentioned, but otherwise no event has had something overturned or adjusted. The primary concern is that most video review overturns would result in replaying matches, and thus the added schedule risk. It's not fair to penalize alliances for their strategic behavior based on the score/field conditions that are presented to them in real time, and then go back and adjust those via video review. For instance, if a team crosses a defense 3 or 4 times to damage it, they can't get the extra time they wasted crossing that defense back. Or if a team makes a call to ensure a capture at ~15seconds rather than scoring an extra couple balls based on their real time scoring, then have that scoring adjusted when you give the other alliance a breach that didn't exist previously. Virtually any case where errors are found in video review should mandate a replayed match. Thus the schedule concern.
|
|
#6
|
||||
|
||||
|
Re: Video Review Needs to Happen Now
Quote:
- - - - - - - - - - - - - - - - - - - - - - To prep to answer this, in addition to shooting from the hip , I wanted to refresh my recollection of what has been said so far in the last few months. so, I reviewed this thread, and an adjacent thread. I found these posts I and a few others wrote. There is nothing Earth-shaking in them; but they supply some context.For me, the outline that follows is the way I would want to approach A) creating a solid understanding of the need (or lack thereof) for adding video to the refs' tools, and B) coming up with a first version of a video system, if developing one is warranted. The "hard data" would be the results (measurements & statistics) produced by the experiments. Obviously this is a back of the napkin, discussion-forum-quality sort of an outline - Not even PowerPoint quality yet. The current system (FIRST) being discussed is a system containing many things, including competition events that contain, at the least, a Playing Field & Game Pieces, the Field Staff (announcers, refs, etc.), the participating Teams, the Match/Game/Robot rules, the Audience, the Matches/Schedule, and the field Computers/Sensors/Software. We are talking about introducing Video Replays into the FRC (and FTC ...) event part of that FIRST system. We need to know the pertinent parts of the current baseline system's status/performance, the current system's purpose, and the sensitivity of the system's ability-to-achieve-it's-purpose(s) to changes in the independent variables we are going to adjust. Some useful metrics might be
In the experiments I would want to
What's above is a quick-and-dirty outline of what I would *want* to do to produce "hard data". After dealing with real-world constraints, thinking a bit more deeply, and getting some preliminary results; I, or whoever, might decide the experiments could be simplified without violating the integrity of the results, or they might add something. I know there are folks who firmly believe that the need for (or cost of) video reviews is/isn't so obvious, that what I outlined above isn't necessary. I don't disagree that they feel that way. I do say that nothing in this thread so far *proves* that the need does/doesn't exist, and/or that a need would justify the investment (instead of investing in satisfying other needs). Blake PS: In the past, I and at least one other person have wished for detailed camera/lens specs and placement info. That would be one example of "hard data", and could be used to answer some important questions; but it's just one part of the bigger picture under the heading of "Video Review Needs to Happen Now". PPS: Above I have some bullets about identifying which calls could/should/would be affected by reviewing video. Complementing that, I'm not sure whether deciding what the effect of a changed call should be, is part designing each/any experiment (it probably is). Regardless, it is certainly something that would factor into any decisions to introduce (or not) video replays into the system. Last edited by gblake : 17-11-2016 at 15:36. |
|
#7
|
||||
|
||||
|
Re: Video Review Needs to Happen Now
Quote:
|
|
#8
|
|||||
|
|||||
|
Re: Video Review Needs to Happen Now
Blake, there's one other data point that I think could be useful in that analysis:
# of matches where video replay was available for use. That is, how many matches at the event in question could have possibly had a review? (Basically, how many matches did the event play? Unless, of course, it's elims-restricted, in which case it's how many elims matches were played.) And the reason that that particular data point could be useful is that it will give a broad-spectrum picture of how used video replay could be, if available. Just as an example, if in 5 offseason events (that offer replay review) each with 50 matches replay is used 5 times and the call on the field is overturned* once, then you could say that there's a 2% use rate and when used, there's a 20% overturn rate--but overall, the overturn rate is 0.4% because only one call was overturned. That sort of data can be used to figure out timing of events (and what the limits for review are) and ref performance. Both those numbers could be pretty important, the latter feeding into better ref crews if need be. *Change of any sort to the result of the match--points, winner, you get the idea. I'd suggest that "control" events track number of scoring changes (team or field staff initiated), and number of (denied, obviously) requests for review. |
|
#9
|
||||
|
||||
|
Re: Video Review Needs to Happen Now
Quote:
|
![]() |
| Thread Tools | |
| Display Modes | Rate This Thread |
|
|