|
|
|
![]() |
|
|||||||
|
||||||||
|
|
Thread Tools | Rate Thread | Display Modes |
|
#19
|
|||
|
|||
|
Re: Offseason Video Review Pilot-Volunteers?
Quote:
The best way to do this is to capture a bunch of data, and score every match under differing sets of assumptions using independent reviewers. Setting up a camera or two, and only allowing a maximum of 8 samples (one challenge per alliance, playoffs only) won't provide enough data to convince anyone to act. The question I'm trying to answer is the percentage of time these variations prevent a review from correcting a missed call: * lack of time sync * low resolution * slow shutter speeds (blurry) * bad camera mounting (blurry) * lack of depth of field (blurry) * is 30fps adequate? * bad point of view For example, consider the FPS value. A robot moving at 10 feet per second moves 4" per frame at 30 frames per second. Is that enough that, in most situations, the correct call can be made? I missed one thing on my list -- you need the match time, matched with the video. |
| Thread Tools | |
| Display Modes | Rate This Thread |
|
|