You may have noticed that match times shown on TBA this year are a little more accurate than they’ve been in years past. I wrote a tech talk blog post describing how we’re able to algorithmically predict when a match will start!
It’s cool you are looking into using vision to help with this next year! Have you thought about using audio as well?
(I already posted this comment on Facebook but I’ll post here so others can see it)
Cool stuff!
Will predicted match times be available from the API in the future? (At least I don’t think they are already in the Match model, or at least the times from the Match model didn’t seem to change as I was hacking with it)
I was working with my team on a feature for an as-of-yet unreleased Slack bot for FRC teams and would have loved to have access to predicted times.
Audio sounds (no pun intended) like a better cue for fog horned matches and instances where the Audience Display overlay doesn’t immediately show up. Definitely worth looking into.
Yes, scheduled, predicted, and actual match times will be available in APIv3 (docs coming soon we promise).
Audio seems really cool theoretically and I think there’s a lot of cool stuff that you could do with that. I think in order for it to be effective, though, we would need for more consistent webcasting (beyond this year’s move towards that). If the system relied on audio and then an event was using YouTube or someone that mutes the audio when music is played, you’re kind of stuck. I think in an ideal world you use a combination of audio and visual with some fallbacks on the algorithms.