Paper: 4536 scouting database 2017

Thread created automatically to discuss a document in CD-Media.

4536 scouting database 2017
by: Caleb Sykes

This is a database which contains component calculated contributions and other metrics for all official FRC events.

This is a a scouting database which calculates component calculated contributions (OPRs) and other metrics using the data from the TBA API. Each sheet currently contains data from a distinct FRC event. A new database will be published weekly within a day or two of all of that week’s events being completed. For sheets which contain events that have not yet occurred, seed values for each category are available in order to aid in pre-scouting.

4536_scouting_database_2017.0.1.xlsx (4.81 MB)
4536_scouting_database_2017.1.0.xlsx (9.61 MB)
4536_scouting_database_2017.2.0.xlsx (10.4 MB)
4536_scouting_database_2017.3.0.xlsx (10.9 MB)
4536_scouting_database_2017.4.0.xlsx (11.8 MB)
4536_scouting_database_2017.5.0.xlsx (12.7 MB)
4536_scouting_database_2017.5.1.xlsx (12.7 MB)
4536_scouting_database_2017.6.0.xlsx (13.5 MB)
4536_scouting_database_2017.6.1.xlsx (13.7 MB)
4536_scouting_database_2017.7.0.xlsx (14.3 MB)
4536_scouting_database_2017.7.1.xlsx (14.5 MB)
4536_scouting_database_2017.8.0.xlsx (14.8 MB)
4536_scouting_database_2017.9.0.xlsx (15.1 MB)

Primary discussion thread can be found here.

For week 2 and onward events, the sheet names at the bottom don’t seem to match up with the data that is shown on each page.

The Israel 2 page shows data for Arkansas Rock, Arkansas Rock shows data for San Diego, etc.

Good catch, I have uploaded a revised version that fixes these errors.

Caleb
I was looking for a way to use this spreadsheet for scouting at a regional like the Ed Law Spreadsheet. This would be a power tool for team scouting.

Check back in day or two, I will be uploading a different workbook soon called the “4536 event simulator” that can be used during an ongoing event.

Gratias Amigo. I see a dedication that is admirable in your efforts with statistics and robots

I downloaded the file about 5 minutes ago and the results for 1 team are the same for every team…this isn’t intentional, right?

What you are seeing is the week 0 version, which has no 2017 data in it yet. Week 1 version will be uploaded later tonight.

Thanks for the Wk4 data. It shows the championship will be murders row for an unprepared team. :eek::yikes: :ahh: :smiley:

Hi Caleb, when will the updated file including St. Louis be posted? Thanks, Kathleen

I’ll get it up as soon as I can, but I’m a little busy these next few days. It is normally pretty straightforward to update, but I want to fix the Elo ratings so that they account for MSC finals and Einstein matches. It will be up Wednesday night at the latest.

Thank you so much Caleb.

I found the data in your spread sheets very informative, thank you for posting regularly. This is the first year our team has started to look at the statistics, and we were able to leverage the information when forming strategies with our alliance partners. Alpha Dogs 4946 were finalists at Curie Division in St. Louis!

Thanks again. Kathleen

I’d like to see a comparison in predictive ability between OPR vs. Elo. I noticed the results were markedly different between them.

Week 9 data has been added. Elo ratings on the “seed values” sheet account for MSC finals and Einstein, the Elo ratings on other sheets do not. Unless I find an error, this will be the last update for this year.

Sure, I’ll check more into this and make a post about it sometime in the next few days. My Elo predictions for 2017 were less predictive than they have been for any year since at least 2008. I expect OPR predictions to be comparably poor, and the average of the two to be slightly better than either on its own, but we’ll have to see.