2022 Chezy Champs Ranking Projection Contest

Announcing the 2022 Chezy Champs Ranking Projection Contest. We’re back again after taking 2020 and 2021 off. This will be a similar contest to the ones I hosted in previous years, see those contests below:
2018
2019

Overview
All entrants will submit an average predicted rank for each team at CC (although if you do have a full probability distribution I’d love to see it). What I like about average rank is that it allows a way to quantify uncertainty. If you think two teams will seed about equally well, that’s fine, just give them the same predicted rank. If you don’t like dealing with uncertainty, just submit a list of teams from best to worst with predicted ranks 1 to (number of teams). The winner of this contest will receive eternal bragging rights. As of now I have no sponsor for this event since I threw it together at the last minute, but if anyone wants to sponsor it, DM me. If we do get any prizes my entry will not be eligible to win them.

How to submit

  1. Go to this link
  2. Go to the “Predictions” tab
  3. Add your name to an open column in the top row (ideally your CD username for everyone else’s sake)
  4. Edit the values in your column as you see fit, make sure your predictions line up correctly with the team list in the first column.

Rules

  • Limit one entry per person
  • Submission deadline is Thursday 9/22 at 11:59PM eastern time.
  • You can make changes to your predictions as frequently as you want up to the deadline, but at the deadline I will copy all submissions as they are for official scoring.
  • If there is any prize other than bragging rights, the winner of the prize must reply to my PM to them asking for contact info within one week of the completion of Chezy Champs, otherwise the award is forfeited.
  • No making fun of other people’s predictions unless you submit your own
  • I reserve the right to remove any submission for any reason (e.g. if I find out you are throwing matches to improve your predictions).
  • I reserve the right to edit or add any rules at any time.

Scoring
After qual matches are completed at Chezy Champs, I’ll take the Root Mean Square Error (RMSE) of each submission compared to the actual rankings. The lowest RMSE is the winner. For example, here is a sheet showing how I would have calculated the RMSE for my 2018 pre-event IRI projections. If you are missing a team from the submission, I will use a predicted rank of ((total teams) + 1) / 2 for them. If you have an additional team that doesn’t end up competing, they will be ignored in the score.

Other
Feel free to edit anything in the workbook except other people’s submissions, you can make cool graphs/analysis and add in more tabs to the workbook etc… Also, you can use this time to share whatever you want, so if you want to share match-by-match WLT predictions, bonus RP predictions, team strength estimations, etc… by all means do so in here or in the shared book.

Well, have at it. Keep in mind the team list could still change, so keep an eye on that if you submit early. I’m really interested in how people derive their predictions, so at a minimum please also leave a short description of how you derived your predictions, as well as if you were building off of someone else’s work or starting from scratch.

6 Likes

This is gonna be fun

1 Like

One week left! Note that a preliminary schedule has been posted here: Chezy Champs 2022 - #38 by NickE
Entrants may want to consider the effects of the schedule when making predictions.

For anyone curious, my predictions come from 10K simulations of my event simulator using the preliminary schedule.

The only tweaks I made to the raw output were to subtract 0.1 points from each team’s cargo RP ILS and to subtract 0.05 points from each team’s hangar RP ILS. These changes were made to reflect the effects of the increased RP thresholds at this event. The reason I subtracted as much as I did was based on a quick look at teams’ RP probabilities compared to their average scored cargo or average endgame points. An alliance with a combined average cargo scored of 30 should have an estimated RP probability of around 50%. I found that getting to that point required a 0.2 drop per team in cargo RP ILS and a 0.1 drop per team in hangar RP ILS, I cut that effect in half to 0.1 and 0.05 respectively to account for uncertainties (namely the effects of modified gameplay strategies based on the new thresholds).

Feel free to share how you arrived at your predicted rankings!

4 Likes

Note that 1072 has dropped from the event. I have removed their row from the spreadsheet. Due to this and to the preliminary schedule changing, I would advise everyone to review their submissions before the deadline.

2 Likes

Alright, predictions have locked. We had 22 entrants this year! Almost double the 2019 entries. It’s awesome to see so many.

For any entries that were missing submissions for teams, I set the prediction for that team to be 20.

Here are the standard deviations for each entry:

Entry Standard Deviations
Entrant stdev
Person someone 11.86
Andre Sanchez955 11.84
OPR rank 11.78
Sagi346 11.78
Joel340 11.76
J_Beta 11.72
Boganafoganasee 11.72
anshul 11.69
Cirrus 11.66
Taylor 11.62
Matthia 11.6
Britain 11.41
Random McRandomFace 11.4
Yuyu 11.32
HarrisonKaufman1619 11.07
Anthony_Galea 10.93
MichaelBick 10.81
Nobody6502 10.33
justin5026 10.32
Average 9.71
Pompano 9.56
Strategos 9.17
Caleb 9.11
Seth 6.38
Ignoramus 0

A higher standard deviation means a more confident prediction. In 2019 I found that most submissions were over-confident. We’ll see if that’s the case again this year.

Finally, here are all teams sorted by the standard deviation of the predictions for their predictions:

Summary
Team min mean max stdev
4499 2 17.2 40 8.7
7034 5 16.9 39 7.9
5940 4 13.9 38 7.6
114 14 25.6 41 7.2
8033 15 28.2 40 6.4
6036 8 24.7 40 6.2
3647 10 18.4 37 5.9
3940 8 23.5 34 5.7
696 10 20.2 30 5.6
4255 20 31.5 39 5.6
604 7 16.1 29 5.5
1700 20 34.8 40 5.5
5104 20 29.2 39 5.4
3256 20 31 38 5.4
2046 7 14.2 28 5.4
2813 20 35.1 41 5.4
5507 20 33.3 40 5.4
972 20 32.6 39 5.3
694 14 20.3 33 5.3
7157 7 21.4 33 5.2
2930 20 28.3 40 5.2
359 7 18.2 26 5
3478 4 19.8 25.9 4.9
3175 19 28.3 35 4.8
2486 14 24 36 4.8
2910 2 6.5 20 4.8
649 15 26.1 33 4.7
3476 1 12.9 21 4.6
1619 1 8.5 19 4.6
498 20 34.4 39 4.3
846 10 20.5 30 4.3
971 4 10.6 21 3.9
3310 5 9.8 20 3.9
4414 1.5 6.9 16.5 3.8
6800 1 8.7 14 3.6
254 1 3.8 11.9 3.6
973 1 6.7 17 3.5
1690 1 4.2 11 2.7
1678 1 3.5 8 1.8

4499 has the most discrepancy among their predictions, much of that discrepancy comes from Taylor’s predicted rank of 40 and Yuyu’s predicted rank of 2. At least one of those predictions is going to be very far off. 1678 had the least discrepancy among their predictions, with all of their predictions falling between 1 and 8.

We’ll see where everything lands when the dust settles.

3 Likes

With Saturday complete, we are getting down to the end of the qualification matches. Here are everyone’s scores using the current rankings:

Saturday night RMSE
Entrant End of Saturday RMSE
Caleb 7.1
Britain 7.1
Strategos 7.3
Average 7.7
Anthony_Galea 8.3
Yuyu 8.4
Cirrus 8.5
anshul 8.5
J_Beta 8.5
MichaelBick 8.7
HarrisonKaufman1619 8.8
Nobody6502 9
Joel340 9
Pompano 9.4
OPR rank 9.6
Matthia 9.6
Sagi346 9.6
justin5026 10.2
Boganafoganasee 10.3
Andre Sanchez955 10.3
Person someone 10.6
Seth 10.7
Ignoramus 11.3
Taylor 13.2
Random McRandomFace 15

It’s a tight race for first between myself, Strategos, and Britain. We’ll see what happens tomorrow.

3 Likes

Alright, we have some final results now that qualification matches are completed. Here they are:

Final Results
Entrant RMSE
Caleb 6.3
Strategos 6.6
Average 7.6
Britain 7.6
Yuyu 7.7
anshul 8.1
Anthony_Galea 8.2
J_Beta 8.2
MichaelBick 8.3
HarrisonKaufman1619 8.6
Cirrus 8.7
Joel340 8.8
OPR rank 9
Matthia 9.3
Nobody6502 9.4
Sagi346 9.5
Pompano 10
justin5026 10.1
Boganafoganasee 10.1
Andre Sanchez955 10.1
Person someone 10.4
Seth 10.5
Ignoramus 11.3
Taylor 13.9
Random McRandomFace 15

It was a close race between me and @Strategos, I checked a few times today and there was at least one moment where his score was better than mine. So it really could have gone either way. Thank you to everyone who entered!

Here were the differences the average predictions compared to the final results:

Summary
team average prediction final result difference
4499 16.4 5 -11.4
359 17.6 8 -9.6
7034 16 7 -9
7157 20.5 12 -8.5
3647 17.5 10 -7.5
3940 22.4 15 -7.4
3310 9.4 4 -5.4
6800 8.2 3 -5.2
6036 23.6 19 -4.6
604 15.3 11 -4.3
2910 6.2 2 -4.2
5940 13.2 9 -4.2
8033 27.1 23 -4.1
498 33 29 -4
649 25 22 -3
2813 33.5 31 -2.5
1678 3.4 1 -2.4
5104 27.9 27 -0.9
4414 6.8 6 -0.8
846 19.8 20 0.2
2046 13.5 14 0.5
2486 23.1 25 1.9
5507 31.7 34 2.3
971 10.2 13 2.8
2930 27.2 30 2.8
3476 12.3 16 3.7
1700 33.3 38 4.7
694 19.2 24 4.8
3256 29.7 35 5.3
4255 30.2 36 5.8
3478 18.8 26 7.2
972 31.2 39 7.8
114 24.4 33 8.6
3175 27.1 37 9.9
973 6.5 17 10.5
696 19.3 32 12.7
1690 4 18 14
254 3.5 21 17.5
1619 8.2 28 19.8

4499, 359, and 7034 top the list of teams that beat their expectations dramatically, congrats to all of those teams!

254 and 1619 were the biggest under-performers relative to expectation. This is likely due in large part to their horrible schedules, and probably slightly due to predictions over-estimating them due to their championship wins.

10 Likes