Quote:
Originally Posted by CVR
I think a better method would be to use the following:
Red Score =
Red1_(toteOPR + binOPR + litterOPR)
+ Red2_(toteOPR + binOPR + litterOPR)
+ Red3_(toteOPR + binOPR + litterOPR)
+ MAX(Red1_autoOPR, Red2_autoOPR, Red3_autoOPR)
+ MAX(Red1_coopOPR, Red2_coopOPR, Red3_coopOPR)
I think that method, while slightly more complex, will give more accurate results.
|
There's an issue with this method that may skew predictions for alliances with more than one team that does a lot of co-op. Lets say:
R1 has a "platform OPR of 40, and a co-op OPR of 30
R2 has a "platform OPR of 40 and a co-op OPR of 28
R3 has OPRs of 0 for the sake of argument
By the above method, the red alliance would be predicted to score just 110 points, since we would use R1's co-op average, but not R2's. But if R2 can usually score 40 platform points *and do co-op most of the time*, surely they wouldn't put up their 40 and then spend the rest of the match twiddling their thumbs. They would use the time they normally use on co-op to score more platform points!
(There's also the issue of whether the opposing alliance has a high enough Co-op OPR to ensure co-op will be successful, but now we're getting really complicated.)