|
|
|
![]() |
|
|||||||
|
||||||||
![]() |
|
|
Thread Tools | Rate Thread | Display Modes |
|
|
|
#1
|
||||
|
||||
|
Re: Twitter decoding program
Quote:
Also, see attachment. Does anyone know how the 205 TeleOp points number was computed for Team 67 at Waterford? Unlike the Hybrid and Bridge points, it does not seem to be equal to the total of the alliance TeleOp points scored in the 12 qual matches by the alliance in which Team 67 was a member. |
|
#2
|
||||||
|
||||||
|
Re: Twitter decoding program
Quote:
It would be great if someone with a working twitter parser could compare the ranking results when QS, hybrid, and bridge scores are all tied, to see which is being used as the tiebreaker. Looking at the Alamo regional, there were 2 cases of this 2721 tied with 4162 and 2583 tied with 2969. Unfortunately, the magnitude of difference between then team's TP were large enough that including foul points would be unlikely to change the ranking. I did not see any such ties at Kansas City, BAE, or Smokey Mountain, but there's still a lot of events I didn't look at. I did ask about this on Q/A. |
|
#3
|
|||
|
|||
|
Re: Twitter decoding program
I have now added least squares solving in order to better find the "impact" each team had on the score. The results are now much more accurate, and better predict the results in the final match( the total of the average scores from each team member is around +- 5 from the total score for that team(excluding outliers like team 93)).
( I am only using data from the qualifying matches to predict the final rounds) While of course hand recording the individual scores of each team would be more accurate, this should be a great help in determining which teams provide the most "positive" points to help in the finals. Last edited by Lalaland1125 : 16-03-2012 at 00:11. |
|
#4
|
||||
|
||||
|
Re: Twitter decoding program
Quote:
http://www.chiefdelphi.com/forums/sh....php?p=1144595 http://www.chiefdelphi.com/forums/sh....php?p=1144727 Oh, and a couple questions: What linear algebra library are you using, and is there a reason you are using SVD? |
|
#5
|
|||
|
|||
|
Re: Twitter decoding program
The math library is Eigen.
The reason why I am using SVD is because that is how Eigen's tutorials describe how to perform a least squares operation(http://eigen.tuxfamily.org/api/Tutor...Leastsqua res) I don't think missing scores is going to be that bad. As long as most of the scores are posted, there should be enough data to get a reasonably accurate result. If anything, the main problem with my model is that it is very limited, not counting defense, autonomous, etc Last edited by Lalaland1125 : 16-03-2012 at 00:15. |
|
#6
|
||||
|
||||
|
Re: Twitter decoding program
Quote:
For this application, LDLT would be far faster* and plenty accurate. Quote:
* For computing least squares for single events, the matrix is small enough that the time difference is probably not even noticeable. But if you ever intend to expand the functionality to compute least squares for a matrix containing all the data from an entire year's worth of events, I believe there would be a very noticeable difference in speed. If you have the time and are so inclined, it would be interesting if you would try SVD with 2011's data and see what the computation time is. For reference, LDLT takes 12 seconds on my 8-year-old PC to do least squares on a matrix populated with all the qual data from all the events in 2011 Last edited by Ether : 16-03-2012 at 17:42. |
|
#7
|
||||
|
||||
|
Re: Twitter decoding program
Quote:
Quote:
Last edited by Ether : 22-03-2012 at 14:03. |
|
#8
|
||||
|
||||
|
Re: Twitter decoding program
Regarding Sacramento, I talked with the field crew/FTA and their Twitter posts were apparently being blocked by the firewall. If the same happens at SVR, we will have a team member manually copy the data down so it won't be lost.
|
|
#9
|
||||
|
||||
|
Re: Twitter decoding program
Quote:
But in all seriousness, why should that even be necessary? This data has significant statistical value and historical interest. The data is already in electronic form. Does anyone know: Is there a compelling reason why the data is being discarded instead of being saved at the point of origin? |
![]() |
| Thread Tools | |
| Display Modes | Rate This Thread |
|
|