There is sure a lot going on here.

I understand you want to cover all the bases, but you need to to keep an eye on extraneous information. Now maybe I'm dead nuts wrong, maybe you can gather, tabulate, and analyze all that data, taking it all into account. But I've found there are only a few categories one can evaluate at a time. That being said
this sheet is built for data acquisition.
For 2017 (before the game was even revealed), I was planning on trying to move towards a more qualitative methodology, for the express purpose of more eyes-on-field time. As long as you have a few unique bases of comparison the desired result of ranking can easily be accomplished. The trick is to know what you are looking for and focus any additional scouting resources on teams that for those criteria. Qualitative to get the big picture, quantitative for close evaluation.
I am glad to see you have a quasi-qualitative implementation for shooter accuracy, I would break it down into; 0-33%,33-66%,66-95%,95+%, but that's just me. The other qualitative fields are also well done, (with too many options your accuracy may go up but precision goes down, with fewer options the data accuracy decreases but the precision rises), I have found 4-5 options in qualitative to be the 'sweet-spot'.
My big takeaway from this sheet is;
good organization, it's logical, and complete. It includes some qualitative data points where appropriate. My suggestions are:
watch for extraneous information, no sense in collecting it if you are not using it,
redefine/adjust on your qualitative distributions to catch the outliers more efficiently, and cut down on the large black areas. your printer won't like it and it will be harder to read.
Overall excellent sheet for logging everything a team is doing over the entire match. Good job.
EDIT: Human error is a huge factor, you will always want to mitigate it with your alliance partners. It may be worth your time to force the user to evaluate a team's human components.