paper: Team 33 Scouting Sheet 2012

Thread created automatically to discuss a document in CD-Media.

Team 33 Scouting Sheet 2012
by: IKE

This is a filled out scouting sheet for 2012 (Troy event). This year I decided to put a filled out sheet up on CD to show how they look going into Pick-list formation Friday afternoon/night.

This is a filled out scouting sheet for 2012. This year I decided to put a filled out sheet up on CD to whos how they look going into Pick-list formation Friday afternoon/night.
We like the one consistent sheet as we can then use them like playing cards and do some quick sorts. Previously it was to ensure that all the relevant and necessary data for a team is easy and accessible. This year, several teams showed me their “scouting reports” that they send to the Coach, and I am really impressed and will likely make some changes for next year. 2337 has one of my personal favorites.

33_Scoutsheet_Troy2012.pdf (962 KB)

This particular sheet was from the first 10 rounds at Troy. The blanks at the top of the page are totals for some of the key metrics after the first 8 rounds (Friday). This particular sheet shows the importance of comments as several of the matches, the robot died shortly into the match. Dying during a match is very important towards scouting. If you are looking for a strong consistent strategy, a robot that dies could send you home early. If you are an alliance captain that needs a wild-card to beat a great alliance, a high scoring, but sometimes dies robot could be just the ticket for pulling an upset.

We have a team of 7+ scouts in the stands taking the data. 1 Lead scout organizes the sheets and ensure quality (reviewing sheets after each match). 6 scouts are dedicated to watch each robot in the match. We have 2 Lead scouts on our team that rotate the role of Lead, and about 20 students rotate through the role of taking match data though there is a core group that clearly scout more matches than others.

Thanks for sharing! :slight_smile:

How do you determine who scouts which matches? Do you have structured shifts or a minimum number of matches per person per event, or something else?

It has mostly been on a voluntary basis. We ask that students help and support the roles. At the District events this is usually not as big an issues as MSC and World championship due to the amount of distractions and good weather available. We did have some complaints this year about shifts being too long and some of the students not participating enough. We are discussing with our students on how to improve this. With the 7 students, if someone has to leave for an emergency, the lead scout is able to pick up the slack for a match or two.

Motivating students is a tricky subject, and often requires specialized solutions for each team. On our team we praise good behaviours publicly. This is more than just saying thanks guys for doing a good job. Being specific and discriminating about the behaviours that are praise worthy has had good results. I suspect that some of the issue we might have had in a couple minor areas were due to not applying proper emphasis and praise like we did the past couple previous years.

Thanks for the info. I know that last year our scouting program was disorganized and too subjective to realize its full potential. For that reason, this year was all about Data Driven Decisions, meaning no comments or opinions in the scouting sheets. I can’t deny it worked well–it produced a pick list which allowed us to fairly accurately predict most of the first round of alliance selections and was quite remarkable for its reliability in predicting match scores–but well-organized comments might add that extra push to make up for the loss of our lead strategist when he decides to step down (he could only make it to the occasional meeting and our competitions this year). Looking into the future, I ask: How do you keep everyone’s rankings, opinions, and comments consistent when there are categories that appear to be all in the eye of the beholder? Another point I noticed different from our sheets was that missed shots are segregated by target. How would one determine where a shot was intended when it is not clear? Also, how would one record an attempted high shot which misses and lands in the middle hoop for two points? Finally, if I may, how do you organize the sheets for each team so that they can all be found in a timely manner for each match?

Sorry for all the questions–especially the seemingly obvious ones–I just like this setup and want to understand it more thoroughly.

The lead scout reinserts them in numerical order after each match into a folder. This is pretty easy with the 6 sheets.

There are a lot of subjective categories on our sheets. Subjective stuff is less critical but good to have opinions on. If 7 out of 8 think the team looks slow, they are probably slow. If they get mostly 5s and 4s then they are pretty good on those categpries.

A made shot is a made shot. Even if it was intended to go elsewhere. These were quite few. In general, attempts were pretty easy to see what they were shooting on. The most important thing we found was point totals, followed by hybrid points, then shooting percentage, and… Some of these, you can figure out before hand. Some you figure out by watching closely.
In my opinion, the big things we missed were quantifying blocks of inbounders and we valued shooting position a bit low at an event.

Our system is far from perfect but works pretty good for us which is the main thing. Since we implemented this a few years ago, our win loss and seedings have been better, and the kids have much more realistic views of what good and average performances are.

I liked the Idea Karthik and 1114 use with the Sim Bucks. Has anyone else done this?

If you didnt watch the video, 1114’s scouters have Sim Bucks in which the students bet on robots that are in a match and try to make as much $ as possible. Its a great idea in My Opinion :slight_smile:

I know at least one out of state team at MSC that was using this approach.