Hello! I’m Austin and I run programming and scouting on team 5148. I noticed that many teams this year are opting to focus their efforts on shooting in the high goal. The benefits look great on paper! However, after seeing the game piece itself, and watching some matches of xRC Simulator, I’m wondering how worthwhile the low goal in comparison to the high goal.
In order to quantify this, I created a quick simulation in Unity that randomly spawns robots around the field that have almost perfect aim (adds a slightly variance to each shot). The result is what you see below:
Robot Count is the amount of robots
Shots is the amount of shots taken
Contacts is the amount of balls that have made contact with the inside of the hub
Goals is the amount of goals
Accuracy is the ratio between shots and goals
Bounce-In is the ratio between contacts and goals
Ball-to-ball and Ball-to-robot collision is disabled.
The ball has 90% bounciness and 60% friction in this simulation, but real world results will vary.
After about ~11,000 goals, I exported the data into Tableau where I made these charts:
(Scatter plot of all of the goals made)
(Heat map of all of the goals made)
One thing is immediately apparent: The closer you are to the goal, the more likely you are to actually make a goal. Even with a perfect aim (looking at you, 254…)
(Histogram of Distance vs Goals. Note: Measured in In-Game units, not IRL)
(Same Scatter plot but with Distance vs Goals as a color-scale)
Everything inside the green area is about a 50-70% chance of making it.
Everything inside the red area is about a 20-40% chance of making it.
This pattern continues across all of my testing. I believe this is primarily caused by the horizontal velocity balls are given causing the ball eject itself out of the goal:
From these simulations, I believe robots will need to be on or near the tarmac in order to profit off of the high goal.