Teams that played on the far side: How?

Watching matches and looking for places to improve, one thing that stands out is the capabilities of teams playing well on the other side of the field. Many times there were cargo that couldn’t be seen from the drivers station, or tucked in the hangar

For teams that played well on the opposite side of the field, how/what did you do for blind cargo? We’ve discussed a driver’s station camera to see cargo, more practice, etc. Curious to hear what other teams did to solve the problem this year!

2 Likes

A strategy I saw used a lot by teams at the Ontario DCMP was having your human player on the far side of the field indicating if balls were on the far side of the hub. Some teams went as far as hand signaling where the balls were and roughly how many there were. Seemed like a fairly effective use of their otherwise not used human player.

8 Likes

We had a driver camera with a toggle to swap the controls to robot oriented mode so we could pick up balls on the far side, you can also use the human player and reflections off the opposing driver station to locate balls.

7 Likes

We mounted a camera atop our intake.

1 Like

We practiced nothing but far-side driving between our last regional and Championships, banking on the fact that whoever we played with in elims would probably be much more effective on the near side so we wanted to let them do that. We used a combination of a fish-eye camera, human player hand signaling, and getting good at swiping through “likely ball homes” blind.

We were definitely better at it than at our Regionals (especially when defense wasn’t tight), but it was still way harder than playing near.

23 Likes

Driver height is a big advantage. I’m about 6’8 so my depth perception is better and I can see balls closer to the hub on the other side.

17 Likes

We didn’t play it amazingly since we didn’t use any vision feed or anything but after our first event, we were informed by another team that had scouted us that we hyperfocused on the near fenders too much and needed to play far fenders more, especially when they were closer options. So for a majority of the drive practice for our second event, we told our drive team that they weren’t allowed to go to the near fenders, and that they had to force themselves to go and get comfortable with using the far fenders, no matter how far the bot had to travel to get there. Once they were clearly comfortable, we opened it back up.

To summarize: require drive practice under every scenario. Under defense, far side, near side, playing defense yourself, heck anything game specific.

1 Like

Do you do anything on the technical side to minimize latency on the fish-eye camera stream back to the driver?

3 Likes

We played with a variety of resolution / compression schemes using both Limelight and the RoboRIO as our connection point. We found a combination that worked pretty well using the Limelight (we wanted to avoid the RoboRIO due to fear about this issue). I don’t remember the specifics.

Latency was okay, but we mostly used the camera for ball localization rather than actual first-person driving, so it wasn’t super critical.

11 Likes

our team did drive practice with our drive far back from our field while a few people tried to distract them to help practice driving with bad visibility under stress.

2 Likes

We’re not much different than other teams, but for the bits of time we were on the far side of the field:

Good camera: Wide angle, prioritize latency over quality and resolution. Display it on a screen that’s about eye-level with the driver to minimize eye motion when switching between on-field and on-robot.

Robot-centric driving: we had a button to switch between field-centric and robot-centric. While driving on camera, driver had to switch to robot-centric to make it work.

The camera was secondary to just seeing the robot on the far side. A driver who knows the robot well knows how it moves even without seeing it, and can use that intuition to guide it through the blind spots. Just knowing where the balls are at and getting to them is a lot more important than hitting marks with sub inch accuracy.

Which leads toward the real enabler: tons and tons and tons and tons of driver practice.

1 Like

The 2019 sandstorm meta. The jumbotron is up there already and it doesn’t take too much thought to get stream centric driving.

(we mostly tried to use our camera)

4 Likes

ELP USB camera running through a raspberry pi (THANKS 862 Lightning Robotics!) with photonvision to stream set to potato resolution (160x120) to maximize frame rate. Only active while a button was being held (we considered a foot pedal for the driver). Also enabled “first person” drivetrain controls (turned off field centric), and lowered the drive train speed. Long cords on the controllers to allow the drivers to move around as much as possible.

In retrospect, I saw some teams just go up behind the hub and “blow it up” when their human player indicated more than 2 balls were hidden there. They’d just drive in full speed and make the balls bounce away.

We also mitigated it with strategy. By defining areas of play on the field, the right driver station played the right side, left played the left, and the middle played the close field. That allowed the side driver stations who had a view of the backfield to clean it up.

Same.

We added the fisheye camera to a pi with photonvision. Once we dropped the stream resolution the camera latency was adequate. Likewise, you didn’t need a really clear picture to see which balls you were looking for.

1 Like

Can you go into a little more detail with this? Were you able to detect cargo and then estimate their position on the field from the robot? If so, how was this done and did you use that information to automate moving towards the cargo to intake it?

I think its much simpler. I think he means, the camera was used so a driver could roughly locate a ball and then using their own eyes, drove the robot to that location. Their driver didn’t actually do first-person driving through the camera.

2 Likes

Yeah, this. More a mental footnote of “oh, there’s a ball hiding there”.

3 Likes

Though this does sound awesome and one step closer to a fully autonomous robot in teleop. The driver just watches as the robot automatically locates the nearest ball, collects it and shoots.

Your reputation seems to bring huge expectations with it.

Thanks for the informative replies. Not having a semblance of a real hub was definitely a struggle in practicing; hopefully future game elements won’t require so much storage space or materials.

We’ll continue practicing and look into the fisheye lens.

1 Like

Do you have a suggested camera and screen? I’ve heard from others that the external DLSR screens work well, but wasn’t sure what to buy.

1 Like