Automatic catcher?

Now Catching I believe is going to be a huge thing! (especially if you are a defensive/ support robot… Like us) So my team was discussing if automatic catching would be possible. We thought that maybe you would use your camera to track the ball but that is as far as we got. We were wondering how it would be done and coded?

We use 8 limit switches on our 4 panel wings. When the ball hits the switch it automatically clamps down on it with over 300 lbs of clamping force. If you look at the 604 reveal video its very similar but better due to the autonomous action.

We also used a limit switch on our bot to clamp down on the ball after a catch.

We just go off of operator reflex!

We just made the “wings” as non-rigid & energy absorbing as possible so the ball doesn’t bounce out. Beyond that, its up to the driver teams to the transfer.

Tracking a ball is a feat on its own than not many teams get around to doing, yet alone calculating its trajectory and autonomously moving to where it is going to land. It involves some serious geometry if your camera is not perpendicular to the ground. You also have to account for the delay. Can you write an efficient enough program that will give an accurate solution in time? It is a very complex problem, but a lot of fun to solve.

I would simply go off driver reflex, and I’m the vision programmer on the team.

Obviously tracking the trajectory of the ball is insanely difficult in the FRC constraints, but I think you’ll see plenty of catching mechanisms automatically actuate when a ball gets near enough. We’re attempting to use an IR range finder for this.

It is. We have done it.

This sensor (we’re using two of them right now):

The sensor outputs an analog signal. To calibrate it we held the ball where we wanted our trigger point to be and recorded that value. The driver holds down a button on the control interface to open our catching mechanisms and when the sensor sees something that crosses the trigger point value the catcher snaps shut.

We added all of the hardware and code in the last two days of build, it’s really simple and straight-forward.

Until then we were getting good results with driver reflexes, but wanted to automate it to remove some burden from the driver.

This is what we are currently trying to look into since it eases the burden on the drivers. It does not however eliminate the need to position the robot correctly, and since we are relying on the drivers’ instincts, this will take a lot of practice.

Tracking a ball will be a tough feat, especially with no reflective tape on the balls themselves and considering the lighting that comes from the roof. IMO, it would be better to spend this amount of energy and time elsewhere.

Hopefully you used your maxbotix voucher to buy one of them.

It’s probably still a waste of time, but I can think of an easier way to do this. It requires a omnidirectional drivetrain. Basically you use PID. For left to right, it is simply using PID to try and center the ball on your camera’s x-axis. Forwards and backwards is a bit more difficult. I would probably use some sort of function which combines the current size and height of the ball, and feed that into PID. It seems like with some tuning you could get it to work.

Our Programming team made something huge. they tracked the ball with the camera and using Gyro drove to the ball(you can know when you’re really close when the radius of the ball is pretty high-can be read from the dashboard) and then we just catch the ball.

Is it able to tell the difference from the opponents ball? Thats really cool though!

Video or ban! :smiley:

We are currently using one LV-EZ4, looking up at a shallow angle, from the base of our catapult. It “works”, but the threshold of where we want to trigger our grabber and hands is varying too much. …say from an analog threshold of ~9 to ~30. The result is that we sometimes miss the auto-catch, and the operator has to recover manually. That doesn’t sound like a big deal, but time is precious. For now, we’ve decided that the ultrasonic - mounted as we chose - is not as reliable as we want it to be. Unless we calibrate before each match, or understand the drift, we’ll likely swap it out with an optical sensor at our first regional in Orlando.

@JamesCH95: have you seen your threshold move around like that?

@JoeRoss: you bet we used that voucher :slight_smile:

We have an automatic catch function using the same matixbot ultrasound rangefinder (may be a different model, but certainly the same brand and packaging). One of our students got it running on an ardunio - the threshold “catch” distance is set with a potentiometer, and the cRIO is sent a boolean digital signal (basically, “Catch” or “Don’t Catch”).

LEDs hooked into the arduino let us see where the catch threshold is, so we can make tweaks by powering on, messing with the potentiometer and waving a ball around in the pit. No Arduino code changes, no cRIO code changes.

The issue we’ve found with the autocatch is the timing very much depends on the speed of the ball - a lobbed ball from the human player needs to trigger the catch late (short range) so it doesn’t snap shut before the ball is contained, but a shot with a lot of velocity (over the truss) needs to trigger earlier so the catch is in motion when the ball gets deeper into our robot. We’re messing around trying to find a sweet spot, but we might have to tune for one and operate the other manually. We could add a second digital out for a second threshold, as well - have a “catch” and a “human load” threshold distance.

Our robot release video has some demos of this - I think many (most?) of these were automatic catches.

In a word: no. We have them aimed near-vertical and they trigger very reliably as the ball drops into our catcher. It sounds like you have a geometry issue as much as anything else… from what you describe it seems like you’re trying to bounce sound off of the ball nearly tangent to its surface.

…hmmmmm Good point. Depending on the ball’s entrance, the geometry could be working out to be close to tangential. That’s something to take a closer look at.