G39 - Autonomus

How many robots will get G39 fouls during Autonomous?

G39 ROBOTS are prohibited from launching BOULDERS unless they are in contact with the opponent’s TOWER or carpet in the opponent’s COURTYARD, and not in contact with any other carpet. Violation: TECH FOUL per BOULDER

A robot gets stuck on the Defense, and then blindly shoots the boulder.

I’ll have to tell the programmers to check their position, and do not shoot if not in the right place.

This is harder than you think. Unless your vision system can accurately tell distance (and therefore derive your position on the field), there isn’t much your programmers can do. Most defences like the Rough Terrain or Ramparts will render encoders basically useless until you’re on flat ground again.

That being said, I think a lot of teams will get stuck on defences, or read their position inaccurately, during autonomous. However, some teams may have a way to mitigate this :wink:

I do not foresee this being an issue with my team, but I do think this will be fairly common and something to keep track of when scouting.

Quote:
Originally Posted by rich2202
I’ll have to tell the programmers to check their position, and do not shoot if not in the right place.
This is harder than you think. Unless your vision system can accurately tell distance (and therefore derive your position on the field), there isn’t much your programmers can do. Most defences like the Rough Terrain or Ramparts will render encoders basically useless until you’re on flat ground again.

That being said, I think a lot of teams will get stuck on defences, or read their position inaccurately, during autonomous. However, some teams may have a way to mitigate this

Everyone’s roboRIO has an accelerometer built into it.

If a team can’t use that to tell the difference between being flat on the floor or sitting on a defense before firing… I will be very sad.

I might even go around at events making sure every team that plans on shooting, that our team would play with in quals at least, can utilize that feature to ensure no fouls.

Accelerometer =/= Gyro. Although 2 dimensions worth of data can be derived from the accelerometer, for most teams, this is not worth the effort. The RoboRIO’s accelerometer has a lot of drift associated with it. This was also discussed Here

Even if a team decides to implement this, they still have to reorient to face the goal and shoot, for which they need to determine their distance and angle to the goal. If their vision is capable of this, it negates the need for a gyro in the first place.

Does the Camera can see the goal in the right place? if not, don’t shoot.

If you are lined up with the goal, then I’d risk taking the shot.

We have a NavX, so hopefully we can get some useful data from that.

You can be on a defence and still have acquired a valid vision tracking target. (i.e. middle defence). We’re trying to avoid that issue as much as possible

You don’t need a gyro to figure out if you’re on flat ground. You don’t need to integrate at all. When you’re stopped, check the accelerometer. The relative sizes of the X, Y, and Z axes will tell you the RoboRIO’s orientation. Unless the mounting has broken loose, you can then figure out the robot’s orientation.

Shooting based on an integrated inertial navigation position is likely to be a waste of time. If you aren’t using vision code or otherwise aligning the robot to field elements, you’re unlikely to hit a high goal. At least practice this to find your shot percentage so you can figure out whether the extra five points when you make it is worth the time to track the boulder down when you miss.

Don’t just acquire a target; let it tell you where you are. The farther away you are, the smaller it shall appear. Don’t shoot if the target is too small.

I continue to think we are vastly overestimating the number of teams that will successfully shoot high goals in autonomous. This is one of many, many issues teams will have in doing so.

So this won’t be a big deal, if only because less than half a dozen teams per event will be successful in auton any way and other teams will stop trying to shoot.

Drift only matters when you’re double integrating for position. Using the accelerometer to determine which way is down will be very accurate.

It’s not quite this simple to tell if you are stuck on a defense though. Teams can still be trapped on defenses when their robots are parallel to the floor. This will be very common on the moat, and could still happen with many other defenses.

In that case, one could monitor the current going to the drive motors… if the drive wheels are spinning free they’re not in contact with the carpet.

We’re mounting color sensors to the underbelly of our robot. We’ll be watching for green, then not green, then green again.

I don’t necessarily see the accelerometer as a viable option for getting reliable data in autonomous unless you plan on only crossing defenses with level surfaces (unlike the moat or rough terrain). Otherwise, you will find that the inaccuracy of the accelerometer will hinder your tracking of positions on the field.

Unless you just plan to use the accelerometer to sense when your robot has settled back down. Don’t need to know you are flat, just need to know you are not moving all over the place.

Yup, they can.

… another check before firing.
… another check most teams won’t do :wink:

As the OP stated, there will be a number of G39s during autonomous.

Humm…
Forgot about the built in roborio…
Nothing but Navx…
#Pudding

We have developed a great way to tell if we made it to the other side of the outer works so we don’t rely on encoders. Hint: The outerworks are a lot shinier than the carpet.