PhotonVision not working

I have a bit of code basically ripped from Photon themselves and the robot does nothing when the button is pressed.

Heres the git.

The biggest thing that jumps out to me is that you have this setup in the periodic of the subsystem and the photonvision example is in the teleopPeriodic.

I can’t be sure but based on a setup we have using photovision for turret tracking its all in a commands execute method with that command being the subsystems defaultCommand.

Our tracking code: https://github.com/Cybersonics/2022-Robot/blob/3/5/2022Comp/src/main/java/frc/robot/commands/TurretCommand.java

I’d concur. The examples for photon are written in a TimedRobot architecture, because that’s how WPILib does their documentation, and it’s the most concise way to show the idea of what logic is required.

Command-based programming is designed around the paradigm of splitting up the “what to do” from the “how to do it”.

When I quickly looked at the code, it wasn’t clear to me what the intent was the drivetrain motors are being controlled from multiple places, and it wasn’t clear to me which control would win out.

If it would be helpful, some command based examples could be added over the summer. Opening an issue describing where the current gaps are and what sorts of things need documented could help.

Ok so I changed the code according to your example but i keep getting a fatal error at AimAtBall line 49. I’ve got no clue why. The codes been pushed to origin at the link above. Also I have tried some command base but it just keeps ending up like this fatal error rabbit hole that the person who was helping me just gives up or something and I don’t get a response.

You never define your _controller in RobotContainer this is passing null to the command so when it tries to get the axis it breaks.

Real quick before I fix that heres some Timed Based code I through together based on Photon example and it does nothing when the button is pressed.

 public class Robot extends TimedRobot {
 Spark leftRear = new Spark(0);
 Spark leftFront = new Spark(1);
 Spark rightRear = new Spark(5);
 Spark rightFront = new Spark(6);
 MotorControllerGroup Left = new MotorControllerGroup(leftRear, leftFront);
 MotorControllerGroup Right = new MotorControllerGroup(rightRear, rightFront);
 DifferentialDrive drive = new DifferentialDrive(Left, Right);
 XboxController xboxController = new XboxController(0);

 PhotonCamera camera = new PhotonCamera("Camera");
 final double CAMERA_HEIGHT_METERS = Units.inchesToMeters(24);
 final double TARGET_HEIGHT_METERS = Units.feetToMeters(5);
 final double CAMERA_PITCH_RADIANS = Units.degreesToRadians(0);
 final double GOAL_RANGE_METERS = Units.feetToMeters(3);

 final double LINEAR_P = 0.1;
 final double LINEAR_D = 0.0;
 PIDController forwardController = new PIDController(LINEAR_P, 0, LINEAR_D);

 final double ANGULAR_P = 0.1;
 final double ANGULAR_D = 0.0;
 PIDController turnController = new PIDController(ANGULAR_P, 0, ANGULAR_D);

 @Override
 public void teleopPeriodic() {

double forwardSpeed;
    double rotationSpeed;

    forwardSpeed = xboxController.getRightX();

    if (xboxController.getRawButton(1)) {
        // Vision-alignment mode
        // Query the latest result from PhotonVision
        var result = camera.getLatestResult();

        if (result.hasTargets()) {
            // Calculate angular turn power
            // -1.0 required to ensure positive PID controller effort _increases_ yaw
            rotationSpeed = -turnController.calculate(result.getBestTarget().getYaw(), 1);
        } else {
            // If we have no targets, stay still.
            rotationSpeed = 0;
        }
    } else {
        // Manual Driver Mode
        rotationSpeed = xboxController.getLeftY();
    }

    // Use our forward/turn speeds to control the drivetrain
    drive.arcadeDrive(forwardSpeed, rotationSpeed);

}
}

Keep in mind: Everyone here is doing this for free. They have no context to who you are, what your background is, what your robot looks like, what you’re actually struggling with - all they have is the words you type here. Be very gentle when judging others on their ability or desire to help you. Expect to have to work with them - multiple rounds of questions and answering.

First, post the full text of the error. It’s difficult to give concrete advice without knowing what error you’re actually facing.

If you haven’t yet, Read the documentation on WPILib that talks through how to interpret runtime errors. It’ll help you have some context when someone gives an answer.

Ask specific, targeted questions. Describe with as much detail as you can muster what you attempted, what you expected to see, and what you actually saw.

Spend time to analyze which parts of an explanation you understand, and which ones you don’t. Don’t be shy about saying a certain phrase or sentence doesn’t make sense. Strangers on the internet need that sort of information to help fill in your knowledge gaps.

Finally - be relentless in your efforts to learn new information. Attitudes involving “I can’t possibly figure this out ever” have the tendency to make unpaid educators walk away from the situation.

Finally…

I concur, this is most likely root cause on one of the issues you are currently facing.

The reason Jeremy can say this - He’s spent years learning and staring at these problems. It’s not magic. It’s hard work, grit, and determination. It’s getting burned by a problem again and again, enough to recognize the root cause knowing next to nothing about the symptom.

1 Like

Some follow-up questions to help myself establish context:

  1. If you look in NetworkTables (via outline viewer) or in the photonvsion UI website, is it reporting a target?
  2. What is the name of your camera and pipeline in the photonvision UI website?
  3. When the button is not pressed, does the robot drive normally?
  4. How did you select the ANGULAR_P and LINEAR_P values of 0.1?
  1. Yes
  2. Blue Alliance and Yellow Alliance (Using 2021 Yellow ball atm)
  3. It drives Perfectly
  4. I dont know if i fully understand that one but it was just a PID value untuned something to turn it “on”

Gotcha. Some thoughts:

First -

The code you posted lists "Camera" as the name of the camera in the PhotonCamera constructor. You may need to update this to align with the setup in the UI website for photonvision.

This is a gap in understanding.

WPILib has some documentation on PID controllers. The gains (P, I, and D) describe the ratio of how much “control effort” (IE, turn command) you get in response to a certain amount of error. The ideal value depends on your robot’s mechanisms, the units involved in the error and the control effort, and how you define “ideal” for your case.

A common “Hand-tune” method is to start by setting all gains to zero. Then, double the P gain until you see too much motion. Bump the P gain back down slightly, or increase the D gain until the oscillation or overshoot is well-controlled.

Try addressing both of those as best you can, and let’s talk through the results.

I plan on tuning pid eventually but as this bot im using isnt our actual bot and basically just a chassis i dont plan on calibrating anything I want a proof of concept over the spring break. So i should change the camera name to Yellow or Blue alliance accordingly? Or should i just set the pipeline?

Also I went out and tested the code after setting the controller and I got some results. The robot despite not pressing anything will stutter in one direction. I can still drive with some hiccups from it trying to drive, and when i press a trigger it stops the robot completely.

Sounds like there is more than one issue at play here.

Vision alignment builds on top of a functional drivetrain. You’ll want to get this fixed first. Given the symptom described, some things I would check: Verify that your motor controllers are showing the expected light colors when driving forward and backward, the battery is charged and the robot is not browning out. Verify the joystick axes being used in code correspond to the ones you’re using on your controller. Look for possible electrical wiring or mechanical binding issues.

Once the robot is driving smoothly, and not before…

The requirement is the name provided in the UI and the name set in code must match. This is required because you might have multiple cameras, or multiple coprocessors, all running photonvision. You must make your code align to the camera you want to use vision results from.

This is not necessarily a good assumption. “Untuned” and “Functional” is usually mutually exclusive when it comes to PID gains. An incorrect P gain can cause any symptom from no motion at all, to rapid and uncontrolled motion.

While it’s not the first thing to address, I would include a few iterations of PID tuning in your plan before your proof of concept is functional.

The driving has always been smooth its just trying to vision. Also we only have one camera and one coprocessor all we need and all we can afford really. So I should rename it the LifeCam or whatever

These statements seem to be in conflict.:

Can you elaborate? I am not sure how to advise otherwise.


The requirement is:

I believe the answer to your question is “yes”. However, I leave it as an exercise to the user to fully determine how to apply the requirement to their current situation.

So it appears the function acts in reverse. I beleive it was supposed to start vision tracking when a trigger is held. However it seems to activate when not held and deactivate when held. So when nothing is held it will stutter trying to vision track. It will drive normally when triggers are held.

Ok. Let’s continue digging from the symptoms you report, down to root cause.

To confirm: I assume the roboRIO is running exactly the TimedRobot code in 6, nothing more or less?

Assuming so:

The code you provided references xboxController.getRawButton(1) as the input used to change between vision alignment, and full-driver-control mode.

What sort of controller are you using? Which inputs on it are the “triggers” you refer to? And, does that align with xboxController.getRawButton(1)?

What I am referring to is the main code on git named as GyroBot. The Timed only did nothing upon pressing A.

Ok. Let’s throw the brakes on hard here.

Here’s the problem: We spent a half hour discussing possible issues with your drivebase and code, and the whole time I didn’t know you had switched which codebase you were talking about. It’s going to be difficult to make much progress.

The real core issue currently isn’t anything code related: It’s that I don’t know how to ask my questions in a way that gets enough detail out of your situation. Lacking that detail, I can’t provide meaningful advice.

My recommendation: Poke at a student or mentor on your team, or in a nearby team, who has have experience implementing vision processing algorithms. 894 and 2337 are both close-by, experienced teams who could likely help. See if one of them can come in and do some walk-through work with you on setting up a functional, basic ball tracking algorithm. The issue is not just technical - you need someone who can provide real-time, constructive feedback about the actual situation you’re in. I’m unfortunately not able to do that remotely.

Yea thats been the main issue for our team. I am the only coder and the only help I really get is from here. I do have a graduate “mentor” per say but they don’t know vision. Yea I also only brought up the time based once to try to elaborate on any possible issues since the example itself didn’t seem to work.

Understood. And to be clear - don’t feel bad about it. The vast majority of students I interact with aren’t able to fully express ideas to the extent needed to get concrete, remote help. I’m an engineer by trade - an education professional would be much better than I am at asking the questions the right way. The reason FIRST is an awesome program (and why IMO it really struggled through the past few years) is the power of that in-person mentorship (whether it’s coming from a mentor or a student).

Seriously though - 894 and 2337 both are accomplished teams, and I’m willing to bet at least one of them has someone who could lend a hand to a local team if you asked. We do that kind of thing all the time around my area. Even just a few hours of one of their programmers swinging by your build space could make a world of difference. 2337 especially has done some assistance programs in years past. I’d 1000% recommend reaching out to them.