Use PhotonVision? Give us your feedback here!

If you/your team used PhotonVision in the 2021-22 season, we would love to hear your feedback below! This includes, but is not limited to:

  1. Bug reports
  2. Documentation issues
  3. Feature requests
  4. Overall reviews

Our goal remains to have an accessible and robust solution for teams to use - your feedback is critical in making sure that is achieved!

When providing feedback, we ask that you tell us:

  1. A detailed description of your experience / issue
  2. Software version
  3. Co-processor and camera type
  4. Logs (as needed)
  5. Whether or not you have time for further debugging / testing (as needed)

Either post here, or on our GitHub issues page

As always, contributors are welcome!

Thanks,
PhotonVision Team

12 Likes

This will be more overall review than proper bug report as i’d prefer to keep the details of those to the GitHub, but suffice to say our experience was excellent.

At Kickoff this year, though I knew the general workflow used by Limelight (our previous vision solution) I had never tuned or programmed it for a competition bot before. I was browsing this forum and saw Photon advertised as being faster than Limelight on the same hardware. Despite from some minor issues that are elaborated on below, our Photon has been much smoother and better than my experience with the alternatives.The colored backgrounds to the hue slider, the threshold view, and the ability to directly poll target data instead of having to read from NetworkTables have made debugging and overall usage much easier, especially for our newer programmers.

The issues we did encounter were a failure to post the NetworkTables version string (fixed in 2022.1.5 and worked around before that by inserting the string ourselves), a sporadically very blurry threshold view until Driver Mode or the color stream was enabled or toggled (though this did not seem to have an effect on tracking, as our Photon-driven flywheel and turret tracked and adjusted perfectly well during testing and at every event of the season), and a string of failures to connect to the RoboRIO until the Rio’s IP address was set to static (which after reading the docs again as well as the WPILib Networking docs was something we should have done from the start but I digress).

Photon is not perfect, yes, but with a little spit and polish (and some of the teased upcoming features) I feel no shame in recommending it for any team of any skill level to use.

6 Likes

@connor.worley if you have the time, would you mind sharing the issues you had? I believe you mentioned some in other threads

I really like the open-source and cross platform aspects of photonvision. It’s got a nice interface and pipeline calibration is pretty easy… until you plug in an unsupported USB camera.
Then the whole thing starts crashing. The camera won’t display, the software freezes when switching or creating pipelines, etc.
We experienced this with an ELP camera plugged into photonvision running on a limelight. We had no issues using a lifecam instead of the ELP though. However, we found ball tracking not to be worthwhile for other reasons, and the ELP’s fov was better in teleop, so we switched back to the limelight firmware about halfway through the build season.

We also had issues with the problem described in this issue.

I liked using photonvision a lot. We abandoned it mostly due to not needing it, and partially due to a few small bugs. It was working about as well as the limelight firmware at tracking the vision tape, and if we hadn’t had a limelight already, we would have used it. Thank you to all the contributors who are improving this project; this is already a strong alternative to the limelight and its only getting better.

4 Likes

I worked remotely with a team this season and we saw what we believed were incorrect yaw values on our LL2 running Photon at every resolution up to and including 720p. We also saw an issue where Photon would appear to occasionally stop updating NT until we power cycled the LL2. The team made the call to switch back to the stock LL image, so I didn’t get to perform a more thorough investigation. I know it’s not a great bug report, but I may be able to get them to take a second look at Photon if they end up going to any events using AprilTags and we’ll see if the issues reoccur.

We posted a question about this on the PhotonVision discord during the build season, but nobody answered us… is there a way to turn off the crosshairs in driver mode, or even better, to adjust the position of the crosshairs? Using a LL2, if it matters.

It’s already on our roadmap :slight_smile:

We are tracking the FOV issue at 720p but did not know it was seen at other resolutions.

The NT update issues were seen across dozens of devices but we are unable to track it down to a PhotonVision specific issue. Setting both the coprocessor and RoboRIO to static IPs seemed to resolve it in every case we were in contact with.

Thanks for the reports.

Hello! I just wanted to say thank you for creating such an awesome product! I did run into an issue during the season that I wanted to note:

When running on a gloworm development board, for some reason the photonvision version did not match up with the photonlib version, even though both were flashed with 2022.1.5 (I think that was the version). That itself wasn’t the issue, but when running into a mismatch, every loop iteration an error was logged which caused the robot to lag horribly and behave unpredictably. However, someone on the PhotonVision discord was able to guide me through using a custom version of PhotonLib that had that commented out which I very much appreciate.

I think maybe a more robust solution would be to only error log once every 5 seconds or so to make sure the robot doesn’t lag too much.

Thank you again!
Drew

1 Like

The version check as it exists right now isn’t ideal. It should for sure be rate limited to avoid lag, we just didn’t have bandwidth to do that this year. You’re welcome to take a stab at that, or i believe we have an issue open for someone to get around to if not.

1 Like

I thought Photonvision was a great addition to our arsenal. I wanted to re-flash the Limelight image to have a true A/B test on the implementations but never got around to it.

Our high goal tracking was very accurate in each of the venues we competed in, we had a lifecam plugged in to try and track cargo and it conditionally worked but our student’s filtering was too easily confused by bumpers and lines on the ground to make it bulletproof.

Our only complaint or bugs we experienced with Photonvision were:

  1. After redeploying code during development we’d often have to go restart the photon vision service from the UI because of issues we never tracked down. We never really had that problem from a fresh start of the robot, just in the re-deploy scenarios.
  2. We did have to set our version string in network tables manually to eliminate the mismatched version exception that gets continually dumped to the driverstation even though our versions did match.
1 Like

This topic was automatically closed 365 days after the last reply. New replies are no longer allowed.