So my team decided to use a Pixy2 as a camera and we have no idea how to implement the values in VSCode and how to visualize the image without having the Pixy2 directly connected to a computer. We are struggling so much in finding information that is useful: it is both pretty old or we just don’t understand it.
First, you’ll have to decide how to attach it to your robot. Either through another coprocessor like an Arduino, then likely some serial connection from it to the RoboRIO or make a cable from the Pixy2 directly to the RoboRIO and communicate via SPI or I2C (then I would recommend using this library GitHub - PseudoResonance/Pixy2JavaAPI: Pixy2 API ported to Java for FIRST Robotics RoboRIO).
For what you’re likely doing, you’ll want to use the color connected components in the Pixy2. You can train objects for the Pixy2 to look for by plugging it into a computer and using PixyMon or by pressing and holding the button. Then either on the Arduino/coprocessor or RoboRIO, you can have the Pixy2 list what objects it has found and it calls these “blocks”. Then you can figure out which one you want to use, likely the one with the largest area. The other information likely of interest to you would be the x or y coordinates of the object in the camera’s view.
If you were looking for something like a ball, you can use the x-coordinate to rotate towards the ball. (https://github.com/FRC-4277/2020InfiniteRecharge/blob/master/src/main/java/frc/robot/subsystems/VisionSystem.java https://github.com/FRC-4277/2020InfiniteRecharge/blob/master/src/main/java/frc/robot/commands/IntakeLineUpCommand.java)
This topic was automatically closed 365 days after the last reply. New replies are no longer allowed.