Hello Everyone. First of all, I was wondering who has used a Pixy CMUcam5 with arduino, who has used it with a raspberry pi, or directly to the roborio.
For those of you that have, were you successful? Also, what did you think of it. What were the pros and cons of the way that you hooked up the pixy CMUcam5.
Our team uses the pixy camera for vision targeting. It is directly connected to the RoboRio via I2C bus. It works very well. Our autonomous has an almost 100% success rate. However, it’s worth noting that the performance of pixy camera highly depend on how you tuned it. It didn’t perform well in our first two competition events because we did not config the camera correctly. Once the config was fixed, we are 100% accurate.
Since the vision processing is done within the pixy camera, you don’t get the video stream but only the array of detected rectangles. There is no reason to add an intermediate component such as Arduino or Raspberry Pi. It doesn’t improve any performance and increases complexity by dealing with the programming of an extra component.
BTW, just like Grip vision processing, Pixy may also give you false positive rectangles. So you do need a good algorithm to discriminate against those. Once that’s done, it would be super accurate. Also, a lot of teams process vision and obtain the angle of the target then use the gyro to navigate there. So it’s like taking a snapshot of the target, determining the target angle and then closing your eyes and hope it will get there. We did it differently, we are actually doing real time navigation using the pixy camera as a feedback device (i.e. while we are moving towards the target, the pixy camera is constantly adjusting the heading). This is almost impossible for Grip type of vision processing because of the lag time involved but it is possible for the pixy camera.
We put our pixy camera support into our library, so it is separated into several modules. The main one is the platform independent class that’s responsible for parsing the block info independent of what bus the pixy camera is connected to. https://github.com/trc492/Frc2017FirstSteamWorks/blob/master/src/trclib/TrcPixyCam.java
This class is extended by the platform dependent class (https://github.com/trc492/Frc2017FirstSteamWorks/blob/master/src/frclib/FrcPixyCam.java) that supports both the serial port or I2C.
Initially, we used serial port but soon we found that it gave us framing error exception which means the serial baud rate could be out of sync. So we had to lower the baud rate to 9600 and it improved but still occasionally gave us framing error. Eventually, we gave up on serial port because of its reliability and switched to I2C. Then we are fine now.