Pixy I2C straight to RoboRio issues

Hello, I am using the Pixy this year to deliver gears in autonomous. I am plugging the Pixy directly into the RoboRio through I2C. I am recieving data for the one signature we are using. The problem is that it is sending both information for the two strips of tape in the same byte. I need to know how to change it so each strip of tape thats the same signature send there data through different bytes or some other way…

The code we are using is from the BHSRobotix github page.

You might want to plug the PIXY into a laptop PC and use the monitor app to view the image. If your lens isn’t screwed in the right amount, a blurred photo can result in one particle when your eyes see two.

If that isn’t the issue, compare the I2C implementation to what you see using the monitor. I wrote an I2C handler for the PIXY recently, and it is actually 14 bytes per particle. The porting guide on their FAQ gives the details.

Greg McKaskle

When I use pixy mon, I can see both boxes perfectly. I need to know how to get the data for each box separate. Having information on both blocks, I can find the middle of them were the peg is and make that work.

Study the Porting Guide.


Information comes from the Pixy in Bytes. You need to write code to separate, sort, parse the byte data. At the bottom of the link above explains in details how to get the data from different signatures.

Thanks, but that is written in C++ and I use java. Do you think you can help me with converting it?

I read that you can parse the Sync word, how exactly would you do that?

We use Labview, sorry.

I found the part that I need. Parsing the different sync it recieves from the pixy. it says I need to parse the two syncs to get the information on both blocks. What excatly does it mean by parse.

We are the team whose code you referenced in your original post. We spent alot of time today finding out that the code we are using(copied from another frc team who copied from a CD post) isn’t doing what we want. We want to read multiple blocks of the same signature. The code we were using is reading data from pixy in 32 byte chunks and then looking for one block of each type of signature. The amount of data to read depends on how many signatures you have taught pixy and how many blocks it actually detects. We tried porting the arduino pixy code to java and will see if it works any better that what we had before. Check our repo to see updates. We did porting tonight and will test tomorrow. I have reasonable hope it will work… another team did something similar… see https://www.chiefdelphi.com/forums/showthread.php?t=154355

Just wanted to update this thread with results. The arduino code we ported to java worked great. We can see both strips of reflective tape. In other words we get two objects from pixy per frame now. Averaging x of the objects and a gyro with pid controller we can get our robot to center on the peg pretty well.

The arduino code we ported is here:

Our ported code is here: