Need help with the Pixy

First of all, I will preface this by saying that we already have a few working Autonomous builds, so don’t worry about this being super late in the season - it’s mostly a “add it if we can, scrap it if we can’t” kind of thing.

Anyways, we’re trying to hook up a Pixy cam for tracking the gear pegs. However, we have no real clue what that entails. We have read the porting guide, but any further advice (and especially examples) would be much appreciated. First of all, the cable that came with the Pixy is much too short and doesn’t appear to be able to plug in to the RoboRio. So we need another cable. But should we use I2C or SPI? And, once it is connected, how do we pull the data for two objects in LabView (in order to average the distance and find their center)?

Again, any help is much appreciated, but don’t spend too much time freaking out over how “late” this is. :stuck_out_tongue:

We use the analog and digital output and tune it using the USB from a laptop. You are going to need a custom cable no matter what. It does not plug into the rio directly no matter the comms scheme.

PM me if you need help with the cable.


But can the analog and digital signals convey the locations of the two largest targets? I was under the impression that that was not the case.

If you are looking for a bit of a challenge, look at the porting page. It explains how to make the cables and how to talk to the device. Or you can look for a library that does I2C or SPI for you.

Be sure to learn how to train the PIXY and how to steer the robot using it.

Greg McKaskle

So wait

We need to manually wire the output from the Pixy’s wire to whatever board we’re using?

Sounds like something I’ll need to talk to my electrical team on. Was really hoping to get a direct Pixy to rio connection.

Our team had a whole lot of trouble getting it working but just recently we got a solution that involves using the arduino to do the processing and then having the arduino send the data for each motor over to the roborio.

I’m assuming a Raspberry Pi could also be used for such a purpose? I could donate my Pi to my team for the season, but we don’t have an Arduino.

I suppose you could but it is so easy with the arduino as you didn’t need a custom cable and there is already sample code. For our code we modified the Hello world code and simply just have a couple if statements that is based off of the average of the two blocks and sends over digital to the Rio if the target is straight ahead to the left or to the right. You also need to configure the pixy camera with pixy mon properly if you want to do it this way however.

The Pixy is good to go on finding the targets, we just need a way to get our rio to get that data so we can use a PID loop to align it with the gear peg. :smiley:

PID loop is kinda overkill, just feed an offset into calculations for driving direction.

PID loops are easy though

If you haven’t yet solved the data to the rio problem, see the porting guide. We used the I2C port on the roboRio - our electrical team made a connecting wire, and we put a little diagram on the back of the 3D printed case in case the wires came out. You can take a look at our code if you want - we borrowed from others and wrote some ourselves. There are folks out there extending what we have done as well.

Using the analog/digital interface the Pixy will only report the single largest object it sees. Using the I2C or SPI interface it will report all objects it sees in order of decreasing size.

You’re right, but I have yet to come across code that I’ve been able to use to get the pixy to communicate with the rio over a serial interface.

To be honest, I doubt that you could tune a PID loop to be more effective than a good driver. Personally, I recommend just getting a camera feed to the driver. If one were dead set on using the pixy, I would recommend just getting x-axis data and feeding that to the driver. However, that would probably require serial communication unfortunately.

I’m gonna have to disagree. In Stronghold we were scoring 5 high goals per match with an exceptional driver. Once the Pixy was integrated we topped out at 11.

This year our 2 Pixy devices are communicating just fine over the I2C bus (one at address 0x54 and one at 0x5A). I’m sure our programmers will share the code if asked.

Does your team use Java? I looked for a github but I couldn’t find one. Regarding last year, a large part of that was that you guys had an excellent shooter. Regarding this year, I’m kinda stopping our work on vision because of how weak fuel is. We have a working shooter (as you saw) but the reality is that we will be able to use the key most of the time and use the camera to line up the rest of the time.

We’re using C++.

Last year’s catapult was extremely repeatable, but in the end was only as good as the auto-targeting code (which rocked).

ElectroKen and I are with the same team (FRC 230). We use C++ but based our code on the published code of Team 599 RoboDox that was posted early in the season. A version of this code is posted in this thread:

We can also post our code here, but it has several changes to it that make it more complicated and are related to how we use the information… so not sure if that would be easier or harder to use.

More code can’t hurt, although we’re in LabView.

Also, as far as I can tell it would be extremely difficult for our team to make a custom cable. We lack the ability to 3D-print, and our Electrical “Team” is just one kid. Given this, any suggestions on how to connect the Pixy to the rio without using analog/dio?

If you have the tools to make a PWM cable you have what you need to make a cable to connect the Pixy to a Roborio. Same crimper, contacts, and connector housings.