Using pixy


#1

My team recently bought a pixy 2 camera and wanted to use it for vision programming this year. We want our robots to be able to see the vision targets and drive towards them. We are new to this and were looking for help. If someone could assist us that would be amazing!


Advantage of using raspberry pi with pixy
#2

My team is doing something similar. We haven’t really finished testing everything out, but it seems like the easiest way to send information from the pixy cam to the roborio is via a serial port with an arduino as an in-between processor. Definitely try using the usb port on the arduino to connect to the roborio (it not only allows for data transfer, but it powers the arduino itself) and you should be able to use a ribbon wire to wire the pixy cam to the arduino. As for programming it, check out the documentation on the SerialPort class (http://first.wpi.edu/FRC/roborio/beta/docs/java/edu/wpi/first/wpilibj/SerialPort.html) and set the port to be SerialPort.Port.kUSB.

You’ll also have to program your arduino to send the data back to the rio. Try looking for documentation on programming it (This site is very helpful: http://www.cmucam.org/projects/cmucam5/wiki/Hooking_up_Pixy_to_a_Microcontroller_(like_an_Arduino)). Also make sure to download and import the library for pixy cam (You can find it in the previous link). You might see something about “baud rate,” and I’ve found if you’re just sending small numbers, 1200 does a good job, although you may want to play around with that number. Just make sure you set the baud rate on the arduino to be the same as that of the robot’s code, and also keep in mind that the serial output from the arduino is sent in ascii characters. Hope this helps, and have good luck with it!


#3

Hello,

Our team will also use a pixy, but wit a raspberry as co-processor. We’ve written a python class for the raspberry to gather the data of the pixy via I2C and then send it using the serial port.

If you are interested in using a raspberry, please let me know and we’ll help you out!


#4

Hi,

We used a Pixy last year and will likely use one this year as well. Instead of using a coprocessor, we plugged it directly into the RoboRIO’s I2C port, from which we got all of the data we needed (link to code). We are still figuring out how to do this with the Pixy 2, though. I think new Pixy 2 API allows it to interface directly with the I2C port, making code easier.


#5

You don’t need to use an arduino or raspberry pi. It depends what your team feels comfortable with. You can connect the pixy2 cam directly to roborio(using i2c or spi for example). You will then need to implement the serial protocol to talk to the pixy2 cam. Our team is doing this.

Here’s a thread about wiring pixy2 up. It talks about pixy but pixy2 wiring is same. Need help with the Pixy
And the official pixy2 docs have info on wiring too: https://docs.pixycam.com/wiki/doku.php?id=wiki:v2:porting_guide#how-to-talk-to-pixy2

If you search on github you can find some code examples of pixy2 serial protocol: https://github.com/search?q=pixy2
Our current effort is here: https://github.com/BHSRobotix/PowerUp2019/tree/pixy2_port
Note that the protocol between pixy and pixy2 is different.

You could also try to cross compile the libpixyusb code on the roboRIO and use it to talk to pixy2. I have no idea how to get started compiling things for roboRIO tho so can’t give any links.

If trying to implement the pixy2 protocol in java/c++/python/labview on roboRIO seems too hard then you can try using an arduino or raspberry PI. Use the code that is already written to talk to pixy2 that @thepythonguy linked to above to run on the arduino/raspberry pi. You will then need to implement your own protocol to talk between your roboRIO code and the code running on arduino/raspberry pi.


Pixy 2 talking to roboRio thru SPI
Pixy2 sending incorrect packets for vectors
#6

I am currently at the same stage as you and I recommend taking an approach to do direct connection to the RoboRIO using the SPI or I2C ports provided, if you have an Arduino or Raspberry PI, you can take that approach too. However you may find it easier to just do a direct connection to the RoboRIO from your Pixy2.

I recommend you use this https://github.com/PseudoResonance/Pixy2JavaAPI and follow the instructions to use the API in the README. I am using the SPI (Pixy2) to SPI (RoboRIO) method but this requires you to wire connections from pin to pin since the pin layout is not the same for each interface.

I am currently working on the Pixy2 for our team and you are welcome to see how I use the SPI method here: https://github.com/Shockwave4546/FRC-2019/tree/Pixy2


Pixy2 Object recognition
Connecting pixy 2 thru UART
#7

I’ve been working with the team 1891, Bullbots, out of Boise, ID, and I have compiled a USB version of the Pixy2 classes as converted into a Java JNI library. It should be fully functional now. Just clone the repo and deploy. Let me know if there are any issues with it.

Currently it is missing the final hookups of the objects to the “Target Reticle” screen on the Shuffleboard because I have that as an assignment to one of the students.


Interfacing a PixyCam2 directly to the roboRIO
#8

Oh, yeah. You might want to change the team number first before deploying.


#9

:open_mouth: Much wow.


#10

We are currently testing this out, have no errors but have no Camera feed from the Pixy 2. Anything we are missing or should remove? https://github.com/Shockwave4546/FRC-2019/tree/Pixy2USB


#11

In robotInit() you need to start the Pixy2 Thread to kick the whole thing off:

public static final Pixy2USBJNI pixy2USBJNI = new Pixy2USBJNI();

@Override
public void robotInit() {
Thread pixy2USBThread = new Thread(pixy2USBJNI);
pixy2USBThread.setDaemon(true);
pixy2USBThread.start();


#12

We do indeed have the start() in the robotinit. Does it have something to do with the fact we have another ordinary camera working out of the other USB on the roboRIO? I dont think that would be the issue though.


#13

You need to have the start() call right after where the thread is created or else the thread goes out of scope and is destroyed and the Runnable never gets called. The Pixy2USBJNI Runnable is static so that it won’t get destroyed, so even though the pixy2USBThread goes out of scope the Runnable does not so the thread cannot be garbage collected.

One can use other USB cameras as well no problem. I have tested two additional USB cameras successfully. I even have the code commented out that enables this.

JNIEXPORT void JNICALL Java_frc_robot_vision_Pixy2USBJNI_pixy2USBStartCameraServer(JNIEnv *env, jobject thisObj)
{
std::cout << “Starting CameraServer…” << std::endl;
// Uncomment these to get more regular USB cameras
// camera = frc::CameraServer::GetInstance()->StartAutomaticCapture(0);
// camera.SetResolution(640, 480);
// camera1 = frc::CameraServer::GetInstance()->StartAutomaticCapture(1);
// camera1.SetResolution(640, 480);


#15

I have posted a clunky, yet working version, of the two mixed together. I am not happy with the results yet because the frame rate is so slow. I just cant believe that the getFrame function cannot reset much faster. Anyway, it is on the “get_blocks” branch if you want to check it out.


#16

2 Quick Comments about that JNI build setup. Please update to GradleRIO 2019.3.2 (2019.3.1 was released too early and should not be used). And also use wpilibjni rather then wpilib for the library. wpilib includes things that can easily break if you do the wrong thing, whereas wpilibjni has much less chance of breaking.

One of the biggest ones is wpilibjni does not include camera server. The Java one and the C++ one can interfere with each other, causing some weird and serious issues, especially to any team that might be trying to use it. You can actually do the PutVideo call from the Java side, and you can then grab the handle from the returned object, and pass that over JNI to create the object in the JNI layer. That would be my recomendation, as it will be safer in the future. And the handles can be safely passed across the language boundaries.


#17

I’m trying to make use of this, but I’m getting the following error. What do we need to do to make it build? (on Windows)

Configure project :
INFO: Could not find files for the given pattern(s).
Skipping build: shared library ‘pixy2_usb:debug:sharedLibrary’: Could not find valid toolchain for platform linuxathena
Skipping build: static library ‘pixy2_usb:debug:staticLibrary’: Could not find valid toolchain for platform linuxathena
Skipping build: shared library ‘pixy2_usb:release:sharedLibrary’: Could not find valid toolchain for platform linuxathena
Skipping build: static library ‘pixy2_usb:release:staticLibrary’: Could not find valid toolchain for platform linuxathena
=============================
No Toolchain Found for roboRio
Run ./gradlew installRoboRioToolchain to install one!

You can ignore this error with -Ptoolchain-optional-roboRio
For more information, run with --info

FAILURE: Build failed with an exception.

  • What went wrong:
    No Toolchain Found! Scroll up for more information.

#18

Were you by chance disconnected from the internet when you tried to build it for the first time? It needs to download the roboRioToolchain just like for teams which run C++ for their regular competition code.


#19

I’m pretty sure I was connected to the internet at that time.


#20

Do exactly what it says. For C++, you have to manually install the toolchain, it will never automatically do it. Run ./gradlew installRoboRioToolchain to install it.


#21

@Thad_House. I never ran that command to install the roboRioToolchain. Did it by chance work for me because I have run C++ robot code previously?