Connecting pixy 2 thru UART


Hi, my team wants to connect the pixy by UART. Can someone give me some guidelines and some example code? Also, you may show me other ways to connect the pixy with some code also. Is our first time using the pixy for a robot and also utilizing vision tracking, so the more the details the better.



There isn’t really any pixy 2 porting at all but there is a pixy 1 porting already done if you want to use it. I have been trying to work porting the pixy 2 I2C code to java. Heres the link for the pixy 1 I2C code:


This code would work with the pixy 2?


Curious on this as well, we are trying through SPI and we can see data on our smart dashboard in X Y, Height, Width, but seems to have locked onto 1 signature meaning it jumps back and forth from the two images i Do not think its a average of the two.
Any ideas on how to use the pixy 2 cam to separate that data for 2 images as in the tape this year.


You can try to re-engineer or use pixy 1 code to read the distances that I linked above two posts.


We haven’t tested it yet, but the pixy 2 has a compatibility mode that might work with code for the pixy one. You can activate it in one of the settings tabs in pixymon.


From the docs, " Note, this parameter only applies to SPI, UART, and I2C interfaces – not USB. Additionally, it only applies to the color_connected_components program."


I don’t think there is a way to directly connect the Pixy(or Pixy2) to the RoboRIO, at least with official libraries, in Java. I think most teams just plug the Pixy into the Arduino with the included cable and use the Arduino Pixy(or Pixy2) library to do the processing on the Arduino and then send the results to the RoboRIO. We have a repository that is just code to send data from a Pixy to robot using an Arduino and I2C. Our 2019 repo has code to communicate with the Pixy and Arduino in command based if that helps?


We directly connected the pixy to the robo rio over I2C and then wrote some custom code to communicate with it.

There is a lot of good info on the pixy docs


The code that was listed is only covering the response from the PIXY 1 but not the actual request for the data from the Roborio. Did the old pixy model not require two-way communication? Because the pixy 2 website requires that we send requests from the Roborio to the pixy and then capture the data back again from the pixy2.

Also, is the 0x55 0xaa on the old pixy the same as the 16-bit sequence and 0xc1af on the pixy 2 or is it using that for error-checking?


To read from the Pixy 1 over I2C we used the read function with the pixy’s I2C address to get the info (in our case that’s 54).

Here is the code we used to read a word:

public int readWord() {
    ByteBuffer buffer = ByteBuffer.allocate(2);
    boolean abortedWhileReading =, 2, buffer);
    if (!abortedWhileReading) {
        return getUnsignedInt(buffer.array());
    } else {
        return 0;

public static int getUnsignedInt(byte[] data) {
    ByteBuffer bb = ByteBuffer.wrap(data);
    return bb.getShort() & 0xffff;

Note: The getUnsignedInt() function converts two bytes from the Pixy (little endian) to a java integer


If you are still not having luck porting the Pixy 2, refer to this post (offers SPI, I2C, and UART Methods). Using Pixy 2 with RoboRIO (No Co-Processor) Java API


Hey, I’m trying to implement the API you listed in that post. Can you help me implementing it to a new FRC project? @dilanpace1496. This is what I have:

        /* Copyright (c) 2018 FIRST. All Rights Reserved.                             */
        /* Open Source Software - may be modified and shared by FRC teams. The code   */
        /* must be accompanied by the FIRST BSD license file in the root directory of */
        /* the project.                                                               */

        package frc.robot;


        import edu.wpi.first.wpilibj.TimedRobot;
        import io.github.pseudoresonance.pixy2api.Pixy2;

         * The VM is configured to automatically run this class, and to call the
         * functions corresponding to each mode, as described in the TimedRobot
         * documentation. If you change the name of this class or the package after
         * creating this project, you must also update the build.gradle file in the
         * project.
        public class Robot extends TimedRobot {
           * This function is run when the robot is first started up and should be used
           * for any initialization code.

           Pixy2 pixy = Pixy2.createInstance(Spi);

          public void robotInit() {



          public void autonomousInit() {

          public void autonomousPeriodic() {

          public void teleopInit() {

          public void teleopPeriodic() {

          public void testInit() {

          public void testPeriodic() {


And this is the

plugins {
    id "java"
    id "edu.wpi.first.GradleRIO" version "2019.1.1"

def ROBOT_MAIN_CLASS = "frc.robot.Main"

// Define my targets (RoboRIO) and artifacts (deployable files)
// This is added by GradleRIO's backing project EmbeddedTools.
deploy {
    targets {
        roboRIO("roborio") {
            // Team number is loaded either from the .wpilib/wpilib_preferences.json
            // or from command line. If not found an exception will be thrown.
            // You can use getTeamOrDefault(team) instead of getTeamNumber if you
            // want to store a team number in this file.
            team = frc.getTeamNumber()
    artifacts {
        frcJavaArtifact('frcJava') {
            targets << "roborio"
            // Debug can be overridden by command line, for use with VSCode
            debug = frc.getDebugOrDefault(false)
        // Built in artifact to deploy arbitrary files to the roboRIO.
        fileTreeArtifact('frcStaticFileDeploy') {
            // The directory below is the local directory to deploy
            files = fileTree(dir: 'src/main/deploy')
            // Deploy to RoboRIO target, into /home/lvuser/deploy
            targets << "roborio"
            directory = '/home/lvuser/deploy'

// Set this to true to enable desktop support.
def includeDesktopSupport = false

// Maven central needed for JUnit
repositories {
    maven { url '' }

// Defining my dependencies. In this case, WPILib (+ friends), and vendor libraries.
// Also defines JUnit 4.
dependencies {
    compile 'pw.otake.pseudoresonance:pixy2-java-api:1.1'
    compile wpi.deps.wpilib()
    nativeZip wpi.deps.vendor.jni(wpi.platforms.roborio)
    nativeDesktopZip wpi.deps.vendor.jni(wpi.platforms.desktop)
    testCompile 'junit:junit:4.12'

// Setting up my Jar File. In this case, adding all libraries into the main jar ('fat jar')
// in order to make them all available at runtime. Also adding the manifest so WPILib
// knows where to look for our Robot Class.
jar {
    from { configurations.compile.collect { it.isDirectory() ? it : zipTree(it) } }
    manifest edu.wpi.first.gradlerio.GradleRIOPlugin.javaManifest(ROBOT_MAIN_CLASS)


I changed the link type from




I want to know haw to get the information of the objects from the camera.


Well first off, change the version number in the build.gradle since a recent release was published for the API on GitHub (You will have to keep up to date from time to time for these updates).

compile ‘pw.otake.pseudoresonance:pixy2-java-api:1.1’
compile ‘pw.otake.pseudoresonance:pixy2-java-api:1.3.1’

Are you planning on making a separate file to do the Pixy2 commands in? (Like in a Subsystem folder?)


Yes, im programming in Iterative, like this code. I’m not getting the lectures from the camera, the x, y, width, and height.


You would have to use the API documentation here, similar to this for extra detail

I encourage you to have a look at our Team’s current Pixy 2 testing if you have any issues understanding it.

We also are using a normal camera along with our Pixy 2 since the API for a video stream over USB to the RoboRIO does not exist yet. (We may look into that after Build Season but it isn’t important yet for us).


I used the code of your team, let me use this new one. What protocl should i select in the pixymon? The SPI SS one?




How are you doing so far? Got it to work I hope?