Pixy2 Object recognition


#1

Hi everyone, being that I’m a new member here on the Chief Delphi forums let me start by introducing myself, My name’s Micheal (Also that’s the legal spelling of my first name so… oof) and I’m one of the programmers from FRC team 6455.

Now onto my the reason for my post. I received an email from one of my Mentors this week, he sent me a link from amazon for the Pixy2 camera(Which I did a little more digging around and found it on andymark as well.). I talked to him at the meeting we had tonight (the day of this being posted) and he said he came across it while he was looking for (I believe) wiring parts on amazon.

We haven’t bought the Pixy2 yet, but we’re definitely considering it.

Pixy2 Link: https://www.andymark.com/products/pixy2-smart-vision-camera

I did some searching and found some stuff on teams that had used the original Pixy, but I’m still a bit lost/:confused: .

I was thinking of using a Raspberry Pi to communicate back and forth to the Pixy2 (Especially since the Pixy2 Uses C++ and we code our robot in Java), Would that be the easiest way to communicate to it especially with Object recognition?

Does anyone have any suggestions or tips?


#2

At the end of the day, the Pixy will do both the image capture and object recognition, so you should only have to transfer target location data from the Pixy to the roboRIO.

I’d recommend against using a raspberryPI to talk to the Pixy - it’s just an extra handoff point in the stream of data from camera to roboRIO. In vision processing, latency usually matters, so you would want a system that is as direct as possible. If I were in your position, I think I would try to get the Pixy talking directly with the roboRIO.

I’ve never worked with this product line, just looked at the documentation. Like you I’m not seeing an official pre-made java library for communicating with the Pixy from a roboRIO. They do have a good set of documentation on choosing an interface which you could use to write your own java-based driver to read data from the Pixy. Usually this is over serial, I2C, SPI, or something similar. If you go this route they do document the format of the data packets sent from the pixy.


#3

Another vision option similarly priced to the Pixy is the Jevois camera. It can run OpenCV pipelines on board as well as ML algorithms, and I’ve heard much more positive things about it than the Pixy. I’ve never actually used a Pixy but have heard plenty of griping about it from other programmers who have so I am hesitant to consider it over other similar options.

Here are some CD whitepapers on the Jevois:

https://www.chiefdelphi.com/media/papers/3405

https://www.chiefdelphi.com/media/papers/3408?


#4

We used a Pixy in 2017 with an IR kit from https://irlock.com/ .
We used a Raspberry Pi to communicate from the Pixy to the robot over network tables because we couldn’t get serial communication working on the RoboRIO quickly enough for our liking.
It wasn’t perfect, but it worked.


#5

Alright, good to know that there’s options at our hands, though it does sound like writing a driver would be the best option.

As for the Jevois camera that’s interesting as well. I’ll share this with the others and see what they think.


#6

You don’t need a Raspberry Pi, arduino or any other kind of board to connect pixy cams to roborio. See Pixy and its different communication protocols

These are two wiki pages that describe the communication protocol for pixy2 and pixy
https://docs.pixycam.com/wiki/doku.php?id=wiki:v2:porting_guide
vs
https://docs.pixycam.com/wiki/doku.php?id=wiki:v1:porting_guide

I haven’t had a chance to see if the protocol is different between the two yet.


#7

This is a late reply but if you truly want to keep using Java and not use an Arduino or Raspberry PI, I suggest you use this API because it is truly working well for our team so far. Refer to this prior post I made. Using pixy This provides the API instructions with Gradle/Maven from PseudoResonance’s GitHub Pixy2JavaAPI and the branch from which our team is currently testing the Pixy2 with this API. It is so far promising and you may find it a better method in order to code easier with integration directly to the RoboRIO without the need of a co-processor.