OpenMV cam M7 vision questions


#1

Our programmer has been working on vision tracking and posted this to an older thread so I thought I would share it here as well. Any help you can give would be greatly appreciated.

“I’ve been experimenting with the OpenMV M7 for vision tracking this year but I’ve had some troubles with communication with the roborio. I was wondering if anyone had any suggestions, sample code, or the ideas on how to help. It would be much appreciated!”


#2

Disclaimer: I’ve never used this camera. That being said, the product page does say this:

A full speed USB (12Mbs) interface to your computer. 
Your OpenMV Cam will appear as a Virtual COM Port and a USB Flash Drive when plugged in.

Therefore, you should be able to use a Serial port from the device to the RIO using something like this on the RIO-side:

SerialPort m7Camera = new SerialPort(9600, SerialPort.Port.kUSB);
...
String cameraData = m7Camera.readString();

The product page also mentions SPI and I2C capabilities on the camera, which both have WPILib interfaces as well:

So it looks like you have a few ways you can interface with the M7 camera from a Java program on the RoboRIO.


#3

I guess my first question to you would be what are you programming in? But regardless we have also been working on the openmv m7 and we actually having it working for vision tracking. We have tried both I2c and the rs232 port with no success due to the fact that the roborio has a different logic level on those ports. Although I believe that if we would have went through the mxp port I do believe that we could have gotten something like i2c or serial to work (but I couldn’t use this since we have a rioduino plugged in there). So after having tested all of the other methods we decided to try USB (a micro usb cable from the roborio usb port to the openmv microusb port) and that is what we got to work. On the openmv side we just used the simple print function that you would use when you are debugging your code or printing out the fps. Then on the roborio side we used the following code. To better explain our code the openmv sends back a number that corresponds to where the target is in relation to the robot EX: pixels from the center of the screen. If that does not help let me know and I could try to explain it a little bit better.