Help! how do i code a live feed vision system?

This is my first year of code. My team has decided to use live feed from a camera on our robot during the sand storm period.

The goal is to use an Axis M1013 with an Arduino board. We are using LabVIEW for code, but from what I’ve read the camera plugs into the Arduino. then the Arduino plugs into the radio.

My question is where do I start?

WPI published a guide for getting started with an Axis network camera, either the 206, M1011 or M1013. In their set of steps, an Arduino is not needed – the camera would connect to the robot’s network via ethernet.

In general, you can find more resources about cameras/streaming/vision on this page of the Screensteps site.

thanks for the info, ill read into it. we plan on using a coprocessor to lighten the load on the Ds laptop, and robot. because our laptop isn’t the greatest.

You might also check out the Advanced session here (worked through an example):

This is using LV to do the vision processing (can be run on the roboRIO or another processor that can run LV - like a laptop). In the example, I’m using a USB camera, but if you go into the Vision.vi and change the select at the start from True to False, it will switch the camera address from Usb 0 to an IP address (just make sure this IP matches your camera).

The camera should show up in your shuffleboard after plugging it into the roboRIO. You shouldn’t have to use an arduino or code anything if you’re only trying to get live camera feed.

when i said Arduino i meant Rasberypie