Using raspberry pi for vision in labveiw

So recently we have been trying to use a raspberry pi for vision on our robot for the 2022 season and we ran into a few problems we’ve followed the guide on wpilib about vision processing with the raspberry pi, but the thing is we use labveiw for our robot code and we’re wondering how do we get the camera feed into the labveiw dashboard, I’ve had a hard time finding information for labveiw vison processing with the raspberry pi so any help would be appreciated.

Thanks in advanced .Team 7645

1 Like

The raspberry PI should be able to stream video to any dashboard regardless of which language you use to program your robot. The raspberry PI uses Network Tables to communicate it’s list of available cameras. One this list has been published, any dashboard including LabVIEW can select that camera for streaming.

Make certain that you use the Raspberry PI web page to enable the Network Tables client and set your FRC team number. (Use the button at the top of the screen to set the raspberry PI file system to Writable, then save your changes, then set the file system back to read only.). You may have to reboot the raspberry PI.

Also make certain that you the Raspberry PI web page to define a camera. Plug a USB camera into the Raspberry PI then define the camera. Make certain to save your changes. Rebooting the Raspberry PI is a good way to see if the changes were saved.

It would be helpful if you were more specific about the issue that you are having.

Sorry for the late reply(time zones)
The issues we are having is how do we get the data such as the center X,Y in labveiw so we can let our robot track the object.
And also we are having some connection issues, when the Raspberry pi is plugged in to the roborio the dashboard cant connect to the Wpilib local camera server, but when I plug the raspberry pi in to computer via ethernet while also connected to my roborio via usb the camera feed shows up.

How is your computer connected to the roboRIO in this case?

Though the USB port

The USB port is a separate network interface so you can’t see Ethernet devices over it. Java/C++ have a utility that can forward the data, but I’m not sure if there’s something easily available in LabVIEW. Port Forwarding — FIRST Robotics Competition documentation

You can add a network switch so that you can connect to the roboRIO over Ethernet also so you can access the camera data.

A couple of questions:

  • Are you using USB cameras attached to the Raspberry PI?
  • Are you using the WPILIB raspberry PI operating system image?
  • Have you configured the raspberry PI to use network tables?
  • Did you define a camera using the raspberry PI web page interface.

If these are yes, then.

  • The raspberry PI uses ethernet as its communications media. The raspberry PI is wired to the extra ethernet port on the robot radio. (This means that communications only goes through the radio. Communications to the raspberry PI cannot go through the direct USB cable between a PC and roboRIO.)
  • The raspberry PI will publish a specific network table variable listing its camera streams. Any standard dashboard can see and select these streams for viewing.
  • If you created a custom vision program, it needs to write network table variables for things like:
    – Vision target found
    – Target X offset
    – Target Y offset
    – Target distance
  • The robot program reads the network table variables and performs your desired control action.

A couple of questions:
*Are you using USB cameras attached to the Raspberry PI?
*Are you using the WPILIB raspberry PI operating system image?
*Have you configured the raspberry PI to use network tables?
*Did you define a camera using the raspberry PI web page interface.

We’ve done all of these and have a custom vision program on our pi but the current issue is how do code the data taken from the netwoktables into labveiw, how do we use setup the network table and read the data in labveiw?

I am uncertain about your specific question. I’ll try my best.

  1. The PC will need to communicate with the roboRIO and the Raspberry PI using WIFI. The USB connection cannot communicate with the raspberry PI. ( “Port forwarding” is possible with LabVIEW, but this would have to be written.)

  2. Here is the documentation to configure Network Table client and your team number on the raspberry PI. Both are needed for Network Table communications to work.
    The Raspberry PI — FIRST Robotics Competition documentation

  3. Your custom vision program will need to write the values to network table variables using the WPILIB calls in C and Java to do this.

  4. To see the values being written, use the “Variables” tab of the default dashboard. If values aren’t showing up and they should be, try turning the raspberry PI off, wait a minute, and turn in back on. Also write something to the variable even when a vision target is not found. (If the target is never found and the value is only written when the target is found, nothing will be displayed.)
    image

  5. Once the raspberry PI is writing values to Network Tables, the roboRIO can read them using the LabVIEW Network Table read VI. Here is sample. Something like this would have to be put into a loop, probably in “Periodic Tasks.VI” and executed only when driving to the target is desired.
    image

The “Drive to target” is code you create to to something with your vision data.

If this wasn’t your question, please be a little more specific about what you have tried and the trouble you are having.

So currently I have everything set up and running labveiw is getting the center X,Y data(thanks to your help), but now the thing is how do I write my drive to target code, I’ve tried using a PID controller and setting the set point to 320 as a center point for my X and it works but not in the way we want it to it’s too sporadic, so any help would be nice.

Generally use “Position Control” to spin your robot towards the target. The SecretBookOfFRCLabVIEW has a chapter about position control. (You don’t need to use a PID to do position control…)

The PhotonVision team created some examples. I ported a few of these to LabVIEW in the repository. Just posted a couple of hours ago. (They are UN-TESTED!!)

Use the middle of the image as 0. Any offsets of the image and +/- of this. The offset will be the Process Variable. The Setpoint would be 0.

This topic was automatically closed 365 days after the last reply. New replies are no longer allowed.