I’ve seen a lot of discussion about Coprocessors with Grip and OpenCV but not much discussion about LabVIEW.
We have image processing built on LabVIEW but we’d like to run it off of a coprocessor. We want to send a numeric array to the RoboRIO and a processed image to the Driver Station.
We do not know, however, how to send our information from the coprocessor to the RoboRIO and the Driver Station. I have everything hooked up through a network switch and from my understanding the RoboRIO is the host of this network. Smart Dashboard VIs coming from the coprocessor don’t seem to communicate with the RoboRIO or the Driver Station (probably not their intention but it was worth a try).
I had semi-success using a Driver Station program running on the coprocessor but it seems that only one Driver Station at a time can communicate with the robot.
You have a number of options for communicating between various computers on or off your robot.
Probably the easiest is to run a Network Tables version 3 client. This is what LV implements and what most of the other tools implement, though some are at version 2.
You can also use TCP, UDP, Serial, or other protocols to send data between the computers.
There is no need to run the driver station on another computer, and as you saw, they don’t like that and will generally stop one another to keep only one up at a time.
Thank you for the reply Greg. I was able to send some data using the SD variables by starting the NT Client. I’m still stuck on sending the processed image to the Driver Station. How would I send an image file from the coprocessor to the Driver Station with LabVIEW?
WPILib contains a Send Images to PC Loop. This VI is responsible for getting the image from the camera, flattening it to a buffer, and sending it to the dashboard.
If the coprocessor can run LabVIEW, then this code may be a reasonable starting point. There is a similar multi-camera version in another thread. If it is in another language, then you may be able to translate it or look at a similar implementation in that language.
Thank you again for your help Greg. Send Images to PC expected a camera reference as an input but what I sought was the ability to send a post processed image to the Driver Station. However, I did figure it out and I was also able to learn more about sending data through networks as a result. I will be imparting this knowledge to the student programmers during our off-season so they can perform this next year.
To anyone who is interested in how to do this for their own team in the future and digs up this thread, I have created a guide with many pictures here which details how to wire up this system on your robot, how to get the image processing examples running, and how to send the data over the robot network.
Glad you figured it out. Nice tutorial by the way.
I’m curious what the modified image contained. I’ve generally annotated images by sending a few points to the DS and have those overplayed over the original image that is showing on the DB.
I’m not sure what the image did that caused Network Tables to cease working between the coprocessor and the RoboRIO. Another thing to note is that setting a static IP on the RoboRIO (to prevent from having to set the IP again later on the coprocessor) also failed and prevented the robot from working on the field altogether. Quite a few teams disconnected on the field (and even in the finals!) so I wonder how many of the teams had a static IP…
And again, all these errors could just come from my inexperience with networking.
I didn’t really think about doing it that way but I suppose I could have done it this way. I’ll have to get a better handle on TCP protocol and see what I can manage that way.
What is the correct way to do this using just the NT Client and NT writes?
Is there an example out there of how to correctly add and wire the the NT Client VI in the Color Processing example?
Our specific problem:
We can use NT Write to send information that can eventually be read from the Variables tab of the dashboard, but after restarting the Color Processing example, the variables (centerX and centerY for instance) are static on the dashboard and do not respond to writes from the Kangaroo. If we change the names of the NT entries and restart the vision processing example, we can once again read the data on the dashboard. Basically, the names are “spoiled” after first use.