Hi, second year programmer here. I’m having trouble trying to get two USB cameras to work simultaneously. Could someone help me by posting some example code or providing suggestions. Thank You
- Can you open one camera (which camera do you have?)
- Do you want to feed them to the Dashboard or just read them on the roboRIO?
There may be some USB bandwidth limits and you’ll have to run at lower resolutions/frame rates. To pipe both to the dashboard simultaneously, the easiest way is to go into the CameraBackgroundLoop.vi and modify it to allow you to use different ports. Then on the drivers station you duplicate the camera reading code for the 2nd port.
The code for sending USB imagery to the desktop is located in
CameraBackgroundLoop.vi -> WPI_CameraSend Images to PC Loop.vi and the default port it uses is 1180. On the dashboard, you’ll need to modify Loop 2.
We are trying to have the two USB cameras plugged into the Roborio. Would that change your directions at all?
No change in directions. I’ve attached the files necessary for this. (We’ll see how much I regret this) I haven’t had a chance to check the changes against a working roboRIO yet. I just patched everything quickly to what “should” work. You’ll still need to modify vision code on the rio to add the 2nd camera (“usb 1”).
Multiple Cameras.zip (801 KB)
Multiple Cameras.zip (801 KB)
I sure hope you don’t regret providing this. It looks like good stuff so far.
It looks like we can use your code to route one camera (usb 0) to the vision processing for targeting. A second camera (usb 1) could then be routed to the dashboard for driving. Is this a fair assessment? We shouldn’t need the targeting video on the dashboard just the driving view.
There’s a joke regarding programming and one mistake and supporting it for life.
Your approach is reasonable. At this point, I need to sit down and recode some VIs for long term support for multiple camera robots. I’m seeing it so much this year it’s time we have some code properly written for the purpose.
Heh! Don’t I know that feeling.
That would be much appreciated by many of us. In the meantime I am studying your BackgroundLoop code to learn how it works so I can pass this on to the programmers on the team.
Has anyone in this thread actually gotten both cameras to simultaneously stream to the Dashboard?
We worked on this for a solid day in Java, and ran into issues starting a capture on the second camera without first stopping the capture from the first camera.
When we tried to start two captures at the same time, we received an exception from the underlying NI Vision libraries.
We’ve gone to using a dedicated Raspberry Pi to stream both USB cameras instead.
One of our students detailed his findings here:
Using a rasberry PI is a better way to go anyway. This way you do not have to worry about network traffic and bandwidth limitations. If you figured this out then you are doing it the best way in my opinion.
We did it last year in LabView but had issues which are related to IMAQdx. We’re working on developing code this year to support dual streaming again.
The rPI doesn’t remove any bandwidth limits. It will move buffer copying and possibly compression to another CPU, but whether that is worth the extra complexity is up to the team.
I read the other thread about NI Vision throwing an exception and didn’t see much to work off of. It would not surprise me to see NIVision return an error code when asked to do something that doesn’t make sense, and the Java and C++ libs have a habit of throwing exceptions instead of handling errors. If you have more details, please send them. In the meantime, I may find time to get multiple USB streams going. I’m pretty confident it will work, but will require changes within the lower level code to select a different port for transmission.
This is exactly the solution we needed.
The Camera Background Loop VI (located in the Vision Processing VI) does two things; it handles some required processes for the IP camera to reduce lag, and it also runs the Send Images to PC VI, which, as it sounds, sends the camera to the dashboard.
In our case, we have two cameras on our robot; one functions for vision processing and one functions for driver vision. We were only getting the former on our dashboard, even though both were connected.
Our final solution was to, in Vision Processing, define both cameras, (we spent a while on this part) and run the Camera Background Loop VI on each. To reduce bandwidth consumption, we created an alternative version of the Camera Background Loop with the Send Images to PC VI removed.
Now we can see on the robot with one camera and perform vision processing with the other! Thanks so much, we wouldn’t have figured this out without this post!
Surely there must be some way of specifying which camera feed goes to the dashboard without having to resort to creating more work for the RoboRio to do.
I’ve looked at Adciv’s code for the Background Loop but I am at a loss to figure out how to specify which video feed is sent to the PC.
Anybody have an answer to this?
The short version is it doesn’t select which of the cameras goes to the dashboard. Instead it sends both of the cameras to the dashboard simultaneously. It modifies the existing code to allow a 2nd camera to be sent in parallel (just on a different port).