Chief Delphi

Chief Delphi (http://www.chiefdelphi.com/forums/index.php)
-   Programming (http://www.chiefdelphi.com/forums/forumdisplay.php?f=51)
-   -   WPILib 2017 Control System (http://www.chiefdelphi.com/forums/showthread.php?t=152929)

PopeRyanI 01-03-2017 10:55 AM

WPILib 2017 Control System
 
I was looking through WPILib documentation and noticed that they added the information for the 2017 control system here: https://wpilib.screenstepslive.com/s/4485/m/13503

I didn't see any other threads for discussing this. This part stood out to me immediately. Having OpenCV bundled with WPILib will be very nice so that it will automatically be on the roboRIO. In addition, it seems that we will be able to do vision processing on the RIO more efficiently and then send the processed image to the dashboard, which could eliminate the need for a coprocessor if I am understanding this correctly. I'm very excited to see how this works during build season.
Correct me if I'm wrong on any of this because I am relatively new to this, so I don't know.

Quote:

Computer vision and camera support
For 2017 the most significant features added to WPILib Suite have been in the area of computer vision. First and foremost, we have moved from the NIVision libraries to OpenCV. OpenCV is an open source computer vision library widely used through academia and industry. It is available in many languages, we specifically support C++, Java, and Python. There is a tremendous wealth of documentation, videos, tutorials, and books on using OpenCV in a wide-ranging set of applications with much emphasis on robotics.

OpenCV libraries are now bundled with WPILib and will be downloaded to the roboRIO without the need for teams to locate and download it themselves.
There is complete support for USB and Axis cameras in the form of a CameraServer class and specific camera classes that will produce OpenCV images that can be used for further processing. You can either let the CameraServer automatically stream camera video to the SmartDashboard or you can add processing steps on the robot between capture and sending to the Dashboard. All the example programs in eclipse have been updated to show how the new Camera server is used.
GRIP, the graphical vision pipeline generator can be used to quickly and easily create and test computer vision algorithms that can run standalone on your Driver Station computer sending results back to the robot via NetworkTables. New for 2017, GRIP can generate code in either C++, Java or Python for your vision algorithm that can easily be incorporated into robot programs.
The NIVision libraries have been removed from WPILib to a separately installable package.

Cr1spyBacon8r 01-03-2017 11:12 AM

Re: WPILib 2017 Control System
 
I don't think we'll run it on the roboRIO anyway... We've had enough CPU troubles before, so just putting extra strain on it wouldn't go very well... Having a coprocessor to do vision would probably be the best way to go still.

BradAMiller 01-03-2017 12:38 PM

Re: WPILib 2017 Control System
 
We just want to give teams the option to have a good vision solution for the roboRIO. In addition, GRIP will now generate OpenCV code that can be run on your coprocessor of choice or you can run GRIP on your laptop. The GRIP generated code can also be added to the robot program.

jlindquist74 01-03-2017 01:11 PM

Re: WPILib 2017 Control System
 
Thank you for that. Our programming team has wanted to do vision work for the last couple of years. The biggest roadblock for them was understanding how the parts fit together. "It's all just the robot code, right?" Well, now it is.

jreneew2 01-03-2017 04:20 PM

Re: WPILib 2017 Control System
 
So, if we run the vision code on say a raspberry pi, can we stream the output to the driver station computer?

Vannaka 01-03-2017 05:05 PM

Re: WPILib 2017 Control System
 
@jreneew2, you can stream the output to network tables, whether the output is the video stream or the contour report. Then you can read those values from the driver station. Keep in mind the raspberry pi is somewhat underpowered and thus there might be a delay in the processing of the image.

jreneew2 01-03-2017 05:07 PM

Re: WPILib 2017 Control System
 
Quote:

Originally Posted by Vannaka (Post 1624788)
@jreneew2, you can stream the output to network tables, whether the output is the video stream or the contour report. Then you can read those values from the driver station. Keep in mind the raspberry pi is somewhat underpowered and thus there might be a delay in the processing of the image.

Ah it runs right through network tables. Thank you. We ran our vision code on a raspberry pi 3 last year and it worked wonderfully. It's much more powerful than the roborio so I don't see why we would run it there.

bob.wolff68 01-03-2017 05:15 PM

Re: WPILib 2017 Control System
 
Quote:

Originally Posted by Cr1spyBacon8r (Post 1624691)
I don't think we'll run it on the roboRIO anyway... We've had enough CPU troubles before, so just putting extra strain on it wouldn't go very well... Having a coprocessor to do vision would probably be the best way to go still.

I'd like to temper the prior statement with a very different experience. Our team (1967) used the OLD cRio processor which is way less powerful than roboRio to do live image processing using the NI library a number of years back with great success up to 13-15 fps without causing any other disturbances in the performance of the robot.

The biggest key here is in making sure the way you're handling your processing loop is done in such a way as to not HOG the CPU and all of its free resources. I would expect that the roboRio has more than sufficient processing resources to do quite a bit of realtime image processing.

Happy Programming!
RoboBob


All times are GMT -5. The time now is 07:34 PM.

Powered by vBulletin® Version 3.6.4
Copyright ©2000 - 2017, Jelsoft Enterprises Ltd.
Copyright © Chief Delphi