Go to Post Remind me again. Why do we let the GDC hold us captive to these hints year after year? - jholman [more]
Home
Go Back   Chief Delphi > Technical > Programming
CD-Media   CD-Spy  
portal register members calendar search Today's Posts Mark Forums Read FAQ rules

 
Reply
Thread Tools Rate Thread Display Modes
  #1   Spotlight this post!  
Old Yesterday, 10:55
PopeRyanI PopeRyanI is offline
Registered User
FRC #2410 (Metal Mustangs Robotics)
Team Role: Programmer
 
Join Date: Dec 2014
Rookie Year: 2012
Location: Kansas
Posts: 18
PopeRyanI is an unknown quantity at this point
WPILib 2017 Control System

I was looking through WPILib documentation and noticed that they added the information for the 2017 control system here: https://wpilib.screenstepslive.com/s/4485/m/13503

I didn't see any other threads for discussing this. This part stood out to me immediately. Having OpenCV bundled with WPILib will be very nice so that it will automatically be on the roboRIO. In addition, it seems that we will be able to do vision processing on the RIO more efficiently and then send the processed image to the dashboard, which could eliminate the need for a coprocessor if I am understanding this correctly. I'm very excited to see how this works during build season.
Correct me if I'm wrong on any of this because I am relatively new to this, so I don't know.

Quote:
Computer vision and camera support
For 2017 the most significant features added to WPILib Suite have been in the area of computer vision. First and foremost, we have moved from the NIVision libraries to OpenCV. OpenCV is an open source computer vision library widely used through academia and industry. It is available in many languages, we specifically support C++, Java, and Python. There is a tremendous wealth of documentation, videos, tutorials, and books on using OpenCV in a wide-ranging set of applications with much emphasis on robotics.

OpenCV libraries are now bundled with WPILib and will be downloaded to the roboRIO without the need for teams to locate and download it themselves.
There is complete support for USB and Axis cameras in the form of a CameraServer class and specific camera classes that will produce OpenCV images that can be used for further processing. You can either let the CameraServer automatically stream camera video to the SmartDashboard or you can add processing steps on the robot between capture and sending to the Dashboard. All the example programs in eclipse have been updated to show how the new Camera server is used.
GRIP, the graphical vision pipeline generator can be used to quickly and easily create and test computer vision algorithms that can run standalone on your Driver Station computer sending results back to the robot via NetworkTables. New for 2017, GRIP can generate code in either C++, Java or Python for your vision algorithm that can easily be incorporated into robot programs.
The NIVision libraries have been removed from WPILib to a separately installable package.
Reply With Quote
  #2   Spotlight this post!  
Old Yesterday, 11:12
Cr1spyBacon8r Cr1spyBacon8r is offline
Registered User
FRC #5203
 
Join Date: Mar 2016
Location: Michigan
Posts: 4
Cr1spyBacon8r is an unknown quantity at this point
Re: WPILib 2017 Control System

I don't think we'll run it on the roboRIO anyway... We've had enough CPU troubles before, so just putting extra strain on it wouldn't go very well... Having a coprocessor to do vision would probably be the best way to go still.
Reply With Quote
  #3   Spotlight this post!  
Old Yesterday, 12:38
BradAMiller BradAMiller is offline
Registered User
AKA: Brad
#0190 ( Gompei and the Herd)
Team Role: Mentor
 
Join Date: Mar 2004
Location: Worcester, MA
Posts: 586
BradAMiller has a brilliant futureBradAMiller has a brilliant futureBradAMiller has a brilliant futureBradAMiller has a brilliant futureBradAMiller has a brilliant futureBradAMiller has a brilliant futureBradAMiller has a brilliant futureBradAMiller has a brilliant futureBradAMiller has a brilliant futureBradAMiller has a brilliant futureBradAMiller has a brilliant future
Re: WPILib 2017 Control System

We just want to give teams the option to have a good vision solution for the roboRIO. In addition, GRIP will now generate OpenCV code that can be run on your coprocessor of choice or you can run GRIP on your laptop. The GRIP generated code can also be added to the robot program.
__________________
Brad Miller
Robotics Resource Center
Worcester Polytechnic Institute
Reply With Quote
  #4   Spotlight this post!  
Old Yesterday, 13:11
jlindquist74's Avatar
jlindquist74 jlindquist74 is offline
WOPR Software Integration Lead
FRC #1622
Team Role: Mentor
 
Join Date: Feb 2011
Rookie Year: 1337
Location: Poway, CA
Posts: 35
jlindquist74 is just really nicejlindquist74 is just really nicejlindquist74 is just really nicejlindquist74 is just really nice
Re: WPILib 2017 Control System

Thank you for that. Our programming team has wanted to do vision work for the last couple of years. The biggest roadblock for them was understanding how the parts fit together. "It's all just the robot code, right?" Well, now it is.
Reply With Quote
  #5   Spotlight this post!  
Old Yesterday, 16:20
jreneew2's Avatar
jreneew2 jreneew2 is offline
Alumni of Team 2053 Tigertronics
AKA: Drew Williams
FRC #2053 (TigerTronics)
Team Role: Programmer
 
Join Date: Jan 2014
Rookie Year: 2013
Location: Vestal, NY
Posts: 189
jreneew2 has a spectacular aura aboutjreneew2 has a spectacular aura aboutjreneew2 has a spectacular aura about
Re: WPILib 2017 Control System

So, if we run the vision code on say a raspberry pi, can we stream the output to the driver station computer?
Reply With Quote
  #6   Spotlight this post!  
Old Yesterday, 17:05
Vannaka Vannaka is offline
Registered User
FRC #5801
 
Join Date: Jan 2017
Location: Kansas City
Posts: 1
Vannaka is an unknown quantity at this point
Re: WPILib 2017 Control System

@jreneew2, you can stream the output to network tables, whether the output is the video stream or the contour report. Then you can read those values from the driver station. Keep in mind the raspberry pi is somewhat underpowered and thus there might be a delay in the processing of the image.
Reply With Quote
  #7   Spotlight this post!  
Old Yesterday, 17:07
jreneew2's Avatar
jreneew2 jreneew2 is offline
Alumni of Team 2053 Tigertronics
AKA: Drew Williams
FRC #2053 (TigerTronics)
Team Role: Programmer
 
Join Date: Jan 2014
Rookie Year: 2013
Location: Vestal, NY
Posts: 189
jreneew2 has a spectacular aura aboutjreneew2 has a spectacular aura aboutjreneew2 has a spectacular aura about
Re: WPILib 2017 Control System

Quote:
Originally Posted by Vannaka View Post
@jreneew2, you can stream the output to network tables, whether the output is the video stream or the contour report. Then you can read those values from the driver station. Keep in mind the raspberry pi is somewhat underpowered and thus there might be a delay in the processing of the image.
Ah it runs right through network tables. Thank you. We ran our vision code on a raspberry pi 3 last year and it worked wonderfully. It's much more powerful than the roborio so I don't see why we would run it there.
Reply With Quote
  #8   Spotlight this post!  
Old Yesterday, 17:15
bob.wolff68's Avatar
bob.wolff68 bob.wolff68 is offline
Da' Mentor Man
FRC #1967
Team Role: Mentor
 
Join Date: Jan 2012
Rookie Year: 2007
Location: United States
Posts: 151
bob.wolff68 is just really nicebob.wolff68 is just really nicebob.wolff68 is just really nicebob.wolff68 is just really nicebob.wolff68 is just really nice
Re: WPILib 2017 Control System

Quote:
Originally Posted by Cr1spyBacon8r View Post
I don't think we'll run it on the roboRIO anyway... We've had enough CPU troubles before, so just putting extra strain on it wouldn't go very well... Having a coprocessor to do vision would probably be the best way to go still.
I'd like to temper the prior statement with a very different experience. Our team (1967) used the OLD cRio processor which is way less powerful than roboRio to do live image processing using the NI library a number of years back with great success up to 13-15 fps without causing any other disturbances in the performance of the robot.

The biggest key here is in making sure the way you're handling your processing loop is done in such a way as to not HOG the CPU and all of its free resources. I would expect that the roboRio has more than sufficient processing resources to do quite a bit of realtime image processing.

Happy Programming!
RoboBob
__________________
~~~~~~~~~~~~~~~~~~~
Bob Wolff - Software from the old-school
Mentor / C / C++ guy
Team 1967 - The Janksters - San Jose, CA
Reply With Quote
  #9   Spotlight this post!  
Unread Yesterday, 19:58
Joe Ross's Avatar Unsung FIRST Hero
Joe Ross Joe Ross is offline
Registered User
FRC #0330 (Beachbots)
Team Role: Engineer
 
Join Date: Jun 2001
Rookie Year: 1997
Location: Los Angeles, CA
Posts: 8,542
Joe Ross has a reputation beyond reputeJoe Ross has a reputation beyond reputeJoe Ross has a reputation beyond reputeJoe Ross has a reputation beyond reputeJoe Ross has a reputation beyond reputeJoe Ross has a reputation beyond reputeJoe Ross has a reputation beyond reputeJoe Ross has a reputation beyond reputeJoe Ross has a reputation beyond reputeJoe Ross has a reputation beyond reputeJoe Ross has a reputation beyond repute
Re: WPILib 2017 Control System

Quote:
Originally Posted by jreneew2 View Post
So, if we run the vision code on say a raspberry pi, can we stream the output to the driver station computer?
Quote:
Originally Posted by Vannaka View Post
@jreneew2, you can stream the output to network tables, whether the output is the video stream or the contour report.
Using network tables for the video stream is really inefficient. It is a good choice for processed results, angle and distance. For the video stream, you should use something like mjpg-streamer.
Reply With Quote
  #10   Spotlight this post!  
Unread Yesterday, 19:59
jreneew2's Avatar
jreneew2 jreneew2 is offline
Alumni of Team 2053 Tigertronics
AKA: Drew Williams
FRC #2053 (TigerTronics)
Team Role: Programmer
 
Join Date: Jan 2014
Rookie Year: 2013
Location: Vestal, NY
Posts: 189
jreneew2 has a spectacular aura aboutjreneew2 has a spectacular aura aboutjreneew2 has a spectacular aura about
Re: WPILib 2017 Control System

Quote:
Originally Posted by Joe Ross View Post
Using network tables for the video stream is really inefficient. It is a good choice for processed results, angle and distance. For the video stream, you should use something like mjpg-streamer.
OK thank you. I'll look into that
Reply With Quote
Reply


Thread Tools
Display Modes Rate This Thread
Rate This Thread:

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

vB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Forum Jump


All times are GMT -5. The time now is 01:40.

The Chief Delphi Forums are sponsored by Innovation First International, Inc.


Powered by vBulletin® Version 3.6.4
Copyright ©2000 - 2017, Jelsoft Enterprises Ltd.
Copyright © Chief Delphi