|
|
|
![]() |
|
|||||||
|
||||||||
![]() |
| Thread Tools | Rate Thread | Display Modes |
|
#1
|
|||
|
|||
|
WPILib 2017 Control System
I was looking through WPILib documentation and noticed that they added the information for the 2017 control system here: https://wpilib.screenstepslive.com/s/4485/m/13503
I didn't see any other threads for discussing this. This part stood out to me immediately. Having OpenCV bundled with WPILib will be very nice so that it will automatically be on the roboRIO. In addition, it seems that we will be able to do vision processing on the RIO more efficiently and then send the processed image to the dashboard, which could eliminate the need for a coprocessor if I am understanding this correctly. I'm very excited to see how this works during build season. Correct me if I'm wrong on any of this because I am relatively new to this, so I don't know. Quote:
|
|
#2
|
|||
|
|||
|
Re: WPILib 2017 Control System
I don't think we'll run it on the roboRIO anyway... We've had enough CPU troubles before, so just putting extra strain on it wouldn't go very well... Having a coprocessor to do vision would probably be the best way to go still.
|
|
#3
|
|||
|
|||
|
Re: WPILib 2017 Control System
We just want to give teams the option to have a good vision solution for the roboRIO. In addition, GRIP will now generate OpenCV code that can be run on your coprocessor of choice or you can run GRIP on your laptop. The GRIP generated code can also be added to the robot program.
|
|
#4
|
||||
|
||||
|
Re: WPILib 2017 Control System
Thank you for that. Our programming team has wanted to do vision work for the last couple of years. The biggest roadblock for them was understanding how the parts fit together. "It's all just the robot code, right?" Well, now it is.
|
|
#5
|
||||
|
||||
|
Re: WPILib 2017 Control System
So, if we run the vision code on say a raspberry pi, can we stream the output to the driver station computer?
|
|
#6
|
|||
|
|||
|
Re: WPILib 2017 Control System
@jreneew2, you can stream the output to network tables, whether the output is the video stream or the contour report. Then you can read those values from the driver station. Keep in mind the raspberry pi is somewhat underpowered and thus there might be a delay in the processing of the image.
|
|
#7
|
||||
|
||||
|
Re: WPILib 2017 Control System
Quote:
|
|
#8
|
||||
|
||||
|
Re: WPILib 2017 Control System
Quote:
The biggest key here is in making sure the way you're handling your processing loop is done in such a way as to not HOG the CPU and all of its free resources. I would expect that the roboRio has more than sufficient processing resources to do quite a bit of realtime image processing. Happy Programming! RoboBob |
|
#9
|
||||||
|
||||||
|
Re: WPILib 2017 Control System
Quote:
|
|
#10
|
||||
|
||||
|
Re: WPILib 2017 Control System
OK thank you. I'll look into that
|
![]() |
| Thread Tools | |
| Display Modes | Rate This Thread |
|
|