![]() |
Teams that have used a Raspberry Pi
Hello, we would like to know if any other teams have used the Raspberry Pi and connected it to the RoboRio? We were trying to use the Raspberry Pi Camera to handle vision processing.
|
Re: Teams that have used a Raspberry Pi
I wasn't the lead on our RPi Vision project, I'll have him come in and help you when he gets back from his vacation, but I recommend you look at our GitHub as a place to start. We implemented a vision system on our RPi 2 with a PiCam using Network Tables for variable communication to the RoboRIO.
https://github.com/GarnetSquadron4901 https://github.com/GarnetSquadron490...ion-processing |
Re: Teams that have used a Raspberry Pi
Quote:
|
Re: Teams that have used a Raspberry Pi
We are currently using a Pi on both our bots. We run OpenCV with a Microsoft web cam. We send target data to the Rio via UDP. Works great!
|
Re: Teams that have used a Raspberry Pi
ok thank you also have you guys had problems with communication lag between the systems
|
Re: Teams that have used a Raspberry Pi
5968 spent quite a while developing an image processing algorithm to run on a Raspberry Pi (only to abandon it). I did most of the serial communication between the pi and roboRIO, and somebody else did the actual image processing algorithm.
Based on my experience, I wouldn't recommend it. First, the Raspberry Pi's built-in camera interface library doesn't support sending the image over serial as far as I could tell (which would have been slow anyway - we could have used USB, but still, going from PiCamera to serial is not very nice). This becomes an issue if you decide you want to stream the video to your driver station. Also, it took upwards of 15 seconds to run the processing algorithm on the Pi. This could have been inefficiencies in the algorithm, but it ran in 3 or so seconds on a laptop. The serial communication itself worked well though, once we got it working. Perhaps our issues were a result of us trying it in our rookie year, which was probably a bit ambitious. I also haven't tried using a USB camera directly with the roboRIO, though we are going to experiment with that in the next few weeks. |
Re: Teams that have used a Raspberry Pi
Quote:
|
Re: Teams that have used a Raspberry Pi
Quote:
When we run our processing loop as a single thread, we average better than 40 FPS when using a (320 X 240) matrix. If we run it in a multi-thread, we can get better than 60 processed frames a second. The RPi has plenty of power for this approach. |
Re: Teams that have used a Raspberry Pi
This may help and I would suggest contacting the team for further help via the website. www.prhsrobotics.com
https://www.youtube.com/watch?v=ZNIlhVzC-4g This was a TE Session at Georgia Tech that the programmers held. |
Re: Teams that have used a Raspberry Pi
We use a Raspberry Pi for vision processing. This was very successful this year. I will ask the project lead to join this dialogue.
|
Re: Teams that have used a Raspberry Pi
One of our students wrote a good generic goal finder back in 2013 and we've used it with only minor tweaks since (except 2015 when the only vision targets were in the wrong place): https://github.com/frc3946/PyGoalFinder.
We've found it useful to use separate cameras for targeting (higher resolution, lower brightness) and vision (lower resolution/higher frame rate over the network). Our entire control board for Stronghold was "shock mounted". Holding the board in place with bungee cord was originally a prototype hack, but after a few dozen times over the rock wall, we enhanced rather than replaced it. |
Re: Teams that have used a Raspberry Pi
I'm lead on the 4901 vision project this year.
Take a look at the repository that John linked above. Instructions are in the repository on how to install it. I'll have to do some updates to the Network Table portion due to updates to the pynetworktables from this past week. We used this 'hat' for the Raspberry Pi: https://www.amazon.com/Pi-Screw-Term...ry+pi+breakout Unfortunately, it looks like it's no longer made, but any breakout hat would work. We used the proto area for a logic level converter. One unique aspect to our vision system is our LED ring was programmable via the Raspberry Pi. The LED ring's logic works at 5V, whereas the RPi operates at 3.3V. This is where a simple logic level converter comes in handy. I recommend taking a look at Sparkfun's logic level converter. https://www.sparkfun.com/products/12009 I had some spare BSS138 N-Channel MOSFETs from a different project that I made work with the through-hole mounting, however, you can find many available. You want to find one with a low enough gate-source voltage (Vgs) to turn on at 3.3V. The BSS138 turns on at 1.5V, meaning that it will work for 1.8V and 2.5V devices as well, as Sparkfun says. I'm driving the WS2812 LEDs. I recommend getting genuine Adafruit ones. I tried some cheaper ones from eBay and had overheating issues running them at 5V at their highest brightness. To put this into terms, I had 3 cheap ones fail (usually one LED goes out, which causes the rest of the chain to stop working). I haven't had an Adafruit Neopixel ring fail yet. |
Re: Teams that have used a Raspberry Pi
Quote:
|
Re: Teams that have used a Raspberry Pi
Quote:
|
Re: Teams that have used a Raspberry Pi
Running GRIP (Graphically Represented Image Processing engine) on Raspberri Pi 2: https://github.com/WPIRoboticsProjec...Raspberry-Pi-2
|
Re: Teams that have used a Raspberry Pi
Quote:
|
| All times are GMT -5. The time now is 15:16. |
Powered by vBulletin® Version 3.6.4
Copyright ©2000 - 2017, Jelsoft Enterprises Ltd.
Copyright © Chief Delphi