Hello, I’m Alan from team 6145 and am currently the main programmer. I was wondering how to use the raspberry pi in conjunction with the roborio for vision processing. I understand that you have to plug the pi into the router via ethernet but what do I do from there?
- How do I get the camera to send information to the roborio
- How do I fix the lag if I want to use the camera for live viewing at 720p
- If that is not possible, what coprocessor do I need to view the camera at, at least 720p?
(Please be detailed on explanations or when using programming terms, I am new to more advanced code such as this. Thank you!)
Have you considered the bandwidth used at 720p? What led you to a 720p requirement?
We want to be able to use a vr headset to view the field this year, I am the driver as well and was wondering if achieving 720p with little to no lag was possible. 720p allows for less headache when wearing the headset. I have no idea the what the bandwidth usage is but last year when I tried 720p use the Microsoft life camera it was about 2 - 5 fps but if not possible I will settle for 360p
Raspberry Pis aren’t all that powerful. Have you looked into a nvidia Jetson, or processing it on your laptop?
1 Like
Check the rules on the bandwidth allotment. It doesn’t seem to me that you’ll be able to have any kind of VR-capable stream
How would I process the video on my laptop? Can you walk me through a step-by-step process on that?
I haven’t done it myself, but I remember our past years we have done it. Basically have a program tap into the video coming from the field and process it with the same code you would on the raspberry pi, just compiled for windows or whatever you use
Check H1 - Historically a full headset may run amuck with the safety rules for the driver teams (obstructed vision)? None the less, full 360 video is going to be… very… hard to fit in 4Mbps.
1 Like
The bandwidth limit is 4Mbits/s this year, if we can’t use VR that is fine, will I at least be able to do 720p at 24 fps with this?
Well, how large is a single 720p frame? Multiply that by 24 frames per second and you get a rough number for how much bandwidth it’ll use
1 Like
Depends on compression and all of that stuff. VR gives you a headache if it’s less than 60fps (source: my VR experiences), and 720 is very noticeably pixely. You might be doing yourself a disservice by trying to do 2 720p frames, versus one bigger frame.
To get 24fps at 720 is definitely doable, you’ll just have to up your compression alittle bit. It’s quite the balance. You might find going to something a little smaller than 720p but give it less compression
I can get 4k below 4Mbit/s but it wouldn’t be anything worth looking at
From the information im gathering now it looks like 720p is not possible for vision in terms of bandwidth and being able to get a good picture without compressing it to boxes, is there anyway to enlarge a 360p video to a 720p size?
1 Like
While I’ll agree the kind of latency, frame rate, and image quality necessary for tolerable VR is probably far out of the question, with good video compression - of which RPi3s with the right cameras are certainly capable - bandwidth limitations don’t put 720p out of the question. The raspivid program has a fixed bit rate option, and 800kbps H.264 gives quite tolerable video at that resolution.
There are a lot of interesting things you can do with this much video. Here’s our project: 2077 Video/Vision platform code
A couple of warnings: 1) this is fairly ambitious stuff 2) we got burned with unreliable video feeds using well under half last year’s legal bandwidth limit. There are risks and unsolved problems.
1 Like