Anand's Coprocessor Roundup - Which Is Best? (also, an Orange Pi setup guide)

I have summarized my adventures in coprocessors in this document:

The tl;dr is that I highly recommend the Orange Pi 5 to anyone who can get one off Aliexpress (shorter lead time than Amazon right now). The Beelink Mini S with N5095 is a good option for teams willing to put in slightly more work. However, with the newest Photonvision releases, setup for any coprocessor is very easy. The primary differentiator is in hardware setup, and in that game, the Orange Pi 5 wins against the Beelink, and the Limelight wins all.

I have an Orange Pi Photonvision setup guide here: Orange Pi Photonvision Guide - Google Docs

I’m setting aside my coprocessors for now, but if anyone wants me to test something, just send me an Amazon link and I’ll get it done.

22 Likes

Have you found that these numbers scale at 1920x1080 and with, say, 10 tags in the field of view? (see Edit)

Frankly, I’m surprised you got the latency down so low with a USB camera. I assume your Pi4b numbers are with the same USB camera? If so, it would be good to get comparison numbers with a Pi4b running with an ov9281 MIPI CSI camera.

I’ve played with older OrangePi’s and had quality issues but that was a good number of years ago (Allwinner H3 I think).

Edit:

I think I see why your numbers are based on 640x480. I assume the cameras you’re running are USB2 so at 477mbps with USB overhead, you can’t go higher resolution than 640x480 without some sort of image compression (ie: mjpeg or h.264 and thus much higher latency). I think that translates into about 3m maximum detection distance?

Also, I assume your ā€˜latency’ numbers are ā€˜time to return a pose after the image is received by a userland process’ as opposed to ā€˜latency from photons hitting the lens to returning a pose’

I expect that the amount of time it takes from when a photon hits the lens to when that image is handed off to libcamera/v4l2 will be higher in the USB case than in the MIPI CSI case, so I would be especially curious to see the comparison between OPi5 w/USB camera and Pi4B w/MIPI CSI camera.

Have you looked at Le Potato? https://libre.computer/products/aml-s905x-cc/

Seems to have decent specs and they are available.

1 Like

Just got my Orange Pi a few days back! Thanks for making a setup paper!

2 Likes

Do you have lens recommendations?

I just want to say that you’ve done a massive lift for teams this year, and many of us will be in a much better place thanks to you, Anand. Thank you

9 Likes

Agreed, this testing was extremely valuable for us both on the Photonvision side and on the user side. Thanks again for the hard work and time investment you’ve had in this.

1 Like

I am pretty sure that all of Anand’s testing was with USB cameras. Yes, those generally need JPEG compression above 640x480 @ 30FPS, but that does not limit the range to 3m. Take a look at his numbers; even at 640x480, you can get about >20 ft, and if you are willing to higher frame sizes (presumably lower FPS), you can get to >30 ft.

The latency number in PhotonVision is time between what is reported from the camera driver until the processing is finished and results are inserted into NetworkTables. It is unclear (to me at least) what image time any camera reports, so the pessimistic assumption is that it starts when the image readout starts, and thus does not include the exposure time.

Not sure how you would interpret numbers coming from the USB vs MIPI camera because there is no way to know what is actually included in a timer. The only real way to do that would be with a physical setup which (for instance) turns on and off a light and times the results from that, independent of software timing. (See for instance my measurements of driver image latency: https://ligerbots.org/docs/whitepapers/LigerBots_Camera_Latency_Whitepaper.pdf)

1 Like

According to Geekbench benchmarks, the S905x is somewhere between a RPi3 and RPi4, so guessing it would get roughly 15 fps with 50ms.

2 Likes

With libcamera (ie, mipi cameras on Raspberry Pis), we are able to read out the time the first pixel started being exposed. With USB cameras we are left at the mercy of cscore’s ā€œframe timeā€ CvSink (WPILib API 2022.1.1)
I haven’t dug into the cscore source code to see what it does internally but I strongly suspect it depends on the camera and irs v4l driver

2 Likes

Fair warning to everyone. Amazon vendors do not expect possibly hundreds of teams to randomly start buying this random coprocessor and global shutter cameras, they will sell out fast and take a while to restock. Order sooner, rather than later.

3 Likes

Yes, that is what I had heard (maybe from you ;-).

I believe I looked once, and cscore starts the clock just as it starts the readout, so the ā€œmissing timeā€ is from the start of exposure until the readout starts. I would hypothesize that the ā€œworst caseā€ is about 2 frame periods: 1 period for the exposure and then up to 1 period for readout and compression (but this is a guess!!). I guess you could go to 3 periods if you want to be really pessimistic.

The good aspect of that is that the period should be the free run period; if the Apriltag processing takes more than a period, it will just skip frames, but the delay will not extend. So, if the camera can go at 120FPS, then that would be ~< 20ms. Significant, but not that bad.

1 Like

All of my personally tested numbers are with MJPEG compression, because that’s the only way to get more than 30fps at 480p. If you check my Beelink testing doc, I have tried a USB 3.0 camera, but the model of camera matters more than the compression. YUYV (raw) is about 2-4ms faster. I’m not sure where you’re getting the distance claim from because my range is good up to about 32ft (the length of my house), and I’m pretty sure I’m just limited by my camera lens/exposure time. I could check the USB 3 camera again and see if it has better quality though.

At lower FPS, the model of camera matters a lot less. I’m pretty sure the Pi 3 tests used a Lifecam. I used the most flattering representative numbers from the supported hardware page.

I can check later but I’m pretty sure all of the RasPi tests were done with MIPI cameras already. It isn’t a game changer to use a MIPI cameras and probably only matters for the Orange Pi to eke out that last bit of performance.

My impression was this the processor on here was weaker than even a Pi, but if that’s not the case I could try it out. I keep getting ads for it now because I’ve looked up so many computers.

Yes, in the both the Orange Pi and Beelink guides I linked. I haven’t tried them yet so if someone does let me know how it goes. Ideally just get your camera with a 100 degree no-distortion lens.

You can see the table of performance in the first post. Higher resolution does increase processing time. I wish I could do 50fps at 1280x720 to eke out those last 5ft of range, but it’s just barely too slow. At that resolution it’s more worth it to localize based on tag information and use odometry or gyro PID.

I have an external OV9281 Arducam vendor linked in one of the guides. The OV9281 module is fairly common so teams should be able to get them from places other than Amazon.

!! ATTENTION !!
I’ve been informed by mdurrani that the Opi 5 is available now with overnight shipping: https://a.co/d/8g6HUDa

2 Likes

Agreed it’s difficult. Something I’ve done in the past was to start a timer in a window and then aim the camera at the screen and open a browser window to that camera (usually mjpg_streamer), then photograph the result (or screenshot). It’s very crude obviously and doesn’t eliminate any processing time though I’ve found mjpg_streamer to be pretty good, as compared to mplayer or cvlc.

I reproduced that experiment with a Pi 1.3 MIPI CSI camera and 2 USB cameras that I happen to have here albeit with Photonvision driver-mode instead of mjpg_streamer. At 640x480, I would agree, there is no discernible difference between MIPI CSI and USB (A Razor Kiyo and a Microsoft LifeCam). https://photos.app.goo.gl/Lu27PN7cFF4Hm2AH9

Obviously at 1920x1080 the difference between MIPI CSI and USB becomes greater. It’s approximately 1000ms latency with the Razor Kiyo at 1920x1080 MJPEG. This is no surprise and is what I was thinking when I wrote my post.

Razor Kiyo at 1920x1080: https://photos.app.goo.gl/rXjhEwTCDbC84GsCA
Pi 1.3 MIPI CSI cam: https://photos.app.goo.gl/C7eRKs3io6yBmuLW8

Anyway,

@asid61:

I’m not sure where you’re getting the distance claim from because my range is good up to about 32ft (the length of my house), and I’m pretty sure I’m just limited by my camera lens/exposure time.

I have about 22’ of distance between my Pi4 and an AprilTag within the confines of my lab. I was not able to get any recognition at 640x480 and it worked great at 1920x1080 … Prior to my post, I had read this post (Visual Fiducials in Future FRC Games - #353 by signalfade) where someone was reporting 3m and that’s what stuck in my head and I figured that’s why I must not be getting any distance out of 640x480. After you and @prensing mentioned getting >20feet at 640x480 I connected everything back up this afternoon (yesterday was busy with this kickoff thing we had) and was able to get it working after playing with the settings in PV.

Anyhow, thanks @prensing and @asid61. This has been a good exercise for me.

1 Like

Really glad to hear about your setup working better! I wonder if it’s worth setting up some kind of latency verification given the 1000ms compression time you’re looking at. I’m definitely only getting latency in the tens of ms at 1280x720 just by eye, but it would be nice to confirm that on a robot.

You just woke up one of my braincells. I moved my Razor Kiyo from a black USB port to one of them thar fancy blue ones and now my latency:

1920x1080 is more like 45ms: https://photos.app.goo.gl/y2WqBvJ5rNHFxvEy8

1280x720 is 20-35ms: https://photos.app.goo.gl/4ebuK7LGc5HRMbqN6

Who knew color was so important. :stuck_out_tongue_winking_eye:

4 Likes

Great job with the guide for setting up Orange PI. One thing that would be nice to add is instructions on how to disable the WIFI and/or bluetooth. WIFI has to be disabled at the competitions.

On the rPI, it can be disabled by adding a line of code to the boot\config.txt file. It was pretty easy to find instructions for this for the rPI. I was not able to find any specific instructions for the Orange PI using google search.

If you run rfkill list what shows up?

1 Like

I don’t have an Orange PI yet. One is on order for evaluation. At this point I’m just trying to identify all of the details necessary to use one.

There’s no built in WiFi or Bluetooth. In any case, I wasn’t aware that WiFi had to be disabled, just that we weren’t allowed to use it. Can you cite the rule that requires WiFi functions to be disabled?