I canât see to enter anything in the poll, and numbers are all 0.
Also, you might want to add an Orange Pi 5 processor; that is one of the highly recommended ones for PV.
What is an âoffbrand Piâ? Do you mean all the various ARM boards with âPiâ in the name? Despite the copying of the naming, I would not say they are âoffbrandâ. If you are trying to capture that section, wording like âSmall ARM-base SBCâ might be more descriptive.
I in no way think of the Orange Pi as an âoffbrand Piâ. Sure, they stupidly use âPiâ in hopes of capturing google search hits, but they are not trying to functionally copy the Raspberry Pi. To me, âoffbrandâ implies a close, drop-in replacement, and the Orange Pi is definitely not that.
Orange Pi 5 and 5+ which most here are using still outperform the RPi 5 along with using different images and being more expensive. A competing product to be sure, but not âoffbrandâ.
I get what you are saying, but this is really not surprising. FRC has been pushing having vision on the robot for ~10 or more years.
I looked at the 2022 usage stats (could not find 2023), and ~1500 teams reported having âUSB cameraâ. I believe that means the camera is connected directly to the roboRio, and does not count all the teams with separate coprocessors. So, I would guess that >90% of allFRC teams have some sort of vision processing.
Iâm sorry, but there is no way this is the case. With most regionals falling between 40-60 teams, and most district events being 30-40 teams, this would mean that your typical regional would only have 4-6 teams without vision, and your typical district event would have 3 or 4.
Thatâs what the usage metric measures, but I would argue what we should conclude from it is that 1500 teams were not doing vision processing, and instead only using a camera feed to the driver (if that, or they just left the camera code in even though the camera was disconnected). The Rio doesnât have enough processing power for high performance vision, so cameras plugged directly into it are used almost exclusively for driver cameras. And since you can also stream a driver cam from a coprocessor, some percentage of those teams are likely not using a coprocessor either.
Itâs worth keeping in mind that many FRC teams have 0 or 1 mentors (eg just a teacher, no outside mentors) and maybe a student who has some prior programming experience in JavaScript.
Is that confirmed though? I remember FIRST said in the blog post announcing AprilTags that they plan to remove reflective tape in 2024, but havenât seen anything confirming this will indeed be the case.
Yes, but the question in the above screen shot was âplan on using visionâ. For me, streaming to the DS counts for âusingâ vision. Obviously others could interpret it differently, but still.
Maybe the takeaway is to write your poll questions more clearly.
wouldnât streaming to DS be âusing webcamâ to drive? thereâs zero need to have any co-processor on board if you are not going to be doing any vision processing.
One of the reason those reflective tape was going away was that the one that were in used is no longer being made. Yes, there are other reflective tape but HQ has not done any testing to try to standardize on them. My guess is that they want to get away from those tapes because too easy for environment interference.
We know that for sure AprilTag will be available. So minds as well switch to that which is easier to process (co-processor/camera dependency of course).
So why hope/pray that something âpotentially uncertainâ might still be in existance (I highly doubted it will) when thereâs a For-Certain alternative that is better already?