FTC which vision sensor to buy for April Tags

Hi,
I am a novice here and am trying to understand how the FTC robot identifies the April Tags in autonomous mode? During the kickoff for Centerstage they talked about having vision sensor but I am unclear as to where to buy it and how it is used? I have searched on Rev Duo but only could find a color sensor. I have also looked at Andy Marks they do provide camera but I am not sure if that will work (Microsoft Lifecam HD-3000 Camera am-3025). Any help or advise would be appreciated.

Thanks,
Jeremy

I haven’t dug into it yet, but start here AprilTag Introduction — FIRST Tech Challenge Docs 0.2 documentation

You can read AprilTags with a standard camera, so I’m expecting it’ll work with a standard Logitech webcam, and maybe the Lifecam.

2 Likes

I believe the AprilTag library can work with a webcam. We got Logitech C615’s last year for vision, and I was intending to have the students use them this year too.

In other words, a “vision sensor” is essentially some form of camera (though there are different kinds). Page 29 of game manual 1 talks a bit about them, and rule RE13 (page 38) defines what’s allowed (but basically most USB webcams with a single sensor on them are allowed).

1 Like

Thanks for your help and the quick response!

I don’t know any FTC rules or what’s allowed and what’s not, but I recommend a global shutter camera if you can. But I don’t even know if thats’ needed, I can’t imagine FTC robots are even moving that fast that motion blur comes into play.

The DFRobot HuskyLens sensor does a pretty good job of recognizing and tracking the 36H11 AprilTag family that FTC is using this year. From a hardware perspective, it should be easy to interface over I2C. From a software perspective, you’d kind of be on your own own. I also see that FTC has an Illegal Parts list that bans microprocessors such as an Arduino. The HuskyLens may fall under that rule. The rule seems kind of dumb. You’re probably better off doing whatever other posters in this thread are recommending.

The HuskyLens (and for that matter, PixyCam) are allowed as “Vision Sensors” under the rules as they are able to interface over i2c. The GDC intends for these sorts of sensors to be legal.

The Control Hub has four Cortex A53s in it, making it (on paper) about on par with a Raspberry Pi 3. I can’t find which Limelight this roughly corresponds to, but onboard processing is very doable.

USB cameras are generally legal if they are UVC-compatible. You could use an OV9281 if you wanted.

1 Like

The Raspi 3 is comparable to a LL2+, which can process tags fairly quickly (30+ fps) at low resolution with up to 10ft of range. Should be fine for FTC.

1 Like

Out of pure curiosity, we opted to buy a couple Huskylens modules. While the I2C interface should be easy enough to connect, we have no clue what the data format it will be sending. Does anyone know?

Any legal webcam works, we have used cheap Logitech ones in the past with great success.

If you look at the Legal and Illegal Parts doc, Raspberry Pis, Arduinos, and other microprocessors aren’t allowed. (It’s sad as I’d love to use them too.)

I don’t think asid61 was saying to use a rPi 3, but rather that since the Control Hub has equivalent processing power to the rPi 3, onboard processing on the Control Hub should be fine.

3 Likes

How was the T265 camera from Intel used by an FTC team?

Oops, you’re right. Thanks! (That’s what I get for reading forums too early on a Monday without caffeine!)

1 Like

We are trying the HuskyLens camera. This camera is LEGAL for the 2023-2024 FIRST Tech Challenge. The HuskyLens has multiple modes of operation.
huskyLens.selectAlgorithm(HuskyLens.Algorithm.TAG_RECOGNITION);
HuskyLens.Algorithm.NONE
HuskyLens.Algorithm.TAG_RECOGNITION
HuskyLens.Algorithm.COLOR_RECOGNITION
HuskyLens.Algorithm.FACE_RECOGNITION
HuskyLens.Algorithm.LINE_TRACKING
HuskyLens.Algorithm.OBJECT_CLASSIFICATION
HuskyLens.Algorithm.OBJECT_RECOGNITION
HuskyLens.Algorithm.OBJECT_TRACKING

Please note I have tested this with our robot configuration. Check all wiring and software before use.

Cable HuskyLens to REV Control Hub (Note: Custom cable, verify connection Pin functions for HuskyLens and REV control Hub before use)

   HuskyLens                                                              REV Control Hub 

Pin Label Pin Function Description Pin Function Description**
1 T SDA Serial data line 4 SCL Serial clock line
2 R SCL Serial clock line 3 SDA Serial data line
3 - GND Negative (0V) 2 VCC Positive (3.3~5.0V)
4 + VCC Positive (3.3~5.0V) 1 GND Negative (0V)

Robot Configuration

  • Select “I2C Bus 2”
  • Select type “HuskyLens”
  • Enter name “huskylens”

HuskyLens Camera Configuration

  • Select “General Setting”
  • Select “Protocol”
  • Select “I2C”
  • Select “Save and Exit”

Software Test Program (in FTC software release 9.0, external.samples)

Make a copy in your ftc.teamcode directory.

HuskyLens[.java]

I did not find the HuskyLens test program listed in the TeleOp listing after building code. Changed group = “Sensor” to “Iterative Opmode”.

//@TeleOp(name = “Sensor: HuskyLens”, group = “Sensor”)
@TeleOp(name = “Sensor: HuskyLens”, group = “Iterative Opmode”)
//@Disabled
public class SensorHuskyLens extends LinearOpMode {

Example program worked. I will release more examples for the HuskyLens as we work on FTC applications. If other users develop any applications or user notes please release your information. There is limited information on the HuskyLens camera for the FTC REV Control Hub.

T265 is not legal anymore, but back when it was, it was used to track the robot position via its onboard processing board which combined environmental data with its built-in IMU. It didn’t look for any tags in particular, but features in the environment to localize off of.

To be precise, HuskyLens is considered a Vision Sensor according to the rules, not a camera.
Semantics aside, were you able to get more than just tag recognition from the Huskylens? Getting pose would be fantastic, but doesn’t seam to be available.