STEREO VISION - Stereolabs ZED Camera

If you are interested in using a Stereolabs ZED camera with an NVIDA Jetson on your robot, you can!

https://i.imgur.com/kW4n56e.png

Wow… I had no idea I would never use this camera until now.

Seriously though, even with a discount, that’s a lot of money for that product. I get that it has a high resolution but I honestly can’t see who in the FRC community is actually going to use this.

Largely because the alternative is an Xbox 360 kinect, a device that has had half a decade of being developed for by the community. Seriously setting up SLAM for a kinect in ROS is almost easier than installing the FRC plugins.

Also, when I got another for myself a few months ago it was 25 bucks used at GS, so unless the discount is quite literally an order of magnitude, I don’t have much hope for this device.

Ummmm. That akward moment when you are not allowed to use parts that cost of over $400 (Per R10).

Edit: I am wrong see below.

The post linked in the OP has a solution to that:

Well $@#$@#$@#$@#(why is dam censored?). I would really like to have one for our robot but that’s way to expensive for such a device for our team =\

The good news is that this also works with two USB cameras rigidly mounted a fixed distance apart and with the appropriate calibration and processing. The NI libraries are on the Stereo palette, and the example is in examples/vision/Stereo. The performance of the ZED system is quite impressive for such high resolution images.

Greg McKaskle

Yes, not every team will want to use this camera, but for the teams who do it is now an option. The ZED camera and the Kinect tackle the problem of depth maps in two completely different ways.

The Kinect utilizes an IR pattern projector and an IR camera to analyze the pattern deformation to generate the depth map of the environment. This is great for an indoor environment with constant light and no windows, however try using a Kinect outside and all of a sudden you have a normal RGB camera with very close to no depth detection. The Kinect has a max distance of 4m(~13ft).

The ZED however utilizes dual RGB cameras, which I bet are synced by hardware, to generate a disparity map. This has the advantage for both indoor and outdoor environments where light pollution, such as IR light(The Sun), would be a problem. The ZED has a max distance of 20m(~65ft). There is also the option to use a up to 120 FPS, where the Kinect is limited to a max FPS of 30.

Yes the ZED is an expensive camera compared to the Kinect, but the Kinect was merely adopted into computer vision for robotics, the ZED was born into it, designed for it. :stuck_out_tongue:

Yes, I am well aware of that, having used them extensively since their release for the exclusive purpose of robotics. It’s what makes them so dang cheap.

While I don’t gamble out of principle, I would wager a rather large sum of money that the 2016 FRC game will, in fact, happen indoors, although I have to agree that polycarbonate is a nuisance.

If this years game happens outdoors I’ll eat my hat and post progress pics.

Running rather well optimized SLAM code with the Kinect (and an Xtion Pro) on the ODROID U3 nets between 5 and 15 FPS. I imagine I could get it higher but I have other projects I need to work on. I think you are either severely overestimating the capabilities of modern single board computers, or severely underestimating the amount of work a computer has to do to do anything meaningful with a pointcloud.

We all know how that turned out though. XD

Haha. True I doubt the game will be outdoors, but from personal experience there are fields hosted at places with large windows that throughout the day the sun will illuminate and cast shadows on the field or blind the robot. And yes, this would effect most any camera system, but I believe an IR camera looking for a pattern would be affected the most vs a normal RGB camera.

Water game confirmed? Am I doing this right? :confused: :confused: :confused:

As this is the FIRST community I bet everyone here would love to see you share your code rather then just talk about it. I understand that to work with the pointcloud does take processing power, but you too might be underestimating how far SoC boards have come since the ODROID U3. The thing about the ZED is that it requires a NVIDIA GPU to function meaning to use it on a robot a team will need a NVIDIA Jetson TK1 or even better the new Jetson TX1. So no, a team will not get 120 FPS on the robot, but the fact that the camera would be able to capture at that speed would reduce motion blur. Last year with our horribly optimized cascade classifying code we were getting ~15-25 FPS for green bin detection on a Jetson TK1 and a USB camera.

We really want to help FRC teams this year, so we’re offering a big discount on ZED 3D cameras during the competition. More details can be found here.

Given the complexity of the Stronghold challenge this year, 3D vision should be very helpful! :slight_smile:

We put one on order(yay for that discount!) but come to find out they don’t even expect to ship it for two weeks due to “unusually high number of orders” and then however long shipping takes… I contacted them and waiting to hear back if anything can be done to expedite the process but it’s pretty tough if you’re only able to get it middle of week 4 of build season… doesn’t leave much room to make working code much less fine tuning it as part of an actual robot. Guess we will see how it turns out. This year will be extremely difficult in more than one regard.

Hey sanelss, don’t worry the shipping process has been accelerated.
We’ve shipped all FRC teams yesterday by UPS, so you should get your ZED today!

On the subject of the kinect

  1. Polycarb is a nuisance for using any type of light based image, laser/structured ir/visible light
  2. While the events are indoors, that doesn’t mean a healthy amount of sunlight won’t make it into the play spaces. I’ve got video from 2014 from our IR camera in an auditorium with sunlight blooming out the targets coming from an open gym door. Similarly many popular event lighting chemistries emit large amounts of IR
  3. During 2012, the first year you could use the kinect, our team tried using it to track balls in flight. The problem is that the color image and the depth images are slightly out of sync. So an approach to say, find the orange region of the image, then figure out how far away it is becomes non trivial. Going from a stereo disparity map keeps those in sync