Different camera capitability?

We are a rookie team and did not have a camera in the KOP. I was wondering if it would be possible to use another network camera on the robot?
perhaps
A kinect (Cheeper and depth perception.)
Playstation eye (High framerates (60 and 120 FPS))

Would it be easy to get the image off the camera for processing? Or to the driver station?

I was also wondering why the axis cameras are so expensive?

If you had a rookie kit, I believe it should have been in the kit. Report this to FIRST.

When being evaluated in 2008, less expensive cameras would have latency of more than half of a second. The Axis is more like 60ms. The cameras has HW to speed image compression and generally has a nice programming API.

While you can use other cameras and sensors on the robot, the WPILib support is specific to Axis. Also keep in mind that the controller doesn’t have a USB port, so Kinect and USB cameras will not work without an additional board.

Greg McKaskle

They just did not have any this cameras for rookies this year, it was not on the checklist.

Any Idea how you could process images from a kinect or PS eye? And possibly stream some info (Distance angle etc. ) to the driver station?

Also how would something like a pi handle image processing? what would the latency be like?

That is a lot of work. You will need a separate computer. I tried this and here’s what I got:

A raspberry pi computer communicating with the kinect, the cRIO, Internet/Intranet (CTRL).
To use linux with the kinect, there is the library: libfreenect. instructions to get it are here: http://openkinect.org/wiki/Getting_Started

Communicate with cRIO via i2C, from GPIO. about minimum of 10 hrs of hacking

:yikes: :yikes: :yikes: :yikes: :yikes:

Kinect only works over USB, so you would have to purchase a secondary processing platform like the Rasberry Pi or Pandaboard to make it work, plus do more programming. It ends up being more expensive, and more work than I would recommend for a team that’s just starting out.

I’m not familiar with the PS eye, but 60-120 fps is needless overkill and would quickly saturate your bandwidth limit.

Remember that cameras aren’t always necessary. Many of the powerhouse teams last year started out with cameras and fancy physics calculations but ended up just using strategies like placing their robots in the same spot and using a flashlight to aim manually. It turned out to be more effective.

(trying to consistently place the robot may not be viable this year, but flashlight aiming certainly should be)

I don’t mean to be rude, but using a rasberry pi sounds like a terrible idea… what’s your refresh rate to even capture a frame? I might suggest other more powerful soc’s such as odroid, pandaboard, beagleboard, and gumstix.
Also: opencv is tuned for x86. Even copying frames in ARM is very expensive, and you’ll also be running without a gpu, which also hurts opencv performance.

Finally, I think interfacing with the crio over ethernet would be easier and faster than over i2c.

By the way, looking at the Axis website, the M1013 has a different lens with a wider view angle. Also, the Amazon and NewEgg pricing doesn’t make sense to me. The 1031 is a little bit fancier than the 1013, but that is a huge price jump.

If you want to be able to use the examples and libraries in WPILib, purchase a Axis 206, 207, M1011, or M1013. AndyMark lists M1011’s in stock, but FIRST Choice shows out of stock.

Greg McKaskle

(Rookie first post)

I thought the retro reflective tape would only shine to the observer with the light, so wouldn’t you need a camera to aim via flashlight? Or do you just mean using the flashlight for coarse “where am I vaguely pointing” tracking?

The bright light approach doesn’t rely on the retro-reflective tape. It is the equivalent of a car’s headlights. From a helicopter or hilltop, you can see which way a car is pointing because you can see the shape their headlights make on the ground and surrounding objects. By using a narrow beam of light that causes a streak across the carpet and wall, the driver can determine how the robot is oriented.

Greg McKaskle

Cool, thanks. I’ll definitely pass that along to our team.

I believe the Axis cams were available as part of First Choice, I know that our team picked one up this year with our points. (Unfortunately I don’t do any of the acquisition of stuff, so I don’t know where you access First choice, or how it works)