Chief Delphi

Chief Delphi (http://www.chiefdelphi.com/forums/index.php)
-   Robot Showcase (http://www.chiefdelphi.com/forums/forumdisplay.php?f=58)
-   -   Team 3142: Week 5 preview (http://www.chiefdelphi.com/forums/showthread.php?t=102435)

oswaldonfire 09-02-2012 23:24

Team 3142: Week 5 preview
 
Here is our robot as of today (2/9). We still have a ton of work to do, but things are beginning to come together! And yes, the black box has a Kinect inside..


akoscielski3 09-02-2012 23:25

Re: Team 3142: Week 5 preview
 
Wow this is close to how we have ours :)

But still not the same :) Great job!!

Joshuamunson 10-02-2012 01:55

Re: Team 3142: Week 5 preview
 
Does the kinect on the robot actually work?
and if so what role does it play?

TEAMROCK2000 10-02-2012 05:08

Re: Team 3142: Week 5 preview
 
like josh was saying how does the kinect actually work on the robot.
and if u will how did u get it to work?

oswaldonfire 10-02-2012 12:37

Re: Team 3142: Week 5 preview
 
Yep, the Kinect works - if you look closely, you can see how we put wax paper over the infrared laser projector, effectively blurring the light into a homogenous field and taking advantage of the kinect's infrared camera. When coupled with the retroreflective tape on the targets, it gives us a perfect tracking system completely immune to any changes visible light. The kinect is connected to an onboard computer, which does a huge amount of image processing to send a distance value (accurate to the inch) and information on how to move the turret (preliminary testing shows <1 degree accuracy) to the cRio.

In addition to running the Kinect, the onboard computer processes a feed from a second webcam which is pointed down at the field in front of the robot (not attached in this picture) and sends an augmented-reality video feed back to the driver station, highlighting the closest ball in green (or any other color) and overlaying information to help the driver line the robot up with the ball to pick it up.

Dusk Star 10-02-2012 12:54

Re: Team 3142: Week 5 preview
 
Quote:

Originally Posted by oswaldonfire (Post 1123741)
In addition to running the Kinect, the onboard computer processes a feed from a second webcam which is pointed down at the field in front of the robot (not attached in this picture) and sends an augmented-reality video feed back to the driver station, highlighting the closest ball in green (or any other color) and overlaying information to help the driver line the robot up with the ball to pick it up.

Oh my- how does your team have enough time for this!?! Our 5 programmers (including me) have barely gotten tracking (of the backboard) working! Great job!

And what did you use for an onboard computer?

oswaldonfire 10-02-2012 13:12

Re: Team 3142: Week 5 preview
 
We have a mini-ITX computer running onboard; it has an Atom dual-core 1.8GHz processor, 2GB RAM, a 4GB SSD, and power regulating and supply equipment to allow it to run on anywhere from 6-34vDC (this circuitry also powers the Kinect). We're all in love with it - it's small, around six inches square, and draws maybe 30 watts and gets slightly warm while doing all its image processing.

Right now we're running into framerate issues while processing both feeds - getting only around 6fps from the Kinect and 12 at best from the other camera.. although I attribute this to the fact that we like our nice high resolutions too much - today we'll try moving down from 640x480 and we should see a huge gain in performance. Although the smaller the resolution from the Kinect, the less resolution our distance readings will have. Right now we have a reliable resolution of around 2-3 inches, which is acceptable.

I should also note that we're running the excellent RoboRealm software on the computer to do our vision processing - it's an amazingly powerful, GUI-based machine vision platform, and RoboRealm is giving free copies to any FIRST team who is interested (it's normally $50) - if nothing else, your team should grab a copy to play around with in the off-season.

I've found that it's a great tool for teaching the basics of machine vision, and its easy enough to learn the basics to get students who otherwise wouldn't step foot near C++ excited about programming and computers. I worked with two freshmen on our team to develop the Kinect targeting software, and both had no previous experience in programming. There is also plenty of room for the experienced programmer - RoboRealm has a full-featured API, built-in HTTP and FTP servers, and you can write custom image processing modules and plugins in C, Python, or Visual Basic. Check it out - http://www.roborealm.com/

JohnSchneider 10-02-2012 13:49

Re: Team 3142: Week 5 preview
 
Quote:

Originally Posted by oswaldonfire (Post 1123741)
Yep, the Kinect works - if you look closely, you can see how we put wax paper over the infrared laser projector, effectively blurring the light into a homogenous field and taking advantage of the kinect's infrared camera. When coupled with the retroreflective tape on the targets, it gives us a perfect tracking system completely immune to any changes visible light. The kinect is connected to an onboard computer, which does a huge amount of image processing to send a distance value (accurate to the inch) and information on how to move the turret (preliminary testing shows <1 degree accuracy) to the cRio.

In addition to running the Kinect, the onboard computer processes a feed from a second webcam which is pointed down at the field in front of the robot (not attached in this picture) and sends an augmented-reality video feed back to the driver station, highlighting the closest ball in green (or any other color) and overlaying information to help the driver line the robot up with the ball to pick it up.

WOW. thats awesome. Anyway we could get a writeup of this after season?

MattC9 10-02-2012 13:57

Re: Team 3142: Week 5 preview
 
Holy!!! How much does that BEAST way?!

372 lives on 10-02-2012 20:17

Re: Team 3142: Week 5 preview
 
looks great except for the weight distibution. your cg is probably going to be a foot off the ground :/

Keyreaper 10-02-2012 23:28

Re: Team 3142: Week 5 preview
 
I'm also very interested as many probably about how you implemented kinect well enough to be put on your robot! :O. I give props to your programmer, he's got some real talent if he could do that.

Grim Tuesday 11-02-2012 19:15

Re: Team 3142: Week 5 preview
 
Our programmers have been working on it as well--we use a little linux comptuer called a Beagle Board. What are you guys using?

Cal578 11-02-2012 19:59

Re: Team 3142: Week 5 preview
 
Impressive-looking robot!

Have you calculated your center of gravity? It does look rather top-heavy, just judging from the picture.

Good luck!

tickspe15 12-02-2012 09:29

Re: Team 3142: Week 5 preview
 
Quote:

Originally Posted by oswaldonfire (Post 1123741)
Yep, the Kinect works - if you look closely, you can see how we put wax paper over the infrared laser projector, effectively blurring the light into a homogenous field and taking advantage of the kinect's infrared camera. When coupled with the retroreflective tape on the targets, it gives us a perfect tracking system completely immune to any changes visible light. The kinect is connected to an onboard computer, which does a huge amount of image processing to send a distance value (accurate to the inch) and information on how to move the turret (preliminary testing shows <1 degree accuracy) to the cRio.

In addition to running the Kinect, the onboard computer processes a feed from a second webcam which is pointed down at the field in front of the robot (not attached in this picture) and sends an augmented-reality video feed back to the driver station, highlighting the closest ball in green (or any other color) and overlaying information to help the driver line the robot up with the ball to pick it up.

We(1318) are doing a very similar thing but with AXIS cameras instead of a kinect because we did not want to run windows on our onboard computer and microsoft has rules about using the kinect with non windows devices
also we are using a PICO-ITX P830

oswaldonfire 16-02-2012 19:27

Re: Team 3142: Week 5 preview
 
We have a mini-ITX computer running onboard; it has an Atom dual-core 1.8GHz processor, 2GB RAM, a 4GB SSD, and power regulating and supply equipment to allow it to run on anywhere from 6-34vDC (this circuitry also powers the Kinect). We're all in love with it - it's small, around six inches square, and draws maybe 30 watts and gets slightly warm while doing all its image processing.

Right now we're running into framerate issues while processing both feeds - getting only around 6fps from the Kinect and 12 at best from the other camera.. although I attribute this to the fact that we like our nice high resolutions too much - today we'll try moving down from 640x480 and we should see a huge gain in performance. Although the smaller the resolution from the Kinect, the less resolution our distance readings will have. Right now we have a reliable resolution of around 2-3 inches, which is acceptable.

I should also note that we're running the excellent RoboRealm software on the computer to do our vision processing - it's an amazingly powerful, GUI-based machine vision platform, and RoboRealm is giving free copies to any FIRST team who is interested (it's normally $50) - if nothing else, your team should grab a copy to play around with in the off-season.

I've found that it's a great tool for teaching the basics of machine vision, and its easy enough to learn the basics to get students who otherwise wouldn't step foot near C++ excited about programming and computers. I worked with two freshmen on our team to develop the Kinect targeting software, and both had no previous experience in programming. There is also plenty of room for the experienced programmer - RoboRealm has a full-featured API, built-in HTTP and FTP servers, and you can write custom image processing modules and plugins in C, Python, or Visual Basic. Check it out - http://www.roborealm.com/

I should also note that we were about 8 pounds overweight as it appeared in the original photo; the kids have since re-designed the robot to weigh less while simultaneously taking 8 inches off of its height to help with the center of gravity. I personally feel that there were much better ways to go about reducing the weight and CG, because now our robot can only hold two balls, and three in a pinch. The students feel that we can make up for this limitation with our shooting accuracy, however.


All times are GMT -5. The time now is 18:33.

Powered by vBulletin® Version 3.6.4
Copyright ©2000 - 2017, Jelsoft Enterprises Ltd.
Copyright © Chief Delphi