Chief Delphi

Chief Delphi (http://www.chiefdelphi.com/forums/index.php)
-   Programming (http://www.chiefdelphi.com/forums/forumdisplay.php?f=51)
-   -   2011 "Light Sensor" (http://www.chiefdelphi.com/forums/showthread.php?t=88937)

cooldude8181 12-01-2011 18:40

2011 "Light Sensor"
 
Has anyone been trying to program their light sensors for this year yet? Are there any examples for Labview that are relevant besides the "simple digital input" one?

Mark McLeod 12-01-2011 19:44

Re: 2011 "Light Sensor"
 
That's all the light sensor is, a simple digital input.

Did you have a more complex task in mind?

davidthefat 12-01-2011 20:01

Re: 2011 "Light Sensor"
 
I assume it returns a boolean. I have not looked at the libraries yet, I am currently teaching the other programmers the last year's library. (Its the same as this year technically) How many do they give you? Make it so that 3 of them point in different spots in front of the robot. So the outer 2 sensors would form an angle big enough (it would depend on where you place the sensors) and the middle one would be placed at the angle bisector. So It checks if the middle triggers true, it it does, its on track. If the left is triggered, you are too far right, turn left a little and same with the right side. That should be the basic loop. If they are all triggered, you are at a cross point or a stop point.

mwtidd 12-01-2011 20:10

Re: 2011 "Light Sensor"
 
Will your robot be able to strafe, if so you may want to consider the camera. The way I look at it the line sensors are best if you can't strafe, but the camera is better if you can.

If you want more details just ask...

davidthefat 12-01-2011 20:25

Re: 2011 "Light Sensor"
 
Quote:

Originally Posted by lineskier (Post 998515)
Will your robot be able to strafe, if so you may want to consider the camera. The way I look at it the line sensors are best if you can't strafe, but the camera is better if you can.

If you want more details just ask...

Details por favor. I mean details on why you have that belief. I don't see why the line sensors would not be as efficient on a strafing bot than on a robot that can't. Is it because the strafing bot has to change its rotation of the motors(I think these are called crab drive) or reverse the motors (mecanium or omni) to strafe? The robot has to do those actions even with a camera.

Charlie675 12-01-2011 20:29

Re: 2011 "Light Sensor"
 
How do you follow the line with the camera? That sounds pretty awesome. You would probably have to go slow though because the camera's refresh rate is terrible.

Joe Ross 12-01-2011 20:31

Re: 2011 "Light Sensor"
 
Quote:

Originally Posted by cooldude8181 (Post 998414)
Has anyone been trying to program their light sensors for this year yet? Are there any examples for Labview that are relevant besides the "simple digital input" one?

Have you looked at the autonomous vi in the framework with game specific code?

Mark McLeod 12-01-2011 20:37

Re: 2011 "Light Sensor"
 
NI has published a 2011 line tracking paper too.

davidthefat 12-01-2011 20:39

Re: 2011 "Light Sensor"
 
Quote:

Originally Posted by Joe Ross (Post 998550)
Have you looked at the autonomous vi in the framework with game specific code?

Talking to me? No sir, I have not looked at anything 2011 programming related. Even with the last year's library, it would not be hard to make a basic line tracking software utilizing the camera. The NI vision library is very comprehensive, it has everything from edge detection to object and motion detection. They make it very easy for programmers to just get up and get going.


Quote:

Originally Posted by Charlie675 (Post 998546)
How do you follow the line with the camera? That sounds pretty awesome. You would probably have to go slow though because the camera's refresh rate is terrible.


I would have to disagree with you on that. If done correctly, the camera would be significantly faster than the line tracking hardware. The line trackers only see a "pixel" compared to the camera's capabilities. Using trigonometric functions and may be some probability, it would be easy to navigate over the line. The lines are not gonna go anywhere, the software can make decisions ahead of time.

Charlie675 12-01-2011 21:17

Re: 2011 "Light Sensor"
 
Quote:

Using trigonometric functions and may be some probability, it would be easy to navigate over the line. The lines are not gonna go anywhere, the software can make decisions ahead of time.
If someone gets this to work effectively I have to see it.

davidthefat 12-01-2011 21:20

Re: 2011 "Light Sensor"
 
Quote:

Originally Posted by Charlie675 (Post 998600)
If someone gets this to work effectively I have to see it.

Now apparently our team flushed the whole idea of autonomous drive down the toilet. They do not trust me anymore. Last year our autonomous mode was absolutely hideous. It just barely worked and 90% of the it missed. Now long story short: the team captain comes to me literally the day before shipping "David I want you to get the autonomous to work" and then I see the IR sensor he installed in front of the kicker. So, I coded and debugged the autonomous code AT the competition... So that person might not be me. But certainly I will bring it up to my team that they indeed can trust me. I already got the drive working in 3 ways: linear, exponential and tank drive. They might even want a logarithmic one. Not hard at all can finish in 10 minutes. Now the autonomous might take a couple days, but sure check back with me next week.

mwtidd 12-01-2011 21:40

Re: 2011 "Light Sensor"
 
Quote:

Originally Posted by davidthefat (Post 998539)
Details por favor. I mean details on why you have that belief. I don't see why the line sensors would not be as efficient on a strafing bot than on a robot that can't. Is it because the strafing bot has to change its rotation of the motors(I think these are called crab drive) or reverse the motors (mecanium or omni) to strafe? The robot has to do those actions even with a camera.

no our chassis right now is all omni wheels, so we can strafe at no disadvantage. The camera returns the x,y and size of the traget, so for a strafing bot centering the target is easy, and then you just drive straight keeping the target in the center. For a nonstrafing bot you have to correct all the way to the target. Given I still think the camera is the best choice overall, line trackers will work almost equally well as the camera on a non strafing bot. On a strafing bot, I think the camera is the better choice.

Quote:

Originally Posted by Charlie675 (Post 998546)
How do you follow the line with the camera? That sounds pretty awesome. You would probably have to go slow though because the camera's refresh rate is terrible.

With the camera I look at the target not the lines. Where the refresh rate is not great, capping with the camera will be more accurate. Even their pretty slick line tracker in the video was jerking around quite a bit. If i was to attempt to cap with a line track I would require a gyro for it to be accurate(75% success rate is typically my bar) with the camera you do not need a gyro because you are looking right at the target.

I believe the average automated camera cap will be faster than a human cap, and more accurate than a line tracker cap.

Personally for this task I would rather use dead reckoning than use a line tracker. If you threw a gyro and a range finder on your robot, figured out the distance and said drive straight, I think you could make the 75% cut off for autonomous

davidthefat 12-01-2011 21:48

Re: 2011 "Light Sensor"
 
Quote:

Originally Posted by lineskier (Post 998626)
no our chassis right now is all omni wheels, so we can strafe at no disadvantage. The camera returns the x,y and size of the traget, so for a strafing bot centering the target is easy, and then you just drive straight keeping the target in the center. For a nonstrafing bot you have to correct all the way to the target. Given I still think the camera is the best choice overall, line trackers will work almost equally well as the camera on a non strafing bot. On a strafing bot, I think the camera is the better choice.



With the camera I look at the target not the lines. Where the refresh rate is not great, capping with the camera will be more accurate. Even their pretty slick line tracker in the video was jerking around quite a bit. If i was to attempt to cap with a line track I would require a gyro for it to be accurate(75% success rate is typically my bar) with the camera you do not need a gyro because you are looking right at the target.

I believe the average automated camera cap will be faster than a human cap, and more accurate than a line tracker cap.

Oh! I totally misunderstood what you meant about camera tracking. I thought you meant tracking the line with the robot. But with your method, be careful to watch out for other robots and to not run into other robots... I know from experience, I was looking straight ahead and I did not see the bench right in front of me...

zbanks 12-01-2011 21:49

Re: 2011 "Light Sensor"
 
To address the OP, the light sensors are pretty easy to use... almost too easy.

They're purely digital. If you power them, you can use the indicator LED to calibrate them. The knob sets the sensitivity, so you don't have to set it in code. Two of the wires are for power, the other two are output. One is normally-open, the other is normally-closed. You only have to wire up one: I'd suggest normally-open, but do whichever makes more sense for you.

Once its on your robot, you'll get a certain boolean value over light areas, and another over dark ones.

Theoretically, you only need one sensor to track a line. Dark: go left, Light: go right; you'll end up following the right edge of the line. With two, you could have them strattle the line and react accordingly, giving you a bit more hysteresis.

If your robot has an unconventional drive system, it will work on the same exact principle. I wouldn't think you would run into any additional problems unless your drive system itself fails.


Strangely enough, I think KISS might apply here. If you can calibrate your encoders correctly, I wouldn't be surprised if you could forgo the line sensors entirely. Encoders aren't sensitive to light/field conditions, and they're extremely simple to mount. Also, the lines on the field help so much: the two end lines put you in the middle of two columns of pegs & you'll have to maneuver a bit more anyways.

mwtidd 12-01-2011 21:54

Re: 2011 "Light Sensor"
 
Going back to 04, buzz robotics used a line tracker. Note they had 5+ sensors on the front of their robot. And in that game, accuracy was much less of a requirement, hence why line tracking was a good strategy.


All times are GMT -5. The time now is 03:54.

Powered by vBulletin® Version 3.6.4
Copyright ©2000 - 2017, Jelsoft Enterprises Ltd.
Copyright © Chief Delphi