![]() |
Last minute autonomous help
My team is scrambling to create a working autonomous.
Our code, so far, is a modified version of the Java sample line tracker. We had to change our code to work off of two sensors because, unfortunately, one of our three line sensors came broken. We understand that all the sample code does is move the robot forward for 8 seconds, but even with this sample code loaded on the robot, our robot does zilch. We're just trying to get our robot to show some sign of life while put in autonomous mode. Can anybody help us out? All suggestions are appreciated. :D Here's what we have loaded, autonomous-wise: Code:
public void autonomous() { |
Re: Last minute autonomous help
You say two? Good luck with that, it can be done, but you will require either gyros or encoders. Keep a record of the speeds, or the angle it has veered off to. Mount a line tracker in the front, and the other one in the back so it always is triggered. Do some calculations. I too have to work on the autonomy. I will be writing my own from scratch.
|
Re: Last minute autonomous help
We don't use line tracking, we use encoders we drive a fixed distance straight and place a piece.
Since you're running so late. try code that powers your drive forward at say 1/3 power for 5 seconds. Tune it until it just bumps the wall and then place a piece. Make sure you keep a good battery, but I think that's your best bet for a last minute autonomous BEFORE you ship. Alternatively, look up teams going to your first regional and see if there's anyone like my team who hasn't used there line sensors and wouldn't mind giving you one. |
Re: Last minute autonomous help
Are you getting any response when you deploy? If the robot does nothing, the robot is probably quitting unexpectedly, which you would receive. As for working with two photosensors, if you are dead set on using them (instead of other sensors or dead reckoning as has been suggested) I would place them directly adjacent to each other, so they can both be tripped by the tape at the same time and working with that. There may be some interference however (one receiving the reflected light from the other and returning a false positive) and it obviously won't be ideal.
|
Re: Last minute autonomous help
Personally I find the encoders a more reliable way of making an autonomous algorithm. You simply drive until you hit a certain number of encoder counts. Then you move your manipulator. Then you drive some more. Then you place. Then you reverse.
If you're using the Iterative Robot template, the easiest way to do this is do create different state machines for the drive and each of your mechanisms. Then you create a main state machine that simply feeds these other machines values and waits for the drive or manipulator to be in place. Once the structure was created, fine tuning the numbers only took a few hours, and now we can place on any of the high pegs and the middle second level pegs reliably (as long as we're using a fresh battery that is). |
Re: Last minute autonomous help
Quote:
|
Re: Last minute autonomous help
Quote:
The other thing about encoders is that with some math, you can convert it to feet. You know how far back you start, so drive N feet forward and then place the piece. All the fiddling is with the arm then. |
Re: Last minute autonomous help
Quote:
Any other teams out there willing to share how they use their encoders in autonomous? :D |
Re: Last minute autonomous help
Quote:
Here's the relevant code. Code:
leftDist = bot.dt.getLeftDist() - leftStart;gain is the initial value that gets toned down as you drive (Currently using 0.7) Hopefully the rest of the code is clear enough that it makes sense. If you have anymore questions feel free to ask. Also, you can PM me if you want to see our full autonomous code. |
Re: Last minute autonomous help
Quote:
If you don't mind, can you please send me the code that has to do with your base? I'm sure I can learn from it as I can't find many resources that explain the concepts. :O |
Re: Last minute autonomous help
Quote:
Is there any reason you can't overshoot? Fast speed to wall, then slowly backing up 6 inches may be better, unless it smashes your arm or something. Don't forget, you have 15 seconds, consistent even if slow is better than inconsistent and fast. Hope that helps a bit. |
Re: Last minute autonomous help
At the Waterford District we used encoders, drove a set amount of cycles forward, and then had our arm code place our tube, except it only worked 50% of the time because our field people liked to not line it up straight so it went off to the side quite a bit.
At Ann Arbor we used line trackers to keep our robot straight while using encoders to set our distance. But our field people would have it too far forward or back so it still wasn't reliable. At Troy we're looking forward to trying to use a camera positioned on our arm to locate the top peg and score, with little or no use for the encoders or line trackers. As for only using 2 line trackers I wouldn't recommend it. Last year we tried that and sent half of a competition trying to borrow one after 5 unsuccessful matches. |
Re: Last minute autonomous help
Did any other team not use encoders/ultrasonic for their autonomous?
My team only had line sensors, we're counting on timers to do the rest of the work for us (arm). And if so, how well did the timers work out for your team? Do you think we would have enough time to implement encoders on our robot in the pits? |
Re: Last minute autonomous help
Quote:
How long would it take to implement encoders? Assuming you plan for it, it should be quick and doable with one or two runs to the practice field. BEFORE YOU GET THERE 1. If you ave the encoders at your shop and can solder the wires so that you can plug it straight into jags or digital sidecar 2. Figure out where you're mounting the encoders (cimple boxes?) and the gear ratio to the wheels 3. Figure out the diameter of the wheel and the gear ratio between the 4. Using the diameter of the wheel and gear ratio to calculate how many feet you go in one rotation of the encoder 5. Write code using the above equation and P(and possibly ID) to get to your target position (I think starting location is ~19 feet from the driver station) 6. Double check your code. WHEN YOU GET THERE 1. Attach encoders to robot 2. Wire encoders 3. With robot on blocks check that autonomous behave sanely (have somebody on the Estop) 4. Test on practice field (have somebody on the Estop) 5. Assuming that the behavior was close enough to sane.. 6. Continue testing on practice field and matches until it's consistently scoring. (Similar applies to arm.... might need more math though) Hopefully that helps, the great thing about sensors is that you can write alot of code ahead of time. Good luck if you go through with using encoders. |
Re: Last minute autonomous help
Quote:
We're using C++. Our autonomous program manages a state machine that gets input from the light sensors and decides what to do based on them. Our algorithm is as follows: Light sensor values (L/C/R)-----------------------State to execute 000--------------------------------------------------move() - go forward until you see the line 010--------------------------------------------------move() - this is ideal, we're on the line 100--------------------------------------------------correctLeft() - move left to see the line 110--------------------------------------------------also correctLeft() 001 and 011----------------------------------------correctRight() 101--------------------------------------------------at the fork; correctLeft() or correctRight() based on a switch on the robot 111-------------------------------------------------placeTube() After executing placeTube(), the program quits the state machine and retreats back several feet, then turns on the spot (~180 degrees, but it's not perfect since we don't have encoders or a gyro). Generally, getting the tube on the rack takes about ten seconds, and we're only driving the motors at 40% for drive and 60% for turn. |
| All times are GMT -5. The time now is 11:14. |
Powered by vBulletin® Version 3.6.4
Copyright ©2000 - 2017, Jelsoft Enterprises Ltd.
Copyright © Chief Delphi