Follow up: How is your auton?

Now that the build is over, what state is your autonomous in?

I’ve written some dead-reckoning code to grab a hanging tetra in autonomous. Unfortunately, I haven’t been able to test it because we “finished” the robot only a few hours before shipping. I guess that’s what Thursday is for. :wink:

you forgot that some teams will got o loading zones in auto mode.

I am a bit disappointed in our auto mode.

We are able to get ONE of the vision tetras, place it on the middle goal, then find the second one and get it picked up, but we can only score it on one of the side goals. We don’t have time to drive back to the center and get it onto the higher center goal.

I think we need to recruit a new programming team.

:wink: :wink: :wink: :wink:

ps - just kidding / we really can’t get to the side goal with the second tetra :wink:

ps again - really just kidding / I don’t know what our AUTO mode does, I fell asleep in the workroom about 3 AM, and didn’t wake up until the FEDEX truck was pulling away.

At this point we can pick up vision tetras and go towards goals, but the robot shipped before I could test the capping. I will likely have the alternate program to knock down side-goal tetras done within the next couple days (although it will have no testing on the robot till Rochester next week). We’ll see if I decide to do any more :D.

Our bot just goes straight and we pray that we are a nuisance to some other autonomous robot :slight_smile:

I have written an Autonomous program that will get a tetra from the auto loader and place it on a small goal, then attempt to get another tetra from the auto loader and do it again. Six weeks worth of great code that has never been tested, I hope there are not too many bugs in my code. I only had time to make sure that the Hall Effects sensors were actually working and my distance calculation was not totally useless. But, if my code turns out to be as bug filled as IE6 then I will have to write a not so elegant Auton mode to knock down a hanging tetra, rev our cooling fan, and flash the feedback lights feverishly. :smiley:

we go “wheeeeeeeee!”–roundy round in circles…

not really. What the code’s supposed to do is knock off a hanging tetra, then turn and head over to the human loading platform to receive a tetra at the end of the autonomous mode.

unfortunately, testing has been sporadic, and the latest ‘fix’ i added to the code to rework it never got tested. so… everything’s up in the air as to whether it actually accomplishes everything it’s supposed to. but i guess we’ll find that out at regionals, won’t we? :smiley:

Until then… this kat’s gonna catch up on some LONG-overdue sleep.

'Night all!

~kat

My test robot (using a frame with broken welds) was able to track green and yellow a few weeks ago.

My actual robot is able to grab tetras, assuming the pots are calibrated.

My robot should be able to drop the tetra onto a goal, assuming the code fits together and the arm doesn’t drop the tetra.

The combination of all of this has not been tested because I was only testing this at intervals when the build members weren’t building the robot.

My team messed around with the pots on my robot’s arm two hours before shipping, depriving me of my needed testing time while probably breaking the calibration of the pots. So I am probably not going to do very well on autonomous mode… buit I’ll see on Thursday.

I can at least modify with code until then, hopefully making it more likely to work…

I’m sure that many teams are in a similar situation.

At least the driving straight code works to some extent!

If everything goes as planned, this programmer here is going to write a simple autonomous tonday, tomorrow, thursday, and at reagionals to start off with a tetra, cap a corner goal, and knock down a tetra. This is what happens when I only get 2 hours with the bot, and one of which is done creating our drive-by-wire control (which, if I do say so myself, is spiffy and makes driving so much easier. Yaa. Go stock Hall Effects Sensors). Anywho, heres to a lot of praying and guessing numbers!

-Tony K

We had a good programming team this year. We can cap the center goal and the side goal with 90% efficeincy. It takes us about 8 seconds to cap center and about 7 to cap the goal behind.
We also can go and mess us a robot tht has vision capatilities, imagine us driving right into you guys, over and over again in autonomous, no capping for you :stuck_out_tongue: .

Due to unfortunate mishap with the programming USB->Serial cable (Sabotage by drivering team needing practice?) we had to stop dead short of working perfectly. I’m guessing a late thursday night at the regional.

due to slow construction team, school bureaucracy, and fedex not being nice we werent able to test our autonomous codes, so they might work, but most probably not.

After reading these responses I’m feeling better (though not much) about the state of our autonomous modes. We’ve got 9 unique autonomous modes, switchable depending on which side of the field we’re on, but they have never been tested! While we were strapping our robot into the crate yesterday we still had others continuing assembly, so the only autonomous mode that we know works 100% is the one that tells it to sit and do nothing–we can do that reliably. :slight_smile:

well–suppose to have it–um dunno-- i have never seen it–it’ll be an interesting thursday as usual–perhaps we’ll get it perfect before nationals-hee hee :wink:

Imagine you getting disqualified:

Q: If a team moves to the opposite end of the field in autonomous mode and starts “zooming” back and forth across the field, obviously to disrupt opposing robot’s autonomous mode, can they be DQ’d for high speed ramming per <G25>?
A: <G25> applies in all modes. If your robot is traveling at high speed during autonomous mode and rams another robot, it is very possible that you would be disqualified, although the decision will depend on the particular situation and is at the discretion of the head referee.

wow, i feel better after reading this thread, too. we tried the camera thing but it didn’t work out…seeing as we only have one programmer and she only knew JAVA, we discarded the camera and worked with what we had.

only problem is this programmer has not seen the auto code actually WORK, the one day it was tested was the day she had a tennis game… so we’ll see on thursday won’t we? x.x’ anticipation…

Our Auton is pretty much untested because I did not get our robot until Monday at about 5:30 then the mechanical team took it back again at about 6:30. This is not much different then any other year so hopefully it will all come together during the practice day. I know we will be able to knock off the hanging tetra and move to get in front of another robot. I am hopping we will be able to take a tetra from the automated loading station as well.

I’m kind of the programmer’s assistant. That is, I don’t really do much with the code besides working out the occasional algorithm. Programming our autonomous was a wee bit impossible since we didn’t get a finished robot until a few hours before we put 'er in a big box addressed to Ypsilanti.
We should be able to track down the green tetra, which we tested successfully with the kit chassis (before they ripped the motors out for the real bot). Past that, we might be able to pick it up, but we don’t really have any code for going to the center goal. I worked out the geometry for position tracking using the hall-effects, but I don’t know if Mike ever got around to coding it in (I left about 12 hours before he did because i got some influenza).

We can do pretty much anything we want… 1403 is going for 2 vision tetras during autonomous mode…

GO 1403!!!