pic: Touchscreen interface, theory of operation



shows the in and out pins from the drivers station to the arduino touch screen, the path is not sent directly to the robot, only as a visual suggestion to the path the robot might take. only the vertices of the lines are sent, at a max of 50 (limit so that all points could be sent at the last second, if required)

original thread that lead to this http://www.chiefdelphi.com/forums/showthread.php?t=77903

This looks like a great and interesting idea, however I would urge caution about designing to the 2009 Driver Station interface specifically. Given some of its problems this year, and Bill Miller’s statement that a Classmates PC would be in next years’ kit, there may be different if not easier ways to interface in the future.

Would you be able to use the touch screen to map a waypoint based autonomous mode on the fly? Because there were multiple times where I would have loved that (mostly while “breaking” air lock loading auto modes from the likes of 1114).

yes, that is the point of the screen

This looks like a great and interesting idea, however I would urge caution about designing to the 2009 Driver Station interface specifically. Given some of its problems this year, and Bill Miller’s statement that a Classmates PC would be in next years’ kit, there may be different if not easier ways to interface in the future.

The blog says that the laptop is to be used with the driver station. and there should always be analog inputs on the driverstation. if the laptop is to be used as the DS, then drivers may be able to be installed, but i think it may just be for a dashboard

but, the points have to be entered before the robot is enabled, so a limit of 50 points is imposed. 50 because the driverstation updates at 50hz and it would only take 1 sec. best case.
The points are sent in order received on the screen,
and if they aren’t inputted fast enough the last points may not be received.
so a memory will be implemented on the screen (as it has its own processor),
and on the next power up, act like the points were inputted from a person, but much faster

Why not just resend the points every second? The point number input would always tell which point is being sent so why not 1, 2, 3, …, 49, 50, 1, 2, … ?

–Ryan

I hadn’t thought of that. :]
that would allow for error correction, and last second point changes
thats a good idea

This discussion brings up an interesting rules question.

Presently, you can download code to a robot when it is on the field by plugging your laptop into the other Ethernet port. Doing a soft reboot would then complete the update. A few times this year we have done so to get in “last minute” code changes without having to tether up the bot in the queuing line.

However, if properly written, one could eliminate the need to reboot the robot altogether. Here’s what I’m thinking:

  1. Create your autonomous mode on the laptop.
  2. The software then creates a text file representing the sequence of autonomous commands.
  3. The laptop is connected to the driver station Ethernet port.
  4. Using FTP, the text file is placed on the cRIO.
  5. Upon the initialization of autonomous mode, the software parses the text file and acts upon the stored sequence of commands.

As long as all of these steps are complete by the time the robot is enabled, is there anything in 2009 (or past) rules that would preclude this scheme? As far as I am concerned, this solution is far easier than using an Arduino board and writing a custom communications protocol.

I think I have some coding to do.

(Even if the 2010 rules explicitly disallow transmission of packets from a laptop to the cRIO at any point when the robot is on the field, such a scheme would probably still be the easiest way to tweak autonomous modes without ever needing to rebuild or even reboot)

I can’t think of any 2009 rules that prohibits FTPing a file of commands over pre-match. 195 used a similar strategy in 2006 to let them create waypoints to make them more effective in defending against opponents in autonomous (theirs, as I remember, used a PDA connected to the serial port to dump the data before the match).

I’ll ask a silly question: What if you want to run your manipulator (including specifying a particular position) during your autonomous routine?

this is a general implemetation, based off of our 2009 lunacy robot. there are no manipulators that need to be run in auto. if our 2010 robot has a manipulator then the code will have to adapt.
the screen can display anything that we want it to, so we could have it display various options on the screen.

yes, but our drivers didn’t want a laptop up at the Driver station stand, they wanted some thing small, that could update the auto mode on the fly without dealing with uploading things to the robot.

the touch screen is only about 2.7’’ by 2.1’’
for comparison our laptop is 24" by 18"

Wasn’t the concept of this introduced in Atlanta during the Championship Event on the big screen in 2008 when the cRIO was first officially & publicly announced?
What ever happened to that?
As a person with a CAD background, it seemed really cool to just draw a path & have a robot follow that.
Never saw anyone use anything like that this year though.
So disappointing.
Hopefully in the future more people will attempt this & succeed.

I dont know, i never saw that video, or pictures of it.

that is basically what i am trying to accomplish, my team at the LA regional didnt like how the robot kept crashing into the driver station wall, and they wanted to have something to be able to change the behavior of the bot without having to tweak values in the code (no text file for us :]).
one of them remembered the screen at atlanta, and wanted it put on the robot for last second strategy changes.
(we used binary switches but they couldnt ever remember what each switch did, or if it even worked)

This is similar to what it is going to look like

http://www.liquidware.com/system/0000/1902/Arduino_TouchShield_Slide_reflection.jpg

Did you model this yourself or did you find it somewhere?

The touch screen is from liquidware.
http://www.liquidware.com/shop/show/TSL/TouchShield+Slide

but the interface to the driver station i developed.

the blue board under the screen is an arduino

there will be another board between the two components that will have the circuitry that i designed

Today, the touchscreen finally arrived :smiley:

The circuit is almost completely soldered, and the screen can be powered off of one IO pin on the DS

Pictures soon, waiting for a photo box

*Heads off to go programming :] *