Team 302's Off-Season Robotics Programming Competition

Team 302 is introducing a new kind of off-season competition… A virtual one.

This FIRST year’s challenge (in what we hope will become an annual event) is “Robust Autonomous”.

For complete details visit the VIRSYS webpage

This contest specifically targets LabVIEW programmers and emphasizes control. However, the aim is to increase scope beyond labVIEW and programming in future years to include design elements and so on.

Good Luck and have fun!!!

This is incredible!

I have a few students who would probably have a ton of fun in this competition.

Also, maybe you can help clear something up - is there any way to get “true” robot position/orientation information from VIRSYS.exe? I’d like to be able to compare where the robot thinks it is against its precise field position.


The values you are looking for are sent to the Image Generator (IG) in exactly the same way messages are sent and received in the LabVIEW code.

If you want to peek at them you can setup another IG in the VIRSYS config (text) files (physics.txt?). Set the number of IGs from 1 to 2 and duplicate the IG definition using different ports… Make the last digit “4” as I believe 1,2,and 3 are already in use). The information you’re looking for will go out on/to this port.

Then all you need to do is setup a listener in LabVIEW exactly the same way the sensors are read (but you must match the port you sent the extra IG message out to). This message SHOULD be <timestamp, x, y, z, heading, pitch, roll, arm angle, wrist angle, clamp on/off> where x,y,z are in inches, x is down the field, y to the left and z up. heading pitch and roll are in radians and appear in that order and right handed about “body fixed” axes. Note that this signal is 11 floats (singles) long, so you also need to update the receive array size.

This is a good feature to have but you really don’t NEED it. If you implement it please send your project file and config file to the VIRSYS e-mail address and I’ll get it onto the website. It’s a great request so if you really want us to do it, just let us know. Give us a few days to coordinate who will do it then get it done.

There are still LOTS of questions to answer, so keep them coming!

I’m afraid you’ve sort of lost me - without really knowing your architecture, the concept of an Image Generator is not easy to understand (or how it relates to other components). Do you have a diagram of the overall project?

Do I have these concepts right:
–I need to configure VIRSYS.exe (via an existing text file) to send data out on another channel/port
–I need to configure the LabVIEW program to read data on that channel/port
–I need to add code to the LabVIEW program in order to parse that data

Is this right?

Your interpretation is correct.

My apologies, the discussion was intended to be an extension of the “getting started” documentation (to use the term “documentation” loosely) on The prior version of the getting started page mentioned the configuration files… I just noticed it no longer does, so that would be a problem.

The top level architecture is simple, and being clear about what it is should go a long way.

A VIRSYS simulation is three components. Your software, the physics model, and the 3D visualization (image generator, IG).

These three components run independently and send and receive UDP (networking) packets between each other. Your software environment (LabVIEW) and the physics model send to and receive data from each other. The IG never sends data and only listens to (receives data from) the physics model. The IG can be absent and apart from no visuals, the simulation should run.

For convenience, the physics model and IG are packaged in a single multi-threaded executable (VIRSYS.EXE). However, the network type communication between them is still there. You can configure the executable using the various (commented) text files which VIRSYS.EXE comes with (unzipped in the same directory). For example, you can tell the executable not to run physics which is useful for various things including constructing multiple remote views of the same simulation (on remote computers).

The comments are not fabulous, so here’s what you need to do to the file “virsysPhysics.txt” so that it sends out an additional IG message:


VIRSYS (VIrtual Robot SYStem) - Physics model configuration file

50001			<- local receive port		<- LabView host
50002			<- LabView host port

1			<- Number of image generators		<- IG host (1)
50003			<- IG host port (1) ...  additional hosts repeat like this

robot.txt		<- robot config file

0.001			<- integration time step
0.01			<- I/O time step (data publication frequency)


VIRSYS (VIrtual Robot SYStem) - Physics model configuration file

50001			<- local receive port		<- LabView host
50002			<- LabView host port

2			<- Number of image generators.  NOW 2!		<- IG host (1)
50003			<- IG host port (1) ...  additional hosts repeat like this		<- IG host (2)  ACTUALLY WE'RE SENDING THIS TO LABVIEW FOR DISPLAY
50004			<- IG host port (2)

robot.txt		<- robot config file

0.001			<- integration time step
0.01			<- I/O time step (data publication frequency)

Note that you can run LabVIEW on a different machine if you wish by changing the “LabView host” from the loopback localhost IP address to an actual remote IP address. Of course LabVIEW then needs to know where to send it’s data.

Of your original 3 steps, the first is taken care of.

–I need to configure VIRSYS.exe (via an existing text file) to send data out on another channel/port

Yes, done.

–I need to configure the LabVIEW program to read data on that channel/port

I would replace “configure the” with “write/modify the” :slight_smile: Thankfully there is an example of exactly what you need to do already inside the LabVIEW Trainer. You just need to find it, copy and paste, and change a few parameters. It’s pretty easy to find inside the “HW” block on the Main VI. You want to mimic the Sensor Read/Recv procedure with only minor changes. I recommend performing all of your activities inside a newly created block on the Main VI directly below the HW block. This will give it it’s own execution thread. Don’t forget to use 50004 as the port number for this and adjust the size of the array to accommodate the size of the new message (per my last post).

--I need to add code to the LabVIEW program in order to parse that data 

Yes. Parsing is also shown in the Sensor Recv procedure. Your signals are still “singles” (floats)… All you need to do is index into the array and verify which is which.

Adding a front panel for display of this data would be a nice touch.

Looking at the labview project, it didn’t see anything that looked like autonomous independent. I don’t think it would be very hard to add, but didn’t want to reinvent the wheel if it was already there. For periodic tasks, it looks like additional loops could be added to HW_Interface, or to minimize messing with the framework, another vi could be added in parallel.


The Autonomous / Tele-op / disable, etc. mode switch is located on the surrogate Driver Station VI (DSInterface?.. sorry I wanted to reply, but I don’t have it open). It is a pulldown menu above the arm controls dials and switch. Ideally the robot code determines what to do based on that signal and all of the high level stuff happens in the Robot Control VI.

You are correct and do need to add your own stuff to the HW interface. Ideally that is just PID, etc. controllers and doesn’t look at the Auton / tele-op state at all. However, if you have different low level controls (tele-op vs. auton) then you would need to. Alternatively, if you were to have scripted maneuvers within teleop, then the signal would be something set by both auton and teleop independently. So you can argue that you never need to look at that mode signal in there.

It’s been a few weeks now… I’m curious to know how many students are out there working on this.

Keep the questions coming!

COOL!, one question though, what software did you use to create the field in the picture?

The software in the pictures is constructed in C using GLUT. Googling “GLUT opengl” will get you tons of links.

Alternatively we have a Unity 3D version of the visuals which was impemented by one of our seniors last year. Develpment stopped after the championship and it’s not quite ready for prime time. Unity offers the potential to directly import robot geometries and get outstanding visuals with less effort.

Since we are interested in deploying the simulation to a single computer, the GLUT visualization is preferred because the computational resources consumed are very small. That said, we are observing issues on some windows platforms where the “sleep” function is not properly yeilding control to other processes.

I’ve attached my changes to labview trainer project to make it more like the LabVIEW robot framework. It adds separate VIs for Teleop, Disabled, Autonomous Independent, Autonomous Iterative, and Periodic Tasks. This makes it easier to move code from our 2011 robot to the trainer. (194 KB) (194 KB)

In Read_sensors, there appears to be an issue with the right distance sensor. It’s source is “VIRSYS right_speed”, and I think it should be “VIRSYS right_angle”.


At the moment I have no way of looking at the code or posting an update until the weekend. It sounds like a cut and paste error on my part. Can you verify the correction and post speciffic steps for the correction identifying the exact VI (or VIs) where changes must be applied?


One of our other mentors/recent grads got a description for the exact fix.

I had a few minutes so I looked into the VIRSYS right distance sensor issue.

Under -> ->
The calculation for the R_DIST_SNSR draws from the Global Variable VIRSYS_R_SPD instead of the Global Variable VIRSYS_R_ANGLE, this should be a simple fix.

Left click on the VIRSYS_R_SPD that is being replaced, and from the drop down menu that appears select the VIRSYS_R_ANGLE.

Good work!

Yes, that’s what I did. I found it by looking at the code, I haven’t tried reading sensor values, yet. Theoretically, there could be an offsetting error somewhere.

I scored 3 tubes on the top row in 17 seconds, without moving the shoulder, and I think it would be easy to get it under 15 seconds.

On my quad core i5, virsys.exe takes approximately 50% of my cpu, and on my dual core laptop, it takes over 90%. Is that expected?


Are you picking the second and third tubes up off the floor in front of the other robots? How are you doing that “without moving the shoulder”? Do you mean that you have yet to implement the shoulder controls? Please observe that the claw is not symmetric, it opens “up” and clamps “down” when the fixed plate is on the floor. It cannot typically pick-up tubes by reaching backwards to to floor behind the robot. The real 302 robot actually has a hard stop just beyond the vertical position (but it’s not in there virtually… If a backwards grab looks convincing, we would count it. I expect that the claw whould just kick the tube in that configuration rather than grab it, but anything is possible.

On some windows platforms, the CPU results you are seeing are unfortunately “expected”. VIRSYS is a very lightweight application which runs two threads, one each for physics and visualization. It does not have to be this way, but it’s late for this kind of change for the competition. One or both of these threads is explicitly synchronized to real-time. We check the machine’s clock then call “sleep(0)” which is supposed to “surrender the rest of the thread’s time slice” to threads/processes of equal or higher priority. When there is nothing else running, VIRSYS is effectively “busy waiting” and can consume lots of cycles. Once you launch another process like LabVIEW, beyond your number of cores the CPU usage SHOULD stay about the same and you should not notice a performance hit… Unfortunately, the results are different on each of XP, Vista, (32 and 64), (and perhaps HW?), etc. I have no issues on the two older Windows machines I run at home. In the Linux and BSD (MacOS) implementations I call “usleep” for the reamining time and the results are very good. I’m not aware of a reliable sleep function appropriate for real-time synchronization at 50-100Hz with Windows. A friend pointed me to a method which borrows the socket timer, but that was a while back and I do not have it anymore.

If somebody has a suggestion / example for an accurate Windows sleep function, the substitution can be made very quickly.

I believe what LV does to get more granular sleep timing is to reprogram the multimedia timers. At that point, you have a one ms resolution sleep.

Greg McKaskle

I don’t think you’re going to count any of this… :slight_smile:

See a video of this year’s winner here.

Congratulations to Team 302 for winning the programming challenge this year!

This robot simulation hangs a single tube in autonomous mode. The winning code was an independent work submitted by one of our first year programmers and uses “open loop” (scripted) motor torque commands to realize reliable motions of the ideal robot. Unfortunately testing under variability did not hang any tubes.

Per the contest announcement (here, a second award would be made to the top finisher from an outside team. However, in this the inaugural year, Team 302 was unopposed.

We look forward to defending our title next year!!!