Virtual Robot

Is there a way to test a code (made in LabView) without having the robot for use. Or if not does anyone have an idea of how it might be done.

The only way i can think of is to replace inputs that would be given by the robot with controls and the results with indicators
if you want to test the functionality you will have to run it on the robot.:slight_smile:

You would have to do a decent amount of work with the front panel. It mostly depends on what your doing. So more specifically what is your goal with this?

The only thing I can think of is to use LED indicators for specific commands and run the code, checking for the right combinations of LED’s.

It sounds like you just want to test the logic of the code. Last year we would create a VI’s with just the logic in them no robot specific vi’s. we would create what nodes were required for inputs and outputs and test it from there. Test with front panel controls and indicators. Then once we were satisfied with the code, we would simply place the vi in our robot code, and wire it up to the appropriate motors, joysticks, sensors, etc. I would suggest a similar process, until you have a robot on which to test code.

The Virtual Robotics lab gives users the opportunity to work in a robotics lab building and programing a mobile robot. The tasks include: assembling all physical components of the robot, building a robotic arm, writing scripts to direct the arm to pick up a Coke bottle, writing scripts to steer the robot’s wheels to the activity table, loading “beliefs” into the main AI engine (ProtoThinker), and finally watching the Iris.4 robot move through the lab, pick up the Coke bottle, and put it into the recycler (an action it performs because in its “language of thought,” it is a committed environmentalist).

The robot that you build in this lab is a “top-down” robot. That is, the robot’s behavior will be controlled by a good 'ole fashioned artificial intelligence program (GOFAI) which is capable of having “beliefs” about the world, of making logical inferences, and is the single, centralized control device. This is in contrast to robots with a “bottom-up” design, like the IRRL Virtual Robot, also featured on our website.

The Iris.4 Robot that users construct in the virtual lab is a direct model of the physical Iris.4 robot built by a team of undergraduates at Illinois State University and their partners at the Technical University of Lisbon, Portugal. The Mind Project’s Iris.4 Mobile Robot Group will consider proposals from other teams who would like to add some new capability to our robot or to improve on an existing capability. Each team should have one or more student-researchers and at least one instructor to serve as an advisor. Extensive documentation is available (below) showing how the real Iris robots have been built.