Open Simulation with Dashboard Provider running

Danny,

I think I followed all the instructions to install LabView, Vista, and the newest user.lib. But, I am having problems following the 2nd tutorial. I start the Dashboard Provider and see that the DataSocket Server is running. But, the tutorial shows opening a simulation environment template vi while Dashboard Provider is running. I can’t figure out how to get back to the Labview Getting Started screen while Dashboard Provider is running. If I try to start another instance of Labview, it just pops me back into the Dashboard Provider.

Please Advise, Ray

I had problems with LabView and Vista. I just installed XP SP2…

I didn’t spend more than 2 hours on it though. May be possible.

Jacob

From either the Front Panel or the Block Diagram of LabVIEW, click on File, then Open. You should get a standard Windows Open File dialog box from which you can navigate to the Simulation Environment Template.vi.

Barry Lazzer.

Thanks Barry,

That did it. The server, provider and Simulation Environment are all running at once. (slowly) I guess it just doesn’t look like that in the tutorial 2.

We’re trying to customize the Dashboard, mainly to rename the data to the same variable names as the code (more readable) and group them into functional groups like driving, camera targeting, and manipulator controls.

If there is a template for the Dashboard, similar to the standard one, that we could just customize, rather than build from scratch, that would be very helpful.

I’m hoping we can make a stand-alone executable when we’re done, so we can publish it to multiple (student, mentor, …) laptops. I really don’t want to install all these tool packages on every computer we use.

Thanks, Ray

Like what would you like? For my team (418) we have only about 4 things we ever look at from a Dashboard perspective, plus the user bytes. For us starting from a template would take more time because we’d delete 80% of what was on there. Plus you can change the outputs to any number of different kind of indicators (graphs, charts, displays, meters, etc…) so modifying a template would cause you to have to reposition everything to make things fit. If you want a “templatized” dashboard I’m sure we can get you one, but I think you’ll be mostly customizing your dashboard anyway.

I’m hoping we can make a stand-alone executable when we’re done, so we can publish it to multiple (student, mentor, …) laptops. I really don’t want to install all these tool packages on every computer we use.

Unfortunately the Application Builder isn’t built into the Full Development System, only the Professional Development System. Believe it or not, even if you did build an executable, you still have to install about 70MB of packages to make everything work (LabVIEW runtime engine, VISA runtime, DataSocket runtime, etc…). Of course at that point you would have “dead code”, and would have to then distribute changes back to all the machines. We really do prefer you to have LabVIEW installed on your computer, but depending on what you truly need you can cut down on the installation footprint by only installing the core components (LabVIEW, NI-VISA, etc…).

-Danny

Danny,

Thanks for the info. We’re up and running custom dashboards.

One thing is really great. We “join” user bytes together so we can
display the full resolution of the AtoD channels.

The guages really dress up the drive PWM display.

Separate groupings of the camera tracking and drive data is helpful.

Thanks, Ray

Oh yeah, we’re doing the same thing - we are pulling over 4 10-bit numbers using 5 user bytes, and then using the 6th user byte as a “code” indicator of what values we’re pulling over so we can multiplex even more - you send 4 numbers with one code, send 4 more with another code, and so on, and before you know it you have all the information you need updating about 10 times per second! (just be careful, not all user bytes come over at the same rate since some user bytes are included in multiple packets).

The COOLEST thing EVER, that I REALLY wish everyone would look into, is the 3D Picture Control. Team 418 is going to have a 3D picture control on our Dashboard showing us a 3D representation of our robot arm based on the information we get back from the robot, so we can quickly and easily see where the arm is and how it’s positioned at any time (even if the robot arm is occluded by the rack and other robots/objects). We’d love to integrate a field model in our 3D picture control so we can even see where we are in relation to the field and the rack and everything, but with only one person doing LabVIEW programming on the team (not me, I don’t do anything, I make them do everything) it is waaaaay too much for them to bite off. However, if we had the manpower, it would be an extremely awesome way to get in some more driver practice time if you also simulated the Robot in LabVIEW at the same time… :stuck_out_tongue: Maybe if I can get a few projects off my back I’ll have some time to do a small demo to show everyone the potential of just how useful these tools and LabVIEW really can be!

-Danny

Danny,

I hope 418 will post something, maybe a picture of the 3D map.
A demo would be great.

3D would really be a nice addition to the simulation.

525 is using LabView to solve some of our drive and camera problems, but we’re still new at it.

Ray

After analyzing our current state, and the current design of our robot, we’ve actually decided we can focus our efforts more efficiently by leveraging the 2D robotic arm control example that is already built into LabVIEW that uses the 3D picture control. This is not really a spoiler on our design or anything, but our robot arm is very eerily similar to the 3D Picture Control robotic arm example in LabVIEW (and I would imagine most arms for this year’s competition are). So we’re going to integrate the robotic arm example into our Dashboard so we can control the arm articulations in the picture control via data from the dashboard (coming from pots on our own robot arm articulations) instead of controls on the screen. Then we’re going to modify the model on the screen to more closely resemble our arm and also give a reference point for the ground so we know how close our manipulator is to the ground.

Check it out, the example in the following relative path in LabVIEW 8.2:
National Instruments\LabVIEW 8.2\examples\picture\robot.llb

The top-level VI you want to look at inside the library is robot.vi.

http://www.texasdiaz.com/temp/robot_arm.jpg

This dashboard control will really only be used during driving when/if the robot arm is occluded from view, but the REAL benefits to having this is are:

  1. Robot Arm Visualization
  • It shows us where the Robot Controller code thinks the arm is at in a visualization that is quickly understood by anyone on our team. We’ve had problems in the past where a pot would come loose or something would go wrong, and it would take us way too long to figure out the problem.
  1. Operator driver practice
  • With the USB-DAQ device, we can give a robot operator practice time with the Dashboard and the simulator toolkit (the drivers can practice with previous year’s robots). Unfortunately we don’t have the resources to give something to our drivers such that the driver AND the operator can practice together, but some practice is better than no practice if you ask me.
  1. Code development
    Once we get a good mapping between the pots on our real robot arm and the angles in our visualization, it will allow us to tweak and visualize where presets are for the arm. Obviously you want preset positions for the arm to reach the various heights (floor, lower rack, mid rack, upper rack) so you can one-button press-it-and-forget-it-and-get-it-perfect-every-time kind of operation; we’ll pull a trick from the LabVIEW CMUCam2 code and export a .h file that will contain our config information. Once we’re done setting our presets, we’ll just then compile the code with the header file and bam we don’t have any additional code to muck with.
  2. Wow Factor
  • Obviously we’re itching for the control award at Lone Star for the 2nd year in a row… :stuck_out_tongue:

I’m investigating ways of getting a demo put together of the system - if we can assemble it all before we ship out our robot, if not then after the season is over we’ll still put together a demo to show teams how to do it next year - Camtasia allows us to take computer-screen recordings at the same time as webcam recordings so we can have a split-screen of our dashboard and robot at the same time, so I think I will use that.

-Danny