My laptop is running Windows 8, but would rather have the Driver Station running in a VM (due to licensing issues with NI software, something about hardware changes) (I have Windows 7 Pro running in a VM). The network properly works when connecting to the Internet, and I can browse and download properly. However, I am wondering if anyone here has managed to use a VMware Player VM as a driver Station.
I have seen it done in Parallels with no special configurations, but I don’t really want to pay for that. Does one have to configure the IPs of the host network adapters for the VM a certain way?
I have no qualms with doing this in depth tweaking, I know how to revert back and reinstall software if needed.
The Driver Station was not officially supported in Windows 8 last season but it ran fine. We’ll have to wait until kickoff to know about official support for 2014. Why would you want to run the DS in a VM?
While I have not tried the driver station under VMWare, it should be just fine, provided your hardware is up to the task. I would think any Win8 machine is fairly modern and has sufficient CPU and memory, but do some testing before committing to competition.
Like I stated in my post, I have hardware changes (apparently, I think it is my USB headphone amp that may be doing this) that mess up NI licensing.
I know my laptop is perfectly capable of running the driver station in a VM. I have. I just need a bit of help configuring the network. It doesn’t communicate with the robot from within the VM(yes, I set the IP properly), but I can communicate in the host OS fine. Like I said, the licensing issues are what I am trying to evade so I don’t have to reinstall constantly.
I worked it out.
I bridged the 2 VMware adapters and the WiFi adapters in the Windows Network manager on the host, set the bridge IP and DNS as I would the WiFi adapter, and set the Virtual Ethernet adapter in the guest to the same IP and DNS.
The host Driver Station doesn’t work in this config (it works without the bridge), but the VM runs it fine.
I have the Classmate E12 2014 developer and driver station image running in VirtualBox (see picture). Essentially, through some Windows PE madness I was able to do so, but it wasn’t “easy” to figure out. There are a number of reasons that I wanted it in a VM, most of which I wanted a develop in a *nix environment and test the robot with minimum hardware and effort. What is important to note though, is that we will not use a VM for competition. I don’t even think that would be smart for performance, but probably too complicated due to the networking with the FMS. As usual we will use a standalone laptop solely dedicated to the purpose of being a driver station. At any rate, it was a fun exercise and I now have my test machine.
You should make sure that you uninstall the driverstation from the host computer so that it doesn’t cause IP conflicts if opened.
The driverstation has an option to modify your IP address automatically by default to change it to the 10.TE.AM.5 IP address required.
Doing this on the host and VM computer will cause conflicts.
Also I would recommend only using the VM Driver Station for testing purposes only, and not as your competition driver station. You should have a dedicated driver station propoerly configured to run natively on the hardware for competition and matches.
Some things to watch out for:
I work primarily with ESX and ESXi, so I don’t know if workstation has this, but ensure USB passthrough is enabled, so the Virtual can access the controllers.
If you use bridging, the field may authenticate the traffic packet based on MAC address, and if the VM interface MAC is encapsulated in the payload, but the Physical MAC’s interface is registered as sending the packet, the field radio could reject/drop those packets. I think this is highly unlikely, as the field network last year was a simple configuration.
The same applies if you are using NAT’ting, except replace everywhere I said MAC with IP address. Because different organizations have produces all of these different items at a nearly free to FIRST level, I HIGHLY doubt this is integrated in to the Field.
Those are the big ones I can really think of.
Make sure you have vmtools installed so as to optimize your video, and virtual hardware to hardware mapping.
Hope this helps, and let me know if you have nay other questions.
D
I helped a team that had bridged their ethernet and wireless connections. They were not able to connect to FMS until they removed the bridge, even though all other settings were OK.
I don’t think the field network configuration is as simple as you suspect. From the FMS whitepaper:
The FMS software (running on the FMS Server)
communicates with each Driver Station via the
managed switches in the Scorpion Case and Station
Control Cabinets. This communication employs
team-specific virtual local area networks (VLANs)
which serve to isolate each teams data traffic.
…
The FMS software configures the managed switches
and Access Point before each match to ensure the
data and communications for each team is kept
separate from others.
Hi, Joe.
What is being discussed in that body of text refers to creating multiple isolated Virtual Local Area Networks (VLANs) in order to keep traffic communications privatized per network (or robot in this case). This could prevent someone on one robot from spoofing traffic on another robot on the opposing team.
If you’re statement that bridging did not work, then yes, there could be another method of security in play. I would have to see the traffic to determine if that is the case and what it may be. 0]
Thanks for the info! I appreciate it. It will be interesting to see their latest whitepaper. I’m sure I’ll be asking more questions.
D