![]() |
Virtual Reality 1st Person Driver?
Has anyone ever thought of hooking something like an oculus rift/google cardboard to an onboard camera for the robot to create a sort of 1st person driver for the robot. You can argument the vision to aid in aiming, driving, etc. I know in some aspects it might be cumbersome, but I think it sounds like a good idea. (Is it even allowed in FRC?)
|
Re: Virtual Reality 1st Person Driver?
Quote:
The problem lies with the infrastructure of FRC itself. There is a bandwidth cap for robot-driverstation communication, and to get 3d you would need 2 cameras streaming to the DS to work. The problem arises with the fact that you have to choose (Pick any two)
I, personally, would get a pretty severe headache from that for even just a few minutes. Were we able to send dual 480/720p 20+fps cameras, it would be definitely a viable control system IMO. |
Re: Virtual Reality 1st Person Driver?
Quote:
http://www.3dglassesonline.com/learn...d-glasses-work Which oddly enough in my previous suggestions I suggested sending full frames one at a time (aka key frames) instead of a stream of video changes. It should be less difficult to make the network handle that sort of work. If you snap the pictures from the 2 cameras, the proper distance appart, process them, then send that (as quickly as you can) that will likely work out. Even if you get a single static image it would still exhibit depth. |
Re: Virtual Reality 1st Person Driver?
Remember that headsets need to be disconnected from the driver when the match starts, and if cable get tangled or something you will lose precious seconds.
|
Re: Virtual Reality 1st Person Driver?
Quote:
Extra amusement points though if someone gets some displays and retrofits a ViewMaster ;). |
Re: Virtual Reality 1st Person Driver?
Quote:
I think the OP was talking about integrating the gyroscopic capabilities of the Oculus to control the robot by steering the robot and raising/lowering a manipulating arm, which would independantly be possible (Via a TrackIR set or similar) |
Re: Virtual Reality 1st Person Driver?
Quote:
Also, as an alternative to using just tilt sensors in the Rift, you could use Kinect for the arms. |
Re: Virtual Reality 1st Person Driver?
Quote:
Using polarized lens you get much better color reproduction at the cost of a compatible display. |
Re: Virtual Reality 1st Person Driver?
This form of interface is normally used when you are miles way from the real device/environment and needing to immerse yourself into the situation as if you could be there. This is true whether it is a visualization cave, 3D glasses, etc.
When you have a view of the situation, I think a HUD or secondary monitor is a far better approach. This allows pilots to "see" what radar sees, but doesn't generally remove and try to reproduce the things that are in front of the craft. If you have a rift, I'd encourage you to experiment with it using any of the simulators available. Identify beneficial experiments and then try them on the robot. I highly doubt that the driver would benefit from wearing these, but perhaps the operator trying to manipulate the arms/rollers/etc would. If you don't have a rift and are looking to justify the team buying one, I'd encourage you to write up the details of how it would be used. Greg McKaskle |
Re: Virtual Reality 1st Person Driver?
Last year, on my vision system which I didn't finish, I had the goal to have some driver interface where the image is converted to a vector, only with keypoints. This way, you get a lot of information to your face, but it doesn't look that bad. You can also run this at a higher rate without exceeding bandwidth restrictions.
One practical idea that I thought of is a...HUD! You are getting the video feed directly from the glasses, not the robot. This way, you can see around and not get nauseated by the intense motion blur on the robot. You have a coprocessor on the robot, calculating things such as distance to objects, or maybe even robot position on a minimap. This can be displayed at the corner of the display and the driver merely has to move his or her eyes to get the measurement. |
Re: Virtual Reality 1st Person Driver?
Quote:
Quote:
Quote:
http://www.instructables.com/id/DIY-...A-the-Beady-i/ Thats also an idea, but not as fun as a 1st person display. Quote:
|
Re: Virtual Reality 1st Person Driver?
Technology issues aside, I think that having a first person view of the bot would be a huge disadvantage. When you're driving the robot, you're able to see the whole field and what's going on, where (e.g. Map Awareness). With a first person view, you are limited to a small window of visibility. This may help with aiming, but this is a team game* and its important to know what's going on behind you.
*Going off of A.A. |
Re: Virtual Reality 1st Person Driver?
In the future, virtual reality could be used to make a driver SIM which would be just as adequate for driver training as driving a normal robot if it was done properly. It could be difficult to make though.
|
Re: Virtual Reality 1st Person Driver?
Quote:
LOL some of you probably were not alive then. It had really poor physics modeling and I suspect that most modern game engines would put it to shame. Considering it was for fun I wasn't really putting much into it. That said a modern game engine and Oculus Rift for immersion should be possible. Considering what one can do with mere shutter glasses. I am pretty sure DarkBASIC has Rift integration and real time accelerated physics support. LOL I've had a DarkBASIC license since VB6 I bet that's older than some of you as well ;) |
Re: Virtual Reality 1st Person Driver?
We had a HUD last year, with a screen on the glasses. It was pretty sweet, but when the camera's image was up it wasn't nearly as useful as you'd think. We won the Innovation in Controls award on Galileo since the Rockwell people were so impressed by the solution we used to comply with the rules and still not lose much time at the beginning of the match. This year we won't use the same setup since the drivers do not want the cumbersome-ness of it - yet they still want some of the info the HUD gave them. We're working it from a different angle for this coming year.
A whitepaper for 2014's HUD should be up sometime after Thanksgiving Day. |
| All times are GMT -5. The time now is 12:16. |
Powered by vBulletin® Version 3.6.4
Copyright ©2000 - 2017, Jelsoft Enterprises Ltd.
Copyright © Chief Delphi