NASA To Use Xbox One's Kinect and Oculus Rift To Control Robots
Xbox, Open the pod bay doors.
Watching developers play around with the Oculus Rift has got to be one of the best marketing strategies out there. It's not even a consumer product yet, and it already has me fingering my credit card in anticipation. I'm not the only one, as NASA's Human Interfaces Group has been experimenting around by using an Oculus Rift and a Kinect sensor to remotely manipulate robotic arms.
As cool as it seems, this is actually off-the-shelf hardware. Using an upgraded Kinect from the Xbox One, the operator can manipulate the JACO robot claw simply by moving his own arm.
The trick is that traditional robotics controls have an inherent lag time in processing the signals. That's where the Oculus Rift comes into play. Besides offering a first-person perspective of the action, it actually shows a predicted future, reducing the perceived lag. In the video, he's not grabbing that block where it is right now, he's grabbing it where it will be in a few seconds.
Robots will be critical in future space exploration. They can operate in dangerous conditions, don't require the same support systems, and don't need years of training. With all the awesome stuff NASA is dreaming up, it's cool to see what they can do using simple gaming hardware.
Ah the Kinect, a wonderful little piece of technology that is able to have many applications for it to improve on many things.
Sadly though, video games is not one of those things. :P
Seriously, the Kinect has so much potential and has been used amazingly in so many things expect for what it was originally meant for: gaming. I mean in my old Robotics club we got the Kindct to actually be able to control our F.R.C robot just fine, albeit with awkward ways of controlling it. Then there are video purposes for it as well, as the famous SFM Practical Problem used a Kinect to mo-cap the movements in that SFM. Yet when it comes to gaming it's either shoe-horned in as an unnecessary gimmick, or its imprecisions distract from the game as a whole and makes you wish you were using a regular controller. Welp, let's see what Apple does with the tech since the bought the company that makes Kinect and all of its tech.
This was done before with monitor screens and motion gloves. it was done to the precision of remote surgery when the doctor was too far away to rush to a dieing patient (and it worked).
the problem with this being used for nasa is ping. It works great in a lab. but when you need to wait minutes for the signal response when the robot is on mars such direct control is unfeasable.