Meta’s new hand-tracking feature feels like touching the future
The ability to tap and scroll on virtual objects with just your hands, without the use of controls, is being tested by Meta as a possible fundamental update to their Quest VR headsets. The concept is that you will be able to perform operations that you may already be accustomed to performing on your smartphone, such as swiping up and down a page, tapping a button to activate it, or typing on an onscreen keyboard, using only your fingers in the air.
The Quest v50 software upgrade, which is currently rolling out, includes the new experimental functionality known as “Direct Touch.” I finally received the update after several weeks of waiting, so of course I turned it on right away.
When hand tracking is activated, the Quest 2 follows your hands with its external-facing cameras, and within the headset, you’ll see them as shadowy dark hands. (CEO Mark Zuckerberg’s Direct Touch film appears to have been shot on a Quest Pro and includes additional hand and arm detail.) You may make an educated guess as to when your hand will “touch” a menu item or window in front of you using those shadows. As you “make contact” with something via Direct Touch, items will begin to scroll or light up. Although scrolling is jerky, it is typically more responsible than I anticipated.
But Direct Touch typing is terrible. The Quest’s onscreen keyboard appears underneath the window when you tap on an area of the user interface where you can type text, and you can “push” particular keys to write things down. It’s difficult to know where or what you’re truly typing, though, because there is nowhere for your hands or fingers to rest. (Consider the onscreen keyboard on an iPad without any feedback, and then picture a world without glass.) The UI occasionally assumes that I accidentally touched a different key than what I meant to when I resort to VR hunt-and-peck to futilely compose even a single word. Thankfully, the keyboard does offer word suggestions as you type.
The Quest web browser is possibly the best example of the Direct Touch controls because of the poor typing and adequate scrolling. If I misspell something in an online search, the search engine will probably correct me. It works quite fine to scroll up and down and tap on links. Strangely, The Verge’s homepage on the Quest’s browser won’t navigate past our list of the Top Stories, but tapping any one of the six stories I can actually see works better than I anticipated.
The majority of other built-in Quest applications I used could at least be used with Direct Touch, although many Quest Store applications, such as Meta’s own Horizon Worlds VR social network, are still incompatible with using only your hands. If I didn’t have a controller, they wouldn’t even open. Without a controller, I didn’t anticipate programs like Beat Saber to perform any better, but I did want to at least be able to play about with them.
It is immediately apparent why Direct Contact is classified as an experiment. Using the Quest for more than a few minutes at a time fast becomes annoying because I can’t completely trust that my hand is actually going to “touch” a virtual component of the Interface with every mid-air poke. It also gets weary after a time to hold out my arms in the air to navigate the user interface. Although I find them less obvious, Meta’s other controller-free hand gestures that entail pinching are typically more reliable.