I briefly got my hands on Apple’s new high-tech goggles, which impressed and creeped me out and raised a question: Why do we need these?
Apple unveiled the new Vision Pro on Monday. The device will be available next year.
By Brian X. Chen Brian X. Chen, who has covered consumer technology for The Times for more than a decade, has tested 11 virtual reality headsets. I walked away with mixed feelings, including a nagging sense of skepticism.
On one hand, I was impressed with the quality of the headset, which Apple bills as the beginning of an era of “spatial computing,” where digital data blends with the physical world to unlock new capabilities. Imagine wearing a headset to assemble furniture while the instructions are digitally projected onto the parts, for instance, or cooking a meal while a recipe is displayed in the corner of your eye.
But after wearing the new headset to view photos and interact with a virtual dinosaur, I also felt there wasn’t much new to see here. And the experience elicited an “ick” factor I had never had before with an Apple product. More on this later.
Fit and control Apple’s Vision Pro’s first-person-video viewing feature.
The Vision Pro, which resembles a pair of ski goggles, has a white USB cable that plugs into a silver battery pack that I slipped into the pocket of my jeans. To put it on my face, I turned a knob on the side of the headset to adjust the snugness and secured a Velcro strap above my head. I pressed down on a metal button toward the front of the device to turn it on. Then I ran through a setup process, which involved looking at a moving dot so the headset could lock in on my eye movements. The Vision Pro has an array of sensors to track eye movements, hand gestures, and voice commands, which are the primary ways to control it.
Looking at an icon is equivalent to hovering over it with a mouse cursor; to press a button, you tap your thumb and index fingers together, making a quick pinch that is equivalent to clicking a mouse. The pinch gesture was also used for grabbing and moving around apps on the screen. It was intuitive and felt less clunky than waving around the motion controllers that typically come with competing handsets. But it raised questions. What other hand gestures would the headset recognize for playing games? How good will voice controls be if Siri’s voice transcription on phones currently doesn’t work well? Apple isn’t sure yet what other gestures will be supported, and it didn’t let me try voice controls.
All the many uses? Apple’s Vision Pro first-person multiple apps feature was unveiled on Monday.
Then came time for the app demos to show how the headset might enrich our everyday lives and help us stay connected with one another. Apple first walked me through looking at photos and a video of a birthday party on the headset. I could turn a dial near the front of the Vision Pro counterclockwise to make the photo backgrounds more transparent and see the real world, including the Apple employees around me, or turn it clockwise to make the photo more opaque to immerse myself.
Apple also had me open a meditation app in the headset that showed 3-D animations while soothing music played and a voice instructed me to breathe. But the meditation couldn’t prepare me for what was coming next: a video call. A small window popped up — a notification of a FaceTime call from another Apple employee wearing the headset. I stared at the answer button and pinched to take the call.
The Apple employee in the video call was using a “persona,” an animated 3-D avatar of herself that the headset created using a scan of her face. Apple portrays videoconferencing through the personas as a more intimate way for people to communicate and even collaborate in virtual space. The Apple employee’s facial expressions looked lifelike, and her mouth movements synchronized with her speech. But because of how her avatar was digitally rendered, with the uniform texture of her face and the lack of shadows, I could tell it was fake. It resembled a video hologram I had seen in sci-fi movies like “Minority Report.” In the FaceTime session, the Apple employee and I were supposed to collaborate on making a 3-D model in an app called Freeform. But I stared at it blankly, thinking about what I was seeing. After three years of my being mostly isolated during the pandemic, Apple wanted me to engage with what was essentially a deepfake video of a real person. I could feel myself shutting down. My “ick” sensation was probably what technologists have long described as uncanny valley, a feeling of unease when a human sees a machine creation that looks too human. A technological feat? Yes. A feature I would want to use with others every day? Probably not anytime soon.
Real people
After the demo, I drove home and processed the experience during rush hour. Over dinner, I talked to my wife about the Vision Pro. The Apple goggles, I said, looked and felt better than the competing headsets. But I wasn’t sure that mattered. Other headsets from Meta and Sony PlayStation were much cheaper and already quite powerful and entertaining, especially for playing video games. But whenever we had guests over for dinner and they tried the goggles on, they lost interest after less than half an hour because the experience was exhausting and they felt socially disconnected from the group. Would it matter if they could twist the dial on the front of the headset to see into the real world while wearing it? I suspect it would still feel isolating, because they would probably be the only person in a room wearing one. But more important to me was the idea of connecting with others, including family members and colleagues, through Apple headsets.
http://dlvr.it/Sqf7Tx
0 Comments