2023-06-06 23:33:20
One of the things that drew the most attention during the announcement of the Apple Vision Pro was the concept of interaction using “spatial input”, a method that allows you to interact with the system through two basic primary input methods: hand gestures e eye targeting.
It’s basically the way the Apple rethought the “pointer” for his new device. By itself, it is not something revolutionary, but (as is tradition), it was a presentation that the company has already made in the past when introducing the mouse for its computers, the Click Wheel the iPod and, more recently, the finger to navigate the iPhone.
Israel Pastana Vicente and Eugene Krivoruchko, two members of the Apple design team, spoke a little regarding these two input methods adopted in the headset and revealed how they work together to provide an experience that is different from anything that has already been seen on other devices in the gender.
Starting with a breakdown of “vision direction”, Israel briefly explained how the Vision Pro’s selection mechanism works: just stare at a button, blink both eyes and presto, the magic is done. It is a comfortable method, as it allows you to leave your hands at rest and use them only for essential tasks.
The issue of comfort is also in the way the information is arranged, with the main content focusing on the user’s field of view and the options that are used secondarily (such as the tab bar) on the sides — which keeps the user focused and decreases eye strain.
Visual guidelines also include drawing rounded shapes and dynamic scaling for app windows, rather than physical scaling. They allow the window size to be maintained even when the user pushes the field of view further away — which allows the information to remain readable and prevents further focus deviation.
Speaking specifically regarding the use of hands, Krivoruchko highlighted the presence of gestures, which can be used for actions such as scrolling a page or rotating an object. Developers will be able to define custom gestures for their apps, but there is advice for them not to conflict with system gestures or with movements that the user would make in a normal conversation.
Gestures work even better if used in conjunction with the gaze direction, enabling interactions that do not exist on other platforms. When using hand gestures to zoom, for example, the system considers the user’s viewing direction as the point of expansion of the image. When handwriting, gaze direction is also used to determine the point at which writing will begin.
Hands can also be used for other tasks that are best done with direct interaction, like scrolling Safari with your fingertips or even using your fingers to type on a virtual keyboard. In this case, as there is no physical feedback from the “keys”, Apple used a bright highlight and audible feedback to guide the user through the typing process.
Still in the field of direct interaction, it is possible to inspect an object closely, simulate real-world interactions (such as a soundboard) or even perform tasks that require physical effort — such as a Fruit Ninja-style game in which your hand plays the role. role of swords, for example.
Other input methods
While gaze direction and hand gestures are Vision Pro’s primary input methods, it also has other options. One of them is the voice, which allows both dictating content to the operating system and asking Siri for something, for example.
If virtual typing or dictation doesn’t suit you, there’s nothing like connecting the Magic Keyboard to the product, which accepts Bluetooth pairing with other devices such as the Magic Trackpad or even a joystick to help when playing games.
The product can even serve as a great external monitor. Opening a 13″ MacBook Air using Vision Pro simply projects the contents of the computer, using that magic that only the Apple ecosystem is capable of providing.
Interesting, isn’t it?
via 9to5Mac
1686103855
#Vision #Pro #combines #gaze #gestures #exclusive #input #method