The new iOS 18 feature to control the iPhone with your eyes has been a failure

The new iOS 18 feature to control the iPhone with your eyes has been a failure

At the start of June, Apple showcased new accessibility features that iOS 18 would introduce during its developer conference. Among these features, one particularly caught our attention: it allowed users to control the iPhone using their eyesight, eliminating the need for hand usage. It felt like a glimpse into the future, but what has been the outcome?

We tested it for the first time during the initial betas and found it impressive, although it required some refinement, as it wasn’t entirely accurate. However, being a beta version, we didn’t expect perfection; as long as it functioned, we could consider it a success. Unfortunately, months later, it remains unchanged, or perhaps even worse.

Control iPhone with your eyes?

As mentioned, this feature is part of iOS 18, compatible with all devices that have been updated to this version. Regardless of the device, the outcome has been quite similar, and we have to say it is not satisfactory.

We conducted tests with an iPhone 13 Pro and an iPhone 15, and in both cases, while it does allow users to control their iPhone with their eyes, it is overly complicated. If you attempt to point at a specific spot between two buttons, for instance, it becomes an arduous task, and consequently, optimal results are not achieved. Although this feature is designed for individuals with limited mobility and has potential, it still falls short of expectations. Users often end up needing to tap the screen to go back, restart settings, or exit screens that only have button or touch screen options available.

Thus, despite being a commendable idea, it has not evolved as Apple anticipated. Since the first beta of iOS 18, this tool has not seen updates, and the precision improvements we had hoped for have not materialized. We may have to wait further. At least it is patented, and there is a chance that in iOS 19 or future devices with improved front sensors, we could receive a more professional and functional tool. For now, it remains a simple tool that offers us little; as we mentioned, no more than two minutes will go by before we have to resort to using our fingers. For entertainment or novelty, it may be appealing, but it is not effective for those who genuinely need it.

The new iOS 18 feature to control the iPhone with your eyes has been a failure

iOS 18.1 is nearing release, bringing with it several Apple Intelligence features that could reshape many experiences. However, since this particular feature does not rely on those updates, we don’t believe it will improve. If any enhancements are made, they may occur in the coming years. As a starting point, it is promising, but we also need to address that it does not function as Apple truly promised.

Have you had a chance to try it? How has your experience been? It is noticeable that there is a significant difference in performance based on various lighting conditions and iPhone models, attributed to varying camera quality that detects eye movements more effectively. Nonetheless, overall, none of the phones we tested executed the functions perfectly, at least not to our satisfaction.

Exploring iOS 18 Accessibility: Controlling Your iPhone with Your Eyes

At the beginning of June, Apple showcased impressive new accessibility features in iOS 18 during its annual developer conference. Among these innovations was the ability to control your iPhone using only your eyesight, a concept that feels like stepping into the future. However, what is the real-world application of this feature, and has it lived up to expectations?

The Eye Control Feature: What Is It?

The eye control feature was touted as a revolutionary enhancement aimed primarily at users with mobility impairments. This innovation allows users to navigate their devices without using their hands, relying solely on their eye movements. Compatible with all devices running iOS 18, the feature’s functionality remains consistent across models. Unfortunately, as of the latest updates, the performance leaves much to be desired.

Screen from iOS 18 beta version

First Impressions: Testing Eye Control

In our initial tests with both the iPhone 13 Pro and the iPhone 15, we found the eye control feature to be an exciting prospect but fraught with challenges. Users can indeed control their iPhone with their eyes; however, aiming accurately between multiple buttons proved to be a frustrating endeavor. We encountered significant issues when trying to select specific icons or options on the screen.

This feature, while groundbreaking, appears to fall short of expectations. For instance:

  • Aiming at closely spaced buttons was often erratic.
  • The need to touch the screen remained prevalent, undermining the hands-free idea.
  • Adjusting for varying lighting conditions produced inconsistent results.

The User Experience: Challenges Faced

The design of the eye control tool aimed to provide a seamless experience, yet the reality is different. Users often find themselves returning to traditional touch methods due to:

Precision Issues: Aiming accurately at small icons or buttons can become nearly impossible, making the experience cumbersome.
Frustration Levels: Frequent adjustments or taps to correct misselections lead to user frustration rather than the convenience promised.
Limited Functionality: Many actions still necessitate a physical touch, such as navigating back or exiting screens.

The new iOS 18 feature to control the iPhone with your eyes has been a failure

Current Status of Eye Control Feature

As of the release of iOS 18.1, it appears that no significant enhancements have been made to the eye control feature. Many users are hopeful that future updates will address these shortcomings, but as of now, the feature feels more like a trial than a complete solution.

Expected Improvements and Future Developments

While no substantial improvements have occurred since the initial beta, Apple’s investment in improving its accessibility features remains promising. It’s possible that with advancements in hardware, particularly in future iPhone models with superior front sensors, eye control could evolve into a more precise and functional tool. For now, it serves as a basic introduction to what could eventually become a valuable accessibility asset.

Potential Updates to Anticipate

Feature Description Expected Impact
Improved Calibration Enhancements in eye-tracking accuracy across different lighting conditions. Better user experience and less dependence on touch.
Feedback Mechanism Implementation of haptic feedback to confirm actions. Increased confidence while using eye control.
Customization Options Ability to adjust sensitivity and speed of eye tracking. Personalized user experience catering to individual needs.

Practical Tips for Users Trying Eye Control

If you’re considering trying this eye control feature, here are some practical tips to enhance your experience:

  • Optimal Lighting: Ensure you are in a well-lit environment to help your iPhone detect your eye movements better.
  • Distance and Positioning: Maintain an appropriate distance from the screen for best recognition.
  • Take Your Time: Allow for slower, deliberate movements when attempting to select options.
  • Regular Updates: Keep your device updated to receive any improvements Apple may roll out for the feature.

User Experiences: How Does It Work?

Have you had the opportunity to test the eye control feature? Feedback varies widely among users, with some citing improved accessibility while others express frustration due to the aforementioned challenges. Notably, the performance can fluctuate depending on:

Device Model: Newer models may have better sensors yielding a slightly improved experience.
Lighting Conditions: Variations in ambient light heavily influence functionality.
User Technique: Individual comfort with the new method can lead to varying degrees of success.

Many users emphasize that while the concept is exciting and filled with potential, the practicality of the eye control feature requires much refinement to be a reliable tool for everyday use.

Leave a Replay