@@quantuminfinity4260 been testing it on stand, 30cm distance as it asks, bright room on my face etc and it’s a mess , hope they fix it cause I know a person that needs this to use a tablet/phone and it would be incredibly to her
@@quantuminfinity4260 doesn’t make a big difference to me, it’s very wobbly still and clicking things i don’t want to click. They should also find a way to scroll without having to use the corners every time.
As a quadriplegic who uses eye tracking extensively on windows, I'm just sat here like well yeah, people who need eye tracking can't hold their tablets and use stationary stands. Very poor video
Eyetracking is an accessibility feature meant for people who are paraplegic or have extreme motor issues, may be missing a limb, etc. The iPad would most likely be stationary and propped up at eye level.
Have you tried it when the iPad was stationary? It being handheld might have been messing with the tracking a bit. Regardless, very informative video as always! Didn’t even know it was a feature!
After a couple of quick tests on my phone, it’s clear that the device is meant to be stable, you can move around with it, but it’s clearly intended to be mounted on something, it makes a massive difference.
Came back to this video, just as Apple have announced the new Airpods' hearing aid functionality, and I've had patients that would benefit a lot from this.
Love your videos but come on man, this was a flawed test and doesn’t feel like it was done in good faith. Place the iPad on a stable / stationary surface so the tracking can keep up.
As someone with a spinal injury, who is paralyzed from the neck down, I use my Apple devices (iPhone, iPad Pro, Macbook Pro) completely by voice using Apple accessibility feature Voice Control. You would think Eye Tracking would be a feature I could really benefit from. Unfortunately that's not true yet. I have demoed Apple Vision Pro twice. They encouraged me to use Eye Tracking for navigation and selection. I tried it the first session but for some reason I could not keep my gaze steady enough, or I was impatient, but I never became very successful with it. The second session I used Voice Control for navigation and selection. Using Voice Control on Apple Vision Pro was a much better experience. I quickly navigated the interface and selected things with ease. I'm still interested in Eye Tracking coming to the mobile devices. Unfortunately seeing your experience with it and it being much like my experience with Apple Vision Pro I doubt it's going to work for me. Instead I will stick with Apple accessibility feature Voice Control.
Thank you for sharing your experience. Have you ever considered using a separate eye-tracking camera for ipad? There are several of those, they use several cameras and are optimized for eye-tracking, not taking pictures. The most common is probably the Tobii Dynavision, but they all certainly aren’t cheap. They all work with the iPad and have a MFI (made for Ipad) certificate.
@@Arab_papa37 what gen ipad do u have, most probably it’s 9 or less it also needs the feature of Face ID 9 or less gen has no Face ID only finger print
Come On this test is not serious , please do it in a proper environment . Static iPad on a stand or o the Magic Keyboard . It’s an accessibility feature and disabled people still use it on a static position and probably they’ll stay static themselves too . Besides it’s a beta version so please take your time . When it’s your main keyboard out you have muscular memory so aveu criticism. You drew were irrelevant
Its really buggy for me I had my ipad on the table and max scteen brightness and yet it keeps highlighting all the wrong things The dot is jittery and not aiming where im looking I wonder if its because i have glasses
This seems like a really cool feature. I do, however, think the dwell feature for selecting is bad. I would prefer a quick double blink to select making it faster. And you mentioning Affinity Photon wow. Can you imagine selecting a brush with a double blink, slowly moving your eyes to paint and then double blinking again to lift the brush?
I downloaded beta specifically for this feature and I don't have it. Come to find out the version of my iPad 11 Pro is too old and this feature only starts with the 4th gen.
Kinda strange my 2018 11 inch iPad Pro won’t support this, yet my sister 2019 3rd gen iPad Air with a home button supports it. So the A12x won’t get it but the A12 gets it. This really is something even though I won’t use the feature.
I definitely think Apple needs to develop further and move it from an “accessibility” feature to a full fledged native feature that focuses touchless interactions, I think it’s amazing! 😊
“Hey Siri, look at me”…. That should be the wake-up command to have it track your eye commands. When I’m doing the dishes. (hands wet) That option would be life changing. ;)
for someone not disabled may be it could be used to replace Tobii eye tracker 5, that would be awesome. imagine using your iPhone as eye tracker and playing flight simulator.
It's better on the VisionPro because the cameras are stable and move with the user's head. This ensures greater precision as the distance and displacement remain constant.
I got better performance when the iPad was stationary. This is a feature for people who can’t use their hands, so holding the iPad isn’t the intended usage.
It's a beta release, of course any feature is not quite 'sorted', that's the purpose of a beta, for feedback! Impossible to judge final quality until OS18 is released so why bother?
This was pretty embarrasing and disrespectful for people who developed this feature and to those who need it. First you calibrated it wrong and then used it wrong and don’t understand how to use it and have the nerves to say its not working right. I’m sorry, I’m really upset, please delete this video.
If everything he did was "wrong", could you please explain how it was done wrong so that we know from your perspective? And note, this is Beta 1 so this feature would certainly have room for improvement so I doubt this is entirely his fault.
@Kupilainen If you tried to troll, you faced palm really well. He is testing out a beta version, and maybe he unconsciously forgets it was the feature meant for those with disability by talking to the majority, but eye tracking is still a work in progress.
Hey first pls pin, i love ios 18
Well, yeah, you are holding it and moving it around. It probably needs to be stationary.
And there’s a bright window behind him. Not ideal conditions for a camera.
Tested it on phone, it makes a huge difference, it is absolutely meant for the device to be station.
@@quantuminfinity4260 been testing it on stand, 30cm distance as it asks, bright room on my face etc and it’s a mess , hope they fix it cause I know a person that needs this to use a tablet/phone and it would be incredibly to her
@@quantuminfinity4260 doesn’t make a big difference to me, it’s very wobbly still and clicking things i don’t want to click. They should also find a way to scroll without having to use the corners every time.
As a quadriplegic who uses eye tracking extensively on windows, I'm just sat here like well yeah, people who need eye tracking can't hold their tablets and use stationary stands. Very poor video
Seriously? Holding device with eye movement tracker in hands? How you got 800k subs as tech reviewer?
Try setting the ipad on your desk for the eye tracking, lol.... you're making it try to track while your unsteady hand is jiggling the ipad around....
Eyetracking is an accessibility feature meant for people who are paraplegic or have extreme motor issues, may be missing a limb, etc. The iPad would most likely be stationary and propped up at eye level.
Have you tried it when the iPad was stationary? It being handheld might have been messing with the tracking a bit.
Regardless, very informative video as always! Didn’t even know it was a feature!
After a couple of quick tests on my phone, it’s clear that the device is meant to be stable, you can move around with it, but it’s clearly intended to be mounted on something, it makes a massive difference.
Why can’t I see the eye tracking setting?
Same thing
Which iPad models will be supported?
i think everything with a m1 chip and after m1 chip
Came back to this video, just as Apple have announced the new Airpods' hearing aid functionality, and I've had patients that would benefit a lot from this.
Love your videos but come on man, this was a flawed test and doesn’t feel like it was done in good faith. Place the iPad on a stable / stationary surface so the tracking can keep up.
how do you scroll a webpage with the eye tracking?
As someone with a spinal injury, who is paralyzed from the neck down, I use my Apple devices (iPhone, iPad Pro, Macbook Pro) completely by voice using Apple accessibility feature Voice Control. You would think Eye Tracking would be a feature I could really benefit from. Unfortunately that's not true yet.
I have demoed Apple Vision Pro twice. They encouraged me to use Eye Tracking for navigation and selection. I tried it the first session but for some reason I could not keep my gaze steady enough, or I was impatient, but I never became very successful with it. The second session I used Voice Control for navigation and selection. Using Voice Control on Apple Vision Pro was a much better experience. I quickly navigated the interface and selected things with ease. I'm still interested in Eye Tracking coming to the mobile devices. Unfortunately seeing your experience with it and it being much like my experience with Apple Vision Pro I doubt it's going to work for me. Instead I will stick with Apple accessibility feature Voice Control.
Thank you for sharing your experience. Have you ever considered using a separate eye-tracking camera for ipad? There are several of those, they use several cameras and are optimized for eye-tracking, not taking pictures. The most common is probably the Tobii Dynavision, but they all certainly aren’t cheap. They all work with the iPad and have a MFI (made for Ipad) certificate.
It’s genuinely insanely good for just being the camera I thought it was all Face ID sensors then I saw that it’s was available in my iPad Air 5th gen
What do I do I can’t find it on settings 1:16
@@Arab_papa37 what gen ipad do u have, most probably it’s 9 or less it also needs the feature of Face ID 9 or less gen has no Face ID only finger print
Personally I don't think I would ever use that feature of iOS 18 or iPad OS. But it does seem like a cool feature.🐬
Come
On this test is not serious , please do it in a proper environment . Static iPad on a stand or o the Magic Keyboard . It’s an accessibility feature and disabled people still use it on a static position and probably they’ll stay static themselves too . Besides it’s a beta version so please take your time .
When it’s your main keyboard out you have muscular memory so aveu criticism. You drew were irrelevant
Man, if this can work, I would use it all the time. No more having to touch the screen. Hopefully this is a start and will be perfected in the future.
Its really buggy for me
I had my ipad on the table and max scteen brightness and yet it keeps highlighting all the wrong things
The dot is jittery and not aiming where im looking
I wonder if its because i have glasses
How to get this feature in ipad mini 6, i think its not possible! Right?
The eye tracking option won’t show up for me ;(
Me too (using iPad Pro 2018 with Face ID)
This seems like a really cool feature. I do, however, think the dwell feature for selecting is bad. I would prefer a quick double blink to select making it faster. And you mentioning Affinity Photon wow. Can you imagine selecting a brush with a double blink, slowly moving your eyes to paint and then double blinking again to lift the brush?
what devices are compatible with this feature?? O__O
I downloaded beta specifically for this feature and I don't have it. Come to find out the version of my iPad 11 Pro is too old and this feature only starts with the 4th gen.
eyetracking plus a bluetooth ring to click is a better option than blinking.
Kinda strange my 2018 11 inch iPad Pro won’t support this, yet my sister 2019 3rd gen iPad Air with a home button supports it. So the A12x won’t get it but the A12 gets it. This really is something even though I won’t use the feature.
Does it support ipad 10th (it hasn’t face ID)
What it’s sitting on a table not moving?
Does it work with iPad Air?
How do you confirm what you’re looking at is what you wanna “click” on? By blinking?
By holding the gaze
Why don’t I have it on my ipad
It’s a start for old timers
How do you click???
Look for a while on the thing you want to select. To turn it on you have to turn on rest control
Place it on a static surface. Please
I definitely think Apple needs to develop further and move it from an “accessibility” feature to a full fledged native feature that focuses touchless interactions, I think it’s amazing! 😊
Are you a tech reviewer, or a Face Maker for thumbnails?
“Hey Siri, look at me”…. That should be the wake-up command to have it track your eye commands.
When I’m doing the dishes. (hands wet) That option would be life changing. ;)
for someone not disabled may be it could be used to replace Tobii eye tracker 5, that would be awesome. imagine using your iPhone as eye tracker and playing flight simulator.
It's not meant to be held, obviously.
I wish it supports vision os like touching between two fingers to make an action.
Couple it iwth eye-blinks for gestures (double blink == click,, etc...)
It's better on the VisionPro because the cameras are stable and move with the user's head. This ensures greater precision as the distance and displacement remain constant.
they should have the volume button on the iPhone as a right click button. / left click button
Yes I would use this feature. - ie. eye tracking on iPad Pro. ❤
I got better performance when the iPad was stationary. This is a feature for people who can’t use their hands, so holding the iPad isn’t the intended usage.
Don’t you mean hands off demo
Eye tracking is proof that Apple even knows the color of your underwear. 😂
I wonder why they did not use blinking for acknowledgement, maybe it will make us blink too much…
Because we blink many many times per minute. So we’d select everything we look at, all the time.
@@SeanTube2099 ok
Apple could just make external camera with R1 chip for tracking hands that could be connected to their devices (yeah, xbox kinect).
Remake this without moving the iPad
Apple should repair Voice over before starting to program a new feature that does not work either.
It's a beta release, of course any feature is not quite 'sorted', that's the purpose of a beta, for feedback! Impossible to judge final quality until OS18 is released so why bother?
You can’t really blame it,your hands were shaking like crazy
Wish samsung has this feature
What if you could wink 😉 in order to click things!
Most eye tracking tec needs to be stabel to work
Another great presentation Fernando 🐬
This is how apple is gona sell eye tracking data :)
It’s not easy to use brother 😢
🐬
It doesnt Work Great on iPhone. Cause its moving
Maybe🇺🇸🐻
This was pretty embarrasing and disrespectful for people who developed this feature and to those who need it. First you calibrated it wrong and then used it wrong and don’t understand how to use it and have the nerves to say its not working right. I’m sorry, I’m really upset, please delete this video.
If everything he did was "wrong", could you please explain how it was done wrong so that we know from your perspective? And note, this is Beta 1 so this feature would certainly have room for improvement so I doubt this is entirely his fault.
@Kupilainen If you tried to troll, you faced palm really well. He is testing out a beta version, and maybe he unconsciously forgets it was the feature meant for those with disability by talking to the majority, but eye tracking is still a work in progress.
There was no eye tracking on my iPad. This video is fake 100% fake. Also, my iPad version is 17.5.1
may be problem with your eyes
It’s awful
🐬
🐬
🐬
🐬
🐬