Learn more

Apple is to enable eye tracking on the iPhone and iPad as part of a new range of accessibility tools aimed at helping those with physical disabilities to more easily use their devices. Using artificial intelligence, it will allow users to navigate their Apple device using just their eyes. The new feature is joined by a new Music Haptics tool, which uses the taptic engine in the iPhone – which powers the vibrations on the device – to enable those who are deaf or hard of hearing to experience music vibrating to the audio of the music. Apple said it was also introducing new speech features which …

cuu