[CCCNFBW] Fwd: [wcb-l] Upcoming accessability features for phones

Merribeth Greenberg merribeth.manning at gmail.com
Mon May 23 16:03:34 UTC 2022


You may already know about this stuff, but if not...

Coming soon to a phone near you: A new wave of accessibility tools

Think door detection, sound recognition and lots of captions

By Chris Velazco

May 19, 2022 at 7:00 a.m. EDT


Door detection

Automatic image descriptions

Live captions for audio

Apple Watch mirroring

Custom sound recognition


Together, Apple and Google are responsible for the software that powers
nearly all of the world’s smartphones. And within the last week, both of
them have outlined plans to make that software more useful for users with
disabilities.


Google was up first, showcasing this month accessibility updates that will
be folded into some of the support apps the company maintains. Apple
followed suit with a bevy of accessibility feature announcements, too,
which are widely expected to appear in the new iOS 16 software update due
later this year.


Some of the features these companies previewed aim to make the world easier
to navigate for the blind and vision impaired. Others were designed to make
music, movies and conversations easier to follow for people who can’t hear
well — or at all.


Here’s our guide to some of the most interesting changes these companies
are planning, and what they may be able to do for you or someone in your
life.


Door detection

Available on: iOS

When is it coming? “Later this year,” Apple says.

iPhones come with a magnifier app that does pretty much what you’d expect:
it enlarges our view of whatever’s in front of the camera. But in an
upcoming software update, the app will gain a “door detection” tool to help
blind iPhone users more easily get where they’re going.

Here’s how it will work: once someone opens the magnifier app, they jump
into a new “Detection Mode” and aim the phone at their surroundings. If
there’s a door nearby, the app will audibly tell its user how far away it
is, and whether it’s open or closed. If the app has a clear view of that
door, it’ll also begin to describe aloud specific details — think numbers
and text on and around the door and whether you’re meant to turn a knob or
push a bar to open it.

Door detection is perhaps the most clever accessibility feature Apple has
talked about recently, but it comes with a very specific catch: it only
works on a specific list of gadgets. For now, supported devices include the
iPhone 12 Pro, 12 Pro Max, 13 Pro and 13 Pro Max, along with iPad Pros from
2020 onward.

That’s because the feature relies in part on a technology called light
detection and ranging, or lidar for short. It’s a way for devices (and,
more commonly, cars) to very quickly scan the space around them, and Apple
only builds the parts that make this spatial awareness possible into its
more expensive products.


Automatic image descriptions

Available on: Android

When is it coming? Available as a beta feature in Google’s Lookout app on
June 2.

Three years ago, Google launched an Android app called Lookout as a sort of
catchall toolbox for its blind and low-vision users.

If you open the app and point the camera at a snippet of text, for
instance, the app will begin to read those words out loud. “Explore mode,”
meanwhile, tries its best to describe what’s in front of the camera — it
likes to describe the water bottle that perpetually lives on my desk as
“tableware with text ‘The Washington Post.’ ”

Google’s latest addition to this toolbox is a feature that can describe
what’s going on in certain kinds of photos. (Certain iPhone and iPad models
have had a similar tool, called VoiceOver Recognition, for a few years.)

“Imagine reading a news story with a photo or receiving an image on social
media that people are reacting to, but not knowing what’s in that image,”
said Scott Adams, product manager for Lookout, in a video meant for app
developers. “That’s where the potential of [machine learning] comes in.”

For now at least, you probably shouldn’t expect Lookout to correctly
explain gritty, complicated scenes. Adams said that, beyond offering basic
captions like “dogs playing in the grass,” the tool will be able interpret
text — say, on a sign — and identify people in images by their “upper body
clothing and whether they’re an adult or youth.” Even better, all of that
image recognition will happen solely on your phone or tablet — no more
sending info to far-flung servers.


Live captions for audio

Available on: iOS, iPadOS, macOS, Android

When is it coming? Available as a beta feature for Apple products later
this year and for some Android devices now.

Certain Android devices have had the ability to automatically transcribe
audio — be it from a podcast, movie or anything else playing on your phone
— for a few years. (If your device uses Android 10 or later, you may
already have it installed.)

Now, Apple is joining the club.

The company confirmed this week that a similar feature would soon arrive on
iPhones, iPads and Mac computers as a beta feature later this year. Like on
Android, Apple’s Live Captions will appear on-screen for just about any
audio that a phone can play, but there’s at least one helpful difference
here: group video calls in services like FaceTime will attribute those
transcriptions to specific speakers, so users can keep up with the flow of
conversation more easily.


Apple Watch mirroring

Available on: iOS

When is it coming? “Later this year,” Apple says.

You might think that a wearable like the Apple Watch would have to be poked
and prodded to operate it, but that's actually not true — you can actually
interact with one by clenching your hands and making pinching motions with
your fingers.

But even those gestures aren’t manageable by some of Apple’s users, so the
company built a feature that mirrors the action on the Watch’s tiny screen
onto a connected iPhone.

That means people with motor disabilities and limited mobility can control
those Apple Watches using the iPhone’s built-in tools. Voice Control, which
first appeared in 2019, lets people navigate through on-screen menus,
webpages and more by calling out very specific commands. Switch Control,
meanwhile, lets its users interact with an iPhone using accessories like
joysticks and sip-and-puff switches. Since this new feature replicates the
Apple Watch’s screen on the iPhone, all of those controls can be used to
interact with a Watch without ever having to touch it.


Custom sound recognition

Available on: iOS and Android

When is it coming? Later this year for Apple products. Google wouldn’t
confirm a launch date for the new feature.

Deaf users and the hard of hearing have been able to use their iPhones and
Android devices to receive visual alerts when they detect certain sounds —
think breaking glass, crying babies or fire alarms. In time, though, both
companies say people will be able to “teach” their phones to listen for
more specific kinds of audio cues.

Think about it: there’s a very good chance that your microwave’s telltale
ding doesn’t sound anything like your dryer’s. (Mine emits a sort of
sickly, raspy buzz.) Rather than just lumping both of those into a single
broad category of “appliance noises,” you’ll be able to train your phone to
tell them apart and send you the right notification when your food is warm
again.


_._,_._,_
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://nfbnet.org/pipermail/cccnfbw_nfbnet.org/attachments/20220523/3ba12f10/attachment.html>


More information about the CCCNFBW mailing list