A few years ago, backstage at a conference, I spotted a young blind woman using her phone. The phone was speaking everything her finger touched on the screen, allowing her to tear through her apps. My jaw hit the floor. After years of practice, she had cranked the voice’s speed so high, I couldn’t understand a word it was saying.
And here’s the kicker: She could do all of this with the screen turned off. Her phone’s battery lasted forever.
Ever since that day, I’ve been like a kid at a magic show. I’ve wanted to know how it’s done. I’ve wanted an inside look at how blind people can navigate a phone that’s basically a slab of featureless glass.
I finally got my chance. Joseph Danowsky offered to spend a morning with me, showing me the ropes.
Joe majored in economics at the University of Pennsylvania, got a law degree at Harvard, worked in the legal department at Bear Stearns, became head of solutions at Barclays Wealth, and is now a private-client banker at U.S. Trust. He often commutes to New York from his home in New Jersey.
Joe was born with cone-rod dystrophy. He can see general shapes and colors, but no detail. (Only about 10 or 15 percent of visually impaired people don’t see any light or color at all.) He can’t read a computer screen or printed materials, recognize faces, read street signs or building numbers, or drive. And he certainly can’t see what’s on his phone.
Yet Joe spends his entire day on his iPhone. In fact, he calls it “probably the number one assistive device for people who can’t see,” right up there with “a cane and a seeing eye dog.”
The key to all of this is an iPhone feature called VoiceOver. At its heart, it’s a screen reader—software that makes the phone speak everything you touch. (Android’s TalkBack feature is similar in concept, but blind people find it far less complete; for example, it doesn’t work in all apps.)
You turn on VoiceOver in Settings -> General -> Accessibility. If you turn on VoiceOver, you hear a female voice begin reading the names of the controls she sees on the screen. You can adjust the Speaking Rate of the synthesized voice.
There’s a lot to learn in VoiceOver mode; people like Joe have its various gestures committed to muscle memory, so that they can operate with incredible speed and confidence.
But the short version is that you touch anything on the screen—icons, words, even status icons at the top; as you go, the voice tells you what you’re tapping. “Messages.” “Calendar.” “Mail—14 new items.” “45 percent battery power.” You can tap the dots on the Home screen, and you’ll hear, “Page 3 of 9.”
You don’t even have to lift your finger; you can just slide it around, getting the lay of the land.
Once you’ve tapped a screen element, you can also flick your finger left or right—anywhere on the screen—to “walk” through everything on the screen, left to right, top to bottom.
Ordinarily, you tap something on the screen to open it. But since single-tapping now means “speak this,” you need a new way to open everything. So: To open something you’ve just heard identified, you double tap. (You don’t have to wait for the voice to finish talking.) In fact, you can double-tap anywhere on the screen; since the phone already knows what’s currently “highlighted,” it’s like pressing the Enter key.
There are all kinds of other special gestures in VoiceOver: for making the speaking stop; for reading everything from the top of the screen; for scrolling one page at a time; for going to the next or previous screen (Home, Stocks, and so on); and more.
If you do a three-finger triple-tap, you turn on Screen Curtain, meaning that the screen goes black. You gain visual privacy as well as a heck of a battery boost. (Repeat to turn the screen back on.)
Joe, however, doesn’t see that battery boost, since he’s on the phone all day long. In fact, he’s equipped his phone with one of those backup-battery cases.
The Rotor
Joe also demonstrated for me the Rotor: a brilliant solution to a thorny problem. There are dozens of settings to control in a screen reader like VoiceOver: voice, gender, language, volume, speaking speed, verbosity, and so on. How do you make all of these options available in a concise form that you can call up from within any app—especially for people who can’t see controls on the screen?
The Rotor is an imaginary dial. It appears when you twist two fingers on the screen as if you were turning an actual dial.
Each “notch” around the dial represents a different setting you might want to change: Characters, Words, Speech Rate, Volume, Punctuation, Zoom, and so on.
“Let’s say we want VoiceOver to read word by word, because there’s something there that we want to hear spelled. We bring up the Rotor,” Joe told me. “It’s a deep menu system. And I can choose what I’m putting there, and the order. There are 20 or 30 items that could go on the Rotor.”
Once you’ve dialed up a setting, you can get VoiceOver to move from one item to another by flicking a finger up or down. For example, if you’ve chosen Volume from the Rotor, then you make the playback volume louder or quieter with each flick up or down. If you’ve chosen Zoom, then each flick adjusts the screen magnification.
The Rotor is especially important if you’re reading on the web. It lets you jump among web page elements like pictures, headings, links, text boxes, and so on. Use the Rotor to choose, for example, images—then you can flick up and down from one picture to the next on that page.
A day in the life
Joe walked me through a typical day, starting with a check of the weather and the train schedule, followed by a scan of his To Do list and email Inbox; on the train, he might read the news or listen to a podcast, audiobook, or music.
Through the workday, he has a few other tricks:
“There’s another gesture, called scrubbing. You take two fingers, and you kind of Z-line it. It means, ‘Go Back.’”
“Getting a cab is very hard for me to do. I’ll be standing on the street corner, and people look at me and say, ‘What is this guy doing?’ They don’t see that I’m visually impaired, that I can’t see if somebody’s inside the cab! But now, I call a Lyft car or an Uber car, and it’s saying, ‘The car is 2 minutes away’; I just call him. I’m gonna say to the driver, ‘I’m on this corner, I’ve got a blue shirt on, I’ve got a briefcase. I can’t recognize you, so just yell out to me when you get there.’”
“It’s also kinda cool to be able to project my photos to a huge TV screen. There’s a lot I can see if I get in really close to the screen.”
If he needs to read a printed document, Joe uses called Kurzweil’s KNFB Reader app. As I watched, he used it to photograph a printed letter; instantly, the app converted the image to text and began reading it aloud, with astonishing accuracy.
This was very cool: “If I’m in my office and put my headphones on, I’m hearing the phone call and I’m hearing what VoiceOver is saying, all through the headphones. But the person on the other end cannot hear any of the VoiceOver stuff. You don’t know what I’m reading, what I’m doing. I can do all these complicated things without you hearing it. That’s what’s really incredible. If you and I were working together on a three-way call, and you were to text me, ‘Let’s wrap this up’ or ‘Don’t bring that up on this call’—I would know, but the other guy wouldn’t hear it.”
Joe showed me how he takes photos. As he holds up the iPhone, VoiceOver tells him what he’s seeing: “One face. Centered. Focus lock,” and so on. Later, as he’s reviewing his photos in the Camera Roll, VoiceOver once again tells him what he’s looking at: “One face; slightly blurry.”
“If a cab or an Uber lets me off somewhere, and I’m not sure which way is uptown, I open the Compass app. Since NYC is a nice grid, it lets me know which way I’m walking.”
“Or I might just say to Siri, ‘Where am I?’ She tells me exactly where I am.”
Joe uses a lot of text macros. He’s set one up that says, for example, “Where are you?” when he types wru.
He knows the positions of all his apps’ icons—but often, he’ll just say to Siri, “Open Calendar” (or whatever).
The big picture
I asked Joe if there’s anything he’d ask Apple to improve in VoiceOver.
“The biggest problem with the iPhone is when you use it a lot, you need a bigger battery. I’m using it all the time. If the phone were just a little thicker, to accommodate a double battery, that’d be a nice thing. I’m also a little disappointed they did away with the standard headphone jack, because when you use it a lot, you need to charge it all the time [and the new earbuds plug into the Lightning charging jack].”
I pointed out that none of his complaints about the iPhone have anything to do with accessibility. They’re the same complaints we all have.
“I know,” he said, laughing. “VoiceOver is very consistent and it’s extremely good. There’s no problem with VoiceOver.”
(The Associated Services for the Blind and Visually Impaired would undoubtedly agree; last year, it gave Apple its Louis Braille Award.)
And how about society? What don’t we understand? What drives him crazy? “Stop grabbing my arm when I’m crossing the street,” or “Stop talking louder to me”?
“I have to tell you, there aren’t that many anymore, surprisingly,” he replied. “As more visually impaired people enter the workforce, there aren’t too many things, honestly.”
There’s an age gap in awareness of these accessibility features, too. “What I find is, people who are older, in their 70s, who have macular degeneration and could benefit from this, don’t,” Joe says. “I don’t know why. To me, it’s so intuitive and fast and easy.”
Well, here’s the bright side: Maybe Joe’s story will help get the word out.
A version of this story was originally posted in March 2017.
David Pogue, tech columnist for Yahoo Finance, welcomes non-toxic comments in the Comments below. On the Web, he’s davidpogue.com. On Twitter, he’s @pogue. On email, he’s poguester@yahoo.com. You can sign up to get his stuff by email, here.