How Will Apple Innovate Beyond the iPhone 7? With Next-Gen Siri

Xconomy National — 

Apple’s iPhone 7 is due out next month. According to the Wall Street Journal, PFWTMs (“people familiar with the matter”) say the new device will look pretty much the same as the iPhone 6s. The biggest change: no more headphone plug. Customers who want to listen to music or podcasts on their iPhones will need to get Bluetooth earbuds or new headphones with Lightning connectors.

Doesn’t sound like much of an innovation. (Or is it? I’ll come back to that below.) The first point to acknowledge is that if the reports are right, it would be the first time Apple has departed from its traditional two-year upgrade cycle. Redesigned iPhones typically come out in even-numbered years. 2015 saw the introduction of the iPhone 6s, a minor makeover of the previous year’s iPhone 6. So, going by the traditional schedule, the 2016 iPhone would have been a major refresh, perhaps helping to reverse this year’s slowdown in sales of iOS devices.

So, what’s going on inside Apple? And what does this portend for the future of mobile interfaces?

One interpretation is that smartphones have reached the Plateau of Near-Perfection.

Already on that plateau are products like the commercial jet airliner, which hasn’t changed much since Boeing introduced the 747 in 1970. Planes have evolved on the inside through the addition of features like all-glass cockpits and fly-by-wire software. But in essence, aircraft builders have concluded that there’s only one efficient design for an aluminum airframe riding on gas-turbine engines, and it’s the one we’ve got.

Maybe we’ve reached the equivalent point in smartphone design, where the devices already do everything we need them to do. An iPhone is an alarm clock, a camera, a pager, a game console, a compass, a map, an audio and video recorder, a phone, and a Dick Tracy two-way video link. It’s hard to imagine what new capabilities could be added to this Victorinox of gadgets.

Or maybe that’s all wrong. It could be that there’s still plenty of room for new features, but Apple is simply taking a longer breather between upgrades.

The company might be planning something splashy for 2017, the 10th anniversary of the original iPhone. According to the same PFWTMs, Apple designer Jony Ive wants to get rid of the iPhone’s bezels and use an edge-to-edge OLED screen to make the device look and act like a single sheet of smart glass. That sounds cool, but tricky. It would be understandable if it took an extra year.

But I suspect something more is going on. Yes, designing an edge-to-edge display might involve some tough hardware challenges. But my bet is that the seeming slowdown in the iPhone iteration cycle isn’t primarily a hardware issue. It’s more like a synchronization problem. Maybe, before phones can get smarter, software engineers have to get smarter.

Let me explain. What made the original iPhone so groundbreaking was a remarkable convergence of hardware and software innovation. It was the first device to feature both a phone-sized touchscreen and a new interface that made the screen intuitive and delightful to use. The combination opened up so many possibilities that app builders have now spent the better part of a decade exploring them.

We may not see another great leap in mobile technology—one that supercharges the whole category, the way the first iPhone did—until someone comes up with a similar synthesis of hardware and software innovation.

The big limitations right now aren’t on the hardware side. Microprocessors have speed to burn. Batteries last all day. Every year, phones get better displays and better cameras. With 5G broadband, wireless connections are about to get a lot faster.

No, what’s missing is an organizing idea—a new way of packaging and interacting with the information on our mobile devices. Something that will make iOS look as primitive as the operating system on a 2003 Treo or Blackberry.

And here’s my big prediction: that idea is Siri. Or rather, her future offspring. To make the next leap, Apple and its competitors need to figure out how to build more powerful conversational agents, so that computing can finally move off the screen and into our daily lives.

Ever since 1988, when William Gibson published Mona Lisa Overdrive, the final novel in his famous cyberpunk trilogy, I’ve been waiting for a real-world version of Continuity. In the book, Continuity was a conversational expert system, or maybe a self-aware artificial intelligence, plugged into a vast global database. From her networked house, Angie, the book’s main protagonist, could call it up by name:


“Hello, Angie.”

“Do you know how to reach Hans Becker?”

“I have his agent’s number in Paris.”

And so forth. Continuity could do research for Angie, make videos, even write books. What I found intoxicating about Gibson’s idea was not just the concept of a global knowledge base (and remember, this was a decade before Google, and 13 years before Wikipedia)—it was the idea that you’d be able to talk to it.

Today, the big players in tech are investing billions in conversational and/or voice-activated virtual assistants. Microsoft has Cortana, Amazon has Alexa, Google has Ok Google, and Apple has Siri. In addition, Facebook is helping companies create chatbots for Facebook Messsenger, and Siri’s original creators are working on a new, more context-aware system called Viv. Conversational interfaces are hot.

That said, none of these assistants are anywhere near the point where they can hold up their end of an actual conversation. That would require what computer scientists call “strong AI” or “artificial general intelligence,” which is likely decades away.

But that’s okay, because frankly, Continuity-style AI is overkill. We don’t need our laptops or smartphones to be our friends. We just need them to understand our needs, and to be better at tapping into all of the digitally mediated services that surround us.

Once Siri gets smart enough to take over many of the tasks that currently force me to poke at a tiny screen with my fat fingers, you can bet I’ll buy a new iPhone. I’m talking about things like:

  • Reading everything to me out loud—including incoming e-mail, text messages, news reports, and social media—and taking dictated replies.
  • Not just reminding me about upcoming appointments, but changing them if necessary.
  • Taking a picture of whatever I’m looking at, or making an audio recording of whatever I’m hearing.
  • Finding new content or experiences based on my habits and preferences: “Siri, play me something similar to Vijay Iyer.”
  • Saving content that I’ll want later: “Siri, send this to Pocket.” “Siri, store this in Evernote.”
  • Paying for stuff: “Siri, charge this coffee to my Visa.”
  • Relaying instructions to the other networked devices in my home, like my TV, stereo, thermostat, or security system.
  • Answering complex reference questions without sending me to a Web browser.
  • Interfacing with other Web services: “Siri, call me an Uber.” “Siri, get me a table at Legal Sea Foods tomorrow at 12:30.” “Siri, order more AA batteries from Amazon.” (Yes, I know Alexa already does that last one.)
  • Staying aware of history and context, so that I can “stack” inquiries without having to start over every time.

At that level of intelligence, it won’t matter as much whether the iPhone has a beautiful edge-to-edge OLED screen, because we won’t need to pull our phones out of our pockets as often.

From this point of view, the alleged disappearance of the headphone jack on the iPhone 7 begins to make sense. If you’re going to be talking with your iPhone a lot more, you’ll need to be able to hear it, and it will need to hear you, even when there’s lots of noise around. That means you’ll need an earbud, and that earbud should work whether your phone is in your pocket or on the other side of the room, which means it needs to be cordless.

If you saw the 2013 movie Her, none of what I’m describing will be novel to you. In the movie, Theodore (Joaquin Phoenix) still carried a little smartphone-like device, about the size of a business card holder. But it acted only as a camera and a wireless conduit to his personal cloud-based AI, Samantha (Scarlett Johansson). Theodore and Samantha communicated mainly through his earpiece, which looked like a stylish hearing aid.

Bluetooth earbuds are a real thing now, so in hardware terms, the movie wasn’t all that prescient. Cutting the cord is the first step toward getting people to think of their smartphones and their earbuds as an always-there, connected ecosystem. (Once you’ve got such a system, you don’t need a separate voice-assistant device like Amazon’s Echo, which is why I’m skeptical about reports that Apple is developing one.)

I wish I could predict how many more steps will be needed before Siri becomes a truly reliable and versatile personal assistant. In a 2014 column called “Welcome to the Seven-Year Technology Pause,” I noted that big technology advances tend to take about 15 years to play out. If the iPhone’s introduction in 2007 kicked off one such cycle, I wrote at the time, then we were still seven or more years away from the next big hardware-software convergence. Given the gradual pace of recent progress on interfaces, that may not be a bad estimate.

So here’s what I’d really like to ask Siri: What do I have to do to keep my iPhone 6 working until 2022?