Technology

Why Google’s new Pixel phone is a Trojan horse

Google’s new Pixel phone feels like a shot at Apple’s iPhone. But perhaps more importantly, it shores up Google’s software sovereignty—and its AI future

Rick Osterloh, SVP Hardware at Google, introduces the Pixel Phone by Google during the presentation of new Google hardware in San Francisco, California, U.S. October 4, 2016. (Beck Diefenbach/Reuters)

Rick Osterloh, SVP Hardware at Google, introduces the Pixel Phone by Google during the presentation of new Google hardware in San Francisco, California, U.S. October 4, 2016. (Beck Diefenbach/Reuters)

It was a barely kept secret, with spec details flying out through a cottage industry of insiders and leaks. A glitzy tech event in a California venue—a restored chocolate factory—was live-streamed across the world. An army of tech writers picked apart and reviewed the performance of each of its parts upon its buzzy release.

If all this feels familiar—a launch event for a smartphone—it’s because it is. Maybe as recently as Sept. 7, when Apple launched its iPhone 7. But for Google, a massive company that has largely treated hardware as a hobby, it’s relatively new ground. “We’re really excited for the worst-kept secret ever,” said Darren Seefried, Google Canada’s head of hardware partnerships, to laughter at a launch event in Toronto. And on a day that Google had been heralding as a major announcement—the reveal of the Google Pixel, its first true incursion into the smartphone market—the spectre of the company it’s hoping to send a shot across the bow was hard to miss.

But there was one thing that made it starkly different from an iPhone launch: On a day a new phone was announced, the phone wasn’t really the star of the show. It was the shift to AI.

First, the day’s superficial news itself: after occasional flirtations, Google is plunging fully into the hardware world, and there are a lot of new devices, albeit fewer for Canadians. Google is entering the virtual-reality field with a comfortable, breathable headset, the Google Daydream VR, with promises of partners like J.K. Rowling and The New York Times. It announced Google WiFi, routers set to improve internet connectivity across all corners of your home. Its device-to-TV streaming Chromecast—which Google found enjoyed a 160 per cent increase in overall watch time last year—gets an upgrade, providing 4K resolution with higher speeds, coming later this year. Google Home, an Amazon Echo replica, is a speaker with Google Assistant built-in, integrated with your home systems and your pre-existing Google devices (though this won’t be available in Canada for the foreseeable future.)

But no hardware news was louder than the Google Pixel, a “made by Google” phone and its foray into the premium smartphone market.

There were the typical boasts: It preened about its battery life, promising a quick-charge ability that provides 7 hours of battery life in just 15 minutes. Leading camera analyzer DXOMark gave the Pixel’s 12.3 MP-resolution camera an 89 rating—the highest-ever for a smartphone. Pixel purchasers will get unlimited Google cloud storage to store those big, high-res photos and videos. It will be widely sold by carriers like Bell and Rogers (which owns Maclean’s) starting Oct. 20.

On its surface, it’s hard not to read the release of a premium Android phone as anything but a strike at Apple and its marquee iPhone. Throughout the launch, Google embraced that attitude, too, with more than a few pinches of salt: “There’s no unsightly camera bump,” pointedly noted Rick Osterloh, Google’s SVP of hardware, at one point. Later, boasting of Pixel’s unlimited cloud storage, a “Storage Full” pop-up in Apple’s trademark bubble appeared on screen. Later still, there was Google’s exultation about “a headphone jack,” actually listed as a Pixel feature, in the wake of Apple’s divisive decision to scrap it from the iPhone 7. And then there was Osterloh’s grand statement about Google’s “magic”: “The next big innovation is going to be at the intersection of hardware and software, with AI at the centre,” he said. And if that sounds familiar, it’s because Apple’s been talking about this for some time: “The magic of Apple, from a product point of view, happens at this intersection of hardware, software, and services. It’s that intersection,” said Apple chief Tim Cook, in a 2015 interview with Fast Company.

And that competitive jockeying with Apple can be true and real. After all, tech watchers have long surmised that only Google could mount a truly meaningful rivalry against Apple in an already crowded marketplace. But what really makes Pixel’s launch interesting is what this really means: In a world where the currency is increasingly data, rather than coin, Google’s hardware play is a Trojan horse—even if it is a beautiful one. It’s a move into making hardware that bolsters its bread-and-butter software, allows it to further integrate itself in our daily lives, and positions it to win the fight that probably matters more, long-term: over artificial intelligence and the machine-learning algorithms that power it.

South Korean professional Go player Lee Sedol reviews the match after finishing the final match of the Google DeepMind Challenge Match against Google's artificial intelligence program, AlphaGo, in Seoul, South Korea, Tuesday, March 15, 2016. Google's Go-playing computer program again defeated its human opponent in a final match on Tuesday that sealed its 4:1 victory. (Lee Jin-man/AP)

South Korean professional Go player Lee Sedol reviews the match after finishing the final match of the Google DeepMind Challenge Match against Google’s artificial intelligence program, AlphaGo, in Seoul, South Korea, Tuesday, March 15, 2016. Google’s Go-playing computer program again defeated its human opponent in a final match on Tuesday that sealed its 4:1 victory. (Lee Jin-man/AP)

Google has long been using machine learning to improve its services. It’s how Google Images knows what pictures to spit out when you type things in—something it’s improved at in the last two years, from 89.6 to 93.9 per cent accuracy. It’s how they’ve managed to improve machine translation to the point that it’s approaching human accuracy. It’s how Google’s AlphaGo beat the world’s best human Go player in March. It’s how Google is getting better at knowing what individual users want, based on the data they create by how they use their apps.

To do all that, high-powered computers use machine learning, sifting through immense amounts of data to find patterns and learn. Users provide this data to Google through their searches and their activity across their many apps—Mail, Maps, Play, YouTube, and so on. And more data—along with better computing—refines AI and more advanced deep-learning systems. As Geoffrey Hinton, the Canadian godfather of deep learning-algorithms, said in an interview with Maclean’s: “In deep learning, the algorithms we use now are versions of the algorithms we were developing in the 1980s, the 1990s. People were very optimistic about them, but it turns out they didn’t work too well. Now we know the reason is they didn’t work too well is that we didn’t have powerful enough computers, we didn’t have enough data sets to train them.” Imagine the pace of Google’s AI development if it had even more directed information—with a phone that learns verbal cues and textual instructions, and takes detailed photos uploaded to an infinite bank of data, in the hands of many people?

MORE: What is deep learning?

So it was telling that Google’s Oct. 4 event started not with what consumers would invariably see as the major announcement—the new Pixel—but with new exultations over Google Assistant improvements. “We’re building hardware with the Google Assistant at its core,” said Sundar Pinchai, Google Inc.’s CEO, and indeed, the Pixel is the first phone with Assistant built in. That matters because Google’s presentation made clear that it envisions a world where you use Assistant’s vocal commands in lieu of text search, and is the big push for its Google Home, which even lets you automate home functions with verbal cues. In effect, the Pixel is a way for Google’s neural networks to gain even more insight into what people want and how they behave, information that benefits Google tremendously in its software development.

That’s really where the Pixel comes in, and why Google is doing this now: Google Pixel allows an IV drip to further fuel the AI beast. This is hardware in the service of software. And a phone, to that end, makes eminent sense: In 2015, 68 per cent of Canadians owned a smartphone, and according to a 2014 report, of the 33 hours a month Canadians spend online, 49 per cent of that was spent on mobile devices, on  average. These numbers are sure to have risen today, proving that our phones are integrated deeply into the warp and weft of our lives. Putting a Google phone running Google products in people’s hands allows Google direct access to more data than ever before—an immense boon to the real prize, data sets for deep learning.

As a bonus, it will get to claim that the Pixel is a true “Google phone”; if the iPhone is, for many, the product that defines Apple as a whole for consumers, a phone promising to be made entirely by Google allows the Pixel phone, conjuring emotional attachments every day, to reflect back on the company at large. But for a device that’s “all about control” for Google, according to Google Canada’s Seefried, the Pixel isn’t actually truly “made by Google,” as the company has crowed in its launch hashtag and website; while it controls design, it has farmed out the manufacturing to HTC.

But those are small potatoes. Today isn’t about hardware vs. software, or even a new smartphone. Focusing on the Pixel would be missing the forest from the trees, confusing a short-term play for a long-term mission. Pinchai acknowledged it himself: “We’re moving from a mobile-first world, to an AI-first world.”

It’s little wonder, then, that Google scheduled the event for Oct. 4. Sure, there’s the symbolism that its Assistant is jockeying against Apple’s Siri assistant, publicly released exactly five years ago. It could also be because, on this Oct. 4, Google was saying 10-4—radio language for “understood.”

Google understands. And it’s hoping the Pixel allows its users to help them get more understanding.

Looking for more?

Get the Best of Maclean's sent straight to your inbox. Sign up for news, commentary and analysis.
  • By signing up, you agree to our terms of use and privacy policy. You may unsubscribe at any time.