In 1998, a new search engine from an obscure startup launched. It
featured a logo, a text field and buttons for search and “I’m Feeling
Lucky” — and not much else. Its name, as I don’t actually have to tell
you, was Google, and it changed the way that the world interacted with
information.
It’s tempting to think of Google search as something that hasn’t
evolved radically over the years, in part because the Google.com
homepage hasn’t changed much — the logo, search field and dual
buttons are all still there. Even the results, which now weave in
images, videos, Knowledge Graph
summaries and other elements, are still dominated by straightforward
text results of the sort that many a would-be Google rival has derided
as “ten blue links.”
Under the surface, however, Google has changed plenty, in
increasingly profound ways. The way we interact with it has also
evolved. And if the company’s ambitious plans pay off, the Google of
just a few years from now could be a new kind of search engine.
In a recent visit to Google’s Silicon Valley headquarters, I
discussed the future of Google search with Amit Singhal, a 22-year
veteran of the search field and the company’s senior vice president in
charge of search, and some of his colleagues. They didn’t clue me in on
any top-secret projects. (I didn’t even get to try on Google Glass.) But I did leave with a greater understanding of where Google thinks search should go, and the steps it’s taking to get there.
As Singhal stresses, all Google is doing is continuing a journey it’s already on. “Over
the 12 years I’ve been here, we have changed Google every two to four
years,” he says. “There have been four or five huge milestones … Google’s beauty is what hides behind that simple interface: incredibly complex mathematics.”
For search research, Singhal says, “these are supremely interesting
times.” But when he describes his ideal version of Google, it doesn’t
sound all that much like Google as we’ve known it. What he describes is
the omniscient fictional computing device from an old TV program.
“As a little child growing up in India, I watched way too much Star Trek.
That’s the vision that stuck with me,” he says, speaking of the show’s
iconic computer. “You can talk to it naturally, you can ask it whatever
you need to. It fades into the background. It’s just there for you.”
Singhal isn’t the only Google staffer who likes to bring up Star Trek
when talking about the future of search. It’s the company’s romantic
ideal of what it should be aiming for, and even in the 21st century, it
still sounds futuristic.
Then again, the first rough draft of Google’s Star Trek-like vision of conversational, anticipatory search already exists. In fact, you might already be using it.
It’s
Google’s search apps for Android and iOS, featuring Google Now, a
feature which Google introduced in 2012 with Android 4.1 Jelly Bean and brought to Apple’s iOS this April.
The apps let you pull up information by talking to your phone, and
thanks to Google Now, they also figure out what you might want to know,
sometimes without you even explicitly asking for it. As Singhal talked
about the future of search, he kept referring to the current mobile app
and demoing it for me — with good reason.
The Art of Conversation
Like Apple’s Siri, Google’s mobile search uses voice recognition,
natural-language technology and voice synthesis to create an interface
that’s spoken as well as visual. It’s the start of the conversational Star Trek interface, and it’s critical to Google, particularly as it puts its services on new types of mobile gadgets.
“If you think about something like Google Glass, there you have to rely
more on the vocals because you just can’t show much,” says Scott
Huffman, the company’s head of mobile search. “You’re not going to use
that device to research your science project, and even for the weather
what you can show is pretty minimal.”
With conversations between two or more flesh-and-blood people, the
parties involved convey information to each other in an ongoing sequence
that’s rich in context. Every new statement reflects the discussion up
until that point. But search engines have never worked that way.
Historically, as Huffman says, “Google assumes that every single thing I
type stands alone.”
That’s already begun to change, whether you’ve noticed it or not. I
hadn’t, to be honest, until Singhal demonstrated it to me by talking to
his Android phone:
“How tall is Justin Bieber?”
“Justin Bieber is five feet seven inches tall.”
“How old is he?”
“Justin Bieber is 19 years old.”
Google understands that the subject of the first query is “Justin
Bieber,” and correctly assumes that Bieber is also the “he” in the
second query. If it were human, you’d expect nothing less. But for a
search engine, it’s a genuine back-and-forth discussion. “Imagine this
taken to the limit,” Singhal says. “You’re having a natural conversation
with the machine.”
Even the simple
fact that Google can tell that Justin Bieber is the subject of a query
shows that the search engine has a more sophisticated understanding of
the world than it once did. If you’d used an early version of Google to
search for “Ricky Martin,” it would have provided you with a list of
results containing his name. But all Google would have known about
“Ricky Martin” was that it was a sequence of 12 characters that showed
up on an awful lot of web pages.
Thanks to the
Knowledge Graph, which Google announced a year ago, the search engine
has a much better handle on just what “Justin Bieber” is. Search for
Bieber’s name at Google.com, and the right-hand side of the results are
dedicated to a summary box that contains the following:
- A brief biography
- His birth date and place of birth
- His height
- Movies and TV shows he’s appeared in
- His parents
- His upcoming concerts
- His songs and albums
- Other things people who search for “Justin Bieber” also search for: Selena Gomez, Rihanna, Taylor Swift
But
Google couldn’t create the summary unless it knew that Justin Bieber
was a person with a height and age, among other attributes. And that
understanding is why Google Now is able to hold up its end of a
rudimentary conversation about him. “Our
Knowledge Graph, from a back-end perspective, is as big a change to
search as the original algorithms and [original Google algorithm] PageRank are,” says Singhal.
To
Google, Justin Bieber is an entity — one of millions it knows about,
including people, places and things. “Entities are pretty fundamental,”
says Moxley. “The better we understand them, the better we understand
search queries that come in, and the better results we can give to you.”
As
Singhal showed me, the summary for astronaut Eileen Collins doesn’t say
how tall she is; nobody cares about that. But it does specify the time
she’s spent in space. The
more you peruse those summary boxes, the clearer it is that the
Knowledge Graph helps Google understand not only what entities are, but
what aspects of them matter. “For an amusement ride,” explains Moxley,
“the interesting things are how many flips it has and the height
requirement and how many feet it drops.”
Even for people within the same profession, the Knowledge Graph
emphasizes different facts. Search for “Barack Obama,” and it doesn’t
bother to tell you his political party. It assumes you already know. For
“Michael Bloomberg,” however, it specifies his affiliation.
If the Knowledge Graph were compiled by human editors, you’d just
assume that they were making judgement calls about which facts to
stress. But “it’s
all algorithms saying, ‘this is what the world searches for’,” says
Singhal. Real people ask about Bieber’s height, but not his nonexistent
space-time. They know the president’s party, but are understandably
unsure of serial party-switcher Bloomberg’s. As they request specific
pieces of information, the Knowledge Graph is paying attention.
Today, Google understands a question which would have flummoxed it a few months ago: “How
far is it from San Luis Obispo to here?” Someday, Huffman says, it may
be able to answer ones such as “How long has Tim Burton been friends
with Johnny Depp?” It may even be able to do something with a follow-up
query such as “Why does he always cast him in his movies?” — even though
there may be no single definitive answer to that one.
Still,
there’s plenty of information in the world that no algorithm can grind
down into a concise, canonical form. Huffman points out that “Did
Natalie Portman do her own dancing in Black Swan?” is one of
them. “You might think Google should say yes or no, but really there are
huge arguments between Natalie Portman and her body double.” If human
beings can’t untangle them, Google probably can’t either.
Very, very personal search
The Knowledge Graph is
so powerful because it helps Google understand what the world cares
about. That’s only part of the future-search puzzle, though. The other
one is at least as important, and more controversial: What does every
individual Google user care about?
For years, even
conventional Google search results have reflected what Google knows
about you, based on past searches and other clues. For instance, when
I’m logged in to Google, I’m likely to see TIME.com results pop up more
frequently than you do, simply because I visit the site so often. And
searches such as “Whole Foods Market” and “Burmese Food” provide results
which are at least vaguely tailored to Google’s best guess about my
physical whereabouts.
But Google Now, and the Star Trek computer approach to search in general, aim to be far more personal than classic Google Search. Already,
Now uses your phone’s GPS to keep tabs on your location — not just
where you are at the moment, but where you’ve been — so it knows where
you live and where you work. It scans your Google Calendar and Google
Contacts to help it figure out what you’re doing and who you’re doing it
with. It peeks in your Gmail to find items such as tracking information
for packages on their way to you. And it checks your search history to
deduce stuff like which sports teams and stock quotes you follow. Then
it displays cards showing information you might be interested in.
In other words, Google
Now provides a form of search which isn’t search at all, strictly
speaking. It’s anticipatory rather than reactive, and the only reason it
works at all is because it’s so plugged into your life.
It’s
not a given that this proposition is automatically appealing to
everyone. Much of what Google does isn’t, particularly when private
information is involved: Gmail alarmed a fair number of people when it debuted in 2004. Google Glass is already doing so. And so is Google Now.
In 2010, Google chairman Eric Schmidt — who has a knack for phrasing things in ways that can be unsettling to mere mortals — was referring to Google Now-style anticipatory search when he told the Wall Street Journal
that “I actually think most people don’t want Google to answer their
questions. They want Google to tell them what they should be doing
next.” You don’t have to be a Google cynic to instinctively bristle at
that notion.
When
Singhal talks about the same scenario, it doesn’t sound as creepy — at
least to me. “We’re excited about context becoming the query,” he says,
referring to Google knowing the details of your doings well enough to
proactively tell you stuff it thinks you should know. “The value I get out of Google Now is tremendous. Thus, I’m comfortable with it. It’s
a magical moment when I book a meal with someone, and it figures out
how long it’ll take to get there and gives me a warning when I’m at
work. It lets me lead a better life, in a sense, because I don’t come
off as someone who’s late all the time.”
Even people who relish the idea of Google utilizing their personal
information in such a matter might fret about a related privacy issue:
How will the company use its increasingly deep trove of data on users to
make money? All the context Google wants to collect to improve search —
where people are, what they’re doing and who they’re doing it with,
among other data points — will leave advertisers, who pay Google’s
bills, salivating.
When I asked Singhal about the financial side of future Google search
services, he said that the subject doesn’t come up in his world. “We in
our group do not care about monetization,” he says. “There’s a big
Chinese Wall between search and monetization. I thank [Google's
monetization team] every day for paying my salary.”
Time will tell how consumers will react to Google’s Star Trek
computer vision — whether they encounter it on a computer, a phone, a
tablet, Google Glass or some gizmo yet to be invented. And it won’t be
that long until they have a chance to form opinions about new features
which are, at the moment, still undisclosed projects in Google research
labs.
The goals which Singhal and his coworkers shared with me won’t result
in one magical Google upgrade which appears all at once; instead,
little bits and pieces of it will arrive as they’re ready. But he sounds
confident that even the most challenging parts of the dream aren’t just
a dream: “Some of the things we’re working on will not come out for a
year or two,” he says. “Some of it, I don’t see how it’ll come out for
3-5 years.”
“If you just peel back on what search should look like, the answers
are already there. It’s just a question of how we’re going to get
there.”
No comments:
Post a Comment