Google Gets A Second Brain, Changing Everything About Search

(Page 5 of 5)

call up video content with spoken commands; behind the scenes, it’s the Knowledge Graph that matches those commands with actual shows, channels, and Web content. Google Now, a Siri-like service that displays just-in-time information cards on Android phones, is also powered by the Knowledge Graph. And the applications will only multiply as more teams at Google figure out how their own products can benefit from the huge database, Thakur says. “The Knowledge Graph is general backbone for representing knowledge, and is a service to the entire company,” he says.

The Perfect Intelligent Assistant

As everyone knows, there’s a single revenue engine that drives Google’s whole universe of activities, from smartphones to robot cars to the $300 million it puts aside every year for Google Ventures, its in-house venture firm. That engine is AdWords, the program that lets advertisers bid to place cost-per-click text ads in the right-hand column of search result pages for specific keywords. AdWords accounts for about 70 percent of Google’s overall advertising revenue, while AdSense, which places ads on partners’ sites, brings in the other 30 percent.

In a situation like this, you might think the company would be allergic to any new idea that changes the way AdWords ads are displayed. But this is exactly what the new knowledge panels do. In fact, they sometimes push the ads so far down the right column that users must scroll or click to the next page to see them.

That has some AdWords advertisers displeased. “The Knowledge Graph released on us by Google seems to take up the right top ad space,” one advertiser complained in an official Google AdWords forum. “I’m paying a lot for AdWords and I think Google should have notified advertisers that search results pages were changing.”

But the Google search engineers I talked to say they have license to improve Google’s core search technology without regard to the potential effects on advertising. “We are building our dream search engine,” says Amit Singhal. “We are guiding the company toward the best experience for our users, and we are not really paying attention to whether people will click more on ads or less on ads.”

Lord knows there’s a good argument for letting Google engineers do their thing. Long before universal search and Google Instant and the Knowledge Graph, Google had superior ranking algorithms, which is how the company stole users away from older search engines and built a user base worth monetizing.

But it’s still breathtaking to visit a company that puts so much faith in innovation. The attitude seems to be: Yes, we might break something, but if we do, we’ll fix it so that it’s even better. After all, it’s possible that the Knowledge Graph could end up improving the algorithms Google uses to rank the relevancy of ads, perhaps leading to higher click-through rates and even greater revenues. “In today’s search system we see today’s design of ads, and tomorrow you will see tomorrow’s design of ads, and we don’t fully know what that is,” says Singhal.

Google’s main goal is to get users to the information they want faster so that they can carry on with their lives, Singhal says. Even with the Knowledge Graph, that isn’t happening fast enough for his taste. “It should be as simple as me saying, ‘Google, drive me to my dinner appointment,’” he says. “Which we can now literally do [thanks to the driverless car project]. But search still has a lot of friction between you and what you need, and it is not as magical as it should be. For it to get there, we will need to solve numerous small problems.”

These “small” problems include better speech recognition, spoken-word interfaces, computer vision, natural language understanding, machine translation, and contextual awareness—all tough computer-science challenges that legions of PhDs at Google are gradually chipping away at. The results will be seen not just in the classic desktop search scenario but also in Google’s mobile interfaces, such as Google Now, which anticipates your questions based on your location (automatically retrieving a boarding pass as you approach the airport, for instance).

“What you are starting to see is the emergence of this dual paradigm,” says Singhal. “Google is becoming the perfect intelligent assistant by your side, which can not only answer when you ask but proactively tell you things that are really important.”

Eventually, Singhal says, Google’s two brains will merge back into one, as the company figures out how to marry the Knowledge Graph with its vast keyword-based index of the Web. At a high level, the task will involve matching listings in the index with entities in the graph, so that Google’s whole picture of the world becomes more entity-based. “I can easily see that future” where there’s only one search going on under the hood at Google, rather than two, Singhal says. “These systems are certainly becoming more interdependent, and as you can imagine a lot of interdependency goes forward and forms integration.”

Singhal and many of Google’s other top engineers are old enough to have grown up watching Star Trek, and you can’t talk with them for very long without hearing a mention of the ship’s computer, which was seemingly ever-present and all-knowing. In fact, when Google Now was in development it went by the code name Majel, a reference to the late actress Majel Barrett, who supplied the voice of the computer systems in Star Trek: The Next Generation and other spinoffs. This science-fiction vision still grips Google today—and Singhal sees the acquisition of Metaweb, taking Google beyond a purely statistical understanding of the world, as a critical step toward achieving it.

“We all want the Star Trek computer, and what we are building is exactly in that direction, where you can ask it anything and it will proactively tell you things,” says Singhal. “That is where this is headed. The Knowledge Graph is one of those key components that are necessary but not sufficient to build it. Likewise with speech recognition. When you put the whole package together, that is how you arrive at the future—and we at Google are far ahead in all of those spaces.”

Single PageCurrently on Page: 1 2 3 4 5 previous page

Wade Roush is a freelance science and technology journalist and the producer and host of the podcast Soonish. Follow @soonishpodcast

Trending on Xconomy

By posting a comment, you agree to our terms and conditions.

10 responses to “Google Gets A Second Brain, Changing Everything About Search”

  1. Prasad says:

    To boldly go where no one has gone before… Google is doing it… without waiting for the 24th century…

  2. foutight says:

    Good guy google!

    • gesster says:

      you obviously don’t know a whole lot about google.

      i’d suggest you go to duckduckgo to see some of the evil that your precious google does

  3. John Ellis says:

    Brilliant researched article thanks! I had forgotten how much google had changed. Google ability to buy in change marks it as different. It is by far the safest way to advent Artificial Intelligence. When I started on the internet no-one trusted credit card transactions and there weren’t search engines. That’s only 13 years ago!

    I challenge the idea that memory and data storage is necessary: with enough calculation power and enough speed, you could generate any memory string to order in any context.

    That’s difficult philosophically but it’s true.

    So one possible future of google is calculation on vast scales, possible with super-recursive algorithms or quantum computing.

  4. Sunil says:

    Great article. Google really has changed many things but It is more comfortable for a ordinary person.

  5. Jacob Varghese says:

    Nice article. Makes in-context content creation that much more important for future online health.

  6. I was contributing to Freebase back in 2007. Thought it was pretty cool then. Went back and did some more just now. Thanks for the article that explains how this stuff is helpful.

  7. rahulanand says:

    Way to go… KG is definately the future :)

  8. gesster says:

    ai probably isn’t even possible, how does google account for the chinese room? google is evil and terrible… run away, run fast