How can Google search be beaten? Google’s edge is to do search better than other companies, i.e., they have access to knowledge about search those other companies don’t, in part because they place a high premium on developing such knowledge in-house.
What happens if Google’s understanding of search starts to saturate, and further research produces only small gains in user experience? The knowledge gap to their competitors will start to close. Other companies will be able to replicate the search experience Google offers. The advantage will then shift to whichever company can manage the operations side of search (e.g., maintaining large teams, large data centers and so on) better. Google’s culture – all those clever people improving search – will then become a liability, not an asset.
This is the classic path to commodization. A new industry opens up. In the early days, the race is to those who develop know-how quickly, providing an edge in service. As know-how saturates, everyone can provide the same service, and the edge moves to whoever can manage operations the best. The old innovators are actually at a disadvantage at this point, since they have a culture strongly invested in innovation.
In Google’s case, there’s another interesting possibility. Maybe search just keeps getting better and better. It’s certainly an interesting enough problem that that may well be posible. But if our knowledge of search ever starts to saturate, Google may find itself needing another source of support for its major business (advertising).
Google is very secretive about their data centers, but from what I know, they are very good at the operations side of things, at least from a technical standpoint. Their one weakness is they’re probably not that good at controlling labor costs, since they’ve so far had no need to do so (they’re swimming in cash), and their hiring model has been to go after very smart (and presumably expensive) people.
I tend to think search will just keep getting better. Ideally, we could use natural language queries, making them as complicated as we want, and get highly accurate results. Actually implementing this is probably AI-complete.
In other words, Google will keep getting better until it becomes sentient, at which point all bets are off.
Google has a secret weapon: computational power. I have heard that their computer collections are truly massive (i.e., they have the second largest collection of computational power in the U.S, and probably the world). Anybody who wants to match their performance is either going to have to come up with truly better algorithms, or invest a massive amount of money which I think will be virtual impossible to recoup with a search engine, since they will almost certainly lose any competition with Google.
Hi Peter,
About 5 x 10^5 moderately powerful machines is the number I’ve frequently heard bandied about. I don’t know what the source of that number is, since Google appears not to release server numbers any more.
Half a million machines certainly requires a large capital investment, although it’s one that could already be matched by many companies. However, if we assume Moore’s law continues to hold, that server cluster will be matched by a few thousand machines within a decade. That’s a modest capital investment that could be easily be made by a large number of companies.
Hi Michael,
The real question, then, is whether Google uses the improved computing power available in the future to improve their search engine, or just rests on its laurels (I think there’s still quite a lot of room for improvement). If they rest on their laurels, they’re probably dead (and they may deserve to be) but my best guess is that this is not going to happen.
Microsoft certainly has both the money and the motive to take on Google right now, and I wouldn’t be surprised if they try. But as for search engines becoming commodified, I don’t think that will ever happen. As long as improved computational power improves search results, it’s a natural monopoly.
Hi Peter,
I’ll restate a part of my post that seems especially relevant to your last comment:
“What happens if Google’s understanding of search starts to saturate, and further research produces only small gains in user experience? The knowledge gap to their competitors will start to close. Other companies will be able to replicate the search experience Google offers. The advantage will then shift to whichever company can manage the operations side of search (e.g., maintaining large teams, large data centers and so on) better. Google’s culture – all those clever people improving search – will then become a liability, not an asset.”
Of course, this isn’t inconsistent with what you say. Where we perhaps differ is that I think this type of saturation is somewhat likely to happen. Not because of anything fundamental, but just because improving search may turn out to be harder than we think, and I wouldn’t be surprised if there are periods lasting years where search quality only improves marginally. During those times Google will be vulnerable to competitors who are as good or better than them at operations, and don’t have the overhead that comes from Google’s heavy focus on innovation. Microsoft doesn’t fit this description, but other companies may arise that do.
The flipside to this is that I really hope Google can use its market cap to quickly blow past a lot of these barriers, in which case my comments will be irrelevant.
As I said earlier, I do think search will keep getting better, but I’m willing to believe that there will be “pauses” of a few years or so along the way. I would still put my money on Google, but let’s think of who might be able to challenge them.
If I were Google, I would be most afraid of telecom companies like AT&T. AT&T clearly has the money and networking expertise to go head-to-head with Google on the computational side of things. If the quality of search becomes comparable, then other factors, such as response speed, become critical. AT&T has two key advantages here:
1) They can make their own search engine faster than Google, by putting hardware closer to the customers.
2) They can make Google search slower by throttling Google’s traffic over their pipes.
To elaborate on (1), AT&T would put a box in each telecom central office (i.e., the first hop from every DSL customer) that would–at a minimum–handle the most common search queries. No matter how many data centers Google builds, they can never get closer to the customers than that.
Why would AT&T want to do this? Think of the tremendous amount of data on and control over user viewing habits they would get by controlling search. Some ISPs are already selling ads targeted based on customer viewing data (i.e., if you’ve been surfing to edmunds.com, they’ll show you a lot of car ads).
Google’s response to this is to buy a lot their own fiber and build a “shadow network” that could bypass most of the roadblocks AT&T could throw up (at least for non-AT&T customers). Google can also try caching common search query results locally on user computers (think Google Toolbar on steroids), although this is potentially a big drain on enduser computer resources.
Hi Travis – This is the kind of thing the current battle about net neutrality is about. The telcos would need to win their battle against net neutrality utterly for this to become possible. I just can’t see it happening. In fact, it’s exactly this kind of outcome that is motivating people (myself included) into supporting net neutrality.