Will Personalization Change Search Engine Optimization?
What Is Personalization?
The most basic explanation of personalization is that it a system by which the search engines are able to extract patterns from previous search behavior and adjust present and future search results based on “learned” preferences. The simplest example can be found in the repeated selection of a single site when it appears in the search results. Ego drives many (present company included) to click on their own site when it shows up in the search results. Once the site is selected multiple times it will rise in the results when the same or a similar search is run again by the same user. Google has thus learned that you like this site and is now making it easier to get to it again.
The title of this article is “Personalization and the Death of SEO.” Right away, the author, Dave Davies, sets us up for a let down. We are going to be told, we assume, that SEO will no longer be possible once personalization in full effect. Instead, Davies takes us down another path. Some of it makes sense, but a lot of it is mere gobbledy gook masquerading as expert knowledge.
In the above paragraph Davies explains that personalization is an attempt to ascertain patterns of searchers, which is true. However, he gives no insight here as to why that is necessary. His comment that “ego drives many to click on their own site when it shows up in the search results” is a bit shallow. Sadly, however, with personalization, anyone clicking on their own website could very well be skewing the results of the personalization algorithm. If personalization means that you are more likely to receive results for those sites you visit most often for particular queries then it seems to reason that someone clicking on his own site every time it appears (yes, those egotists) will continue to have their own site ranking highest every time they perform a search for an important keyword. That won’t do you much good if your intent is to check out the competition. What that means is searchers will have to be very discriminating in their queries and click throughs while logged in to Google’s personalization feature. That is likely to change the nature of search more than the nature of SEO.
The numbers of potential factors, as with any algorithm, are virtually endless in theory however there are some key factors that come up repeatedly in the patents that are sure to hold weight as personalization evolves. They are:
Your personal search history. What you look for and the sites/ads that you select will affect the results you receive when you search. Right now this seems to primarily be restricted to increasing the position of a site that is selected multiple times when it appears in a set of search engine result however as this technology evolves your past behaviors and the types of sites you select in the results will surely be applied to new searches, increasing the positions of sites that have similar characteristics to ones you have selected in the past for completely different queries.
The big question here is what will happen to new sites that are well optimized and similar to those you have visited in the past? Will they appear in your search results or be discriminated against because you have not visited them before?
As we all know, the Web is constantly changing. If new sites are discriminated against because they have not been visited in the past then that creates a whole new barrier to entry that, before, didn’t exist. There is no rationale for that. Bottom line: Keywords and links will still rule in the ranking scheme unless it is Google’s intent to devalue link popularity. I don’t see keywords every being devalued. That would be pointless. But links, if links are devalued in exchange for personal preferences, then what will happen to those sites that have built their reputations today on link popularity? Will they fall in rankings if they are not as popular as other sites with higher link popularities? These are questions no guru can answer at the moment. All we can do is speculate but it remains to be seen that Google, who invented link popularity, will turn its back on this ranking scheme.
Your behavior on a selected site. What you do on a site and how long it takes you to return to the search engine is or soon will be a factor. The search engines have clearly stated that their main goal is to deliver a positive experience to their users. The more readily a searcher finds the information they are looking for in a set of results, the better the experience and thus, the more likely that searcher is to continue to use that engine. If Google discovers that when a visitor lands on a site they are likely to stay for only a couple of seconds then that site can reasonably be considered less relevant for a specific query than one who’s visitor’s remain on their site for a minute or two. The former site will thus lose position for the phrase and the later will increase. All indications are that if this is the case for a single phrase, that the rankings for other phrases the site ranks for will not be affected however I would speculate that if visitors react poorly to a site for multiple phrases, that the value of the site as a whole will be reduced and the rankings will be affected globally.
This actually makes a lot of sense. If Google can analyze searchers’ behavior once they land on a site then they could use that information to rank sites for a particular key phrase. However, there is some danger in this.
What if a searcher is looking for specific information on a topic but doesn’t quite know the proper way to research it? For instance, they want more information on the symbolic meaning of black roses. Instead of typing in “symbolic meaning of black roses” or “black rose symbolism” the search just searches for “black rose.” If none of the first ten search queries offer the information the searcher is looking for but one or two appear as if they might and the searcher clicks on those links only to discover they are not the right sites, we have an interpretation problem. Google’s algorithms would interpret that to mean those sites don’t rank well for the phrase “black rose” because the visitor didn’t stick around for long when the actual situation was the searcher didn’t know how to conduct an effective search.
Your location. Especially important for mobile search but sure to gain importance for specific, localized phrases – your business location relative to the searcher will gain importance. A search for a phrase such as “seo services” is likely to be unaffected by such factors (unless the searcher has a past history of selecting sites from his/her own region for multiple phrases) however if a searcher searches for “pizza victoria” and the engine is able to pick up that the searcher is from Victoria, Texas and not Victoria, BC those sites that promote a pizza restaurant in Victoria, Texas will be increased in the results.
Again, we have issues. What if a searcher makes a lot of local searches? Will Google return a list of local websites for every search query?
I mean, if I live Podunk Hills, Pennsylvania and 85 out of my last 100 searches were for local businesses then I go online and search for “seo services,” will I get a list of local SEO firms or will Google’s personalization algorithm know that I don’t care whether the list I get is local or not? That could be a problem. Again, personalization could affect how searches are done more than how websites are optimized simply because the searcher is responsible for ensuring they conduct a properly keyworded search for the information they are looking for, whereas SEOs are simply attempting to help Google rank their websites according to the keywords they find valuable.
The patterns of similar searchers. And now it gets even more complex. It does not appear that at this time the search engines are yet grouping users together to find common search patterns however there are multiple references in their patent applications that Google will be looking for ways to group users together by search patters, interests, or memberships in communities to provide personalized results based on what others with similar interested have selected. For example, if I as a searcher am looking for blue widgets and after looking at a number sites I spend a few minutes on site xyz.com and you do the same and then a couple days later I am looking for green gromits the search engine will reference your search patterns. If you have looked for green gromits in the past the engine will use your experience (i.e. which sites did you visit and for how long) to affect my rankings based on our past similar behavior. Now, when we’re dealing with just two people searching there isn’t a lot of information to affect the rakings however when the engines are looking at global rules across millions of searchers they are able to determine which types of searchers are selecting which types of results by grouping users with similar interests/patterns together and increasing the position of those sites that the majority of the group has found most desirable.
The engines can also use memberships in communities and bookmarking similarities to establish common interests and patterns to increase and decrease a site’s position for specific phrases or to raise the sites value as a whole.
This wouldn’t be personalized search. It would be more akin to communal search, or social search. I think we’re a long way from that and, quite frankly, I hope it never happens. There are surely a wide range of problems with this.
For one thing, people are individuals. The fact that we belong to different communities with similar interests doesn’t mean that we think alike the majority of members in those communities. This type of search would tend to treat search queries like communal festivities and devalue searchers as individuals. It would totally defy personalization.
Your value as a visitor. A colleague of mine and a brilliant reporter on the industry, Jim Hedger brought a point to my attention that snuck past me the first time I read it but which now jumps out as both interesting and important. An engine can (and likely will) assign users with their own PageRank. What this basically translates into is a value that your vote will have when you visit a site and its effect on the overall results of the many. If Google decides that I am a lack-luster searcher and seem to select sites that others with similar interests do not then my personal PageRank will be decreases and thus, the sites I visit will be given less of a boost than those of a user whose selected sites match those that others find favorable. That user will then receive an increase in their personal PageRank as their voting power will be deemed higher than others and their decisions more reflective of the most-desired-results.
Now, I’m not sure why Google would place one person’s preferences over another. It seems this kind of maneuver would be like trying to get in the mind of the searcher from a distance. If a new website denizen makes an error in their search query, does that mean they are less important than the veteran searcher who has been doing it for 20 years? Will Google also place higher value of the individual who performs 30 or 40 searches a day, as opposed to the individual who only performs one or two searches a day? Doing so wouldn’t make any sense. But even if Google did take such an approach, how would that affect how SEOs optimize web pages?
I don’t see personalization changing the SEO industry a great deal. Sure, it may cause SEOs to think a little more deeply about how they should optimize their web pages, but the industry insiders, as a whole, are doing that already. I think people like Davies want non-SEOs to believe that things will change so they can justify charging more for their services. But in reality, SEO has always been about keywords and links, and to some degree a few other factors. This will likely not change a great deal. If it does change, it will likely change along the following lines:
- Link popularity will become less important
- On page factors will become more important
- Social factors may play a greater role in search engine algorithms but only insofar as they are necessary to understand how searchers seek information