Like it or not, SEO has become a brave new two-algorithm world in which search marketers must optimize for both Google’s algorithm, as well as human input, which is something the search engine is starting to care about more. But that’s not to say SEOs and digital marketers should shift gears completely and simply make pages for people instead of search engines. Rather, they should optimize for both, says Moz Co-Founder Rand Fishkin. Here’s why.
As Google launched new algorithms to fight manipulative links and content – and used fear and uncertainty about penalization to keep sites in line – over the last three years, its actions have erased a decade of old-school SEO practices, Fishkin said.
But it isn’t necessarily bad news from a consumer perspective. Google has also figured out searcher intent and started examining language instead of just words to provide better results and they’ve also figured out scenarios when consumers want recent results, such as in a search for “digital marketing conferences.”
Meanwhile, Google’s search quality team has also undergone an evolution, Fishkin said. That includes the incorporation of machine learning to not only predict ad click-through rates (CTRs), but also organic results. As machine learning takes over more of Google’s algorithm, the underpinnings of the rankings change, Fishkin said.
What’s more, with a machine learning system in search – in which potential ranking factors and training data, such as what constitutes good and bad search results, are used to create a learning process and then the best-fit algorithm – it’s sometimes hard to figure out why something ranks the way it does. That’s even more pronounced with deep learning, which takes machine learning a step further in that it’s essentially an algorithm that builds its own algorithm.
What Does Deep Learning Mean For SEO?
For one thing, it means Google won’t know why something ranks the way it does or whether a variable is in the algorithm. That’s because query success metrics – such as long- to short-click ratio, user engagement across the domain and sharing/amplification rate versus other results – will be all that matter to machines.
“We’ll be optimizing less for ranking inputs and optimizing more for searcher outputs,” Fishkin said.
That means the near future is really about optimizing for two algorithms, Fishkin said.
“The best SEOs have always optimized for where we’re going,” he said. “Today I think we know better than ever where we’re going.”
That means finding balance between classic ranking inputs like keyword targeting, quality, and uniqueness and searcher outputs like relative CTR and short- versus long-click.
So how should search marketers do that? Here’s Fishkin’s advice.
Tip 1: Optimize More For Clicks
Search marketers should optimize the title, meta description, and URL a little for keywords and then a lot for clicks.
“If you rank number three, but have a higher than average CTR for that position, you might get moved up,” Fishkin said.
Because Google often tests new results briefly on page one, Fishkin said it may also be worth repeated publication on a topic to earn high CTR.
In addition, Fishkin said driving up CTR through branded searches may give an extra boost as the percentage of people who do branded search influence how a result ranks for non-branded search.
Also, in a category like car insurance, brand spend on TV influences the searches being performed by consumers, which sort of secondarily influences their success metrics, which is why Fishkin said he thinks Trivago will start creeping up in travel searches.
Tip 2: Compel Site Visitors To Stay Awhile
Fishkin said pogo-sticking (when users have to go to a bunch of different sites to find what they’re looking for because sites rank highly, but don’t satisfy user queries) and long clicks (when users perform a search, click on a result, and remain on that site for a long period of time) together may determine where a brand ranks and for how long. So SEOs should seriously consider content that fulfills both the searcher’s conscious and unconscious needs, as well as ensure speed, in order to compel visitors to go deeper into a site.
One example is the New York Times, which had an interactive graph that asked users to draw their best guess about how income predicts a child’s college chances because readers would naturally spend a long time on that page drawing said graph.
Tip 3: Be As Comprehensive As Possible
Google is looking for content signals that a page will fulfill all of a searcher’s needs and machine learning models may note the presence of certain words, phrases, and topics predict more successful searches.
In other words, in a search for New York City, a page that mentions each of the five boroughs may rank higher than a page that does not.
Tools like AlchemyAPI and MonkeyLearn can help here, Fishkin said.
Tip 4: Create Content That Inspires Loyalty
Pages that get lots of social activity and engagement, but few links, seem to overperform even for highly competitive keywords like “photos of dogs.”
Fishkin said we see this kind of behavior when a URL becomes hot in social, which could mean Google is using other metrics to get data that mimics social shares, such as clickstream, engagement, and branded queries.
Per Fishkin, Google almost certainly classifies search results pages differently and optimizes to different goals, such as for a medical query that might be best served by a page that does not have many social shares.
In addition, Google probably wants to see shares that result in loyalty and return visits, Fishkin said. Knowing what the audience and their influencers share is essential, as is knowing what makes them return or prevents them from doing so.
“We don’t need better content. We need 10X content,” Fishkin said.
Tip 5: Solve The User’s Entire Task, Not Just A Query
Google wants searchers to accomplish their tasks faster.
So if Google sees that many users who perform similar types of queries end up on the same site, it might use clickstream data to rank that site higher, even if that site doesn’t have traditional ranking signals.
A page that simply answers the initial query may not be enough because Google wants to send users to websites that resolve their mission, Fishkin said.
What do you think about optimizing for both Google’s algorithm and human input?