You searched for Ecommerce SEO - LinkGraph https://linkgraph.io/ High authority link building services, white hat organic outreach. Tue, 29 Nov 2022 22:39:32 +0000 en-US hourly 1 https://wordpress.org/?v=6.1.1 https://linkgraph.io/wp-content/uploads/2021/06/cropped-LinkGraph-Favicon-32x32.png You searched for Ecommerce SEO - LinkGraph https://linkgraph.io/ 32 32 Google Algorithm Update History https://linkgraph.io/blog/google-algorithm-update-history/ https://linkgraph.io/blog/google-algorithm-update-history/#comments Fri, 21 Oct 2022 21:39:41 +0000 https://linkgraph.io/?p=2935 Learn how Google's algorithm has developed over time, what drove changes, and what it means for search and your own web content.

The post Google Algorithm Update History appeared first on LinkGraph.

]]>
Intro

The Google algorithm is constantly changing. In 2018 alone, Google ran 15,096 Live traffic experiments, and launched 3,234 updates to its search algorithm.

 

Three variations of google search result layouts being tested with users.


Not all updates have significant impact on the search results. This page covers the top 150 updates to how search results function from 2000-2019. Updates are a blend of changes to:

 

  • Algorithms
  • Indexation
  • Data (aka Data Refreshes)
  • Google Search UIs
  • Webmaster Tools
  • Changes to ranking factors and signals

Before we get into the timeline of individual google updates, it’s going to be helpful to define a handful of things upfront for any SEO newbies out there:

Google’s Core Algorithm

SEO experts, writers, and audiences will often refer to “Google’s Core Algorithm” as though it is a single item. In reality, Google’s Core Algorithm is made up of millions of smaller algorithms that all work together to surface the best possible search results to users. What we mean when we say “Google’s Core Algorithm” is the set of algorithms that are applied to every single search, which are no longer considered experimental, and which are stable enough to run consistently without requiring significant changes.

Google Panda (2011-2016)

The Panda algorithm focused on removing low quality content from search by reviewing on-page content itself. This algorithm focused on thin content, content dominated by ads, poor quality content (spelling/grammar mistakes), and rewarded unique content. Google Panda was updated 29 times before finally being incorporated into the core algorithm in January of 2016.

Google Penguin (2012-2016)

The Penguin algorithm focused on removing sites engaging in spammy tactics from the search results. Penguin primarily filtered sites engaging in keyword stuffing and link schemes out of the search results. Google Penguin was updated 10 times before being integrated into Google’s core algorithm in September of 2016.

RankBrain (2015-Present)

This machine-learning based AI helps Google process and understand the meaning behind new search queries. RankBrain works by being able to infer the meaning of new words or terms based on context and related terms. RankBrain began rolling out across all of Google search in early 2015 and was fully live and global by mid-2016. Within three months of full deployment RankBrain was already the 3rd most important signal contributing to the results selected for a search query.

Matt Cutts

One of the first 100 employees at Google, Matt Cutts was the head of Google’s Web Spam team for many many years, and interacted heavily with the webmaster community. He spent a lot of time answering questions about algorithm changes and providing webmasters high-level advice and direction.

Danny Sullivan

Originally a Founding Editor, Advisor, and Writer for Search Engine Land (among others), Danny Sullivan now communicates with the SEO community as Google’s Public Search Liaison. Mr. Sullivan frequently finds himself reminding the community that the best way to rank is to create quality content that provides value to users.

Gary Illyes

Google Webmaster Trends Analyst who often responds to the SEO community when they have questions about Google algorithm updates and changes. Gary is known for his candid (and entertaining) responses, which usually have a heavy element of sarcasm.

Webmaster World:

Frequently referenced whenever people speak about Google algorithm updates, webmasterworld.com is one of the most popular forums for webmasters to discuss changes to Google’s search results. A popular community since the early 2000’s webmasters still flock to the space whenever major fluctuations are noticed to discuss theories.

Years.
Tags.

 

2021 Google Search Updates

2021 December – Local Search Update

From November 30th – December 8th, Google runs a local search ranking update. This update rebalances the various factors used to generate local results. Primary ranking factors for local search remain the same: Relevance, Distance, and Prominence. 

Additional Reading:

2021 November – Core Quality Update

From November 17th – November 30th, Google rolls out another core update. As with all core updates, this one is focused on improving the quality and relevance of search results. 

Additional Reading:

2021 August – Title Tag Update

Starting August 16th, Google starts rewriting page titles in the SERPs. After many SEOs saw negative results from the update, Google rolls back some of the changes in September. Google emphasizes that it still uses content with the <title> tag over 80% of the time. 

Additional Reading:

2021 July – Link Spam Update

Google updates link spam fighting algorithm to improve effectiveness of identifying and nullifying link spam. The update is particularly focused on affiliate sites and those websites who monetize through links.

Additional Reading:

2021 June – Page Experience Update

Google announced in late 2020 that its upcoming 2021 Page Experience update would introduce core web vitals as new Google ranking factors. Core web vitals are a set of user experience criteria that include page load times, mobile responsiveness, visual responsiveness, and more. Google evaluates these metrics through the following criteria:

  1. Largest Contentful Paint (LCP) – The time it takes a web page to load the largest piece of content on the page
  2. First Input Delay (FID) – A measurement of the users first interaction with the page from interactivity and responsiveness.
  3. Cumulative Layout Shift (CLS) – Measures visual stability and how stable the website is when loading and scrolling

This update makes it so Google will evaluate page experiences signals like mobile friendliness, safe browsing, HTTPS security, and intrusive interstitial guidelines when ranking web pages.

Additional Reading:

2021 February – Passage Ranking

Google introduces Passage Ranking and starts indexing passages of web content. Google now hones in on a specific passages of long-form content and ranks those passage in the SERPs. Google highlights the relevant passage and takes the users directly to the relevant passage after clicking on the blue link result. 

Additional Reading:

2020 Google Search Updates

2020 October – Indexing Bugs

From early September to the beginning of October, Google experienced multiple bugs with mobile indexing, canonicalization, news-indexing, top stories carousel, and sports scores breaking. The bugs impacted about .02% of searches. Google fully resolved all impacted urls by October 9th.

Additional Reading:

2020 August 11 – Google Glitch

On Tuesday, August 11th, Google experienced a massive, worldwide indexing glitch that impacted search results. Search results were very low-quality or irrelevant to search queries, and ecommerce sites in particular reported significant impacts on rankings. Google resolved the glitch within a few days.

Additional Reading:

2020 June – Google Bug Fix

A Google representative confirmed an indexing bug temporarily impacted rankings. Google was struggling to surface fresh content.

Additional Reading:

2020 May – Core Quality Update

This May 2020 core update was one of the more significant broad core updates with the introduction of core web vitals and increased emphasis on E.A.T. This update was a continuation of an effort to improve the quality of SERP results with COVID related searches. The update most significantly impacted those sites with low-quality or unnatural links. However some sites with lower-domain authority did appear to see positive ranking improvements for pages with high-quality, relevant content. 

Many SEOs reacted negatively, particularly because of the timing of the update, which occurred at the height of economic shutdowns to slow the spread of coronavirus. Some concerns about the May 2020 core quality update ranged from social media SERP domination and better SERP results for larger, more dominant brands like Amazon and Etsy. Some analysis noted these changes may have been reflecting user intent from quarantine, particularly because the update focused on providing better results for queries with multiple search intents. Google’s responded to the complaints by reinforcing existing content-quality signals. 

Additional Reading:

2020 March – COVID-19 Pandemic

Although not an official update, the coronavirus outbreak led to an unprecedented level of search queries that temporarily changed the landscape of search results. Google made several changes to adjust to the trending searches such as:

  • Increased user personalization to combat misinformation
  • Removed COVID-19 misinformation across YouTube and other platforms
  • Added “Sticky Menu” for COVID related searches
  • Added temporary business closures to the Map Pack
  • Temporarily banned ads for respirators and medical masks
  • Created COVID-19 Community Mobility Reports
  • Temporary limited certain Google My Business listings features

Additional Reading:

2020 February 7 – Unannounced Update

In February of 2020, many SEOs reported seeing significant changes to rankings, although Google had not announced and denied any broad core update. Various analysis of the update showed no clear pattern between websites that were impacted. 

Additional Reading:

2020 January 22 – Featured Snippet De-duplication

Prior to this January 2020 update, those sites that earned the featured snippet, or “position zero,” also appeared as the subsequent organic search result. This update de-duplicated search results to eliminate this double exposure. This impacted 100% of searches worldwide and had significant impacts on rank tracking and organic CTR.

Additional Reading:

2020 January – Broad Core Update

On January 13th, 2020, Google started rolling out another broad core update. Google did not provide details about the update, but did emphasize existing webmaster guidelines about content quality.

Additional Reading:

2019 Google Search Updates

2019 November Local Search Update

In November of 2019 Google rolled out an update to how local search results are formulated (ex: map pack results). This update improved Google’s understanding of the context of a search, by improving its understanding of synonyms. In essence, local businesses may find they are showing up in more searches.

 

2019 October 26 BERT

In October Google introduced BERT a deep-learning algorithm focused on helping Google understand the intent behind search queries. BERT (Bidirectional Encoder Representations from Transformers) gives context to each word within a search query. The “bidirectional” in BERT refers to how the algorithm looks at the words that come before and after each term before assessing the meaning of the term itself.

Here’s an example of bi-directional context from Google’s Blog:

In the sentence “I accessed the bank account,” a unidirectional contextual model would represent “bank” based on “I accessed the” but not “account.” However, BERT represents “bank” using both its previous and next context — “I accessed the… account” — starting from the very bottom of a deep neural network, making it deeply bidirectional.

The introduction of BERT marked the most significant change to Google search in half a decade, impacting 1 in 10 searches — 10% of all search queries.

Additional Reading:

2019 September – Entity Ratings & Rich Results

If you place reviews on your own site (even through a third party widget), and use schema markup on those reviews – the review stars will no longer show up in the Google results. Google applied this change to entities considered to be Local Businesses or Organizations.

The reasoning? Google considers these types of reviews to be self-serving. The logic is that if a site is placing a third party review widget on their own domain, they probably have some control over the reviews or review process.

Our recommendation? If you’re a local business or organization, claim your Google My Business listing and focus on encouraging users to leave reviews with Google directly.

Additional Reading:

2019 September – Broad Core Update

This update included two components:First, it hit sites exploiting a 301 redirect trick from expired sites. In this trick users would buy either expired sites with good SEO metrics and redirect the entire domain to their site, or users would pay a 3rd party to redirect a portion of pages from an expired site to their domain.Note: Sites with relevant 301 redirects from expired sites were still fine.

Second, video content appears to have gotten a boost from this update. June’s update brought an increase in video carousels in the SERPs. Now in September, we’re seeing video content bumping down organic pages that previously ranked above them.

 

We can see this at an even greater scale looking at two purely text and purely video sites – YouTube and Wikipedia. We can see that for the first time, YouTube has eclipsed Wikipedia in the Google search results.

 

Additional Reading:

2019 June – Broad Core Update

This is the first time that Google has pre-announced an update. Danny Sullivan, Google’s Search Liaison, stated that they chose to pre-announce the changes so webmasters would not be left “scratching their heads” about what was happening this time.

What happened?

  • We saw an increase in video carousels in the SERPs
  • Low quality news sites saw losses

What can sites do to respond to this broad core update? It looks like Google is leaning into video content, at least in the short-term. Consider including video as one of the types of content your team creates.

Additional Reading:

2019 May 22-26 – Indexing Bugs

On Wednesday May 22nd Google tweeted that there were indexation bugs causing stale results to be served for certain queries, this bug was resolved early on Thursday May 23rd.

By the evening of Thursday May 23rd Google was back to tweeting – stating that they were working on a new indexing bug that was preventing capture of new pages. On May 26th Google followed up that this indexation bug had also been fixed.

Additional Reading:

2019 April 4-11 De-Indexing Bugs

In April of 2019 an indexing bug caused about 4% of stable URLs to fall off of the first page. What happened? A technical error caused a bug to de-index a massive set of webpages.

Additional Reading:

2019 March 12 – Broad Core Update

Google was specifically vague about this update, and just kept redirecting people and questions to the Google quality guidelines. However, the webmaster community noticed that the update seemed to have a heavier impact on YMYL (your money or your life) pages.

YMYL sites with low quality content took a nose-dive, and sites with heavy trust signals (well known brands, known authorities on multiple topics, etc) climbed the rankings.

Let’s take two examples:

First, Everdayhealth.com lost 50% of their SEO visibility from this update. Sample headline:Can Himalayan Salt Lamps Really Help People with Asthma?

Next, Medicinenet.com saw a 12% increase in their SEO visibility from this update. Sample headline: 4 Deaths, 141 Legionnaires’ Infections Linked to Hot Tubs.

This update also seemed to factor in user behavior more strongly. Domains where users spent longer on the site, had more pages per visit, and had lower bounce rates saw an uptick in their rankings.

Additional Reading:

2019 March 1 – Extended Results Page

For one day, on March 1st, Google displayed 19 results on the first page of SERPs for all queries, 20 if you count the featured snippet. Many hypothesize it was a glitch related to in-depth articles, a results type from 2013 that has long since been integrated into regular organic search results.

Additional Reading:

2018 Google Algorithm Updates

2018 August – Broad Core Update (Medic)

This broad core update, known by its nickname “Medic” impacted YMYL (your money or your life) sites across the web.

SEOs had many theories about what to do to improve rankings after this update, but both Google and the larger SEO community ended up at the same messaging: make content user’s are looking for, and make it helpful.

This update sparked a lot of discussion around E-A-T (Expertise, Authoritativeness, Trustworthiness) for page quality, and the importance of clear authorship and bylines on content.

Additional Reading:

2018 July – Chrome Security Warning

Google begins marking all http sites as “not secure” and displaying warnings to users.

 

Google views security as one of their core principles, so this change makes sense as the next step to build on their October 2017 update that began warning users about unsecured forms.

 

Looking forward, Google is planning on blocking mixed content from https sites.

What can you do? Purchase an SSL certificate and make the move from http to https as soon as possible. Double check that all of your subdomains, images, PDFs and other assets associated with your site are also being served securely.

Additional Reading:

2018 July – Mobile Speed Update

Google rolled out the mobile page speed update, making page speed a ranking factor for mobile results.

Additional Reading:

2018 June – Video Carousels

Google introduces a dedicated video carousel on the first page of results for some queries, and moves videos out of regular results. This change also led to a significant increase in the number of search results displaying videos (+60%).

Additional Reading:

2018 April – Broad Core Update

The official line from Google about this broad core update, is that it rewards quality content that was previously under-rewarded. Sites that had content that was clearly better than the content of it’s organic competitors saw a boost, sites with thin or duplicative content fell.

2018 March – Broad Core Update

March’s update focused on content relevance (how well does content match the intent of the searcher) rather than content quality.

What can you do? Take a look at the pages google is listing in the top 10-20 spots for your target search term and see if you can spot any similarities that hint at how Google views the intent of the search.

Additional Reading:

2018 March – Mobile-First Index Starts to Roll Out

After months of testing Google begins rolling out mobile-first indexing. Under this approach, Google crawls and indexes the mobile version of website pages when adding them to their index. If content is missing from mobile versions of your webpages, that content may not be indexed by Google.

To quote Google themselves,

“Mobile-first indexing means that we’ll use the mobile version of the page for indexing and ranking, to better help our – primarily mobile – users find what they’re looking for.”

Essentially the entire index is going mobile-first. This process of migrating over to indexing the mobile version of websites is still underway. Website’s are being notified in Search Console when they’ve been migrated under Google’s mobile-first index.

 

Additional Reading:

 

2017 Google Search Updates

2017 December – Maccabees

Google states that a series of minor improvements are rolled out across December. Webmasters and SEO professionals see large fluctuations in the SERPs.

 

Danny Sullivan's Maccabees Tweets about how there are always multiple daily updates, no single update.
Barry Schwartz gave this set of updates the Maccabees nickname as he noted the most fluctuation around December 12 (occurring during Hanukkah). However, updates occurred from the very beginning until the very end of December.

 

What were the Maccabees changes?

Webmasters noted that doorway pages took a hit. Doorway pages act as landing pages for users, but don’t contain the real content – users have to get past these initial landing pages to access content of any value. Google considers these pages barriers to a user.

A writer at Moz dissected a slew of site data from mid-december noted one key observation. When two pages ranked for the same term, the one with better user engagement saw it’s rankings improve after this update. The other page saw its rankings drop. In many instances what happened for sites that began to lose traffic, is that blog pages were being shown/ranked where product or service pages should have been displayed.

A number of official celebrity sites fall in the rankings including (notably) Channing Tatum, Charlie Sheen, Kristen Stewart, Tom Cruise, and even Barack Obama. This speaks to how Google might have rebalanced factors around authoritativeness vs. content quality. One SEO expert noted that thin celebrity sites fell while more robust celebrity sites (like Katy Perry’s) maintained their #1 position.

Multiple webmasters reporting a slew of manual actions on December 25th and 26th, and some webmasters also reported seeing jumps on the 26th for pages that had been working on site quality.

Additional Reading:

2017 November – Snippet Length Increased

Google increases the character length of meta descriptions to 300 characters. This update was not long-lived as Google rolled back to the original 150-160 character meta descriptions on May 13, 2018.

2017 May – Quality Update

Webmasters noted that this update targeted sites and pages with:

  • Deceptive advertising
  • UX challenges
  • Thin or low quality content

Additional Reading:

2017 March – Fred

In early March webmasters and SEOs began to notice significant fluctuations in the SERPs, and Barry Schwartz from SEJ began tweeting Google to confirm algorithm changes.

The changes seemed to target content sites engaging in aggressive monetization at the expense of users. Basically sites filling the internet up with low-value content, meant to benefit everyone except the user. This included PBN sites, and sites created with the sole intent of generating AdSense income.

Fred got its name from Gary Illyes who suggested to an SEO expert asking if he wanted to name the update, that we should start calling all updates without names “Fred.”

 

The joke, for anyone who knows the webmaster trends analyst, is that he calls everything unnamed fred (fish, people, EVERYTHING).

 

The SEO community took this as a confirmation of recent algorithm changes (note: literally every day has algorithm updates). Validating them digging into the SERP Changes.

Additional Reading:

2017 January 10 – Pop Up Penalty

Google announces that intrusive pop ups and interstitials are going to be factored into their search algorithm moving forward.

“To improve the mobile search experience, after January 10, 2017, pages where content is not easily accessible to a user on the transition from the mobile search results may not rank as highly.”

This change caused rankings to drop for sites that forced users to get past an ad or pop up to access relevant content. Not all pop ups or interstitials were penalized, for instance the following pop ups were still okay:

  • Pop ups that helped sites stay legally compliant (ex: accepting cookies, or verifying a user’s age).
  • Pop ups that did not block content on load.

Additional Reading:

2016 Google Search Updates

2016 September – Penguin 4.0

The Google announcement of Penguin 4.0 had two major components:

  • Penguin had been merged into the core algorithm, and would now have real-time updates.
  • Penguin would be more page-specific moving forward rather than impacting entire domains.

SEOs also noted one additional change. Penguin 4.0 seemed to just remove the impact of spam links on SERPs, rather than penalizing sites with spammy links. This appeared to be an attempt for Google to mitigate the impact of negative SEO attacks on sites.

That being said, today in 2019 we still see a positive impact from running disavows for clients who have seen spammy links creep into their backlink profiles.

Additional Reading:

2016 September – Possum Update

This update targeted duplicate and spammy results in local search (Local Pack and Google Maps). The goal being to provide more diverse results when they’re searching for a local business, product, or service.

Prior to the Possum update Google was filtering out duplicates in local results by looking for listings with matching domains or matching phone numbers. After the Possum update Google began filtering out duplicates based on their physical address.

Businesses who saw some of their listings removed from the local pack may have initially thought their sites were dead (killed by this update), but they weren’t – they were just being filtered (playing possum). The term was coined by Phil Rozek

SEOs also noted that businesses right outside of city limits also saw a major uptick in local rankings, as they got included in local searches for those cities.

Additional Reading:

2016 May – Mobile Friendly Boost

Google boosts the effect of the mobile-friendly ranking signal in search.
Google took time to stress that sites which are not mobile friendly but which still provide high quality content will still rank.

Additional Reading:

2016 February 19 – Adwords Change

Google Removes sidebar ads and ads a fourth ad to the top block above the organic search results.
This move reflects the search engine giant continuing to prioritize mobile-first experiences, where side-bar ads are cumbersome compared to results in the main content block.

2016 January – Broad Core Update + Panda Is Now Core

Google Confirms core algorithm update in January, right after confirming that Panda is now part of Google’s core algorithm.

Not a lot of conclusions were able to be drawn about the update, but SEOs noticed significant fluctuations with news sites/news publishers. Longform content with multi-media got a boost, and older articles took a bit of a dive for branded terms. This shift could reflect Google tweaking current-event related results to show more recent content, but the data was not definitive.

Additional Reading:

2015 Google Search Updates

2015 December – SSL/HTTPS by Default

Google starts indexing the https version of pages by default.

Pages using SSL are also seeing a slight boost. Google holds security as a core component of surfacing search results to users, and this shift becomes one of many security-related search algo changes. In fact, by the end of 2017 over 75% of the page one organic search results were https.

2015 October 26 – RankBrain

in testing since April 2015, Google officially introduced RankBrain on this date. RankBrain is a machine learning algorithm that filters search results to help give users a best answer to their query. Initially, RankBrain was used for about 15 percent of queries (mainly new queries Google had never seen before), but now it is involved in almost every query entered into Google. RankBrain has been called the third most important ranking signal.

Additional Reading:

2015 October 5 – Hacked Sites Algorithm

Google introduces an algorithm specifically targeting spammy in the search results that were gaining search equity from hacked sites.

This change was significant, it impacted 5% of search queries. This algorithm hides sites benefiting from hacked sites in the search results.

 

Interactions with Gary Illyes at #pubcon and on twitter suggest that this algo only applies to search queries traditionally known to be spammy.

 

 

The update came right after a September message from Google about cracking down on repeat spam offenders. Google’s blog post notified SEOs that sites which repeatedly received manual actions would find it harder and harder to have those manual actions reconsidered.

Additional Reading:

2015 August 6 – Google Snack Pack

Google switches from displaying seven results for local search in the map pack to only three.

Why the change? Google is switching over (step-by-step) to mobile-first search results, aka prioritizing mobile users over desktop users.

On mobile, only three local results fit onto the screen before a users needs to scroll. Google seems to want users to scroll to then access organic results.

Other noticeable changes from this update:

  • Google only displays the street (not the complete address) unless you click into a result.
  • Users can now filter snack pack results by rating using a dropdown.

Additional Reading:

2015 July 18- Panda 4.2 (Update 29)

Roll out of Panda 4.2 began on the weekend of July 18th and affected 2-3% of search queries. This was a refresh, and the first one for Panda in about 9 months.

Why does that matter? The Panda algorithm acts like a filter on search results to sort out low quality content. Panda basically gets applied to a set of data – and decides what to filter out (or down). Until the data for a site is refreshed, Panda’s ruling is static. So when a data refresh is completed, sites that have made improvements essentially get a revised ruling on how they’re filtered.

Nine months is a long time to wait for a revised ruling!

2015 May – Quality Update / Phantom II

This change is an update to the quality filters integrated into Google’s core algorithm, and alters how the algorithm processes signals for content quality. This algorithm is real-time, meaning that webmasters will not need to wait for data refreshes to see positive impact from making content improvements.

What kind of pages did we see drop in the rankings?

  • Clickbait content
  • Pages with disruptive ads
  • Pages where videos auto-played
  • How-to sites with thin or duplicative content (this ended up impacting a lot of how-to sites)
  • Pages that were hard to navigate/had UI barriers

In hindsight, this update feels like a precursor to Google’s 2017 updates for content spam and intrusive pop ups.

Additional Reading:

2015 April 21 – Mobilegeddon

Google boosts mobile-friendly pages in mobile search results.

This update was termed Mobilegeddon as SEOs expected it to impact a huge number of search queries, maybe more than any other update ever had. Why? Google was already seeing more searches on mobile than on desktop in the U.S. in May 2015.

In 2018 Google takes this a step further and starts mobile-first indexing.

Additional Reading:

2014 Google Algorithm Updates

2014 December – Pigeon Goes International

Google’s local algorithm, known as Pigeon, expands to international English speaking countries (UK, Canada, Australia) on December 22, 2014.

In December Google also releases updated guidelines for local businesses representing themselves on Google.

Additional Reading:

2014 October – Pirate II

Google releases an “improved DMCA demotion signal in Search,” specifically designed to target and downrank some of the sites most notorious for piracy.

In October Google also released an updated report on how they fight piracy, which includes changes they made to how results for media searches were displayed in search. Most of these user interface changes were geared towards helping user find legal (trusted) ways to consume the media content they were seeking.

 


—————————–
Additional Reading:

 

2014 October 17 – Penguin 3.0

This update impacted 1% of English search queries, and was the first update to Penguin’s algorithm in over a year. This update was both a refresh and a major algorithm update.

2014 September – Panda 4.1 (Update 28)

Panda 4.1 is the 28th update for the algorithm that targets poor quality content. This update impacted 3-5% of search queries.

To quote Google:

“Based on user (and webmaster!) feedback, we’ve been able to discover a few more signal to help Panda identify low-quality content more precisely. This results in a greater diversity of high-quality small- and medium-sized sites ranking higher, which is nice.”

Major losers were sites with deceptive ads, affiliate sites (thin on content, meant to pass traffic to other monetizing affiliates), and sites with security issues.

2014 September – Known PBNs De-Indexed

This change impacted search, but was not an algorithm change, data refresh, or UI update.

Starting mid-to-late September, 2014 Google de-indexed a massive amount of sites being used to boost other sites and game Google’s search rankings.

Google then followed-up on the de-indexing with manual actions for sites benefiting from the PBN. These manual actions went out on September 18, 2014.

Additional Reading:

2014 August – Authorship Removed from Search Results

Authors are no longer displayed (name or photo) in the search results along with the pieces that they’ve written.

Almost a year later Gary Illyes suggested that sites with authorship markup should leave the markup in place because it might be used again in the future. However, at a later date it was suggested that Google is perfectly capable of recognizing authorship from bylines.

Additional Reading:

2014 August – SSL becomes a ranking factor

Sites using SSL began to see a slight boost in rankings.

Google would later go on to increase this boost, and eventually provide warning to users when they were trying to access unsecure pages.

Additional Reading:

2014 July 24 – Google Local Update (Pigeon)

Google’s local search algorithm is updated to include more signals from traditional search (knowledge graph, spelling correction, synonyms, etc).

Additional Reading

2014 June – Authorship Photos Removed

Photos of Authors are gone from SERPs.

This was the first step towards Google decommissioning Authorship markup.

2014 June – Payday Loan Update 3.0

Where Payday Loans 2.0 targeted spammy sites, Payday Loans 3.0 targeted spammy queries, or more specifically the types of illegal link schemes scene disproportionately within high-spam industries (payday loans, porn, gambling, etc).

What do you mean illegal? We mean link schemes that function off of hacking other websites or infecting them with malware.

This update also included better protection against negative SEO attacks,

Additional Reading:

2014 May 17-18 – Payday Loan Update 2.0

Payday Loan Update 2.0 was a comprehensive update to the algorithm (not just da data refresh). This update focused on devaluation of domains using spamy on-site tactics such as cloaking.

Cloaking is when the content/page that google can see for a page is different than the content/page that a human user sees when they click on that page from the SERPs.

2014 May – Panda 4.0 (Update 27)

Google had stopped announcing changes to Panda for a while, so when they announced Panda 4.0 we know it was going to be a larger change to the overall algorithm.

Panda 4.0 impacted 7.5% of English queries, and led to a drastic nose dive for a slew of prominent sites like eBay, Ask.com, and Biography.com.

 

Sites that curated information from other sources without posting info or analysis of their own (aka coupon sites, celebrity gossip sites) seemed to take a big hit from this update.

 

 

2014 February 6 – Page Layout 3.0 (Top Heavy 3.0)

This is a refresh of Google’s algorithm that devalues pages with too many above-the-fold ads, per Google’s blog:

We’ve heard complaints from users that if they click on a result and it’s difficult to find the actual content, they aren’t happy with the experience. Rather than scrolling down the page past a slew of ads, users want to see content right away.

So sites that don’t have much content “above-the-fold” can be affected by this change. If you click on a website and the part of the website you see first either doesn’t have a lot of visible content above-the-fold or dedicates a large fraction of the site’s initial screen real estate to ads, that’s not a very good user experience.

The Page Layout algorithm was originally launched on January 19, 2012, and has only had one other update in October of the same year (2012).

 

Tweet from Matt Cutts Announcing Panda 4.0

2013 Google Algorithm Updates

2013 December – Authorship Devalued

Authorship gets less of a boost in the search results. This is the first step Google took in beginning to phase out authorship markup.

2013 October – Penguin 2.1

Technically the 5th update to Google’s link-spam fighting algorithm, this minor update affects about 1% of search queries.

 

2013 August – Hummingbird

Hummingbird was a full replacement of the core search algorithm, and Google’s largest update since Caffeine (Panda and Penguin had only been changes to portions of the old algorithm).

Humminbird helped most with conversational search for results outside of the knowledge graph — where conversational search was already running. Hummingbird was a significant improvement to how google interpreted the way text and queries are typed into search.

This algorithm was named Hummingbird by Google because it’s “precise and fast.”

 

Additional Reading:

 

2013 July – Expansion of Knowledge Graph

Knowledge Graph Expands to nearly 25% of all searches, displaying information-rich cards right above or next to the organic search results.

 

 

Additional Reading:

2013 July – Panda Dance (Update 26)

Panda begins going through monthly refreshes, also known as the “Panda Dance,” which caused monthly shifts in search rankings.

The next time Google would acknowledge a formal Panda update outside of these refreshes would be almost a year later in May of 2014.

2013 June – Roll Out Anti-Spam Algorithm Changes

Google rolled out an anti-link-spam algorithm in June of 2013 targeting sites grossly violating webmaster guidelines with egregious unnatural link building.

Matt Cutts even acknowledged one target – ‘Citadel Insurance’ which built 28,000 links from 1,000 low ranking domains within a single day, June 14th, and managed to reach position #2 for car insurance with the tactic.

By the end of June sites were finding it much harder to exploit the system with similar tactics.

 

 

2013 June 11 – Payday Loans

This update impacted 0.3% of queries in the U.S., and as much as 4% of queries in Turkey.

This algorithm targets queries that have abnormally high incidents of SEO spam (payday loans, adult searches, drugs, pharmaceuticals) and applies an extra filters to these types of queries specifically.

 

2013 May 22 – Penguin 2.0

Penguin 2.0 was an update to the Penguin algorithm (as opposed to just a data refresh), it impacted 2.3% of english queries.

What changed?

  • Advertorials will no longer flow pagerank
  • Niches that are traditionally spammy will see more impact
  • Improvements to how hacked sites are detected
  • Link spammers will see links from their domains transfer less value.

One of the biggest shifts with Penguin 2.0 is it also analyzed linkspam for internal site pages, whereas Penguin 1.0 had looked at spammy links specifically pointing to domain home pages.

This marked the first time in 6 months that the Penguin algorithm had been updated, and the 4th update to Penguin that we’ve seen:

  • April 24, 2012 – Penguin 1.0 Launched
  • May 25, 2012 – Penguin 1.1 Data Refresh
  • October 5, 2012 – Another Penguin Data Refresh

 

Additional Reading:

 

2013 May – Domain Diversity

This update reduced the amount of times a user saw the same domain in the search results. According to Matt Cutts, once you’ve seen a cluster of +/- 4 results from the same domain, the subsequent search pages are going to be significantly less likely to show you results from that domain.

 

Additional Reading:

 

2013 May 8th – Phantom I

On May 8th, 2013 SEOs over at Webmaster World noticed intense fluctuation in the SERPs.

Lots of people dove into the data – some commenting that sites who had taken a dive were previously hit by Panda, but there were no conclusive takeaways. With no confirmation of major changes from Google, and nothing conclusive in the data – this anomaly came to be known as the “Phantom” update.

2013 March 14-15 – Panda Update 25

This is the 25th update for Panda, the algorithm that devalues low quality content in the SERPs. Matt Cutts confirmed that moving forward the Panda algorithm was going to be part of a regular algorithm updates, meaning it will be a rolling update instead of a pushed update process.

2013 January 22 – Panda Update 24

The 24th Panda update was announced on January 22, 2013 and impacted 1.2% of English search queries.

 

2012 Google Algorithm Updates

2012 December 21 – Panda Update 23

The 23rd Panda update hit on December 21, 2012 and impacted 1.3% of English search queries.

2012 December 4 – Knowledge Graph Expansion

On December 4, 2012 Google announced a foriegn language expansion of the Knowledge Graph, their project to “map out real-world things as diverse as movies, bridgets and planets.”

 


Variations of Knowledge Graph in Search Results for Different Languages (Russian, Japanese, etc)

2012 November – Panda Updates 21 & 22

In November 2012 Panda had two updates in the same month – one on November 5, 2012 (1.1% of English queries impacted in the US) and one on November 22, 2012 (0.8% of Enlish queries impacted in the US).

2012 October 9 – Page Layout Update

On October 9, 2012 Google rolled up an update to their Page Layout filter (also known as “Top Heavy”) impacting 0.7% of English-language search queries. This update rolled the Page Layout algorithm out globally.

Sites that made fixes after Google’s initial Page Layout Filter hit back in January of 2012 saw their rankings recover in the SERPs.

2012 October – Penguin Update 1.2

This was just a data refresh affecting 0.3% of English queries in the US.

 

2012 September – Panda Updates 19 & 20

Panda update 19 hit on September 18, 2012 affecting 0.7% of English search queries, followed just over a week later by Panda update 20 which hit on September 27, 2012 affecting 2.4% of English search queries.

Panda update 20 was an actual algorithm refresh, accounting for the higher percentage of affected queries.

2012 September – Exact Match Domains

At the end of September Matt Cutts announced an upcoming change: low quality exact match domains were going to be taking a hit in the search results.

Up until this point, exact match domains had been weighted heavily enough in the algorithms to counterbalance low quality site content.

Additional Reading:

2012 August 19 – Panda Update 18

Panda version 3.9.1 rolled out on Monday, August 19th, 2012, affecting less than 1% of English search queries in the US.

This update was a data refresh.

 

2012 August – Fewer Results on Page 1

In August Google began displaying 7 results for about 18% of the queries, rather than the standard 10.

Upon further inspection it appeared that google had reduced the number of organic results so they’d have more space to test a suite of potential top-of search features including: expanded site links, images, and local results.

This change, in conjunction with the knowledge graph, paved the way for the top-of-search rich snippet results we see in search today.

Additional Reading:

2012 August 10 – Pirate/DMCA Penalty

Google announces they’ll be devaluing sites that repeatedly get accused of copyright infringement in the SERPs. As of this date the number of valid copyright removal notices is a ranking signal in Google’s search algorithm.

Additional Reading:

2012 July 24 – Panda Update 17

On July 24, 2012 Google Announces Panda 3.9.0 – a refresh for the algorithm affecting less than 1% search

2012 July 27 – Webmaster Tool Link Warnings

Not technically an algorithm update, but it definitely affected the SEO landscape.

On July 27, 2012 Google posted an update clarifying topics surrounding a slew of unnatural link warnings that had recently been sent out to webmasters:

  • Unnatural link warnings and drops in rankings are directly connected
  • Google doesn’t penalize sites as much when they’re the victims of 3rd party bad actors

Additional Reading:

2012 June – Panda Updates 15 & 16

In June Google made two updates to its Panda algorithm fighting low quality content in the SERPs:

  • Panda 3.7 rolled out on June 8, 2012 affecting less than 1% of English search queries in the U.S.
  • Panda 3.8 rolled out on June 25, 2012 affecting less than 1% of queries worldwide.

Both updates were data refreshes.

2012 June – 39 Google Updates

On June 7, 2012 Google posted an update providing insight into search changes made over the course of May. Highlights included:

  • Link Spam Improvements:
    • Better hacked sites detection
    • Better detection of inorganic backlink signals
    • Adjustments to Penguin
  • Adjustments to how Google handles page titles
  • Improvements to autocomplete for searches
  • Improvements to the freshness algorithm
  • Improvements to rankings for news and recognition of major news events.

Additional Reading:

2012 May 25 – Penguin 1.1

A data refresh for the Penguin algorithm was released on May 25, 2012 affecting less than 0.1% of search queries.

Additional Reading:

2012 My 16 – Knowledge Graph

On May 16, 2012 Google introduced the knowledge graph, a huge step forward in helping users complete their goals faster.

First, the knowledge graph improved Google’s understanding of entities in Search (what words represented — people, places, or things).

Second, it surfaced relevant information about these entities directly on the search results page as summaries and answers. This meant that users in many instances, no longer needed to click into a search result to find the information they were seeking.

Additional Resources:

2012 May 4 – 52 April Updates

On May 4, 2012 Google posted an update providing insight into search changes made over the course of April. Highlights included:

  • 15% increase in the base index
  • Removed the freshness boost for low quality content
  • Increased domain diversity in the search results.
  • Changes to Sitelinks
    • Sub sitelinks
    • Better ranking of expanded sitelinks
    • Sitelinks data refresh
  • Adjustment to surface more authoritative results.

Additional Reading:

2012 April – Panda Updates 13 & 14

In April Google made two updates to its Panda algorithm fighting low quality content in the SERPs:

  • Panda 3.5 rolled out on April 19, 2012
  • Panda 3.6 rolled out on April 27, 2012 affecting 1% of queries.

Panda 3.5 seemed to target press portals and aggregators, as well as heavily-templated websites. This makes sense as these types of sites are likely to have a high number of pages with thin or duplicative content.

Additional Reading:

2012 April 24 – Penguin

The Penguin Algorithm was announced on April 24, 2012 and focused specifically on devaluing sites that engage in spammy SEO practices.

The two primary targets of Penguin 1.0? Keyword stuffing and link schemes.

Additional Reading:

2012 April 24 – Penguin

The Penguin Algorithm was announced on April 24, 2012 and focused specifically on devaluing sites that engage in spammy SEO practices.

The two primary targets of Penguin 1.0? Keyword stuffing and link schemes.

Additional Reading:

2012 April – Parked Domain Bug

After a number of webmasters reported ranking shuffles, Google confirmed that a data error had caused some domains to be mistakenly treated as parked domains (and thereby devalued). This was not an intentional algorithm change.

Additional Reading:

2012 April 3 – 50 Updates

On April 3, 2012 Google posted an update providing insight into search changes made over the course of March. Highlights included:

  • Sitelinks Data Refresh
  • Better handling of queries with navigational and local intent
  • Improvements to detecting site quality
  • Improvements to how anchor text contributes to relevancy for sites and search queries
  • Improvements to how search handles synonyms

Additional Reading:

2012 March – Panda Update 12

On March 23, 2012 we saw the Penguin 3.4 update, a data refresh affecting 1.6% of queries.

 

2012 February 27 – Panda Update 11

Panda Update 3.3 was a data refresh that was announced on February 27, 2012.

2012 February 27 – Series of Updates

On February 27, 2012 Google posted an update providing insight into search changes made over the course of February. Highlights included:

  • Travel related search improvements
  • international launch of shopping rich snippets
  • improved health searches
  • Google changed how it was evaluating links, dropping a method of link analysis that had been used for the past several years.

Additional Reading:

2012 February – Venice

The Venice update changed the face of local search forever, as local sites now up even without a geo modifier being used in the keyword itself.

Additional Reading:

2012 January – Page Layout Update

This update devalued pages in search that had too many ads “above-the-fold.” Google said that ads that prevented users from accessing content quickly provided a poor user experience.

Additional Reading:

2012 January 10 – Personalized Search

On January 10, 2012 Google announced Search, plus Your World. Google had already expanded search to include content personally relevant to individuals with Social Search, Your World was the next step.

This update pulled in information from Google+ such as photos, profiles, and more.

Additional Reading:

2012 January 5 – 30 Google Updates

On January 5, 2012 Google posted an update providing insight into search changes made over the course of December of 2011. Highlights included:

  • Landing page quality became a signal for image search, beyond the image itself
  • Soft 404 detection (when a page returns a different status code, but the content still wont be accessible to a user).
  • More rich snippets
  • Better infrastructure for autocomplete (ex: spelling corrections)
  • More accurate byline dates
  • Related queries improvements
  • Upcoming events at venues
  • Faster mobile browsing – skipped the redirect phase of sending users to a mobile site m.domain.com

Additional Reading:

2011 Google Algorithm Updates

2011 December 1 – 10 Google Updates

On December 1, 2011 Google posted an update providing insight into search changes made the two weeks prior. Highlights included:

  • Refinements to the inclusion of related queries so they’d be more relevant
  • Expansion of indexing to include more long tail keywords
  • New parked domain classifier (placeholder sites hosting ads)
  • More complete (fresher) blog results
  • Improvements for recognizing and rewarding whichever sites originally posted content
  • Top result selection code rewrite to avoid “host crowding” (too many results from a single domain in the search results).
  • New verbatim tool
  • New google bar

Additional Reading:

2011 November 18 – Panda Update 10

The Panda 3.1 update rolled out on November 18th, 2011 and affected less than 1% of searches.

 

2011 November – Panda 3.1 (Update 9)

On November 18th, 2011 Panda Update 3.1 goes live, impacting <1% of searches.

 

2011 November – Automatic Translation & More

On November 14, 2011 Google posted an update providing insight into search changes made over the couple preceding weeks. Highlights included:

  • Cross language results + automatic translation
  • Better page titles in search results by de-duplicating boilerplate anchors (referring to google-generated page titles, when they ignore html title tags because they can provide a better one)
  • Extending application rich snippets
  • Refining official page detection, adjusted how they determine which pages are official
  • Improvements to date-restricted queries

Additional Reading:

2011 November 3 – Fresher Results

Google puts an emphasis on more recent results, especially on time-sensitive queries.

  • Ex: Recent events / hot topics
  • Ex: regularly occurring/recurring events
  • Frequently updated/outdated types of info (ex: best SLR camera)

Additional Reading:

2011 October – Query Encryption

On October 18, 2011 Google announced that they were going to be encrypting search data for users who are signed in.

The result? Webmasters could tell that users were coming from google search, but could no longer see the queries being used. Instead, webmasters began to see “(not provided)” showing up in their search results.

This change followed a January roll out of SSL encryption protocol to gmail users.

Additional Reading:

2011 October 19 – Panda Update 8 (“Flux”)

In October Matt Cutts announced there would be upcoming flux from the Panda 3.0 update affecting about 2% of search queries. Flux occurred throughout October as new signals were incorporated into the Panda algorithms and data is refreshed.

Additional Reading:

2011 September 28 – Panda Update 7

On September 20, 2011 Google released their 7th update to the Panda algorithm – Panda 2.5.

2011 September – Pagination Elements

Google added pagination elements – link attributes to help with pagination crawl/indexing issues.

  • Rel=”Next”
  • Rel=”prev”

Note: this is no longer an indexing signal anymore

2011 August 16 – Expanded Site Links

On August 16, 2011 Google announced expanded display of sitelinks from a max of 8 links to a max of 12 links.

 


Additional Reading:

 

2011 August 12 – Panda Update 6

Google rolled out Panda 2.4 expanding Panda to more languages August 12, 2011, impacting 6-9% of queries worldwide.

Additional Reading:

2011 July 23 – Panda Update 5

Google rolled out Panda 2.3 in July of 2011, adding new signals to help differentiate between higher and lower quality sites.

2011 June 28 – Google+

On June 28, 2011 Google launched their own social network, Google+. The network was sort of a middle ground between Linkedin and Facebook.

Over time, Google + shares and +1s (likes) will eventually become a temporary personalized search ranking factor.

Ultimately though, Google+ ended up being decommissioned in 2019

Additional Reading:

2011 June 16 – Panda Update 4

According to Matt Cutts Panda 2.2 improved scraper-site detection.

What’s a scraper? In this context, a scraper is software used to copy content from a website, often to be posted to another website for ranking purposes. This is considered a type of webspam (not to mention plagiarism).

This update rolled out around June 16, 2011.

2011 June 2 – Schema.org

On June 2, 2011 Google, Yahoo, and Microsoft announced a collaboration to create “a common vocabulary for structured data,” known as Schema.org.

Additional Reading:

2011 May 9 – Panda Update 3

Panda 2.1 rolled out in early May, and was relatively minor compared to previous Panda updates.

2011 April 11 – Panda Update 2

On April 11, 2011 Panda 2.0 rolled out globally to English users, impacting about 2% of search queries.

What was different in Panda 2.0?

  • Better assessment of site quality for long-tailed keywords
  • This update also begins to incorporate data around sites that user’s manually block

Additional Reading:

2011 March 28 – Google +1 Button

Google introduces the +1 Button, similar to facebook “like” button or the reddit upvote. The goal? Bring trusted content to the top of the search results.

Later in June Google posted a brief update that they made the button faster, and in August of 2011 it also became a share icon.

 

Additional Reading:

 

2011 February – Panda Update (AKA Farmer)

Panda was released to fight thin content and low-quality content in the SERPs. Panda was also designed to reward unique content that provides value to users.

 

Panda impacted a whopping 12% of search results, and virtually wiped out content farms, sites with low quality content, thin affiliate sites, sites with large ad-to-content ratios and over optimization.

 

As a result sites with less intrusive ads started to do better in the search results, sites with”thin” user-generated content went down, as did harder to read pages.

Per Google:

As “pure webspam” has decreased over time, attention has shifted instead to “content farms,” which are sites with shallow or low-quality content.”

Additional Reading

2011 January – Attribution Update

This update focused on stopping scraper sites from receiving benefit from stolen content. The algorithm worked to establish which site initially created and posted content, and boost that site in the SERPs over other sites which had stolen the content.

Additional Reading:

2011 January – Overstock.com & JCPenney Penalty

Overstock and J.C. Penney receive manual actions due to deceptive link building practices.

Overstock offered a 10% discount to universities, students, and parents — as long as they posted anchor-text rich content to their university website. A competitor noticed the trend and reported them to Google.

JC Penney had thousands of backlinks built to its site targeting exact match anchor text. After receiving a manual action they disavowed the spammy links and largely recovered.

Additional Reading:

2010 Google Algorithm Updates

2010 December – Social Signals Incorporated

Google confirms that they use social signals including accounting for shares when looking at news stories, and author quality.

<h3style=”font-size: 18pt;”>2010 December – Negative ReviewsIn late November a story broke about how businesses were soaring in the search results, and seeing their businesses grow exponentially – by being as terrible to customers as possible.

Enraged customers were leaving negative reviews on every major site they could linking back to these bad-actor businesses, trying to warn others. But what was happening in search, is all those backlinks were giving the bad actors more and more search equity — enabling them to show up as the first result for a wider and wider range of searches.

Google responded to the issue within weeks, making changes to ensure businesses could not abuse their users in that manner moving forward.

Per Google:

“Being bad is […] bad for business in Google’s search results.”

Additional Reading:
NYT – Bullies Rewarded in Search

2010 November – Instant Visual Previews

This temporary feature allowed users to see a visual preview of a website in the search results. It was quickly rolled back.

 

Additional Resources:
Google Blog – Beyond Instant Results, Instant Previews

 

2010 September – Google Instant

Google suggest starts displaying results before a user actually completes their query.

This feature lived for a long time (in tech-years anyways) but was sunset in 2017 as mobile search became dominant, and Google realized it might not be the optimal experience for on-the-go mobile users.

2010 August – Brand Update

Google made a change to allow some brands/domains to appear multiple times on page one depending on the search

This feature ends up undergoing a number of updates over time as Google works to get the right balance of site diversity when encountering host-clusters (multiple results from the same domain in search).

2010 June – Caffeine Roll Out

On June 10, 2010 Google announced Caffeine.

Caffeine was an entirely new indexing system with a new search index. Where before there had been multiple indexes, each being updated and refreshed at their own rates, caffeine enabled continuous updating of small portions of the search index. Under caffeine, newly indexed content was available within seconds of being crawled

Per Google:

“Caffeine provides 50 percent fresher results for web searches than our last index, and it’s the largest collection of web content we’ve offered. Whether it’s a news story, a blog or a forum post, you can now find links to relevant content much sooner after it is published than was possible ever before.”

Additional Reading:

2010 May 3 – MayDay

The May Day update occurred between April 28th and May 3rd 2010. This update was a precursor to Panda and took a shot at combating content farms.

Google’s comment on the update? “If you’re impacted, assess your site for quality.”

Additional Resources:

2010 April – Google Places

In April of 2010 Local Business Center became Google Places. Along with this change came the introduction of service areas (as opposed to just a single address as a location).

Other highlights:

  • Simpler method for advertising
  • Google offered free professional photo shoots for businesses
  • Google announced another batch of favorite places

By April of 2010, 20% of searches were already location-based.

Additional Reading:

2009 Google Algorithm Updates

2009 December – Real Time Search

Google announces search features related to newly indexed content: Twitter Feeds, News Results, etc. This real time feed was nested under a “latest results” section of the first page of search results.

 

Additional Reading:

 

2009 August 10 – Caffeine Preview

On August 10 Google begins to preview Caffeine, requesting feedback from users.

Additional Reading:

2009 February – Vince

Essentially the Vince update boosted brands.

Vince focused on trust, authority and reputation as signals to provide higher quality results which could push big brands further to the top of the SERPs.

Additional Resources:
Watch – Is Google putting more weight on brands in rankings?
Read – SEO Book – Google Branding

2008 Google Search Updates

2008 August – Google Suggest

Google introduces “suggest” which displays suggested search terms as the user is typing their query.

Additional Reading:

2008 April – Dewey

The Dewey update rolled out in late March/early April. The update was called Dewey because Matt Cutts chose the (slightly unique) term as one that would allow comparison between results from different data centers.

2007 Google Algorithm Updates

2007 June – Buffy

The Buffy update caused fluctuations for single-word search results.

Why Buffy?Google Webmaster Central product manager and long-time head of operations, Vanessa Fox, notoriously an avid Buffy fan, announced she was leaving Google.

Vanessa garnered an intense respect from webmasters over her tenure both for her product leadership and for her responsiveness to the community – the people using google’s products daily. The webmaster community named this update after her interest as a sign of respect.

Additional Reading:

2007 May – Universal Search

Old school organic search results are integrated with video, local, image, news, blog, and book searches.

Additional Reading:

2006 Google Search Updates

2006 November – Supplemental Update

An update to how the filtering of pages stored in the supplemental index is handled. Google went on to scrap the supplemental index label in July 2007.

Additional Reading:

2005 Google Search Updates

2005 November – Big Daddy

This was an update to the Google search infrastructure and took 3 months to roll out: January, February, and March. This update also changed how google handled canonicalization and redirects.

Additional Reading:

2005 October 16 – Jagger Rollout Begins

The Jagger Update rolled out as a series of October updates.

The update targeted low quality links, reciprocal links, paid links, and link farms. The update helped prepare the way for the Big Daddy infrastructure update in November.

Additional Reading:

2005 October – Google Local / Maps

In October of 2015, Google merged Local Business Center data merges with Maps data.

2005 September – Gilligan / False Alarm

A number of SEOs noted fluctuations in September which they originally named “Gilligan.” It turns out there were no algorithm updates, just a data refresh (index update).

Given the news, many SEOs renamed their posts “False Alarm.” However, moving forward many data refreshes are considered updates by the community. So we’ll let the “Gilligan” update stand.

Additional Reading:

2005 June – Personalized Search

Google relaunches personal search. This time it helps shape future results based on your past selections.

Additional Reading:

2005 June – XML sitemaps

Google launches the ability to submit XML sitemaps via Google Webmaster tools. This update bypassed old HTML sitemaps. It gave Webmasters some influence over indexation and crawling, allowing them to feed pages to the index with this feature.

Additional Reading:

2005 May – Bourbon

The May 2005 update, nicknamed Bourbon seemed to devalue sites/pages with duplicate content, and affected 3.5% of search queries.

2005 February – Allegra

The Allegra update rolled out between February 2, 2005 and February 8, 2005. It caused major fluctuations in the SERPs. While nothing has ever been confirmed, these are the most popular theories amongst SEOs for what changed:

  • LSI being used as a ranking signal
  • Duplicate content is devalued
  • Suspicious links are somehow accounted for

Additional Reading:

2005 January – NoFollow

In early January, 2005 Google introduced the “Nofollow” link attribute to combat spam, and control the outbound link quality. This change helped clean up spammy blog comments: comments mass posted to blogs across the internet with links meant to boost the rankings of the target site.
Future Changes:

  • On June 15, 2009 Google changed the way it views NoFollow links in response to webmasters manipulating pages with “page rank sculpting”.
  • Google suggests webmasters use “nofollow” attributes for ads and paid links.
  • On September 10, 2019 Google Announced two additional link attributes “sponsored” and “ugc.”
    • Sponsored is for links that are paid or advertorial.
    • UGC is for links which come from user generated content.

Additional Reading:

2004 Google Algorithm Updates

2004 February – Brandy

The Brandy update rolled out the first half of February and included five significant changes to Google’s algorithmic formulas (confirmed by Sergey Brin).

Over this same time period Google’s index was significantly expanded, by over 20%, and dynamic web pages were included in the index.

What else changed?

  • Google began shifting importance away from Page Rank to link quality, link anchors, and link context.
  • Attention is being given to link neighborhoods – how well your site connected to others in your sector or space. This meant that outbound links became more important to a site’s overall SEO.
  • Latent Semantic Indexing increases in importance. Tags (titles, metas, H1/H2) took a back seat to LSI.
  • Keyword analysis gets a lot better. Google gets better at recognizing synonyms using LSI.

2004 January – Austin

Austin followed up on Florida continuing to clean up spammy SEO practices, and push unworthy sites out of the first pages of search results.
What changed?

  • Invisible text took another hit
  • Meta-tag stuffing was a target
  • FFA (Free for all) link farms no longer provided benefit

Many SEOs also speculated that this had been a change to Hilltop, a page rank algorithm that had been around since 1998.

Additional Reading:

2003 Google Algorithm Updates

2003 November 16 – Florida

Google’s Florida update rolled out on November 16, 2003 and targeted spammy seo practices such as keyword stuffing. Many sites that were trying to game the search engine algorithms instead of serve users also fell in the rankings.

 


GIF of a webmaster freaking out a little and mashing their keyboard looking worried
Additional Reading:

 

2003 September – Supplemental Index

Google split their index into main and supplemental. The goal was to increase the number of pages/content that Google could crawl and index. The supplemental index had less restrictions on indexing pages. Pages from the supplemental index would only be shown if there were very few good results from the main index to display for a search.

When the supplemental index was introduced some people viewed being relegated to the supplemental index as a penalty or search results “purgatory”.

Google retired the supplemental index tag in 2007, but has never said that they retired the supplemental index itself. That being said it’s open knowledge that Google maintains multiple indices, so it is within the realm of reason that the supplemental index may still be one of them. While the label dissapeared, many wonder if the supplemental index has continued to exist and morphed into what we see today as “omitted results”Sites found they were able to move from the supplemental index to the main index by acquiring more backlinks.

Additional Reading:

2003 July – Fritz (Everflux)

In July, 2003 Google moved away from monthly index updates (often referred to as the google dance) to daily updates in which a portion of the index was updated daily. These regular updates came to be referred to as “everflux.”

2003 June – Esmerelda

Esmerelda was the last giant monthly index update before Google switched over to daily index updates.

2003 May – Dominic

Google’s Dominic update focused on battling spammy link practices.

2003 April – Cassandra

Google’s Cassandra update launched in April of 2003 and targeted spammy SEO practices including hidden text, heavily co-linked domains, and other low-link-quality practices.

Google began allowing banned sites to submit a reconsideration request after manual penalties in April of 2003.

Additional Reading:

2003 February – Boston

Google’s first named update was Boston which rolled out in February of 2003. The Google Boston Update improved algorithms related to analyzing a site’s backlink data.

2002 Google Algorithm Updates

2002 September – 1st Documented Update

Google’s first documented search algorithm update happened on September 1, 2002. It was also the kickoff of “Google Dance” – large-scale monthly refreshes of Google’s search index.

SEOs were shocked by the update claiming “PageRank [is] DEAD”, this update was a little imperfect and included issues such as 404 pages showing up on the first page of search.

Additional Reading:

2000 Google Search Updates

2000 December – Google Toolbar

Google launches their search toolbar for browsers. The toolbar highlights search terms within webpage copy, and allowed users to search within websites that didn’t have their own site search.

Additional Reading:

 

The post Google Algorithm Update History appeared first on LinkGraph.

]]>
https://linkgraph.io/blog/google-algorithm-update-history/feed/ 47
SEO for Ecommerce https://linkgraph.io/blog/seo-for-ecommerce/ https://linkgraph.io/blog/seo-for-ecommerce/#respond Sun, 16 Oct 2022 21:42:48 +0000 https://linkgraph.io/?p=2846 Is your SEO strategy failing to drive enough conversion-oriented organic traffic to your e-commerce site? Or are you just getting started with a new e-commerce store? This […]

The post SEO for Ecommerce appeared first on LinkGraph.

]]>
Is your SEO strategy failing to drive enough conversion-oriented organic traffic to your e-commerce site? Or are you just getting started with a new e-commerce store?
This article covers e-commerce specific SEO strategies and considerations. By applying these advanced on page SEO tactics and best practices, you can expect your online business or online store to gain search visibility, better organic rankings, higher organic search traffic, and ultimately more online shoppers.

We’ll walk you through the same basic SEO elements you’ve undoubtedly read about countless times before, but address their application for an ecommerce store. These practical SEO tips will help you significantly increase your new customers and online sales by making it easier to reach your target audience without spending a dime on PPC advertising.

How to Set IA for E-commerce Sites

Category-based information architecture (IA) and site architecture are critical for ecommerce sites, and this IA should inform your main navigation.

Online users have no patience. If they can’t find what they’re looking for quickly, they’ll bounce back to the search results and try the next site in the search results.

As your product catalogue grows, you will need to put more effort into making it easier for users find the right products!

Your site must have well-thought through UX (user experience) elements such as filters, navigation links, breadcrumbs, product categories and subcategories as well as clear URL structures and product naming conventions.

Ecommerce IA Is All About Product Categories

Good UX makes it easy for a user to understand where they are in your site, and what products and/or services your online store offers. For ecommerce SEO, site structure is typically based off of product categories, product collections, and products/product filters.

  • Step 1: Determine Product Categories
  • Step 2: Determine Product Sub-Categories and/or Collections
  • Step 3: Determine Product Names

Product categorization for your e-commerce website may need to be different than your product categorization operationally. Your ecommerce SEO strategy needs to reflect how your consumers view your products, not how your business views your products.

IA Should Reflect How Your Customers Think

In general, your site structure should reflect how your customers think about your products and services, even down your actual product names.

A common mistake that e-commerce sites make is organizing their products online the way they view those products from a production or operational perspective.

How everyday shoppers think about your products may be different than how you think about your products.

For example, you may think of a piece your company makes as “Breville part #: BJE510XL/45” but your everyday shopper may search for “Breville Filter Basket Replacement”.

So how do you establish and/or close the gap between how you think about your products, and how your prospective customers search for your products?

When In Doubt, Ask Your Customers!

Pull from a usability best practice — have direct conversations with existing customers. Ask them how they’d describe your products or services, how they mentally categorize your offerings, or how they searched for your business to begin with. Their input will help you understand the language your customers are using, and how they think about your products or services.

If you need a starting place for understanding how your consumers view your products, you can have them complete a card sorting exercise. Card sorting is a UX (user experience) tactic that helps you prioritize and group information based on how your customer’s see it – by literally giving them all the elements and asking for their feedback.

Account for Broader Market Trends

Next, complete keyword research. Keep in mind a handful of responses can be very helpful for gaining insights, but they may not reflect the broader market. Spot check search volume for keywords using the language your customers use, the language you would use, and be open to discovering additional ways the market overall searches for your products/services.

How to Conduct Keyword Research for an E-commerce Store

Keyword research helps you understand how the market thinks about products and services, and which search terms are likely to convert if you can attract those shoppers to your ecommerce site.

Start With a Keyword Research Tool

There are a number of tools to help you with keyword research. If you already use adwords, you could use the google keyword planner tool as a starting point for establishing the best keywords (search terms) to target. You can also use LinkGraph’s keyword volume or keyword tracker tools.

Make a Starter List of Product-Specific Keywords

Your goal should be to identify a list of keywords that describe your products and have high search volume. This set of terms represents how your consumer base thinks about your products, and this is the language that you should use throughout your ecommerce website (ex: for product categories or collections).

You need to know what long-tail keywords, which specific terms, people use when searching for the exact products that you sell.

Narrowing Down Keywords By Search Intent

The types of keywords you would use to optimize an e-commerce site aren’t necessarily the types you would use in another niche. You want to attract users towards the bottom the purchase funnel. To do this, you need to identify a user’s search intent.

As the name implies, the term “search intent” refers to the reason someone is performing a search. This also influences the words they choose when performing a search. Search intent can be broken down into four categories: informational, navigational, commercial, and transactional.

  • Informational searches use keywords that indicate a user wants to learn about a particular topic.
  • Navigational searches include the domain that results should be surfaced from such as The New York Times or Twitter.
  • Commercial searches suggest a user is interested in a general product or service, but hasn’t chosen a specific option yet. This type of search might include keywords such as “best guitar for beginners” or “guitar reviews.” The person performing this search is clearly thinking about making a purchase in the future. They’re simply conducting initial research first.
  • Transactional searches are usually performed when a potential customer knows precisely what they wish to buy. As such, words such as “buy” and “for sale” often show up in these searches. To continue with the example above, after conducting research and deciding which guitar to buy, the user in question might search for “buy Yamaha Gigmaker EG Electric Guitar Pack” or “Yamaha EG Electric Guitar Pack for sale.”

It’s important that you select keywords that have commercial or transactional search intent. These are the terms that will convert for your site. Do NOT simply pick keywords that have high search volumes. After all, your core goal is actual ecommerce sales. You want to attract more potential customers, not just generic organic traffic!

Select Keywords with Commercial Search Intent

In general, ecommerce keywords belong to the commercial and transactional categories. It’s easy to understand why you’d want to focus primarily on researching transactional (and, to some degree, commercial) keywords. These are simply the types of keywords people frequently use when they are ready to make a purchase. This makes them ideal for ecommerce websites.

Select Keywords That Are Very Specific

It’s also worth noting that ecommerce keywords are often very specific, or long tail. Long-tail keywords are keywords that have modifier terms around the basic keyword. Identifying these more specific keywords can be extremely valuable.

In the example below we see how more specific searches can have a higher cost per click. This is a market indicator that the term is higher-converting. You may also notice that there is less search volume on long-tailed, or specific, keywords. Broad searches have higher search volume because there is a much wider range of reasons those searches could be performed. Specific searches have much clearer intent.

A very specific search phrase, for example one that includes model type, size, brand name, or location indicates a user knows exactly what they want and they are ready to make a purchase. Very specific searches tend to be more high-converting, and more impactful on your bottom-line.

Select Keywords That Have High CPC

One more indicator that a keyword is likely to be used by potential customers and have a higher conversion rate is that the term has a high Pay Per Click (PPC) or Cost Per Click (CPC) value. These are terms that the market has already validated as conversion-oriented. However, you cannot rely on CPC alone, you still need to check for the relevancy of terms against your own product list and/or services. Converting for the market will not always mean converting for your specific site or business.

Find out how to get #1 on Google rankings and beat the competition
Book a Call

Set Up Your E-commerce Website Architecture

Once you understand the search queries your customers are using, at both a category and product level, you’re ready to finalize your site architecture. Use broad high-volume terms as product categories/product collections, and then more specific terms for sub categories and individual products.

You’ll use this architecture to set up the structure of your site, from your home page, to category-based landing pages, sub category landing pages, collection pages, and finally product pages. This architecture will also inform how your main navigation and sub navigations are structured.

Create Your Category Landing Pages

Once you’ve determined your product categories and subcategories, consider creating related landing-pages for each category and/or subcategory. A recent study has shown, that sites which increase their landing pages from 10 to 15 see a 55% increase in leads. These pages can be used to target keywords that are broader than your product-specific keywords. For example, this page on Guitar Center for electric guitars:

This landing page (and URL) target the broader and higher-volume term “electric guitars.” Category pages help your site capture traffic from higher-volume terms while the individual product pages target much more specific long-tailed keywords (aka higher-converting keywords).

Set Up Your Main Navigation

Category-based site structures also help users navigate quickly to relevant products/services right from the homepage. For ecommerce websites of any size each category page can be a main nav or sub nav item. This strategy adds the related category keywords to every page on your site, as well as increases the page rank for these pages through internal linking, as the main navigation is repeated on every page of your site.

Take Guitar Center’s site for example, all of these sub-category pages for “guitars” are listed (linked) in the main navigation, and therefore all the terms you see here are “read” by search engines on every single page of Guitar center’s site — not just their homepage. This site structure also makes it very easy for users to find the exact product they’re looking for and even discover new products. This boosts your page’s odds of appearing in relevant search results.

Implement Category-Based Breadcrumb Navigation

For larger ecommerce sites, adding all product categories or subcategories to the main navigation may not be feasible. In this instance, breadcrumbs can provide an alternative method for leveraging internal linking and product pages to help users navigate deeper sites.

Guitar Center’s main nav does not display or internally link to any pages below the “electric guitars” category. However, there are additional product subcategories within the electric guitars product category. To improve usability and discoverability of these additional subcategory pages breadcrumb navigation has been added (highlighted in red below).

Breadcrumbs are especially useful on product pages, as they can help users discover a full product line, clarify the website structure, and provide a secondary navigation link to bounce a user back multiple site levels without having to press the back button multiple times.

Amazon, as another example, uses breadcrumbs to provide users with secondary navigation on almost every product page.

Employ Category-Based Redirects

Finally, category pages can be helpful for ecommerce sites, as you can set up category-based redirects. Category-based redirects allow out-of-stock products redirect to the main category page. This improves the user experience and reduces the chances of Google, Bing, or other major search engines reading any of your pages as 404ing.

Determine Your Site’s URL Structure

A well thought-through site structure will also enable you to programmatically generate custom product URLs that include relevant keywords (describe the product) and are easy for users to read/understand.

Use Plain-Language URLs

Keyword-based URLs are more “clickworthy” than URLs that consist of seemingly random characters such as product SKUs. URLs featuring keywords essentially “tell” a search engine algorithm more about what type of product is featured on the page.

URLs Should Include the Product Name

In ecommerce seo, the URL slug will typically be the main product keyword — usually the product name. Your main keyword for the product should also be included in your H1. Continuing with the earlier example of a user searching for a beginner guitar set, this page from GuitarCenter demonstrates the right way to generate a product URL:

The URL slug is the name of the product. It’s also the H1 (not just stylistically but also with the HTML H1 tag applied). The result? This is the first page to appear in a Google SERP when users search for “yamaha gigmaker eg electric guitar pack.”

Popular Ecommerce URL Formats

The URL also illustrates a popular ecommerce URL format: domain.com/category/product. Other options to consider include domain.com/collection/category/product and simply domain.com/product.

Determining which format to use requires deciding whether your products belong to specific categories and collections, or whether they stand on their own. The chosen format works in this example because the product is an electric guitar pack belonging to a specific brand.

Explore the traditional URL structure of various ecommerce platforms when reviewing your options, but keep in mind that you can often change the structure by choosing the right theme or directly editing the code.

Optimize Individual Product Pages

I’m sure you’ve spent time optimizing your homepage already, but did you know it’s even more crucial to optimize the SEO of your individual product pages? These are the pages that need to appear in relevant search results, and you want them to be strong enough that they convince guests to make a purchase.

Product-Focused Technical SEO Elements

Once you’ve done some preliminary keyword research, begin to optimize your individual product pages by addressing your Technical SEO:

  • Page Title
  • Meta Description
  • H1
  • Clear search intent

E-commerce Title Tags

Page titles, also known as title tags, need to accurately represent what a product is. They also need to feature the primary keyword for which you most want to rank. Make sure this keyword or phrase is front loaded in the title tag so it’s more noticeable on a small mobile screen, and the key information from the title tag still displays even in rich snippets.

Take a look at the examples below. In the first, only the first two words of the page title are visible in the first one. In the second we see subcategories highlighted.

E-commerce Meta Descriptions

You should also include relevant keywords in the meta description for your product page. Your meta description should encourage the user to click into the search result, clarify what the user can expect from the page, and include product-relevant keywords to help your page rank.

It’s important to keep in mind that Technical SEO is primarily about helping users navigate online. The meta information on your product pages (including page titles, descriptions, and headers) serve much the same purpose as highway signs or signage at an airport: They help users reach their intended destination. Thus, you should attempt to be as informative as possible, while also being brief and direct.

Improve Product Metas with “Modifier” Words

When thinking about what information to include in your title, meta description, and/or H1 it can help to think about product modifiers. These are terms (often included in long-tail keyword searches) that further describe the product. These often include include items such as:

  • Price
  • Size(s)
  • Color(s)
  • Material(s)
  • Whether an item is for a particular gender and/or age group
  • Discounts (you may want to include keywords like “as low as”)
  • Shipping options

Remember, many of the users you’re targeting know exactly what they’re looking for in detail. You thus need to provide them with information demonstrating you’re selling exactly what they’re looking for.

Images Sell Products, Alt Tags Help

Including images of your products is crucial to ecommerce SEO.

  • Images increase product sales exponentially, and help users form an idea of what they’ll be receiving for their money.
  • Well-optimized images with fast load speeds help send signals to search engines that your site has been optimized for mobile users.
  • Alt tags and image title tags can help your images show up in image searches as well as sending additional keyword-relevancy signals to search engines.
  • Images are a prerequisite for being included in Google’s rich snippets at the top of the search results.

Images Provide Better Customers Context

Images allow you to display your products in dynamic ways. If you’re selling apparel, product images where items are being worn by models provide customers a better sense of how an article of clothing looks when worn.

Images can also provide context for products. For instance, maybe you sell furniture and fixtures. An image of a product in a room (ideally surrounded by a few other items) will give users a better idea of its size, and how it will look in their own homes.

Images Need to Be Size and Speed Optimized

Product images with small file sizes, which do not display page load speeds, and which adjust responsively – display well on mobile. Google is continuing to switch sites over to mobile-first indexing and both Google and Bing noticeably reward sites with images and rich media in search.

Notes on Drafting Alt Tags

Image ALT text tags are simply descriptions of images on your site. They also play a significant role in ecommerce SEO. Alt text tells a user what an image depicts when the image either doesn’t load, or when the user is blind, but it also provides search engines more information about what an image itself is relevant for in search.

Where appropriate, ALT tags should feature the keywords you want to rank for without adding confusion to the image description itself. Keep in mind the point of alt text is still to help people with accessibility issues, so keep alt text relatively short (no more than 125 characters) and try to be specific about any key product features highlighted in the image (such as product name, size, materials, and any other relevant information). This is another way in which you can tell a search engine what type of content appears on the page.

Lengthen Product Descriptions

Including a short product description right next to the product image is a smart way to capture a potential customer’s attention and improve the on page SEO. They’ll see both the product and the most important information about it at the same time.

How to Avoid Thin Content

Review this example to understand why this method is effective. The image shows off the product, while the copy provides a user with the basic essential information.

Scroll down, however, and you’ll find a lengthier product description. A longer product description section gives you the opportunity to include more keywords in your content and ensures you won’t be prevented from ranking for thin content.

Additionally, great content is more likely to engage customers and build brand awareness. When you only have thin content users spend less time on the page, and less time considering your product. Not only does more time spent on the page improve your rankings in the SERPs, but also great descriptions will help guests better understand why a product is valuable.

The right word count for descriptions depends on how much content exists on a blank product page. What this refers to is the sum of text in the navigation, header, footer, etc. before product information is added. Making sure product descriptions are longer than the sum of the base page content is a good starting point.

Never Use The Manufacturer’s Product Description

One thing to always note, though: never use the manufacturer’s product description. Ensure yours is unique and not copied from the manufacturer or another site. This is important because search engines won’t show the exact same content (duplicate content) from multiple sites. They instead only display content from the site that is deemed most “trustworthy.” This can end up being the site that has had the content up the longest, the site with the most traffic, the site with the most backlinks, or the site with the most users.

How to Avoid Duplicate Content

If you have multiple pages for essentially the same product (ex: the same product in different colors, or the same product in different sizes), you’ll need to make some choices so that search engines are not confused by what are essentially duplicative pages/duplicative content:

  • Keep the product description on each page relatively the same, but set one page as canonical.
  • Create entirely unique product descriptions/unique content for each page.
  • Vary a percentage of the content under the descriptions using LSI keywords (latent semantic indexing keywords).

To avoid duplicate content when creating product descriptions, try breaking up the content into multiple sections. For example, one section could describe the story behind the product. Another could list its key features. Yet another could feature customer testimonials, just like the Biossance example above.

Include LSI Keywords

LSI keywords are terms that search engines expect to see on a page that is related to particular topic. Often called focus terms, LSI keywords help Google understand the focus of a page. Read more about focus terms and content optimization here.

LSI keywords can help you tailor each page to rank better for the longtail keyword or term you’ve selected. LinkGraph even has our own content optimization tool that you can use for free.

Break Product Descriptions Into Sections

You should also consider that different people care about different product features. Breaking your descriptions up into sections makes it easier to appeal to all users. For example, if you were selling a garment, various users might care about such information as size, durability, warranties, shipping options, color options, special features (such as water-resistance), and more. Use your longer product descriptions to provide all information you believe potential buyers would be interested in. If your product descriptions are exceptionally long, you can even use internal links called jump links to help users navigate down to relevant content more quickly.

Free SEO proposal when you schedule with LinkGraph
Book a Call

Implementing Product Schema for an eCommerce Site

Schema markup – also known as rich snippets – refers to HTML tags you can add to your content. When used correctly, schema can increase CTR by as much as 677% and boost traffic 20-30%. By providing users with more valuable information about your content when it shows up in Google search results.


For example, maybe a user’s search results include one of your product pages. With schema markup, you could include customer ratings in your organic search result – or even show up in the product rich snippets that appear at the top of the search results. It’s also worth noting that Google’s own John Mueller has confirmed that schema is important to SEO.

How to Implement Product Schema

Product schema can substantially improve your SEO. It’s also fairly easy to implement. The following are two simple ways to do so:

Install a Plugin

Do you use a major platform like Shopify or WooCommerce to manage your ecommerce site? If so, you can simply install a plugin for schema markup. It will allow you to add the necessary schema with ease.

Other platforms, like WordPress, have their own plugins, too, like these. So does Squarespace. While these platforms don’t allow for extensive schema markup on their own, plugins can expand their capabilities.

Use a Schema Markup Platform

If you have a custom site that is NOT managed via WordPress, you could instead use SchemaApp, which allows you to organize your schema markup data on one platform. You can also use this tool if you host your e-commerce site through such platforms as Shopify, Woocommerce, BigCommerce, and Squarespace.

There’s also Google’s Structured Data Markup Helper, which you can follow along with after selecting “Products” from the main screen.

Does your organization have substantial in-house technical resources? If so, you can coordinate with a web developer to add schema markup to your site via Schema.org. This allows you to exercise a greater degree of control over the purpose of the schema you wish to add. It’s not an option for all businesses, but it’s worth considering if you have the necessary resources.

Build Backlinks

Building a strong backlink profile is part of an effective SEO strategy for any site. Ecommerce sites are no exception. Inbound links (also known as backlinks) are critical for improving your SERP, and can help you bump terms stuck on the second search engine results page up to the first.

What are inbound links or backlinks? A backlink is when another (external) site links back to your site, referencing your products, services, or content. In essence it’s another site referring their own users to your site because your site provides value. Essentially, search engines view other sites linking to your site as a positive reference from a real person. Link signals are weighted heavily in SEO. Each backlink your site receives, increases the value of your site in the eyes of Google, and thus improves your rankings.

As more domains link back to your site, your own site’s domain authority will increase. As your domain authority increases, so does your site’s SEO value. Search engines use these ranking signals (backlinks) to determine which sites are most relevant online for related topics. Adding link authority boosts your page’s SEO value, and it’s ranking in the search results.

A strong backlink profile improves brand awareness and captures top of funnel web visitors who may encounter your site/brand via another initial source.

How to Build Backlinks

Link building starts with creating quality content that will be used and shared by people outside of your own site. Securing inbound links from reputable sites tells search engines your site is also reputable. These links may also provide additional opportunities for your products to display in rich snippets, and direct more traffic back to you.

There are several ways you can build backlinks for an ecommerce site. The following are a few methods we’ve found to be successful for e-commerce sites:

Submit Your Products to Product Lists & Pages

Many sites routinely post lists such as “Best Holiday Gifts for College Students,” “25 Life Changing Products under $25”, or “What to Get the Person Who Has Everything.” There are also sites specifically designed to help users discover new products (such as uncommon goods, product hunt, or pinterest). Submitting your products to these sites boosts your odds of showing up on such lists. Additionally, you may wish to submit your products to sites where users actively discover products, such as Pinterest, Product Hunt, or Wish.

Post Strong Blog Content

A blog featuring valuable content can be a very useful tool for building backlinks. We recommend starting by identifying a list of ideas for blog posts. Each blog post should be tailored to a frequently asked question, or frequently searched topic, relevant to your business and your target consumer. Popular blog content, such as “Top 10 Gift Ideas for Father’s Day 2020” can encourage others to link back to this type of entry if it was fairly comprehensive.

Additionally, you could submit guest blogs to other sites, linking back to your products in the content. For further reading, SEMRush has a great guide to guest blogging as a linking strategy.

Pitch Product Pages

A more advanced approach would involve pitching product pages. Think about the kind of sites and publications that are likely to cover your products. Check their writer profiles and masthead to find their contact information, and submit a product for review. Each site will have its own process, so research publications and influencers or discuss this with an editor or other relevant individual before submitting your products blindly.


Image Above: Example of a Product Write-Up Included in a List

You may also want to coordinate with influencers in your niche. This guide gives an excellent intro to reaching out to influencers. Search for social media influencers in your industry. If a popular Internet personality recommends your products, that will generate more backlinks and drive overall interest in your brand.

Next Steps

Once you’ve optimized the basics (product descriptions, URLs, schema, etc.), Make sure that your site has enough trust signals and social proof that consumers feel confident purchasing from your online store:

  • Your site design needs to look professional, from the home page to your checkout page.
  • You need to provide security indicators around payment portals such as SSL certificates (ex: https vs. http).
  • Add product reviews to your site!

Customers typically trust user reviews through third-party platforms (such as reviews on Google My Business or Amazon) more than curated product reviews you post to your own site. However, testimonials from people where their full name is displayed can still be a great first step, or even an addition to pulling third party customer reviews onto your own site.

Depending on how you’ve set up your product schema, you’ll also be able to display your aggregate rating, or star rating directly in the search results (especially the Google search results).

These points are all important to keep in mind when developing an ecommerce SEO strategy for your site. The right combination of tools and techniques can be the key to ranking higher, attracting more organic traffic, improving your conversion rate, and ultimately increasing your e-commerce sales.

Get 7 Days Free to use the most powerful SEO software on the planet
Learn More

The post SEO for Ecommerce appeared first on LinkGraph.

]]>
https://linkgraph.io/blog/seo-for-ecommerce/feed/ 0
How to Optimize Internal Links for SEO https://linkgraph.io/blog/internal-links-for-seo/ https://linkgraph.io/blog/internal-links-for-seo/#respond Fri, 09 Sep 2022 16:20:51 +0000 https://linkgraph.io/?p=17897 Internal links allow Google to rank your site more accurately and index your site more effectively.  Your website’s Internal links not only improve the user experience, they […]

The post How to Optimize Internal Links for SEO appeared first on LinkGraph.

]]>
Internal links allow Google to rank your site more accurately and index your site more effectively. 

Your website’s Internal links not only improve the user experience, they communicate to web crawlers your site architecture and how your web content interrelates. 

Without a strong, strategic internal linking structure, your site may lose SEO value and struggle to rank in search engines.

Here is a guide on SEO best practices for internal links, and some mistakes you might be making that could be impacting your organic visibility.

What are Internal Links?

An internal link is a hyperlink that points to a different page on the same website.

a web page with two internal links pointing to two other pages on the same website

They are commonly used to help users navigate between different pages of a website, but can also be used for SEO purposes.

Internal links help to keep visitors on your website longer, which can improve your site’s SEO performance.

What are the Different Types of Internal Links?

There are a few different types of internal links you likely have on your website right now. 

Some of them will bring more SEO value than others, so it’s good to know the difference between each.

Menu/Navigation

The links in your menu/navigation bar are some of the most important internal links. These links remain consistent no matter where a site visitor travels across your website.

a screenshot of a homepage with a red box around the navigation menu

They should point to the most important pages (e.g. product categories, primary services, blog, about, etc.) and should give users a high-level overview of what type of content is on your website.

Because the majority of your link equity is most likely on your homepage, these internal links will distribute a significant amount of page rank across your website, so make sure the pages linked there are the most important and the ones you want to rank.

The internal links you include here will also communicate to those users visiting your website for the first time where to go next. 

Footer Links

Footer links are at the bottom of your web pages. Like the nav bar, the footer is like an anchor that remains consistent across your website.

screenshot of a web page with a red box around the footer links

There may be some repetition in the links you include in your navigation menu and your footer, and that’s okay. They also will be sending quite a bit of link equity from your homepage to the pages linked there.

If users reach the bottom of a web page and have not found a place to click next, you want them to find what they are looking for in the footer.

Buttons/CTA Links

The internal links that you include on your buttons or CTAs are important for shaping the user or buyer journey across your website and for conversion rate optimization.

screenshot of a web page with a red box around the button or CTA links on the page

Most likely, CTA links are pointing to web pages that push users further down the conversion funnel, whether that is to a web page to book a meeting, request a demo, submit an email address, or add an item to a cart.

The anchor text of these internal links will be primarily user and conversion focused.

Sidebar Internal Links

Sidebar links are often used to provide users options of relevant content or what page they could go to next. 

For publishers that feature a lot of content on their website, sidebar links can help site visitors who are browsing your website without necessarily looking for something specific, but are just exploring the various content you offer.

screenshot of a web page with a red box around the sidebar links

Sidebar links are very common on news sites, recipe sites, or those that want the opportunity to show users multiple pages (and thus multiple advertisements).

In-Article Links

In-article links are those that are included in the body of blog posts or long-form articles. They point to relevant content that can provide users with more context or information.

Screenshot of a blog post with red boxes around the two internal links

These types of links are very common because they have loads of SEO value. 

If you are not linking to other relevant articles on your website within each blog post, you’re missing out on opportunities to improve your ranking positions and search engine visibility.

Why are Internal Links Important for SEO?

The SEO benefits of internal links are significant, and can improve your search engine visibility for a variety of reasons.

1. Direct Users & Google to your Most Important Pages

Internal links let Google know the most important content on your website. You can use internal links to help Google understand which pages to promote in the SERPs.

2. Help Google Find and Index your Pages

When indexing sites, search engine crawlers begin on your homepage and spread out from there, using internal links as their navigational guide. 

When you have a strong internal linking system, Google is more likely to find and index all your URLs, so your newest content has ranking potential.

3. Communicate Topical Relevance Through Anchor Text

You may wonder how Google knows what your site and landing pages are about. 

Google’s web crawlers use the anchor text from internal linking to understand the purpose and meaning of your content and its relevance to specific search terms.

Anchor text best practices can improve your SEO.

4. Maximized Crawl Budget

Strategic use of noindex and nofollow tags with your internal links can help you ensure that Google is crawling and indexing your most important pages.

For pages that don’t need to be indexed, like thank you or confirmation pages, internal links with nofollow directives can prevent low-value or low-converting pages from ending up in Google’s index. 

It also leaves room in your website’s crawl budget for Google to index those pages that you do want to rank.

5. Better User Experience

Internal links also make your website a better place for site visitors.

Navigation links guide users along a conversion journey after they find you in the SERPs, and in-content links can point them to other relevant pages.

6. Displays Topical Depth and Breadth

Interlinking your topically related pages can turn your website into a topical powerhouse.

Having lots of internal links in your blog posts to related topics or subtopics shows Google crawlers that your website has topical authority, and is a go-to expert source in a particular industry niche or topic area.

How to Analyze My Internal Links for SEO

If you are not sure whether or not you have internal link issues on your website, a site crawler or site audit tool can help you identify any issues.

To use SearchAtlas’ free site auditor, register for a trial of our SEO software.

To run a site audit, do the following.

  1. Register for an account with SearchAtlasscreenshot of SearchAtlas registration page
  2. Navigate to the Site Audit tool in your dashboardScreenshot of SearchAtlas tutorial page with red arrows pointing to the site auditor tool
  3. Enter your homepage url into the Auditor and click “Audit Site”screenshot of text field in site auditor and button that says "Audit Site"
  4. Select your preferred User Agent, Crawl Speed, and Crawl Budgetscreenshot of searchatlas site auditor
  5. Wait for your audit to generate. Depending on the size of your website, it may take up to a day for the auditor to crawl all of your pages. You’ll receive an email when your site audit is ready.screenshot of searchatlas site auditor email alert
  6. Look for your homepage in the Sites List and click “View Audit.”screenshot of a complete site audit in searchatlas site lists

If you are not comfortable using our software on your own, you can also order an Internal Linking Analysis in our order builder. Our technical SEO experts will determine if there are any link issues on your site and provide a roadmap for how to optimize your internal linking profile for better organic visibility.

Common Issues with Internal Links

You can use the SearchAtlas Site Auditor to see whether or not you are utilizing internal linking best practices. 

Our report will flag any internal linking issues that may be preventing your web pages from earning higher keyword rankings in the SERPs.

Not Enough Internal Links

One of the most common mistakes that new or unoptimized websites make is that they do not include enough internal links on their web pages.

If your web pages are failing to include the right amount of internal links, it will be flagged in your SearchAtlas site audit report.

screenshot of the too few internal links issue in the searchatlas site auditor

This may or may not be an easy fix, depending on the number of web pages you have on your website. 

To resolve the issue, do the following:

  • If you already have relevant content on your website but you are just not linking to it, adding internal links to those pages is the first step to resolving this issue.
  • But if you are a newer website, you will need to write and publish relevant content on your website, and it will need to be high-quality in order to bring SEO value. Then, once the content is live on your website, you can take the next step of adding internal links.

Too Many Internal Links

Although you want to include internal links on your web pages, too many outlinks on a page (both external and internal) can appear like over-optimization to Google.

screenshot of the too many outlinks issue in searchatlas site audit report

Make sure that you are only including links to relevant, helpful content. And don’t overdo it by stuffing your navigation menu or footer with too many internal links. 

Reserve those links for the most important pages on your website – the ones you really want to rank in the SERPS.

Broken Internal Links

Another very common issue that may be flagged in your site audit report is broken internal links. 

screenshot of the broken internal links issue message in the searchatlas site auditor

A broken internal link occurs when you move or delete a page on your website, and you do not update previous internal links with the new destination url.

As a result, those internal links point to 404 pages. Sending Google crawlers and users to a dead page is not good for SEO or for the user experience.

Broken internal links are very common with large enterprise or ecommerce websites that are constantly updating their content. 

To resolve a broken internal link, take one of the following actions:

  1. Restore the dead/deleted page
  2. Update the internal link with a new destination url

Internal Links with Redirects

Sometimes, webmasters may not be worried about internal links because they use 301 redirects whenever they move or delete a page.

Although 301 redirects are good for SEO in terms of the links from other websites that point to your web pages, internal links with 301 redirects are not considered SEO best practice.

screenshot of internal links with redirects in searchatlas site audit report

Why? Because redirecting internal links slow down your website and cause Google crawlers to have to move through your website at a slower pace.

Whenever you move a page, a part of your website maintenance needs to be updating any internal links with the new destination url.

This shows Google crawlers that you are an attentive webmaster, and thus makes them more likely to promote your pages.

Unoptimized Anchor Text

The anchor text that you use to internally link your pages is also important to your keyword rankings and your user experience.

Anchor text lets Google know what your other web pages are about, how your content interrelates, and displays the many valuable pieces of content that live permanently on your website.

For more details on anchor text best practices, read this anchor text guide.

Final Thoughts on Internal Links

Your website’s internal link profile is essential to optimize if you want to rank for high-value keywords in your industry.

Taking the time to audit your internal links and repair any issues can be all the difference in your ranking positions.

Still not sure how to resolve internal linking problems? Connect with our SEO strategists to see how we can help.

The post How to Optimize Internal Links for SEO appeared first on LinkGraph.

]]>
https://linkgraph.io/blog/internal-links-for-seo/feed/ 0
301 Redirects for SEO & Common Redirect Issues https://linkgraph.io/blog/301-redirects-for-seo/ https://linkgraph.io/blog/301-redirects-for-seo/#respond Fri, 02 Sep 2022 16:20:19 +0000 https://linkgraph.io/?p=17405 Learning how to properly use 301 redirects for SEO can make it so your website maintains keyword rankings and organic traffic even as you make changes to […]

The post 301 Redirects for SEO & Common Redirect Issues appeared first on LinkGraph.

]]>
Learning how to properly use 301 redirects for SEO can make it so your website maintains keyword rankings and organic traffic even as you make changes to your content or site architecture.

The reality is, our websites are constantly changing. Good and attentive webmasters will add new and updated content over time to make sure they are providing the highest quality content and page experience to users. 

As a result, redirects become necessary to make sure users and search engine crawlers can find your content. But improper use of redirects can result in lost keyword rankings, lost link equity, and a poor user experience for your website visitors.

When implemented with SEO best practices, 301 redirects shouldn’t undermine your SEO efforts, but ensure that your search visibility is maintained. Here’s a guide to 301 redirects and how to implement them correctly.

Types of Redirects

Here are all of the redirects that you might want to know about, particularly if they are mentioned in your SearchAtlas site auditor report.

  • 301 = “Moved Permanently” – best for SEO
  • 302 = “Moved Temporarily” – often used during website redesigns
  • Meta Refresh = page-level redirect that is not recommend for SEO

As a general rule, if a page is important and you want it to rank, then you should use a 301 redirect if it the page is ever moved

What is a 301 Redirect?

301 redirects are used to tell browsers and search engines that a web page has been permanently moved to a new location.

301 redirect definition with flow chart

For example, https://linkgraph.io/why-anchor-text-diversity-is-good-for-your-backlink-profile redirects to https://linkgraph.io/anchor-text-diversity

301 redirects ensure that users and search engines are always directed to the most current and relevant content. A 301 redirect tells the search engine that the page has been moved, and the old page can be safely removed from the search engine’s index, while the new page should be indexed instead.

Why Do 301 Redirects Matter for SEO?

There are a few ways that 301 redirects can impact your web pages SEO performance.

  • Ensure your most up-to-date versions of your web pages are what are indexed and shown to searchers
  • Protects site visibility of your web pages during and after site migrations
  • Helps you maintain the majority of the link equity the original page has earned through backlinks or previous link building efforts

Here is a video from Google Search Central about how link equity is passed through a redirect.

Common Redirect Issues

There are some common issues that occur with redirects that can impact SEO performance. It’s possible that one or more of these issues will be flagged if you run an SEO audit using the Site Auditor in your SearchAtlas dashboard.

Broken Redirects

Broken redirects are those that point to 404 or dead pages. When this happens, you will often see an error message like this:

screenshot of a 404 error message

The negative impact of a broken page for users and search engines is pretty clear, so you want to avoid sending either to a dead page at all costs.

Unfortunately, broken redirects are hard to detect without the use of a site auditor. But if you’re a webmaster for an ecommerce website with thousands of product pages that are constantly being added or old pages being deleted, broken redirects are more common than you might think.

Here are the two ways you can resolve this issue.

  1. Reinstate the dead page so the redirect is no longer broken
  2. If you want the dead page to stay dead, you need to remove every internal link on your website that points to that dead page

Redirect Chains & Redirect Loops

Although when used sparingly, redirects are good for SEO, then can also. harm your SEO performance if used excessively. 

A redirect chain occurs whenever one or more redirects point from a url to a destination url.

flowchart of redirect chain with a little robot moving along the redirects

Google does not want to see redirect chains on your website, as they slow down your website and make it take longer for Google to crawl your website.

Redirect loops are when your redirects point to urls with other redirects, sending spiders in a loop where they never arrive at a destination page at all.

flowchart of a redirect loop with a robot getting lost along the redirect loop

As a general rule, it should never take more than one redirect to get to a destination page. 

The best way to avoid excessive redirect chains is to make sure you use SEO friendly urls from the beginning. That means optimizing your urls from the and sticking to them after you update the content.

However if you do need to resolve a redirect chain or loop, take the following steps.

  • For redirect chains: Replace the redirect chain with a single redirect
  • For redirect loops: Fix the final destination url

Redirecting to HTTP instead of HTTPS

It’s important that HTTP pages always redirect to HTTPS protocols. HTTPS provides users with a safer browsing experience and it is a confirmed ranking factor.

screenshot of a compliant HTTPS redirect in the SearchAtlas site auditor

For more info on getting an SSL certificate and redirecting an HTTP site to HTTPs, read our detailed guide on HTTPS.

Internal Links with Redirects

If you have internal links that redirect, it is likely slowing down your website and causing you valuable link equity. 

The site audit report will let you know if this issue is present on any of your pages.

screenshot of internal links with redirects in the searchatlas site auditor

After adding a new version of a page or deleting a page, a part of your regular website maintenance needs to be updating all of your internal links that previously pointed to those pages to the new destination url.

This can take some time, particularly if you have a lot of web pages and are using internal links to elevate your SEO performance.

But it shows Google that you’re an active webmaster that is doing the necessary work to make your website the best place for visitors.

Redirects in XML Sitemaps

There should be no pages in your XML sitemap that redirect to other destination urls. You should be updating your sitemap instead with the new destination so Google crawlers are directed straight to the newest version of the page that you want indexed.

Other Redirect Errors

There are a few other redirect issues that might be flagged in your Site Audit report.

Redirect urls should be lowercase.

screenshot of redirect error in the searchatlas site auditor

And all of the protocol variants (HTTPS, HTTPS) should redirect to the same destination url.

screenshot of compliant redirects message in searchatlas

How to Setup a 301 Redirect

There are many ways to implement a redirect depending on your content management system. Some CMS like WordPress will automatically set up a 301 redirect when you make changes to the url path of an existing page. 

There are also many plugins that you can add to your WP site that help confirm on-page SEO best practices with redirects.

But to add a redirect manually, you will need to edit your .htaccess file.

A .htaccess is a powerful website file that is used by Apache web servers. It is located in the root directory of your website. The root directory may be in a folder labeled public_html, www, htdocs, or httpdocs, depending on your hosting provider.

To edit the file, all you need is the old page’s URL and the new page’s URL.

  1. Log in to your website’s hosting account.
  2. Find the file that contains your website’s code.
  3. Look for the code that redirects users to a specific page.
  4. Copy the old page’s URL and paste it into the code, replacing the old URL.
  5. Copy the new page’s URL and paste it into the code, replacing the old URL.
  6. Save the file and upload it to your website’s server.
  7. Test the redirect by visiting the old page’s URL. You should be redirected to the new page.

Conclusion

If you have a large website and you haven’t been thinking about redirects until recently, there may be quite a bit of technical work you need to do to get your site on track.

Not sure whether you have redirect issues on your website? Run a free site audit.

If you are unable to resolve the redirect issues identified in your SIte Audit report on your own, reach out to our technical SEO team

We can do the work of optimizing your linking profile so you meet Google’s standards, speed up your website, and create a better experience for your website visitors. 

The post 301 Redirects for SEO & Common Redirect Issues appeared first on LinkGraph.

]]>
https://linkgraph.io/blog/301-redirects-for-seo/feed/ 0
9 SEO Shopify Tips to Optimize Your E-commerce Website https://linkgraph.io/blog/seo-shopify-tips/ https://linkgraph.io/blog/seo-shopify-tips/#respond Mon, 21 Feb 2022 18:13:54 +0000 https://linkgraph.io/?p=11961 Guest post by Cassandra Highbridge at Unstack.  Many Shopify store owners are not leveraging their full SEO potential. You might think that you don’t need Shopify SEO […]

The post 9 SEO Shopify Tips to Optimize Your E-commerce Website appeared first on LinkGraph.

]]>
Guest post by Cassandra Highbridge at Unstack. 

Many Shopify store owners are not leveraging their full SEO potential. You might think that you don’t need Shopify SEO or that it is too difficult to set up, but you could be missing out on a large number of potential customers browsing your online store. If you want to help your e-commerce store show up in Google, here are some SEO Shopify tips to help you get started. 

What is Shopify SEO?

Shopify SEO is the process of making your online store more visible in the search engine results pages (SERPs). This means that when people search for the product that you sell, you want to rank high so that you get more organic traffic and increased sales.

Best SEO Shopify Tips for 2022

In order to optimize your Shopify store effectively, follow these 9 tips.

Tip #1: Categorize Your Products

Category pages allow online shoppers to easily navigate your products. You’ll want to target keywords that your shoppers would be interested in. 

You’ll first want to start with title tags and meta descriptions. As PracticalEcommerce states, “The title tag is the most influential on-page element that sets your page’s keyword theme and, combined with the meta description, influences the search terms the page ranks for.” 

Another element you’ll want to optimize is your heading tags. Heading tags are parts of the page that tell the reader what certain parts of the page contain. 

The Media Captain offers these tips for more effective headlines:

  • Only have a single H1 on the page.
  • H2-H4s should elaborate on the core theme and follow a clear hierarchy of information.
  • Use relevant keywords in your headings.
  • Make your headings relevant and captivating; you want your readers to be intrigued and informed on the content within that section.

Tip #2: Optimize Your Shopify Store’s URL Structure

If you want to rank well, another thing you have to keep in mind is your URL structure. A clear URL makes it easy for visitors and search engines to navigate your site. 

ContentKing states a URL is generally considered good if it’s: 

  • Descriptive and easy to read
  • Brief
  • Consistent
  • Lowercase

This makes it so that when using Search Engines users can see the whole URL and keywords before they click.

You should also try to include keywords in your URL’s, so rather than something like this:

example.com/categories/jk13d3

You should have a URL that contains keywords from your product page. So if ID jk13d3 stands for “Levi Jeans” on your site, you could change the URL to:

example.com/categories/levi-jeans 

Another tip is to make sure you separate your URLs with hyphens. Google may not be able to read your URLs (eg. example.com/soccershoes) if you don’t include punctuation. Keep in mind Google treats hyphens as a space (eg. example.com/soccer-shoes) and underscores as a separate character.

Tip #3: Choose Your Keywords Wisely

The keywords that you incorporate into your eCommerce store are incredibly important. That’s why it’s imperative to do research and see what your direct competitors are using. This can be done just by going to their websites and compiling data or using a keyword research tool. Here are just a few tools that might work for you:

  • Google Keyword Planner – Get help with keyword research and selecting the right terms.
  • WordStream Keyword Tool – Discover and export new keywords and performance data to help you succeed in Google Ads and Bing Ads.
  • SearchAtlas – a popular SEO tool that specializes in keyword research, competitor analysis, and content optimization

When picking what keywords to target there are a couple of factors you’ll want to look at:

  1. Search Volume: this is the amount of time a keyword is searched for within a certain timeframe.
  2. Keyword Difficulty: this is how hard it will be to rank for a certain keyword
  3. CPC: Cost-per-click is the amount advertisers are paying to target the keyword in Google Ads and is a good sign of conversion potential

Keyword Research in SearchAtlas
Keyword Metrics as seen in the Keyword Researcher from SearchAtlas

Usually, you want to target keywords with a high search volume, high CPCs, and less competition. 

Once you know what keywords you’re targeting, start incorporating them into product descriptions, landing pages, and headlines. You’ll want to continue to monitor these keywords and see how they’re performing, and make adjustments when necessary.

Tip #4: Attract Buyers With Your Page Title

Optimizing page titles for SEO in a Shopify Store
Optimizing Page Titles in the Shopify Platform

When searching Google your page title is the first thing online shoppers will see and influences whether or not someone will click on it. And what’s the right length for a title tag? As Moz states, “While Google does not specify a recommended length for title tags, most desktop and mobile browsers are able to display the first 50–60 characters of a title tag.”

Here’s a formula for writing a great page title:

Keyword | Additional Keyword | Business Name

Here’s what this looks like when you search “cashmere sweater” on Google:

SERP result for the keyword "cashmere sweater"
Example Page Title in Google

As you can see, these Page Titles are short and sweet. It’s important not to stuff your titles full of keywords as this can get you into trouble with Search Engines.

Tip #5: Develop A Strategic Content Plan

To create a great content strategy you can’t just create blog content, stuff in a bunch of keywords, and call it a day. Instead, you’ll want to start off by thinking about your target audience and what type of content resonates most with them.

List of blog articles in a Shopify store
Example of blog content in a Shopify store

Dive into your analytics and look at things like the age, gender, and location of people who buy your products. Another big factor is how your audience likes to consume your content. Do they use Instagram? Prefer videos or white papers? Short or longer-form content?

Start thinking about what sets you apart from your competition. From there, you can develop original content that aligns with your brand and your audience’s interests. Make sure to do keyword research, SEO competitor analysis, and look at keyword gaps. And finally, get started on writing your unique content! Content can be for:

  • Social Media
  • Email
  • Guides/Ebooks
  • Infographics
  • Videos
  • Blog Posts

Once published on your Shopify website, you’ll want to monitor performance to see what page content your audience is enjoying and interacting with. This will impact your content plan moving forward.

Tip #6: Keyword Density

Keyword density – also called keyword frequency – is the number of times a keyword appears on a webpage compared to the overall word count. In the past keyword stuffing, or putting as many keywords as possible into content was very common. Now Google penalizes the page rankings of sites that keyword stuff. 

A driver talking into a radio saying

Want to figure out your keyword density? It’s easy to do! Simply divide the number of times a keyword is used on your page by the total number of words on the page. 

For example, say your content had 30 keywords and overall 2,000 words:

30 / 2000 = .015 %

Multiply that by 100 to get the percentage and you get 1.5 %.

For keyword density, there’s no perfect amount although “…many SEOs recommend using approximately one keyword for each 200 words of copy.”

There are also some free tools to help you calculate keyword density if you want to streamline the process:

  • SureOak: offers free keyword analysis for the SEO of any page on your website.
  • SmallSEOTools: analyzes the density of your text just as a search engine would do.
  • SEO Content Assistant: Recommends a frequency for key topical terms based on your target keyword

Tip #7: Acquire Trustworthy & Authoritative Backlinks

What’s a backlink? Backlinks are links from one website to another. They’re important because these links tell Google and other search engines that your site has domain authority

So when a bunch of other websites link to your website this increases your website’s domain authority with search engines, therefore also increasing your SEO. 

Not all backlinks hold the same weight though. You want to have backlinks to websites that have high domain authority. For example, a backlink from a page on Shopify would be extremely valuable since they are an authoritative website.

example of a link pointing to another web page and creating a backlink
Image Source

There are a couple of strategies you can use to when link building:

  • Guest posting: research blogs or companies that relate to your business and email them to see if they would be interested in a guest post. 
  • Become a source for reporters: HARO (Help a Reporter Out) connects people that need sources, like journals and publications, to sources. It’s free to sign up and you can filter by industry. 
  • Write great content: If you write quality content that resonates with your audience others might start linking to it without you even needing to reach out.
  • Infographics: according to Search Engine Journal, an infographic is 30 times more likely to be read than a text article. Companies love including them in blogs and ebooks so if you’re able to create a visually appealing infographic it can go a long way for your e-commerce website.

Tip #8: Optimize Image Alt Text

Today, nearly 38% of Google’s SERPs show images. That’s why it’s important to capitalize on another source of organic traffic – images.

Editing alt text in the Shopify CMS
Optimizing Alt Text in the Shopify CMS

Alt text or alt tags is the copy that will appear if an image fails to load. If someone is using a screen reading tool the image will be described using this copy. 

In order to add Alt text to your products Shopify provides these instructions:

  1. From your Shopify admin, go to Products > All products.
  2. Click the name of the product that you want to edit.
  3. From the product details page, click a product media item to see the Preview media page.
  4. Click Add ALT text.
  5. Enter your alt text, and then click Save ALT text.
  6. Click the X to exit the preview page.

Again, you won’t want to keyword stuff with your Alt text images. Instead, focus on only one or two keywords and be descriptive of the product. If a product has text on it make sure to include that.

Alt Tag in HTML
Example of Alt Text in HTML. Image Source.

A good example is the picture above of a bag of Doritos. In the Alt text they include the product name, flavor, size, and amount of product included. This tells the reader exactly what the image is without going overboard with copy.

Tip 9: Offer a Great User Experience 

You may have heard the term UX thrown around before – it stands for User Experience. In terms of an eCommerce website this means things like your website is easy to navigate, visitors can find products they’re looking for, it’s easy to add items to a shopping cart and checkout.

How do SEO and UX work together? SEO drives traffic to your site, and UX gets that traffic to convert

You should take these factors into consideration to create a great UX experience.

  • Navigation – you want your navigation to be as simple as possible. Visitors should be able to go to any page and know how to go to a different location. A great feature to include is a search function. Proving an intuitive and easy to use navigation can improve the customer search experience and boost conversion rates.
  • Site Speed – for eCommerce Consumers say their ideal speed is two-seconds. If your site is too slow there’s a good chance shoppers will leave and go to a competitor instead. PageSpeed Insight is a great free tool for checking your site’s speed. 

Mobile Friendly – In 2021, 72.9 percent of all retail e-commerce is expected to be generated via m-commerce. That’s why your site needs to work and look great on both desktop and mobile. Make sure your homepage, navigation, images, and copy all display correctly on mobile devices. Go through the process of purchasing a product and ensure that process is seamless. Shoppers will immediately switch to a different site if they experience any issues.

Final Thoughts on SEO Shopify Tips

Deploying these tips across your Shopify store can be all the difference in driving real clicks to your website. If you are not quite ready to take your SEO strategy into your own hands, reach out to an e-commerce SEO expert to get started.

Author Bio: Cassandra Highbridge is the CRM Engagement Marketing Specialist at Unstack. Unstack’s no-code landing page templates and optimized eCommerce page components give merchants control over product pages, collections, and more.

The post 9 SEO Shopify Tips to Optimize Your E-commerce Website appeared first on LinkGraph.

]]>
https://linkgraph.io/blog/seo-shopify-tips/feed/ 0