Connect with us

SEO

10 Essential Priorities For Your Retail SEO Strategy

Published

on

10 Essential Priorities For Your Retail SEO Strategy


Ecommerce is expected to reach almost 35% of sales among big-box retailers worldwide by 2023, rising from 23% in 2019, according to Edge Retail Insight.

This growth is expected to continue, with ecommerce taking a nearly 40% share of sales by 2025.

This comes amid strong online growth and stable or declining physical store sales worldwide.

However, store-based retail nevertheless continues to account for the majority of sales. Additional research shows consumers prefer a mix of online and offline shopping.

Regardless of how the transaction is carried out, the majority of shoppers use search engines for discovery and comparison shopping.

Image from LSA, February 2022

That means that whether customers are shopping in your store, on your website, or via a social commerce platform, SEO is an area of opportunity retailers cannot afford to miss.

Here are 10 top priorities for your retail SEO strategy.

1. Keyword Research

Keyword research is extremely important for retail brands. Knowing what keywords consumers are searching for and how they are searching is vital to building out your informational architecture and content strategy.

It should cover keywords at all stages of the fragmented user journey:

  • informational,
  • navigational,
  • transactional,
  • and intent-based.

There are a plethora of keyword research tools, but always make sure to review your competitors’ keyword research strategy, too.

That includes Amazon because of the high purchase intent there. Use Amazon’s keyword tool, as well as tools like Ahrefs.

Once the site is up and running, review paid search data and find keywords that are converting and driving traffic and sales.

Make sure the site is ranking on the first page wherever possible of all the major search engines for those keywords through ongoing optimization.

This will help you augment organic performance initially but then reallocate your paid budget over time as SEO proves its value.

2. Local Search

Getting found online is key to driving traffic and sales.

It’s a simple truth that the more you show up for your customers, the more your business thrives and can provide services.

But when it comes to local search, accuracy matters.

So get on a good local search platform and then claim and optimize your listings.

Optimized listings help your retail brand show up at the top of local searches and provide a consistent customer experience to drive acquisition and retention.

Make sure you are taking advantage of Google Business Profiles, a tool that helps businesses manage their presence across Google properties, to share updates with your customers.

Add photos to your GBP listings to improve the customer experience, add attributes so customers know what to expect, display your products and inventory, submit relevant categories, respond to Q&A, and also monitor and respond to reviews.

I can’t tell you how many retail brands still don’t respond to reviews, both good and bad.

To learn more about how to optimize for local search, read the Definitive Guide to Improve Your Local Search Rankings.

3. Structured Data

Structured data can help search engines better understand your content and improve visibility via Featured Snippets.

For retail brands, the most important structured data type is product schema.

All your products should be marked up with product schema so Google and other search engines can publish more information about your products and get a better understanding of what your brand sells.

Other important structured data types for retail stores and local businesses are local business schema, which posts your address, ratings and reviews, website, geocoordinates, events, etc.

To learn more, visit How to Use Schema for Local Search.

4. Top Quality Content

Fresh, high-quality content based on intent is very important for retail brands.

That’s in part because 81% of retail shoppers conduct online research before buying.

With so many users doing online research – and over 70% of this research coming from mobile phones – it’s imperative you have content that satisfies their needs.

If I was working with a new retail brand, I would make sure my category pages and product pages are filled with high-quality and unique content.

Additionally, I would make sure I have a blog that helps users solve problems and offers advice, tips, and how-to content that is relevant to the brand.

It’s important to optimize product review pages, as well.

I still come across big retailers that do not have any content blocks or FAQ content on their category pages and limited content on their product pages, which is a missed opportunity to rank for upper funnel and transactional keywords.

For example, on the climbing ropes category page for outdoor retailer REI, there is no content block that describes what a climbing rope is or answers any questions for the Google rich snippet feature, People Also Ask (PAA).

Instead, other sites are dominating the featured snippets for content that REI should own.

Landing page of REI climbing rope productsImage from REI, January 2022
people also ask google snippet for ropesScreenshot from search for [climbing ropes], Google, February 2022

5. Optimized Images

Humans are very visual. When it comes to retail, you can’t forget about optimizing images for both product and non-product-related keywords.

Shoppers like to see what it is they’re considering purchasing from multiple angles, close up, and even virtually placed in their own environment.

digital shopping young vs. older millennialsImage by eMarketer, February 2022

Always make sure to optimize your image file names, image size, formats, and alt text to help search engines understand your images and show up in the image search results for relevant keywords.

In addition, platforms like Pinterest and Instagram rely on images and are constantly honing their shopping features, so brands should optimize their images and video assets for those powerhouse discovery channels, as well.

6. Mobile and Core Web Vitals (CWV)

Mobile now accounts for more than half of all ecommerce traffic and definitely has taken over desktop as a top traffic-driving source.

Since shoppers are searching and buying products using their mobile devices, brands need to ensure their sites are optimized for mobile.

To do so, make sure the site is using easy-to-read text, is user-friendly, and has clear calls to action.

That helps ensure users interact with the main conversion points, i.e., buy products, sign up for rewards, etc.

In 2021, Google updated its algorithm to incorporate page experience as a ranking signal.

You also want to make sure your pages load as quickly as possible – preferably under three seconds – and are optimized for Core Web Vitals. This can give your page a boost and that could make the difference in super competitive retail SERPs.

According to a study from cybersecurity firm Radware, 51% of online shoppers in the U.S. claimed if a site was too slow they would not complete a purchase.

7. Backlinks

Backlinks are still an important part of any SEO strategy.

Always monitor your backlink profile to see if you have any links from spammy sites or broken backlinks and make sure your links have a mixture of branded and non-branded anchor text.

Also, remember having too many exact match anchor text links can be harmful to your link profile.

In order to obtain high-quality links, always make sure you have content that is helpful to end-users and satisfies their problems.

For example, one retailer that does this effectively is The Body Shop.

Since The Body Shop sells foundation, they have a post on How to Apply Foundation.

That attracts links to their site because it helps consumers solve a real problem. It’s educational and people would consider that a helpful share as opposed to an advertisement.

Coupon link building is a great option for retailers, as well.

8. SEO-Friendly Page Templates

When it comes to building page templates for retail brands, it’s important to follow SEO best practices.

Building and designing templates in an SEO-friendly manner ensures search engines can crawl and index your content.

Keep the following in mind as you optimize.

Document Templates

  • Use front-loaded exact-match primary keywords and secondary keywords in your title tags. Be sure to use no more than 65 characters (including spaces).
  • Utilize the SERP Preview Tool to see how the title will appear in the SERP and check for truncation.
  • Maintain uniform branding with a pipe or dash.
  • Provide search engines and searchers with a concise yet captivating description of what the page is about in the meta description.
  • Maintain consistency with brand voice, messaging, and tone.
  • Keep character counts to around 156 to 165 maximum, including spaces. You can utilize SERP Preview Tool to see how the description will appear in the SERP and check for truncation. Always include a Call to Action, like “Learn more”, “Find out how…”, “Browse [offerings]…”, etc. Avoid sounding like an advertisement or too promotional.
  • Use one H1 tag per page with the primary keyword front-loaded. Your H1 should introduce the main topic/theme/title of the page and help provide structure and context.
  • Use keyword-rich H2 tags (there is no limit on the number of H2s per page). Exact-matching longtail keywords/questions/voice search queries in the H2s helps target paragraph-type featured snippets in the SERP.

Body Copy Requirements

  • Build out long-form content, containing at least 901-1200 words per page. Include exact matching for target keywords and internal linking to relevant PDPs/category pages as much as possible.
  • Include CTA buttons, high-quality, compressed, optimized images with alt tags to improve UX and all text on images should be crawlable/indexable.
  • Internal linking should include relevant PDPs/category pages as much as possible and include CTA buttons.
  • The topic of page and body content should align with and serve both informational and commercial search intent (i.e., provide knowledge/article-type content while also making relevant products and shopping easily accessible).
  • Internal linking to specific products should be strategically placed to increase the likelihood of conversion and keep the user on the site for as long as possible.
  • Avoid transactional/promotional verbiage/obvious persuasion to gain sales.

URL Requirements

  • URLs are a minor ranking factor and should be keyword-rich, semantically accurate, and succinct, providing a clear idea of what the page is about.
  • Remove stop words and keep them as short as possible to make URLs look cleaner.
  • Ensure the CMS will create URLs that are all lower case and structured properly.

9. Strong Technical Architecture And Foundation

Perform crawls of your site using Screaming Frog, Botify, DeepCrawl, or whatever crawler you prefer to make sure the site does not have any major technical issues that can harm your search engine rankings.

Always check for things like:

  • Broken links on your site.
  • Missing alt text or metadata.
  • Thin and duplicate content.
  • Your domain is accessible using non-www or www and there is only one version of your site. Other versions should be 301 redirected to the preferred version.
  • Missing HTTPS.
  • That the site does not have a no index and/or is not blocking pages that should be crawled.
  • Google Analytics and Search Console are set up and verified.
  • All your pages have unique and optimized metadata.
  • Your site has minimal crawl errors, i.e., 404 pages, etc.

10. Measurement

Monitoring your SEO progress is extremely important for measuring how your retail brand is performing over time.

When launching a new brand, you want to make sure you’re ranking for all your brand-related keywords and in the top 30 for non-branded keywords.

As the site starts to age, continue to optimize to make sure you are ranking for high-volume and relevant keywords that are going to drive business value and ROI.

This may take a while for a new site, but it should take less time for an existing site, depending upon the state of the site and how competitive your aspect of retail is.

In addition, always monitor important KPIs, which can consist of but are not limited to the following:

  • Branded rankings.
  • Non-branded rankings.
  • Golden keyword list, i.e., keywords that you have to own.
  • Time on site.
  • Bounce rate.
  • Conversions.
  • Organic visits.
  • New organic visitors.

Other items to keep a careful watch on include paid search data, which can help build out your content and keyword strategy.

Also, prioritize keywords that perform well on the paid side to maximize efficiencies.

It’s also important to monitor Google Search Console for any manual actions, crawl errors, indexing issues, etc., and to address those issues right away.

Wrapping Up

Optimizing a retail website can help build your customer base and build trust among an audience looking specifically for the products you sell.

With the growth of ecommerce accounting for a good and growing portion of sales, focus on:

  • Building and maintaining an SEO-friendly website that loads quickly.
  • Creating content that satisfies the needs of end-users, helps users solve problems and attract links, is marked up with structured data, and is optimized for local search.

This will drive incremental revenue, traffic, and sales and take valuable search engine real estate away from your retail competitors.

More resources:


Featured Image: Dilok Klaisataporn/Shutterstock





Source link

SEO

How to Block ChatGPT From Using Your Website Content

Published

on

How to Block ChatGPT From Using Your Website Content

There is concern about the lack of an easy way to opt out of having one’s content used to train large language models (LLMs) like ChatGPT. There is a way to do it, but it’s neither straightforward nor guaranteed to work.

How AIs Learn From Your Content

Large Language Models (LLMs) are trained on data that originates from multiple sources. Many of these datasets are open source and are freely used for training AIs.

Some of the sources used are:

  • Wikipedia
  • Government court records
  • Books
  • Emails
  • Crawled websites

There are actually portals and websites offering datasets that are giving away vast amounts of information.

One of the portals is hosted by Amazon, offering thousands of datasets at the Registry of Open Data on AWS.

Screenshot from Amazon, January 2023

The Amazon portal with thousands of datasets is just one portal out of many others that contain more datasets.

Wikipedia lists 28 portals for downloading datasets, including the Google Dataset and the Hugging Face portals for finding thousands of datasets.

Datasets of Web Content

OpenWebText

A popular dataset of web content is called OpenWebText. OpenWebText consists of URLs found on Reddit posts that had at least three upvotes.

The idea is that these URLs are trustworthy and will contain quality content. I couldn’t find information about a user agent for their crawler, maybe it’s just identified as Python, I’m not sure.

Nevertheless, we do know that if your site is linked from Reddit with at least three upvotes then there’s a good chance that your site is in the OpenWebText dataset.

More information about OpenWebText is here.

Common Crawl

One of the most commonly used datasets for Internet content is offered by a non-profit organization called Common Crawl.

Common Crawl data comes from a bot that crawls the entire Internet.

The data is downloaded by organizations wishing to use the data and then cleaned of spammy sites, etc.

The name of the Common Crawl bot is, CCBot.

CCBot obeys the robots.txt protocol so it is possible to block Common Crawl with Robots.txt and prevent your website data from making it into another dataset.

However, if your site has already been crawled then it’s likely already included in multiple datasets.

Nevertheless, by blocking Common Crawl it’s possible to opt out your website content from being included in new datasets sourced from newer Common Crawl data.

The CCBot User-Agent string is:

CCBot/2.0

Add the following to your robots.txt file to block the Common Crawl bot:

User-agent: CCBot
Disallow: /

An additional way to confirm if a CCBot user agent is legit is that it crawls from Amazon AWS IP addresses.

CCBot also obeys the nofollow robots meta tag directives.

Use this in your robots meta tag:

<meta name="robots" content="nofollow">

Blocking AI From Using Your Content

Search engines allow websites to opt out of being crawled. Common Crawl also allows opting out. But there is currently no way to remove one’s website content from existing datasets.

Furthermore, research scientists don’t seem to offer website publishers a way to opt out of being crawled.

The article, Is ChatGPT Use Of Web Content Fair? explores the topic of whether it’s even ethical to use website data without permission or a way to opt out.

Many publishers may appreciate it if in the near future, they are given more say on how their content is used, especially by AI products like ChatGPT.

Whether that will happen is unknown at this time.

More resources:

Featured image by Shutterstock/ViDI Studio



Source link

Continue Reading

SEO

Google’s Mueller Criticizes Negative SEO & Link Disavow Companies

Published

on

Google's Mueller Criticizes Negative SEO & Link Disavow Companies

John Mueller recently made strong statements against SEO companies that provide negative SEO and other agencies that provide link disavow services outside of the tool’s intended purpose, saying that they are “cashing in” on clients who don’t know better.

While many frequently say that Mueller and other Googlers are ambiguous, even on the topic of link disavows.

The fact however is that Mueller and other Googlers have consistently recommended against using the link disavow tool.

This may be the first time Mueller actually portrayed SEOs who liberally recommend link disavows in a negative light.

What Led to John Mueller’s Rebuke

The context of Mueller’s comments about negative SEO and link disavow companies started with a tweet by Ryan Jones (@RyanJones)

Ryan tweeted that he was shocked at how many SEOs regularly offer disavowing links.

He tweeted:

“I’m still shocked at how many seos regularly disavow links. Why? Unless you spammed them or have a manual action you’re probably doing more harm than good.”

The reason why Ryan is shocked is because Google has consistently recommended the tool for disavowing paid/spammy links that the sites (or their SEOs) are responsible for.

And yet, here we are, eleven years later, and SEOs are still misusing the tool for removing other kinds of tools.

Here’s the background information about that.

Link Disavow Tool

In the mid 2000’s there was a thriving open market for paid links prior to the Penguin Update in April 2012. The commerce in paid links was staggering.

I knew of one publisher with around fifty websites who received a $30,000 check every month for hosting paid links on his site.

Even though I advised my clients against it, some of them still purchased links because they saw everyone else was buying them and getting away with it.

The Penguin Update caused the link selling boom collapsed.

Thousands of websites lost rankings.

SEOs and affected websites strained under the burden of having to contact all the sites from which they purchased paid links to ask to have them removed.

So some in the SEO community asked Google for a more convenient way to disavow the links.

Months went by and after resisting the requests, Google relented and released a disavow tool.

Google cautioned from the very beginning to only use the tool for disavowing links that the site publishers (or their SEOs) are responsible for.

The first paragraph of Google’s October 2012 announcement of the link disavow tool leaves no doubt on when to use the tool:

“Today we’re introducing a tool that enables you to disavow links to your site.

If you’ve been notified of a manual spam action based on ‘unnatural links’ pointing to your site, this tool can help you address the issue.

If you haven’t gotten this notification, this tool generally isn’t something you need to worry about.”

The message couldn’t be clearer.

But at some point in time, link disavowing became a service applied to random and “spammy looking” links, which is not what the tool is for.

Link Disavow Takes Months To Work

There are many anecdotes about link disavows that helped sites regain rankings.

They aren’t lying, I know credible and honest people who have made this claim.

But here’s the thing, John Mueller has confirmed that the link disavow process takes months to work its way through Google’s algorithm.

Sometimes things happen that are not related, no correlation. It just looks that way.

John shared how long it takes for a link disavow to work in a Webmaster Hangout:

“With regards to this particular case, where you’re saying you submitted a disavow file and then the ranking dropped or the visibility dropped, especially a few days later, I would assume that that is not related.

So in particular with the disavow file, what happens is we take that file into account when we reprocess the links kind of pointing to your website.

And this is a process that happens incrementally over a period of time where I would expect it would have an effect over the course of… I don’t know… maybe three, four, five, six months …kind of step by step going in that direction.

So if you’re saying that you saw an effect within a couple of days and it was a really strong effect then I would assume that this effect is completely unrelated to the disavow file. …it sounds like you still haven’t figured out what might be causing this.”

John Mueller: Negative SEO and Link Disavow Companies are Making Stuff Up

Context is important to understand what was said.

So here’s the context for John Mueller’s remark.

An SEO responded to Ryan’s tweet about being shocked at how many SEOs regularly disavow links.

The person responding to Ryan tweeted that disavowing links was still important, that agencies provide negative SEO services to take down websites and that link disavow is a way to combat the negative links.

The SEO (SEOGuruJaipur) tweeted:

“Google still gives penalties for backlinks (for example, 14 Dec update, so disavowing links is still important.”

SEOGuruJaipur next began tweeting about negative SEO companies.

Negative SEO companies are those that will build spammy links to a client’s competitor in order to make the competitor’s rankings drop.

SEOGuruJaipur tweeted:

“There are so many agencies that provide services to down competitors; they create backlinks for competitors such as comments, bookmarking, directory, and article submission on low quality sites.”

SEOGuruJaipur continued discussing negative SEO link builders, saying that only high trust sites are immune to the negative SEO links.

He tweeted:

“Agencies know what kind of links hurt the website because they have been doing this for a long time.

It’s only hard to down for very trusted sites. Even some agencies provide a money back guarantee as well.

They will provide you examples as well with proper insights.”

John Mueller tweeted his response to the above tweets:

“That’s all made up & irrelevant.

These agencies (both those creating, and those disavowing) are just making stuff up, and cashing in from those who don’t know better.”

Then someone else joined the discussion:

Mueller tweeted a response:

“Don’t waste your time on it; do things that build up your site instead.”

Unambiguous Statement on Negative SEO and Link Disavow Services

A statement by John Mueller (or anyone) can appear to conflict with prior statements when taken out of context.

That’s why I not only placed his statements into their original context but also the history going back eleven years that is a part of that discussion.

It’s clear that John Mueller feels that those selling negative SEO services and those providing disavow services outside of the intended use are “making stuff up” and “cashing in” on clients who might not “know better.”

Featured image by Shutterstock/Asier Romero



Source link

Continue Reading

SEO

Source Code Leak Shows New Ranking Factors to Consider

Published

on

Source Code Leak Shows New Ranking Factors to Consider

January 25, 2023, the day that Yandex—Russia’s search engine—was hacked. 

Its complete source code was leaked online. And, it might not be the first time we’ve seen hacking happen in this industry, but it is one of the most intriguing, groundbreaking events in years.

But Yandex isn’t Google, so why should we care? Here’s why we do: these two search engines are very similar in how they process technical elements of a website, and this leak just showed us the 1,922 ranking factors Yandex uses in its algorithm. 

Simply put, this information is something that we can use to our advantage to get more traffic from Google.

Yandex vs Google

As I said, a lot of these ranking factors are possibly quite similar to the signals that Google uses for search.

Yandex’s algorithm shows a RankBrain analog: MatrixNext. It also seems that they are using PageRank (almost the same way as Google does), and a lot of their text algorithms are the same. Interestingly, there are also a lot of ex-Googlers working in Yandex. 

So, reviewing these factors and understanding how they play into search rankings and traffic will provide some very useful insights into how search engines like Google work. No doubt, this new trove of information will greatly influence the SEO market in the months to come. 

That said, Yandex isn’t Google. The chances of Google having the exact same list of ranking factors is low — and Google may not even give that signal the same amount of weight that Yandex does. 

Still, it’s information that potentially will be useful for driving traffic, so make sure to take a look at them here (before it’s scrubbed from the internet forever).

An early analysis of ranking factors

Many of their ranking factors are as expected. These include:

  • Many link-related factors (e.g., age, relevancy, etc.).
  • Content relevance, age, and freshness.
  • Host reliability
  • End-user behavior signals.

Some sites also get preference (such as Wikipedia). FI_VISITS_FROM_WIKI even shows that sites that are referenced by Wikipedia get plus points. 

These are all things that we already know.

But something interesting: there were several factors that I and other SEOs found unusual, such as PageRank being the 17th highest weighted factor in Yandex, and the 19th highest weighted factor being query-document relevance (in other words, how close they match thematically). There’s also karma for likely spam hosts, based on Whois information.

Other interesting factors are the average domain ranking across queries, percent of organic traffic, and the number of unique visitors.

You can also use this Yandex Search Ranking Factor Explorer, created by Rob Ousbey, to search through the various ranking factors.

The possible negative ranking factors:

Here’s my thoughts on Yandex’s factors that I found interesting: 

FI_ADV: -0.2509284637 — this factor means having tons of adverts scattered around your page and buying PPC can affect rankings. 

FI_DATER_AGE: -0.2074373667 — this one evaluates content age, and whether your article is more than 10 years old, or if there’s no determinable date. Date metadata is important. 

FI_COMM_LINKS_SEO_HOSTS: -0.1809636391 — this can be a negative factor if you have too much commercial anchor text, particularly if the proportion of such links goes above 50%. Pay attention to anchor text distribution. I’ve written a guide on how to effectively use anchor texts if you need some help on this. 

FI_RANK_ARTROZ — outdated, poorly written text will bring your rankings down. Go through your site and give your content a refresh. FI_WORD_COUNT also shows that the number of words matter, so avoid having low-content pages.

FI_URL_HAS_NO_DIGITS, FI_NUM_SLASHES, FI_FULL_URL_FRACTION — urls shouldn’t have digits, too many slashes (too much hierarchy), and of course contain your targeted keyword.

FI_NUM_LINKS_FROM_MP — always interlink your main pages (such as your homepage or landing pages) to any other important content you want to rank. Otherwise, it can hurt your content.

FI_HOPS — reduce the crawl depth for any pages that matter to you. No important pages should be more than a few clicks away from your homepage. I recommend keeping it to two clicks, at most. 

FI_IS_UNREACHABLE — likewise, avoid making any important page an orphan page. If it’s unreachable from your homepage, it’s as good as dead in the eyes of the search engine.

The possible positive ranking factors:

FI_IS_COM: +0.2762504972 — .com domains get a boost in rankings.

FI_YABAR_HOST_VISITORS — the more traffic you get, the more ranking power your site has. The strategy of targeting smaller, easier keywords first to build up an audience before targeting harder keywords can help you build traffic.

FI_BEAST_HOST_MEAN_POS — the average position of the host for keywords affects your overall ranking. This factor and the previous one clearly show that being smart with your keyword and content planning matters. If you need help with that, check out these 5 ways to build a solid SEO strategy.

FI_YABAR_HOST_SEARCH_TRAFFIC — this might look bad but shows that having other traffic sources (such as social media, direct search, and PPC) is good for your site. Yandex uses this to determine if a real site is being run, not just some spammy SEO project.

This one includes a whole host of CTR-related factors. 

CTR ranking factors from Yandex

It’s clear that having searchable and interesting titles that drive users to check your content out is something that positively affects your rankings.

Google is rewarding sites that help end a user’s search journey (as we know from the latest mobile search updates and even the Helpful Content update). Do what you can to answer the query early on in your article. The factor “FI_VISITORS_RETURN_MONTH_SHARE“ also shows that it helps to encourage users to return to your site for more information on the topics they’re interested in. Email marketing is a handy tool here.

FI_GOOD_RATIO and FI_MANY_BAD — the percentage of “good” and “bad” backlinks on your site. Getting your backlinks from high-quality websites with traffic is important for your rankings. The factor FI_LINK_AGE also shows that adding a link-building strategy to your SEO as early as possible can help with your rankings.

FI_SOCIAL_URL_IS_VERIFIED — that little blue check has actual benefits now. Links from verified accounts have more weight.

Key Takeaway

Yandex and Google, being so similar to each other in theory, means that this data leak is something we must pay attention to. 

Several of these factors may already be common knowledge amongst SEOs, but having them confirmed by another search engine enforces how important they are for your strategy.

These initial findings, and understanding what it might mean for your website, can help you identify what to improve, what to scrap, and what to focus on when it comes to your SEO strategy. 

Source link

Continue Reading

Trending

en_USEnglish