Connect with us

SEO

Five must-haves of a conversion worthy ecommerce website

Published

on

30-second summary:

  • Getting traffic to a website can be difficult, so you need to make sure that visitors are as likely to convert as possible once there
  • Quality site search implementation can increase conversion rates by 5-6x, and including elements like CTAs or a system that accounts for spelling mistakes can have a considerable impact
  • When working with an online store, think about category pages like aisles and sub-categories like shelves within those aisles
  • Breadcrumbs can not only help enhance the user experience but also improve rankings as they help search engines understand how your site structure and relevance

By many estimates, there are over twelve million ecommerce websites on the internet. That’s a lot of online stores, covering a lot of different niches. Getting traffic to these sites is one of the main struggles for businesses, so it’s important that once someone does land on the website, they have the best chance of converting as possible.

At the end of the day, it doesn’t matter how good the rest of your site is, if the commercial pages are poor then you may be throwing leads away.

By ‘commercial pages’, we mean anything that leads to the generation of revenue, like the product, category, and service pages – even the checkout. What may seem like a minor change can have a huge impact on revenue for these pages.

For example, would you have guessed that simply adding a video to a product page would make users 144 percent more likely to add a product to their cart?

In this article, I take a look at five ways ecommerce websites can take their traffic – but most importantly, conversions – to the next level. We’ll start with the largest, and most underappreciated one, first.

1. Prioritise your site search

According to Econsultancy, up to 30 percent of ecommerce visitors use the internal site search available to them. This level of engagement means there is a higher level of purchasing intent, which needs to be capitalised on. Why?

Due to the increased level of purchasing intent from these searchers, they’re known to be 5–6x more likely to convert than the average visitor that doesn’t use the site search. 

If someone invented a tool that reliably increased conversion rates by 5x, they’d be incredibly wealthy – and the tool would be very expensive. Instead, this is available on pretty much all site builds, but lies unutilized in most cases, even if site search optimization has led to conversion rate increases of 43 percent.

So, how can you optimize your search functionality?

First, include a CTA (call to action) in the search bar by default that encourages users to search, or even just explains what the bar is for more basic users. Below are some examples from major online brands:

Boots example of adding a CTA to the search bar - must have for conversion worthy ecommerce websites

Source: Boots

Depop example of adding a CTA to the search bar - must have for conversion worthy ecommerce websites

Source: Depop 

ebay example of adding a CTA to the search bar - must have for conversion worthy ecommerce websites

Source: ebay 

In the first word of each of these, they are both educating the user on what the bar is for and are also encouraging them to use it. They also give people an insight into what they provide beyond just products, whether that’s services for Boots or styles for Depop. The eBay example is also great copywriting as it supports the brand’s character that you can buy and sell anything you want there; they’re not limited to brands or styles, you can search for anything!

A great site search would also be able to handle misspellings. For example, a website may have items listed as “red t-shirt”, but there are a lot of people that would simply search “red tshirt”. If your site search doesn’t show the same products for either, you’re likely losing out on sales. 

You also want to make sure that generating new searches and applying filters don’t create new, indexable URLs. To test this, run a search on your website and then find what the search string URL looks like – basically everything in the URL before your search. Paste this into Google and see if these pages are being indexed/are appearing in the search engine results page. 

It may be that every search is being saved as a new page (which we’ve seen many times before), which can lead to a huge crawl bloat. Consider search engines like Google as having a really short attention span. You don’t want to distract them with pointless pages like these, so make sure you no-index them. 

Options like Fact Finder, Doo Finger, and SLI Systems are flexible choices that work fairly easily out of the box. These are great for smaller businesses with tighter resources. For larger businesses that need more from this functionality, Elastic Search and Solr are strong open source options but require a lot of work. This means that they can become totally bespoke, but that it may be overwhelming for businesses without the time and resources. 

2. Have a Plan B for when a product is out of stock

Most products sold online are finite. Whether you have a lot of stock or a limited amount, almost every product runs the risk of becoming out of stock. This is the nature of an ecommerce business and is often a sign that something is selling well, but you should have a plan for when this happens. 

It’s easy for a potential sale to end when they see that ‘out of stock’ message. However, the truly great ecommerce stores will know this isn’t the end of the customer’s journey – just because the product they originally wanted isn’t available doesn’t mean they can’t be sold on another. 

After all, if you were doing your online grocery shopping and the usual meat feast pizza you buy isn’t available, that probably doesn’t mean you’re just not eating pizza anymore. Instead, you’d likely look for a similar meaty pizza from a different brand. This mindset works for other products, too. 

First, you should consider related products on out of stock pages as absolutely essential. Take this example from John Lewis:

Add similar products in case of no stock to have a conversion worthy website

Source: John Lewis and Partners

In this case, the outdoor set is out of stock, but they are straight away suggesting similar products that would scratch the same itch the customer has. They’re also high up the page, which is important. If people see a product they want is out of stock, they may click away very quickly, so having similar products above the fold means you have a good chance of grabbing their attention before they move away. 

As well as including related products, there should also be a channel for communication with the customer so you can contact them when the product comes back in stock. You can’t just assume that they’ll remember your website to check again in a few more weeks. It’s much more likely they’ll just find the product on a different website and give them their money instead. 

While you can’t stop them from looking elsewhere, a section asking for their email address means that you can now communicate with them directly for marketing purposes but also let them know as soon as the product becomes available. This means that not only can you draw the customer back to the page for a purchase, but you could also sell them on more products over email! 

Finally, if a product is out of stock and you don’t ever plan to restock it again, then consider removing it from your sitemap. For example, if you sell a calendar designed for 2018, this may very well be out of stock and very unlikely to come back in stock. With this in mind, deleting it from your sitemap would mean that search engines don’t bother looking at it and can instead focus on pages of yours that you actually want the likes of Google and Bing to be looking at. 

3. Build a category structure that makes sense

A considered and effective category/sub-category structure is essential for online stores. Not only does this help search engines understand what it is you sell and what your most important pages are, but it also helps the user.

If there were no aisles in a supermarket, customers would be searching blindly for what they need. There’d be no structure and no space for using initiative. Instead, there are frozen aisles, canned aisles, fresh aisles; if you need some frozen french fries or some fresh peppers, you know where to go. Once you’re in that aisle, there are then shelves which can help you get even more specific. There likely wouldn’t be a tomato aisle, but a tomato shelf in the fresh aisle makes sense. 

When working with an online store, think about category pages like aisles and sub-category pages like shelves within those aisles. Shopping online should be as seamless as this. 

Consider what your biggest categories are and ‘zoom in’ smaller and smaller so you can find what your sub-categories are. It may be that you don’t have enough products to necessitate a sub-category.

Toby Dean, the Associate Director of SEO at Add People, believes that “As a rule of thumb, if there are more than 25-30 products in a category, you may want to sub-categorise that down to improve relevance, rankings and UX.” 

Just like how people rarely click on page nine of Google search results, customers will rarely look at page nine of a category. Sub-category implementation will give them a better guide as to where they can find the products they want. For a clothing store, this might look like this:

Clothing > Men > Jumpers > Roll Neck Jumpers

Not having these is the equivalent of a supermarket having all of their food in one humongous aisle. Good luck trying to find what you need in there! 

4. Include breadcrumbs 

Breadcrumbs aren’t on every category or product page, but they should be. They essentially show the user’s journey from the root category page to whatever page they’re on at that point. Using the example above, if you were on a product page for a roll neck jumper, you might see the “Clothing > Men > Jumpers > Roll Neck Jumpers” as a breadcrumb near the top of the page.

Each of these should be clickable, giving the user a chance to go as far back as they would like to in their journey. This massively improves navigation on these pages and means that if they end up down the wrong path, they can quickly ‘turn around’ and go back the way they came. This helps increase conversions and lower bounce rates.

Habitat, an online furniture provider, use this to good effect on their pages:

Add bread crumbs to pass link equity throughout all the pages and guide consumers - must have for ecommerce site that converts

Source: Habitat 

From a search engine perspective, it also helps pass link equity throughout all the pages. The more internal links something like Google detects going to a page, the more it will consider that page important. With that in mind, including breadcrumbs means that you will be linking to many pages at once. This means that they will quickly develop an understanding of how your website is structured, which should make ranking for relevant terms even easier. 

Everything else

These tips below don’t need a whole section to explain, but could still be key movers for your traffic and conversions. 

  • Include trust points and reviews on product pages

According to a BrightLocal survey, 91% of 18 to 34-year-old consumers trust online reviews as much as personal recommendations. This means that your product pages should include reviews of the item and the rest of your website should include testimonials from customers alongside your ratings on services like TrustPilot or Google. 

  • Use photos and videos to sell to the customer

Shoppers expect more than one photo per product now. They want to see it from different angles and in use, in both a photo and video format ideally. One study found that those shoppers who saw videos on product pages were 144% more likely to add a product to their cart.

  • Add filters and sorts to pages

While some popular ecommerce platforms have this as a basic feature, plenty still don’t. With that in mind, make sure that you can apply filters that are relevant to your products. If a website sells shoes, it may need a size filter. If a website sells food, it may need a vegetarian-friendly filter. Regardless of the niche, all pages should also have the ability to sort by price and ratings. 

After a recent Google update saw some websites crash in rankings, it became even more apparent that optimized copy is crucial for ecommerce-focused pages. By including keywords and matching the intent of the typical customer, you can draw in organic traffic and help them convert while they are there; all while appeasing search engines and assuring them that you’re relevant to the searches your customers are making. 

  • Consider brand-focused pages

If you’re getting a lot of brand-focused searches and interest, you may want to create a dedicated page for that brand and connect all the relevant products to it. This will help establish your relevance for these searches, while also collecting all of the products people are interested in to one place.


Matthew Rogers is Head of Campaign Management at the top Manchester-based digital market agency Add People and has over 14 years of marketing experience. He is also a long-standing member of the Click Z Collective Advisory board.

Subscribe to the Search Engine Watch newsletter for insights on SEO, the search landscape, search marketing, digital marketing, leadership, podcasts, and more.

Join the conversation with us on LinkedIn and Twitter.

Source link

SEO

How to Block ChatGPT From Using Your Website Content

Published

on

How to Block ChatGPT From Using Your Website Content

There is concern about the lack of an easy way to opt out of having one’s content used to train large language models (LLMs) like ChatGPT. There is a way to do it, but it’s neither straightforward nor guaranteed to work.

How AIs Learn From Your Content

Large Language Models (LLMs) are trained on data that originates from multiple sources. Many of these datasets are open source and are freely used for training AIs.

Some of the sources used are:

  • Wikipedia
  • Government court records
  • Books
  • Emails
  • Crawled websites

There are actually portals and websites offering datasets that are giving away vast amounts of information.

One of the portals is hosted by Amazon, offering thousands of datasets at the Registry of Open Data on AWS.

Screenshot from Amazon, January 2023

The Amazon portal with thousands of datasets is just one portal out of many others that contain more datasets.

Wikipedia lists 28 portals for downloading datasets, including the Google Dataset and the Hugging Face portals for finding thousands of datasets.

Datasets of Web Content

OpenWebText

A popular dataset of web content is called OpenWebText. OpenWebText consists of URLs found on Reddit posts that had at least three upvotes.

The idea is that these URLs are trustworthy and will contain quality content. I couldn’t find information about a user agent for their crawler, maybe it’s just identified as Python, I’m not sure.

Nevertheless, we do know that if your site is linked from Reddit with at least three upvotes then there’s a good chance that your site is in the OpenWebText dataset.

More information about OpenWebText is here.

Common Crawl

One of the most commonly used datasets for Internet content is offered by a non-profit organization called Common Crawl.

Common Crawl data comes from a bot that crawls the entire Internet.

The data is downloaded by organizations wishing to use the data and then cleaned of spammy sites, etc.

The name of the Common Crawl bot is, CCBot.

CCBot obeys the robots.txt protocol so it is possible to block Common Crawl with Robots.txt and prevent your website data from making it into another dataset.

However, if your site has already been crawled then it’s likely already included in multiple datasets.

Nevertheless, by blocking Common Crawl it’s possible to opt out your website content from being included in new datasets sourced from newer Common Crawl data.

The CCBot User-Agent string is:

CCBot/2.0

Add the following to your robots.txt file to block the Common Crawl bot:

User-agent: CCBot
Disallow: /

An additional way to confirm if a CCBot user agent is legit is that it crawls from Amazon AWS IP addresses.

CCBot also obeys the nofollow robots meta tag directives.

Use this in your robots meta tag:

<meta name="robots" content="nofollow">

Blocking AI From Using Your Content

Search engines allow websites to opt out of being crawled. Common Crawl also allows opting out. But there is currently no way to remove one’s website content from existing datasets.

Furthermore, research scientists don’t seem to offer website publishers a way to opt out of being crawled.

The article, Is ChatGPT Use Of Web Content Fair? explores the topic of whether it’s even ethical to use website data without permission or a way to opt out.

Many publishers may appreciate it if in the near future, they are given more say on how their content is used, especially by AI products like ChatGPT.

Whether that will happen is unknown at this time.

More resources:

Featured image by Shutterstock/ViDI Studio



Source link

Continue Reading

SEO

Google’s Mueller Criticizes Negative SEO & Link Disavow Companies

Published

on

Google's Mueller Criticizes Negative SEO & Link Disavow Companies

John Mueller recently made strong statements against SEO companies that provide negative SEO and other agencies that provide link disavow services outside of the tool’s intended purpose, saying that they are “cashing in” on clients who don’t know better.

While many frequently say that Mueller and other Googlers are ambiguous, even on the topic of link disavows.

The fact however is that Mueller and other Googlers have consistently recommended against using the link disavow tool.

This may be the first time Mueller actually portrayed SEOs who liberally recommend link disavows in a negative light.

What Led to John Mueller’s Rebuke

The context of Mueller’s comments about negative SEO and link disavow companies started with a tweet by Ryan Jones (@RyanJones)

Ryan tweeted that he was shocked at how many SEOs regularly offer disavowing links.

He tweeted:

“I’m still shocked at how many seos regularly disavow links. Why? Unless you spammed them or have a manual action you’re probably doing more harm than good.”

The reason why Ryan is shocked is because Google has consistently recommended the tool for disavowing paid/spammy links that the sites (or their SEOs) are responsible for.

And yet, here we are, eleven years later, and SEOs are still misusing the tool for removing other kinds of tools.

Here’s the background information about that.

Link Disavow Tool

In the mid 2000’s there was a thriving open market for paid links prior to the Penguin Update in April 2012. The commerce in paid links was staggering.

I knew of one publisher with around fifty websites who received a $30,000 check every month for hosting paid links on his site.

Even though I advised my clients against it, some of them still purchased links because they saw everyone else was buying them and getting away with it.

The Penguin Update caused the link selling boom collapsed.

Thousands of websites lost rankings.

SEOs and affected websites strained under the burden of having to contact all the sites from which they purchased paid links to ask to have them removed.

So some in the SEO community asked Google for a more convenient way to disavow the links.

Months went by and after resisting the requests, Google relented and released a disavow tool.

Google cautioned from the very beginning to only use the tool for disavowing links that the site publishers (or their SEOs) are responsible for.

The first paragraph of Google’s October 2012 announcement of the link disavow tool leaves no doubt on when to use the tool:

“Today we’re introducing a tool that enables you to disavow links to your site.

If you’ve been notified of a manual spam action based on ‘unnatural links’ pointing to your site, this tool can help you address the issue.

If you haven’t gotten this notification, this tool generally isn’t something you need to worry about.”

The message couldn’t be clearer.

But at some point in time, link disavowing became a service applied to random and “spammy looking” links, which is not what the tool is for.

Link Disavow Takes Months To Work

There are many anecdotes about link disavows that helped sites regain rankings.

They aren’t lying, I know credible and honest people who have made this claim.

But here’s the thing, John Mueller has confirmed that the link disavow process takes months to work its way through Google’s algorithm.

Sometimes things happen that are not related, no correlation. It just looks that way.

John shared how long it takes for a link disavow to work in a Webmaster Hangout:

“With regards to this particular case, where you’re saying you submitted a disavow file and then the ranking dropped or the visibility dropped, especially a few days later, I would assume that that is not related.

So in particular with the disavow file, what happens is we take that file into account when we reprocess the links kind of pointing to your website.

And this is a process that happens incrementally over a period of time where I would expect it would have an effect over the course of… I don’t know… maybe three, four, five, six months …kind of step by step going in that direction.

So if you’re saying that you saw an effect within a couple of days and it was a really strong effect then I would assume that this effect is completely unrelated to the disavow file. …it sounds like you still haven’t figured out what might be causing this.”

John Mueller: Negative SEO and Link Disavow Companies are Making Stuff Up

Context is important to understand what was said.

So here’s the context for John Mueller’s remark.

An SEO responded to Ryan’s tweet about being shocked at how many SEOs regularly disavow links.

The person responding to Ryan tweeted that disavowing links was still important, that agencies provide negative SEO services to take down websites and that link disavow is a way to combat the negative links.

The SEO (SEOGuruJaipur) tweeted:

“Google still gives penalties for backlinks (for example, 14 Dec update, so disavowing links is still important.”

SEOGuruJaipur next began tweeting about negative SEO companies.

Negative SEO companies are those that will build spammy links to a client’s competitor in order to make the competitor’s rankings drop.

SEOGuruJaipur tweeted:

“There are so many agencies that provide services to down competitors; they create backlinks for competitors such as comments, bookmarking, directory, and article submission on low quality sites.”

SEOGuruJaipur continued discussing negative SEO link builders, saying that only high trust sites are immune to the negative SEO links.

He tweeted:

“Agencies know what kind of links hurt the website because they have been doing this for a long time.

It’s only hard to down for very trusted sites. Even some agencies provide a money back guarantee as well.

They will provide you examples as well with proper insights.”

John Mueller tweeted his response to the above tweets:

“That’s all made up & irrelevant.

These agencies (both those creating, and those disavowing) are just making stuff up, and cashing in from those who don’t know better.”

Then someone else joined the discussion:

Mueller tweeted a response:

“Don’t waste your time on it; do things that build up your site instead.”

Unambiguous Statement on Negative SEO and Link Disavow Services

A statement by John Mueller (or anyone) can appear to conflict with prior statements when taken out of context.

That’s why I not only placed his statements into their original context but also the history going back eleven years that is a part of that discussion.

It’s clear that John Mueller feels that those selling negative SEO services and those providing disavow services outside of the intended use are “making stuff up” and “cashing in” on clients who might not “know better.”

Featured image by Shutterstock/Asier Romero



Source link

Continue Reading

SEO

Source Code Leak Shows New Ranking Factors to Consider

Published

on

Source Code Leak Shows New Ranking Factors to Consider

January 25, 2023, the day that Yandex—Russia’s search engine—was hacked. 

Its complete source code was leaked online. And, it might not be the first time we’ve seen hacking happen in this industry, but it is one of the most intriguing, groundbreaking events in years.

But Yandex isn’t Google, so why should we care? Here’s why we do: these two search engines are very similar in how they process technical elements of a website, and this leak just showed us the 1,922 ranking factors Yandex uses in its algorithm. 

Simply put, this information is something that we can use to our advantage to get more traffic from Google.

Yandex vs Google

As I said, a lot of these ranking factors are possibly quite similar to the signals that Google uses for search.

Yandex’s algorithm shows a RankBrain analog: MatrixNext. It also seems that they are using PageRank (almost the same way as Google does), and a lot of their text algorithms are the same. Interestingly, there are also a lot of ex-Googlers working in Yandex. 

So, reviewing these factors and understanding how they play into search rankings and traffic will provide some very useful insights into how search engines like Google work. No doubt, this new trove of information will greatly influence the SEO market in the months to come. 

That said, Yandex isn’t Google. The chances of Google having the exact same list of ranking factors is low — and Google may not even give that signal the same amount of weight that Yandex does. 

Still, it’s information that potentially will be useful for driving traffic, so make sure to take a look at them here (before it’s scrubbed from the internet forever).

An early analysis of ranking factors

Many of their ranking factors are as expected. These include:

  • Many link-related factors (e.g., age, relevancy, etc.).
  • Content relevance, age, and freshness.
  • Host reliability
  • End-user behavior signals.

Some sites also get preference (such as Wikipedia). FI_VISITS_FROM_WIKI even shows that sites that are referenced by Wikipedia get plus points. 

These are all things that we already know.

But something interesting: there were several factors that I and other SEOs found unusual, such as PageRank being the 17th highest weighted factor in Yandex, and the 19th highest weighted factor being query-document relevance (in other words, how close they match thematically). There’s also karma for likely spam hosts, based on Whois information.

Other interesting factors are the average domain ranking across queries, percent of organic traffic, and the number of unique visitors.

You can also use this Yandex Search Ranking Factor Explorer, created by Rob Ousbey, to search through the various ranking factors.

The possible negative ranking factors:

Here’s my thoughts on Yandex’s factors that I found interesting: 

FI_ADV: -0.2509284637 — this factor means having tons of adverts scattered around your page and buying PPC can affect rankings. 

FI_DATER_AGE: -0.2074373667 — this one evaluates content age, and whether your article is more than 10 years old, or if there’s no determinable date. Date metadata is important. 

FI_COMM_LINKS_SEO_HOSTS: -0.1809636391 — this can be a negative factor if you have too much commercial anchor text, particularly if the proportion of such links goes above 50%. Pay attention to anchor text distribution. I’ve written a guide on how to effectively use anchor texts if you need some help on this. 

FI_RANK_ARTROZ — outdated, poorly written text will bring your rankings down. Go through your site and give your content a refresh. FI_WORD_COUNT also shows that the number of words matter, so avoid having low-content pages.

FI_URL_HAS_NO_DIGITS, FI_NUM_SLASHES, FI_FULL_URL_FRACTION — urls shouldn’t have digits, too many slashes (too much hierarchy), and of course contain your targeted keyword.

FI_NUM_LINKS_FROM_MP — always interlink your main pages (such as your homepage or landing pages) to any other important content you want to rank. Otherwise, it can hurt your content.

FI_HOPS — reduce the crawl depth for any pages that matter to you. No important pages should be more than a few clicks away from your homepage. I recommend keeping it to two clicks, at most. 

FI_IS_UNREACHABLE — likewise, avoid making any important page an orphan page. If it’s unreachable from your homepage, it’s as good as dead in the eyes of the search engine.

The possible positive ranking factors:

FI_IS_COM: +0.2762504972 — .com domains get a boost in rankings.

FI_YABAR_HOST_VISITORS — the more traffic you get, the more ranking power your site has. The strategy of targeting smaller, easier keywords first to build up an audience before targeting harder keywords can help you build traffic.

FI_BEAST_HOST_MEAN_POS — the average position of the host for keywords affects your overall ranking. This factor and the previous one clearly show that being smart with your keyword and content planning matters. If you need help with that, check out these 5 ways to build a solid SEO strategy.

FI_YABAR_HOST_SEARCH_TRAFFIC — this might look bad but shows that having other traffic sources (such as social media, direct search, and PPC) is good for your site. Yandex uses this to determine if a real site is being run, not just some spammy SEO project.

This one includes a whole host of CTR-related factors. 

CTR ranking factors from Yandex

It’s clear that having searchable and interesting titles that drive users to check your content out is something that positively affects your rankings.

Google is rewarding sites that help end a user’s search journey (as we know from the latest mobile search updates and even the Helpful Content update). Do what you can to answer the query early on in your article. The factor “FI_VISITORS_RETURN_MONTH_SHARE“ also shows that it helps to encourage users to return to your site for more information on the topics they’re interested in. Email marketing is a handy tool here.

FI_GOOD_RATIO and FI_MANY_BAD — the percentage of “good” and “bad” backlinks on your site. Getting your backlinks from high-quality websites with traffic is important for your rankings. The factor FI_LINK_AGE also shows that adding a link-building strategy to your SEO as early as possible can help with your rankings.

FI_SOCIAL_URL_IS_VERIFIED — that little blue check has actual benefits now. Links from verified accounts have more weight.

Key Takeaway

Yandex and Google, being so similar to each other in theory, means that this data leak is something we must pay attention to. 

Several of these factors may already be common knowledge amongst SEOs, but having them confirmed by another search engine enforces how important they are for your strategy.

These initial findings, and understanding what it might mean for your website, can help you identify what to improve, what to scrap, and what to focus on when it comes to your SEO strategy. 

Source link

Continue Reading

Trending

en_USEnglish