Connect with us

SEO

A Complete Google Search Console Guide For SEO Pros

Published

on

A Complete Google Search Console Guide For SEO Pros

Google search console provides data necessary to monitor website performance in search and improve search rankings, information that is exclusively available through Search Console.

This makes it indispensable for online business and publishers that are keen to maximize success.

Taking control of your search presence is easier to do when using the free tools and reports.

What Is Google Search Console?

Google Search Console is a free web service hosted by Google that provides a way for publishers and search marketing professionals to monitor their overall site health and performance relative to Google search.

It offers an overview of metrics related to search performance and user experience to help publishers improve their sites and generate more traffic.

Search Console also provides a way for Google to communicate when it discovers security issues (like hacking vulnerabilities) and if the search quality team has imposed a manual action penalty.

Important features:

  • Monitor indexing and crawling.
  • Identify and fix errors.
  • Overview of search performance.
  • Request indexing of updated pages.
  • Review internal and external links.

It’s not necessary to use Search Console to rank better nor is it a ranking factor.

However, the usefulness of the Search Console makes it indispensable for helping improve search performance and bringing more traffic to a website.

How To Get Started

The first step to using Search Console is to verify site ownership.

Google provides several different ways to accomplish site verification, depending on if you’re verifying a website, a domain, a Google site, or a Blogger-hosted site.

Domains registered with Google domains are automatically verified by adding them to Search Console.

The majority of users will verify their sites using one of four methods:

  1. HTML file upload.
  2. Meta tag
  3. Google Analytics tracking code.
  4. Google Tag Manager.

Some site hosting platforms limit what can be uploaded and require a specific way to verify site owners.

But, that’s becoming less of an issue as many hosted site services have an easy-to-follow verification process, which will be covered below.

How To Verify Site Ownership

There are two standard ways to verify site ownership with a regular website, like a standard WordPress site.

  1. HTML file upload.
  2. Meta tag.

When verifying a site using either of these two methods, you’ll be choosing the URL-prefix properties process.

Let’s stop here and acknowledge that the phrase “URL-prefix properties” means absolutely nothing to anyone but the Googler who came up with that phrase.

Don’t let that make you feel like you’re about to enter a labyrinth blindfolded. Verifying a site with Google is easy.

HTML File Upload Method

Step 1: Go to the Search Console and open the Property Selector dropdown that’s visible in the top left-hand corner on any Search Console page.

Screenshot by author, May 2022

Step 2: In the pop-up labeled Select Property Type, enter the URL of the site then click the Continue button.

Step 2Screenshot by author, May 2022

Step 3: Select the HTML file upload method and download the HTML file.

Step 4: Upload the HTML file to the root of your website.

Root means https://example.com/. So, if the downloaded file is called verification.html, then the uploaded file should be located at https://example.com/verification.html.

Step 5: Finish the verification process by clicking Verify back in the Search Console.

Verification of a standard website with its own domain in website platforms like Wix and Weebly is similar to the above steps, except that you’ll be adding a meta description tag to your Wix site.

Duda has a simple approach that uses a Search Console App that easily verifies the site and gets its users started.

Troubleshooting With GSC

Ranking in search results depends on Google’s ability to crawl and index webpages.

The Search Console URL Inspection Tool warns of any issues with crawling and indexing before it becomes a major problem and pages start dropping from the search results.

URL Inspection Tool

The URL inspection tool shows whether a URL is indexed and is eligible to be shown in a search result.

For each submitted URL a user can:

  • Request indexing for a recently updated webpage.
  • View how Google discovered the webpage (sitemaps and referring internal pages).
  • View the last crawl date for a URL.
  • Check if Google is using a declared canonical URL or is using another one.
  • Check mobile usability status.
  • Check enhancements like breadcrumbs.

Coverage

The coverage section shows Discovery (how Google discovered the URL), Crawl (shows whether Google successfully crawled the URL and if not, provides a reason why), and Enhancements (provides the status of structured data).

The coverage section can be reached from the left-hand menu:

CoverageScreenshot by author, May 2022

Coverage Error Reports

While these reports are labeled as errors, it doesn’t necessarily mean that something is wrong. Sometimes it just means that indexing can be improved.

For example, in the following screenshot, Google is showing a 403 Forbidden server response to nearly 6,000 URLs.

The 403 error response means that the server is telling Googlebot that it is forbidden from crawling these URLs.

Coverage report showing 403 server error responsesScreenshot by author, May 2022

The above errors are happening because Googlebot is blocked from crawling the member pages of a web forum.

Every member of the forum has a member page that has a list of their latest posts and other statistics.

The report provides a list of URLs that are generating the error.

Clicking on one of the listed URLs reveals a menu on the right that provides the option to inspect the affected URL.

There’s also a contextual menu to the right of the URL itself in the form of a magnifying glass icon that also provides the option to Inspect URL.

Inspect URLScreenshot by author, May 2022

Clicking on the Inspect URL reveals how the page was discovered.

It also shows the following data points:

  • Last crawl.
  • Crawled as.
  • Crawl allowed?
  • Page fetch (if failed, provides the server error code).
  • Indexing allowed?

There is also information about the canonical used by Google:

  • User-declared canonical.
  • Google-selected canonical.

For the forum website in the above example, the important diagnostic information is located in the Discovery section.

This section tells us which pages are the ones that are showing links to member profiles to Googlebot.

With this information, the publisher can now code a PHP statement that will make the links to the member pages disappear when a search engine bot comes crawling.

Another way to fix the problem is to write a new entry to the robots.txt to stop Google from attempting to crawl these pages.

By making this 403 error go away, we free up crawling resources for Googlebot to index the rest of the website.

Google Search Console’s coverage report makes it possible to diagnose Googlebot crawling issues and fix them.

Fixing 404 Errors

The coverage report can also alert a publisher to 404 and 500 series error responses, as well as communicate that everything is just fine.

A 404 server response is called an error only because the browser or crawler’s request for a webpage was made in error because the page does not exist.

It doesn’t mean that your site is in error.

If another site (or an internal link) links to a page that doesn’t exist, the coverage report will show a 404 response.

Clicking on one of the affected URLs and selecting the Inspect URL tool will reveal what pages (or sitemaps) are referring to the non-existent page.

From there you can decide if the link is broken and needs to be fixed (in the case of an internal link) or redirected to the correct page (in the case of an external link from another website).

Or, it could be that the webpage never existed and whoever is linking to that page made a mistake.

If the page doesn’t exist anymore or it never existed at all, then it’s fine to show a 404 response.

Taking Advantage Of GSC Features

The Performance Report

The top part of the Search Console Performance Report provides multiple insights on how a site performs in search, including in search features like featured snippets.

There are four search types that can be explored in the Performance Report:

  1. Web.
  2. Image.
  3. Video.
  4. News.

Search Console shows the web search type by default.

Change which search type is displayed by clicking the Search Type button:

Default search typeScreenshot by author, May 2022

A menu pop-up will display allowing you to change which kind of search type to view:

Search Types MenuScreenshot by author, May 2022

A useful feature is the ability to compare the performance of two search types within the graph.

Four metrics are prominently displayed at the top of the Performance Report:

  1. Total Clicks.
  2. Total Impressions.
  3. Average CTR (click-through rate).
  4. Average position.
Screenshot of Top Section of the Performance PageScreenshot by author, May 2022

By default, the Total Clicks and Total Impressions metrics are selected.

By clicking within the tabs dedicated to each metric, one can choose to see those metrics displayed on the bar chart.

Impressions

Impressions are the number of times a website appeared in the search results. As long as a user doesn’t have to click a link to see the URL, it counts as an impression.

Additionally, if a URL is ranked at the bottom of the page and the user doesn’t scroll to that section of the search results, it still counts as an impression.

High impressions are great because it means that Google is showing the site in the search results.

But, the meaning of the impressions metric is made meaningful by the Clicks and the Average Position metrics.

Clicks

The clicks metric shows how often users clicked from the search results to the website. A high number of clicks in addition to a high number of impressions is good.

A low number of clicks and a high number of impressions is less good but not bad. It means that the site may need improvements to gain more traffic.

The clicks metric is more meaningful when considered with the Average CTR and Average Position metrics.

Average CTR

The average CTR is a percentage representing how often users clicked from the search results to the website.

A low CTR means that something needs improvement in order to increase visits from the search results.

A higher CTR means the site is performing well.

This metric gains more meaning when considered together with the Average Position metric.

Average Position

Average Position shows the average position in search results the website tends to appear in.

An average in positions one to 10 is great.

An average position in the twenties (20 – 29) means that the site is appearing on page two or three of the search results. This isn’t too bad. It simply means that the site needs additional work to give it that extra boost into the top 10.

Average positions lower than 30 could (in general) mean that the site may benefit from significant improvements.

Or, it could be that the site ranks for a large number of keyword phrases that rank low and a few very good keywords that rank exceptionally high.

In either case, it may mean taking a closer look at the content. It may be an indication of a content gap on the website, where the content that ranks for certain keywords isn’t strong enough and may need a dedicated page devoted to that keyword phrase to rank better.

All four metrics (Impressions, Clicks, Average CTR, and Average Position), when viewed together, present a meaningful overview of how the website is performing.

The big takeaway about the Performance Report is that it is a starting point for quickly understanding website performance in search.

It’s like a mirror that reflects back how well or poorly the site is doing.

Performance Report Dimensions

Scrolling down to the second part of the Performance page reveals several of what’s called Dimensions of a website’s performance data.

There are six dimensions:

1. Queries: Shows the top search queries and the number of clicks and impressions associated with each keyword phrase.

2. Pages: Shows the top-performing web pages (plus clicks and impressions).

3. Countries: Top countries (plus clicks and impressions).

4. Devices: Shows the top devices, segmented into mobile, desktop, and tablet.

5. Search Appearance: This shows the different kinds of rich results that the site was displayed in. It also tells if Google displayed the site using Web Light results and video results, plus the associated clicks and impressions data. Web Light results are results that are optimized for very slow devices.

6. Dates: The dates tab organizes the clicks and impressions by date. The clicks and impressions can be sorted in descending or ascending order.

Keywords

The keywords are displayed in the Queries as one of the dimensions of the Performance Report (as noted above). The queries report shows the top 1,000 search queries that resulted in traffic.

Of particular interest are the low-performing queries.

Some of those queries display low quantities of traffic because they are rare, what is known as long-tail traffic.

But, others are search queries that result from webpages that could need improvement, perhaps it could be in need of more internal links, or it could be a sign that the keyword phrase deserves its own webpage.

It’s always a good idea to review the low-performing keywords because some of them may be quick wins that, when the issue is addressed, can result in significantly increased traffic.

Links

Search Console offers a list of all links pointing to the website.

However, it’s important to point out that the links report does not represent links that are helping the site rank.

It simply reports all links pointing to the website.

This means that the list includes links that are not helping the site rank. That explains why the report may show links that have a nofollow link attribute on them.

The Links report is accessible  from the bottom of the left-hand menu:

Links reportScreenshot by author, May 2022

The Links report has two columns: External Links and Internal Links.

External Links are the links from outside the website that points to the website.

Internal Links are links that originate within the website and link to somewhere else within the website.

The External links column has three reports:

  1. Top linked pages.
  2. Top linking sites.
  3. Top linking text.

The Internal Links report lists the Top Linked Pages.

Each report (top linked pages, top linking sites, etc.) has a link to more results that can be clicked to view and expand the report for each type.

For example, the expanded report for Top Linked Pages shows Top Target pages, which are the pages from the site that are linked to the most.

Clicking a URL will change the report to display all the external domains that link to that one page.

The report shows the domain of the external site but not the exact page that links to the site.

Sitemaps

A sitemap is generally an XML file that is a list of URLs that helps search engines discover the webpages and other forms of content on a website.

Sitemaps are especially helpful for large sites, sites that are difficult to crawl if the site has new content added on a frequent basis.

Crawling and indexing are not guaranteed. Things like page quality, overall site quality, and links can have an impact on whether a site is crawled and pages indexed.

Sitemaps simply make it easy for search engines to discover those pages and that’s all.

Creating a sitemap is easy because more are automatically generated by the CMS, plugins, or the website platform where the site is hosted.

Some hosted website platforms generate a sitemap for every site hosted on its service and automatically update the sitemap when the website changes.

Search Console offers a sitemap report and provides a way for publishers to upload a sitemap.

To access this function click on the link located on the left-side menu.

sitemaps

The sitemap section will report on any errors with the sitemap.

Search Console can be used to remove a sitemap from the reports. It’s important to actually remove the sitemap however from the website itself otherwise Google may remember it and visit it again.

Once submitted and processed, the Coverage report will populate a sitemap section that will help troubleshoot any problems associated with URLs submitted through the sitemaps.

Search Console Page Experience Report

The page experience report offers data related to the user experience on the website relative to site speed.

Search Console displays information on Core Web Vitals and Mobile Usability.

This is a good starting place for getting an overall summary of site speed performance.

Rich Result Status Reports

Search Console offers feedback on rich results through the Performance Report. It’s one of the six dimensions listed below the graph that’s displayed at the top of the page, listed as Search Appearance.

Selecting the Search Appearance tabs reveals clicks and impressions data for the different kinds of rich results shown in the search results.

This report communicates how important rich results traffic is to the website and can help pinpoint the reason for specific website traffic trends.

The Search Appearance report can help diagnose issues related to structured data.

For example, a downturn in rich results traffic could be a signal that Google changed structured data requirements and that the structured data needs to be updated.

It’s a starting point for diagnosing a change in rich results traffic patterns.

Search Console Is Good For SEO

In addition to the above benefits of Search Console, publishers and SEOs can also upload link disavow reports, resolve penalties (manual actions), and security events like site hackings, all of which contribute to a better search presence.

It is a valuable service that every web publisher concerned about search visibility should take advantage of.

More Resources:


Featured Image: bunny pixar/Shutterstock



Source link

Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address

SEO

How To Uncover Traffic Declines In Google Search Console And How To Fix Them

Published

on

By

How To Uncover Traffic Declines In Google Search Console And How To Fix Them

Google Search Console is an essential tool that offers critical insights into your website’s performance in Google search results.

Occasionally, you might observe a sudden decline in organic traffic, and it’s crucial to understand the potential causes behind this drop. The data stored within Google Search Console (GSC) can be vital in troubleshooting and understanding what has happened to your website.

Before troubleshooting GSC traffic declines, it’s important to understand first what Google says about assessing traffic graphs in GSC and how it reports on different metrics.

Understanding Google Search Console Metrics

Google’s documentation on debugging Search traffic drops is relatively comprehensive (compared to the guidance given in other areas) and can, for the most part, help prevent any immediate or unnecessary panic should there be a change in data.

Despite this, I often find that Search Console data is misunderstood by both clients and those in the first few years of SEO and learning the craft.

Image from Google Search Central, May 2024

Even with these definitions, if your clicks and impressions graphs begin to resemble any of the above graph examples, there can be wider meanings.

Search Central description  It could also be a sign that…
Large drop from an algorithmic update, site-wide security, or spam issue This could also signal a serious technical issue, such as accidentally deploying a noindex onto a URL or returning the incorrect status code – I’ve seen it before where the URL renders content but returns a 410.
Seasonality You will know your seasonality better than anyone, but if this graph looks inverse it could be a sign that during peak search times, Google is rotating the search engine results pages (SERPs) and choosing not to rank your site highly. This could be because, during peak search periods, there is a slight intent shift in the queries’ dominant interpretation.
Technical issues across your site, changing interests This type of graph could also represent seasonality (both as a gradual decline or increase).
Reporting glitch ¯_(ツ)_/¯ This graph can represent intermittent technical issues as well as reporting glitches. Similar to the alternate reasons for graphs like Seasonality, it could represent a short-term shift in the SERPs and what meets the needs of an adjusted dominant interpretation of a query.

Clicks & Impressions

Google filters Click and Impression data in Google Search Console through a combination of technical methods and policies designed to ensure the accuracy, reliability, and integrity of the reported data.

Reasons for this include:

  • Spam and bot filtering.
  • Duplicate data removal.
  • User privacy/protection.
  • Removing “invalid activities.”
  • Data aggregation and sampling.

One of the main reasons I’ve seen GSC change the numbers showing the UI and API is down to the setting of thresholds.

Google may set thresholds for including data in reports to prevent skewed metrics due to very low-frequency queries or impressions. For example, data for queries that result in very few impressions might be excluded from reports to maintain the statistical reliability of the metrics.

Average Position

Google Search Console produces the Average Position metric by calculating the average ranking of a website’s URLs for a specific query or set of queries over a defined period of time.

Each time a URL appears in the search results for a query, its position is recorded. For instance, if a URL appears in the 3rd position for one query and in the 7th position for another query, these positions are logged separately.

As we enter the era of AI Overviews, John Mueller has confirmed via Slack conversations that appearing in a generative snapshot will affect the average position of the query and/or URL in the Search Console UI.

1718702762 996 How To Uncover Traffic Declines In Google Search Console AndSource: John Mueller via The SEO Community Slack channel

I don’t rely on the average position metric in GSC for rank tracking, but it can be useful in trying to debug whether or not Google is having issues establishing a single dominant page for specific queries.

Understanding how the tool compiles data allows you to better diagnose the reasons as to why, and correlate data with other events such as Google updates or development deployments.

Google Updates

A Google broad core algorithm update is a significant change to Google’s search algorithm intended to improve the relevance and quality of search results.

These updates do not target specific sites or types of content but alter specific systems that make up the “core” to an extent it is noteworthy for Google to announce that an update is happening.

Google makes updates to the various individual systems all the time, so the lack of a Google announcement does not disqualify a Google update from being the cause of a change in traffic.

For example, the website in the below screenshot saw a decline from the March 2023 core update but then recovered in the November 2023 core update.

GSC: the website saw a decline from the March 2023 core updateScreenshot by author from Google Search Console, May 2024

The following screenshot shows another example of a traffic decline correlating with a Google update, and it also shows that recovery doesn’t always occur with future updates.

traffic decline correlating with a Google updateScreenshot by author from Google Search Console, May 2024

This site is predominantly informational content supporting a handful of marketing landing pages (a traditional SaaS model) and has seen a steady decline correlating with the September 2023 helpful content update.

How To Fix This

Websites negatively impacted by a broad core update can’t fix specific issues to recover.

Webmasters should focus on providing the best possible content and improving overall site quality.

Recovery, however, may occur when the next broad core update is rolled out if the site has improved in quality and relevance or Google adjusts specific systems and signal weightings back in the favour of your site.

In SEO terminology, we also refer to these traffic changes as an algorithmic penalty, which can take time to recover from.

SERP Layout Updates

Given the launch of AI Overviews, I feel many SEO professionals will conduct this type of analysis in the coming months.

In addition to AI Overviews, Google can choose to include a number of different SERP features ranging from:

  • Shopping results.
  • Map Packs.
  • X (Twitter) carousels.
  • People Also Ask accordions.
  • Featured snippets.
  • Video thumbnails.

All of these not only detract and distract users from the traditional organic results, but they also cause pixel shifts.

From our testing of SGE/AI Overviews, we see traditional results being pushed down anywhere between 1,000 and 1,500 pixels.

When this happens you’re not likely to see third-party rank tracking tools show a decrease, but you will see clicks decline in GSC.

The impact of SERP features on your traffic depends on two things:

  • The type of feature introduced.
  • Whether your users predominantly use mobile or desktop.

Generally, SERP features are more impactful to mobile traffic as they greatly increase scroll depth, and the user screen is much smaller.

You can establish your dominant traffic source by looking at the device breakdown in Google Search Console:

Device by users: clicks and impressionsImage from author’s website, May 2024

You can then compare the two graphs in the UI, or by exporting data via the API with it broken down by devices.

How To Fix This

When Google introduces new SERP features, you can adjust your content and site to become “more eligible” for them.

Some are driven by structured data, and others are determined by Google systems after processing your content.

If Google has introduced a feature that results in more zero-click searches for a particular query, you need to first quantify the traffic loss and then adjust your strategy to become more visible for similar and associated queries that still feature in your target audience’s overall search journey.

Seasonality Traffic Changes

Seasonality in demand refers to predictable fluctuations in consumer interest and purchasing behavior that occur at specific times of the year, influenced by factors such as holidays, weather changes, and cultural events.

Notably, a lot of ecommerce businesses will see peaks in the run-up to Christmas and Thanksgiving, whilst travel companies will see seasonality peaks at different times of the year depending on the destinations and vacation types they cater to.

The below screenshot is atypical of a business that has a seasonal peak in the run-up to Christmas.

seasonal peaks as measured in GSCScreenshot by author from Google Search Console, May 2024

You will see these trends in the Performance Report section and likely see users and sessions mirrored in other analytics platforms.

During a seasonal peak, Google may choose to alter the SERPs in terms of which websites are ranked and which SERP features appear. This occurs when the increase in search demand also brings with it a change in user intent, thus changing the dominant interpretation of the query.

In the travel sector, the shift is often from a research objective to a commercial objective. Out-of-season searchers are predominantly researching destinations or looking for deals, and when it is time to book, they’re using the same search queries but looking to book.

As a result, webpages with a value proposition that caters more to the informational intent are either “demoted” in rankings or swapped out in favor of webpages that (in Google’s eyes) better cater to users in satisfying the commercial intent.

How To Fix This

There is no direct fix for traffic increases and decreases caused by seasonality.

However, you can adjust your overall SEO strategy to accommodate this and work to create visibility for the website outside of peak times by creating content to meet the needs and intent of users who may have a more research and information-gathering intent.

Penalties & Manual Actions

A Google penalty is a punitive action taken against a website by Google, reducing its search rankings or removing it from search results, typically due to violations of Google’s guidelines.

As well as receiving a notification in GSC, you’ll typically see a sharp decrease in traffic, akin to the graph below:

Google traffic decline from penaltyScreenshot by author from Google Search Console, May 2024

Whether or not the penalty is partial or sitewide will depend on how bad the traffic decline is, and also the type (or reason) as to why you received a penalty in the first place will determine what efforts are required and how long it will take to recover.

Changes In PPC Strategies

A common issue I encounter working with organizations is a disconnect in understanding that, sometimes, altering a PPC campaign can affect organic traffic.

An example of this is brand. If you start running a paid search campaign on your brand, you can often expect to see a decrease in branded clicks and CTR. As most organizations have separate vendors for this, it isn’t often communicated that this will be the case.

The Search results performance report in GSC can help you identify whether or not you have cannibalization between your SEO and PPC. From this report, you can correlate branded and non-branded traffic drops with the changelog from those in command of the PPC campaign.

How To Fix This

Ensuring that all stakeholders understand why there have been changes to organic traffic, and that the traffic (and user) isn’t lost, it is now being attributed to Paid.

Understanding if this is the “right decision” or not requires a conversation with those managing the PPC campaigns, and if they are performing and providing a strong ROAS, then the organic traffic loss needs to be acknowledged and accepted.

Recovering Site Traffic

Recovering from Google updates can take time.

Recently, John Mueller has said that sometimes, to recover, you need to wait for another update cycle.

However, this doesn’t mean you shouldn’t be active in trying to improve your website and better align with what Google wants to reward and relying on Google reversing previous signal weighting changes.

It’s critical that you start doing all the right things as soon as possible. The earlier that you identify and begin to solve problems, the earlier that you open up the potential for recovery. The time it takes to recover depends on what caused the drop in the first place, and there might be multiple factors to account for. Building a better website for your audience that provides them with better experiences and better service is always the right thing to do.

More resources: 


Featured Image: Ground Picture/Shutterstock

Source link

Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address
Continue Reading

SEO

Barriers To Audience Buy-In

Published

on

By

Barriers to audience buy-in with lead generation

This is an excerpt from the B2B Lead Generation ebook, which draws on SEJ’s internal expertise in delivering leads across multiple media types.

People are driven by a mix of desires, wants, needs, experiences, and external pressures.

It can take time to get it right and convince a person to become a lead, let alone a paying customer.

Here are some nuances of logic and psychology that could be impacting your ability to connect with audiences and build strong leads.

1. Poor Negotiations & The Endowment Effect

Every potential customer you encounter values their own effort and information. And due to something called the endowment effect, they value that time and data much more than you do.

In contrast, the same psychological effect means you value what you offer in exchange for peoples’ information more than they will.

If the value of what you’re offering fails to match the value of what consumers are giving you in exchange (read: their time and information), the conversions will be weak.

The solution? You can increase the perceived value of the thing you’re offering, or reduce the value of what the user “pays” for the thing you offer.

Want an exclusive peek into tactics we use when developing our own lead gen campaigns? Check out our upcoming webinar.

Humans evaluate rewards in multiple dimensions, including the reward amount, the time until the reward is received, and the certainty of the reward.

The more time before a reward occurs, and the less certain its ultimate value, the harder you have to work to get someone to engage.

Offering value upfront – even if you’re presenting something else soon after, like a live event, ebook, or demo – can help entice immediate action as well as convince leads of the long-term value of their investment.

It can even act as a prime for the next step in the lead gen nurturing process, hinting at even more value to come and increasing the effectiveness of the rest of your lead generation strategy.

It’s another reason why inbound content is a critical support for lead generation content. The short-term rewards of highly useful ungated content help prepare audiences for longer-term benefits offered down the line.

3. Abandonment & The Funnel Myth

Every lead generation journey is carefully planned, but if you designed it with a funnel in mind, you could be losing many qualified leads.

That’s because the imagery of a funnel might suggest that all leads engage with your brand or offer in the same way, but this simply isn’t true – particularly for products or services with high values.

Instead, these journeys are more abstract. Leads tend to move back and forth between stages depending on their circumstances. They might change their minds, encounter organizational roadblocks, switch channels, or their needs might suddenly change.

Instead of limiting journeys to audience segments, consider optimizing for paths and situations, too.

Optimizing for specific situations and encounters creates multiple opportunities to capture a lead while they’re in certain mindsets. Every opportunity is a way to engage with varying “costs” for time and data, and align your key performance indicators (KPIs) to match.

Situational journeys also create unique opportunities to learn about the various audience segments, including what they’re most interested in, which offers to grab their attention, and which aspects of your brand, product, or service they’re most concerned about.

4. Under-Pricing

Free trials and discounts can be eye-catching, but they don’t always work to your benefit.

Brands often think consumers will always choose the product with the lowest possible price. That isn’t always the case.

Consumers work within something referred to as the “zone of acceptability,” which is the price range they feel is acceptable for a purchasing decision.

If your brand falls outside that range, you’ll likely get the leads – but they could fail to buy in later. The initial offer might be attractive, but the lower perception of value could work against you when it comes time to try and close the sale.

Several elements play into whether consumers are sensitive to pricing discounts. The overall cost of a purchase matters, for example.

Higher-priced purchases, such as SaaS or real estate, can be extremely sensitive to pricing discounts. They can lead to your audience perceiving the product as lower-value, or make it seem like you’re struggling. A price-quality relationship is easy to see in many places in our lives. If you select the absolute lowest price for an airline ticket, do you expect your journey to be timely and comfortable?

It’s difficult to offer specific advice on these points. To find ideal price points and discounts, you need good feedback systems from both customers and leads – and you need data about how other audiences interact. But there’s value in not being the cheapest option.

Get more tips on how we, here at SEJ, create holistic content campaigns to drive leads in this exclusive webinar.

5. Lead Roles & Information

In every large purchasing decision, there are multiple roles in the process. These include:

  • User: The person who ultimately uses the product or service.
  • Buyer: The person who makes the purchase, but may or may not know anything about the actual product or service being purchased.
  • Decider: The person who determines whether to make the purchase.
  • Influencer: The person who provides opinions and thoughts on the product or service, and influences perceptions of it.
  • Gatekeeper: The person who gathers and holds information about the product or service.

Sometimes, different people play these roles, and other times, one person may hold more than one of these roles. However, the needs of each role must be met at the right time. If you fail to meet their needs, you’ll see your conversions turn cold at a higher rate early in the process.

The only way to avoid this complication is to understand who it is you’re attracting when you capture the lead, and make the right information available at the right time during the conversion process.

6. Understand Why People Don’t Sign Up

Many businesses put significant effort into lead nurturing and understanding the qualities of potential customers who fill out lead forms.

But what about the ones who don’t fill out those forms?

Understanding these values and the traits that drive purchasing decisions is paramount.

Your own proprietary and customer data, like your analytics, client data, and lead interactions, makes an excellent starting place, but don’t make the mistake of basing your decisions solely on the data you have collected about the leads you have.

This information creates a picture based solely on people already interacting with you. It doesn’t include information about the audience you’ve failed to capture so far.

Don’t fall for survivorship bias, which occurs when you only look at data from people who have passed your selection filters.

This is especially critical for lead generation because there are groups of people you don’t want to become leads. But you need to make sure you’re attracting as many ideal leads as possible while filtering out those that are suboptimal. You need information about the people who aren’t converting to ensure your filters are working as intended.

Gather information from the segment of your target audience that uses a competitor’s products, and pair them with psychographic tools and frameworks like “values and lifestyle surveys” (VALS) to gather insights and inform decisions.

In a digital world of tough competition and even more demands on every dollar, your lead generation needs to be precise.

Understanding what drives your target audience before you capture the lead and ensuring every detail is crafted with the final conversion in mind will help you capture more leads and sales, and leave your brand the clear market winner.

More resources:


Featured Image: Pasuwan/Shutterstock

Source link

Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address
Continue Reading

SEO

Google Answers Question About Toxic Link Sabotage

Published

on

By

Gary Illyes answers a question about how to notify Google about toxic link sabotage

Google’s Gary Illyes answered a question about how to notify Google that someone is poisoning their backlink profile with “toxic links” which is a problem that many people have been talking about for at least fifteen years.

Question About Alerting Google To Toxic Links

Gary narrated the question:

“Someone’s asking, how to alert Google of sabotage via toxic links?”

And this is Gary’s answer:

I know what I would do: I’d ignore those links.

Generally Google is really, REALLY good at ignoring links that are irrelevant to the site they’re pointing at. If you feel like it, you can always disavow those “toxic” links, or file a spam report.

Disavow Links If You Feel Like It

Gary linked to Google’s explainer about disavowing links where it’s explained that the disavow tool is for a site owner to tell Google about links that they are responsible for in some way, like paid links or some other link scheme.

This is what it advises:

“If you have a manual action against your site for unnatural links to your site, or if you think you’re about to get such a manual action (because of paid links or other link schemes that violate our quality guidelines), you should try to remove the links from the other site to your site. If you can’t remove those links yourself, or get them removed, then you should disavow the URLs of the questionable pages or domains that link to your website.”

Google suggests that a link disavow is only necessary when two conditions are met:

  1. “You have a considerable number of spammy, artificial, or low-quality links pointing to your site,
    AND
  2. The links have caused a manual action, or likely will cause a manual action, on your site.”

Both of the above conditions must be met in order to file a valid link disavow tool.

Origin Of The Phrase Toxic Links

As Google became better at penalizing sites for low quality links and paid links, some in the highly competitive gambling industry started creating low quality links to sabotage their competitors. The practice was called negative SEO.

The phrase toxic link is something that was never heard of until after the Penguin link updates in 2012 which required penalized sites to remove all the paid and low quality links they created and then disavow the rest. An industry grew around disavowing links and it was that industry that invented the phrase Toxic Links for use in their marketing.

Confirmation That Google Is Able To Ignore Links

I have shared this anecdote before and I’ll share it here again. Someone I knew contacted me and said that their site lost rankings from negative SEO links. I took a look and their site had a ton of really nasty looking links. So out of curiosity (and because I knew that the site was this person’s main income), I emailed someone at Google Mountain View headquarters about it. That person checked it and replied that the site didn’t lose rankings because of the links. They lost rankings because of a Panda update related content issue.

That was around 2012 and it showed me how good Google was at ignoring links. Now, if Google was that good at ignoring really bad links back then, they’re probably better at it now, twelve years later now that they have the spam brain AI.

Listen to the question and answer at the 8:22 minute mark:

Featured Image by Shutterstock/New Africa

Source link

Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address
Continue Reading

Trending