Connect with us

SEO

HTTP Status Codes: The Complete List

Published

on

HTTP Status Codes: The Complete List

HTTP status codes are server responses to client (typically browser) requests. The status codes are found in the server response. They include a three-digit number and usually have a description of the status. Specifications and their functionality are defined by the World Wide Web Consortium (W3C).

The status codes are how your client and a server communicate with each other. You can view any page’s HTTP status codes for free using Ahrefs’ SEO Toolbar by clicking the toolbar icon.

HTTP Status Codes The Complete List

You can also click and expand this to see the full header response, which helps with troubleshooting many technical issues.

HTTP Status Codes The Complete List

There are five ranges for the codes:

Keep reading to learn what the status codes mean and how Google handles them.

1xxs – Provide some kind of additional information

1xx status codes indicate the server has received the request and the processing will continue.

100 Continue – Everything is OK right now. Keep going.

101 Switching Protocols – There is a message, such as an upgrade request, that’s changing things to a different protocol.

102 Processing – Things are happening but are not done yet.

103 Early Hints – Lets you preload resources, which can help improve Largest Contentful Paint for Core Web Vitals.

2xxs – Show that a request is successful

2xx status codes mean that a client request has been received, understood, and accepted.

200 OK – All good. Everything is successful.

201 Created – Similar to 200, but the measure of success is that a new resource has been created.

202 Accepted – A request has been accepted for processing, but it hasn’t been completed yet. It may not have even started yet.

203 Non-Authoritative Information – Something has changed after it was sent from the server to you.

204 No Content – The request has been sent, but there’s no content in the body.

205 Reset Content – Resets the document to the original state, e.g., clearing a form.

206 Partial Content – Only some of the content has been sent.

207 Multi-Status – There are more response codes that could be 2xx, 3xx, 4xx, or 5xx.

208 Already Reported – The client tells the server the same resource was mentioned earlier.

218 This is fine – Unofficial use by Apache.

226 IM Used – This allows the server to send changes (diffs) of resources to clients.

How Google handles 2xx

Most 2xxs will allow pages to be indexed. However, 204s will be treated as soft 404s and won’t be indexed.

Soft 404s may also be URLs where the server says it is successful (200), but the content of the page says it doesn’t exist. The code should have been a 404, but the server says everything is fine when it isn’t. This can also happen on pages with little or no content.

You can find these soft 404 errors in the Coverage report in Google Search Console.

Soft 404s excluded in GSC's Coverage report

3xxs – Related to redirects mostly but with few exceptions

3xx status codes indicate the client still needs to do something before the request can be successful.

300 Multiple Choices – There’s more than one possible response, and you may have to choose one of them.

301 Moved Permanently – The old resource now redirects to the new resource.

302 Found – The old resource now redirects to the new resource temporarily.

302 Moved Temporarily – The old resource now redirects to the new resource temporarily.

303 See Other – This is another redirect that indicates the resource may be found somewhere else.

304 Not Modified – Says the page hasn’t been modified. Typically used for caching.

305 Use Proxy – The requested resource is only available if you use a proxy.

306 Switch Proxy – Your next requests should use the proxy specified. This code is no longer used.

307 Temporary Redirect – Has the same functionality as a 302 redirect, except you can’t switch between POST and GET.

307 HSTS Policy – Forces the client to use HTTPS when making requests instead of HTTP.

308 Permanent Redirect – Has the same functionality as a 301 redirect, except you can’t switch between POST and GET.

How Google handles 3xx

301s and 302s are canonicalization signals. They pass PageRank and help determine which URL is shown in Google’s index. A 301 consolidates forward to the new URL, and a 302 consolidates backward to the old URL. If a 302 is left in place long enough or if the URL it’s redirected to already exists, a 302 may be treated as a 301 and consolidated forward instead.

302s may also be used for redirecting users to language or country/language-specific homepages, but the same logic shouldn’t be used for deeper pages.

303s have an undefined treatment from Google. They may be treated as 301 or 302, depending on how they function.

A 307 has two different cases. In cases where it’s a temporary redirect, it will be treated the same as a 302 and attempt to consolidate backward. When web servers require clients to only use HTTPS connections (HSTS policy), Google won’t see the 307 because it’s cached in the browser. The initial hit (without cache) will have a server response code that’s likely a 301 or a 302. But your browser will show you a 307 for subsequent requests.

308s are treated the same as 301s and consolidate forward.

Google will follow up to 10 hops in a redirect chain. It typically follows five hops in one session and resumes where it left off in the next session. After this, signals may not consolidate to the redirected pages.

You can find these redirect chains in Ahrefs’ Site Audit or our free Ahrefs Webmaster Tools (AWT).

Redirect chains shown in Ahrefs' Site Audit

4xxs – Errors on the client’s side

4xx status codes mean the client has an error. The error is usually explained in the response.

400 Bad Request – Something with the client request is wrong. It’s possibly malformed, invalid, or too large. And now the server can’t understand the request.

401 Unauthorized – The client hasn’t identified or verified itself when needed.

402 Payment Required – This doesn’t have an official use, and it’s reserved for the future for some kind of digital payment system. Some merchants use this for their own reasons, e.g., Shopify uses this when a store hasn’t paid its fees, and Stripe uses this for potentially fraudulent payments.

403 Forbidden – The client is known but doesn’t have access rights.

404 Not Found – The requested resource isn’t found.

405 Method Not Allowed – The request method used isn’t supported, e.g., a form needs to use POST but uses GET instead.

406 Not Acceptable – The accept header requested by the client can’t be fulfilled by the server.

407 Proxy Authentication Required – The authentication needs to be done via proxy.

408 Request Timeout – The server has timed out or decided to close the connection.

409 Conflict – The request conflicts with the state of the server.

410 Gone – Similar to a 404 where the request isn’t found, but this also says it won’t be available again.

411 Length Required – The request doesn’t contain a content-length field when it is required.

412 Precondition Failed – The client puts a condition on the request that the server doesn’t meet.

413 Payload Too Large – The request is larger than what the server allows.

414 URI Too Long – The URI requested is longer than the server allows.

415 Unsupported Media Type – The format requested isn’t supported by the server.

416 Range Not Satisfiable – The client asks for a portion of the file that can’t be supplied by the server, e.g., it asks for a part of the file beyond where the file actually ends.

417 Expectation Failed – The expectation indicated in the “Expect” request header can’t be met by the server.

418 I’m a Teapot – Happens when you try to brew coffee in a teapot. This started as an April Fool’s joke in 1998 but is actually standardized. With everything being smart devices these days, this could potentially be used.

419 Page Expired – Unofficial use by Laravel Framework.

420 Method Failure – Unofficial use by Spring Framework.

420 Enhance Your Calm – Unofficial use by Twitter.

421 Misdirected Request – The server that a request was sent to can’t respond to it.

422 Unprocessable Entity – There are semantic errors in the request.

423 Locked – The requested resource is locked.

424 Failed Dependency – This failure happens because it needs another request that also failed.

425 Too Early – The server is unwilling to process the request at this time because the request is likely to come again later.

426 Upgrade Required – The server refuses the request until the client uses a newer protocol. What needs to be upgraded is indicated in the “Upgrade” header.

428 Precondition Required – The server requires the request to be conditional.

429 Too Many Requests – This is a form of rate-limiting to protect the server because the client sent too many requests to the server too fast.

430 Request Header Fields Too Large – Unofficial use by Shopify.

431 Request Header Fields Too Large – The server won’t process the request because the header fields are too large.

440 Login Time-out – Unofficial use by IIS.

444 No Response – Unofficial use by nginx.

449 Retry With – Unofficial use by IIS.

450 Blocked by Windows Parental Controls – Unofficial use by Microsoft.

451 Unavailable For Legal Reasons – This is blocked for some kind of legal reason. You’ll see it sometimes with country-level blocks, e.g., blocked news or videos, due to privacy or licensing. You may see it for DMCA takedowns. The code itself is a reference to the novel Fahrenheit 451.

451 Redirect – Unofficial use by IIS.

460 – Unofficial use by AWS Elastic Load Balancer.

463 – Unofficial use by AWS Elastic Load Balancer.

494 Request header too large – Unofficial use by nginx.

495 SSL Certificate Error – Unofficial use by nginx.

496 SSL Certificate Required – Unofficial use by nginx.

497 HTTP Request Sent to HTTPS Port – Unofficial use by nginx.

498 Invalid Token – Unofficial use by Esri.

499 Client Closed Request – Unofficial use by nginx.

499 Token Required – Unofficial use by Esri.

How Google handles 4xx

4xxs will cause pages to drop from the index.

404s and 410s have a similar treatment. Both drop pages from the index, but 410s are slightly faster. In practical applications, they’re roughly the same.

421s are used by Google to opt out of crawling with HTTP/2.

429s are a little special because they are generally treated as server errors and will cause Google to slow down crawling. But eventually, Google will drop these pages from the index as well.

You can find 4xx errors in Site Audit or our free Ahrefs Webmaster Tools.

Pie chart showing HTTP status codes distribution

Another thing you may want to check is if any of these 404 pages have links to them. If the links point to a 404 page, they don’t count for your website. More than likely, you just need to 301 redirect each of these pages to a relevant page.

Here’s how to find those opportunities:

  1. Paste your domain into Site Explorer (also accessible for free in AWT)
  2. Go to the Best by links report
  3. Add a “404 not found” HTTP response filter

I usually sort this by “Referring domains.”

404s with links in the Best by links report that you can redirect

5xxs – Errors on the server’s side

5xx status codes mean the server has an error, and it knows it can’t carry out the request. The response will contain a reason for the error.

500 Internal Server Error – The server encounters some kind of issue and doesn’t have a better or more specific error code.

501 Not Implemented – The request method isn’t supported by the server.

502 Bad Gateway – The server was in the middle of a request and used for routing. But it has received a bad response from the server it was routing to.

503 Service Unavailable – The server is overloaded or down for maintenance and can’t handle the request right now. It will probably be back up soon.

504 Gateway Timeout – The server was in the middle of a request and used for routing. But it has not received a timely response from the server it was routing to.

505 HTTP Version Not Supported – This is exactly what it says: The HTTP protocol version in the request isn’t supported by the server.

506 Variant Also Negotiates – Allows the client to get the best variant of a resource when the server has multiple variants.

507 Insufficient Storage – The server can’t store what it needs to store to complete the request.

508 Loop Detected – The server found an infinite loop when trying to process the request.

509 Bandwidth Limit Exceeded – Unofficial use by Apache and cPanel.

510 Not Extended – More extensions to the request are required before the server will fulfill it.

511 Network Authentication Required – The client needs authentication before the server allows network access.

520 Web Server Returned an Unknown Error – Unofficial use by Cloudflare.

521 Web Server is Down – Unofficial use by Cloudflare.

522 Connection Timed Out – Unofficial use by Cloudflare.

523 Origin is Unreachable – Unofficial use by Cloudflare.

524 A Timeout Occurred – Unofficial use by Cloudflare.

525 SSL Handshake Failed – Unofficial use by Cloudflare.

526 Invalid SSL Certificate – Unofficial use by Cloudflare.

527 Railgun Error – Unofficial use by Cloudflare.

529 Site is overloaded – Unofficial use by Qualys.

530 – Unofficial use by Cloudflare.

530 Site is frozen – Unofficial use by Pantheon.

561 Unauthorized – Unofficial use by AWS Elastic Load Balancer.

598 (Informal convention) Network read timeout error – Unofficial use by some HTTP proxies.

How Google handles 5xx

5xx errors will slow down crawling. Eventually, the pages will be dropped from Google’s index. You can find these in Site Audit or Ahrefs Webmaster Tools, but they may be different from the 5xxs that Google sees. Since these are server errors, they may not always be present.

4xx and 5xx errors in Ahrefs' Site Audit

Source link

Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address

SEO

How To Uncover Traffic Declines In Google Search Console And How To Fix Them

Published

on

By

How To Uncover Traffic Declines In Google Search Console And How To Fix Them

Google Search Console is an essential tool that offers critical insights into your website’s performance in Google search results.

Occasionally, you might observe a sudden decline in organic traffic, and it’s crucial to understand the potential causes behind this drop. The data stored within Google Search Console (GSC) can be vital in troubleshooting and understanding what has happened to your website.

Before troubleshooting GSC traffic declines, it’s important to understand first what Google says about assessing traffic graphs in GSC and how it reports on different metrics.

Understanding Google Search Console Metrics

Google’s documentation on debugging Search traffic drops is relatively comprehensive (compared to the guidance given in other areas) and can, for the most part, help prevent any immediate or unnecessary panic should there be a change in data.

Despite this, I often find that Search Console data is misunderstood by both clients and those in the first few years of SEO and learning the craft.

Image from Google Search Central, May 2024

Even with these definitions, if your clicks and impressions graphs begin to resemble any of the above graph examples, there can be wider meanings.

Search Central description  It could also be a sign that…
Large drop from an algorithmic update, site-wide security, or spam issue This could also signal a serious technical issue, such as accidentally deploying a noindex onto a URL or returning the incorrect status code – I’ve seen it before where the URL renders content but returns a 410.
Seasonality You will know your seasonality better than anyone, but if this graph looks inverse it could be a sign that during peak search times, Google is rotating the search engine results pages (SERPs) and choosing not to rank your site highly. This could be because, during peak search periods, there is a slight intent shift in the queries’ dominant interpretation.
Technical issues across your site, changing interests This type of graph could also represent seasonality (both as a gradual decline or increase).
Reporting glitch ¯_(ツ)_/¯ This graph can represent intermittent technical issues as well as reporting glitches. Similar to the alternate reasons for graphs like Seasonality, it could represent a short-term shift in the SERPs and what meets the needs of an adjusted dominant interpretation of a query.

Clicks & Impressions

Google filters Click and Impression data in Google Search Console through a combination of technical methods and policies designed to ensure the accuracy, reliability, and integrity of the reported data.

Reasons for this include:

  • Spam and bot filtering.
  • Duplicate data removal.
  • User privacy/protection.
  • Removing “invalid activities.”
  • Data aggregation and sampling.

One of the main reasons I’ve seen GSC change the numbers showing the UI and API is down to the setting of thresholds.

Google may set thresholds for including data in reports to prevent skewed metrics due to very low-frequency queries or impressions. For example, data for queries that result in very few impressions might be excluded from reports to maintain the statistical reliability of the metrics.

Average Position

Google Search Console produces the Average Position metric by calculating the average ranking of a website’s URLs for a specific query or set of queries over a defined period of time.

Each time a URL appears in the search results for a query, its position is recorded. For instance, if a URL appears in the 3rd position for one query and in the 7th position for another query, these positions are logged separately.

As we enter the era of AI Overviews, John Mueller has confirmed via Slack conversations that appearing in a generative snapshot will affect the average position of the query and/or URL in the Search Console UI.

1718702762 996 How To Uncover Traffic Declines In Google Search Console AndSource: John Mueller via The SEO Community Slack channel

I don’t rely on the average position metric in GSC for rank tracking, but it can be useful in trying to debug whether or not Google is having issues establishing a single dominant page for specific queries.

Understanding how the tool compiles data allows you to better diagnose the reasons as to why, and correlate data with other events such as Google updates or development deployments.

Google Updates

A Google broad core algorithm update is a significant change to Google’s search algorithm intended to improve the relevance and quality of search results.

These updates do not target specific sites or types of content but alter specific systems that make up the “core” to an extent it is noteworthy for Google to announce that an update is happening.

Google makes updates to the various individual systems all the time, so the lack of a Google announcement does not disqualify a Google update from being the cause of a change in traffic.

For example, the website in the below screenshot saw a decline from the March 2023 core update but then recovered in the November 2023 core update.

GSC: the website saw a decline from the March 2023 core updateScreenshot by author from Google Search Console, May 2024

The following screenshot shows another example of a traffic decline correlating with a Google update, and it also shows that recovery doesn’t always occur with future updates.

traffic decline correlating with a Google updateScreenshot by author from Google Search Console, May 2024

This site is predominantly informational content supporting a handful of marketing landing pages (a traditional SaaS model) and has seen a steady decline correlating with the September 2023 helpful content update.

How To Fix This

Websites negatively impacted by a broad core update can’t fix specific issues to recover.

Webmasters should focus on providing the best possible content and improving overall site quality.

Recovery, however, may occur when the next broad core update is rolled out if the site has improved in quality and relevance or Google adjusts specific systems and signal weightings back in the favour of your site.

In SEO terminology, we also refer to these traffic changes as an algorithmic penalty, which can take time to recover from.

SERP Layout Updates

Given the launch of AI Overviews, I feel many SEO professionals will conduct this type of analysis in the coming months.

In addition to AI Overviews, Google can choose to include a number of different SERP features ranging from:

  • Shopping results.
  • Map Packs.
  • X (Twitter) carousels.
  • People Also Ask accordions.
  • Featured snippets.
  • Video thumbnails.

All of these not only detract and distract users from the traditional organic results, but they also cause pixel shifts.

From our testing of SGE/AI Overviews, we see traditional results being pushed down anywhere between 1,000 and 1,500 pixels.

When this happens you’re not likely to see third-party rank tracking tools show a decrease, but you will see clicks decline in GSC.

The impact of SERP features on your traffic depends on two things:

  • The type of feature introduced.
  • Whether your users predominantly use mobile or desktop.

Generally, SERP features are more impactful to mobile traffic as they greatly increase scroll depth, and the user screen is much smaller.

You can establish your dominant traffic source by looking at the device breakdown in Google Search Console:

Device by users: clicks and impressionsImage from author’s website, May 2024

You can then compare the two graphs in the UI, or by exporting data via the API with it broken down by devices.

How To Fix This

When Google introduces new SERP features, you can adjust your content and site to become “more eligible” for them.

Some are driven by structured data, and others are determined by Google systems after processing your content.

If Google has introduced a feature that results in more zero-click searches for a particular query, you need to first quantify the traffic loss and then adjust your strategy to become more visible for similar and associated queries that still feature in your target audience’s overall search journey.

Seasonality Traffic Changes

Seasonality in demand refers to predictable fluctuations in consumer interest and purchasing behavior that occur at specific times of the year, influenced by factors such as holidays, weather changes, and cultural events.

Notably, a lot of ecommerce businesses will see peaks in the run-up to Christmas and Thanksgiving, whilst travel companies will see seasonality peaks at different times of the year depending on the destinations and vacation types they cater to.

The below screenshot is atypical of a business that has a seasonal peak in the run-up to Christmas.

seasonal peaks as measured in GSCScreenshot by author from Google Search Console, May 2024

You will see these trends in the Performance Report section and likely see users and sessions mirrored in other analytics platforms.

During a seasonal peak, Google may choose to alter the SERPs in terms of which websites are ranked and which SERP features appear. This occurs when the increase in search demand also brings with it a change in user intent, thus changing the dominant interpretation of the query.

In the travel sector, the shift is often from a research objective to a commercial objective. Out-of-season searchers are predominantly researching destinations or looking for deals, and when it is time to book, they’re using the same search queries but looking to book.

As a result, webpages with a value proposition that caters more to the informational intent are either “demoted” in rankings or swapped out in favor of webpages that (in Google’s eyes) better cater to users in satisfying the commercial intent.

How To Fix This

There is no direct fix for traffic increases and decreases caused by seasonality.

However, you can adjust your overall SEO strategy to accommodate this and work to create visibility for the website outside of peak times by creating content to meet the needs and intent of users who may have a more research and information-gathering intent.

Penalties & Manual Actions

A Google penalty is a punitive action taken against a website by Google, reducing its search rankings or removing it from search results, typically due to violations of Google’s guidelines.

As well as receiving a notification in GSC, you’ll typically see a sharp decrease in traffic, akin to the graph below:

Google traffic decline from penaltyScreenshot by author from Google Search Console, May 2024

Whether or not the penalty is partial or sitewide will depend on how bad the traffic decline is, and also the type (or reason) as to why you received a penalty in the first place will determine what efforts are required and how long it will take to recover.

Changes In PPC Strategies

A common issue I encounter working with organizations is a disconnect in understanding that, sometimes, altering a PPC campaign can affect organic traffic.

An example of this is brand. If you start running a paid search campaign on your brand, you can often expect to see a decrease in branded clicks and CTR. As most organizations have separate vendors for this, it isn’t often communicated that this will be the case.

The Search results performance report in GSC can help you identify whether or not you have cannibalization between your SEO and PPC. From this report, you can correlate branded and non-branded traffic drops with the changelog from those in command of the PPC campaign.

How To Fix This

Ensuring that all stakeholders understand why there have been changes to organic traffic, and that the traffic (and user) isn’t lost, it is now being attributed to Paid.

Understanding if this is the “right decision” or not requires a conversation with those managing the PPC campaigns, and if they are performing and providing a strong ROAS, then the organic traffic loss needs to be acknowledged and accepted.

Recovering Site Traffic

Recovering from Google updates can take time.

Recently, John Mueller has said that sometimes, to recover, you need to wait for another update cycle.

However, this doesn’t mean you shouldn’t be active in trying to improve your website and better align with what Google wants to reward and relying on Google reversing previous signal weighting changes.

It’s critical that you start doing all the right things as soon as possible. The earlier that you identify and begin to solve problems, the earlier that you open up the potential for recovery. The time it takes to recover depends on what caused the drop in the first place, and there might be multiple factors to account for. Building a better website for your audience that provides them with better experiences and better service is always the right thing to do.

More resources: 


Featured Image: Ground Picture/Shutterstock

Source link

Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address
Continue Reading

SEO

Barriers To Audience Buy-In

Published

on

By

Barriers to audience buy-in with lead generation

This is an excerpt from the B2B Lead Generation ebook, which draws on SEJ’s internal expertise in delivering leads across multiple media types.

People are driven by a mix of desires, wants, needs, experiences, and external pressures.

It can take time to get it right and convince a person to become a lead, let alone a paying customer.

Here are some nuances of logic and psychology that could be impacting your ability to connect with audiences and build strong leads.

1. Poor Negotiations & The Endowment Effect

Every potential customer you encounter values their own effort and information. And due to something called the endowment effect, they value that time and data much more than you do.

In contrast, the same psychological effect means you value what you offer in exchange for peoples’ information more than they will.

If the value of what you’re offering fails to match the value of what consumers are giving you in exchange (read: their time and information), the conversions will be weak.

The solution? You can increase the perceived value of the thing you’re offering, or reduce the value of what the user “pays” for the thing you offer.

Want an exclusive peek into tactics we use when developing our own lead gen campaigns? Check out our upcoming webinar.

Humans evaluate rewards in multiple dimensions, including the reward amount, the time until the reward is received, and the certainty of the reward.

The more time before a reward occurs, and the less certain its ultimate value, the harder you have to work to get someone to engage.

Offering value upfront – even if you’re presenting something else soon after, like a live event, ebook, or demo – can help entice immediate action as well as convince leads of the long-term value of their investment.

It can even act as a prime for the next step in the lead gen nurturing process, hinting at even more value to come and increasing the effectiveness of the rest of your lead generation strategy.

It’s another reason why inbound content is a critical support for lead generation content. The short-term rewards of highly useful ungated content help prepare audiences for longer-term benefits offered down the line.

3. Abandonment & The Funnel Myth

Every lead generation journey is carefully planned, but if you designed it with a funnel in mind, you could be losing many qualified leads.

That’s because the imagery of a funnel might suggest that all leads engage with your brand or offer in the same way, but this simply isn’t true – particularly for products or services with high values.

Instead, these journeys are more abstract. Leads tend to move back and forth between stages depending on their circumstances. They might change their minds, encounter organizational roadblocks, switch channels, or their needs might suddenly change.

Instead of limiting journeys to audience segments, consider optimizing for paths and situations, too.

Optimizing for specific situations and encounters creates multiple opportunities to capture a lead while they’re in certain mindsets. Every opportunity is a way to engage with varying “costs” for time and data, and align your key performance indicators (KPIs) to match.

Situational journeys also create unique opportunities to learn about the various audience segments, including what they’re most interested in, which offers to grab their attention, and which aspects of your brand, product, or service they’re most concerned about.

4. Under-Pricing

Free trials and discounts can be eye-catching, but they don’t always work to your benefit.

Brands often think consumers will always choose the product with the lowest possible price. That isn’t always the case.

Consumers work within something referred to as the “zone of acceptability,” which is the price range they feel is acceptable for a purchasing decision.

If your brand falls outside that range, you’ll likely get the leads – but they could fail to buy in later. The initial offer might be attractive, but the lower perception of value could work against you when it comes time to try and close the sale.

Several elements play into whether consumers are sensitive to pricing discounts. The overall cost of a purchase matters, for example.

Higher-priced purchases, such as SaaS or real estate, can be extremely sensitive to pricing discounts. They can lead to your audience perceiving the product as lower-value, or make it seem like you’re struggling. A price-quality relationship is easy to see in many places in our lives. If you select the absolute lowest price for an airline ticket, do you expect your journey to be timely and comfortable?

It’s difficult to offer specific advice on these points. To find ideal price points and discounts, you need good feedback systems from both customers and leads – and you need data about how other audiences interact. But there’s value in not being the cheapest option.

Get more tips on how we, here at SEJ, create holistic content campaigns to drive leads in this exclusive webinar.

5. Lead Roles & Information

In every large purchasing decision, there are multiple roles in the process. These include:

  • User: The person who ultimately uses the product or service.
  • Buyer: The person who makes the purchase, but may or may not know anything about the actual product or service being purchased.
  • Decider: The person who determines whether to make the purchase.
  • Influencer: The person who provides opinions and thoughts on the product or service, and influences perceptions of it.
  • Gatekeeper: The person who gathers and holds information about the product or service.

Sometimes, different people play these roles, and other times, one person may hold more than one of these roles. However, the needs of each role must be met at the right time. If you fail to meet their needs, you’ll see your conversions turn cold at a higher rate early in the process.

The only way to avoid this complication is to understand who it is you’re attracting when you capture the lead, and make the right information available at the right time during the conversion process.

6. Understand Why People Don’t Sign Up

Many businesses put significant effort into lead nurturing and understanding the qualities of potential customers who fill out lead forms.

But what about the ones who don’t fill out those forms?

Understanding these values and the traits that drive purchasing decisions is paramount.

Your own proprietary and customer data, like your analytics, client data, and lead interactions, makes an excellent starting place, but don’t make the mistake of basing your decisions solely on the data you have collected about the leads you have.

This information creates a picture based solely on people already interacting with you. It doesn’t include information about the audience you’ve failed to capture so far.

Don’t fall for survivorship bias, which occurs when you only look at data from people who have passed your selection filters.

This is especially critical for lead generation because there are groups of people you don’t want to become leads. But you need to make sure you’re attracting as many ideal leads as possible while filtering out those that are suboptimal. You need information about the people who aren’t converting to ensure your filters are working as intended.

Gather information from the segment of your target audience that uses a competitor’s products, and pair them with psychographic tools and frameworks like “values and lifestyle surveys” (VALS) to gather insights and inform decisions.

In a digital world of tough competition and even more demands on every dollar, your lead generation needs to be precise.

Understanding what drives your target audience before you capture the lead and ensuring every detail is crafted with the final conversion in mind will help you capture more leads and sales, and leave your brand the clear market winner.

More resources:


Featured Image: Pasuwan/Shutterstock

Source link

Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address
Continue Reading

SEO

Google Answers Question About Toxic Link Sabotage

Published

on

By

Gary Illyes answers a question about how to notify Google about toxic link sabotage

Google’s Gary Illyes answered a question about how to notify Google that someone is poisoning their backlink profile with “toxic links” which is a problem that many people have been talking about for at least fifteen years.

Question About Alerting Google To Toxic Links

Gary narrated the question:

“Someone’s asking, how to alert Google of sabotage via toxic links?”

And this is Gary’s answer:

I know what I would do: I’d ignore those links.

Generally Google is really, REALLY good at ignoring links that are irrelevant to the site they’re pointing at. If you feel like it, you can always disavow those “toxic” links, or file a spam report.

Disavow Links If You Feel Like It

Gary linked to Google’s explainer about disavowing links where it’s explained that the disavow tool is for a site owner to tell Google about links that they are responsible for in some way, like paid links or some other link scheme.

This is what it advises:

“If you have a manual action against your site for unnatural links to your site, or if you think you’re about to get such a manual action (because of paid links or other link schemes that violate our quality guidelines), you should try to remove the links from the other site to your site. If you can’t remove those links yourself, or get them removed, then you should disavow the URLs of the questionable pages or domains that link to your website.”

Google suggests that a link disavow is only necessary when two conditions are met:

  1. “You have a considerable number of spammy, artificial, or low-quality links pointing to your site,
    AND
  2. The links have caused a manual action, or likely will cause a manual action, on your site.”

Both of the above conditions must be met in order to file a valid link disavow tool.

Origin Of The Phrase Toxic Links

As Google became better at penalizing sites for low quality links and paid links, some in the highly competitive gambling industry started creating low quality links to sabotage their competitors. The practice was called negative SEO.

The phrase toxic link is something that was never heard of until after the Penguin link updates in 2012 which required penalized sites to remove all the paid and low quality links they created and then disavow the rest. An industry grew around disavowing links and it was that industry that invented the phrase Toxic Links for use in their marketing.

Confirmation That Google Is Able To Ignore Links

I have shared this anecdote before and I’ll share it here again. Someone I knew contacted me and said that their site lost rankings from negative SEO links. I took a look and their site had a ton of really nasty looking links. So out of curiosity (and because I knew that the site was this person’s main income), I emailed someone at Google Mountain View headquarters about it. That person checked it and replied that the site didn’t lose rankings because of the links. They lost rankings because of a Panda update related content issue.

That was around 2012 and it showed me how good Google was at ignoring links. Now, if Google was that good at ignoring really bad links back then, they’re probably better at it now, twelve years later now that they have the spam brain AI.

Listen to the question and answer at the 8:22 minute mark:

Featured Image by Shutterstock/New Africa

Source link

Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address
Continue Reading

Trending