Connect with us

SEO

How To Use SEO To Target Your Audience Throughout The Funnel

Published

on

How To Use SEO To Target Your Audience Throughout The Funnel

The marketing funnel, otherwise known as the sales funnel or customer journey, is a path or way that potential customers take that contains stages from initial awareness of a product or service to making a purchase decision.

A lot of SEO tactics, investment, and efforts can be wasted without fully defining and understanding the customer journey or marketing funnel for businesses.

I’ve seen it way too many times, and talk a lot about the importance of having a defined and objective plan for SEO.

Regardless of how new or established your brand or organization is – or where you are in your efforts – it is always a worthy exercise to go through on a regular basis.

Customer behaviors change, competitors change, and even your own organization can have changes in goals and objectives that may impact your marketing funnel that you’ll want to map your efforts to.

Advertisement

Stages Of The Marketing Funnel

Before I get into the SEO audience targeting to the funnel, let me take a moment and recap how I define the stages of the funnel:

Top Of The Funnel

At the top of the funnel, the main goal is to create awareness for your target audience.

Potential customers become aware of your brand, product, or service through various channels, such as advertising, social media, content marketing, and word-of-mouth referrals.

Keep in mind that this stage should be informational and educational, aimed at addressing the needs of your audience rather than directly promoting products.

Middle Of The Funnel

In the middle of the funnel, individuals have now shown interest in your product or service and are actively seeking more information.

They might do this in the form of engaging with your content in a more valuable way, signing up for newsletters, following your social media accounts, or exploring product pages on your website.

Advertisement

In this stage of the funnel, your content should focus on detailed information about your products or services, including benefits, features, and how they address specific customer needs.

Bottom Of The Funnel

In the final stage of the funnel, users are evaluating their options to make informed decisions on whether or not to make a purchase.

They may compare what you offer with those of competitors by reading reviews and seeking reassurance to ensure that your offer is the right choice for them.

It’s important that you provide these individuals with content that builds trust and loyalty to your brand, driving them toward converting and leaving a positive, lasting impression.

Aligning Your SEO Efforts To The Funnel

By aligning your SEO strategy with the user’s search intent at each stage of the marketing funnel, you are able to deliver the right content to the right audience at the right time.

Understanding Search Intent

The most commonly accepted search intents are informational, navigational, commercial, and transactional.

Advertisement

SEO is intrinsically tied to understanding user intent and delivering content that speaks to your audience’s needs.

When we have the right content for the wrong moment, or vice versa, we spend a lot of time and effort on lower-than-ideal conversion opportunities.

Below, I’ll detail how search intent can vary by stage of the marketing funnel:

Awareness Stage

At the top of the funnel, search intent is typically informational.

Users are searching for information or answers to questions in the form of educational or insightful content. Searches often include who, what, when, where, why, and how.

To target top-of-the-funnel users, I recommend that you:

Advertisement
  • Identify relevant keywords and phrases that your target audience might use when searching for information related to your industry, products, or services. The keywords you use should focus on broader, informational keywords.
  • Develop high-quality, informative, and engaging content that addresses the key components, questions, and interests of your target audience. This can include blog posts, infographics, videos, and social media content.
  • Target keywords that often trigger featured snippets in search results. These can help you gain more visibility and establish authority.

Consideration Stage

As your audience enters the consideration phase, their intent shifts towards commercial and transactional. They intend to make a decision on a product or service at some point and are evaluating options.

In order to reach mid-funnel searchers, you should:

  • Start targeting more specific and long-tail keywords that indicate user intent to learn more or make a decision. These keywords often include terms like “best,” “reviews,” “comparison,” or “how to choose.”
  • Develop comprehensive guides, product reviews, and comparison articles that provide valuable insights and help users make informed decisions.
  • Create location-specific content using keywords based on location to drive people to your physical location, if applicable. Local SEO can help drive people to your location over a nearby competitor’s.

Conversion Stage

Finally, we reach the conversion stage, where users have navigational and transactional intent.

They know what they want and may even be searching for your specific brand or product.

Pay attention to navigational and transactional keywords that often include brand names, product names, or specific action-related phrases like “buy,” “order,” “sign up,” or “contact.”

In order to reach bottom-funnel users, you need to:

  • Optimize product pages and service descriptions to target keywords that signal user intent to purchase, such as “buy,” “order,” “discount,” or “pricing.”
  • Use schema markup to provide rich product information in search results, making it easier for users to compare and decide.

Bonus: Tools for Tracking

Google Analytics 4

GA4 is a tracking tool that provides comprehensive insights into website traffic, user behavior, and conversions.

Utilize it to track where users are landing on your site, how they are engaging with your content, and the paths they are taking prior to converting.

We recommend setting up goals and funnels in order to measure specific events, such as form starts and form completions.

Advertisement

Google Search Console

Search Console provides valuable data insights to track search queries and page performance.

You can use it to monitor what keywords are driving traffic to what pages using core KPIs like clicks, impressions, and click-through rate.

Search Console’s reports are also an invaluable resource for learning how Google is crawling and indexing your site, as well as discovering any Core Web Vitals issues.

Keyword Research Tools

You can use SEO tools like Semrush, Ahrefs, and Moz to help you discover and track keywords for SEO specifically.

All of these tools allow you to identify the intent of a specific keyword, as well as the estimated search volume and ranking difficulty of the keyword.

Conclusion

Knowing your marketing funnel and mapping your SEO focus to the user intents at each phase is critical. SEO requires a lot of time and effort.

Advertisement

Plus, it can take a matter of time before we see a measurable return on investment (ROI).

The more you know about how you engage prospects, move them through the funnel, and with an ongoing optimization process, you can minimize wasted tactics that don’t produce results.

More resources: 


Featured Image: Vitalii Vodolazskyi/Shutterstock

Source link

Advertisement
Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address

SEO

How Compression Can Be Used To Detect Low Quality Pages

Published

on

By

Compression can be used by search engines to detect low-quality pages. Although not widely known, it's useful foundational knowledge for SEO.

The concept of Compressibility as a quality signal is not widely known, but SEOs should be aware of it. Search engines can use web page compressibility to identify duplicate pages, doorway pages with similar content, and pages with repetitive keywords, making it useful knowledge for SEO.

Although the following research paper demonstrates a successful use of on-page features for detecting spam, the deliberate lack of transparency by search engines makes it difficult to say with certainty if search engines are applying this or similar techniques.

What Is Compressibility?

In computing, compressibility refers to how much a file (data) can be reduced in size while retaining essential information, typically to maximize storage space or to allow more data to be transmitted over the Internet.

TL/DR Of Compression

Compression replaces repeated words and phrases with shorter references, reducing the file size by significant margins. Search engines typically compress indexed web pages to maximize storage space, reduce bandwidth, and improve retrieval speed, among other reasons.

This is a simplified explanation of how compression works:

  • Identify Patterns:
    A compression algorithm scans the text to find repeated words, patterns and phrases
  • Shorter Codes Take Up Less Space:
    The codes and symbols use less storage space then the original words and phrases, which results in a smaller file size.
  • Shorter References Use Less Bits:
    The “code” that essentially symbolizes the replaced words and phrases uses less data than the originals.

A bonus effect of using compression is that it can also be used to identify duplicate pages, doorway pages with similar content, and pages with repetitive keywords.

Research Paper About Detecting Spam

This research paper is significant because it was authored by distinguished computer scientists known for breakthroughs in AI, distributed computing, information retrieval, and other fields.

Advertisement

Marc Najork

One of the co-authors of the research paper is Marc Najork, a prominent research scientist who currently holds the title of Distinguished Research Scientist at Google DeepMind. He’s a co-author of the papers for TW-BERT, has contributed research for increasing the accuracy of using implicit user feedback like clicks, and worked on creating improved AI-based information retrieval (DSI++: Updating Transformer Memory with New Documents), among many other major breakthroughs in information retrieval.

Dennis Fetterly

Another of the co-authors is Dennis Fetterly, currently a software engineer at Google. He is listed as a co-inventor in a patent for a ranking algorithm that uses links, and is known for his research in distributed computing and information retrieval.

Those are just two of the distinguished researchers listed as co-authors of the 2006 Microsoft research paper about identifying spam through on-page content features. Among the several on-page content features the research paper analyzes is compressibility, which they discovered can be used as a classifier for indicating that a web page is spammy.

Detecting Spam Web Pages Through Content Analysis

Although the research paper was authored in 2006, its findings remain relevant to today.

Then, as now, people attempted to rank hundreds or thousands of location-based web pages that were essentially duplicate content aside from city, region, or state names. Then, as now, SEOs often created web pages for search engines by excessively repeating keywords within titles, meta descriptions, headings, internal anchor text, and within the content to improve rankings.

Section 4.6 of the research paper explains:

Advertisement

“Some search engines give higher weight to pages containing the query keywords several times. For example, for a given query term, a page that contains it ten times may be higher ranked than a page that contains it only once. To take advantage of such engines, some spam pages replicate their content several times in an attempt to rank higher.”

The research paper explains that search engines compress web pages and use the compressed version to reference the original web page. They note that excessive amounts of redundant words results in a higher level of compressibility. So they set about testing if there’s a correlation between a high level of compressibility and spam.

They write:

“Our approach in this section to locating redundant content within a page is to compress the page; to save space and disk time, search engines often compress web pages after indexing them, but before adding them to a page cache.

…We measure the redundancy of web pages by the compression ratio, the size of the uncompressed page divided by the size of the compressed page. We used GZIP …to compress pages, a fast and effective compression algorithm.”

High Compressibility Correlates To Spam

The results of the research showed that web pages with at least a compression ratio of 4.0 tended to be low quality web pages, spam. However, the highest rates of compressibility became less consistent because there were fewer data points, making it harder to interpret.

Figure 9: Prevalence of spam relative to compressibility of page.

The researchers concluded:

Advertisement

“70% of all sampled pages with a compression ratio of at least 4.0 were judged to be spam.”

But they also discovered that using the compression ratio by itself still resulted in false positives, where non-spam pages were incorrectly identified as spam:

“The compression ratio heuristic described in Section 4.6 fared best, correctly identifying 660 (27.9%) of the spam pages in our collection, while misidentifying 2, 068 (12.0%) of all judged pages.

Using all of the aforementioned features, the classification accuracy after the ten-fold cross validation process is encouraging:

95.4% of our judged pages were classified correctly, while 4.6% were classified incorrectly.

More specifically, for the spam class 1, 940 out of the 2, 364 pages, were classified correctly. For the non-spam class, 14, 440 out of the 14,804 pages were classified correctly. Consequently, 788 pages were classified incorrectly.”

The next section describes an interesting discovery about how to increase the accuracy of using on-page signals for identifying spam.

Insight Into Quality Rankings

The research paper examined multiple on-page signals, including compressibility. They discovered that each individual signal (classifier) was able to find some spam but that relying on any one signal on its own resulted in flagging non-spam pages for spam, which are commonly referred to as false positive.

Advertisement

The researchers made an important discovery that everyone interested in SEO should know, which is that using multiple classifiers increased the accuracy of detecting spam and decreased the likelihood of false positives. Just as important, the compressibility signal only identifies one kind of spam but not the full range of spam.

The takeaway is that compressibility is a good way to identify one kind of spam but there are other kinds of spam that aren’t caught with this one signal. Other kinds of spam were not caught with the compressibility signal.

This is the part that every SEO and publisher should be aware of:

“In the previous section, we presented a number of heuristics for assaying spam web pages. That is, we measured several characteristics of web pages, and found ranges of those characteristics which correlated with a page being spam. Nevertheless, when used individually, no technique uncovers most of the spam in our data set without flagging many non-spam pages as spam.

For example, considering the compression ratio heuristic described in Section 4.6, one of our most promising methods, the average probability of spam for ratios of 4.2 and higher is 72%. But only about 1.5% of all pages fall in this range. This number is far below the 13.8% of spam pages that we identified in our data set.”

So, even though compressibility was one of the better signals for identifying spam, it still was unable to uncover the full range of spam within the dataset the researchers used to test the signals.

Combining Multiple Signals

The above results indicated that individual signals of low quality are less accurate. So they tested using multiple signals. What they discovered was that combining multiple on-page signals for detecting spam resulted in a better accuracy rate with less pages misclassified as spam.

Advertisement

The researchers explained that they tested the use of multiple signals:

“One way of combining our heuristic methods is to view the spam detection problem as a classification problem. In this case, we want to create a classification model (or classifier) which, given a web page, will use the page’s features jointly in order to (correctly, we hope) classify it in one of two classes: spam and non-spam.”

These are their conclusions about using multiple signals:

“We have studied various aspects of content-based spam on the web using a real-world data set from the MSNSearch crawler. We have presented a number of heuristic methods for detecting content based spam. Some of our spam detection methods are more effective than others, however when used in isolation our methods may not identify all of the spam pages. For this reason, we combined our spam-detection methods to create a highly accurate C4.5 classifier. Our classifier can correctly identify 86.2% of all spam pages, while flagging very few legitimate pages as spam.”

Key Insight:

Misidentifying “very few legitimate pages as spam” was a significant breakthrough. The important insight that everyone involved with SEO should take away from this is that one signal by itself can result in false positives. Using multiple signals increases the accuracy.

What this means is that SEO tests of isolated ranking or quality signals will not yield reliable results that can be trusted for making strategy or business decisions.

Takeaways

We don’t know for certain if compressibility is used at the search engines but it’s an easy to use signal that combined with others could be used to catch simple kinds of spam like thousands of city name doorway pages with similar content. Yet even if the search engines don’t use this signal, it does show how easy it is to catch that kind of search engine manipulation and that it’s something search engines are well able to handle today.

Here are the key points of this article to keep in mind:

Advertisement
  • Doorway pages with duplicate content is easy to catch because they compress at a higher ratio than normal web pages.
  • Groups of web pages with a compression ratio above 4.0 were predominantly spam.
  • Negative quality signals used by themselves to catch spam can lead to false positives.
  • In this particular test, they discovered that on-page negative quality signals only catch specific types of spam.
  • When used alone, the compressibility signal only catches redundancy-type spam, fails to detect other forms of spam, and leads to false positives.
  • Combing quality signals improves spam detection accuracy and reduces false positives.
  • Search engines today have a higher accuracy of spam detection with the use of AI like Spam Brain.

Read the research paper, which is linked from the Google Scholar page of Marc Najork:

Detecting spam web pages through content analysis

Featured Image by Shutterstock/pathdoc

Source link

Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address
Continue Reading

SEO

New Google Trends SEO Documentation

Published

on

By

Google publishes new documentation for how to use Google Trends for search marketing

Google Search Central published new documentation on Google Trends, explaining how to use it for search marketing. This guide serves as an easy to understand introduction for newcomers and a helpful refresher for experienced search marketers and publishers.

The new guide has six sections:

  1. About Google Trends
  2. Tutorial on monitoring trends
  3. How to do keyword research with the tool
  4. How to prioritize content with Trends data
  5. How to use Google Trends for competitor research
  6. How to use Google Trends for analyzing brand awareness and sentiment

The section about monitoring trends advises there are two kinds of rising trends, general and specific trends, which can be useful for developing content to publish on a site.

Using the Explore tool, you can leave the search box empty and view the current rising trends worldwide or use a drop down menu to focus on trends in a specific country. Users can further filter rising trends by time periods, categories and the type of search. The results show rising trends by topic and by keywords.

To search for specific trends users just need to enter the specific queries and then filter them by country, time, categories and type of search.

The section called Content Calendar describes how to use Google Trends to understand which content topics to prioritize.

Advertisement

Google explains:

“Google Trends can be helpful not only to get ideas on what to write, but also to prioritize when to publish it. To help you better prioritize which topics to focus on, try to find seasonal trends in the data. With that information, you can plan ahead to have high quality content available on your site a little before people are searching for it, so that when they do, your content is ready for them.”

Read the new Google Trends documentation:

Get started with Google Trends

Featured Image by Shutterstock/Luis Molinero

Source link

Advertisement
Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address
Continue Reading

SEO

All the best things about Ahrefs Evolve 2024

Published

on

All the best things about Ahrefs Evolve 2024

Hey all, I’m Rebekah and I am your Chosen One to “do a blog post for Ahrefs Evolve 2024”.

What does that entail exactly? I don’t know. In fact, Sam Oh asked me yesterday what the title of this post would be. “Is it like…Ahrefs Evolve 2024: Recap of day 1 and day 2…?” 

Even as I nodded, I couldn’t get over how absolutely boring that sounded. So I’m going to do THIS instead: a curation of all the best things YOU loved about Ahrefs’ first conference, lifted directly from X.

Let’s go!

OUR HUGE SCREEN

CONFERENCE VENUE ITSELF

It was recently named the best new skyscraper in the world, by the way.

 

OUR AMAZING SPEAKER LINEUP – SUPER INFORMATIVE, USEFUL TALKS!

 

Advertisement

GREAT MUSIC

 

AMAZING GOODIES

 

SELFIE BATTLE

Some background: Tim and Sam have a challenge going on to see who can take the most number of selfies with all of you. Last I heard, Sam was winning – but there is room for a comeback yet!

 

THAT BELL

Everybody’s just waiting for this one.

 

STICKER WALL

AND, OF COURSE…ALL OF YOU!

 

Advertisement

There’s a TON more content on LinkedIn – click here – but I have limited time to get this post up and can’t quite figure out how to embed LinkedIn posts so…let’s stop here for now. I’ll keep updating as we go along!



Source link

Advertisement
Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address
Continue Reading

Trending