Connect with us

SEO

Google’s “Information Gain” Patent For Ranking Web Pages

Published

on

Google was recently granted a patent on an information gain score for ranking web pages

Google was recently granted a patent on ranking web pages, which may offer insights into how AI Overviews ranks content. The patent describes a method for ranking pages based on what a user might be interested in next.

Contextual Estimation Of Link Information Gain

The name of the patent is Contextual Estimation Of Link Information Gain, it was filed in 2018 and granted in June 2024. It’s about calculating a ranking score called Information Gain that is used to rank a second set of web pages that are likely to be of interest to a user as a slightly different follow-up topic related to a previous question.

The patent starts with general descriptions then adds layers of specifics over the course of paragraphs.  An analogy can be that it’s like a pizza. It starts out as a mozzarella pizza, then they add mushrooms, so now it’s a mushroom pizza. Then they add onions, so now it’s a mushroom and onion pizza. There are layers of specifics that build up to the entire context.

So if you read just one section of it, it’s easy to say, “It’s clearly a mushroom pizza” and be completely mistaken about what it really is.

There are layers of context but what it’s building up to is:

  • Ranking a web page that is relevant for what a user might be interested in next.
  • The context of the invention is an automated assistant or chatbot
  • A search engine plays a role in a way that seems similar to Google’s AI Overviews

Information Gain And SEO: What’s Really Going On?

A couple of months ago I read a comment on social media asserting that “Information Gain” was a significant factor in a recent Google core algorithm update.  That mention surprised me because I’d never heard of information gain before. I asked some SEO friends about it and they’d never heard of it either.

What the person on social media had asserted was something like Google was using an “Information Gain” score to boost the ranking of web pages that had more information than other web pages. So the idea was that it was important to create pages that have more information than other pages, something along those lines.

So I read the patent and discovered that “Information Gain” is not about ranking pages with more information than other pages. It’s really about something that is more profound for SEO because it might help to understand one dimension of how AI Overviews might rank web pages.

TL/DR Of The Information Gain Patent

What the information gain patent is really about is even more interesting because it may give an indication of how AI Overviews (AIO) ranks web pages that a user might be interested next.  It’s sort of like introducing personalization by anticipating what a user will be interested in next.

The patent describes a scenario where a user makes a search query and the automated assistant or chatbot provides an answer that’s relevant to the question. The information gain scoring system works in the background to rank a second set of web pages that are relevant to a what the user might be interested in next. It’s a new dimension in how web pages are ranked.

The Patent’s Emphasis on Automated Assistants

There are multiple versions of the Information Gain patent dating from 2018 to 2024. The first version is similar to the last version with the most significant difference being the addition of chatbots as a context for where the information gain invention is used.

The patent uses the phrase “automated assistant” 69 times and uses the phrase “search engine” only 25 times.  Like with AI Overviews, search engines do play a role in this patent but it’s generally in the context of automated assistants.

As will become evident, there is nothing to suggest that a web page containing more information than the competition is likelier to be ranked higher in the organic search results. That’s not what this patent talks about.

General Description Of Context

All versions of the patent describe the presentation of search results within the context of an automated assistant and natural language question answering. The patent starts with a general description and progressively becomes more specific. This is a feature of patents in that they apply for protection for the widest contexts in which the invention can be used and become progressively specific.

The entire first section (the Abstract) doesn’t even mention web pages or links. It’s just about the information gain score within a very general context:

“An information gain score for a given document is indicative of additional information that is included in the document beyond information contained in documents that were previously viewed by the user.”

That is a nutshell description of the patent, with the key insight being that the information gain scoring happens on pages after the user has seen the first search results.

More Specific Context: Automated Assistants

The second paragraph in the section titled “Background” is slightly more specific and adds an additional layer of context for the invention because it mentions  links. Specifically, it’s about a user that makes a search query and receives links to search results – no information gain score calculated yet.

The Background section says:

“For example, a user may submit a search request and be provided with a set of documents and/or links to documents that are responsive to the submitted search request.”

The next part builds on top of a user having made a search query:

“Also, for example, a user may be provided with a document based on identified interests of the user, previously viewed documents of the user, and/or other criteria that may be utilized to identify and provide a document of interest. Information from the documents may be provided via, for example, an automated assistant and/or as results to a search engine. Further, information from the documents may be provided to the user in response to a search request and/or may be automatically served to the user based on continued searching after the user has ended a search session.”

That last sentence is poorly worded.

Here’s the original sentence:

“Further, information from the documents may be provided to the user in response to a search request and/or may be automatically served to the user based on continued searching after the user has ended a search session.”

Here’s how it makes more sense:

“Further, information from the documents may be provided to the user… based on continued searching after the user has ended a search session.”

The information provided to the user is “in response to a search request and/or may be automatically served to the user”

It’s a little clearer if you put parentheses around it:

Further, information from the documents may be provided to the user (in response to a search request and/or may be automatically served to the user) based on continued searching after the user has ended a search session.

Takeaways:

  • The patent describes identifying documents that are relevant to the “interests of the user” based on “previously viewed documents” “and/or other criteria.”
  • It sets a general context of an automated assistant “and/or” a search engine
  • Information from the documents that are based on “previously viewed documents” “and/or other criteria” may be shown after the user continues searching.

More Specific Context: Chatbot

The patent next adds an additional layer of context and specificity by mentioning how chatbots can “extract” an answer from a web page (“document”) and show that as an answer. This is about showing a summary that contains the answer, kind of like featured snippets, but within the context of a chatbot.

The patent explains:

“In some cases, a subset of information may be extracted from the document for presentation to the user. For example, when a user engages in a spoken human-to-computer dialog with an automated assistant software process (also referred to as “chatbots,” “interactive personal assistants,” “intelligent personal assistants,” “personal voice assistants,” “conversational agents,” “virtual assistants,” etc.), the automated assistant may perform various types of processing to extract salient information from a document, so that the automated assistant can present the information in an abbreviated form.

As another example, some search engines will provide summary information from one or more responsive and/or relevant documents, in addition to or instead of links to responsive and/or relevant documents, in response to a user’s search query.”

The last sentence sounds like it’s describing something that’s like a featured snippet or like AI Overviews where it provides a summary. The sentence is very general and ambiguous because it uses “and/or” and “in addition to or instead of” and isn’t as specific as the preceding sentences. It’s an example of a patent being general for legal reasons.

Ranking The Next Set Of Search Results

The next section is called the Summary and it goes into more details about how the Information Gain score represents how likely the user will be interested in the next set of documents. It’s not about ranking search results, it’s about ranking the next set of search results (based on a related topic).

It states:

“An information gain score for a given document is indicative of additional information that is included in the given document beyond information contained in other documents that were already presented to the user.”

Ranking Based On Topic Of Web Pages

It then talks about presenting the web page in a browser, audibly reading the relevant part of the document or audibly/visually presenting a summary of the document (“audibly/visually presenting salient information extracted from the document to the user, etc.”)

But the part that’s really interesting is when it next explains using a topic of the web page as a representation of the the content, which is used to calculate the information gain score.

It describes many different ways of extracting the representation of what the page is about. But what’s important is that it’s describes calculating the Information Gain score based on a representation of what the content is about, like the topic.

“In some implementations, information gain scores may be determined for one or more documents by applying data indicative of the documents, such as their entire contents, salient extracted information, a semantic representation (e.g., an embedding, a feature vector, a bag-of-words representation, a histogram generated from words/phrases in the document, etc.) across a machine learning model to generate an information gain score.”

The patent goes on to describe ranking a first set of documents and using the Information Gain scores to rank additional sets of documents that anticipate follow up questions or a progression within a dialog of what the user is interested in.

The automated assistant can in some implementations query a search engine and then apply the Information Gain rankings to the multiple sets of search results (that are relevant to related search queries).

There are multiple variations of doing the same thing but in general terms this is what it describes:

“Based on the information gain scores, information contained in one or more of the new documents may be selectively provided to the user in a manner that reflects the likely information gain that can be attained by the user if the user were to be presented information from the selected documents.”

What All Versions Of The Patent Have In Common

All versions of the patent share general similarities over which more specifics are layered in over time (like adding onions to a mushroom pizza). The following are the baseline of what all the versions have in common.

Application Of Information Gain Score

All versions of the patent describe applying the information gain score to a second set of documents that have additional information beyond the first set of documents. Obviously, there is no criteria or information to guess what the user is going search for when they start a search session. So information gain scores are not applied to the first search results.

Examples of passages that are the same for all versions:

  • A second set of documents is identified that is also related to the topic of the first set of documents but that have not yet been viewed by the user.
  • For each new document in the second set of documents, an information gain score is determined that is indicative of, for the new document, whether the new document includes information that was not contained in the documents of the first set of documents…

Automated Assistants

All four versions of the patent refer to automated assistants that show search results in response to natural language queries.

The 2018 and 2023 versions of the patent both mention search engines 25 times. The 2o18 version mentions “automated assistant” 74 times and the latest version mentions it 69 times.

They all make references to “conversational agents,” “interactive personal assistants,” “intelligent personal assistants,” “personal voice assistants,” and “virtual assistants.”

It’s clear that the emphasis of the patent is on automated assistants, not the organic search results.

Dialog Turns

Note: In everyday language we use the word dialogue. In computing they the spell it dialog.

All versions of the patents refer to a way of interacting with the system in the form of a dialog, specifically a dialog turn. A dialog turn is the back and forth that happens when a user asks a question using natural language, receives an answer and then asks a follow up question or another question altogether. This can be natural language in text, text to speech (TTS), or audible.

The main aspect the patents have in common is the back and forth in what is called a “dialog turn.” All versions of the patent have this as a context.

Here’s an example of how the dialog turn works:

“Automated assistant client 106 and remote automated assistant 115 can process natural language input of a user and provide responses in the form of a dialog that includes one or more dialog turns. A dialog turn may include, for instance, user-provided natural language input and a response to natural language input by the automated assistant.

Thus, a dialog between the user and the automated assistant can be generated that allows the user to interact with the automated assistant …in a conversational manner.”

Problems That Information Gain Scores Solve

The main feature of the patent is to improve the user experience by understanding the additional value that a new document provides compared to documents that a user has already seen. This additional value is what is meant by the phrase Information Gain.

There are multiple ways that information gain is useful and one of the ways that all versions of the patent describes is in the context of an audio response and how a long-winded audio response is not good, including in a TTS (text to speech) context).

The patent explains the problem of a long-winded response:

“…and so the user may wait for substantially all of the response to be output before proceeding. In comparison with reading, the user is able to receive the audio information passively, however, the time taken to output is longer and there is a reduced ability to scan or scroll/skip through the information.”

The patent then explains how information gain can speed up answers by eliminating redundant (repetitive) answers or if the answer isn’t enough and forces the user into another dialog turn.

This part of the patent refers to the information density of a section in a web page, a section that answers the question with the least amount of words. Information density is about how “accurate,” “concise,” and “relevant”‘ the answer is for relevance and avoiding repetitiveness. Information density is important for audio/spoken answers.

This is what the patent says:

“As such, it is important in the context of an audio output that the output information is relevant, accurate and concise, in order to avoid an unnecessarily long output, a redundant output, or an extra dialog turn.

The information density of the output information becomes particularly important in improving the efficiency of a dialog session. Techniques described herein address these issues by reducing and/or eliminating presentation of information a user has already been provided, including in the audio human-to-computer dialog context.”

The idea of “information density” is important in a general sense because it communicates better for users but it’s probably extra important in the context of being shown in chatbot search results, whether it’s spoken or not. Google AI Overviews shows snippets from a web page but maybe more importantly, communicating in a concise manner is the best way to be on topic and make it easy for a search engine to understand content.

Search Results Interface

All versions of the Information Gain patent are clear that the invention is not in the context of organic search results. It’s explicitly within the context of ranking web pages within a natural language interface of an automated assistant and an AI chatbot.

However, there is a part of the patent that describes a way of showing users with the second set of results within a “search results interface.” The scenario is that the user sees an answer and then is interested in a related topic. The second set of ranked web pages are shown in a “search results interface.”

The patent explains:

“In some implementations, one or more of the new documents of the second set may be presented in a manner that is selected based on the information gain stores. For example, one or more of the new documents can be rendered as part of a search results interface that is presented to the user in response to a query that includes the topic of the documents, such as references to one or more documents. In some implementations, these search results may be ranked at least in part based on their respective information gain scores.”

…The user can then select one of the references and information contained in the particular document can be presented to the user. Subsequently, the user may return to the search results and the references to the document may again be provided to the user but updated based on new information gain scores for the documents that are referenced.

In some implementations, the references may be reranked and/or one or more documents may be excluded (or significantly demoted) from the search results based on the new information gain scores that were determined based on the document that was already viewed by the user.”

What is a search results interface? I think it’s just an interface that shows search results.

Let’s pause here to underline that it should be clear at this point that the patent is not about ranking web pages that are comprehensive about a topic. The overall context of the invention is showing documents within an automated assistant.

A search results interface is just an interface, it’s never described as being organic search results, it’s just an interface.

There’s more that is the same across all versions of the patent but the above are the important general outlines and context of it.

Claims Of The Patent

The claims section is where the scope of the actual invention is described and for which they are seeking legal protection over. It is mainly focused on the invention and less so on the context. Thus, there is no mention of a search engines, automated assistants, audible responses, or TTS (text to speech) within the Claims section. What remains is the context of search results interface which presumably covers all of the contexts.

Context: First Set Of Documents

It starts out by outlining the context of the invention. This context is receiving a query, identifying the topic, and ranking a first group of relevant web pages (documents) and selecting at least one of them as being relevant and either showing the document or communicating the information from the document (like a summary).

“1. A method implemented using one or more processors, comprising: receiving a query from a user, wherein the query includes a topic; identifying a first set of documents that are responsive to the query, wherein the documents of the set of documents are ranked, and wherein a ranking of a given document of the first set of documents is indicative of relevancy of information included in the given document to the topic; selecting, based on the rankings and from the documents of the first set of documents, a most relevant document providing at least a portion of the information from the most relevant document to the user;”

Context: Second Set Of Documents

Then what immediately follows is the part about ranking a second set of documents that contain additional information. This second set of documents is ranked using the information gain scores to show more information after showing a relevant document from the first group.

This is how it explains it:

“…in response to providing the most relevant document to the user, receiving a request from the user for additional information related to the topic; identifying a second set of documents, wherein the second set of documents includes at one or more of the documents of the first set of documents and does not include the most relevant document; determining, for each document of the second set, an information gain score, wherein the information gain score for a respective document of the second set is based on a quantity of new information included in the respective document of the second set that differs from information included in the most relevant document; ranking the second set of documents based on the information gain scores; and causing at least a portion of the information from one or more of the documents of the second set of documents to be presented to the user, wherein the information is presented based on the information gain scores.”

Granular Details

The rest of the claims section contains granular details about the concept of Information Gain, which is a ranking of documents based on what the user already has seen and represents a related topic that the user may be interested in. The purpose of these details is to lock them in for legal protection as part of the invention.

Here’s an example:

The method of claim 1, wherein identifying the first set comprises:
causing to be rendered, as part of a search results interface that is presented to the user in response to a previous query that includes the topic, references to one or more documents of the first set;
receiving user input that that indicates selection of one of the references to a particular document of the first set from the search results interface, wherein at least part of the particular document is provided to the user in response to the selection;

To make an analogy, it’s describing how to make the pizza dough, clean and cut the mushrooms, etc. It’s not important for our purposes to understand it as much as the general view of what the patent is about.

Information Gain Patent

An opinion was shared on social media that this patent has something to do with ranking web pages in the organic search results, I saw it, read the patent and discovered that’s not how the patent works. It’s a good patent and it’s important to correctly understand it. I analyzed multiple versions of the patent to see what they  had in common and what was different.

A careful reading of the patent shows that it is clearly focused on anticipating what the user may want to see based on what they have already seen. To accomplish this the patent describes the use of an Information Gain score for ranking web pages that are on topics that are related to the first search query but not specifically relevant to that first query.

The context of the invention is generally automated assistants, including chatbots. A search engine could be used as part of finding relevant documents but the context is not solely an organic search engine.

This patent could be applicable to the context of AI Overviews. I would not limit the context to AI Overviews as there are additional contexts such as spoken language in which Information Gain scoring could apply. Could it apply in additional contexts like Featured Snippets? The patent itself is not explicit about that.

Read the latest version of Information Gain patent:

Contextual estimation of link information gain

Featured Image by Shutterstock/Khosro

Source link

Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address

SEO

WordPress Gives WP Engine Users A Reprieve

Published

on

By

WordPress WP Engine Repreieve

Matt Mullenweg posted on WordPress.org that WP Engine users have been granted a reprieve from the block on the WordPress plugin and theme repository until October 1st, allowing them to access updates as usual.

WordPress Versus WP Engine

Matt Mullenweg and popular web host WP Engine have been locked in a conflict for the past week over a commercial licensing fee that other web hosts pay but WP Engine does not. The issue between them stems from the frustrations on Mullenweg’s side with the perception that WP Engine is not giving back enough to WordPress in the way that they should. Prominent figures in the WordPress industry like Joost de Valk agree with Mullenweg that companies, including WP Engine, should give back more to WordPress.

WP Engine has offered their side of the story have gone as far as to send a formal cease and desist letter for what they perceive as an unfair attack on their business.

Regardless of who is right or wrong, WordPress users on WP Engine are caught in the middle of this conflict, with their businesses disrupted by Mullenweg’s decision to block WP Engine from accessing the WordPress.org plugin and theme repository, preventing them from updating plugins and themes.

Temporary Reprieve

Mullenweg posted on WordPress.org that he has heard from WordPress users and has decided to give the WordPress users a chance for WP Engine to set up a solution so that they won’t be inconvenienced. WP Engine has until October 1st to engineer a workaround.

He wrote:

“I’ve heard from WP Engine customers that they are frustrated that WP Engine hasn’t been able to make updates, plugin directory, theme directory, and Openverse work on their sites. It saddens me that they’ve been negatively impacted by Silver Lake‘s commercial decisions.

WP Engine was well aware that we could remove access when they chose to ignore our efforts to resolve our differences and enter into a commercial licensing agreement. Heather Brunner, Lee Wittlinger, and their Board chose to take this risk.

…We have lifted the blocks of their servers from accessing ours, until October 1, UTC 00:00. Hopefully this helps them spin up their mirrors of all of WordPress.org’s resources that they were using for free while not paying, and making legal threats against us.”

Read more at WordPress.org:

WP Engine Reprieve

Featured Image by Shutterstock/Vladimka production

Source link

Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address
Continue Reading

SEO

How to Estimate It and Source Data

Published

on

How to Estimate It and Source Data

Total addressable market (TAM) is an estimation of how much you could earn if you could sell your product or service to every possible customer in your market.

The basic formula for calculating TAM is:

TAM = (Total Number of Potential Customers) × (Average Annual Revenue per Customer)

Understanding TAM helps you figure out the size of your market and the amount of money you could make if you captured all of it.

TAM is also a key metric for startup investors. It shows whether a business idea has a big enough opportunity. Investors often look for a TAM that is “just right” — not too big or too small. A TAM that’s too large might mean the market is crowded with tough competition, while a TAM that’s too small could mean limited room for growth.

In this guide, you’ll learn how to estimate TAM using three methods, where people often make mistakes, and how to refine your estimations to make them plausible to investors or stakeholders and actionable for your business.

There are three approaches to calculating TAM. Depending on the available market data, your business model, and your stakeholders/investors, you should consider using the top-down, bottom-up, or value theory approach.

1. Top-down approach

The top-down approach starts with broad market data and narrows it down to estimate the market size for your specific product or service.

This approach is useful when there’s reliable, broad industry data available.

How to use

  1. Estimate the overall market size in which your product operates, usually obtained from industry reports or research.
  2. Apply a percentage that represents the portion of the market your product can realistically capture.

Example

If the global smartphone market is valued at $500 billion, and you are launching a new smartphone accessory, you might estimate that your product could target 5% of the market, which gives you a TAM of $25 billion.

2. Bottom-up approach

The bottom-up approach builds the TAM by starting with specific, individual data related to your business and scaling it up.

TAM: bottom-up approach.TAM: bottom-up approach.

This method is great when you have detailed knowledge of your customer base and pricing. As far as I know, investors prefer this method, which offers the most accurate and actionable TAM estimation.

 

A few birds in the hand is worth billions in the TAM. Early-stage (pre-Series-B) startups shouldn’t worry too much about calculating a precise TAM. As long as it’s in the right ballpark for their thesis, investors care a lot more about the traction you can show with paying customers. That’s why bottom-up is far more convincing than hand-wavy top-down methods that only rely on finding a big enough pie to claim as your market. 

Rob ChengRob Cheng

How to use

  1. Estimate how many potential customers there are in your target market. You can do this by using sources like industry reports, census data, or research from trusted organizations (more data sources at the end of the article).
  2. Multiply this number by the average revenue you expect to earn from each customer (ARPU – Average Revenue Per User).

Tip

To calculate ARPU, consider the pricing of your product or service, how frequently customers will purchase, and the churn rate.

For example, if you charge $100 per month for a subscription service, your monthly churn rate is 5%; on average, a customer might stay subscribed for around 6-7 months, meaning your average revenue per customer would be around $600-700.

Example

Let’s say you have subscription-based software that helps small businesses manage their finances. You identify that 2 million small businesses could benefit from your software. If your ARPU is $600, your TAM would be 2 million customers × $600 = $1.2 billion.

3. Value theory approach

The value theory approach estimates TAM based on the value your product provides to customers and how much they might be willing to pay for it.

TAM: value-based approach. TAM: value-based approach.

This approach is especially useful if you’re introducing a product or service that disrupts existing markets; traditional market size calculations may not accurately reflect the potential.

How to use

  1. Assess the value or cost savings that your product delivers to the customer.
  2. Estimate how much customers would be willing to pay for that value and scale it across the entire market.

Example

Suppose you have developed a new energy-efficient lighting system that saves companies $10,000 per year in energy costs.

If 100,000 companies could use your lighting system, and each is willing to pay $5,000 for it (because they’ll save $10,000), your TAM would be 100,000 companies × $5,000 = $500 million.

There’s also a fourth option — a middle ground mentioned by quite a few people who offered their insights for this article.

 

I’d say the best method to estimate TAM is usually a combination of top-down and bottom-up approaches. The top-down method gives you a big picture view using industry reports and market research, while bottom-up lets you build from the ground up using your own data and customer insights. This combined approach helps balance out the weaknesses of each method. 

Aaron WhittakerAaron Whittaker

You may encounter the TAM, SAM, and SOM terminology and need to apply it if an investor requests it.

People who prefer this approach treat TAM as a “pie in the sky” number and further refine it with SAM and SOM portions of it.

  • TAM (Total Addressable Market) is the total market if you could sell to everyone, everywhere. Your biggest possible opportunity.
  • SAM (Service Addressable Market) is the portion of the TAM you can actually target based on where you operate and who your product is for. For example, if you’re a local coffee shop in New York City, your SAM might be coffee drinkers in NYC, not every coffee drinker worldwide.
  • SOM (Service Obtainable Market) is the realistic piece of the SAM that you can actually win over, considering the competition and your strengths. Continuing with the coffee shop example, your SOM might be the number of customers you can realistically attract in your neighborhood, given factors like nearby competitors, your unique offerings, and marketing efforts.

TAM is typically used to make a compelling story about the potential for growth, so it’s easy to be over-optimistic and make mistakes that could make your TAM look better.

Here’s an example. I used a tool that calculates TAM automatically based on a URL to find the market size for netflix.com. The tool told me that there are 7B people who “need it (…) even if they’re not willing or able to make a purchase” and 6.3B ready to make a purchase. Something that I find hard to believe since there are an estimated 5.3B people with internet access worldwide.

Also, the way that the tool defines my potential customers doesn’t sound convincing to me, either, let alone logical.

Example of mistakes in calculating TAM.Example of mistakes in calculating TAM.

Other mistakes you should avoid:

  • Falling into the “everything trap”. This is when businesses assume that their product or service could appeal to everyone in the market, leading them to calculate TAM based on an overly broad audience.
  • Sizing the problem instead of the market. This happens when businesses focus on the total number of people who might benefit from their solution without considering how many are realistically willing to pay for it.
  • Overlooking market trends and dynamics. The market can grow or contract, consumer preferences can change, government regulations can influence the market, etc.

The basic data sources for TAM calculations are industry reports you can find on platforms like Statista and census data (like the US census data). However, there are other places where you can look for more detailed data.

Explore the market using search data

Search data is information about what people are looking for online. It can help you understand what customers want, where interest is growing, and what regions are most active.

Google Trends provides some of that data for free. For example, you can check if interest in a plant-based diet is still strong and where in the US you could find the most customers.

Using Google Trends for TAM.Using Google Trends for TAM.

But that’s how far this tool goes. You don’t know what terms are “inside” the topic or how popular a keyword is (the numbers in Google Trends are relative). Also, sometimes Google won’t have the data, just like for the term “baby food subscription”.

Using Google Trends for TAM.Using Google Trends for TAM.

Alternatively, you can use Ahrefs. I’m sure you’ll find more search terms there and a lot more data points. Let me take you through three examples.

Gauge demand with search volume

Search volume is an estimation of the average monthly number of searches for a keyword over the latest known 12 months of data.

High search volumes suggest a larger potential market. Low search volumes, suggest a smaller market (or that you will need to be more creative to find customers).

For example, while Google Trends didn’t have any data on “baby food subscription”, Ahrefs’ Keywords Explorer shows that there are an estimated 1.2K searches per month in the US of that term. Plus, it shows you the forecast for that keyword.

Example of keyword data useful for calculating TAM. Example of keyword data useful for calculating TAM.

If you’d be planning to start a new business in this niche, you’d need compelling arguments to justify a high TAM estimate, because the current demand for this type of service appears to be relatively low.

Learn what people want and how they look for it

Keyword research can tell you what people want in which countries. All you need to know is a few broad terms related to your product.

For example, for plant-based products, you could just type in “plant-based, vegan” and then go to the Matching terms report to see the popularity of certain types of products. You can also see if the demand for these products has grown or fallen over the last three months.

A selection of keywords with growth metrics. A selection of keywords with growth metrics.

So, if you find that the demand for most vegan products has increased, you could assume that your TAM is going to expand in the near future because more people seem to be interested.

You can also use the tool to automatically translate these keywords and see what search terms people use to find the same products around the world and how popular they are.

Automatic keyword translations in Ahrefs. Automatic keyword translations in Ahrefs.

And if you’re unsure what keywords people could use to find a product or service like yours, just use the AI suggestion feature.

Using AI in Keywords Explorer to find more ways people could look for a product or service online. Using AI in Keywords Explorer to find more ways people could look for a product or service online.

Learn from your competitors

By studying the keywords your competitors are targeting, you can uncover untapped niches or areas where demand is high but competition is lower.

For example, say you’re a SaaS company offering a project management tool. If you used Ahrefs’ Site Explorer, you would find that one of your competitors ranks for terms like “engineering project management software”. This could indicate a niche market with unique needs, where there’s considerable demand but less competition.

Using competitive keyword research to find  new niches. Using competitive keyword research to find  new niches.

While you’re at it, go to the Organic Competitors tab to see who else competes for the same audience. Chances are, you may find some new potential competitors.

Identifying organic competitors to refine TAM. Identifying organic competitors to refine TAM.

Use S-1 filings and quarterly reports from public companies

Public companies’ quarterly reports (10-Q) and S-1/F-1 filings offer rich data for estimating TAM. They provide detailed breakdowns of revenue by product line, geographic region, and market segment, along with insights into market share and growth potential.

For example, if a company generates $500 million from a particular service and claims 10% of the market, you can estimate the TAM at $5 billion.

Both reports can also provide guidance on future growth trends, helping forecast your TAM’s evolution.

You can use AI like ChatGPT to analyze the documents for you (they can be quite complex). Here’s a sample analysis of an over 500-page F-1 filing by an Esports company.

AI used to analyze an S1 document. AI used to analyze an S1 document.

Interview potential customers

While reports give you big numbers, talking to real people gives you the practical insights needed to adjust those estimates.

  • By speaking directly to customers, you can gauge whether they actually need your product and how likely they are to adopt it.
  • Interviews help you narrow down the customer segments most interested in your solution. Maybe not everyone is a fit, but if certain industries or company sizes show more interest, you can focus your TAM on those segments.
  • Asking customers what they’d actually pay for your product gives you real data. If you know what your target customers are willing to spend, you can multiply that by the number of similar customers to estimate your revenue potential and refine your TAM.

Use PitchBook for investment and market data

PitchBook offers broader market data and investment trends. It provides reliable information on market valuations, funding rounds, and industry growth, which helps you gauge the overall size and growth potential of a market.

PitchBook also helps identify key players, making it easier to estimate how much of the market is currently being captured and what remains untapped.

For example, based on Stripe’s post-valuation of $152 billion and an assumed 30% market share, Stripe’s TAM would be approximately $506.67 billion (TAM = valuation/market share).

Example of PitchBook data useful for estimating TAM.Example of PitchBook data useful for estimating TAM.

Other tools for SaaS companies

If you’re in SaaS, there are a couple more sources of data you may find especially useful: BuiltWith and Latka SaaS Database.

BuiltWith is a tool that shows you what technologies websites are using. This tool is great for identifying your ideal customer because you can see which companies use certain tools or platforms that align with your product.

Sidenote.

The Ideal customer profile (ICP) is a detailed description of the type of company or person who would benefit most from your product or service. It’s helpful mostly for a bottom-up approach to calculate market size, as it helps you focus on the specific segments of the market that are most relevant to your business. 

Enter a competitor into BuiltWith, and look for the list of their customers. For example, here are some of the sites that use Salesforce. You can sort the list by employees or traffic to find the size of the company you think you could get on board. 

Example of BuiltWith data useful for estimating TAM.Example of BuiltWith data useful for estimating TAM.

The next one is Latka SaaS Database. If you can’t find a SaaS company on PitchBook or BuiltWith, there’s a chance you will find it on Latka. It’s a SaaS-specific database that tracks metrics like revenue, customer growth, churn rates, and funding for thousands of companies.

Example of Latka's data useful for estimating TAM.Example of Latka's data useful for estimating TAM.

Knowing your competitors’ revenue and the number of customers they serve can help you better estimate the size of your potential market.

  • Use competitors’ ARPU or ACV (Annual Contract Value) to estimate your own future metrics.
  • Use the competitor’s revenue or valuation and apply a market share estimation to calculate TAM.

Final thoughts

Remember, TAM is ultimately an estimation. It’s natural to be slightly off, and you’ll probably need to reevaluate every year, after significant changes in the market or after introducing new products.

 

Generally, TAM calculations are not very accurate. At best, you’re relying on partially known variables (number of potential customers and average lifetime customer value). Industries also change so quickly that TAM calculations can become irrelevant within a matter of months.

James OliverJames Oliver

What’s perhaps more important than the exact number is the methodology behind your TAM calculation. A well-thought-out approach demonstrates how seriously you take the business and the effort you’ve put into understanding the market.

Got questions or comments? Find me on LinkedIn.

Source link

Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address
Continue Reading

SEO

9 Successful PR Campaign Examples, According to the Data

Published

on

9 Successful PR Campaign Examples, According to the Data

From Barbie-themed ketchup to exploding owl butts, these PR campaign examples prove that with the right data, timing, and a bit of creativity, you can win coverage and drive real, measurable results. 

In this post, you’ll see the data behind nine successful PR campaigns, and hopefully get some inspiration for your next press idea.

9 popular PR campaign examples

This list is a real mixed bag of PR examples – from newsjacking, to content repurposing, exclusive research, and kooky brand stunts – but one thing they all have in common is measurable success.

In each section, I’ll do a post-mortem of campaign performance, share some analysis tactics, and round-off with a couple of quick tips.

Sound good? Let’s jump in.

1. Heinz + Mattel “Barbiecue” PR campaign

Campaign 🍅👱🏼‍♀️🎀 Heinz Barbiecue
Brand(s) 🏷️ Heinz + Mattel
Links earned 🔗 62
Campaign type 📰 Newjacking/brand collab/product release
Global search volume 🔎 600 for “barbie ketchup”
Search growth (YoY) 📈 200% for “barbie ketchup”

Back in August 2023, when Barbiecore was all the rage, Heinz teased a mockup of two Barbie themed sauces: Kenchup and Barbiecue sauce.

Eight months later, for Barbie’s 65th anniversary in April 2024, Heinz and Mattel dropped the official Barbiecue special edition sauce.

A screenshot of Barbiecue launch from Heinz on TikTok highlighting video engagementA screenshot of Barbiecue launch from Heinz on TikTok highlighting video engagement

Heinz first conceived of the PR stunt to build intrigue around the product months before it hit the shelves, then used public response as a litmus test for its success.

According to their submission in the Shorty Awards, they carefully coordinated their initial “teaser” drop to coincide with an uptick in audience discussions, following the film’s release.

Heinz Shorty Awards quote on the timing of their Barbiecue PR campaignHeinz Shorty Awards quote on the timing of their Barbiecue PR campaign

To date, the Barbiecue PR campaign has earned Heinz 118 relevant mentions in top-tier media outlets like Bloomberg, Yahoo, CBS News, and The Standard, according to Content Explorer.

A screenshot of Ahrefs Content Explorer highlightin 118 mentions of Heinz BarbiecueA screenshot of Ahrefs Content Explorer highlightin 118 mentions of Heinz Barbiecue

With zero dollars in paid promotion, it also generated 38 million organic social impressions and doubled average engagement rates.

Quick learnings

  • Hijack trending cultural “moments”
  • Time your PR campaign launch with peak online conversation
  • Use teaser PR to gauge consumer demand and fuel future R&D decisions 

Campaign 🛀 Saltbomb
Brand 🏷️ Lush
Links earned 🔗 142
Campaign type 📰 Newjacking/product release
Global search volume 🔎 1.3K for “lush saltburn bath bomb”
Search growth (YoY) 📈 37K% for “lush saltburn bath bomb”

This is another great PR example of a brand capitalizing on a film, and waiting for post-event discussion to pick up before newsjacking.

Following a veerryy controversial bath scene in the film Saltburn, UK cosmetics retailer, Lush, jumped on the opportunity to insert their brand into a cultural moment.

In February 2024, three months after the film’s release, they released the “Saltbomb”, a special edition, Saltburn-themed bath bomb.

Parodying some of the film’s most risqué moments, Lush didn’t hold back with their product marketing.

A screenshot of Lush's Saltbomb product pageA screenshot of Lush's Saltbomb product page

And we loved it.

The campaign led to 135 links, many coming from high DR (Domain Rating) publications, driving real, tangible organic traffic – including Global News, New York Times, Pop Sugar, and BBC.

Ahrefs Backlinks report showing 135 press links for Lush's Saltbomb newsjacking PR campaignAhrefs Backlinks report showing 135 press links for Lush's Saltbomb newsjacking PR campaign

Press coverage actually went above and beyond this, because Lush’s products are part of a few publisher affiliate programs – but affiliate links are a little trickier to track.

Here’s an example of what I mean.

The site Allure wrote up a feature piece on the Lush bath bomb, but their affiliate link navigates to a third-party platform before redirecting to Lush’s product page.

A screenshot of an affiliate link from Allure to LushA screenshot of an affiliate link from Allure to Lush

For that reason, the link doesn’t show up in Ahrefs’ Backlinks Report.

Ahrefs Backlinks report showing zero links from Allure to LushAhrefs Backlinks report showing zero links from Allure to Lush

Instead, I found it by monitoring campaign-specific keywords in Content Explorer.

Ahrefs Content Explorer showing press coverage from Allure for Lush Saltburn Bath BombAhrefs Content Explorer showing press coverage from Allure for Lush Saltburn Bath Bomb

Beyond press and affiliate publicity, the Lush PR campaign was a winner on social media.

The photography and product descriptions made it perfect for meme-ification, which added thousands of views and impressions.

A landing page on TikTok for "Saltbomb Lush" with high view-count videos highlightedA landing page on TikTok for "Saltbomb Lush" with high view-count videos highlighted

It also won big in search, with global keyword volume reaching 1.3K…

Ahrefs Keywords Explorer showing 1.3K Global Search Volume for "Lush Saltburn Bath Bomb"Ahrefs Keywords Explorer showing 1.3K Global Search Volume for "Lush Saltburn Bath Bomb"

And the product landing page earned up to 800 monthly organic visits in its first month.

Organic traffic in Site Explorer for Lush's Saltbomb product pageOrganic traffic in Site Explorer for Lush's Saltbomb product page

Traffic has remained steady since, averaging between 500 – 600 monthly visits, despite the product having been archived – pretty good going for a bit of trendjacking.

Quick learnings

  • Scout for affiliate links – you won’t always know when a publisher plans to use an affiliate link, so searching for mentions of campaign keywords can help you find any affiliate coverage that has flown under the radar.
  • Think about how your brand and its tangential topics can tie into cultural moments. 

Campaign 👴🏻 Eclectic Grandpa
Brand 🏷️ Pinterest
Links earned 🔗 98
Campaign type 📑 Report
Global search volume 🔎 4.8K for “eclectic grandpa”
Search growth (YoY) 📈 215K% for “eclectic grandpa”

Every year, Pinterest taps into their internal platform search data to post their trend forecasts in what is known as “Pinterest Predicts”.

Posting on Pinterest For Business (the company’s commercial arm), they categorize related high-growth searches, and assign them novel trend names like “Eclectic grandpa”, “Bow stacking” or “Cafe core”.

A screenshot of Pinterest Predicts Eclectic Grandpa trend pageA screenshot of Pinterest Predicts Eclectic Grandpa trend page

I took a look at the Site Structure report, and found that Pinterest’s most linked trend was in fact the “Eclectic Grandpa” which – in Pinterest’s words – is all about:

“Embracing ‘grandpa core’ and bringing eccentric and expressive elements for the ages to wardrobes. Think retro streetwear, chic cardigans and customised clothing. Because the coastal grandma aesthetic is so last year.”

To date, the trend has earned citations from 98 separate domains.

Ahrefs Site Explorer screenshot of Pinterest Business showing 98 links for trend "Eclectic Grandpa"Ahrefs Site Explorer screenshot of Pinterest Business showing 98 links for trend "Eclectic Grandpa"

A look at the Backlinks report revealed coverage from Vogue, Elle, Who What Wear, New York Post, and Business Insider.

Ahrefs Backlinks report showing 121 press links for Pinterest's Eclectic Grandpa trendAhrefs Backlinks report showing 121 press links for Pinterest's Eclectic Grandpa trend

And it didn’t end there. The “Eclectic Grandpa” gets about a bit, cropping up 340 times in the articles I discovered via Content Explorer.

Ahrefs Content Explorer showing Eclectic Grandpa mentions in top press publicationsAhrefs Content Explorer showing Eclectic Grandpa mentions in top press publications

A considerable number of those DR 50+ mentions (150, to be precise) went unlinked based on Ahrefs’ Unlinked Mentions filter/export – links which could still be claimed by the Pinterest team.

Ahrefs Unlinked Mentions Export for Pinterest "Eclectic Grandpa" mentionsAhrefs Unlinked Mentions Export for Pinterest "Eclectic Grandpa" mentions

Given the far reaching coverage, searches for “Eclectic Grandpa” keywords have shot up in the last year, growing to 4.8K global search volume (GSV). 1727474178 642 9 Successful PR Campaign Examples According to the Data1727474178 642 9 Successful PR Campaign Examples According to the Data

By creating link magnet content, Pinterest has managed to drum up huge publicity – whether they pitched for it or not – making it a great example of a successful PR campaign.

Quick learnings

  • Mine company data to publish new, unseen trends and insights.
  • Come up with a unique name for self-discovered trends and/or theories so it’s easier to monitor uptake and keep track of press coverage.
  • Track mentions – not just links – and claim any unlinked mentions to enhance SEO and brand authority.

Real estate marketplace, Zillow, surveyed 1,815 homeowners and found that those with lower mortgage rates are twice as likely to stay put vs selling their home.

By creating firsthand research tackling an issue close to their audience’s heart, Zillow earned 235 backlinks from the likes of Bloomberg, Yahoo, FoxBusiness, and Money.com.

Ahrefs Backlinks report showing 235 press links for Zillow's Rate-locked Homeowners reportAhrefs Backlinks report showing 235 press links for Zillow's Rate-locked Homeowners report

Sites referenced the survey for multiple reasons; not just quoting one stat, but a whole variety, as evidenced in the anchor text of their backlinks.

Ahrefs Backlinks report showing anchor text variety for Zillow's Rate-locked Homeowners reportAhrefs Backlinks report showing anchor text variety for Zillow's Rate-locked Homeowners report

Quick learnings

  • Conduct your own surveys, asking questions which address a key problem in your industry, then quantitatively analyze the responses.
  • Tease out multiple hard hitting stats to drive more coverage and link variety.

Campaign 🔮👨‍💻 Future of Work
Brand 🏷️ LinkedIn
Links earned 🔗 383
Campaign type 📑 Report
Global search volume 🔎 300 for “LinkedIn report”

LinkedIn tends to keep their data under lock and key, but in their Future of Work report they released proprietary insights on the growth of AI conversations on the platform, plus the impact of AI on careers.

A great example of exclusive PR, LinkedIn’s report made a splash, landing 383 links in Forbes, Microsoft, Harvard Business Review, and CNET.

Ahrefs Backlinks report showing 383 press links for LinkedIn's Future of Work reportAhrefs Backlinks report showing 383 press links for LinkedIn's Future of Work report

Quick learnings

  • Think about what unseen or underground data you can harvest to generate exclusive research for your next PR campaign.
  • If you have internal data, analyze patterns and trends to carve out a totally unique angle..

Tip

Look at your internal site search data in GA4 (thanks to Mark Williams-Cook and Julius Fedorovicius for the tip!) or any other firsthand information at your disposal, to mine insights. 1727474181 615 9 Successful PR Campaign Examples According to the Data1727474181 615 9 Successful PR Campaign Examples According to the Data If you can’t make use of primary data, tap into third-party sources. PR expert, Matt Seabridge, routinely shares some great data sources and PR campaign examples on LinkedIn – I really recommend giving him a follow. 1727474181 397 9 Successful PR Campaign Examples According to the Data1727474181 397 9 Successful PR Campaign Examples According to the Data

Personal finance company, WalletHub, compared the 150 largest metropolitan statistical areas, or MSAs, across 11 key metrics.

Combining primary data with third-party sources like the U.S. Census Bureau, GreatSchools.org, and Yelp, WalletHub created an interactive study ranking the most and least educated cities in America.

A screenshot of WalletHub's Most and Least Educated Cities StudyA screenshot of WalletHub's Most and Least Educated Cities Study

This is an example of a PR campaign that doubles as great content marketing.

It snagged 604 unique backlinks from heavy hitters like Wikipedia, Forbes, Business Insider, Bloomberg, and Yahoo – as well as tons of state publications.

Ahrefs Backlinks report showing 604 press links for WalletHub's StudyAhrefs Backlinks report showing 604 press links for WalletHub's Study

Location based PR campaigns are an especially powerful form of PR, since they have both local and national appeal.

Here’s Tom Chivers, PR Expert and Founder of Sabot, explaining why localization really works for public relations campaigns – with a great additional point made by Co-Founder of Journo Finder, Veronica Fletcher.

Tom Chivers' LinkedIn post on the power of localized PR campaignsTom Chivers' LinkedIn post on the power of localized PR campaigns

Quick learnings

  • Use superlatives in headlines (e.g. “Most”, “Least”, “Best”).
  • Embrace ranking formats – comparisons make readers want to click to see how they size up.
  • Slice and dice your data by location to get your campaign syndicated in both national and local publications.

Campaign 🍩 “Go USA” and “Passport to Paris” doughnuts
Brand(s) 🏷️ Krispy Kreme
Links earned 🔗 95
Campaign type 📰 Newjacking/brand collab/product release
Global search volume 🔎 45K for “Olympics Krispy Kreme Doughnuts”
Search growth (YoY) 📈 4.4M% for “Olympics Krispy Kreme Doughnuts”

Krispy Kreme rode the wave of Olympic interest this year by developing two special edition doughnuts: “Go USA” and “Passport to Paris”.

As we’ve seen already, popular PR campaigns don’t always neatly track back to the sources you’d expect them to.

Krispy Kreme earned only 11 links to their USA doughnut press release, and 20 to their Paris doughnut launch announcement. Not exactly groundbreaking.

But when you filter for mentions of campaign keywords (e.g. “Go USA” and “Paris”) at the domain-level, you find a whole lot more coverage; 95 links, to be precise, from major publications like Yahoo, USA Today, People, and the Food Network.

Ahrefs Backlinks report showing 86 press links for Krispy Kreme's Olympic Doughnuts campaignAhrefs Backlinks report showing 86 press links for Krispy Kreme's Olympic Doughnuts campaign

The special edition doughnuts also drive a cool 45K monthly searches, according to the Matching Terms report in Keywords Explorer.

Ahrefs Matching Terms report showing 45K search volume for olympic krispy kreme doughnutsAhrefs Matching Terms report showing 45K search volume for olympic krispy kreme doughnuts

Quick learnings

  • Capitalize on high demand around recurring events.
  • For campaigns that can’t be neatly tracked (e.g. no specific landing page, or product page) pay closer attention to homepage or domain-level links through clever filtering.

Campaign 🍟👞 McDonald’s + Crocs Collaboration
Brand(s) 🏷️ Crocs + McDonald’s
Links earned 🔗 516
Campaign type 📰 Brand collab/product release
Global search volume 🔎 18K for “mcdonalds crocs”
Search growth (YoY) 📈 838% for “mcdonalds crocs”

This next PR example is a campaign of multiple parts. It began with a pair of McDonald’s themed Crocs, and has extended to a full blown footwear collection…

A screenshot of the crocs and mcdonald's product collection pageA screenshot of the crocs and mcdonald's product collection page

And a novelty product: McDonald’s happy meal mini-crocs keyring.

Press photo of McDonald's and Crocs happy meal toyPress photo of McDonald's and Crocs happy meal toy

The coordinated PR campaign has generated huge awareness for both brands, but tracking all the fragmented assets is no mean feat.

To get a better idea of overall brand awareness, I opted instead to search for co-citations at the domain level.

Searching in the Backlinks report, I applied filters for each brand name in the other’s backlink profile.

A side by side comparsion of press mentions for crocs and mcdonald'sA side by side comparsion of press mentions for crocs and mcdonald's

McDonald’s earned 260 links for “Crocs” related content, but Crocs was the real winner, landing 416 links for “McDonald’s” related press from media goliaths like Business Insider, Fast Company, and Entrepreneur.

From studying the campaign’s individual assets, I noticed something interesting: social posts have the ability to attract links.

Take for instance, this UGC post by Instagram food account, Snackolater. It landed 24 backlinks after sharing news of the happy meal mini-croc launch.

Ahrefs Backlinks report showing 24 press links for an instagram post on the mcdonalds and crocs collabAhrefs Backlinks report showing 24 press links for an instagram post on the mcdonalds and crocs collab

It had never occurred to me to track social media posts for links, but you can never tell how a journalist is going to reference your campaign, so it’s worthwhile setting up a backlink alert for all your assets just in case!

The growth of brand searches is a real testament to the success of a PR campaign, and this collaboration definitely delivers on that front.

Audiences are searching for relevant McDonald’s + Croc based keywords a total of 37K times a month on average, based on data in Keywords Explorer.

1727474190 432 9 Successful PR Campaign Examples According to the Data1727474190 432 9 Successful PR Campaign Examples According to the Data

Quick learnings

  • Sometimes, the “side” brand in a collaboration can snag more links. Keep that in mind for your next PR partnership.
  • With two brands there are double the assets to track, including product pages, press releases, landing pages, and various social posts – make sure you have purview over the performance of all moving parts to track public relations campaigns holistically.
  • Don’t forget to report on social posts, not just for impressions/engagement but for links.

Campaign ❎🍑 Do your lesson, no buts
Brand(s) 🏷️ Duo Lingo
Links earned 🔗 130
Campaign type 📽️ Advert
Global search volume 🔎 250 for “Duolingo commercial”
Search growth (YoY) 📈 376% for “Duolingo commercial”

Duolingo leaned into their weird yet wonderful brand of marketing with a hilarious superbowl ad featuring Duo, the brand’s menacing owl character.

In 5 (wild) seconds, we witness the explosion of Duo’s butt, and the growth of a mini Duo in its place, accompanied by a reminder to do our Duo Lingo lesson.

The ad creative was repurposed from a widget design that went semi-viral – Duo Lingo knew it worked, so they built on it.

And in a stroke of coordinated PR genius, they simultaneously sent out a push-notification to app users as soon as the ad went live.

A photo of a Duo Lingo reminderA photo of a Duo Lingo reminder

 

“We decided to pair the ad with a coordinated push notification, which would hit learners’ phones right after the commercial aired, reinforcing the idea that Duo is always watching 👀.”

The YouTube commercial has earned 5M views and 130 links from Gizmodo, Lifehacker, and Indy100.

Ahrefs Backlinks report showing 118 press links for Duo Lingo's Superbowl AdvertAhrefs Backlinks report showing 118 press links for Duo Lingo's Superbowl Advert

Plus 24M plays on TikTok.

Screenshot of duolingos no buts PR campaign views on TikTokScreenshot of duolingos no buts PR campaign views on TikTok

The Duolingo team have written up a seriously funny play-by-play of the PR campaign here – they talk about everything from the lengths they went to to get the right “shine” on Duo’s buttocks, to carefully selecting the perfect fart sound effect. I recommend reading it, for a giggle if nothing else.

Quick learnings

  • Upcycle owned content that has worked well in the past for your next PR campaign.
  • Try a mixed-message approach to really drive the point of your campaign home.

How I found these PR campaign examples (and you can too)

I spent a lot of time:

There were so many awesome examples of PR, but I narrowed it down to the ones that drove either press mentions, links, search volume, traffic, or all of the above.

Final thoughts

The best PR campaigns aren’t just about links. They’re about creating conversations, driving awareness, and making a lasting impact on your audience.

Here’s a quick recap of some of the top takeaways:

  • Time it right: Launch your campaigns when conversations peak
  • Tap into unique data: Use exclusive insights to stand out
  • Track holistically: Monitor links, mentions, searches, and social
  • Rank and compare: Engage multiple audience “tribes” through rankings
  • Take a local angle: Analyze multiple locations to win more press
  • Collaborate creatively: Brand partnerships can amplify your reach
  • Repurpose winners: Turn successful content into new campaigns

Success is predicated on a campaign meeting its goal(s), and while we don’t know exactly what these brands set out to achieve, their campaigns have enjoyed results that most of us would be pretty happy with.

Hopefully they’ve given you some inspiration for your next project.

 



Source link

Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address
Continue Reading

Trending