Connect with us

SEO

Look At The Past Before You Fear The New Year

Published

on

Look At The Past Before You Fear The New Year

Hello, my dear fellow search marketer, and welcome to 2023.

It’s time to make some New Year’s resolutions, or at the very least, be prepared to make some changes for the new year.

Unlike my New York Jets, there is ample opportunity to drop the crappy “guru” you’ve hired, forecast out a budget (even in a recession), play with a new bid strategy, make memes about Performance Max/GA4 and give Bing (I still refuse to call it Microsoft Advertising) the fighting chance it deserves.

Also, don’t forget to migrate your Twitter ad budget to something actually stable.

So, let’s discuss what you should be doing now, what you went through in 2022, and what you need to do in 2023.

Advertisement

Think of this as a really nerdy and “snarkastic” visitation of three ghosts.

What Should You Be Doing Right Now?

It’s the beginning of 2023, so you’re running a bit late – but you can still make up for lost time.

Forecasting A 2023 Budget

You’ve seen how to forecast search budgets year after year: the old “determine impression share (IS) lost due to budget and had 3%-5% increase in CPC assuming strategy stays the same” method.

Then the pandemic came along, and forecasting got a little iffier. Now, that method lacks some weight.

The reality is, if you keep with that approach, fine, not the end of the world, but understand that cost per click (CPC) growth, especially on brand terms, saw some obscene growth in 2022 (starting around April).

Why? There are a variety of theories, but for now, let’s just call it “inflation.”

Advertisement

If you keep the typical approach, expect to add anywhere from 10%-15% on brand CPC growth YoY in Q1 and, likely, more along the lines of 4%-7% growth on non-brand. This comes from our own in-house estimate – yours should vary.

Next, the ugly elephant in the room – Performance Max – appears. But it gets more complicated if you migrate smart shopping over to Performance Max as well.

There are two ways to forecast this, and honestly, neither will be all that accurate or insightful – I apologize in advance.

  • Look at Google’s recommendation tool, see what it says for growth on a budget (because we all know it never says less), take 15%-25% off that growth level (kill off the buffer), and try that.
  • Or, gradually scale upward of 5%-10% from your current budget, assuming you hit budget caps consistently while flexing up and down for seasonality.

As I said, neither option is great.

If you want to adjust your search strategy (not applicable for Performance Max), look at your IS lost to rank and work the fancy formula that PPC Hero posted a little ways back.

It’ll help you understand where your current strategy/bids are, causing you to miss opportunities.

This is a good time to pace out your budget (if you’re like me, you have a planned budget to spend for literally every day of the year, which will vary based on anticipated demand).

Advertisement

Content Calendar/Seasonal Flighting Planning

Often this is not as applicable if you’re new to a piece of business, but it should 100% be part of your plan.

If you aren’t new to the business and you haven’t done this, then you are Mr. Wilson of the Jets and deserve to be benched.

Make sure you know your deals, seasonality for peaks and lows, and everything you want to do creatively and budget-wise.

It allows you to get all of your assets built way in advance, approved, and scheduled for deployment.

Screenshot from author, December 2022

Assessing What You Didn’t Do

Life and work get busy. This happens to all of us. Odds are, you had laid out some plans for 2022 that you could not execute.

Now is the time to determine what builds, testing, flighting plans, etc., you never got around to doing last year and reprioritize them to determine if you should try them out in 2023.

I like to use this thought process when doing that evaluation:

Advertisement

Was this for “fun” or a necessity (i.e., Is this effort something that would’ve definitely made a business impact, or something just to try out and see if it could help or hurt)?

  • If it was a necessity, then I hope you have a good excuse for why it wasn’t done and put it on the books for 2023.
  • If it was for “fun,” file it away for a rainy day.

Was there a business implication (positive or negative) by not doing this?

  • If no, then no harm/no foul, and you can try it eventually.
  • If yes, then get it ready for 2023, and have a good explanation as to why it wasn’t done.

Consider what you’ve been through.

Much like dealing with your strange aunt/uncle who said something grossly inappropriate during the holidays, you need to sit down and process what did happen to your SEM campaigns in 2022.

This helps you decide if it was all good, all bad, or somewhere in between and what you need to consider carefully in 2023.

Look at both the big things and the small things.

Performance Max

If you migrated into Performance Max by choice or by force (anyone using Smart Shopping or local search), it likely made both a negative and a positive impact on your year.

Negative: You literally have no idea when/where your ad is showing, and all you can think (and you’re probably right) is that Google has thrown some of your direct-to-consumer (DTC) funds away on a really bad Google Display Network placement.

Advertisement

At the same time, you have very little information or ability to explain to your boss why Google has basically relaunched the SMB-targeted Adwords Express as a 2.0 version and just ruined your transparency.

Negative: You did the auto upgrade of a local campaign to Performance Max and discovered how many bugs there are, or you let Google create your YouTube video, and the music makes it far more cringe than you had hoped.

Positive: Especially for those running foot traffic campaigns, you’ve (hopefully) seen cost per store visits become somewhat more cost-efficient, and your ecommerce (for those running Smart Shopping) has seen an improvement in the cost per action (CPA).

Positive: Performance Max is slowly becoming more reliable, and the ability to move to other verticals that are leads driven has become an opportunity.

Google Analytics 4 (GA4)

I’ll go ahead and say what we’re all thinking (and it has been published multiple times already):

My god, this analytics platform was clearly made by someone who clearly only interacts with barnyard animals and has a vision and not by someone who did a user focus group.

Advertisement

If you somehow managed to survive the implementation of GA4, you’re now, more than likely, cursing it out due to lack of intuitiveness or more frustrated they rolled it out without a bounce rate or even conversion rate until months later.

All is not lost, though; I highly recommend deploying it immediately (if you haven’t already) and running it concurrently with GA UA, so you can work out the kinks and learn the platform while accruing historical data.

You may feel like Google decided to wake up and choose chaos with this platform and probably lost a few weeks of your life trying to understand it – so keep it in mind when you evaluate what you didn’t get around to doing in 2022.

Bing Multimedia Ads

You saw the hype for them in September, especially on the video side, and thought: Finally, Bing is getting into the video ad game.

But then you realized you needed a raw video file to upload it and how little it would rotate.

Big hopes, big opportunity, but just no volume.

Advertisement

Twitter

I know this article is SEM focused, but I would be remiss if I didn’t address this, as it is still biddable media.

Every brand has different views on brand association, but if you have even a hint of brand safety concerns on GDN, MSAN, YouTube, etc., then do not advertise on Twitter until it gets itself straightened out.

Some of these changes in 2022 impacted you in different ways, good or bad.

The question is, can you learn from them, use them, and progress in 2023, with or without them?

What You Need to Do In 2023

I’ve done several of these “What to Expect in the New Year for SEM” articles over the years, but the last two of these could never have anticipated what is going on now… again.

With that being said, I will go with what I believe is mostly going to happen, and you can take it with a grain of salt:

Advertisement
  • The NY Jets will not make the big game – just accept it.
  • CPCs, especially for Q1, will be higher than any other Q1 on record (especially brand terms), so be prepared to find a way to explain why and for your money make to become less cost-efficient.
  • There will not be a decline in demand/search volume until there is an increase in unemployment (ala 2007-2009 recession), so be prepared to address the uptick in volume.
  • Google will become less transparent, somehow.
  • Bing will eventually do whatever Google does.
  • If you work with healthcare brands, prepare to get rid of GA UA quickly due to HIPAA compliance.
  • Absolutely most important, use 1st party data as long as you can – but you need to get extremely good, and fast, at building in market audience segment groups and go all Criminal Minds/FBI profiling a serial killer mentality on targeting.

Have I scared you yet? Good.

2023 will be a wild year in search, and you must be prepared for it.

But you cannot move forward until you evaluate and process the past. Once that is done, you can plan out the future.

Best of luck, search marketers. We’re all going to need it.

More resources: 


Featured Image: 3rdtimeluckystudio/Shutterstock



Source link

Advertisement
Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address

SEO

How Compression Can Be Used To Detect Low Quality Pages

Published

on

By

Compression can be used by search engines to detect low-quality pages. Although not widely known, it's useful foundational knowledge for SEO.

The concept of Compressibility as a quality signal is not widely known, but SEOs should be aware of it. Search engines can use web page compressibility to identify duplicate pages, doorway pages with similar content, and pages with repetitive keywords, making it useful knowledge for SEO.

Although the following research paper demonstrates a successful use of on-page features for detecting spam, the deliberate lack of transparency by search engines makes it difficult to say with certainty if search engines are applying this or similar techniques.

What Is Compressibility?

In computing, compressibility refers to how much a file (data) can be reduced in size while retaining essential information, typically to maximize storage space or to allow more data to be transmitted over the Internet.

TL/DR Of Compression

Compression replaces repeated words and phrases with shorter references, reducing the file size by significant margins. Search engines typically compress indexed web pages to maximize storage space, reduce bandwidth, and improve retrieval speed, among other reasons.

This is a simplified explanation of how compression works:

  • Identify Patterns:
    A compression algorithm scans the text to find repeated words, patterns and phrases
  • Shorter Codes Take Up Less Space:
    The codes and symbols use less storage space then the original words and phrases, which results in a smaller file size.
  • Shorter References Use Less Bits:
    The “code” that essentially symbolizes the replaced words and phrases uses less data than the originals.

A bonus effect of using compression is that it can also be used to identify duplicate pages, doorway pages with similar content, and pages with repetitive keywords.

Research Paper About Detecting Spam

This research paper is significant because it was authored by distinguished computer scientists known for breakthroughs in AI, distributed computing, information retrieval, and other fields.

Advertisement

Marc Najork

One of the co-authors of the research paper is Marc Najork, a prominent research scientist who currently holds the title of Distinguished Research Scientist at Google DeepMind. He’s a co-author of the papers for TW-BERT, has contributed research for increasing the accuracy of using implicit user feedback like clicks, and worked on creating improved AI-based information retrieval (DSI++: Updating Transformer Memory with New Documents), among many other major breakthroughs in information retrieval.

Dennis Fetterly

Another of the co-authors is Dennis Fetterly, currently a software engineer at Google. He is listed as a co-inventor in a patent for a ranking algorithm that uses links, and is known for his research in distributed computing and information retrieval.

Those are just two of the distinguished researchers listed as co-authors of the 2006 Microsoft research paper about identifying spam through on-page content features. Among the several on-page content features the research paper analyzes is compressibility, which they discovered can be used as a classifier for indicating that a web page is spammy.

Detecting Spam Web Pages Through Content Analysis

Although the research paper was authored in 2006, its findings remain relevant to today.

Then, as now, people attempted to rank hundreds or thousands of location-based web pages that were essentially duplicate content aside from city, region, or state names. Then, as now, SEOs often created web pages for search engines by excessively repeating keywords within titles, meta descriptions, headings, internal anchor text, and within the content to improve rankings.

Section 4.6 of the research paper explains:

Advertisement

“Some search engines give higher weight to pages containing the query keywords several times. For example, for a given query term, a page that contains it ten times may be higher ranked than a page that contains it only once. To take advantage of such engines, some spam pages replicate their content several times in an attempt to rank higher.”

The research paper explains that search engines compress web pages and use the compressed version to reference the original web page. They note that excessive amounts of redundant words results in a higher level of compressibility. So they set about testing if there’s a correlation between a high level of compressibility and spam.

They write:

“Our approach in this section to locating redundant content within a page is to compress the page; to save space and disk time, search engines often compress web pages after indexing them, but before adding them to a page cache.

…We measure the redundancy of web pages by the compression ratio, the size of the uncompressed page divided by the size of the compressed page. We used GZIP …to compress pages, a fast and effective compression algorithm.”

High Compressibility Correlates To Spam

The results of the research showed that web pages with at least a compression ratio of 4.0 tended to be low quality web pages, spam. However, the highest rates of compressibility became less consistent because there were fewer data points, making it harder to interpret.

Figure 9: Prevalence of spam relative to compressibility of page.

The researchers concluded:

Advertisement

“70% of all sampled pages with a compression ratio of at least 4.0 were judged to be spam.”

But they also discovered that using the compression ratio by itself still resulted in false positives, where non-spam pages were incorrectly identified as spam:

“The compression ratio heuristic described in Section 4.6 fared best, correctly identifying 660 (27.9%) of the spam pages in our collection, while misidentifying 2, 068 (12.0%) of all judged pages.

Using all of the aforementioned features, the classification accuracy after the ten-fold cross validation process is encouraging:

95.4% of our judged pages were classified correctly, while 4.6% were classified incorrectly.

More specifically, for the spam class 1, 940 out of the 2, 364 pages, were classified correctly. For the non-spam class, 14, 440 out of the 14,804 pages were classified correctly. Consequently, 788 pages were classified incorrectly.”

The next section describes an interesting discovery about how to increase the accuracy of using on-page signals for identifying spam.

Insight Into Quality Rankings

The research paper examined multiple on-page signals, including compressibility. They discovered that each individual signal (classifier) was able to find some spam but that relying on any one signal on its own resulted in flagging non-spam pages for spam, which are commonly referred to as false positive.

Advertisement

The researchers made an important discovery that everyone interested in SEO should know, which is that using multiple classifiers increased the accuracy of detecting spam and decreased the likelihood of false positives. Just as important, the compressibility signal only identifies one kind of spam but not the full range of spam.

The takeaway is that compressibility is a good way to identify one kind of spam but there are other kinds of spam that aren’t caught with this one signal. Other kinds of spam were not caught with the compressibility signal.

This is the part that every SEO and publisher should be aware of:

“In the previous section, we presented a number of heuristics for assaying spam web pages. That is, we measured several characteristics of web pages, and found ranges of those characteristics which correlated with a page being spam. Nevertheless, when used individually, no technique uncovers most of the spam in our data set without flagging many non-spam pages as spam.

For example, considering the compression ratio heuristic described in Section 4.6, one of our most promising methods, the average probability of spam for ratios of 4.2 and higher is 72%. But only about 1.5% of all pages fall in this range. This number is far below the 13.8% of spam pages that we identified in our data set.”

So, even though compressibility was one of the better signals for identifying spam, it still was unable to uncover the full range of spam within the dataset the researchers used to test the signals.

Combining Multiple Signals

The above results indicated that individual signals of low quality are less accurate. So they tested using multiple signals. What they discovered was that combining multiple on-page signals for detecting spam resulted in a better accuracy rate with less pages misclassified as spam.

Advertisement

The researchers explained that they tested the use of multiple signals:

“One way of combining our heuristic methods is to view the spam detection problem as a classification problem. In this case, we want to create a classification model (or classifier) which, given a web page, will use the page’s features jointly in order to (correctly, we hope) classify it in one of two classes: spam and non-spam.”

These are their conclusions about using multiple signals:

“We have studied various aspects of content-based spam on the web using a real-world data set from the MSNSearch crawler. We have presented a number of heuristic methods for detecting content based spam. Some of our spam detection methods are more effective than others, however when used in isolation our methods may not identify all of the spam pages. For this reason, we combined our spam-detection methods to create a highly accurate C4.5 classifier. Our classifier can correctly identify 86.2% of all spam pages, while flagging very few legitimate pages as spam.”

Key Insight:

Misidentifying “very few legitimate pages as spam” was a significant breakthrough. The important insight that everyone involved with SEO should take away from this is that one signal by itself can result in false positives. Using multiple signals increases the accuracy.

What this means is that SEO tests of isolated ranking or quality signals will not yield reliable results that can be trusted for making strategy or business decisions.

Takeaways

We don’t know for certain if compressibility is used at the search engines but it’s an easy to use signal that combined with others could be used to catch simple kinds of spam like thousands of city name doorway pages with similar content. Yet even if the search engines don’t use this signal, it does show how easy it is to catch that kind of search engine manipulation and that it’s something search engines are well able to handle today.

Here are the key points of this article to keep in mind:

Advertisement
  • Doorway pages with duplicate content is easy to catch because they compress at a higher ratio than normal web pages.
  • Groups of web pages with a compression ratio above 4.0 were predominantly spam.
  • Negative quality signals used by themselves to catch spam can lead to false positives.
  • In this particular test, they discovered that on-page negative quality signals only catch specific types of spam.
  • When used alone, the compressibility signal only catches redundancy-type spam, fails to detect other forms of spam, and leads to false positives.
  • Combing quality signals improves spam detection accuracy and reduces false positives.
  • Search engines today have a higher accuracy of spam detection with the use of AI like Spam Brain.

Read the research paper, which is linked from the Google Scholar page of Marc Najork:

Detecting spam web pages through content analysis

Featured Image by Shutterstock/pathdoc

Source link

Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address
Continue Reading

SEO

New Google Trends SEO Documentation

Published

on

By

Google publishes new documentation for how to use Google Trends for search marketing

Google Search Central published new documentation on Google Trends, explaining how to use it for search marketing. This guide serves as an easy to understand introduction for newcomers and a helpful refresher for experienced search marketers and publishers.

The new guide has six sections:

  1. About Google Trends
  2. Tutorial on monitoring trends
  3. How to do keyword research with the tool
  4. How to prioritize content with Trends data
  5. How to use Google Trends for competitor research
  6. How to use Google Trends for analyzing brand awareness and sentiment

The section about monitoring trends advises there are two kinds of rising trends, general and specific trends, which can be useful for developing content to publish on a site.

Using the Explore tool, you can leave the search box empty and view the current rising trends worldwide or use a drop down menu to focus on trends in a specific country. Users can further filter rising trends by time periods, categories and the type of search. The results show rising trends by topic and by keywords.

To search for specific trends users just need to enter the specific queries and then filter them by country, time, categories and type of search.

The section called Content Calendar describes how to use Google Trends to understand which content topics to prioritize.

Advertisement

Google explains:

“Google Trends can be helpful not only to get ideas on what to write, but also to prioritize when to publish it. To help you better prioritize which topics to focus on, try to find seasonal trends in the data. With that information, you can plan ahead to have high quality content available on your site a little before people are searching for it, so that when they do, your content is ready for them.”

Read the new Google Trends documentation:

Get started with Google Trends

Featured Image by Shutterstock/Luis Molinero

Source link

Advertisement
Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address
Continue Reading

SEO

All the best things about Ahrefs Evolve 2024

Published

on

All the best things about Ahrefs Evolve 2024

Hey all, I’m Rebekah and I am your Chosen One to “do a blog post for Ahrefs Evolve 2024”.

What does that entail exactly? I don’t know. In fact, Sam Oh asked me yesterday what the title of this post would be. “Is it like…Ahrefs Evolve 2024: Recap of day 1 and day 2…?” 

Even as I nodded, I couldn’t get over how absolutely boring that sounded. So I’m going to do THIS instead: a curation of all the best things YOU loved about Ahrefs’ first conference, lifted directly from X.

Let’s go!

OUR HUGE SCREEN

CONFERENCE VENUE ITSELF

It was recently named the best new skyscraper in the world, by the way.

 

OUR AMAZING SPEAKER LINEUP – SUPER INFORMATIVE, USEFUL TALKS!

 

Advertisement

GREAT MUSIC

 

AMAZING GOODIES

 

SELFIE BATTLE

Some background: Tim and Sam have a challenge going on to see who can take the most number of selfies with all of you. Last I heard, Sam was winning – but there is room for a comeback yet!

 

THAT BELL

Everybody’s just waiting for this one.

 

STICKER WALL

AND, OF COURSE…ALL OF YOU!

 

Advertisement

There’s a TON more content on LinkedIn – click here – but I have limited time to get this post up and can’t quite figure out how to embed LinkedIn posts so…let’s stop here for now. I’ll keep updating as we go along!



Source link

Advertisement
Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address
Continue Reading

Trending