Connect with us

SEO

8 Free SEO Reporting Tools

Published

on

8 Free SEO Reporting Tools

There’s no shortage of SEO reporting tools to choose from—but what are the core tools you need to put together an SEO report?

In this article, I’ll share eight of my favorite SEO reporting tools to help you create a comprehensive SEO report for free.

Price: Free

Google Search Console, often called GSC, is one of the most widely used tools to track important SEO metrics from Google Search.

Most common reporting use case

GSC has a ton of data to dive into, but the main performance indicator SEOs look at first in GSC is Clicks on the main Overview dashboard.

Advertisement

As the data is from Google, SEOs consider it to be a good barometer for tracking organic search performance. As well as clicks data, you can also track the following from the Performance report:

  • Total Impressions
  • Average CTR
  • Average Position
gsc-performance-overviewgsc-performance-overview

Tip

If you’ve signed up for AWT using Google Search Console, you can view your GSC performance data in Ahrefs by clicking “GSC Performance” from the main dashboard.

But for most SEO reporting, GSC clicks data is exported into a spreadsheet and turned into a chart to visualize year-over-year performance.

organic-traffic-graph-showing-clicks-year-over-yearorganic-traffic-graph-showing-clicks-year-over-year

Favorite feature

One of my favorite reports in GSC is the Indexing report. It’s useful for SEO reporting because you can share the indexed to non-indexed pages ratio in your SEO report.

google-search-console-indexed-pages-reportgoogle-search-console-indexed-pages-report

If the website has a lot of non-indexed pages, then it’s worth reviewing the pages to understand why they haven’t been indexed.

Price: Free

Google Looker Studio (GLS), previously known as Google Data Studio (GDS), is a free tool that helps visualize data in shareable dashboards.

Most common reporting use case

Dashboards are an important part of SEO reporting, and GLS allows you to get a total view of search performance from multiple sources through its integrations.

Advertisement

Out of the box, GLS allows you to connect to many different data sources.

Such as:

  • Marketing products – Google Ads, Google Analytics, Display & Video 360, Search Ads 360
  • Consumer products – Google Sheets, YouTube, and Google Search Console
  • Databases – BigQuery, MySQL, and PostgreSQL
  • Social media platforms – Facebook, Reddit, and Twitter
  • Files – CSV file upload and Google Cloud Storage

Sidenote.

If you don’t have the time to create your own report manually, Ahrefs has three Google Looker Studio connectors that can help you create automated SEO reporting for any website in a few clicks

google-looker-studio-partner-connectorsgoogle-looker-studio-partner-connectors

Here’s what a dashboard in GLS looks like:

ahrefs-seo-audit-dashboardahrefs-seo-audit-dashboard
Ahrefs Google Looker Studio integration

With this type of dashboard, you share reports that are easy to understand with clients or other stakeholders.

Favorite feature

The ability to blend and filter data from different sources, like GA and GSC, means you can get a customized overview of your total search performance, tailored to your website.

Price: Free for 500 URLs

Screaming Frog is a website crawler that helps you audit your website.

Advertisement

Screaming Frog’s free version of its crawler is perfect if you want to run a quick audit on a bunch of URLs. The free version is limited to 500 URLs—making it ideal for crawling smaller websites.

screaming-frog-user-interface-screenshotscreaming-frog-user-interface-screenshot

Most common reporting use case

When it comes to reporting, the Reports menu in Screaming Frog SEO Spider has a wealth of information you can look over that covers all the technical aspects of your website, such as analyzing, redirects, canonicals, pagination, hreflang, structured data, and more.

Once you’ve crawled your site, it’s just a matter of downloading the reports you need and working out the main issues to summarize in your SEO report.

Favorite feature

Screaming Frog can pull in data from other tools, including Ahrefs, using APIs. 

If you already had access to a few SEO tools’ APIs, you could pull data from all of them directly into Screaming Frog. This is useful if you want to combine crawl data with performance data or other 3rd party tools.

screaming-frog-api-accessscreaming-frog-api-access

Even if you’ve never configured an API, connecting other tools to Screaming Frog is straightforward.

Price: Free

Ahrefs has a large selection of free SEO tools to help you at every stage of your SEO campaign, and many of these can be used to provide insights for your SEO reporting.

Advertisement
when-to-use-ahrefs-free-tools-across-the-seo-process-illustrationwhen-to-use-ahrefs-free-tools-across-the-seo-process-illustration

For example, you could use our:

Most common reporting use case

One of our most popular free SEO tools is Ahrefs Webmaster Tools (AWT), which you can use for your SEO reporting.

With AWT, you can:

  • Monitor your SEO health over time by setting up scheduled SEO audits
  • See the performance of your website
  • Check all known backlinks for your website
ahrefs-overviewahrefs-overview

Favorite feature

Of all the Ahrefs free tools, my favorite is AWT. Within it, site auditing is my favorite feature—once you’ve set it up, it’s a completely hands-free way to keep track of your website’s technical performance and monitor its health.

If you already have access to Google Search Console, it’s a no-brainer to set up a free AWT account and schedule a technical crawl of your website(s).

Price: Free

Ahrefs’ SEO Toolbar is a free Chrome and Firefox extension useful for diagnosing on-page technical issues and performing quick spot checks on your website’s pages.

Most common reporting use case

For SEO reporting, it’s useful to run an on-page check on your website’s top pages to ensure there aren’t any serious on-page issues.

Advertisement
ahrefs-seo-toolbar-overviewahrefs-seo-toolbar-overview

With the free version, you get the following features:

  • On-page SEO report
  • Redirect tracer with HTTP Headers
  • Outgoing links report with link highlighter and broken link checker
  • SERP positions
  • Country changer for SERP

The SEO toolbar is excellent for spot-checking issues with pages on your website. If you are not confident with inspecting the code, it can also give you valuable pointers on what elements you need to include on your pages to make them search-friendly.

If anything is wrong with the page, the toolbar highlights it, with red indicating a critical issue.

severity-highlight-ahrefs-seo-toolbarseverity-highlight-ahrefs-seo-toolbar

Favorite feature

The section I use the most frequently in the SEO toolbar is the Indexability tab. In this section, you can see whether the page can be crawled and indexed by Google.

indexability-tab-ahrefs-seo-toolbarindexability-tab-ahrefs-seo-toolbar

Although you can do this by inspecting the code manually, using the toolbar is much faster.

Price: Free

Like GSC, Google Analytics is another tool you can use to track the performance of your website, tracking sessions and conversions and much more on your website.

google-analytics-screenshotgoogle-analytics-screenshot

Most common reporting use case

GA gives you a total view of website traffic from several different sources, such as direct, social, organic, paid traffic, and more.

Favorite feature

You can create and track up to 300 events and 30 conversions with GA4. Previously, with universal analytics, you could only track 20 conversions. This makes conversion and event tracking easier within GA4.

Price: Free

Google Slides is Google’s version of Microsoft PowerPoint. If you don’t have a dashboard set up to report on your SEO performance, the next best thing is to assemble a slide deck.

Advertisement

Many SEO agencies present their report through dashboard insights and PowerPoint presentations. However, if you don’t have access to PowerPoint, then Google Slides is an excellent (free) alternative.

google-slides-screenshotgoogle-slides-screenshot

Most common reporting use cases

The most common use of Google Slides is to create a monthly SEO report. If you don’t know what to include in a monthly report, use our SEO report template.

Favorite feature

One of my favorite features is the ability to share your presentation on a video chat directly from Google Slides. You can do this by clicking the camera icon in the top right.

share-video-chat-google-slidesshare-video-chat-google-slides

This is useful if you are working with remote clients and makes sharing your reports easy.

Price: Free

Google Trends allows you to view a keyword’s popularity over time in any country. The data shown is the relative popularity ratio scaled from 0-100, not the direct volume of search queries.

Most common reporting use cases

Google Trends is useful for showing how the popularity of certain searches can increase or decrease over time. If you work with a website that often has trending products, services, or news, it can be useful to illustrate this visually in your SEO report.

Google Trends makes it easy to spot seasonal trends for product categories. For example, people want to buy BBQs when the weather is sunny.

Advertisement

Using Google Trends, we can see that peak demand for BBQs usually happens in June-July every year.

bbq-google-trends-graphbbq-google-trends-graph

Using this data across the last five years, we could be fairly sure when the BBQ season would start and end.

Favorite feature

Comparing two or more search terms against each other over time is one of my favorite uses of Google Trends, as it can be used to tell its own story.

google-trends-comparison-examplegoogle-trends-comparison-example

Embellishing your report with trends data allows you to gain further insights into market trends.

You can even dig into trends at a regional level if you need to.

regional-trends-via-google-trendsregional-trends-via-google-trends

Final thoughts

These free tools will help you put together the foundations for a well-rounded SEO report.

The tools you use for SEO reporting don’t always have to be expensive—even large companies use many of the free tools mentioned to create insights for their client’s SEO reports.

Got more questions? Ping me on X 🙂

Advertisement



Source link

Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address

SEO

How Compression Can Be Used To Detect Low Quality Pages

Published

on

By

Compression can be used by search engines to detect low-quality pages. Although not widely known, it's useful foundational knowledge for SEO.

The concept of Compressibility as a quality signal is not widely known, but SEOs should be aware of it. Search engines can use web page compressibility to identify duplicate pages, doorway pages with similar content, and pages with repetitive keywords, making it useful knowledge for SEO.

Although the following research paper demonstrates a successful use of on-page features for detecting spam, the deliberate lack of transparency by search engines makes it difficult to say with certainty if search engines are applying this or similar techniques.

What Is Compressibility?

In computing, compressibility refers to how much a file (data) can be reduced in size while retaining essential information, typically to maximize storage space or to allow more data to be transmitted over the Internet.

TL/DR Of Compression

Compression replaces repeated words and phrases with shorter references, reducing the file size by significant margins. Search engines typically compress indexed web pages to maximize storage space, reduce bandwidth, and improve retrieval speed, among other reasons.

This is a simplified explanation of how compression works:

  • Identify Patterns:
    A compression algorithm scans the text to find repeated words, patterns and phrases
  • Shorter Codes Take Up Less Space:
    The codes and symbols use less storage space then the original words and phrases, which results in a smaller file size.
  • Shorter References Use Less Bits:
    The “code” that essentially symbolizes the replaced words and phrases uses less data than the originals.

A bonus effect of using compression is that it can also be used to identify duplicate pages, doorway pages with similar content, and pages with repetitive keywords.

Research Paper About Detecting Spam

This research paper is significant because it was authored by distinguished computer scientists known for breakthroughs in AI, distributed computing, information retrieval, and other fields.

Advertisement

Marc Najork

One of the co-authors of the research paper is Marc Najork, a prominent research scientist who currently holds the title of Distinguished Research Scientist at Google DeepMind. He’s a co-author of the papers for TW-BERT, has contributed research for increasing the accuracy of using implicit user feedback like clicks, and worked on creating improved AI-based information retrieval (DSI++: Updating Transformer Memory with New Documents), among many other major breakthroughs in information retrieval.

Dennis Fetterly

Another of the co-authors is Dennis Fetterly, currently a software engineer at Google. He is listed as a co-inventor in a patent for a ranking algorithm that uses links, and is known for his research in distributed computing and information retrieval.

Those are just two of the distinguished researchers listed as co-authors of the 2006 Microsoft research paper about identifying spam through on-page content features. Among the several on-page content features the research paper analyzes is compressibility, which they discovered can be used as a classifier for indicating that a web page is spammy.

Detecting Spam Web Pages Through Content Analysis

Although the research paper was authored in 2006, its findings remain relevant to today.

Then, as now, people attempted to rank hundreds or thousands of location-based web pages that were essentially duplicate content aside from city, region, or state names. Then, as now, SEOs often created web pages for search engines by excessively repeating keywords within titles, meta descriptions, headings, internal anchor text, and within the content to improve rankings.

Section 4.6 of the research paper explains:

Advertisement

“Some search engines give higher weight to pages containing the query keywords several times. For example, for a given query term, a page that contains it ten times may be higher ranked than a page that contains it only once. To take advantage of such engines, some spam pages replicate their content several times in an attempt to rank higher.”

The research paper explains that search engines compress web pages and use the compressed version to reference the original web page. They note that excessive amounts of redundant words results in a higher level of compressibility. So they set about testing if there’s a correlation between a high level of compressibility and spam.

They write:

“Our approach in this section to locating redundant content within a page is to compress the page; to save space and disk time, search engines often compress web pages after indexing them, but before adding them to a page cache.

…We measure the redundancy of web pages by the compression ratio, the size of the uncompressed page divided by the size of the compressed page. We used GZIP …to compress pages, a fast and effective compression algorithm.”

High Compressibility Correlates To Spam

The results of the research showed that web pages with at least a compression ratio of 4.0 tended to be low quality web pages, spam. However, the highest rates of compressibility became less consistent because there were fewer data points, making it harder to interpret.

Figure 9: Prevalence of spam relative to compressibility of page.

The researchers concluded:

Advertisement

“70% of all sampled pages with a compression ratio of at least 4.0 were judged to be spam.”

But they also discovered that using the compression ratio by itself still resulted in false positives, where non-spam pages were incorrectly identified as spam:

“The compression ratio heuristic described in Section 4.6 fared best, correctly identifying 660 (27.9%) of the spam pages in our collection, while misidentifying 2, 068 (12.0%) of all judged pages.

Using all of the aforementioned features, the classification accuracy after the ten-fold cross validation process is encouraging:

95.4% of our judged pages were classified correctly, while 4.6% were classified incorrectly.

More specifically, for the spam class 1, 940 out of the 2, 364 pages, were classified correctly. For the non-spam class, 14, 440 out of the 14,804 pages were classified correctly. Consequently, 788 pages were classified incorrectly.”

The next section describes an interesting discovery about how to increase the accuracy of using on-page signals for identifying spam.

Insight Into Quality Rankings

The research paper examined multiple on-page signals, including compressibility. They discovered that each individual signal (classifier) was able to find some spam but that relying on any one signal on its own resulted in flagging non-spam pages for spam, which are commonly referred to as false positive.

Advertisement

The researchers made an important discovery that everyone interested in SEO should know, which is that using multiple classifiers increased the accuracy of detecting spam and decreased the likelihood of false positives. Just as important, the compressibility signal only identifies one kind of spam but not the full range of spam.

The takeaway is that compressibility is a good way to identify one kind of spam but there are other kinds of spam that aren’t caught with this one signal. Other kinds of spam were not caught with the compressibility signal.

This is the part that every SEO and publisher should be aware of:

“In the previous section, we presented a number of heuristics for assaying spam web pages. That is, we measured several characteristics of web pages, and found ranges of those characteristics which correlated with a page being spam. Nevertheless, when used individually, no technique uncovers most of the spam in our data set without flagging many non-spam pages as spam.

For example, considering the compression ratio heuristic described in Section 4.6, one of our most promising methods, the average probability of spam for ratios of 4.2 and higher is 72%. But only about 1.5% of all pages fall in this range. This number is far below the 13.8% of spam pages that we identified in our data set.”

So, even though compressibility was one of the better signals for identifying spam, it still was unable to uncover the full range of spam within the dataset the researchers used to test the signals.

Combining Multiple Signals

The above results indicated that individual signals of low quality are less accurate. So they tested using multiple signals. What they discovered was that combining multiple on-page signals for detecting spam resulted in a better accuracy rate with less pages misclassified as spam.

Advertisement

The researchers explained that they tested the use of multiple signals:

“One way of combining our heuristic methods is to view the spam detection problem as a classification problem. In this case, we want to create a classification model (or classifier) which, given a web page, will use the page’s features jointly in order to (correctly, we hope) classify it in one of two classes: spam and non-spam.”

These are their conclusions about using multiple signals:

“We have studied various aspects of content-based spam on the web using a real-world data set from the MSNSearch crawler. We have presented a number of heuristic methods for detecting content based spam. Some of our spam detection methods are more effective than others, however when used in isolation our methods may not identify all of the spam pages. For this reason, we combined our spam-detection methods to create a highly accurate C4.5 classifier. Our classifier can correctly identify 86.2% of all spam pages, while flagging very few legitimate pages as spam.”

Key Insight:

Misidentifying “very few legitimate pages as spam” was a significant breakthrough. The important insight that everyone involved with SEO should take away from this is that one signal by itself can result in false positives. Using multiple signals increases the accuracy.

What this means is that SEO tests of isolated ranking or quality signals will not yield reliable results that can be trusted for making strategy or business decisions.

Takeaways

We don’t know for certain if compressibility is used at the search engines but it’s an easy to use signal that combined with others could be used to catch simple kinds of spam like thousands of city name doorway pages with similar content. Yet even if the search engines don’t use this signal, it does show how easy it is to catch that kind of search engine manipulation and that it’s something search engines are well able to handle today.

Here are the key points of this article to keep in mind:

Advertisement
  • Doorway pages with duplicate content is easy to catch because they compress at a higher ratio than normal web pages.
  • Groups of web pages with a compression ratio above 4.0 were predominantly spam.
  • Negative quality signals used by themselves to catch spam can lead to false positives.
  • In this particular test, they discovered that on-page negative quality signals only catch specific types of spam.
  • When used alone, the compressibility signal only catches redundancy-type spam, fails to detect other forms of spam, and leads to false positives.
  • Combing quality signals improves spam detection accuracy and reduces false positives.
  • Search engines today have a higher accuracy of spam detection with the use of AI like Spam Brain.

Read the research paper, which is linked from the Google Scholar page of Marc Najork:

Detecting spam web pages through content analysis

Featured Image by Shutterstock/pathdoc

Source link

Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address
Continue Reading

SEO

New Google Trends SEO Documentation

Published

on

By

Google publishes new documentation for how to use Google Trends for search marketing

Google Search Central published new documentation on Google Trends, explaining how to use it for search marketing. This guide serves as an easy to understand introduction for newcomers and a helpful refresher for experienced search marketers and publishers.

The new guide has six sections:

  1. About Google Trends
  2. Tutorial on monitoring trends
  3. How to do keyword research with the tool
  4. How to prioritize content with Trends data
  5. How to use Google Trends for competitor research
  6. How to use Google Trends for analyzing brand awareness and sentiment

The section about monitoring trends advises there are two kinds of rising trends, general and specific trends, which can be useful for developing content to publish on a site.

Using the Explore tool, you can leave the search box empty and view the current rising trends worldwide or use a drop down menu to focus on trends in a specific country. Users can further filter rising trends by time periods, categories and the type of search. The results show rising trends by topic and by keywords.

To search for specific trends users just need to enter the specific queries and then filter them by country, time, categories and type of search.

The section called Content Calendar describes how to use Google Trends to understand which content topics to prioritize.

Advertisement

Google explains:

“Google Trends can be helpful not only to get ideas on what to write, but also to prioritize when to publish it. To help you better prioritize which topics to focus on, try to find seasonal trends in the data. With that information, you can plan ahead to have high quality content available on your site a little before people are searching for it, so that when they do, your content is ready for them.”

Read the new Google Trends documentation:

Get started with Google Trends

Featured Image by Shutterstock/Luis Molinero

Source link

Advertisement
Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address
Continue Reading

SEO

All the best things about Ahrefs Evolve 2024

Published

on

All the best things about Ahrefs Evolve 2024

Hey all, I’m Rebekah and I am your Chosen One to “do a blog post for Ahrefs Evolve 2024”.

What does that entail exactly? I don’t know. In fact, Sam Oh asked me yesterday what the title of this post would be. “Is it like…Ahrefs Evolve 2024: Recap of day 1 and day 2…?” 

Even as I nodded, I couldn’t get over how absolutely boring that sounded. So I’m going to do THIS instead: a curation of all the best things YOU loved about Ahrefs’ first conference, lifted directly from X.

Let’s go!

OUR HUGE SCREEN

CONFERENCE VENUE ITSELF

It was recently named the best new skyscraper in the world, by the way.

 

OUR AMAZING SPEAKER LINEUP – SUPER INFORMATIVE, USEFUL TALKS!

 

Advertisement

GREAT MUSIC

 

AMAZING GOODIES

 

SELFIE BATTLE

Some background: Tim and Sam have a challenge going on to see who can take the most number of selfies with all of you. Last I heard, Sam was winning – but there is room for a comeback yet!

 

THAT BELL

Everybody’s just waiting for this one.

 

STICKER WALL

AND, OF COURSE…ALL OF YOU!

 

Advertisement

There’s a TON more content on LinkedIn – click here – but I have limited time to get this post up and can’t quite figure out how to embed LinkedIn posts so…let’s stop here for now. I’ll keep updating as we go along!



Source link

Advertisement
Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address
Continue Reading

Trending