Connect with us
Cloak And Track Your Affiliate Links With Our User-Friendly Link Cloaking Tool, Try It Free

SEO

Screaming Frog SEO Spider Version 20.0: AI-Powered Features

Published

on

What’s New with Screaming Frog SEO Spider 20.0?

For SEO experts, our toolkit is crucial. It’s how we make sure we can quickly and effectively assess how well our websites are performing. Using the best tools can put you way ahead of other SEOs. One example (and one tool I’ve personally been using for years) is Screaming FrogIt’s a powerful, straightforward, and insightful website crawler tool that’s indispensable for finding technical issues on your website.

And the good news is that it keeps getting better. Screaming Frog just released its 20th major version of the software, which includes new features based on feedback from SEO professionals.

Here are the main updates:

  1. Custom JavaScript Snippets
  2. Mobile Usability
  3. N-Grams Analysis
  4. Aggregated Anchor Text
  5. Carbon Footprint & Rating

Custom JavaScript Snippets

One of the standout features in this release is the ability to execute custom JavaScript snippets during a crawl. This functionality expands the horizons for data manipulation and API communication, offering unprecedented flexibility.

Use Cases:

  • Data Extraction and Manipulation: Gather specific data points or modify the DOM to suit your needs.
  • API Communication: Integrate with APIs like OpenAI’s ChatGPT from within the SEO Spider.

Setting Up Custom JS Snippets:

  • Navigate to `Config > Custom > Custom JavaScript`.
  • Click ‘Add’ to create a new snippet or ‘Add from Library’ to select from preset snippets.

setting up custom JS snippets screamingfrog 20setting up custom JS snippets screamingfrog 20

  • Ensure JavaScript rendering mode is set via `Config > Spider > Rendering`.

Crawl with ChatGPT:

  • Leverage the `(ChatGPT) Template` snippet, add your OpenAI API key and tailor the prompt to your needs.
  • Follow our tutorial on ‘How To Crawl With ChatGPT’ for more detailed guidance.

Sharing Your Snippets:

  • Export/import snippet libraries as JSON files to share with colleagues.
  • Remember to remove sensitive data such as API keys before sharing.

Introducing Custom JavaScript Snippets to Screaming Frog SEO Spider version 20.0 significantly enhances the tool’s flexibility and power. Whether you’re generating dynamic content, interacting with external APIs, or conducting complex page manipulations, these snippets open a world of possibilities. 

Mobile Usability

In today’s mobile-first world, ensuring a seamless mobile user experience is imperative. Version 20.0 introduces extensive mobile usability audits through Lighthouse integration. 

With an ever-increasing number of users accessing websites via mobile devices, ensuring a seamless mobile experience is crucial. Google’s mobile-first indexing highlights the importance of mobile usability, which directly impacts your site’s rankings and user experience.

 Mobile Usability Features:

  • New Mobile Tab: This tab includes filters for regular mobile usability issues such as viewport settings, tap target sizes, content sizing, and more.
  • Granular Issue Details: Detailed data on mobile usability issues can be explored in the ‘Lighthouse Details’ tab.
  • Bulk Export Capability: Export comprehensive mobile usability reports via `Reports > Mobile`.

Setup:

  • Connect to the PSI API through `Config > API Access > PSI` or run Lighthouse locally.

Example Use Cases:

  • Identify pages where content does not fit within the viewport.
  • Flag and correct small tap targets and illegible font sizes.

mobile usability analysis on screamingfrog 20mobile usability analysis on screamingfrog 20

With these new features, Screaming Frog SEO Spider version 20.0 streamlines the process of auditing mobile usability, making it more efficient and comprehensive. By integrating with Google Lighthouse, both via the PSI API and local runs, the tool provides extensive insights into the mobile performance of your website. Addressing these issues not only enhances user experience but also improves your site’s SEO performance.

N-grams Analysis

N-grams analysis is a powerful new feature that allows users to analyze phrase frequency across web pages. This can greatly enhance on-page SEO efforts and internal linking strategies.

Setting Up N-grams:

  • Activate HTML storage by enabling ‘Store HTML’ or ‘Store Rendered HTML’ under `Config > Spider > Extraction`.
  • View the N-grams in the lower N-grams tab.

n-grams analysis on screamingfrog 20n-grams analysis on screamingfrog 20

Example Use Cases:

  • Improving Keyword Usage: Adjust content based on the frequency of targeted N-grams.
  • Optimizing Internal Links: Use N-grams to identify unlinked keywords and create new internal links.

Internal Linking Opportunities:

The N-grams feature provides a nuanced method for discovering internal linking opportunities, which can significantly enhance your SEO strategy and site navigation.

The introduction of N-grams analysis in Screaming Frog SEO Spider version 20 provides a tool for deep content analysis and optimization. By understanding the frequency and distribution of phrases within your content, you can significantly improve your on-page SEO and internal linking strategies.

Aggregated Anchor Text

Effective anchor text management is essential for internal linking and overall SEO performance. The aggregated anchor text feature in version 20.0 provides clear insights into how anchor texts are used across your site.

Using Aggregated Anchor Text:

  • Navigate to the ‘Inlinks’ or ‘Outlinks’ tab.
  • Utilize the new ‘Anchors’ filters to see aggregated views of anchor text usage.

aggregated anchor text report on screamingfrog 20aggregated anchor text report on screamingfrog 20

Practical Benefits:

  • Anchor Text Diversity: Ensure a natural distribution of anchor texts to avoid over-optimization.
  • Descriptive Linking: Replace generic texts like “click here” with keyword-rich alternatives.

The aggregated anchor text feature provides powerful insights into your internal link structure and optimization opportunities. This feature is essential if you are looking to enhance your site’s internal linking strategy for better keyword relevance, user experience, and search engine performance.

Aligning with digital sustainability trends, Screaming Frog SEO Spider version 20.0 includes features to measure and optimize your website’s carbon footprint.

Key Features:

  • Automatic CO2 Calculation: The SEO Spider now calculates carbon emissions for each page using the CO2.js library.
  • Carbon Rating: Each URL receives a rating based on its emissions, derived from the Sustainable Web Design Model.
  • High Carbon Rating Identification: Pages with high emissions are flagged in the ‘Validation’ tab.

Practical Applications:

  • Resource Optimization: Identify and optimize high-emission resources.
  • Sustainable Practices: Implement changes such as compressing images, reducing script sizes, and using green hosting solutions.

The integration of carbon footprint calculations in Screaming Frog SEO Spider signifies a growing recognition of digital sustainability. As more businesses adopt these practices, we can collectively reduce the environmental impact of the web while driving performance and user satisfaction.

Other Updates

In addition to major features, version 20.0 includes numerous smaller updates and bug fixes that enhance functionality and user experience.

Rich Result Validation Enhancements:

  • Split Google Rich Result validation errors from Schema.org.
  • New filters and columns provide detailed insights into rich result triggers and errors.

Enhanced File Types and Filters:

  • Internal and external filters include new file types such as Media, Fonts, and XML.

Website Archiving:

  • A new option to archive entire websites during a crawl is available under `Config > Spider > Rendering > JS`.

Viewport and Screenshot Configuration:

  • Customize viewport and screenshot sizes to fit different audit needs.

API Auto Connect:

  • Automatically connect APIs on start, making the setup process more seamless.

Resource Over 15MB Filter:

  • A new validation filter flags resources over 15MB, which is crucial for performance optimization.

Page Text Export:

  • Export all visible page text through the new `Bulk Export > Web > All Page Text` option.

Lighthouse Details Tab:

  • The ‘PageSpeed Details’ tab has been renamed ‘Lighthouse Details’ to reflect its expanded role.

HTML Content Type Configuration:

  • An ‘Assume Pages are HTML’ option helps accurately classify pages without explicit content types.

Bug Fixes and Performance Improvements:

  • Numerous small updates and fixes enhance stability and reliability. 

Screaming Frog SEO Spider version 20.0 is a comprehensive update packed with innovative features and enhancements that cater to the evolving needs of SEO professionals like us. From advanced data extraction capabilities with Custom JavaScript Snippets to environmental sustainability with Carbon Footprint and Rating, this release sets a new benchmark in SEO auditing tools.

Key Takeaway

Add this to your toolbox, or update to version 20 to explore the rich array of new features from Screaming Frog to optimize your website’s SEO, usability, and sustainability. It’s a no-fuss tool with tons of features that will help you stay ahead of your competitors, and ensure your websites perform optimally in terms of user experience and search engine visibility.

Source link

Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address

SEO

Why Search (And The User) Is Still Important To SEO

Published

on

By

Why Search (And The User) Is Still Important To SEO

Throughout every technological change, the one constant always seems to be people calling for the death of SEO and search engines.

While pundits have been quick to call for the death of SEO, SEO itself has been all too reluctant to die. This article will look at how SEO evolves and why that makes it even more important.

Sure we could just spout some random facts about how most online purchases begin with a search and how a majority of online sessions include search – but there is a much bigger case to be made.

To fully grasp the importance of SEO and search, we first need to go back and understand both user intent (why people search) and how search engines have changed.

SEO Isn’t Dead

The “SEO is dead” articles always seem to follow a change to search that makes information easier to access for consumers. We saw it with featured snippets, we saw it with instant answers, and we’re seeing it again with AI.

We’ve also seen the “death of SEO” articles pop up around new and emerging social media sites like Meta, TikTok, X, etc – but the fact remains that overall web searches on search engines have continued to increase every year for the last decade plus.

Search isn’t dying, and new social networks or technology like AI aren’t cutting into search – they’re just making people search more. Search is becoming ingrained in (if not defining) our everyday online behavior.

While often associated, SEO is more than just building links or tricking search engines with spammy tactics. That stuff can work – temporarily – but not long-term for a real business or a brand. Sustained SEO growth needs to focus on more than keywords and tricks.

From Keywords To Intent

There’s a great quote from Bill Gates back in 2009 where he said “the future of search is verbs.”

This quote really summarizes the heart of “why” people search. People are searching to accomplish a task or do something.

It’s important that we consider this search intent when evaluating SEO and search. Not all searchers want websites. In the early days of search, links to websites were the best thing we had.

Today, however, search engines and AI are getting better at answering common questions.

For a search like [how old is taylor swift] or [when is the NHL trade deadline?] users just want an answer – without having to click over to a website, accept the cookie consent notice, close the alert popup, decline to subscribe to the newsletter, stop the auto-play video ad, and scroll past three irrelevant paragraphs to get the answer.

If creating thin ad-rich pages to answer public domain questions was your idea of SEO, then yes SEO is dead – however, SEO is much more than that now.

SEO Is Marketing

When many say search and SEO are dying, those factoid searches are the SEO they’re talking about – but there’s an entire section of search that’s thriving: The verbs!

This shift makes SEO even more important because search is no longer about the word the user typed and is all about doing actual marketing.

SEOs can help understand user intents and personas.

A good SEO professional can help you understand not only what users are searching for but “why” they’re searching – and then help marketers build something that meets the users needs.

Just as search engines have evolved, so, too, has SEO.

The days of keyword density and meta tags are gone. Search engines don’t really work like that anymore.

They’ve moved on to a semantic model that uses vectors to try to understand meaning – and marketers would do well to make the same moves by understanding their user’s intent.

Evolution Of The Consumer Journey

We typically think of the consumer journey as a funnel – but that funnel in every business school textbook doesn’t really exist. Today’s consumer journey is more like one of those crazy straws you got in a cereal box as a kid with lots of bends and loops and turns it in.

Consumers are searching more than ever across multiple devices, platforms, networks, apps, and websites. This spread-out user behavior makes having an experienced SEO pro even more important.

It’s not just about getting the right words on the page anymore, and understanding user intent isn’t enough – we also have to understand where our users are acting on each of those intents.

Technical Still Matters, Too

Despite many platforms and frameworks claiming to be SEO-friendly, technical SEO issues and opportunities still remain abundant.

Most of today’s most popular website frameworks aren’t very SEO-friendly out of the box and still require customization and tweaking to really drive results.

There still isn’t a one size fits all solution and I’m not sure there ever will be.

A good SEO will help you ensure that there aren’t confusing duplicate versions of pages, that the pages you want to be seen are all easily understood by search engines, and that your re-design or re-platform won’t hurt your existing traffic.

So Why Is Search Still Important?

Search is important because users are important.

Sure, users are going to different platforms or using apps/AI – but those things are still technically a search and we still need to make sure that they’re surfacing our brands/products.

It doesn’t matter if the user is typing into a web form, talking to a device, asking an AI, using their camera, or even talking into a smart pin – they’re still trying to “do” something – and as long as users have tasks to accomplish, SEO pros will be there to influence them.

More resources:


Featured Image: Accogliente Design/Shutterstock

Source link

Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address
Continue Reading

SEO

Understanding & Optimizing Cumulative Layout Shift (CLS)

Published

on

By

Understanding & Optimizing Cumulative Layout Shift (CLS)

Cumulative Layout Shift (CLS) is a Google Core Web Vitals metric that measures a user experience event.

CLS became a ranking factor in 2021 and that means it’s important to understand what it is and how to optimize for it.

What Is Cumulative Layout Shift?

CLS is the unexpected shifting of webpage elements on a page while a user is scrolling or interacting on the page

The kinds of elements that tend to cause shift are fonts, images, videos, contact forms, buttons, and other kinds of content.

Minimizing CLS is important because pages that shift around can cause a poor user experience.

A poor CLS score (below > 0.1 ) is indicative of coding issues that can be solved.

What Causes CLS Issues?

There are four reasons why Cumulative Layout Shift happens:

  • Images without dimensions.
  • Ads, embeds, and iframes without dimensions.
  • Dynamically injected content.
  • Web Fonts causing FOIT/FOUT.
  • CSS or JavaScript animations.

Images and videos must have the height and width dimensions declared in the HTML. For responsive images, make sure that the different image sizes for the different viewports use the same aspect ratio.

Let’s dive into each of these factors to understand how they contribute to CLS.

Images Without Dimensions

Browsers cannot determine the image’s dimensions until they download them. As a result, upon encountering anHTML tag, the browser can’t allocate space for the image. The example video below illustrates that.

Once the image is downloaded, the browser needs to recalculate the layout and allocate space for the image to fit, which causes other elements on the page to shift.

By providing width and height attributes in the tag, you inform the browser of the image’s aspect ratio. This allows the browser to allocate the correct amount of space in the layout before the image is fully downloaded and prevents any unexpected layout shifts.

Ads Can Cause CLS

If you load AdSense ads in the content or leaderboard on top of the articles without proper styling and settings, the layout may shift.

This one is a little tricky to deal with because ad sizes can be different. For example, it may be a 970×250 or 970×90 ad, and if you allocate 970×90 space, it may load a 970×250 ad and cause a shift.

In contrast, if you allocate a 970×250 ad and it loads a 970×90 banner, there will be a lot of white space around it, making the page look bad.

It is a trade-off, either you should load ads with the same size and benefit from increased inventory and higher CPMs or load multiple-sized ads at the expense of user experience or CLS metric.

Dynamically Injected Content

This is content that is injected into the webpage.

For example, posts on X (formerly Twitter), which load in the content of an article, may have arbitrary height depending on the post content length, causing the layout to shift.

Of course, those usually are below the fold and don’t count on the initial page load, but if the user scrolls fast enough to reach the point where the X post is placed and it hasn’t yet loaded, then it will cause a layout shift and contribute into your CLS metric.

One way to mitigate this shift is to give the average min-height CSS property to the tweet parent div tag because it is impossible to know the height of the tweet post before it loads so we can pre-allocate space.

Another way to fix this is to apply a CSS rule to the parent div tag containing the tweet to fix the height.

#tweet-div {
max-height: 300px;
overflow: auto;
}

However, it will cause a scrollbar to appear, and users will have to scroll to view the tweet, which may not be best for user experience.

If none of the suggested methods works, you could take a screenshot of the tweet and link to it.

Web-Based Fonts

Downloaded web fonts can cause what’s known as Flash of invisible text (FOIT).

A way to prevent that is to use preload fonts

and using font-display: swap; css property on @font-face at-rule.

@font-face {
   font-family: Inter;
   font-style: normal;
   font-weight: 200 900;
   font-display: swap;
   src: url('https://www.example.com/fonts/inter.woff2') format('woff2');
}

With these rules, you are loading web fonts as quickly as possible and telling the browser to use the system font until it loads the web fonts. As soon as the browser finishes loading the fonts, it swaps the system fonts with the loaded web fonts.

However, you may still have an effect called Flash of Unstyled Text (FOUT), which is impossible to avoid when using non-system fonts because it takes some time until web fonts load, and system fonts will be displayed during that time.

In the video below, you can see how the title font is changed by causing a shift.

The visibility of FOUT depends on the user’s connection speed if the recommended font loading mechanism is implemented.

If the user’s connection is sufficiently fast, the web fonts may load quickly enough and eliminate the noticeable FOUT effect.

Therefore, using system fonts whenever possible is a great approach, but it may not always be possible due to brand style guidelines or specific design requirements.

CSS Or JavaScript Animations

When animating HTML elements’ height via CSS or JS, for example, it expands an element vertically and shrinks by pushing down content, causing a layout shift.

To prevent that, use CSS transforms by allocating space for the element being animated. You can see the difference between CSS animation, which causes a shift on the left, and the same animation, which uses CSS transformation.

CSS animation example causing CLS CSS animation example causing CLS

How Cumulative Layout Shift Is Calculated

This is a product of two metrics/events called “Impact Fraction” and “Distance Fraction.”

CLS = ( Impact Fraction)×( Distance Fraction)

Impact Fraction

Impact fraction measures how much space an unstable element takes up in the viewport.

A viewport is what you see on the mobile screen.

When an element downloads and then shifts, the total space that the element occupies, from the location that it occupied in the viewport when it’s first rendered to the final location when the page is rendered.

The example that Google uses is an element that occupies 50% of the viewport and then drops down by another 25%.

When added together, the 75% value is called the Impact Fraction, and it’s expressed as a score of 0.75.

Distance Fraction

The second measurement is called the Distance Fraction. The distance fraction is the amount of space the page element has moved from the original to the final position.

In the above example, the page element moved 25%.

So now the Cumulative Layout Score is calculated by multiplying the Impact Fraction by the Distance Fraction:

0.75 x 0.25 = 0.1875

The calculation involves some more math and other considerations. What’s important to take away from this is that the score is one way to measure an important user experience factor.

Here is an example video visually illustrating what impact and distance factors are:

Understand Cumulative Layout Shift

Understanding Cumulative Layout Shift is important, but it’s not necessary to know how to do the calculations yourself.

However, understanding what it means and how it works is key, as this has become part of the Core Web Vitals ranking factor.

More resources: 


Featured image credit: BestForBest/Shutterstock

Source link

Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address
Continue Reading

SEO

Google Gives 5 SEO Insights On Google Trends

Published

on

By

Google provides 5 insights about their Trends Tool and SEO

Google published a video that disclosed five insights about Google Trends that could be helpful for SEO, topic research and debugging issues with search rankings. The video was hosted by Daniel Waisberg, a Search Advocate at Google.

1. What Does Google Trends Offer?

Google Trends is an official tool created by Google that shows a representation of how often people search with certain keyword phrases and how those searches have changed over time. It’s not only helpful for discovering time-based changes in search queries but it also segments queries by geographic popularity which is useful for learning who to focus content for (or even to learn what geographic areas may be best to get links from).

This kind of information is invaluable for debugging why a site may have issues with organic traffic as it can show seasonal and consumer trends.

2. Google Trends Only Uses A Sample Of Data

An important fact about Google Trends that Waisberg shared is that the data that Google Trends reports on is based on a statistically significant but random sample of actual search queries.

He said:

“Google Trends is a tool that provides a random sample of aggregated, anonymized and categorized Google searches.”

This does not mean that the data is less accurate. The phrase statistically significant means that the data is representative of the actual search queries.

The reason Google uses a sample is that they have an enormous amount of data and it’s simply faster to work with samples that are representative of actual trends.

3. Google Cleans Noise In The Trends Data

Daniel Waisberg also said that Google cleans the data to remove noise and data that relates to user privacy.

“The search query data is processed to remove noise in the data and also to remove anything that might compromise a user’s privacy.”

An example of private data that is removed is the full names of people. An example of “noise” in the data are search queries made by the same person over and over, using the example of a trivial search for how to boil eggs that a person makes every morning.

That last one, about people repeating a search query is interesting because back in the early days of SEO, before Google Trends existed, SEOs used a public keyword volume tool by Overture (owned by Yahoo). Some SEOs poisoned the data by making thousands of searches for keyword phrases that were rarely queried by users, inflating the query volume, so that competitors would focus on optimizing on the useless keywords.

4. Google Normalizes Google Trends Data?

Google doesn’t show actual search query volume like a million queries per day for one query and 200,000 queries per day for another. Instead Google will select the point where a keyword phrase is searched the most and use that as the 100% mark and then adjust the Google Trends graph to percentages that are relative to that high point. So if the most searches a query gets in a day is 1 million, then a day in which it gets searched 500,000 times will be represented on the graph as 50%. This is what it means that Google Trends data is normalized.

5. Explore Search Queries And Topics

SEOs have focused on optimizing for keywords for over 25 years. But Google has long moved beyond keywords and has been labeling documents by the topics and even by queries they are relevant to (which also relates more to topics than keywords).

That’s why in my opinion one of the most useful offerings is the ability to explore the topic that’s related to the entity of the search query. Exploring the topic shows the query volume of all the related keywords.

The “explore by topic” tool arguably offers a more accurate idea of how popular a topic is, which is important because Google’s algorithms, machine learning systems, and AI models create representations of content at the sentence, paragraph and document level, representations that correspond to topics. I believe that’s what is one of the things referred to when Googlers talk about Core Topicality Systems.

Waisberg explained:

“Now, back to the Explore page. You’ll notice that, sometimes, in addition to a search term, you get an option to choose a topic. For example, when you type “cappuccino,” you can choose either the search term exactly matching “cappuccino” or the “cappuccino coffee drink” topic, which is the group of search terms that relate to that entity. These will include the exact term as well as misspellings. The topic also includes acronyms, and it covers all languages, which can be very useful, especially when looking at global data.

Using topics, you also avoid including terms that are unrelated to your interests. For example, if you’re looking at the trends for the company Alphabet, you might want to choose the Alphabet Inc company topic. If you just type “alphabet,” the trends will also include a lot of other meanings, as you can see in this example.”

Related: 12 Ways to Use Google Trends

The Big Picture

One of the interesting facts revealed in this video is that Google isn’t showing normalized actual search trends, that it’s showing a normalized “statistically significant” sample of the actual search trends. A statistically significant sample is one in which random chance is not a factor and thus represents the actual search trends.

The other noteworthy takeaway is the reminder that Google Trends is useful for exploring topics, which in my opinion is far more useful than Google Suggest and People Also Ask (PAA) data.

I have seen evidence that slavish optimization with Google Suggest and PAA data can make a website appear to be optimizing for search engines and not for people, which is something that Google explicitly cautions against. Those who were hit by the recent Google Updates should think hard about the implications of what their SEO practices in relation to keywords.

Exploring and optimizing with topics won’t behind statistical footprints of optimizing for search engines because the authenticity of content based on topics will always shine through.

Watch the Google Trends video:

Intro to Google Trends data

Featured Image by Shutterstock/Luis Molinero

Source link

Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address
Continue Reading

Trending