Connect with us
Cloak And Track Your Affiliate Links With Our User-Friendly Link Cloaking Tool, Try It Free


How To Empower Content Teams With Real-Time Log File Insights



Too many organizations still publish content without clear objectives and KPIs.

For organizations to move beyond “just publishing content,” they need to adopt a different mindset.

They need to reflect on their past work, think critically, and ask for access to performance data they can use to assess content performance in terms of traffic, crawls, and links generated.

I know what you may be thinking: “Wait, are you hinting at content teams asking for log file data?”

Yes, but I’ll do you one better: I want content teams to start asking for real-time log file insights.

For those familiar with traditional, time-consuming log file analysis, let me tell you this is different.

Times have changed, and content teams can now tap into the valuable insights log files hold.

Let’s change that mindset with the four steps below.

Step 1: Content Teams Start Thinking Critically

Rarely do content teams say, “I want to get the content piece discovered by search engines the same day, crawled within three days after publishing, indexed within a week, and driving 200 organic visits and two leads a month three weeks after publishing.”

Unfortunately, many organizations still just publish an X amount of content pieces a month because “That’s the way we’ve always done things,” or “We need fresh content to keep up our SEO performance.”

After publishing, they quickly move on to the next piece. And at the end of the month, they’ve achieved their objective to publish four content pieces and are “done.”

They don’t reflect on how long it took for search engines to crawl their newly published or updated content, how long it took to get indexed, and how long it took before the article started to rank and drive organic traffic.

And that’s a terrible shame.


Because it’s highly unlikely that this old way of doing things is really moving the needle.

Sure, everyone’s keeping very busy and I’m sure it’ll do some good, but the content will never live up to its potential. That’s a waste of money.

Don’t get me wrong. I get why it’s happening.

It’s a combination of doing what’s worked (or may have worked) in the past and a lack of a centralized place where content teams can find all the insights they need to reflect on their work’s performance effectively.

Thinking critically means content teams are asking themselves:

  • Why did article X start driving meaningful organic traffic nearly instantly after publishing? Why was it crawled so fast? Was it picked up by the press? Did it go viral on social media?
  • Are we seeing very different behavior when comparing the performance of content in site section A compared to section B? Does it get recrawled more often? If so, why?
  • Does section A have much more internal and external links? Does it have better-performing content in general?

Where can they find the answers to these questions?

Step 2: Getting Your Hands On Log File Analysis Insights

Getting your hands on log files has been notoriously difficult. There are all sorts of challenges.

For starters, they may not be available anymore. Even if they are still available, they are a pain to get because of red tape relating to PII (personally identifying information) concerns.

You’ll see that it’s a slow and painful process in most cases. There’s a reason most organizations perform a traditional log file analysis only once or twice a year.

This is where Content Delivery Networks (CDNs) such as CloudFlare, CloudFront, and Akamai come in.

Nowadays, many sites use CDNs to provide fast-loading sites to both visitors and crawlers.

And the beauty of CDNs is that they provide log files in real-time, and you can easily pull logs and make sure it doesn’t include any PII data.

Step 3: Provide Content Teams With Easily Digestible Insights

Log files also hold valuable, non-technical insights for content teams, even when their information needs differ from technical SEO teams.

Content teams need easily digestible insights that are content-focused, and they need it in real-time because they’re making changes all day and touch a lot of different content.

It needs to be a walk in the park so they can answer questions like:

  • Has Google crawled these newly published pages? And what about these pages that we recently updated?
  • How frequently does Google crawl pages in website section X? How does that compare to section Y?
  • Did Google crawl pages when they had the wrong title tags? Or that time when they contained broken links?

Knowing what crawl behavior search engines are exhibiting is essential to improving your SEO performance because having pages (re)crawled is the first step in Google’s crawling, indexing, and ranking pipeline after discovery.

When content teams can answer the questions above, they can start connecting the dots and will learn how their work has influenced search engine behavior on the site.

They can even calculate and improve:

  • Average time to crawl.
  • Average time to index.
  • Average time to rank.
  • Average time to traffic.

Zooming out, this makes for great input for SEO traffic forecasts too!

Step 4: Mapping Insights To Content Inventory

The last piece of the puzzle is mapping these useful insights to your content inventory, which also tracks all of your changes to the content.

And, we want to stay far away from putting this manually together in spreadsheets – you want an ever up-to-date content inventory to which your log file insights are automatically tied.

Off-the-shelf solutions offer all this, or you could build your own custom solution.

Both are fine. What matters is that you empower your content team!

Pro tip: You could even integrate with Google Search Console’s URL Inspection API to determine whether the content is indexed!

Wrapping Things Up

When content teams ask all the right questions and reflect on their work and have everything they need at their fingertips to answer those questions and reflect, all of their efforts will go a long way.

You’ll see that working on improving the SEO performance of sites is much more accessible for everyone involved. It’ll be more fun, and management will likely buy in faster.

Empower your content team, and be amazed by their contribution to the site’s SEO performance!

More resources:

Featured Image: The KonG/Shutterstock


if( typeof sopp !== “undefined” && sopp === ‘yes’ ){
fbq(‘dataProcessingOptions’, [‘LDU’], 1, 1000);
fbq(‘dataProcessingOptions’, []);

fbq(‘init’, ‘1321385257908563’);

fbq(‘track’, ‘PageView’);

fbq(‘trackSingle’, ‘1321385257908563’, ‘ViewContent’, {
content_name: ‘log-file-insights-for-content’,
content_category: ‘strategy seo ‘

Source link

Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address


Attain Superior Growth & ROI With Organic & Paid Tips




Holistic Search 2.0: Optimizing and Measuring Organic and Paid Performance

Silos don’t cut it anymore. User journeys are too complex for you to view and track channels separately.

To improve your campaign performance, you need a holistic view of your marketing activities and how they intertwine. This is especially true for organic and paid search strategies. 

You need to be front and center with your ideal customers at multiple touchpoints, including active interactions and passive awareness. An ideal marketing strategy has paid and organic campaigns working in tandem, and it’s becoming harder to succeed without doing both.

If you’re looking to drive quality growth in your own campaigns, iQuanti can help.

Join us live on July 24 as we delve into this intricate relationship between organic and paid search channels. You’ll get actionable insights for measuring success to maximize their combined potential.

You’ll gain a comprehensive, data-driven understanding of how to measure, analyze, and optimize holistic search marketing efforts, ensuring sustainable growth and superior ROI for your business.

You’ll walk away with:

  • Integrated Metrics and KPIs: Learn how to define and track key metrics to capture the performance of your organic and paid search campaigns, so you can make informed strategic decisions that work.
  • Attribution Models: You’ll see firsthand how strong attribution models are crucial to understanding your customers’ journeys, allowing you to identify influential touchpoints and allocate budget effectively for maximum ROI.
  • Optimization Strategies: You’ve gathered data from your campaigns…now what? Take the data and leverage it to further optimize your paid and organic search campaigns, increasing conversions along the way.

Shaubhik Ray, Senior Director of Digital Analytics Solutions at iQuanti is an expert at crafting holistic search strategies to reach more of your ideal audiences at relevant stages in their journeys. Now, he’s ready to share his insights with you.

You’ll walk away equipped with the knowledge and tools necessary to execute a combined organic and paid strategy that improves the performance of each channel.  You’ll gain data-driven insights on how to align a combined strategy with business goals and lead your organization to success.

Sign up now and prepare to maximize the potential of combining your organic and paid campaigns.

At the end of the presentation, you’ll get a chance to ask Shaubhik your burning questions in our live Q&A, so be sure to attend.

And if you can’t make it that day, register here and we’ll send you a recording following the webinar. 

Source link

Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address
Continue Reading


Screaming Frog SEO Spider Version 20.0: AI-Powered Features



What’s New with Screaming Frog SEO Spider 20.0?

For SEO experts, our toolkit is crucial. It’s how we make sure we can quickly and effectively assess how well our websites are performing. Using the best tools can put you way ahead of other SEOs. One example (and one tool I’ve personally been using for years) is Screaming FrogIt’s a powerful, straightforward, and insightful website crawler tool that’s indispensable for finding technical issues on your website.

And the good news is that it keeps getting better. Screaming Frog just released its 20th major version of the software, which includes new features based on feedback from SEO professionals.

Here are the main updates:

  1. Custom JavaScript Snippets
  2. Mobile Usability
  3. N-Grams Analysis
  4. Aggregated Anchor Text
  5. Carbon Footprint & Rating

Custom JavaScript Snippets

One of the standout features in this release is the ability to execute custom JavaScript snippets during a crawl. This functionality expands the horizons for data manipulation and API communication, offering unprecedented flexibility.

Use Cases:

  • Data Extraction and Manipulation: Gather specific data points or modify the DOM to suit your needs.
  • API Communication: Integrate with APIs like OpenAI’s ChatGPT from within the SEO Spider.

Setting Up Custom JS Snippets:

  • Navigate to `Config > Custom > Custom JavaScript`.
  • Click ‘Add’ to create a new snippet or ‘Add from Library’ to select from preset snippets.

setting up custom JS snippets screamingfrog 20setting up custom JS snippets screamingfrog 20

  • Ensure JavaScript rendering mode is set via `Config > Spider > Rendering`.

Crawl with ChatGPT:

  • Leverage the `(ChatGPT) Template` snippet, add your OpenAI API key and tailor the prompt to your needs.
  • Follow our tutorial on ‘How To Crawl With ChatGPT’ for more detailed guidance.

Sharing Your Snippets:

  • Export/import snippet libraries as JSON files to share with colleagues.
  • Remember to remove sensitive data such as API keys before sharing.

Introducing Custom JavaScript Snippets to Screaming Frog SEO Spider version 20.0 significantly enhances the tool’s flexibility and power. Whether you’re generating dynamic content, interacting with external APIs, or conducting complex page manipulations, these snippets open a world of possibilities. 

Mobile Usability

In today’s mobile-first world, ensuring a seamless mobile user experience is imperative. Version 20.0 introduces extensive mobile usability audits through Lighthouse integration. 

With an ever-increasing number of users accessing websites via mobile devices, ensuring a seamless mobile experience is crucial. Google’s mobile-first indexing highlights the importance of mobile usability, which directly impacts your site’s rankings and user experience.

 Mobile Usability Features:

  • New Mobile Tab: This tab includes filters for regular mobile usability issues such as viewport settings, tap target sizes, content sizing, and more.
  • Granular Issue Details: Detailed data on mobile usability issues can be explored in the ‘Lighthouse Details’ tab.
  • Bulk Export Capability: Export comprehensive mobile usability reports via `Reports > Mobile`.


  • Connect to the PSI API through `Config > API Access > PSI` or run Lighthouse locally.

Example Use Cases:

  • Identify pages where content does not fit within the viewport.
  • Flag and correct small tap targets and illegible font sizes.

mobile usability analysis on screamingfrog 20mobile usability analysis on screamingfrog 20

With these new features, Screaming Frog SEO Spider version 20.0 streamlines the process of auditing mobile usability, making it more efficient and comprehensive. By integrating with Google Lighthouse, both via the PSI API and local runs, the tool provides extensive insights into the mobile performance of your website. Addressing these issues not only enhances user experience but also improves your site’s SEO performance.

N-grams Analysis

N-grams analysis is a powerful new feature that allows users to analyze phrase frequency across web pages. This can greatly enhance on-page SEO efforts and internal linking strategies.

Setting Up N-grams:

  • Activate HTML storage by enabling ‘Store HTML’ or ‘Store Rendered HTML’ under `Config > Spider > Extraction`.
  • View the N-grams in the lower N-grams tab.

n-grams analysis on screamingfrog 20n-grams analysis on screamingfrog 20

Example Use Cases:

  • Improving Keyword Usage: Adjust content based on the frequency of targeted N-grams.
  • Optimizing Internal Links: Use N-grams to identify unlinked keywords and create new internal links.

Internal Linking Opportunities:

The N-grams feature provides a nuanced method for discovering internal linking opportunities, which can significantly enhance your SEO strategy and site navigation.

The introduction of N-grams analysis in Screaming Frog SEO Spider version 20 provides a tool for deep content analysis and optimization. By understanding the frequency and distribution of phrases within your content, you can significantly improve your on-page SEO and internal linking strategies.

Aggregated Anchor Text

Effective anchor text management is essential for internal linking and overall SEO performance. The aggregated anchor text feature in version 20.0 provides clear insights into how anchor texts are used across your site.

Using Aggregated Anchor Text:

  • Navigate to the ‘Inlinks’ or ‘Outlinks’ tab.
  • Utilize the new ‘Anchors’ filters to see aggregated views of anchor text usage.

aggregated anchor text report on screamingfrog 20aggregated anchor text report on screamingfrog 20

Practical Benefits:

  • Anchor Text Diversity: Ensure a natural distribution of anchor texts to avoid over-optimization.
  • Descriptive Linking: Replace generic texts like “click here” with keyword-rich alternatives.

The aggregated anchor text feature provides powerful insights into your internal link structure and optimization opportunities. This feature is essential if you are looking to enhance your site’s internal linking strategy for better keyword relevance, user experience, and search engine performance.

Aligning with digital sustainability trends, Screaming Frog SEO Spider version 20.0 includes features to measure and optimize your website’s carbon footprint.

Key Features:

  • Automatic CO2 Calculation: The SEO Spider now calculates carbon emissions for each page using the CO2.js library.
  • Carbon Rating: Each URL receives a rating based on its emissions, derived from the Sustainable Web Design Model.
  • High Carbon Rating Identification: Pages with high emissions are flagged in the ‘Validation’ tab.

Practical Applications:

  • Resource Optimization: Identify and optimize high-emission resources.
  • Sustainable Practices: Implement changes such as compressing images, reducing script sizes, and using green hosting solutions.

The integration of carbon footprint calculations in Screaming Frog SEO Spider signifies a growing recognition of digital sustainability. As more businesses adopt these practices, we can collectively reduce the environmental impact of the web while driving performance and user satisfaction.

Other Updates

In addition to major features, version 20.0 includes numerous smaller updates and bug fixes that enhance functionality and user experience.

Rich Result Validation Enhancements:

  • Split Google Rich Result validation errors from
  • New filters and columns provide detailed insights into rich result triggers and errors.

Enhanced File Types and Filters:

  • Internal and external filters include new file types such as Media, Fonts, and XML.

Website Archiving:

  • A new option to archive entire websites during a crawl is available under `Config > Spider > Rendering > JS`.

Viewport and Screenshot Configuration:

  • Customize viewport and screenshot sizes to fit different audit needs.

API Auto Connect:

  • Automatically connect APIs on start, making the setup process more seamless.

Resource Over 15MB Filter:

  • A new validation filter flags resources over 15MB, which is crucial for performance optimization.

Page Text Export:

  • Export all visible page text through the new `Bulk Export > Web > All Page Text` option.

Lighthouse Details Tab:

  • The ‘PageSpeed Details’ tab has been renamed ‘Lighthouse Details’ to reflect its expanded role.

HTML Content Type Configuration:

  • An ‘Assume Pages are HTML’ option helps accurately classify pages without explicit content types.

Bug Fixes and Performance Improvements:

  • Numerous small updates and fixes enhance stability and reliability. 

Screaming Frog SEO Spider version 20.0 is a comprehensive update packed with innovative features and enhancements that cater to the evolving needs of SEO professionals like us. From advanced data extraction capabilities with Custom JavaScript Snippets to environmental sustainability with Carbon Footprint and Rating, this release sets a new benchmark in SEO auditing tools.

Key Takeaway

Add this to your toolbox, or update to version 20 to explore the rich array of new features from Screaming Frog to optimize your website’s SEO, usability, and sustainability. It’s a no-fuss tool with tons of features that will help you stay ahead of your competitors, and ensure your websites perform optimally in terms of user experience and search engine visibility.

Source link

Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address
Continue Reading


Google Simplifies Adding Shipping & Return Policies For Online Stores




woman online shopper affixes a barcode sticker to a cardboard box, marking it for return and refund

Google introduces Search Console feature for online stores to easily manage shipping and return policies.

  • Google now allows online stores to manage shipping and return policies via Search Console.
  • This simplifies providing vital information to customers.
  • The feature can potentially boost sales for retailers.

Source link

Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address
Continue Reading