Connect with us

SEO

The 6 Biggest SEO Challenges You’ll Face in 2024

Published

on

The 6 Biggest SEO Challenges You'll Face in 2024

Seen any stressed-out SEOs recently? If so, that’s because they’ve got their work cut out this year.

Between navigating Google’s never-ending algorithm updates, fighting off competitors, and getting buy-in for projects, there are many significant SEO challenges to consider.

So, which ones should you focus on? Here are the six biggest ones I think you should pay close attention to.

Make no mistake—Google’s algorithm updates can make or break your site.

Core updates, spam updates, helpful content updates—you name it, they can all impact your site’s performance.

As we can see below, the frequency of Google updates has increased in recent years, meaning that the likelihood of being impacted by a Google update has also increased.

How to deal with it:

Recovering from a Google update isn’t easy—and sometimes, websites that get hit by updates may never fully recover.

For the reasons outlined above, most businesses try to stay on the right side of Google and avoid incurring Google’s wrath.

SEOs do this by following Google’s Search Essentials, SEO best practices and avoiding risky black hat SEO tactics. But sadly, even if you think you’ve done this, there is no guarantee that you won’t get hit.

If you suspect a website has been impacted by a Google update, the fastest way to check is to plug the domain into Ahrefs’ Site Explorer.

Ahrefs Site Explorer screenshotAhrefs Site Explorer screenshot

Here’s an example of a website likely affected by Google’s August 2023 Core Update. The traffic drop started on the update’s start date.

Website impacted by Google's August 2023 Core UpdateWebsite impacted by Google's August 2023 Core Update
Hover over the G circles on the X axis to get information about each update.

From this screen, you can see if a drop in traffic correlates with a Google update. If there is a strong correlation, then that update may have hit the site. To remedy it, you will need to understand the update and take action accordingly.

Follow SEO best practices

It’s important your website follows SEO best practices so you can understand why it has been affected and determine what you need to do to fix things.

For example, you might have missed significant technical SEO issues impacting your website’s traffic. To rule this out, it’s worth using Site Audit to run a technical crawl of your website.

Site Audit screenshot, via Ahrefs Site AuditSite Audit screenshot, via Ahrefs Site Audit

Monitor the latest SEO news

In addition to following best practices, it’s a good idea to monitor the latest SEO news. You can do this through various social media channels like X or LinkedIn, but I find the two websites below to be some of the most reliable sources of SEO news.

Even if you escape Google’s updates unscathed, you’ve still got to deal with your competitors vying to steal your top-ranking keywords from right under your nose.

This may sound grim, but it’s a mistake to underestimate them. Most of the time, they’ll be trying to improve their website’s SEO just as much as you are.

And these days, your competitors will:

How to deal with it:

If you want to stay ahead of your competitors, you need to do these two things:

Spy on your competitors and monitor their strategy

Ok, so you don’t have to be James Bond, but by using a tool like Ahrefs Site Explorer and our Google Looker Studio Integration (GLS), you can extract valuable information and keep tabs on your competitors, giving you a competitive advantage in the SERPs.

Using a tool like Site Explorer, you can use the Organic Competitors report to understand the competitor landscape:

Organic competitors screenshot, via Ahrefs' Site ExplorerOrganic competitors screenshot, via Ahrefs' Site Explorer

You can check out their Organic traffic performance across the years:

Year on Year comparison of organic traffic, via Ahrefs' Site ExplorerYear on Year comparison of organic traffic, via Ahrefs' Site Explorer

You can use Calendar to see which days changes in Positions, Pages, Referring domains Backlinks occurred:

Screenshot of Ahrefs' Calendar, via Ahrefs' Site ExplorerScreenshot of Ahrefs' Calendar, via Ahrefs' Site Explorer

You can see their Top pages’ organic traffic and Organic keywords:

Top pages report, via Ahrefs' Site ExplorerTop pages report, via Ahrefs' Site Explorer

And much, much more.

If you want to monitor your most important competitors more closely, you can even create a dashboard using Ahrefs’ GLS integration.

Google Looker Studio integration screenshot,Google Looker Studio integration screenshot,

Acquire links and create content that your competitors can’t recreate easily

Once you’ve done enough spying, it’s time to take action.

Links and content are the bread and butter for many SEOs. But a lot of the time the links that are acquired and the content that is created just aren’t that great.

So, to stand the best chance of maintaining your rankings, you need to work on getting high-quality backlinks and producing high-quality content that your competitors can’t easily recreate.

It’s easy to say this, but what does it mean in practice?

The best way to create this type of content is to create deep content.

At Ahrefs, we do this by running surveys, getting quotes from industry experts, running data studies, creating unique illustrations or diagrams, and generally fine-tuning our content until it is the best it can be.

As if competing against your competitors wasn’t enough, you must also compete against Google for clicks.

As Google not-so-subtly transitions from a search engine to an answer engine, it’s becoming more common for it to supply the answer to search queries—rather than the search results themselves.

The result is that even the once top-performing organic search websites have a lower click-through rate (CTR) because they’re further down the page—or not on the first page.

Whether you like it or not, Google is reducing traffic to your website through two mechanisms:

  • AI overviews – Where Google generates an answer based on sources on the internet
  • Zero-click searches – Where Google shows the answer in the search results

With AI overviews, we can see that the traditional organic search results are not visible.

And with zero-click searches, Google supplies the answer directly in the SERP, so the user doesn’t have to click anything unless they want to know more.

Zero Click searches example, via Google.comZero Click searches example, via Google.com

These features have one thing in common: They are pushing the organic results further down the page.

With AI Overviews, even when links are included, Kevin Indig’s AI overviews traffic impact study suggests that AI overviews will reduce organic clicks.

In this example below, shared by Aleyda, we can see that even when you rank organically in the number one position, it doesn’t mean much if there are Ads and an AI overview with the UX with no links in the AI overview answer; it just perpetuates the zero-clicks model through the AI overview format.

How to deal with it:

You can’t control how Google changes the SERPs, but you can do two things:

Make your website the best it can be

If you focus on the latter, your website will naturally become more authoritative over time. This isn’t a guarantee that your website will be included in the AI overview, but it’s better than doing nothing.

Prevent Google from showing your website in an AI Overview

If you want to be excluded from Google’s AI Overviews, Google says you can add no snippet to prevent your content from appearing in AI Overviews.

nosnippet code explanation screemshot, via Google's documentationnosnippet code explanation screemshot, via Google's documentation

One of the reasons marketers gravitated towards Google in the early days was that it was relatively easy to set up a website and get traffic.

Recently, there have been a few high-profile examples of smaller websites that have been impacted by Google:

Apart from the algorithmic changes, I think there are two reasons for this:

  • Large authoritative websites with bigger budgets and SEO teams are more likely to rank well in today’s Google
  • User-generated content sites like Reddit and Quora have been given huge traffic boosts from Google, which has displaced smaller sites from the SERPs that used to rank for these types of keyword queries

Here’s Reddit’s traffic increase over the last year:

Reddit's organic traffic increase, via Ahrefs Site ExplorerReddit's organic traffic increase, via Ahrefs Site Explorer

And here’s Quora’s traffic increase:

Quora's organic traffic increase, via Ahrefs Site ExplorerQuora's organic traffic increase, via Ahrefs Site Explorer

How to deal with it:

There are three key ways I would deal with this issue in 2024:

Focus on targeting the right keywords using keyword research

Knowing which keywords to target is really important for smaller websites. Sadly, you can’t just write about a big term like “SEO” and expect to rank for it in Google.

Use a tool like Keywords Explorer to do a SERP analysis for each keyword you want to target. Use the effort-to-reward ratio to ensure you are picking the right keyword battles:

Effort to reward ratio illustrationEffort to reward ratio illustration

If you’re concerned about Reddit, Quora, or other UGC sites stealing your clicks, you can also use Keywords Explorer to target SERPs where these websites aren’t present.

To do this:

  • Enter your keyword in the search bar and head to the matching terms report
  • Click on the SERP features drop-down box
  • Select Not on SERP and select Discussions and forums
Example of removing big UGC sites from keyword searches using filters in Ahrefs' Keywords ExplorerExample of removing big UGC sites from keyword searches using filters in Ahrefs' Keywords Explorer

This method can help you find SERPs where these types of sites are not present.

Build more links to become more authoritative

Another approach you could take is to double down on the SEO basics and start building more high-quality backlinks.

Write deep content

Most SEOs are not churning out 500-word blog posts and hoping for the best; equally, the content they’re creating is often not deep or the best it can possibly be.

This is often due to time restraints, budget and inclination. But to be competitive in the AI era, deep content is exactly what you should be creating.

As your website grows, the challenge of maintaining the performance of your content portfolio gets increasingly more difficult.

And what may have been an “absolute banger” of an article in 2020 might not be such a great article now—so you’ll need to update it to keep the clicks rolling in.

So how can you ensure that your content is the best it can be?

How to deal with it:

Here’s the process I use:

Steal this content updating framework

And here’s a practical example of this in action:

Use Page Inspect with Overview to identify pages that need updating

Here’s an example of an older article Michal Pecánek wrote that I recently updated. Using Page Inspect, we can pinpoint the exact date of the update was on May 10, 2024, with no other major in the last year.

Ahrefs Page Inspect screenshot, via Ahrefs' Site ExplorerAhrefs Page Inspect screenshot, via Ahrefs' Site Explorer

According to Ahrefs, this update almost doubled the page’s organic traffic, underlining the value of updating old content. Before the update, the content had reached its lowest performance ever.

Example of a content update and the impact on organic traffic, via Ahrefs' Site ExplorerExample of a content update and the impact on organic traffic, via Ahrefs' Site Explorer

So, what changed to casually double the traffic? Clicking on Page Inspect gives us our answer.

Page Inspect detail screenshot, via Ahrefs' Site ExplorerPage Inspect detail screenshot, via Ahrefs' Site Explorer

I was focused on achieving three aims with this update:

  • Keeping Michal’s original framework for the post intact
  • Making the content as concise and readable as it can be
  • Refreshing the template (the main draw of the post) and explaining how to use the updated version in a beginner-friendly way to match the search intent

Getting buy-in for SEO projects has never been easy compared to other channels. Unfortunately, this meme perfectly describes my early days of agency life.

SEO meme, SEO vs PPC budgetsSEO meme, SEO vs PPC budgets

SEO is not an easy sell—either internally or externally to clients.

With companies hiring fewer SEO roles this year, the appetite for risk seems lower than in previous years.

SEO can also be slow to take impact, meaning getting buy-in for projects is harder than other channels.

How long does SEO take illustrationHow long does SEO take illustration

How to deal with it:

My colleague Despina Gavoyannis has written a fantastic article about how to get SEO buy-in, here is a summary of her top tips:

  • Find key influencers and decision-makers within the organization, starting with cross-functional teams before approaching executives. (And don’t forget the people who’ll actually implement your changes—developers.)
  • Adapt your language and communicate the benefits of SEO initiatives in terms that resonate with different stakeholders’ priorities.
  • Highlight the opportunity costs of not investing in SEO by showing the potential traffic and revenue being missed out on using metrics like Ahrefs’ traffic value.
  • Collaborate cross-functionally by showing how SEO can support other teams’ goals, e.g. helping the editorial team create content that ranks for commercial queries.

And perhaps most important of all: build better business cases and SEO opportunity forecasts.

If you just want to show the short-term trend for a keyword, you can use Keywords Explorer:

Forecasting feature for keywords, via Ahrefs' Keywords ExplorerForecasting feature for keywords, via Ahrefs' Keywords Explorer
The forecasted trend is shown in orange as a dotted line.

If you want to show the Traffic potential of a particular keyword, you can use our Traffic potential metric in SERP overview to gauge this:

Traffic potential example, via Ahrefs' Site ExplorerTraffic potential example, via Ahrefs' Site Explorer

And if you want to go the whole hog, you can create an SEO forecast. You can use a third-party tool to create a forecast, but I recommend you use Patrick Stox’s SEO forecasting guide.

Final thoughts

Of all the SEO challenges mentioned above, the one keeping SEOs awake at night is AI.

It’s swept through our industry like a hurricane, presenting SEOs with many new challenges. The SERPs are changing, competitors are using AI tools, and the bar for creating basic content has been lowered, all thanks to AI.

If you want to stay competitive, you need to arm yourself with the best SEO tools and search data on the market—and for me, that always starts with Ahrefs.

Got questions? Ping me on X.



Source link

Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address

SEO

Client-Side Vs. Server-Side Rendering

Published

on

By

Client-Side Vs. Server-Side Rendering

Faster webpage loading times play a big part in user experience and SEO, with page load speed a key determining factor for Google’s algorithm.

A front-end web developer must decide the best way to render a website so it delivers a fast experience and dynamic content.

Two popular rendering methods include client-side rendering (CSR) and server-side rendering (SSR).

All websites have different requirements, so understanding the difference between client-side and server-side rendering can help you render your website to match your business goals.

Google & JavaScript

Google has extensive documentation on how it handles JavaScript, and Googlers offer insights and answer JavaScript questions regularly through various formats – both official and unofficial.

For example, in a Search Off The Record podcast, it was discussed that Google renders all pages for Search, including JavaScript-heavy ones.

This sparked a substantial conversation on LinkedIn, and another couple of takeaways from both the podcast and proceeding discussions are that:

  • Google doesn’t track how expensive it is to render specific pages.
  • Google renders all pages to see content – regardless if it uses JavaScript or not.

The conversation as a whole has helped to dispel many myths and misconceptions about how Google might have approached JavaScript and allocated resources.

Martin Splitt’s full comment on LinkedIn covering this was:

“We don’t keep track of “how expensive was this page for us?” or something. We know that a substantial part of the web uses JavaScript to add, remove, change content on web pages. We just have to render, to see it all. It doesn’t really matter if a page does or does not use JavaScript, because we can only be reasonably sure to see all content once it’s rendered.”

Martin also confirmed a queue and potential delay between crawling and indexing, but not just because something is JavaScript or not, and it’s not an “opaque” issue that the presence of JavaScript is the root cause of URLs not being indexed.

General JavaScript Best Practices

Before we get into the client-side versus server-side debate, it’s important that we also follow general best practices for either of these approaches to work:

  • Don’t block JavaScript resources through Robots.txt or server rules.
  • Avoid render blocking.
  • Avoid injecting JavaScript in the DOM.

What Is Client-Side Rendering, And How Does It Work?

Client-side rendering is a relatively new approach to rendering websites.

It became popular when JavaScript libraries started integrating it, with Angular and React.js being some of the best examples of libraries used in this type of rendering.

It works by rendering a website’s JavaScript in your browser rather than on the server.

The server responds with a bare-bones HTML document containing the JS files instead of getting all the content from the HTML document.

While the initial upload time is a bit slow, the subsequent page loads will be rapid as they aren’t reliant on a different HTML page per route.

From managing logic to retrieving data from an API, client-rendered sites do everything “independently.” The page is available after the code is executed because every page the user visits and its corresponding URL are created dynamically.

The CSR process is as follows:

  • The user enters the URL they wish to visit in the address bar.
  • A data request is sent to the server at the specified URL.
  • On the client’s first request for the site, the server delivers the static files (CSS and HTML) to the client’s browser.
  • The client browser will download the HTML content first, followed by JavaScript. These HTML files connect the JavaScript, starting the loading process by displaying loading symbols the developer defines to the user. At this stage, the website is still not visible to the user.
  • After the JavaScript is downloaded, content is dynamically generated on the client’s browser.
  • The web content becomes visible as the client navigates and interacts with the website.

What Is Server-Side Rendering, And How Does It Work?

Server-side rendering is the more common technique for displaying information on a screen.

The web browser submits a request for information from the server, fetching user-specific data to populate and sending a fully rendered HTML page to the client.

Every time the user visits a new page on the site, the server will repeat the entire process.

Here’s how the SSR process goes step-by-step:

  • The user enters the URL they wish to visit in the address bar.
  • The server serves a ready-to-be-rendered HTML response to the browser.
  • The browser renders the page (now viewable) and downloads JavaScript.
  • The browser executes React, thus making the page interactable.

What Are The Differences Between Client-Side And Server-Side Rendering?

The main difference between these two rendering approaches is in the algorithms of their operation. CSR shows an empty page before loading, while SSR displays a fully-rendered HTML page on the first load.

This gives server-side rendering a speed advantage over client-side rendering, as the browser doesn’t need to process large JavaScript files. Content is often visible within a couple of milliseconds.

Search engines can crawl the site for better SEO, making it easy to index your webpages. This readability in the form of text is precisely the way SSR sites appear in the browser.

However, client-side rendering is a cheaper option for website owners.

It relieves the load on your servers, passing the responsibility of rendering to the client (the bot or user trying to view your page). It also offers rich site interactions by providing fast website interaction after the initial load.

Fewer HTTP requests are made to the server with CSR, unlike in SSR, where each page is rendered from scratch, resulting in a slower transition between pages.

SSR can also buckle under a high server load if the server receives many simultaneous requests from different users.

The drawback of CSR is the longer initial loading time. This can impact SEO; crawlers might not wait for the content to load and exit the site.

This two-phased approach raises the possibility of seeing empty content on your page by missing JavaScript content after first crawling and indexing the HTML of a page. Remember that, in most cases, CSR requires an external library.

When To Use Server-Side Rendering

If you want to improve your Google visibility and rank high in the search engine results pages (SERPs), server-side rendering is the number one choice.

E-learning websites, online marketplaces, and applications with a straightforward user interface with fewer pages, features, and dynamic data all benefit from this type of rendering.

When To Use Client-Side Rendering

Client-side rendering is usually paired with dynamic web apps like social networks or online messengers. This is because these apps’ information constantly changes and must deal with large and dynamic data to perform fast updates to meet user demand.

The focus here is on a rich site with many users, prioritizing the user experience over SEO.

Which Is Better: Server-Side Or Client-Side Rendering?

When determining which approach is best, you need to not only take into consideration your SEO needs but also how the website works for users and delivers value.

Think about your project and how your chosen rendering will impact your position in the SERPs and your website’s user experience.

Generally, CSR is better for dynamic websites, while SSR is best suited for static websites.

Content Refresh Frequency

Websites that feature highly dynamic information, such as gambling or FOREX websites, update their content every second, meaning you’d likely choose CSR over SSR in this scenario – or choose to use CSR for specific landing pages and not all pages, depending on your user acquisition strategy.

SSR is more effective if your site’s content doesn’t require much user interaction. It positively influences accessibility, page load times, SEO, and social media support.

On the other hand, CSR is excellent for providing cost-effective rendering for web applications, and it’s easier to build and maintain; it’s better for First Input Delay (FID).

Another CSR consideration is that meta tags (description, title), canonical URLs, and Hreflang tags should be rendered server-side or presented in the initial HTML response for the crawlers to identify them as soon as possible, and not only appear in the rendered HTML.

Platform Considerations

CSR technology tends to be more expensive to maintain because the hourly rate for developers skilled in React.js or Node.js is generally higher than that for PHP or WordPress developers.

Additionally, there are fewer ready-made plugins or out-of-the-box solutions available for CSR frameworks compared to the larger plugin ecosystem that WordPress users have access too.

For those considering a headless WordPress setup, such as using Frontity, it’s important to note that you’ll need to hire both React.js developers and PHP developers.

This is because headless WordPress relies on React.js for the front end while still requiring PHP for the back end.

It’s important to remember that not all WordPress plugins are compatible with headless setups, which could limit functionality or require additional custom development.

Website Functionality & Purpose

Sometimes, you don’t have to choose between the two as hybrid solutions are available. Both SSR and CSR can be implemented within a single website or webpage.

For example, in an online marketplace, pages with product descriptions can be rendered on the server, as they are static and need to be easily indexed by search engines.

Staying with ecommerce, if you have high levels of personalization for users on a number of pages, you won’t be able to SSR render the content for bots, so you will need to define some form of default content for Googlebot which crawls cookieless and stateless.

Pages like user accounts don’t need to be ranked in the search engine results pages (SERPs), so a CRS approach might be better for UX.

Both CSR and SSR are popular approaches to rendering websites. You and your team need to make this decision at the initial stage of product development.

More resources: 


Featured Image: TippaPatt/Shutterstock

Source link

Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address
Continue Reading

SEO

HubSpot Rolls Out AI-Powered Marketing Tools

Published

on

By

HubSpot Rolls Out AI-Powered Marketing Tools

HubSpot announced a push into AI this week at its annual Inbound marketing conference, launching “Breeze.”

Breeze is an artificial intelligence layer integrated across the company’s marketing, sales, and customer service software.

According to HubSpot, the goal is to provide marketers with easier, faster, and more unified solutions as digital channels become oversaturated.

Karen Ng, VP of Product at HubSpot, tells Search Engine Journal in an interview:

“We’re trying to create really powerful tools for marketers to rise above the noise that’s happening now with a lot of this AI-generated content. We might help you generate titles or a blog content…but we do expect kind of a human there to be a co-assist in that.”

Breeze AI Covers Copilot, Workflow Agents, Data Enrichment

The Breeze layer includes three main components.

Breeze Copilot

An AI assistant that provides personalized recommendations and suggestions based on data in HubSpot’s CRM.

Ng explained:

“It’s a chat-based AI companion that assists with tasks everywhere – in HubSpot, the browser, and mobile.”

Breeze Agents

A set of four agents that can automate entire workflows like content generation, social media campaigns, prospecting, and customer support without human input.

Ng added the following context:

“Agents allow you to automate a lot of those workflows. But it’s still, you know, we might generate for you a content backlog. But taking a look at that content backlog, and knowing what you publish is still a really important key of it right now.”

Breeze Intelligence

Combines HubSpot customer data with third-party sources to build richer profiles.

Ng stated:

“It’s really important that we’re bringing together data that can be trusted. We know your AI is really only as good as the data that it’s actually trained on.”

Addressing AI Content Quality

While prioritizing AI-driven productivity, Ng acknowledged the need for human oversight of AI content:

“We really do need eyes on it still…We think of that content generation as still human-assisted.”

Marketing Hub Updates

Beyond Breeze, HubSpot is updating Marketing Hub with tools like:

  • Content Remix to repurpose videos into clips, audio, blogs, and more.
  • AI video creation via integration with HeyGen
  • YouTube and Instagram Reels publishing
  • Improved marketing analytics and attribution

The announcements signal HubSpot’s AI-driven vision for unifying customer data.

But as Ng tells us, “We definitely think a lot about the data sources…and then also understand your business.”

HubSpot’s updates are rolling out now, with some in public beta.


Featured Image: Poetra.RH/Shutterstock

Source link

Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address
Continue Reading

SEO

Holistic Marketing Strategies That Drive Revenue [SaaS Case Study]

Published

on

By

Holistic Marketing Strategies That Drive Revenue [SaaS Case Study]

Brands are seeing success driving quality pipeline and revenue growth. It’s all about building an intentional customer journey, aligning sales + marketing, plus measuring ROI. 

Check out this executive panel on-demand, as we show you how we do it. 

With Ryann Hogan, senior demand generation manager at CallRail, and our very own Heather Campbell and Jessica Cromwell, we chatted about driving demand, lead gen, revenue, and proper attribution

This B2B leadership forum provided insights you can use in your strategy tomorrow, like:

  • The importance of the customer journey, and the keys to matching content to your ideal personas.
  • How to align marketing and sales efforts to guide leads through an effective journey to conversion.
  • Methods to measure ROI and determine if your strategies are delivering results.

While the case study is SaaS, these strategies are for any brand.

Watch on-demand and be part of the conversation. 

Join Us For Our Next Webinar!

Navigating SERP Complexity: How to Leverage Search Intent for SEO

Join us live as we break down all of these complexities and reveal how to identify valuable opportunities in your space. We’ll show you how to tap into the searcher’s motivation behind each query (and how Google responds to it in kind).

Source link

Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address
Continue Reading

Trending