Connect with us

SEO

Google Cautions On Improper 404 Handling

Published

on

Google explains how to manage 404 status codes for SEO

Google’s John Mueller addressed whether numerous 404 errors negatively impact rankings and provided a clear explanation of the best practices for handling them.

404 (Not Found) Status Code

404 is the code that a server sends when a browser or a crawler requests a web page that the server couldn’t find. It only means that the page was not found.

The official W3C documentation doesn’t use the word “error” in its definition of 404. That said, the 400 series of codes (400, 404, 410, etc.) are classified as Client Error Responses. A client is a browser or a crawler, so a client error response means that the server is telling the browser or crawler that their request is in error. It doesn’t mean that the website is in error.

This is the official W3C definition of a 404 Page Not Found response:

“The 404 (Not Found) status code indicates that the origin server did not find a current representation for the target resource or is not willing to disclose that one exists. A 404 status code does not indicate whether this lack of representation is temporary or permanent; the 410 (Gone) status code is preferred over 404 if the origin server knows, presumably through some configurable means, that the condition is likely to be permanent.”

Will 404 Errors Affect Rankings?

The person asking the question wanted to know if a lot of 404 responses will affect rankings. Google’s John Mueller answered the question then he explained the right way to “fix” 404 error responses and cautioned about when not to “fix” them. I put “fix” in quotation marks because 404 responses are not always something that needs fixing.

Here’s the question:

“My website has a lot of 404s. Would I lose my site’s rankings if I don’t redirect them?”

John Mueller answered:

“First off, the 404s wouldn’t affect the rest of your site’s rankings.”

Addressing 404s With Redirects

Mueller next discussed the use of redirects for stopping 404 responses from happening. A redirect is a server response that tells the client that the web page they are requesting has been moved to another URL. A 301 redirect tells the browser or crawler that the URL has permanently moved to another URL.

When To Use Redirects For 404s

Redirecting a web page that no longer exists to another web page is sometimes the right way to handle 404 page not found responses.

This is how Mueller explains the proper use of redirects for “fixing” 404 responses:

“Redirects can play a role in dealing with old pages, but not always. For example, if you have a genuine replacement product, such as a new cup that functionally replaces a cup which is no longer produced, then redirecting is fine.”

When Not To Use Redirects For 404s

Next he explained when not to use redirects for 404s, explaining that it’s a crummy experience to show a web page that is irrelevant to what the site visitors are expecting to see.

Mueller explains:

“On the other hand, if you just have similar pages, then don’t redirect. If the user clicked on your site in search of a knife, they would be frustrated to only see spoons. It’s a terrible user-experience, and doesn’t help in search. “

It’s Okay To Show 404 Responses

Mueller next explained that it’s okay to show 404 responses because it’s the right response for when a browser or crawler asks for a page that doesn’t exist on a server anymore.

He explained:

“Instead, return an HTTP 404 result code. Make a great 404 page. Maybe even make a 404 page that explains why spoons are superior to knives, if you can make that argument. Just don’t blindly redirect to a similar page, a category page, or your homepage. If you’re unsure, don’t redirect. Accept that 404s are fine, they’re a normal part of a healthy website.”

Always Investigate Error Responses

Something that Mueller didn’t mention is that 404 responses should always be investigated. Don’t stop investigating just because the page doesn’t exist and there’s no other page to redirect it to. Sometimes there’s a real problem that needs solving.

404 By Internal Links

For example, some 404s are caused by broken internal linking where a URL is misspelled. You can “fix” that by redirecting the wrong URL to the correct URL but that’s not fixing the problem because the real problem is the broken link itself.

404 Caused By Outgoing Links

Some 404s are caused by linking to pages that no longer exist. Linking to pages that don’t exist makes it look like the page is abandoned. It’s a poor user experience to link to a non-existent web page and there is never a “normal part of a healthy website.” So either link to the right page, link to something else or don’t link to anything at all.

404s Caused By Inbound Links

There are another type of 404 responses that Mueller didn’t talk about that need looking into. Sometimes sites misspell a URL and when that happens the right response would be a 301 to the correct response. You can try contacting the site to ask them to fix their mistake but it’s easier to just add the redirect and move on with your life.

Listen to the question and answer at the 2:08 minute mark:

Featured Image by Shutterstock/Krakenimages.com

Source link

Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address

SEO

Google’s Revamped Documentation Shows 4 Reasons To Refresh Content

Published

on

By

Google's Revamped Documentation Shows 4 Reasons To Refresh Content

Google significantly revamped its documentation about ranking pages that contain video content. While the changelog lists three areas that changed, a review of the content provides a case study of four considerations for digital marketers and publishers when refreshing content to improve relevance for site visitors—and Google.

What Changed

The documentation that was updated relates to ranking web pages that contain videos. The purpose of the documentation is to communicate best practices for optimizing videos for higher visibility in Google’s search results.

Google’s changelog indicated that there were three major changes to the Video SEO best practices page.

  • Clarified video indexing criteria
  • Updated technical requirements
  • Added a new section about dedicated watch pages for each video

This is what the changelog shows what was changed:

“Improving the Video SEO documentation

What: Overhauled the video SEO best practices. Notably, we clarified the video indexing criteria and technical requirements, added a new watch page section, and expanded our examples.

Why: Based on feedback submissions, we revisited our video SEO guidance to clarify what’s eligible for a video result and how site owners can make it easier for Google to find their videos.”

Four Reasons To Refresh Content

There’s a common misinterpretation that encourages changing content annually because “Google loves fresh content,” which is a gross misunderstanding of the Freshness Algorithm. Content shouldn’t be changed without purpose—otherwise, it’s just “rearranging the furniture” instead of truly “redesigning the space.”

Google’s reasons for updating the content offer a mini case study of three things publishers and businesses should consider when freshening up their content.

These are the three reasons for changing the Video SEO content:

  1. Remove Outdated Content
  2. Improved Information Density
  3. Add Fresh Information
  4. Update For Brevity And Clarity

1. Remove Outdated Content

The old version of the documentation was written when video as web content was a “growing format” and the changes reflect that the times have changed, rendering the old content out of date.

“Video is a growing format for content creation and consumption on the web, and Google indexes videos from millions of different sites to serve to users. “

Videos in content are not a growing format. The editors of the web page were right to remove that passage because it no longer made any sense.

Takeaway: Always keep up to date with how your readers perceive the topic. Failure to do this will make the content look less authoritative and trustworthy.

2. Improved Information Density

Information density in this context describes the ability of content to communicate ideas and topics with the least amount of words and with the highest amount of clarity.

An opening sentence should reflect what the entire topic of the web page is about but the original opening sentence did a poor job of communicating that. It referenced that “Video is a growing format” which is a statement that absolutely did not reflect the web page topic.

This is the new opening sentence:

“If you have videos on your site, following these video SEO best practices can help more people find your site through video results on Google.”

The new sentence accurately describes the topic of the entire web page is about in only 23 words.  Here’s something really cool: The second sentence remains exactly the same between the old and revised versions.

Takeaway: The lesson here is to revise what needs to be revised and don’t make changes when the original works just fine.

3. Add Fresh Information

An important change that all publishers should consider is to update content with fresh content that reflects how topics evolve over time. Products, laws, how consumers use services and products, everything undergoes some kind of change over time.

Google added content about tools available in Google Search Console that enable publishers to monitor the performance of their video content pages.

4. Update For Brevity And Clarity

The third reason for changing some of the content was to make it more concise, easier to read with simplified language. One of the subtle changes they made was change the phrase “landing page” to “watch page.” This seemingly small change clarifies the meaning of the sentence by making it super clear that they are referring to a page where videos are watched. Previously the documentation made zero references to watch page and now it makes 21 references to that phrase, introducing consistency in the message of the web page.

Many Reasons To Update Content

Every publisher should consider reviewing their content on a daily basis, whether that’s once a year for a smaller site, or chunking it up and tackling different sections on a monthly basis, a content review is a great way to keep content relevant to users and to discover new topics for content. Sometimes it’s better to break out a topic from a web page and create a dedicated page for it.

Read the updated documentation:
Video SEO best practices

Compare it to the old documentation at Archive.org:
Video SEO best practices

Featured Image by Shutterstock/Cast Of Thousands

Source link

Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address
Continue Reading

SEO

What Is Largest Contentful Paint: An Easy Explanation

Published

on

By

What Is Largest Contentful Paint: An Easy Explanation

Largest Contentful Paint (LCP) is a Google user experience metric integrated into ranking systems in 2021.

LCP is one of the three Core Web Vitals (CWV) metrics that track technical performance metrics that impact user experience.

Core Web Vitals exist paradoxically, with Google providing guidance highlighting their importance but downplaying their impact on rankings.

LCP, like the other CWV signals, is useful for diagnosing technical issues and ensuring your website meets a base level of functionality for users.

What Is Largest Contentful Paint?

LCP is a measurement of how long it takes for the main content of a page to download and be ready to be interacted with.

Specifically, the time it takes from page load initiation to the rendering of the largest image or block of text within the user viewport. Anything below the fold doesn’t count.

Images, video poster images, background images, and block-level text elements like paragraph tags are typical elements measured.

LCP consists of the following sub-metrics:

Optimizing for LCP means optimizing for each of these metrics, so it takes less than 2.5 seconds to load and display LCP resources.

Here is a threshold scale for your reference:

LCP thresholds

Let’s dive into what these sub-metrics mean and how you can improve.

Time To First Byte (TTFB)

TTFB is the server response time and measures the time it takes for the user’s browser to receive the first byte of data from your server. This includes DNS lookup time, the time it takes to process requests by server, and redirects.

Optimizing TTFB can significantly reduce the overall load time and improve LCP.

Server response time largely depends on:

  • Database queries.
  • CDN cache misses.
  • Inefficient server-side rendering.
  • Hosting.

Let’s review each:

1. Database Queries

If your response time is high, try to identify the source.

For example, it may be due to poorly optimized queries or a high volume of queries slowing down the server’s response time. If you have a MySQL database, you can log slow queries to find which queries are slow.

If you have a WordPress website, you can use the Query Monitor plugin to see how much time SQL queries take.

Other great tools are Blackfire or Newrelic, which do not depend on the CMS or stack you use, but require installation on your hosting/server.

2. CDN Cache Misses

A CDN cache miss occurs when a requested resource is not found in the CDN’s cache, and the request is forwarded to fetch from the origin server. This process takes more time, leading to increased latency and longer load times for the end user.

Usually, your CDN provider has a report on how many cache misses you have.

Example of CDN cache reportExample of CDN cache report

If you observe a high percentage ( >10% ) of cache misses, you may need to contact your CDN provider or hosting support in case you have managed hosting with cache integrated to solve the issue.

One reason that may cause cache misses is when you have a search spam attack.

For example, a dozen spammy domains link to your internal search pages with random spammy queries like [/?q=甘肃代], which are not cached because the search term is different each time. The issue is that Googlebot aggressively crawls them, which may cause high server response times and cache misses.

In that case, and overall, it is a good practice to block search or facets URLs via robots.txt. But once you block them via robots.txt, you may find those URLs to be indexed because they have backlinks from low-quality websites.

However, don’t be afraid. John Mueller said it would be cleared in time.

Here is a real-life example from the search console of high server response time (TTFB) caused by cache misses:

Crawl spike of 404 search pages which have high server response timeCrawl spike of 404 search pages that have high server response time

3. Inefficient Server Side Rendering

You may have certain components on your website that depend on third-party APIs.

For example, you’ve seen reads and shares numbers on SEJ’s articles. We fetch those numbers from different APIs, but instead of fetching them when a request is made to the server, we prefetch them and store them in our database for faster display.

Imagine if we connect to share count and GA4 APIs when a request is made to the server. Each request takes about 300-500 ms to execute, and we would add about ~1,000 ms delay due to inefficient server-side rendering. So, make sure your backend is optimized.

4. Hosting

Be aware that hosting is highly important for low TTFB. By choosing the right hosting, you may be able to reduce your TTFB by two to three times.

Choose hosting with CDN and caching integrated into the system. This will help you avoid purchasing a CDN separately and save time maintaining it.

So, investing in the right hosting will pay off.

Read more detailed guidance:

Now, let’s look into other metrics mentioned above that contribute to LCP.

Resource Load Delay

Resource load delay is the time it takes for the browser to locate and start downloading the LCP resource.

For example, if you have a background image on your hero section that requires CSS files to load to be identified, there will be a delay equal to the time the browser needs to download the CSS file to start downloading the LCP image.

In the case when the LCP element is a text block, this time is zero.

By optimizing how quickly these resources are identified and loaded, you can improve the time it takes to display critical content. One way to do this is to preload both CSS files and LCP images by setting fetchpriority=”high” to the image so it starts downloading the CSS file.

But a better approach – if you have enough control over the website – is to inline the critical CSS required for above the fold, so the browser doesn’t spend time downloading the CSS file. This saves bandwidth and will preload only the image.

Of course, it’s even better if you design webpages to avoid hero images or sliders, as those usually don’t add value, and users tend to scroll past them since they are distracting.

Another major factor contributing to load delay is redirects.

If you have external backlinks with redirects, there’s not much you can do. But you have control over your internal links, so try to find internal links with redirects, usually because of missing trailing slashes, non-WWW versions, or changed URLs, and replace them with actual destinations.

There are a number of technical SEO tools you can use to crawl your website and find redirects to be replaced.

Resource Load Duration

Resource load duration refers to the actual time spent downloading the LCP resource.

Even if the browser quickly finds and starts downloading resources, slow download speeds can still affect LCP negatively. It depends on the size of the resources, the server’s network connection speed, and the user’s network conditions.

You can reduce resource load duration by implementing:

  • WebP format.
  • Properly sized images (make the intrinsic size of the image match the visible size).
  • Load prioritization.
  • CDN.

Element Render Delay

Element render delay is the time it takes for the browser to process and render the LCP element.

This metric is influenced by the complexity of your HTML, CSS, and JavaScript.

Minimizing render-blocking resources and optimizing your code can help reduce this delay. However, it may happen that you have heavy JavaScript scripting running, which blocks the main thread, and the rendering of the LCP element is delayed until those tasks are completed.

Here is where low values of the Total Blocking Time (TBT) metric are important, as it measures the total time during which the main thread is blocked by long tasks on page load, indicating the presence of heavy scripts that can delay rendering and responsiveness.

One way you can improve not only load duration and delay but overall all CWV metrics when users navigate within your website is to implement speculation rules API for future navigations. By prerendering pages as users mouse over links or pages they will most likely navigate, you can make your pages load instantaneously.

Beware These Scoring “Gotchas”

All elements in the user’s screen (the viewport) are used to calculate LCP. That means that images rendered off-screen and then shifted into the layout, once rendered, may not count as part of the Largest Contentful Paint score.

On the opposite end, elements starting in the user viewport and then getting pushed off-screen may be counted as part of the LCP calculation.

How To Measure The LCP Score

There are two kinds of scoring tools. The first is called Field Tools, and the second is called Lab Tools.

Field tools are actual measurements of a site.

Lab tools give a virtual score based on a simulated crawl using algorithms that approximate Internet conditions that a typical mobile phone user might encounter.

Here is one way you can find LCP resources and measure the time to display them via DevTools > Performance report:

You can read more in our in-depth guide on how to measure CWV metrics, where you can learn how to troubleshoot not only LCP but other metrics altogether.

LCP Optimization Is A Much More In-Depth Subject

Improving LCP is a crucial step toward improving CVW, but it can be the most challenging CWV metric to optimize.

LCP consists of multiple layers of sub-metrics, each requiring a thorough understanding for effective optimization.

This guide has given you a basic idea of improving LCP, and the insights you’ve gained thus far will help you make significant improvements.

But there’s still more to learn. Optimizing each sub-metric is a nuanced science. Stay tuned, as we’ll publish in-depth guides dedicated to optimizing each sub-metric.

More resources:


Featured image credit: BestForBest/Shutterstock

Source link

Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address
Continue Reading

SEO

Data Confirms Disruptive Potential Of SearchGPT

Published

on

By

SearchGPT

Researchers analyzed SearchGPT’s responses to queries and identified how it may impact publishers, B2B websites, and e-commerce, discovering key differences between SearchGPT, AI Overviews, and Perplexity.

What is SearchGPT?

SearchGPT is a prototype natural language search engine created by OpenAI that combines a generative AI model with the most current web data to provide contextually relevant answers in a natural language interface that includes citations to relevant online sources.

OpenAI has not offered detailed information about how SearchGPT accesses web information. But the fact that it uses generative AI models means that it likely uses Retrieval Augmented Generation (RAG), a technology that connects an AI language model to indexed web data to give it access to information that it wasn’t trained on. This enables AI search to provide contextually relevant answers that are up to date and grounded with authoritative and trustworthy web sources.

How BrightEdge Analyzed SearchGPT

BrightEdge used a pair of search marketing research tools developed for enterprise users to help identify search and content opportunities, emerging trends and conduct deep competitor analysis.

They used their proprietary DataCube X and the BrightEdge Generative Parser™ to extract data points from SearchGPT, AI Overviews and Perplexity.

Here’s how it was done:

“BrightEdge compared SearchGPT, Google’s AI Overviews, and Perplexity.

To evaluate SearchGPT against Google’s AI Overviews and Perplexity, BrightEdge utilized DataCube X alongside BrightEdge Generative Parser ™ to identify a high-volume term and question based on exact match volumes. These queries were then input into all three engines to evaluate their approach, intent interpretation, and answer-sourcing methods.

This comparative study employs real, popular searches within each sector to accurately reflect the performance of these engines for typical users.”

DataCube X was used for identifying high-volume keywords and questions, all volumes were based on exact matches.

Each search engine was analyzed for:

  1. Approach to the query
  2. Ability to interpret intent
  3. Method of sourcing answers

SearchGPT Versus Google AI Overviews

Research conducted by BrightEdge indicates that SearchGPT offers comprehensive answers while Google AI Overviews (AIO) provides answers that are more concise but also has an edge with surfacing current trends.

The difference found is that SearchGPT in its current state is better for deep research and Google AIO excels at giving quick answers that are also aware of current trends.

Strength: BrightEdge’s report indicates that SearchGPT answers rely on a diverse set of authoritative web resources that reflect academic, industry-specific, and government sources.

Weakness: The results of the report imply that SearchGPT’s weakness in a comparison with AIO is in the area of trends, where Google AIO was found to be more articulate.

SearchGPT Versus Perplexity

The researchers concluded that Perplexity offers concise answers that are tightly focused on topicality. This suggests that Perplexity, which styles itself as an “answer engine” shares the strengths with Google’s AIO in terms of providing concise answers. If I were to speculate I would say that this might reflect a focus on satisfaction metrics that are biased toward more immediate answers.

Strength: Because SearchGPT seems to be tuned more for research and on high quality information sources, it could be said to have an edge over Perplexity as a more comprehensive and potentially more trustworthy tool for research than Perplexity.

Weakness: Perplexity was found to be a more concise source of answers, excelling at summarizing online sources of information for answers to questions.

SearchGPT’s focus on facilitating research makes sense because the eventual context of SearchGPT is as a complement to ChatGPT.

Is SearchGPT A Competitor To Google?

SearchGPT is not a competitor to Google because OpenAI’s stated plans are to incorporate it into ChatGPT and not as a standalone search engine. SearchGPT’s official purpose is not as a standalone search engine but to be integrated into ChatGPT.

This is how OpenAI explains it:

“We also plan to get feedback on the prototype and bring the best of the experience into ChatGPT.

…Please note that we plan to integrate the SearchGPT experience into ChatGPT in the future. SearchGPT combines the strength of our AI models with information from the web to give you fast and timely answers with clear, relevant sources.”

Is SearchGPT then a competitor to Google? The more appropriate question is if ChatGPT is building toward disrupting the entire concept of organic search.

Google has done a fair job of exhausting and disenchanting users with ads, tracking and data mining their personal lives.  So it’s not implausible that a more capable version of ChatGPT could redefine how people get answers.

BrightEdge’s research discovered that SearchGPT’s strength was in facilitating trustworthy research. That makes even more sense with the understanding that SearchGPT is currently planned to be integrated into ChatGPT, not as a competitor to Google but as a competitor to the concept of organic search.

Takeaways: What SEOs And Marketers Need To Know

The major takeaways from the research can be broken down into five ways SearchGPT is better than Google AIO and Perplexity.

1. Diverse Authoritative Sources.
The research shows that SearchGPT consistently surfaces answers from authoritative and trustworthy sources.

“Its knowledge base spans academic resources, specialized industry platforms, official government pages, and reputable commercial websites.”

2. Comprehensive Answers
BrightEdge’s analysis showed that SearchGPT delivers comprehensive answers on any topic, simplifying them into clear, understandable responses.

3. Proactive Query Interpretation
This is really interesting because the researchers discovered that SearchGPT was not only able to understand the user’s immediate information need, it answered questions with an expanded breadth of coverage.

BrightEdge explained it like this:

“Its initial response often incorporates additional relevant information, illustrative examples, and real-world applications.”

4. Pragmatic And Practical
SearchGPT tended to provide practical answers that were good for ecommerce search queries. BrightEdge noted:

“It frequently offers specific product suggestions and recommendations.”

5. Wide-Ranging Topic Expertise
The research paper noted that SearchGPT correctly used industry jargon, even for esoteric B2B search queries. The researchers explained:

“This approach caters to both general users and industry professionals alike.”

Read the research results on SearchGPT here.

Featured Image by Shutterstock/Khosro

Source link

Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address
Continue Reading

Trending