Connect with us

SEO

10 Steps To Boost Your Site’s Crawlability And Indexability

Published

on

7 Steps To Boost Your Site’s Crawlability And Indexability

Keywords and content may be the twin pillars upon which most search engine optimization strategies are built, but they’re far from the only ones that matter.

Less commonly discussed but equally important – not just to users but to search bots – is your website’s discoverability.

There are roughly 50 billion webpages on 1.93 billion websites on the internet. This is far too many for any human team to explore, so these bots, also called spiders, perform a significant role.

These bots determine each page’s content by following links from website to website and page to page. This information is compiled into a vast database, or index, of URLs, which are then put through the search engine’s algorithm for ranking.

This two-step process of navigating and understanding your site is called crawling and indexing.

As an SEO professional, you’ve undoubtedly heard these terms before, but let’s define them just for clarity’s sake:

  • Crawlability refers to how well these search engine bots can scan and index your webpages.
  • Indexability measures the search engine’s ability to analyze your webpages and add them to its index.

As you can probably imagine, these are both essential parts of SEO.

If your site suffers from poor crawlability, for example, many broken links and dead ends, search engine crawlers won’t be able to access all your content, which will exclude it from the index.

Indexability, on the other hand, is vital because pages that are not indexed will not appear in search results. How can Google rank a page it hasn’t included in its database?

The crawling and indexing process is a bit more complicated than we’ve discussed here, but that’s the basic overview.

If you’re looking for a more in-depth discussion of how they work, Dave Davies has an excellent piece on crawling and indexing.

How To Improve Crawling And Indexing

Now that we’ve covered just how important these two processes are let’s look at some elements of your website that affect crawling and indexing – and discuss ways to optimize your site for them.

1. Improve Page Loading Speed

With billions of webpages to catalog, web spiders don’t have all day to wait for your links to load. This is sometimes referred to as a crawl budget.

If your site doesn’t load within the specified time frame, they’ll leave your site, which means you’ll remain uncrawled and unindexed. And as you can imagine, this is not good for SEO purposes.

Thus, it’s a good idea to regularly evaluate your page speed and improve it wherever you can.

You can use Google Search Console or tools like Screaming Frog to check your website’s speed.

If your site is running slow, take steps to alleviate the problem. This could include upgrading your server or hosting platform, enabling compression, minifying CSS, JavaScript, and HTML, and eliminating or reducing redirects.

Figure out what’s slowing down your load time by checking your Core Web Vitals report. If you want more refined information about your goals, particularly from a user-centric view, Google Lighthouse is an open-source tool you may find very useful.

2. Strengthen Internal Link Structure

A good site structure and internal linking are foundational elements of a successful SEO strategy. A disorganized website is difficult for search engines to crawl, which makes internal linking one of the most important things a website can do.

But don’t just take our word for it. Here’s what Google’s search advocate John Mueller had to say about it:

“Internal linking is super critical for SEO. I think it’s one of the biggest things that you can do on a website to kind of guide Google and guide visitors to the pages that you think are important.”

If your internal linking is poor, you also risk orphaned pages or those pages that don’t link to any other part of your website. Because nothing is directed to these pages, the only way for search engines to find them is from your sitemap.

To eliminate this problem and others caused by poor structure, create a logical internal structure for your site.

Your homepage should link to subpages supported by pages further down the pyramid. These subpages should then have contextual links where it feels natural.

Another thing to keep an eye on is broken links, including those with typos in the URL. This, of course, leads to a broken link, which will lead to the dreaded 404 error. In other words, page not found.

The problem with this is that broken links are not helping and are harming your crawlability.

Double-check your URLs, particularly if you’ve recently undergone a site migration, bulk delete, or structure change. And make sure you’re not linking to old or deleted URLs.

Other best practices for internal linking include having a good amount of linkable content (content is always king), using anchor text instead of linked images, and using a “reasonable number” of links on a page (whatever that means).

Oh yeah, and ensure you’re using follow links for internal links.

3. Submit Your Sitemap To Google

Given enough time, and assuming you haven’t told it not to, Google will crawl your site. And that’s great, but it’s not helping your search ranking while you’re waiting.

If you’ve recently made changes to your content and want Google to know about it immediately, it’s a good idea to submit a sitemap to Google Search Console.

A sitemap is another file that lives in your root directory. It serves as a roadmap for search engines with direct links to every page on your site.

This is beneficial for indexability because it allows Google to learn about multiple pages simultaneously. Whereas a crawler may have to follow five internal links to discover a deep page, by submitting an XML sitemap, it can find all of your pages with a single visit to your sitemap file.

Submitting your sitemap to Google is particularly useful if you have a deep website, frequently add new pages or content, or your site does not have good internal linking.

4. Update Robots.txt Files

You probably want to have a robots.txt file for your website. While it’s not required, 99% of websites use it as a rule of thumb. If you’re unfamiliar with this is, it’s a plain text file in your website’s root directory.

It tells search engine crawlers how you would like them to crawl your site. Its primary use is to manage bot traffic and keep your site from being overloaded with requests.

Where this comes in handy in terms of crawlability is limiting which pages Google crawls and indexes. For example, you probably don’t want pages like directories, shopping carts, and tags in Google’s directory.

Of course, this helpful text file can also negatively impact your crawlability. It’s well worth looking at your robots.txt file (or having an expert do it if you’re not confident in your abilities) to see if you’re inadvertently blocking crawler access to your pages.

Some common mistakes in robots.text files include:

  • Robots.txt is not in the root directory.
  • Poor use of wildcards.
  • Noindex in robots.txt.
  • Blocked scripts, stylesheets and images.
  • No sitemap URL.

For an in-depth examination of each of these issues – and tips for resolving them, read this article.

5. Check Your Canonicalization

Canonical tags consolidate signals from multiple URLs into a single canonical URL. This can be a helpful way to tell Google to index the pages you want while skipping duplicates and outdated versions.

But this opens the door for rogue canonical tags. These refer to older versions of a page that no longer exists, leading to search engines indexing the wrong pages and leaving your preferred pages invisible.

To eliminate this problem, use a URL inspection tool to scan for rogue tags and remove them.

If your website is geared towards international traffic, i.e., if you direct users in different countries to different canonical pages, you need to have canonical tags for each language. This ensures your pages are being indexed in each language your site is using.

6. Perform A Site Audit

Now that you’ve performed all these other steps, there’s still one final thing you need to do to ensure your site is optimized for crawling and indexing: a site audit. And that starts with checking the percentage of pages Google has indexed for your site.

Check Your Indexability Rate

Your indexability rate is the number of pages in Google’s index divided by the number of pages on our website.

You can find out how many pages are in the google index from Google Search Console Index  by going to the “Pages” tab and checking the number of pages on the website from the CMS admin panel.

There’s a good chance your site will have some pages you don’t want indexed, so this number likely won’t be 100%. But if the indexability rate is below 90%, then you have issues that need to be investigated.

You can get your no-indexed URLs from Search Console and run an audit for them. This could help you understand what is causing the issue.

Another useful site auditing tool included in Google Search Console is the URL Inspection Tool. This allows you to see what Google spiders see, which you can then compare to real webpages to understand what Google is unable to render.

Audit Newly Published Pages

Any time you publish new pages to your website or update your most important pages, you should make sure they’re being indexed. Go into Google Search Console and make sure they’re all showing up.

If you’re still having issues, an audit can also give you insight into which other parts of your SEO strategy are falling short, so it’s a double win. Scale your audit process with tools like:

  1. Screaming Frog
  2. Semrush
  3. Ziptie
  4. Oncrawl
  5. Lumar

7. Check For Low-Quality Or Duplicate Content

If Google doesn’t view your content as valuable to searchers, it may decide it’s not worthy to index. This thin content, as it’s known could be poorly written content (e.g., filled with grammar mistakes and spelling errors), boilerplate content that’s not unique to your site, or content with no external signals about its value and authority.

To find this, determine which pages on your site are not being indexed, and then review the target queries for them. Are they providing high-quality answers to the questions of searchers? If not, replace or refresh them.

Duplicate content is another reason bots can get hung up while crawling your site. Basically, what happens is that your coding structure has confused it and it doesn’t know which version to index. This could be caused by things like session IDs, redundant content elements and pagination issues.

Sometimes, this will trigger an alert in Google Search Console, telling you Google is encountering more URLs than it thinks it should. If you haven’t received one, check your crawl results for things like duplicate or missing tags, or URLs with extra characters that could be creating extra work for bots.

Correct these issues by fixing tags, removing pages or adjusting Google’s access.

8. Eliminate Redirect Chains And Internal Redirects

As websites evolve, redirects are a natural byproduct, directing visitors from one page to a newer or more relevant one. But while they’re common on most sites, if you’re mishandling them, you could be inadvertently sabotaging your own indexing.

There are several mistakes you can make when creating redirects, but one of the most common is redirect chains. These occur when there’s more than one redirect between the link clicked on and the destination. Google doesn’t look on this as a positive signal.

In more extreme cases, you may initiate a redirect loop, in which a page redirects to another page, which directs to another page, and so on, until it eventually links back to the very first page. In other words, you’ve created a never-ending loop that goes nowhere.

Check your site’s redirects using Screaming Frog, Redirect-Checker.org or a similar tool.

9. Fix Broken Links

In a similar vein, broken links can wreak havoc on your site’s crawlability. You should regularly be checking your site to ensure you don’t have broken links, as this will not only hurt your SEO results, but will frustrate human users.

There are a number of ways you can find broken links on your site, including manually evaluating each and every link on your site (header, footer, navigation, in-text, etc.), or you can use Google Search Console, Analytics or Screaming Frog to find 404 errors.

Once you’ve found broken links, you have three options for fixing them: redirecting them (see the section above for caveats), updating them or removing them.

10. IndexNow

IndexNow is a relatively new protocol that allows URLs to be submitted simultaneously between search engines via an API. It works like a super-charged version of submitting an XML sitemap by alerting search engines about new URLs and changes to your website.

Basically, what it does is provides crawlers with a roadmap to your site upfront. They enter your site with information they need, so there’s no need to constantly recheck the sitemap. And unlike XML sitemaps, it allows you to inform search engines about non-200 status code pages.

Implementing it is easy, and only requires you to generate an API key, host it in your directory or another location, and submit your URLs in the recommended format.

Wrapping Up

By now, you should have a good understanding of your website’s indexability and crawlability. You should also understand just how important these two factors are to your search rankings.

If Google’s spiders can crawl and index your site, it doesn’t matter how many keywords, backlinks, and tags you use – you won’t appear in search results.

And that’s why it’s essential to regularly check your site for anything that could be waylaying, misleading, or misdirecting bots.

So, get yourself a good set of tools and get started. Be diligent and mindful of the details, and you’ll soon have Google spiders swarming your site like spiders.

More Resources:


Featured Image: Roman Samborskyi/Shutterstock



Source link

Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address

SEO

Google’s AI Overviews Shake Up Ecommerce Search Visibility

Published

on

By

Google's AI Overviews Shake Up Ecommerce Search Visibility

An analysis of 25,000 ecommerce queries by Bartosz Góralewicz, founder of Onely, reveals the impact of Google’s AI overviews on search visibility for online retailers.

The study found that 16% of eCommerce queries now return an AI overview in search results, accounting for 13% of total search volume in this sector.

Notably, 80% of the sources listed in these AI overviews do not rank organically for the original query.

“Ranking #1-3 gives you only an 8% chance of being a source in AI overviews,” Góralewicz stated.

Shift Toward “Accelerated” Product Experiences

International SEO consultant Aleyda Solis analyzed the disconnect between traditional organic ranking and inclusion in AI overviews.

According to Solis, for product-related queries, Google is prioritizing an “accelerated” approach over summarizing currently ranking pages.

She commented Góralewicz’ findings, stating:

“… rather than providing high level summaries of what’s already ranked organically below, what Google does with e-commerce is “accelerate” the experience by already showcasing what the user would get next.”

Solis explains that for queries where Google previously ranked category pages, reviews, and buying guides, it’s now bypassing this level of results with AI overviews.

Assessing AI Overview Traffic Impact

To help retailers evaluate their exposure, Solis has shared a spreadsheet that analyzes the potential traffic impact of AI overviews.

As Góralewicz notes, this could be an initial rollout, speculating that “Google will expand AI overviews for high-cost queries when enabling ads” based on data showing they are currently excluded for high cost-per-click keywords.

An in-depth report across ecommerce and publishing is expected soon from Góralewicz and Onely, with additional insights into this search trend.

Why SEJ Cares

AI overviews represent a shift in how search visibility is achieved for ecommerce websites.

With most overviews currently pulling product data from non-ranking sources, the traditional connection between organic rankings and search traffic is being disrupted.

Retailers may need to adapt their SEO strategies for this new search environment.

How This Can Benefit You

While unsettling for established brands, AI overviews create new opportunities for retailers to gain visibility without competing for the most commercially valuable keywords.

Ecommerce sites can potentially circumvent traditional ranking barriers by optimizing product data and detail pages for Google’s “accelerated” product displays.

The detailed assessment framework provided by Solis enables merchants to audit their exposure and prioritize optimization needs accordingly.


FAQ

What are the key findings from the analysis of AI overviews & ecommerce queries?

Góralewicz’s analysis of 25,000 ecommerce queries found:

  • 16% of ecommerce queries now return an AI overview in the search results.
  • 80% of the sources listed in these AI overviews do not rank organically for the original query.
  • Ranking positions #1-3 only provides an 8% chance of being a source in AI overviews.

These insights reveal significant shifts in how ecommerce sites need to approach search visibility.

Why are AI overviews pulling product data from non-ranking sources, and what does this mean for retailers?

Google’s AI overviews prioritize “accelerated” experiences over summarizing currently ranked pages for product-related queries.

This shift focuses on showcasing directly what users seek instead of traditional organic results.

For retailers, this means:

  • A need to optimize product pages beyond traditional SEO practices, catering to the data requirements of AI overviews.
  • Opportunities to gain visibility without necessarily holding top organic rankings.
  • Potential to bypass traditional ranking barriers by focusing on enhanced product data integration.

Retailers must adapt quickly to remain competitive in this evolving search environment.

What practical steps can retailers take to evaluate and improve their search visibility in light of AI overview disruptions?

Retailers can take several practical steps to evaluate and improve their search visibility:

  • Utilize the spreadsheet provided by Aleyda Solis to assess the potential traffic impact of AI overviews.
  • Optimize product and detail pages to align with the data and presentation style preferred by AI overviews.
  • Continuously monitor changes and updates to AI overviews, adapting strategies based on new data and trends.

These steps can help retailers navigate the impact of AI overviews and maintain or improve their search visibility.


Featured Image: Marco Lazzarini/Shutterstock



Source link

Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address
Continue Reading

SEO

Google’s AI Overviews Go Viral, Draw Mainstream Media Scrutiny

Published

on

By

Google's AI Overviews Go Viral, Draw Mainstream Media Scrutiny

Google’s rollout of AI-generated overviews in US search results is taking a disastrous turn, with mainstream media outlets like The New York Times, BBC, and CNBC reporting on numerous inaccuracies and bizarre responses.

On social media, users are sharing endless examples of the feature’s nonsensical and sometimes dangerous output.

From recommending non-toxic glue on pizza to suggesting that eating rocks provides nutritional benefits, the blunders would be amusing if they weren’t so alarming.

Mainstream Media Coverage

As reported by The New York Times, Google’s AI overviews struggle with basic facts, claiming that Barack Obama was the first Muslim president of the United States and stating that Andrew Jackson graduated from college in 2005.

These errors undermine trust in Google’s search engine, which more than two billion people rely on for authoritative information worldwide.

Manual Removal & System Refinements

As reported by The Verge, Google is now scrambling to remove the bizarre AI-generated responses and improve its systems manually.

A Google spokesperson confirmed that the company is taking “swift action” to remove problematic responses and using the examples to refine its AI overview feature.

Google’s Rush To AI Integration

The flawed rollout of AI overviews isn’t an isolated incident for Google.

As CNBC notes in its report, Google made several missteps in a rush to integrate AI into its products.

In February, Google was forced to pause its Gemini chatbot after it generated inaccurate images of historical figures and refused to depict white people in most instances.

Before that, the company’s Bard chatbot faced ridicule for sharing incorrect information about outer space, leading to a $100 billion drop in Google’s market value.

Despite these setbacks, industry experts cited by The New York Times suggest that Google has little choice but to continue advancing AI integration to remain competitive.

However, the challenges of taming large language models, which ingest false information and satirical posts, are now more apparent.

The Debate Over AI In Search

The controversy surrounding AI overviews adds fuel to the debate over the risks and limitations of AI.

While the technology holds potential, these missteps remind everyone that more testing is needed before unleashing it on the public.

The BBC notes that Google’s rivals face similar backlash over their attempts to cram more AI tools into their consumer-facing products.

The UK’s data watchdog is investigating Microsoft after it announced a feature that would take continuous screenshots of users’ online activity.

At the same time, actress Scarlett Johansson criticized OpenAI for using a voice likened to her own without permission.

What This Means For Websites & SEO Professionals

Mainstream media coverage of Google’s erroneous AI overviews brings the issue of declining search quality to public attention.

As the company works to address inaccuracies, the incident serves as a cautionary tale for the entire industry.

Important takeaway: Prioritize responsible use of AI technology to ensure the benefits outweigh its risks.



Source link

Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address
Continue Reading

SEO

New Google Search Ads Resemble AI Assistant App

Published

on

By

New Google Search Ads Resemble AI Assistant App

A keynote at Google’s Marketing Live event showed a new AI-powered visual search results that feature advertisements that engage users within the context of an AI-Assisted search, blurring the line between AI-generated search results and advertisements.

Google Lens is a truly helpful app but it becomes unconventional where it blurs the line between an assistant helping users and being led to a shopping cart. This new way of engaging potential customers with AI is so far out there that the presenter doesn’t even call it advertising, he doesn’t even use the word.

Visual Search Traffic Opportunity?

Google’s Group Product Manager Sylvanus Bent, begins the presentation with an overview of the next version of Google Lens visual search that will be useful for surfacing information and for help finding where to buy them.

Sylvanus explained how it will be an opportunity for websites to receive traffic from this new way to search.

“…whether you’re snapping a photo with lens or circling to search something on your social feed, visual search unlocks new ways to explore whatever catches your eye, and we recently announced a newly redesigned results page for Visual search.

Soon, instead of just visual matches, you’ll see a wide range of results, from images to video, web links, and facts about the knowledge graph. It gets people the helpful information they need and creates new opportunities for sites to be discovered.”

It’s hard to say whether or not this will bring search traffic to websites and what the quality of that traffic will be. Will they stick around to read an article? Will they engage with a product review?

Visual Search Results

Sylvanus shares a hypothetical example of someone at an airport baggage claim who falls in like with someone else’s bag. He explains that all the person needs to do is snap a photo of the luggage bag and Google Lens will take them directly to shopping options.

He explains:

“No words, no problem. Just open Lens, take a quick picture and immediately you’ll see options to purchase.

And for the first time, shopping ads will appear at the very top of the results on linked searches, where a business can offer what a consumer is looking for.

This will help them easily purchase something that catches their eye.”

These are image-heavy shopping ads at the top of the search results and as annoying as that may be it’s nowhere near the “next level” advertising that is coming to Google’s search ads where Google presents a paid promotion within the context of an AI Assistant.

Interactive Search Shopping

Sylvanus next describes an AI-powered form advertising that happens directly within search. But he doesn’t call it advertising. He doesn’t even use the word advertising. He suggests this new form of AI search experience is more than offer, saying that, “it’s an experience.”

He’s right to not use the word advertisement because what he describes goes far beyond advertising and blurs the boundaries between search and advertising within the context of AI-powered suggestions, paid suggestions.

Sylvanus explains how this new form of shopping experience works:

“And next, imagine a world where every search ad is more than an offer. It’s an experience. It’s a new way for you to engage more directly with your customers. And we’re exploring search ads with AI powered recommendations across different verticals. So I want to show you an example that’s going live soon and you’ll see even more when we get to shopping.”

He uses the example of someone who needs to store their furniture for a few months and who turns to Google to find short term storage. What he describes is a query for local short term storage that turns into a “dynamic ad experience” that leads the searcher into throwing packing supplies into their shopping cart.

He narrated how it works:

“You search for short term storage and you see an ad for extra space storage. Now you can click into a new dynamic ad experience.

You can select and upload photos of the different rooms in your house, showing how much furniture you have, and then extra space storage with help from Google, AI generates a description of all your belongings for you to verify. You get a recommendation for the right size and type of storage unit and even how much packing supplies you need to get the job done. Then you just go to the website to complete the transaction.

And this is taking the definition of a helpful ad to the next level. It does everything but physically pick up your stuff and move it, and that is cool.”

Step 1: Search For Short Term Storage

1716722762 15 New Google Search Ads Resemble AI Assistant App

The above screenshot shows an advertisement that when clicked takes the user to what looks like an AI-assisted search but is really an interactive advertisement.

Step 2: Upload Photos For “AI Assistance”

1716722762 242 New Google Search Ads Resemble AI Assistant App

The above image is a screenshot of an advertisement that is presented in the context of AI-assisted search.  Masking an advertisement within a different context is the same principal behind an advertorial where an advertisement is hidden in the form of an article. The phrases “Let AI do the heavy lifting” and “AI-powered recommendations” create the context of AI-search that masks the true context of an advertisement.

Step 3: Images Chosen For Uploading

1716722762 187 New Google Search Ads Resemble AI Assistant App

The above screenshot shows how a user uploads an image to the AI-powered advertisement within the context of an AI-powered search app.

The Word “App” Masks That This Is An Ad

Screenshot of interactive advertisement for that identifies itself as an app with the words

Above is a screenshot of how a user uploads a photo to the AI-powered interactive advertisement within the context of a visual search engine, using the word “app” to further the illusion that the user is interacting with an app and not an advertisement.

Upload Process Masks The Advertising Context

Screenshot of interactive advertisement that uses the context of an AI Assistant to mask that this is an advertisement

The phrase “Generative AI is experimental” contributes to the illusion that this is an AI-assisted search.

Step 4: Upload Confirmation

1716722762 395 New Google Search Ads Resemble AI Assistant App

In step 4 the “app” advertisement is for confirming that the AI correctly identified the furniture that needs to be put into storage.

Step 5: AI “Recommendations”

1716722762 588 New Google Search Ads Resemble AI Assistant App

The above screenshot shows “AI recommendations” that look like search results.

The Recommendations Are Ad Units

1716722762 751 New Google Search Ads Resemble AI Assistant App

Those recommendations are actually ad units that when clicked takes the user to the “Extra Space Storage” shopping website.

Step 6: Searcher Visits Advertiser Website

1716722762 929 New Google Search Ads Resemble AI Assistant App

Blurring The Boundaries

What the Google keynote speaker describes is the integration of paid product suggestions into an AI assisted search. This kind of advertising is so far out there that the Googler doesn’t even call it advertising and rightfully so because what this does is blur the line between AI assisted search and advertising. At what point does a helpful AI search become just a platform for using AI to offer paid suggestions?

Watch The Keynote At The 32 Minute Mark

Featured Image by Shutterstock/Ljupco Smokovski

Source link

Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address
Continue Reading

Trending