SEO
10 Steps To Boost Your Site’s Crawlability And Indexability

Keywords and content may be the twin pillars upon which most search engine optimization strategies are built, but they’re far from the only ones that matter.
Less commonly discussed but equally important – not just to users but to search bots – is your website’s discoverability.
There are roughly 50 billion webpages on 1.93 billion websites on the internet. This is far too many for any human team to explore, so these bots, also called spiders, perform a significant role.
These bots determine each page’s content by following links from website to website and page to page. This information is compiled into a vast database, or index, of URLs, which are then put through the search engine’s algorithm for ranking.
This two-step process of navigating and understanding your site is called crawling and indexing.
As an SEO professional, you’ve undoubtedly heard these terms before, but let’s define them just for clarity’s sake:
- Crawlability refers to how well these search engine bots can scan and index your webpages.
- Indexability measures the search engine’s ability to analyze your webpages and add them to its index.
As you can probably imagine, these are both essential parts of SEO.
If your site suffers from poor crawlability, for example, many broken links and dead ends, search engine crawlers won’t be able to access all your content, which will exclude it from the index.
Indexability, on the other hand, is vital because pages that are not indexed will not appear in search results. How can Google rank a page it hasn’t included in its database?
The crawling and indexing process is a bit more complicated than we’ve discussed here, but that’s the basic overview.
If you’re looking for a more in-depth discussion of how they work, Dave Davies has an excellent piece on crawling and indexing.
How To Improve Crawling And Indexing
Now that we’ve covered just how important these two processes are let’s look at some elements of your website that affect crawling and indexing – and discuss ways to optimize your site for them.
1. Improve Page Loading Speed
With billions of webpages to catalog, web spiders don’t have all day to wait for your links to load. This is sometimes referred to as a crawl budget.
If your site doesn’t load within the specified time frame, they’ll leave your site, which means you’ll remain uncrawled and unindexed. And as you can imagine, this is not good for SEO purposes.
Thus, it’s a good idea to regularly evaluate your page speed and improve it wherever you can.
You can use Google Search Console or tools like Screaming Frog to check your website’s speed.
If your site is running slow, take steps to alleviate the problem. This could include upgrading your server or hosting platform, enabling compression, minifying CSS, JavaScript, and HTML, and eliminating or reducing redirects.
Figure out what’s slowing down your load time by checking your Core Web Vitals report. If you want more refined information about your goals, particularly from a user-centric view, Google Lighthouse is an open-source tool you may find very useful.
2. Strengthen Internal Link Structure
A good site structure and internal linking are foundational elements of a successful SEO strategy. A disorganized website is difficult for search engines to crawl, which makes internal linking one of the most important things a website can do.
But don’t just take our word for it. Here’s what Google’s search advocate John Mueller had to say about it:
“Internal linking is super critical for SEO. I think it’s one of the biggest things that you can do on a website to kind of guide Google and guide visitors to the pages that you think are important.”
If your internal linking is poor, you also risk orphaned pages or those pages that don’t link to any other part of your website. Because nothing is directed to these pages, the only way for search engines to find them is from your sitemap.
To eliminate this problem and others caused by poor structure, create a logical internal structure for your site.
Your homepage should link to subpages supported by pages further down the pyramid. These subpages should then have contextual links where it feels natural.
Another thing to keep an eye on is broken links, including those with typos in the URL. This, of course, leads to a broken link, which will lead to the dreaded 404 error. In other words, page not found.
The problem with this is that broken links are not helping and are harming your crawlability.
Double-check your URLs, particularly if you’ve recently undergone a site migration, bulk delete, or structure change. And make sure you’re not linking to old or deleted URLs.
Other best practices for internal linking include having a good amount of linkable content (content is always king), using anchor text instead of linked images, and using a “reasonable number” of links on a page (whatever that means).
Oh yeah, and ensure you’re using follow links for internal links.
3. Submit Your Sitemap To Google
Given enough time, and assuming you haven’t told it not to, Google will crawl your site. And that’s great, but it’s not helping your search ranking while you’re waiting.
If you’ve recently made changes to your content and want Google to know about it immediately, it’s a good idea to submit a sitemap to Google Search Console.
A sitemap is another file that lives in your root directory. It serves as a roadmap for search engines with direct links to every page on your site.
This is beneficial for indexability because it allows Google to learn about multiple pages simultaneously. Whereas a crawler may have to follow five internal links to discover a deep page, by submitting an XML sitemap, it can find all of your pages with a single visit to your sitemap file.
Submitting your sitemap to Google is particularly useful if you have a deep website, frequently add new pages or content, or your site does not have good internal linking.
4. Update Robots.txt Files
You probably want to have a robots.txt file for your website. While it’s not required, 99% of websites use it as a rule of thumb. If you’re unfamiliar with this is, it’s a plain text file in your website’s root directory.
It tells search engine crawlers how you would like them to crawl your site. Its primary use is to manage bot traffic and keep your site from being overloaded with requests.
Where this comes in handy in terms of crawlability is limiting which pages Google crawls and indexes. For example, you probably don’t want pages like directories, shopping carts, and tags in Google’s directory.
Of course, this helpful text file can also negatively impact your crawlability. It’s well worth looking at your robots.txt file (or having an expert do it if you’re not confident in your abilities) to see if you’re inadvertently blocking crawler access to your pages.
Some common mistakes in robots.text files include:
- Robots.txt is not in the root directory.
- Poor use of wildcards.
- Noindex in robots.txt.
- Blocked scripts, stylesheets and images.
- No sitemap URL.
For an in-depth examination of each of these issues – and tips for resolving them, read this article.
5. Check Your Canonicalization
Canonical tags consolidate signals from multiple URLs into a single canonical URL. This can be a helpful way to tell Google to index the pages you want while skipping duplicates and outdated versions.
But this opens the door for rogue canonical tags. These refer to older versions of a page that no longer exists, leading to search engines indexing the wrong pages and leaving your preferred pages invisible.
To eliminate this problem, use a URL inspection tool to scan for rogue tags and remove them.
If your website is geared towards international traffic, i.e., if you direct users in different countries to different canonical pages, you need to have canonical tags for each language. This ensures your pages are being indexed in each language your site is using.
6. Perform A Site Audit
Now that you’ve performed all these other steps, there’s still one final thing you need to do to ensure your site is optimized for crawling and indexing: a site audit. And that starts with checking the percentage of pages Google has indexed for your site.
Check Your Indexability Rate
Your indexability rate is the number of pages in Google’s index divided by the number of pages on our website.
You can find out how many pages are in the google index from Google Search Console Index by going to the “Pages” tab and checking the number of pages on the website from the CMS admin panel.
There’s a good chance your site will have some pages you don’t want indexed, so this number likely won’t be 100%. But if the indexability rate is below 90%, then you have issues that need to be investigated.
You can get your no-indexed URLs from Search Console and run an audit for them. This could help you understand what is causing the issue.
Another useful site auditing tool included in Google Search Console is the URL Inspection Tool. This allows you to see what Google spiders see, which you can then compare to real webpages to understand what Google is unable to render.
Audit Newly Published Pages
Any time you publish new pages to your website or update your most important pages, you should make sure they’re being indexed. Go into Google Search Console and make sure they’re all showing up.
If you’re still having issues, an audit can also give you insight into which other parts of your SEO strategy are falling short, so it’s a double win. Scale your audit process with tools like:
7. Check For Low-Quality Or Duplicate Content
If Google doesn’t view your content as valuable to searchers, it may decide it’s not worthy to index. This thin content, as it’s known could be poorly written content (e.g., filled with grammar mistakes and spelling errors), boilerplate content that’s not unique to your site, or content with no external signals about its value and authority.
To find this, determine which pages on your site are not being indexed, and then review the target queries for them. Are they providing high-quality answers to the questions of searchers? If not, replace or refresh them.
Duplicate content is another reason bots can get hung up while crawling your site. Basically, what happens is that your coding structure has confused it and it doesn’t know which version to index. This could be caused by things like session IDs, redundant content elements and pagination issues.
Sometimes, this will trigger an alert in Google Search Console, telling you Google is encountering more URLs than it thinks it should. If you haven’t received one, check your crawl results for things like duplicate or missing tags, or URLs with extra characters that could be creating extra work for bots.
Correct these issues by fixing tags, removing pages or adjusting Google’s access.
8. Eliminate Redirect Chains And Internal Redirects
As websites evolve, redirects are a natural byproduct, directing visitors from one page to a newer or more relevant one. But while they’re common on most sites, if you’re mishandling them, you could be inadvertently sabotaging your own indexing.
There are several mistakes you can make when creating redirects, but one of the most common is redirect chains. These occur when there’s more than one redirect between the link clicked on and the destination. Google doesn’t look on this as a positive signal.
In more extreme cases, you may initiate a redirect loop, in which a page redirects to another page, which directs to another page, and so on, until it eventually links back to the very first page. In other words, you’ve created a never-ending loop that goes nowhere.
Check your site’s redirects using Screaming Frog, Redirect-Checker.org or a similar tool.
9. Fix Broken Links
In a similar vein, broken links can wreak havoc on your site’s crawlability. You should regularly be checking your site to ensure you don’t have broken links, as this will not only hurt your SEO results, but will frustrate human users.
There are a number of ways you can find broken links on your site, including manually evaluating each and every link on your site (header, footer, navigation, in-text, etc.), or you can use Google Search Console, Analytics or Screaming Frog to find 404 errors.
Once you’ve found broken links, you have three options for fixing them: redirecting them (see the section above for caveats), updating them or removing them.
10. IndexNow
IndexNow is a relatively new protocol that allows URLs to be submitted simultaneously between search engines via an API. It works like a super-charged version of submitting an XML sitemap by alerting search engines about new URLs and changes to your website.
Basically, what it does is provides crawlers with a roadmap to your site upfront. They enter your site with information they need, so there’s no need to constantly recheck the sitemap. And unlike XML sitemaps, it allows you to inform search engines about non-200 status code pages.
Implementing it is easy, and only requires you to generate an API key, host it in your directory or another location, and submit your URLs in the recommended format.
Wrapping Up
By now, you should have a good understanding of your website’s indexability and crawlability. You should also understand just how important these two factors are to your search rankings.
If Google’s spiders can crawl and index your site, it doesn’t matter how many keywords, backlinks, and tags you use – you won’t appear in search results.
And that’s why it’s essential to regularly check your site for anything that could be waylaying, misleading, or misdirecting bots.
So, get yourself a good set of tools and get started. Be diligent and mindful of the details, and you’ll soon have Google spiders swarming your site like spiders.
More Resources:
Featured Image: Roman Samborskyi/Shutterstock
SEO
TikTok Updated Community Guidelines To Include AI Content

TikTok has updated its Community Guidelines, which will go into effect on April 21, 2023.
The updated guidelines introduce TikTok’s Community Principles, which guide content moderation to uphold human rights and international legal frameworks.
TikTok worked with over 100 organizations globally to strengthen its rules to address new threats and reduce potential user harm.
Key changes to Community Guidelines apply to synthetic media, tribes, and civic and election integrity.
AI-Generated Content
TikTok defines “synthetic media” as content created or modified by AI. While AI and related technologies allow creators to express themselves in many new ways, they can also blur the line between fact and fiction for viewers.
Creators must label synthetic or altered media as such to mitigate the potential risks of spreading misinformation.
To reduce potential harm, synthetic media featuring real private individuals is prohibited. Private individuals include anyone under 18 and adults who are not public figures. The use of public figures over 18 – government officials, politicians, business leaders, and celebrities – is permitted, but with restrictions.
Creators must not use synthetic media to violate policies against hate speech, sexual exploitation, and severe harassment. They must also clearly disclose synthetic media and manipulated content that depict realistic scenes with fake people, places, or events.
Public figures cannot be used in synthetic audio or video for political or commercial endorsements to mislead users about financial or political issues.
You can, however, use synthetic media in artistic and educational content.
Protection Of Tribes
TikTok policies already include rules meant to protect people and groups with specific attributes from hateful behavior, hate speech, and hateful ideologies.
With new guidelines, the platform added Tribes to the list of protected attributes, including ethnicity, gender, race, religion, and sexual orientation.
While TikTok allows critical content on public figures, as defined above, it prohibits language that harasses, humiliates, threatens, or doxxes everyone.
Users can consult resources and tools provided by TikTok to identify bullying behavior and configure their settings to prevent it from affecting them further.
Civil And Election Integrity
Noting that elections are essential to community dialogue and upholding societal values, TikTok recently emphasized its alleged efforts to encourage topical discussions while maintaining unity.
To achieve this goal, paid political promotion, advertising, and fundraising by politicians or parties are prohibited. This policy applies to traditional ads and compensated creator content.
TikTok claims to support informed civic idea exchanges to promote constructive conversations without allowing misinformation about voting processes and election outcomes. Content that includes unverified claims about election results will not be eligible to appear in the For You Feed.
Before these changes go into effect next month, moderators will receive additional training on enforcing them effectively.
Will Recent Changes Prevent More TikTok Bans?
TikTok’s refreshed Community Guidelines and explanation of Community Principles appear to attempt greater transparency and foster a safe, inclusive, and authentic environment for all users.
TikTok plans to continue investing in safety measures to encourage creativity and connection within its global community of one billion users globally.
TikTok’s latest changes to improve transparency, reduce harm, and provide higher-quality content for users may be part of efforts to prevent the app from being banned in the U.S.
This week, the House Energy and Commerce Committee will hold a full committee hearing with TikTok CEO Shou Chew on how congress can protect the data privacy of U.S. users and children from online harm.
Organizations like the Tech Oversight Project have also expressed concerns about risks that big tech companies like Amazon, Apple, Google, and Meta pose.
Featured Image: BigTunaOnline/Shutterstock
SEO
Google Launches BARD AI Chatbot To Compete With ChatGPT

Google has unveiled BARD, an AI chatbot designed to compete with OpenAI’s ChatGPT and Microsoft’s chatbot in their Bing search engine.
In a blog post, Google describes Bard as an early AI experiment to enhance productivity, accelerate ideas, and foster curiosity.
You can use BARD to get tips, explanations, or creative assistance in tasks such as outlining blog posts.
With BARD, Google aims to solidify its presence in the AI chatbot space while maintaining its dominance in the search engine market.
BARD’s Technical Details
BARD is powered by a research large language model (LLM) – a lightweight and optimized version of LaMDA.
It will be updated with more advanced models over time. As more people use LLMs, they become better at predicting helpful responses.
BARD is designed as a complementary experience to Google Search, allowing users to check its responses or explore sources across the web.
Operating as a standalone webpage, BARD consists of a singular question box instead of being integrated into Google’s search engine.
This strategic move is to adopt new AI technology while preserving the profitability of its search engine business.
Cautious Rollout Amid Unpredictability Concerns
Google’s cautious approach to BARD’s release is in response to the concerns over unpredictable and sometimes unreliable chatbot technology, as demonstrated by competitors.
Google recognizes LLMs can sometimes produce biased, misleading, or false information.
To mitigate these issues, Google allows you to choose from a few drafts of BARD’s response.
You can continue collaborating with BARD by asking follow-up questions or requesting alternative answers.

Google’s Race to Ship AI Products
Since OpenAI’s release of ChatGPT and Microsoft’s introduction of chatbot technology in Bing, Google has prioritized AI as its central focus.
The company’s internal teams, including AI safety researchers, are working collaboratively to accelerate approval for a range of new AI products.
Google’s work on BARD is guided by its AI Principles, focusing on quality and safety.
The company uses human feedback and evaluation to enhance its systems. It has implemented guardrails, such as capping the number of exchanges in a dialogue, to keep interactions helpful and on-topic.

In Development Since 2015
Google has been developing the technology behind BARD since 2015.
However, similar to OpenAI and Microsoft’s chatbots, BARD has not been released to a broader audience due to concerns about generating untrustworthy information and potential biases against certain groups.
Google acknowledges these issues and aims to bring BARD to market responsibly.
BARD Availability
You can sign up to try BARD at bard.google.com.
Access is initially rolling out in the US and UK, with plans to expand to more countries and languages over time. It’s possible to get around the limited rollout with a VPN.
Google requires users to have a Gmail address to sign up and doesn’t accept Google Workspace email accounts.
Sources: Google, The New York Times
Featured Image: Muhammad S0hail/Shutterstock
SEO
Will AI Kill SEO? We Asked ChatGPT

It happens every couple of years.
First, it was Jason Calacanis and Mahalo, then the early social platforms.
We saw it again with voice search and smart assistants. For a minute, it was TikTok’s turn. Then the metaverse jumped the line.
Now, it’s ChatGPT and AI.
I’m talking, of course, about “SEO killers.”
Every now and then, a new technology comes along, and three things inevitably happen:
- Thousands of SEO professionals publish posts and case studies declaring themselves experts in the new thing.
- Every publication dusts off its “SEO is dead” article, changes the date, and does a find and replace for the new technology.
- SEO continues to be stronger than ever.
Rinse, repeat.
It would seem that search has more lives than a cartoon cat, but the simple truth is: Search is immortal.
How we search, what devices we use, and whether the answer is a link to a website will forever be up for debate.
But as long as users have tasks to complete, they’ll turn somewhere for help, and digital marketers will influence the process.
Will AI Replace Search?
There’s a ton of hype right now about AI replacing both search engines and search professionals – I don’t see that happening. I view ChatGPT as just another tool.
Much like a knife: You can butter bread or cut yourself. It’s all in how you use it.
Will AI replace search engines? Let’s ask it ourselves!
That’s a pretty good answer.
Many SEO professionals (including me) have been saying for years that the days of tricking the algorithm are long gone.
SEO has been slowly morphing into digital marketing for a long time now. It’s no longer possible to do SEO without considering user intent, personas, use cases, competitive research, market conditions, etc.
Ok, but won’t AI just do that for us? Is AI going to take my job? Here’s a crazy idea: Let’s ask ChatGPT!

AI Isn’t Going To Take Your Job. But An SEO Who Knows How To Use AI To Be More Efficient Just Might
Why? Let’s dive in.
I still see a lot of SEO pros writing articles that ask AI to do things it’s simply incapable of – and this comes from a basic understanding of how large language models actually work.
AI tools, like ChatGPT, aren’t pulling any information from a database of facts. They don’t have an index or a knowledge graph.
They don’t “store” information the way a search engine does. They’re simply predicting what words or sentences will come next based on the material they’ve been trained on. They don’t store this training material, though.
They’re using word vectors to determine what words are most likely to come next. That’s why they can be so good and also hallucinate.
AI can’t crawl the internet. It has no knowledge of current events and can’t cite sources because it doesn’t know or retain that information. Sure, you can ask it to cite sources, but it’s really just making stuff up.
For really popular topics that were discussed a lot, it can get pretty close – because the probabilities of those words coming next are really high – but the more specific you get, the more it will hallucinate.
Given the extreme amount of time and resources it takes to train the model, it will be a long time before AI can answer any queries about current events.
But What About Bing, You.com, And Google’s Upcoming Bard? They Can Do All Of This, Can’t They?
Yes and no. They can cite sources, but that’s based on how they’re implementing it. To vastly oversimplify, Bing isn’t asking for a pure chatbot.
Bing is searching for your query/keyword. It’s then feeding in all the webpages that it would normally return for that search and asking the AI to summarize those webpages.
You and I can’t do that on the public-facing AI tools without hitting token limits, but search engines can!
Ok, Surely This Will Kill SEO. AI Will Just Answer Every Question, Right?
I disagree.
All the way back in 2009 (when we were listening to the Black Eyed Peas on our iPhone 3Gs and updating our MySpace top 8 on Windows Vista), a search engine once called Live was being renamed to Bing.
Why? Because Bing is a verb. This prompted Bill Gates to declare, “The future of search is verbs.”
I love to share this quote with clients every chance I get because that future is now.
Gates wasn’t talking about people typing action words into search engines. He meant that people are trying to “do” something, and the job of search is to help facilitate that.
People often forget that search is a form of pull marketing, where users tell us what they want – not push marketing like a billboard or a TV ad.
As digital marketers, our job is simple: Give users what they want.
This is where the confusion comes in, though.
For many queries that have simple answers, a link to a website with a popup cookie policy, notification alert, newsletter sign-up popup, and ads were never what the user wanted.
It’s just the best thing we had back then. Search engines never set out with the end goal of providing links to websites. They set out to answer questions and help users accomplish tasks.
Even from the earliest days, Google talked about how its goal was to be the Star Trek computer; it just didn’t have the technology to do it then. Now, it does.
For many of these queries, like [how old is Taylor Swift?] or [how many megabytes in a gigabyte?], websites will lose traffic – but it’s traffic they were probably never entitled to.
Who owns that answer anyway? These are questions with simple answers. The user’s task is simply to get a number. They don’t want a website.
Smart SEO pros will focus on the type of queries where a user wants to do something – like buy Taylor Swift tickets, get reviews of her album or concerts, chat with other Swifties, etc. That’s where AI won’t be able to kill SEO or search.
What ChatGPT Can Do Vs. What It Can’t
ChatGPT can accomplish a lot of things.
It’s good at showing me how to write an Excel formula or MySQL query, but it will never teach me MySQL, sell me a course, or let me talk with other developers about database theory.
Those are things a search engine can help me do.
ChatGPT can also help answer many “common knowledge” questions, as long as the topic isn’t contested and is old and popular enough to have shown up in the training data.
Even then, it’s still not 100% accurate – as we’ve seen in countless memes and with one famous bank being called out for its AI-written article not knowing how to calculate interest properly.
AI might list the most talked about bars in NYC, but it can’t recommend the best place to get an Old Fashioned like a human can.
Honestly, all SEO pros talking about using AI to create content are starting to bore me. Answering questions is neat, but where ChatGPT really excels is in text manipulation.
At my agency, we’re already using ChatGPT’s API as an SEO tool to help create content briefs, categorize and cluster keywords, write complicated regular expressions for redirects, and even generate XML or JSON-LD code based on given inputs.
These rely on tons of inputs from various sources and require lots of manual reviews.
We’re not using it to create content, though. We’re using it to summarize and examine other pieces of content and then use those to glean insights. It’s less of an SEO replacement and more of a time saver.
SEO Is Here To Stay
What if your business is built around displaying facts you don’t really “own”? If so, you should probably be worried – not just about AI.
Boilerplate copy tasks may be handled by AI. Recent tests I’ve done on personal sites have shown some success here.
But AI will never be capable of coming up with insights or creating new ideas, staying on top of the latest trends, or providing the experience, expertise, authority, or trust that a real author can.
Remember: It’s not thinking, citing, or even pulling data from a database. It’s just looking at the next-word probabilities.
Unlike thousands of SEO pros who recently updated their Twitter bios, I may not be an expert on AI, but I have a computer science degree. I also know what it takes to understand user needs.
So far, no data shows people would prefer auto-generated, re-worded content over unique curated content written by a real human being.
People want fresh ideas and insights that only people can provide. (If we add an I to E-E-A-T, where should it go?)
If your business or content delivers value through insights, curation, current trends, recommendations, solving problems, or performing an action, then SEO and search engines aren’t going anywhere.
They may change shape from time to time, but that just means job security for me – and I’m good with that.
More Resources:
Featured Image: Elnur/Shutterstock
-
SEARCHENGINES6 days ago
Google Says Ignore Spammy Referral Traffic
-
PPC6 days ago
How to Get Found Online: Our Top 9 Tips for Local Service Businesses
-
MARKETING5 days ago
12 Best Practices to Boost Your TikTok Ad Performance
-
AMAZON1 day ago
The Top 10 Benefits of Amazon AWS Lightsail: Why It’s a Great Choice for Businesses
-
MARKETING6 days ago
A treatise on e-commerce data security and compliance
-
SEARCHENGINES6 days ago
John Mueller Offers Hreflang Google SEO Advice
-
MARKETING6 days ago
Top 10 Content Marketing Tools that Improve the Marketing Team’s Productivity
-
SEARCHENGINES7 days ago
Microsoft Bing Ads Doctor And Clinic Ads