SEO
64.2% Of Sites Use WordPress

WordPress continues to dominate the content management systems (CMS) market and is currently used by 64.2% of websites that have a CMS, according to data from W3Techs.com.
Shopify is a distant second for June 2022 and accounts for 6.3% of the CMS market.
Wix, Squarespace, and Joomla round out the top five with less than 3.5% market share each.
W3Techs notes that 33.1% of websites do not use any of the content management systems they monitor.
WordPress is therefore used by 43% of all websites, and 64.2% of those with an identifiable CMS.
WordPress Plans To Continue Working On Security, Stability
WordPress shows no signs of slowing down and is currently about five years into a ten-year project that involves rewriting its entire codebase.
In a recent interview, Josepha Haden Chomphosy, Executive Director of WordPress, told SEJ,
“…the next year, as with all of the years in a project like that, is making sure we are still as stable and capable as a CMS as people have come to expect while also still pushing forward with a newer more modern way to manage your content online.”
WordPress rolled out version change Arturo 6.0 this month and within two weeks, 36.2% of WP sites had updated to it.
Roger Montti reported that WordPress shared a proposal for a plugin checker that would improve security and site performance by proactively vetting plugins, as well.
Shopify Enters B2B Marketplace With June Update
Shopify released its Summer ’22 Edition in June, adding more than 100 new features for users.
A new feature simply and aptly called “B2B” will connect Shopify Plus merchants with wholesalers and offer integrations with NetSuite, Brightpearl, Acumatica, and others for a more seamless experience.
See Brian Frederick’s coverage here to learn more.
Wix Publishes Structured Data Guide For SEO Pros
Wix, in third place for CMS market share this month, released “Wix Structured Data Guide: How To Use Standard & Custom Markup” in June.
Contributing author Mordy Oberstein, Head of SEO Branding at Wix, shared his comprehensive guide to implementing structured data on Wix here at Search Engine Journal.
“In fewer than three years, Wix went from supporting little by way of structured data to offering SEO pros and site owners the ability to do nearly whatever they want with relative ease,” Oberstein wrote.
He also noted that due to recent platform updates, any content elsewhere on the internet around this topic is now out of date.
See his guide above to learn more about applying structured data to your Wix site.
Stay tuned for next month’s CMS Market Share Monthly report.
Related reading:
Featured image: Paulo Bobita/Search Engine Journal
Data source: W3Techs.com, Usage statistics of content management systems, as of June 27, 2002.
SEO
9 Common Technical SEO Issues That Actually Matter


In this article, we’ll see how to find and fix technical SEO issues, but only those that can seriously affect your rankings.
If you’d like to follow along, get Ahrefs Webmaster Tools and Google Search Console (both are free) and check for the following issues.
Indexability is a webpage’s ability to be indexed by search engines. Pages that are not indexable can’t be displayed on the search engine results pages and can’t bring in any search traffic.
Three requirements must be met for a page to be indexable:
- The page must be crawlable. If you haven’t blocked Googlebot from entering the page robots.txt or you have a website with fewer than 1,000 pages, you probably don’t have an issue there.
- The page must not have a noindex tag (more on that in a bit).
- The page must be canonical (i.e., the main version).
Solution
In Ahrefs Webmaster Tools (AWT):
- Open Site Audit
- Go to the Indexability report
- Click on issues related to canonicalization and “noindex” to see affected pages


For canonicalization issues in this report, you will need to replace bad URLs in the link rel="canonical"
tag with valid ones (i.e., returning an “HTTP 200 OK”).
As for pages marked by “noindex” issues, these are the pages with the “noindex” meta tag placed inside their code. Chances are most of the pages found in the report there should stay as is. But if you see any pages that shouldn’t be there, simply remove the tag. Do make sure those pages aren’t blocked by robots.txt first.
Recommendation




A sitemap should contain only pages that you want search engines to index.
When a sitemap isn’t regularly updated or an unreliable generator has been used to make it, a sitemap may start to show broken pages, pages that became “noindexed,” pages that were de-canonicalized, or pages blocked in robots.txt.
Solution
In AWT:
- Open Site Audit
- Go to the All issues report
- Click on issues containing the word “sitemap” to find affected pages




Depending on the issue, you will have to:
- Delete the pages from the sitemap.
- Remove the noindex tag on the pages (if you want to keep them in the sitemap).
- Provide a valid URL for the reported page.
Google uses HTTPS encryption as a small ranking signal. This means you can experience lower rankings if you don’t have an SSL or TLS certificate securing your website.
But even if you do, some pages and/or resources on your pages may still use the HTTP protocol.
Solution
Assuming you already have an SSL/TLS certificate for all subdomains (if not, do get one), open AWT and do these:
- Open Site Audit
- Go to the Internal pages report
- Look at the protocol distribution graph and click on HTTP to see affected pages
- Inside the report showing pages, add a column for Final redirect URL
- Make sure all HTTP pages are permanently redirected (301 or 308 redirects) to their HTTPS counterparts








Finally, let’s check if any resources on the site still use HTTP:
- Inside the Internal pages report, click on Issues
- Click on HTTPS/HTTP mixed content to view affected resources




You can fix this issue by one of these methods:
- Link to the HTTPS version of the resource (check this option first)
- Include the resource from a different host, if available
- Download and host the content on your site directly if you are legally allowed to do so
- Exclude the resource from your site altogether
Learn more: What Is HTTPS? Everything You Need to Know
Duplicate content happens when exact or near-duplicate content appears on the web in more than one place.
It’s bad for SEO mainly for two reasons: It can cause undesirable URLs to show in search results and can dilute link equity.
Content duplication is not necessarily a case of intentional or unintentional creation of similar pages. There are other less obvious causes such as faceted navigation, tracking parameters in URLs, or using trailing and non-trailing slashes.
Solution
First, check if your website is available under only one URL. Because if your site is accessible as:
- http://domain.com
- http://www.domain.com
- https://domain.com
- https://www.domain.com
Then Google will see all of those URLs as different websites.
The easiest way to check if users can browse only one version of your website: type in all four variations in the browser, one by one, hit enter, and see if they get redirected to the master version (ideally, the one with HTTPS).
You can also go straight into Site Audit’s Duplicates report. If you see 100% bad duplicates, that is likely the reason.




In this case, choose one version that will serve as canonical (likely the one with HTTPS) and permanently redirect other versions to it.
Then run a New crawl in Site Audit to see if there are any other bad duplicates left.




There are a few ways you can handle bad duplicates depending on the case. Learn how to solve them in our guide.
Learn more: Duplicate Content: Why It Happens and How to Fix It
Pages that can’t be found (4XX errors) and pages returning server errors (5XX errors) won’t be indexed by Google so they won’t bring you any traffic.
Furthermore, if broken pages have backlinks pointing to them, all of that link equity goes to waste.
Broken pages are also a waste of crawl budget—something to watch out for on bigger websites.
Solution
In AWT, you should:
- Open Site Audit.
- Go to the Internal pages report.
- See if there are any broken pages. If so, the Broken section will show a number higher than 0. Click on the number to show affected pages.




In the report showing pages with issues, it’s a good idea to add a column for the number of referring domains. This will help you make the decision on how to fix the issue.




Now, fixing broken pages (4XX error codes) is quite simple, but there is more than one possibility. Here’s a short graph explaining the process:




Dealing with server errors (the ones reporting a 5XX) can be a tougher one, as there are different possible reasons for a server to be unresponsive. Read this short guide for troubleshooting.
Recommendation
- Go to Site Explorer
- Enter your domain
- Go to the Best by links report
- Add a “404 not found” filter
- Then sort the report by referring domains from high to low




If you’ve already dealt with broken pages, chances are you’ve fixed most of the broken links issues.
Other critical issues related to links are:
- Orphan pages – These are the pages without any internal links. Web crawlers have limited ability to access those pages (only from sitemap or backlinks), and there is no link equity flowing to them from other pages on your site. Last but not least, users won’t be able to access this page from the site navigation.
- HTTPS pages linking to internal HTTP pages – If an internal link on your website brings users to an HTTP URL, web browsers will likely show a warning about a non-secure page. This can damage your overall website authority and user experience.
Solution
In AWT, you can:
- Go to Site Audit.
- Open the Links report.
- Open the Issues tab.
- Look for the following issues in the Indexable category. Click to see affected pages.




Fix the first issue by changing the links from HTTP to HTTPS or simply delete those links if no longer needed.
For the second issue, an orphan page needs to be either linked to from some other page on your website or deleted if a given page holds no value to you.
Sidenote.
Ahrefs’ Site Audit can find orphan pages as long as they have backlinks or are included in the sitemap. For a more thorough search for this issue, you will need to analyze server logs to find orphan pages with hits. Find out how in this guide.
Having a mobile-friendly website is a must for SEO. Two reasons:
- Google uses mobile-first indexing – It’s mostly using the content of mobile pages for indexing and ranking.
- Mobile experience is part of the Page Experience signals – While Google will allegedly always “promote” the page with the best content, page experience can be a tiebreaker for pages offering content of similar quality.
Solution
In GSC:
- Go to the Mobile Usability report in the Experience section
- View affected pages by clicking on issues in the Why pages aren’t usable on mobile section




You can read Google’s guide for fixing mobile issues here.
Performance and visual stability are other aspects of Page Experience signals used by Google to rank pages.
Google has developed a special set of metrics to measure user experience called Core Web Vitals (CWV). Site owners and SEOs can use those metrics to see how Google perceives their website in terms of UX.




While page experience can be a ranking tiebreaker, CWV is not a race. You don’t need to have the fastest website on the internet. You just need to score “good” ideally in all three categories: loading, interactivity, and visual stability.




Solution
In GSC:
- First, click on Core Web Vitals in the Experience section of the reports.
- Then click Open report in each section to see how your website scores.
- For pages that aren’t considered good, you’ll see a special section at the bottom of the report. Use it to see pages that need your attention.








Optimizing for CWV may take some time. This may include things like moving to a faster (or closer) server, compressing images, optimizing CSS, etc. We explain how to do this in the third part of this guide to CWV.
Bad website structure in the context of technical SEO is mainly about having important organic pages too deep into the website structure.
Pages that are nested too deep (i.e., users need >6 clicks from the website to get to them) will receive less link equity from your homepage (likely the page with the most backlinks), which may affect their rankings. This is because link value diminishes with every link “hop.”
Sidenote.
Website structure is important for other reasons too such as the overall user experience, crawl efficiency, and helping Google understand the context of your pages. Here, we’ll only focus on the technical aspect, but you can read more about the topic in our full guide: Website Structure: How to Build Your SEO Foundation.
Solution
In AWT:
- Open Site Audit
- Go to Structure explorer, switch to the Depth tab, and set the data type to Data table
- Configure the Segment to only valid HTML pages and click Apply
- Use the graph to investigate pages with more than six clicks away from the homepage








The way to fix the issue is to link to these deeper nested pages from pages closer to the homepage. More important pages could find their place in site navigation, while less important ones can be just linked to the pages a few clicks closer.
It’s a good idea to weigh in user experience and the business role of your website when deciding what goes into sitewide navigation.
For example, we could probably give our SEO glossary a slightly higher chance to get ahead of organic competitors by including it in the main site navigation. Yet we decided not to because it isn’t such an important page for users who are not particularly searching for this type of information.
We’ve moved the glossary only up a notch by including a link inside the beginner’s guide to SEO (which itself is just one click away from the homepage).








Final thoughts
When you’re done fixing the more pressing issues, dig a little deeper to keep your site in perfect SEO health. Open Site Audit and go to the All issues report to see other issues regarding on-page SEO, image optimization, redirects, localization, and more. In each case, you will find instructions on how to deal with the issue.




You can also customize this report by turning issues on/off or changing their priority.




Did I miss any important technical issues? Let me know on Twitter or Mastodon.
SEO
New Google Ads Feature: Account-Level Negative Keywords


Google Ads Liaison Ginny Marvin has announced that account-level negative keywords are now available to Google Ads advertisers worldwide.
The feature, which was first announced last year and has been in testing for several months, allows advertisers to add keywords to exclude traffic from all search and shopping campaigns, as well as the search and shopping portion of Performance Max, for greater brand safety and suitability.
1/3 Some have noticed Account level negative keywords are starting to roll out globally. From Account Settings, you can add keywords to exclude traffic from all Search and Shopping campaigns, and the Search and Shopping portion of PMax for brand safety: https://t.co/B0VBApPVCm
— AdsLiaison (@adsliaison) January 27, 2023
Advertisers can access this feature from the account settings page to ensure their campaigns align with their brand values and target audience.
This is especially important for brands that want to avoid appearing in contexts that may be inappropriate or damaging to their reputation.
In addition to the brand safety benefits, the addition of account-level negative keywords makes the campaign management process more efficient for advertisers.
Instead of adding negative keywords to individual campaigns, advertisers can manage them at the account level, saving time and reducing the chances of human error.
You no longer have to worry about duplicating negative keywords in multiple campaigns or missing any vital to your brand safety.
Additionally, account-level negative keywords can improve the accuracy of ad targeting by excluding irrelevant or low-performing keywords that may adversely impact campaign performance. This can result in higher-quality traffic and a better return on investment.
Google Ads offers a range of existing brand suitability controls, including inventory types, digital content labels, placement exclusions, and negative keywords at the campaign level.
Marvin added that Google Ads is expanding account-level negative keywords to address various use cases and will have more to share soon.
This rollout is essential in giving brands more control over their advertising and ensuring their campaigns target the appropriate audience.
Featured Image: Primakov/Shutterstock
SEO
Google’s Gary Illyes Answers Your SEO Questions On LinkedIn


Google Analyst Gary Illyes offers guidance on large robots.txt files, the SEO impact of website redesigns, and the correct use of rel-canonical tags.
Illyes is taking questions sent to him via LinkedIn direct message and answering them publicly, offering valuable insights for those in the SEO community.
It’s already newsworthy for a Google employee to share SEO advice. This is especially so given it’s Illyes, who isn’t as active on social media as colleagues like Search Advocate John Mueller and Developer Advocate Martin Splitt.
Throughout the past week, Illyes has shared advice and offered guidance on the following subjects:
- Large robots.txt files
- The SEO impact of website redesigns
- The correct use of rel-canonical tags
Considering the engagement his posts are getting, there’s likely more to come. Here’s a summary of what you missed if you’re not following him on LinkedIn.
Keep Robots.Txt Files Under 500KB
Regarding a previously published poll on the size of robots.txt files, Illyes shares a PSA for those with a file size larger than 500kb.


Illyes advises paying attention to the size of your website’s robots.txt file, especially if it’s larger than 500kb.
Google’s crawlers only process the first 500kb of the file, so it’s crucial to ensure that the most important information appears first.
Doing this can help ensure that your website is properly crawled and indexed by Google.
Website Redesigns May Cause Rankings To Go “Nuts”
When you redesign a website, it’s important to remember that its rankings in search engines may be affected.
As Illyes explains, this is because search engines use the HTML of your pages to understand and categorize the content on your site.
If you make changes to the HTML structure, such as breaking up paragraphs, using CSS styling instead of H tags, or adding unnecessary breaking tags, it can cause the HTML parsers to produce different results.
This can significantly impact your site’s rankings in search engines. Or, as Illyes phrases it, it can cause rankings to go “nuts”:




Illyes advises using semantically similar HTML when redesigning the site and avoiding adding tags that aren’t necessary to minimize the SEO impact.
This will allow HTML parsers to better understand the content on your site, which can help maintain search rankings.
Don’t Use Relative Paths In Your Rel-Canonical
Don’t take shortcuts when implementing rel-canonical tags. Illyes strongly advises spelling out the entire URL path:




Saving a few bytes using a relative path in the rel-canonical tag isn’t worth the potential issues it could cause.
Using relative paths may result in search engines treating it as a different URL, which can confuse search engines.
Spelling out the full URL path eliminates potential ambiguity and ensures that search engines identify the correct URL as the preferred version.
In Summary
By answering questions sent to him via direct message and offering his expertise, Illyes is giving back to the community and providing valuable insights on various SEO-related topics.
This is a testament to Illyes’ dedication to helping people understand how Google works. Send him a DM, and your question may be answered in a future LinkedIn post.
Source: LinkedIn
Featured Image: SNEHIT PHOTO/Shutterstock
-
SEARCHENGINES5 days ago
Google Publishes A New SEO Case Study
-
OTHER5 days ago
Now the internet’s fighting over old scrollbar designs
-
AMAZON4 days ago
41 Super Practical Valentine’s Day Gifts Of 2023
-
PPC6 days ago
7 Ways to Optimize Your LinkedIn Ads for Peak Performance
-
MARKETING5 days ago
Renting vs. Owning the Post-Review Local Consumer Journey
-
MARKETING6 days ago
Feds finally file anti-monopoly suit over Google’s adtech
-
PPC6 days ago
How to Create an Editorial Calendar (+Free Template!)
-
SEO7 days ago
TikTok Staff Can Decide What Goes Viral