Connect with us

SEO

What’s New In SE Ranking’s On-Page SEO Checker?

Published

on

When it comes to SEO, on-page elements are factors that directly affect a site’s performance. This is why over the years, more and more tech companies are launching their own on-page SEO checker tool. These tools help make it easier for people who want to jumpstart SEO for their businesses. One of our favorite SEO tools to use, SE Ranking, takes this to another level by developing unique features that work cohesively with the other datasets in their toolbox.

SE Ranking for SEO Optimization

Currently, SE Ranking’s rank tracking tool is widely used among SEO professionals. To make the platform more cohesive, they have developed a broad range of much-needed functionalities to analyze and organize your SEO efforts. 

Their new On-Page SEO Checker is an awesome addition that would help SEO specialists all over the world make an actionable plan to rank in search engines like Google.

The tool aims to analyze how a specific webpage can be optimized for its target keyword. So, I took a look at what it could monitor on a page, what makes it different from other on-page SEO tools out there, and how it helps improve the on-page SEO of a website.

Today, I’m going to share my experience with SE Ranking’s new on-page SEO tool.

SE Ranking’s On-Page SEO Checker in Action

To fully see how accurate and extensive the tool is, I used it to check a new website my team is currently working on. I entered the URL of the webpage and the keyword I aim to rank it for to get the audit started.

SE Ranking On Page SEO Checker Audit Launch

One thing to note is that the checker only accommodates one search query to analyze. So, it’s important to note that the results the tool will show differ for each search query you enter. 

Here’s a rundown of some of our interesting finds:

It Scores Based On A Comprehensive Metric

SE Ranking On Page SEO Checker Overall Scoring System

The first thing SE Rankings’s on-page SEO checker shows is the webpage’s overall quality score. This is based on over 70 parameters Google considers when it ranks pages. The quality score is more than just an estimation of the webpage’s performance. I found that SE Ranking’s scoring system has a way of gauging which factors are prioritized more than the others.

According to SE Ranking…metrics that have a strong impact on rankings have a greater impact on the page’s overall quality score, while metrics that are not decisive are given less weight.” (source: SE Ranking

This means that the score is based on the number of checks, warnings, and errors detected by the tool in the different metrics it set in the analysis. 

To help you prioritize your optimizations better, you can explore these categories one by one. For example, the URLs listed as ‘Errors’ require more urgent solutions, so you can start with them.

It Provides Suggestions to Optimize Metadata

Some of the most rigid diagnostics I found from the tool are the ones pointing to the webpage’s metadata. The page titles, URLs, meta descriptions, headings, and alt attributes are arguably the heart of SEO. These are some of the most important factors Google looks at to understand the content of your webpage. 

What makes SE Ranking’s on-page SEO checker tool stand out is that it can detect issues that carry weight when it comes to keyword rankings. This is because the tool focuses on giving an in-depth analysis of on-page elements.

While other tools only look at the number of characters in your metadata, SE Ranking can assess if you have integrated your keywords within them. This ensures that your webpage is optimized in a way that caters to the keyword you want to rank it for.

SE Ranking On Page SEO Checker Metadata Findings

Another thing I noticed is how instead of just listing down warnings and errors on your data, it also provides solutions on how you can improve it. This will help you save time and effort that can be used to optimize other areas of your webpage.

It Evaluates for Content Uniqueness

Duplicates in page content are red flags in SEO. The last thing you want to do is publish unoriginal content that lowers your chances of ranking in the search engines. Regularly checking if your content is unique for each page is important. 

The checker detects and lists down the URLs of the pages your content is similar to, the gravity of their similarities, and which exact areas of the content are not unique. This gives you a clearer idea of what and where you should revise or improve.

SE Ranking On Page SEO Checker Page Content Uniqueness Rate Findings

Another unique thing you can take advantage of in this tool is that it looks at the uniqueness of the images that are found on your webpage. With this feature, you can guarantee that every aspect of your content is a non-duplicate.

SE Ranking On Page SEO Checker Image Uniqueness Analyzer

Overall, this category will help make your webpage attractive to users, especially since what it offers is distinct and not repetitive.

It Assesses Keyword Usage

Your keyword usage will make or break your SEO efforts for a webpage. This is why on top of using the appropriate keyword, you have to integrate it into your webpage properly. SE Ranking’s checker also analyzes this by looking at the number of times a keyword is used on a webpage. At the same time, it detects which part of the content your keyword is placed in. 

SE Ranking On Page SEO Checker Keyword Usage Assessment

The “T” in the green circle stands for the title, “D” represented meta description, and “H1” for the Heading 1. This feature can guide you on how to utilize and place your keyword in areas where you need them to be.

Another good thing about this feature is it recognizes keyword variations and keyphrases in case you want to diversify your keyword usage. 

It Provides A Page Speed Report and Recommendations

Good page speed provides good user experience. And when Google rolled out its Core Web Vitals update last year, the speed, responsiveness, and stability of a webpage became an important ranking factor. 

The tool makes this easier by skipping the setting up process and directly showing the results. 

SE Ranking On Page SEO Checker Page Loading Speed Report

Not only does SE Ranking give the numbers for loading speed, but it also presents specific suggestions that you can work on to improve these numbers. 

In the image below, you’ll see specific fields where fixes can be applied. The checker also separates its results for each device. The preview of how the page looks like for each device at the right is just the cherry on top.

SE Ranking On Page SEO Checker Page Preview On Mobile Device

When you click ‘Resolve’ on the right side of each category, it will show the files that can be edited to contribute to a better loading speed. The suggestions show which elements you need to compress and how much space will be saved if you do so.

SE Ranking On Page SEO Checker Resolve Button For Page Speed Optimizations

With these recommendations from the tool, you can make your webpage load faster and create swifter navigation for it. This allows your visitors to see the information they need from your webpage in a shorter amount of time which creates a good on-page user experience.

It Can Showcase Off-Page Data

This tool already provides a comprehensive diagnosis of on-page elements, but what makes it more special is it also gives an overview of the webpage’s backlink profile. After all, backlinks also carry a lot of weight in page ranking. 

The checker shows the backlink record along with the outbound and internal links found on a webpage. 

SE Ranking On Page SEO Checker Link Analysis Report

I think that the presentation of the data could be better if it also shows the anchor text and landing page used for each backlink. This is so you can get a more complete overview of your backlinks. 

And, although it’s good to include how many of the backlinks are Dofollow links, SE Ranking can refine this more by indicating which ones are contributing to your webpage’s ranking.

Aside from looking at the backlinks, the tool also takes a look at how a webpage and its domain are performing on social media, particularly on Facebook and VK (a Russian social networking platform).

SE Ranking On Page SEO Checker Report for Social Media Popularity

Consistent with the other metrics, SE Ranking provides suggestions for improvement such as making social media accounts accessible on the webpage. This will let people on social media see and engage with your content.

Moving forward, it would be useful if SE Ranking can include data from other major social media platforms such as Twitter, Instagram, and LinkedIn in the assessment. This is so it can better evaluate webpage performance across different audiences. At the same time, it can help determine which one you can put more effort into to make your business and content more visible.

On-Page Tool Comparison: SE Ranking vs. Xenu’s Link Sleuth vs. Screaming Frog

Xenu’s Link Sleuth is one of the older SEO tools used by professionals and Screaming Frog is another popular software I love using for my on-site crawls. To have a better grasp of what SE Ranking’s On-Page Checker feature can do, I compared it to the two software I mentioned in terms of how it presents its data and findings.

Findings For Page Speed Data

Xenu link sleuth doesn’t directly analyze page speed. It came into operation in 2009 and the features it has doesn’t seem to be up-to-date compared to a lot of on-page SEO tools my team is using.

The only thing I found that has an impact on a webpage’s loading speed is the ‘Size’ category, which points to large files found on a page. The thing is, it doesn’t indicate the unit of measurement it refers to. 

Xenu

Screaming Frog, on the other hand, can look into how fast your site is from a large number of webpage’s backend elements. For this tool, you have to connect your Google account and get your API key to get certain datasets such as your page speed findings.

Here’s what it shows before the setup process:

Screaming Frog Page Speed Report Results

The same data can be found in SE Ranking without setting up your API. This helps if you’re not familiar with the more technical aspects of SEO. 

Presentation of Content Data

Just like SE Ranking, XenuLink Sleuth can also read the content of a webpage. However, this is limited to only the page title. When it comes to the originality of the content, it doesn’t contribute much to ensuring that a webpage is unique across its domain and the internet.
Xenu

Since Xenu’s Link Sleuth provides a limited number of categories for its on-page audit, the optimizations you can make to step up your webpage’s SEO are also limited.

Similar to SE Ranking, Screaming Frog has a feature that detects duplicate issues on a webpage’s content. Here’s how simple Screaming Frog presents content duplicate warnings:

Screaming Frog Duplicate Content Findings

SE Ranking presents this data a bit differently since it also checks if your webpage is unique from other ones available on the internet. It also highlights which specific lines among the pages are similar. 

This is a recurring theme that I noticed in my comparison. For each category, SE Ranking provides a brief description of what it is, how it factors in SEO, and what ways you can optimize it. 

This makes SE Ranking a beginner-friendly tool that will allow you to enhance your webpage like an expert. 

Key Takeaway 

SE Ranking’s on-page tool helps optimize your webpage to be unique, accessible, and attractive to visitors. It evaluates even the smallest on-page details that affect your ranking and shows ways to enhance them so you won’t have to. 

The final touch? The results the checker generates can be exported via email or PDF. It also includes a checklist of the areas that need improvement on your webpage. This way, you can keep working on them without having to log in again from the SE Ranking website.

If you’re working on giving your website visibility in the search engines, try out SE Ranking’s On-Page SEO Checker tool for a smooth and comprehensive on-page diagnosis. 

Try out SE Ranking’s on page checker by signing up today Click here!

Source link

SEO

How to Block ChatGPT From Using Your Website Content

Published

on

How to Block ChatGPT From Using Your Website Content

There is concern about the lack of an easy way to opt out of having one’s content used to train large language models (LLMs) like ChatGPT. There is a way to do it, but it’s neither straightforward nor guaranteed to work.

How AIs Learn From Your Content

Large Language Models (LLMs) are trained on data that originates from multiple sources. Many of these datasets are open source and are freely used for training AIs.

Some of the sources used are:

  • Wikipedia
  • Government court records
  • Books
  • Emails
  • Crawled websites

There are actually portals and websites offering datasets that are giving away vast amounts of information.

One of the portals is hosted by Amazon, offering thousands of datasets at the Registry of Open Data on AWS.

Screenshot from Amazon, January 2023

The Amazon portal with thousands of datasets is just one portal out of many others that contain more datasets.

Wikipedia lists 28 portals for downloading datasets, including the Google Dataset and the Hugging Face portals for finding thousands of datasets.

Datasets of Web Content

OpenWebText

A popular dataset of web content is called OpenWebText. OpenWebText consists of URLs found on Reddit posts that had at least three upvotes.

The idea is that these URLs are trustworthy and will contain quality content. I couldn’t find information about a user agent for their crawler, maybe it’s just identified as Python, I’m not sure.

Nevertheless, we do know that if your site is linked from Reddit with at least three upvotes then there’s a good chance that your site is in the OpenWebText dataset.

More information about OpenWebText is here.

Common Crawl

One of the most commonly used datasets for Internet content is offered by a non-profit organization called Common Crawl.

Common Crawl data comes from a bot that crawls the entire Internet.

The data is downloaded by organizations wishing to use the data and then cleaned of spammy sites, etc.

The name of the Common Crawl bot is, CCBot.

CCBot obeys the robots.txt protocol so it is possible to block Common Crawl with Robots.txt and prevent your website data from making it into another dataset.

However, if your site has already been crawled then it’s likely already included in multiple datasets.

Nevertheless, by blocking Common Crawl it’s possible to opt out your website content from being included in new datasets sourced from newer Common Crawl data.

The CCBot User-Agent string is:

CCBot/2.0

Add the following to your robots.txt file to block the Common Crawl bot:

User-agent: CCBot
Disallow: /

An additional way to confirm if a CCBot user agent is legit is that it crawls from Amazon AWS IP addresses.

CCBot also obeys the nofollow robots meta tag directives.

Use this in your robots meta tag:

<meta name="robots" content="nofollow">

Blocking AI From Using Your Content

Search engines allow websites to opt out of being crawled. Common Crawl also allows opting out. But there is currently no way to remove one’s website content from existing datasets.

Furthermore, research scientists don’t seem to offer website publishers a way to opt out of being crawled.

The article, Is ChatGPT Use Of Web Content Fair? explores the topic of whether it’s even ethical to use website data without permission or a way to opt out.

Many publishers may appreciate it if in the near future, they are given more say on how their content is used, especially by AI products like ChatGPT.

Whether that will happen is unknown at this time.

More resources:

Featured image by Shutterstock/ViDI Studio



Source link

Continue Reading

SEO

Google’s Mueller Criticizes Negative SEO & Link Disavow Companies

Published

on

Google's Mueller Criticizes Negative SEO & Link Disavow Companies

John Mueller recently made strong statements against SEO companies that provide negative SEO and other agencies that provide link disavow services outside of the tool’s intended purpose, saying that they are “cashing in” on clients who don’t know better.

While many frequently say that Mueller and other Googlers are ambiguous, even on the topic of link disavows.

The fact however is that Mueller and other Googlers have consistently recommended against using the link disavow tool.

This may be the first time Mueller actually portrayed SEOs who liberally recommend link disavows in a negative light.

What Led to John Mueller’s Rebuke

The context of Mueller’s comments about negative SEO and link disavow companies started with a tweet by Ryan Jones (@RyanJones)

Ryan tweeted that he was shocked at how many SEOs regularly offer disavowing links.

He tweeted:

“I’m still shocked at how many seos regularly disavow links. Why? Unless you spammed them or have a manual action you’re probably doing more harm than good.”

The reason why Ryan is shocked is because Google has consistently recommended the tool for disavowing paid/spammy links that the sites (or their SEOs) are responsible for.

And yet, here we are, eleven years later, and SEOs are still misusing the tool for removing other kinds of tools.

Here’s the background information about that.

Link Disavow Tool

In the mid 2000’s there was a thriving open market for paid links prior to the Penguin Update in April 2012. The commerce in paid links was staggering.

I knew of one publisher with around fifty websites who received a $30,000 check every month for hosting paid links on his site.

Even though I advised my clients against it, some of them still purchased links because they saw everyone else was buying them and getting away with it.

The Penguin Update caused the link selling boom collapsed.

Thousands of websites lost rankings.

SEOs and affected websites strained under the burden of having to contact all the sites from which they purchased paid links to ask to have them removed.

So some in the SEO community asked Google for a more convenient way to disavow the links.

Months went by and after resisting the requests, Google relented and released a disavow tool.

Google cautioned from the very beginning to only use the tool for disavowing links that the site publishers (or their SEOs) are responsible for.

The first paragraph of Google’s October 2012 announcement of the link disavow tool leaves no doubt on when to use the tool:

“Today we’re introducing a tool that enables you to disavow links to your site.

If you’ve been notified of a manual spam action based on ‘unnatural links’ pointing to your site, this tool can help you address the issue.

If you haven’t gotten this notification, this tool generally isn’t something you need to worry about.”

The message couldn’t be clearer.

But at some point in time, link disavowing became a service applied to random and “spammy looking” links, which is not what the tool is for.

Link Disavow Takes Months To Work

There are many anecdotes about link disavows that helped sites regain rankings.

They aren’t lying, I know credible and honest people who have made this claim.

But here’s the thing, John Mueller has confirmed that the link disavow process takes months to work its way through Google’s algorithm.

Sometimes things happen that are not related, no correlation. It just looks that way.

John shared how long it takes for a link disavow to work in a Webmaster Hangout:

“With regards to this particular case, where you’re saying you submitted a disavow file and then the ranking dropped or the visibility dropped, especially a few days later, I would assume that that is not related.

So in particular with the disavow file, what happens is we take that file into account when we reprocess the links kind of pointing to your website.

And this is a process that happens incrementally over a period of time where I would expect it would have an effect over the course of… I don’t know… maybe three, four, five, six months …kind of step by step going in that direction.

So if you’re saying that you saw an effect within a couple of days and it was a really strong effect then I would assume that this effect is completely unrelated to the disavow file. …it sounds like you still haven’t figured out what might be causing this.”

John Mueller: Negative SEO and Link Disavow Companies are Making Stuff Up

Context is important to understand what was said.

So here’s the context for John Mueller’s remark.

An SEO responded to Ryan’s tweet about being shocked at how many SEOs regularly disavow links.

The person responding to Ryan tweeted that disavowing links was still important, that agencies provide negative SEO services to take down websites and that link disavow is a way to combat the negative links.

The SEO (SEOGuruJaipur) tweeted:

“Google still gives penalties for backlinks (for example, 14 Dec update, so disavowing links is still important.”

SEOGuruJaipur next began tweeting about negative SEO companies.

Negative SEO companies are those that will build spammy links to a client’s competitor in order to make the competitor’s rankings drop.

SEOGuruJaipur tweeted:

“There are so many agencies that provide services to down competitors; they create backlinks for competitors such as comments, bookmarking, directory, and article submission on low quality sites.”

SEOGuruJaipur continued discussing negative SEO link builders, saying that only high trust sites are immune to the negative SEO links.

He tweeted:

“Agencies know what kind of links hurt the website because they have been doing this for a long time.

It’s only hard to down for very trusted sites. Even some agencies provide a money back guarantee as well.

They will provide you examples as well with proper insights.”

John Mueller tweeted his response to the above tweets:

“That’s all made up & irrelevant.

These agencies (both those creating, and those disavowing) are just making stuff up, and cashing in from those who don’t know better.”

Then someone else joined the discussion:

Mueller tweeted a response:

“Don’t waste your time on it; do things that build up your site instead.”

Unambiguous Statement on Negative SEO and Link Disavow Services

A statement by John Mueller (or anyone) can appear to conflict with prior statements when taken out of context.

That’s why I not only placed his statements into their original context but also the history going back eleven years that is a part of that discussion.

It’s clear that John Mueller feels that those selling negative SEO services and those providing disavow services outside of the intended use are “making stuff up” and “cashing in” on clients who might not “know better.”

Featured image by Shutterstock/Asier Romero



Source link

Continue Reading

SEO

Source Code Leak Shows New Ranking Factors to Consider

Published

on

Source Code Leak Shows New Ranking Factors to Consider

January 25, 2023, the day that Yandex—Russia’s search engine—was hacked. 

Its complete source code was leaked online. And, it might not be the first time we’ve seen hacking happen in this industry, but it is one of the most intriguing, groundbreaking events in years.

But Yandex isn’t Google, so why should we care? Here’s why we do: these two search engines are very similar in how they process technical elements of a website, and this leak just showed us the 1,922 ranking factors Yandex uses in its algorithm. 

Simply put, this information is something that we can use to our advantage to get more traffic from Google.

Yandex vs Google

As I said, a lot of these ranking factors are possibly quite similar to the signals that Google uses for search.

Yandex’s algorithm shows a RankBrain analog: MatrixNext. It also seems that they are using PageRank (almost the same way as Google does), and a lot of their text algorithms are the same. Interestingly, there are also a lot of ex-Googlers working in Yandex. 

So, reviewing these factors and understanding how they play into search rankings and traffic will provide some very useful insights into how search engines like Google work. No doubt, this new trove of information will greatly influence the SEO market in the months to come. 

That said, Yandex isn’t Google. The chances of Google having the exact same list of ranking factors is low — and Google may not even give that signal the same amount of weight that Yandex does. 

Still, it’s information that potentially will be useful for driving traffic, so make sure to take a look at them here (before it’s scrubbed from the internet forever).

An early analysis of ranking factors

Many of their ranking factors are as expected. These include:

  • Many link-related factors (e.g., age, relevancy, etc.).
  • Content relevance, age, and freshness.
  • Host reliability
  • End-user behavior signals.

Some sites also get preference (such as Wikipedia). FI_VISITS_FROM_WIKI even shows that sites that are referenced by Wikipedia get plus points. 

These are all things that we already know.

But something interesting: there were several factors that I and other SEOs found unusual, such as PageRank being the 17th highest weighted factor in Yandex, and the 19th highest weighted factor being query-document relevance (in other words, how close they match thematically). There’s also karma for likely spam hosts, based on Whois information.

Other interesting factors are the average domain ranking across queries, percent of organic traffic, and the number of unique visitors.

You can also use this Yandex Search Ranking Factor Explorer, created by Rob Ousbey, to search through the various ranking factors.

The possible negative ranking factors:

Here’s my thoughts on Yandex’s factors that I found interesting: 

FI_ADV: -0.2509284637 — this factor means having tons of adverts scattered around your page and buying PPC can affect rankings. 

FI_DATER_AGE: -0.2074373667 — this one evaluates content age, and whether your article is more than 10 years old, or if there’s no determinable date. Date metadata is important. 

FI_COMM_LINKS_SEO_HOSTS: -0.1809636391 — this can be a negative factor if you have too much commercial anchor text, particularly if the proportion of such links goes above 50%. Pay attention to anchor text distribution. I’ve written a guide on how to effectively use anchor texts if you need some help on this. 

FI_RANK_ARTROZ — outdated, poorly written text will bring your rankings down. Go through your site and give your content a refresh. FI_WORD_COUNT also shows that the number of words matter, so avoid having low-content pages.

FI_URL_HAS_NO_DIGITS, FI_NUM_SLASHES, FI_FULL_URL_FRACTION — urls shouldn’t have digits, too many slashes (too much hierarchy), and of course contain your targeted keyword.

FI_NUM_LINKS_FROM_MP — always interlink your main pages (such as your homepage or landing pages) to any other important content you want to rank. Otherwise, it can hurt your content.

FI_HOPS — reduce the crawl depth for any pages that matter to you. No important pages should be more than a few clicks away from your homepage. I recommend keeping it to two clicks, at most. 

FI_IS_UNREACHABLE — likewise, avoid making any important page an orphan page. If it’s unreachable from your homepage, it’s as good as dead in the eyes of the search engine.

The possible positive ranking factors:

FI_IS_COM: +0.2762504972 — .com domains get a boost in rankings.

FI_YABAR_HOST_VISITORS — the more traffic you get, the more ranking power your site has. The strategy of targeting smaller, easier keywords first to build up an audience before targeting harder keywords can help you build traffic.

FI_BEAST_HOST_MEAN_POS — the average position of the host for keywords affects your overall ranking. This factor and the previous one clearly show that being smart with your keyword and content planning matters. If you need help with that, check out these 5 ways to build a solid SEO strategy.

FI_YABAR_HOST_SEARCH_TRAFFIC — this might look bad but shows that having other traffic sources (such as social media, direct search, and PPC) is good for your site. Yandex uses this to determine if a real site is being run, not just some spammy SEO project.

This one includes a whole host of CTR-related factors. 

CTR ranking factors from Yandex

It’s clear that having searchable and interesting titles that drive users to check your content out is something that positively affects your rankings.

Google is rewarding sites that help end a user’s search journey (as we know from the latest mobile search updates and even the Helpful Content update). Do what you can to answer the query early on in your article. The factor “FI_VISITORS_RETURN_MONTH_SHARE“ also shows that it helps to encourage users to return to your site for more information on the topics they’re interested in. Email marketing is a handy tool here.

FI_GOOD_RATIO and FI_MANY_BAD — the percentage of “good” and “bad” backlinks on your site. Getting your backlinks from high-quality websites with traffic is important for your rankings. The factor FI_LINK_AGE also shows that adding a link-building strategy to your SEO as early as possible can help with your rankings.

FI_SOCIAL_URL_IS_VERIFIED — that little blue check has actual benefits now. Links from verified accounts have more weight.

Key Takeaway

Yandex and Google, being so similar to each other in theory, means that this data leak is something we must pay attention to. 

Several of these factors may already be common knowledge amongst SEOs, but having them confirmed by another search engine enforces how important they are for your strategy.

These initial findings, and understanding what it might mean for your website, can help you identify what to improve, what to scrap, and what to focus on when it comes to your SEO strategy. 

Source link

Continue Reading

Trending

en_USEnglish