Connect with us

SEO

First Input Delay – A Simple Explanation via @sejournal, @martinibuster

Published

on

First Input Delay (FID) is a user experience metric that Google uses as a small ranking factor.

This article offers an easy-to-understand overview of FID to help make sense of the topic.

First input delay is more than trying to please Google. Improvements to site performance generally lead to increased sales, ad revenue, and leads.

What is First Input Delay?

FID is the measurement of the time it takes for a browser to respond to a site visitor’s first interaction with the site while the site is loading. This is sometimes called Input Latency.

An interaction can be tapping a button, a link, or a keypress, and the response given in response. Text input areas, dropdowns, and checkboxes are other kinds of interaction points that FID will measure.

Scrolling or zooming do not count as interactions because there’s no response expected from the site itself.

The goal for FID is to measure how responsive a site is while it’s loading.

Advertisement

Continue Reading Below

The Cause of First Input Delay

First Input Delay is generally caused by images and scripts that download in a non-orderly manner.

This disordered coding causes the web page download to excessively pause, then start, then pause. This causes unresponsive behavior for site visitors attempting to interact with the web page.

It’s like a traffic jam caused by a free-for-all where there are no traffic signals. Fixing it is about bringing order to the traffic.

Google describes the cause of input latency like this:

“In general, input delay (a.k.a. input latency) happens because the browser’s main thread is busy doing something else, so it can’t (yet) respond to the user.

One common reason this might happen is the browser is busy parsing and executing a large JavaScript file loaded by your app.

While it’s doing that, it can’t run any event listeners because the JavaScript it’s loading might tell it to do something else.”

Advertisement

Continue Reading Below

How to Fix Input Latency

Since the root cause of First Input Delay is the disorganized download of scripts and images, the way to fix the problem is to thoughtfully bring order to how those scripts and images are presented to the browser for download.

Solving the problem of FID generally consists of using HTML attributes to control how scripts download, optimizing images (the HTML and the images), and thoughtfully omitting unnecessary scripts.

The goal is to optimize what is downloaded to eliminate the typical pause-and-start downloading of unorganized web pages.

Why Browsers Become Unresponsive

Browsers are software that complete tasks to show a web page. The tasks consist of downloading code, images, fonts, style information, and scripts, and then running (executing) the scripts and building the web page according to the HTML instructions.

This process is called rendering. The word render means “to make,” and that’s what a browser does by assembling the code and images to render a web page.

The individual rendering tasks are called threads, short for “thread of execution.” This means an individual sequence of action (in this case, the many individual tasks done to render a web page).

In a browser, there is one thread called the Main Thread and it is responsible for creating (rendering) the web page that a site visitor sees.

The main thread can be visualized as a highway in which cars are symbolic of the images and scripts that are downloading and executing when a person visits a website.

Some code is large and slow. This causes the other tasks to stop and wait for the big and slow code to get off the highway (finish downloading and executing).

The goal is to code the web page in a manner that optimizes which code is downloaded first and when the code is executed, in an orderly manner, so that the web page downloads in the fastest possible manner.

Don’t Lose Sleep Over Third-Party Code

When it comes to Core Web Vitals and especially with First Input Delay, you’ll find there is some code over you just can’t do much about. However, this is likely to be the case for your competitors, as well.

Advertisement

Continue Reading Below

For example, if your business depends on Google AdSense (a big render-blocking script), the problem is going to be the same for your competitor. Solutions like lazy loading using Google Ad Manager can help.

In some cases, it may be enough to do the best you can because your competitors may not do any better either.

In those cases, it’s best to take your wins where you can find them. Don’t sweat the losses where you can’t make a change.

JavaScript Impact on First Input Delay

JavaScript is like a little engine that makes things happen. When a name is entered on a form, JavaScript might be there to make sure both the first and last name is entered.

When a button is pressed, JavaScript may be there to tell the browser to spawn a thank you message in a popup.

The problem with JavaScript is that it not only has to download but also has to run (execute). So those are two things that contribute to input latency.

Advertisement

Continue Reading Below

If a big JavaScript file is located near the top of the page, that file is going to block the rest of the page beneath it from rendering (becoming visible and interactive) until that script is finished downloading and executing.

This is called blocking the page.

The obvious solution is to relocate these kinds of scripts from the top of the page and put them at the bottom so they don’t interfere with all the other page elements that are waiting to render.

But this can be a problem if, for example, it’s placed at the end of a very long web page.

This is because once the large page is loaded and the user is ready to interact with it, the browser will still be signaling that it is downloading (because the big JavaScript file is lagging at the end). The page may download faster but then stall while waiting for the JavaScript to execute.

There’s a solution for that!

Advertisement

Continue Reading Below

Defer and Async Attributes

The Defer and Async HTML attributes are like traffic signals that control the start and stop of how JavaScript downloads and executes.

An HTML attribute is something that transforms an HTML element, kind of like extending the purpose or behavior of the element.

It’s like when you learn a skill; that skill becomes an attribute of who you are.

In this case, the Defer and Async attributes tell the browser to not block HTML parsing while downloading. These attributes tell the browser to keep the main thread going while the JavaScript is downloading.

Async Attribute

JavaScript files with the Async attribute will download and then execute as soon as it is downloaded. When it begins to execute is the point at which the JavaScript file blocks the main thread.

Normally, the file would block the main thread when it begins to download. But not with the async (or defer) attribute.

This is called an asynchronous download, where it downloads independently of the main thread and in parallel with it.

Advertisement

Continue Reading Below

The async attribute is useful for third-party JavaScript files like advertising and social sharing — files where the order of execution doesn’t matter.

Defer Attribute

JavaScript files with the “defer” attribute will also download asynchronously.

But the deferred JavaScript file will not execute until the entire page is downloaded and rendered. Deferred scripts also execute in the order in which they are located on a web page.

Scripts with the defer attribute are useful for JavaScript files that depend on page elements being loaded and when the order they are executed matter.

In general, use the defer attribute for scripts that aren’t essential to the rendering of the page itself.

Input Latency is Different for All Users

It’s important to be aware that First Input Delay scores are variable and inconsistent. The scores vary from visitor to visitor.

This variance in scores is unavoidable because the score depends on interactions that are particular to the individual visiting a site.

Advertisement

Continue Reading Below

Some visitors might be distracted and not interact until a moment where all the assets are loaded and ready to be interacted with.

This is how Google describes it:

“Not all users will interact with your site every time they visit. And not all interactions are relevant to FID…”

In addition, some users’ first interactions will be at bad times (when the main thread is busy for an extended period of time), and some user’s first interactions will be at good times (when the main thread is completely idle).

This means some users will have no FID values, some users will have low FID values, and some users will probably have high FID values.”

Why Most Sites Fail FID

Unfortunately, many content management systems, themes, and plugins were not built to comply with this relatively new metric.

This is the reason why so many publishers are dismayed to discover that their sites don’t pass the First Input Delay test.

Advertisement

Continue Reading Below

But that’s changing as the web software development community responds to demands for different coding standards from the publishing community.

And it’s not that the software developers making content management systems are at fault for producing products that don’t measure up against these metrics.

For example, WordPress addressed a shortcoming in the Gutenberg website editor that was causing it to score less well than it could.

Gutenberg is a visual way to build sites using the interface or metaphor of blocks. There’s a widgets block, a contact form block, and a footer block, etc.

So the process of creating a web page is more visual and done through the metaphor of building blocks, literally building a page with different blocks.

There are different kinds of blocks that look and behave in different ways. Each individual block has a corresponding style code (CSS), with much of it being specific and unique to that individual block.

The standard way of coding these styles is to create one style sheet containing the styles that are unique to each block. It makes sense to do it this way because you have a central location where all the code specific to blocks exists.

Advertisement

Continue Reading Below

The result is that on a page that might consist of (let’s say) twenty blocks, WordPress would load the styles for those blocks plus all the other blocks that aren’t being used.

Before Core Web Vitals (CWV), that was considered the standard way to package up CSS.

Since the introduction of Core Web Vitals, that practice is considered code bloat.

This is not meant as a slight against the WordPress developers. They did a fantastic job.

This is just a reflection of how rapidly changing standards can hit a bottleneck at the software development stage before being integrated into the coding ecosystem.

We went through the same thing with the transition to mobile-first web design.

Gutenberg 10.1 Improved Performance

WordPress Gutenberg 10.1 introduced an improved way to load the styles by only loading the styles that were needed and not loading the block styles that weren’t going to be used.

This is a huge win for WordPress, the publishers who rely on WordPress, and of course, the users who visit sites created with WordPress.

Advertisement

Continue Reading Below

Time to Fix First Input Delay is Now

Moving forward, we can expect that more and more software developers responsible for the CMS, themes, and plugins will transition to First Input Delay-friendly coding practices.

But until that happens, the burden is on the publisher to take steps to improve First Input Delay. Understanding it is the first step.

Citations

Chrome User Experience Report

PageSpeed Insights

Chrome Dev Tools Lighthouse

Google Search Console (Core Web Vitals report)

Optimize First Input Delay

First Input Delay

User-centric Performance Metrics

GitHub Script for Measuring Core Web Vitals

Searchenginejournal.com

SEO

Top 5 Essential SEO Reporting Tools For Agencies

Published

on

Top 5 Essential SEO Reporting Tools For Agencies

Your clients trust you to create real results and hit KPIs that drive their businesses forward.

Understanding the intricacies of how that works can be difficult, so it’s essential to demonstrate your progress and efforts.

SEO reporting software showcases important metrics in a digestible and visually represented way. They save guesswork and manual referencing, highlighting achievements over a specified time.

A great tool can also help you formulate action items, gauge the performance of campaigns, and see real results that can help you create new and innovative evaluations.

The latest and allegedly greatest tools hit the market all the time, promising to transform how you conduct reports.

Certainly, you have to weigh a few factors when deciding which software to implement. Price, features, and ease of use are the most important to consider.

A cost-effective tool with a steep learning curve might not be worth it for the features. Similarly, an expensive tool might be more appealing if it is user-friendly but could quickly run up costs.

Just like any transformational business decision, you’ll have to weigh the pros and cons carefully to determine the right one for you.

Key Takeaways

  • Cost, accessibility, and features are the common thread of comparison for SEO reporting tools.
  • To truly get the best use out of an SEO reporting tool for your agency, you’ll need to weigh several details, including scalability, customization, integrations, and access to support.
  • What might be considered a subpar tool could be a game-changer for an agency. Due diligence and research are the keys to knowing what will work for your team.

What To Look For In SEO Reporting Tools

It can be tough to make heads or tails of the available tools and choose which will benefit your agency the most.

Here are the 10 essential requirements of SEO reporting tools.

1. Accurate And Current Regional Data

SEO reporting is all about data. The software must have access to accurate and current data localized to your client’s targeted region.

Search data from the U.S. is meaningless if your client tries to rank for [London plumbing services], so localization matters.

The tool must update data regularly and with reliable accuracy so you can make informed decisions about where your client stands against the competition.

2. Integration With Third-Party Tools

Especially for full-scale digital marketing campaigns, the ability to report on all KPIs in one place is essential.

The more available integrations with third-party tools (e.g., Google Analytics, Google Business Profile, Majestic), the better.

Some tools even allow you to upload custom data sets.

3. Scalability

You don’t want to have to retrain or reinvest in new software every time your agency reaches a new tier.

The right SEO reporting tool should work well for your current business size and leave room for expansion as you onboard more clients.

4. Strong Suite Of Features

A great SEO reporting tool should include:

  • Position tracking.
  • Backlink monitoring.
  • Competitor data.
  • Analytics.

It is a bonus if the tool has reporting features for social media, email marketing, call tracking, and/or paid ads to make it a full-suite digital marketing software.

5. Continually Improving And Updating Features

SEO is constantly evolving, and so should SEO reporting tools.

As we continue the transition from website optimization to web presence optimization, a tool’s ability to integrate new features is essential.

6. Ability To Customize Reports

Each client will have different KPIs, objectives, and priorities.

Presenting the information that clients want to see is paramount to successful campaigns and retention.

Your reporting software of choice should be able to emphasize the correct data at the right times.

7. Client Integration

A good SEO reporting tool must have the client in mind.

It should have a simple bird’s eye overview of the basics but also be easy for clients to dig into the data at a deeper level.

This can mean automated summary reports or 24/7 client access to the dashboard.

8. Ability To White Label Reports

While white labeling is not essential (no client will sniff at receiving a report with a Google logo in the top corner), it helps keep branding consistent and gives a professional sheen to everything you send a client’s way.

9. Access To Support Resources

Quality support resources can help you find a detour when you encounter a roadblock.

Whether it’s detailed support documentation, a chat feature/support desk, or responsive customer support on social media, finding the help you need to solve the issue is important.

10. Cost-To-Value Ratio

With a proper process, time investment, and leveraging support resources, it is possible to get better results from a free reporting tool than one that breaks the bank.

This can mean automated summary reports or 24/7 client access to the dashboard.

Top 5 SEO Reporting Tools

In evaluating five of the most popular SEO reporting tools, based on the above criteria, here is how they stack up:

1. AgencyAnalytics

My Overall Rating: 4.7/5

Image credit: AgencyAnalytics, December 2022

AgencyAnalytics is a quality introductory/intermediate reporting tool for agencies.

Among the tools on this list, it is one of the easiest to use for small to mid-sized agencies.

It starts at $12 per month, per client, with unlimited staff and client logins, a white-label dashboard, and automated branded reports. The minimum purchase requirements mean the first two tiers work out to $60 per month and $180 per month, respectively. But your ability to change the payment based on the number of clients could help keep costs lean.

AgencyAnalytics comes with 70+ supported third-party data integrations.

However, this reliance on third-party data means you may have incomplete reports when there is an interruption in the transmission.

Though new integrations are always being added, they can be glitchy at first, making them unreliable to share with clients until stabilized.

With the ability for clients to log in and view daily data updates, it provides real-time transparency.

Automated reports can be customized, and the drag-and-drop customized dashboard makes it easy to emphasize priority KPIs.

2. SE Ranking

My Overall Rating: 4.5/5

SE Ranking has plans starting at $39.20 per month, although the $87.20 per month plan is necessary if you need historical data or more than 10 projects.

Setup is a breeze, as the on-screen tutorial guides you through the process.

SE Ranking features a strong collection of SEO-related tools, including current and historical position tracking, competitor SEO research, keyword suggestion, a backlink explorer, and more.

SE Ranking is hooked up with Zapier, which allows users to integrate thousands of apps and provide a high level of automation between apps like Klipfolio, Salesforce, HubSpot, and Google Apps.

SE Ranking is an effective SEO reporting tool at a beginner to intermediate level.

However, you may want to look in a different direction if your agency requires more technical implementations or advanced customization.

3. Semrush

My Overall Rating: 4.4/5

Semrush is one of the most SEO-focused reporting tools on the list, which is reflected in its features.

Starting at $229.95 per month for the agency package, it’s one of the more expensive tools on the list. But Semrush provides a full suite of tools that can be learned at an intermediate level.

A major downside of Semrush, especially for cost-conscious agencies, is that an account comes with only one user login.

Having to purchase individual licenses for each SEO analyst or account manager adds up quickly, and the users you can add are limited by the plan features. This makes scalability an issue.

Semrush has both branded and white-label reports, depending on your subscription level. It uses a proprietary data stream, tracking more than 800 million keywords.

The ever-expanding “projects” feature covers everything from position tracking to backlink monitoring and social media analysis.

Though it doesn’t fall specifically under the scope of SEO reporting, Semrush’s innovation makes it a one-stop shop for many agencies.

Project features include Ad Builder, which helps craft compelling ad text for Google Ads, and Social Media Poster, which allows agencies to schedule client social posts.

Combining such diverse features under the Semrush umbrella offsets its relatively high cost, especially if you can cancel other redundant software.

4. Looker Studio

My Overall Rating: 3.6/5

Looker StudioScreenshot from Looker Studio, December 2022

Formerly known as Google Data Studio, Looker Studio is a Google service that has grown considerably since its initial launch.

Though it is much more technical and requires more time investment to set up than most other tools on this list, it should be intuitive for staff familiar with Google Analytics.

If you’re on the fence, Looker Studio is completely free.

A major upside to this software is superior integration with other Google properties like Analytics, Search Console, Ads, and YouTube.

Like other reporting tools, it also allows third-party data integration, but the ability to query data from databases, including MySQL, PostgreSQL, and Google’s Cloud SQL, sets it apart.

You can customize reports with important KPIs with proper setup, pulling from lead and customer information. For eCommerce clients, you can even integrate sales data.

Though the initial setup will be much more technical, the ability to import templates saves time and effort.

You can also create your own templates that better reflect your processes and can be shared across clients. Google also has introductory video walk-throughs to help you get started.

5. Authority Labs

My Overall Rating: 3.2/5

Authority Labs Ranking ReportImage credit: Authority Labs, December 2022

Authority Labs does the job if you’re looking for a straightforward position-tracking tool.

Authority Labs is $49 per month for unlimited users, though you will need to upgrade to the $99 per month plan for white-label reporting.

You can track regional ranking data, get insights into “(not provided)” keywords, track competitor keywords, and schedule automated reporting.

However, lacking other essential features like backlink monitoring or analytic data means you will have to supplement this tool to provide a full SEO reporting picture for clients.

Conclusion

There are many quality SEO reporting tools on the market. What makes them valuable depends on their ability to work for your clients’ needs.

SE Ranking has a fantastic cost-to-value ratio, while Looker Studio has advanced reporting capabilities if you can withstand a higher barrier to entry.

Agency Analytics prioritizes client access, which is a big deal if transparency is a core value for your agency.

Authority Labs keeps it lean and clean, while Semrush always adds innovative features.

These five are simply a snapshot of what is available. There are new and emerging tools that might have some features more appealing to your current clients or fill gaps that other software creates despite being a great solution.

Ultimately, you need to consider what matters most to your agency. Is it:

  • Feature depth?
  • Scalability?
  • Cost-to-value ratio?

Once you weigh the factors that matter most for your agency, you can find the right SEO reporting tool. In the meantime, don’t shy away from testing out a few for a trial period.

If you don’t want to sign up for a full month’s usage, you can also explore walkthrough videos and reviews from current users. The most informed decision requires an understanding of the intricate details.


Featured Image: Paulo Bobita/Search Engine Journal



Source link

Continue Reading

SEO

How to Block ChatGPT From Using Your Website Content

Published

on

How to Block ChatGPT From Using Your Website Content

There is concern about the lack of an easy way to opt out of having one’s content used to train large language models (LLMs) like ChatGPT. There is a way to do it, but it’s neither straightforward nor guaranteed to work.

How AIs Learn From Your Content

Large Language Models (LLMs) are trained on data that originates from multiple sources. Many of these datasets are open source and are freely used for training AIs.

Some of the sources used are:

  • Wikipedia
  • Government court records
  • Books
  • Emails
  • Crawled websites

There are actually portals and websites offering datasets that are giving away vast amounts of information.

One of the portals is hosted by Amazon, offering thousands of datasets at the Registry of Open Data on AWS.

Screenshot from Amazon, January 2023

The Amazon portal with thousands of datasets is just one portal out of many others that contain more datasets.

Wikipedia lists 28 portals for downloading datasets, including the Google Dataset and the Hugging Face portals for finding thousands of datasets.

Datasets of Web Content

OpenWebText

A popular dataset of web content is called OpenWebText. OpenWebText consists of URLs found on Reddit posts that had at least three upvotes.

The idea is that these URLs are trustworthy and will contain quality content. I couldn’t find information about a user agent for their crawler, maybe it’s just identified as Python, I’m not sure.

Nevertheless, we do know that if your site is linked from Reddit with at least three upvotes then there’s a good chance that your site is in the OpenWebText dataset.

More information about OpenWebText is here.

Common Crawl

One of the most commonly used datasets for Internet content is offered by a non-profit organization called Common Crawl.

Common Crawl data comes from a bot that crawls the entire Internet.

The data is downloaded by organizations wishing to use the data and then cleaned of spammy sites, etc.

The name of the Common Crawl bot is, CCBot.

CCBot obeys the robots.txt protocol so it is possible to block Common Crawl with Robots.txt and prevent your website data from making it into another dataset.

However, if your site has already been crawled then it’s likely already included in multiple datasets.

Nevertheless, by blocking Common Crawl it’s possible to opt out your website content from being included in new datasets sourced from newer Common Crawl data.

The CCBot User-Agent string is:

CCBot/2.0

Add the following to your robots.txt file to block the Common Crawl bot:

User-agent: CCBot
Disallow: /

An additional way to confirm if a CCBot user agent is legit is that it crawls from Amazon AWS IP addresses.

CCBot also obeys the nofollow robots meta tag directives.

Use this in your robots meta tag:

<meta name="robots" content="nofollow">

Blocking AI From Using Your Content

Search engines allow websites to opt out of being crawled. Common Crawl also allows opting out. But there is currently no way to remove one’s website content from existing datasets.

Furthermore, research scientists don’t seem to offer website publishers a way to opt out of being crawled.

The article, Is ChatGPT Use Of Web Content Fair? explores the topic of whether it’s even ethical to use website data without permission or a way to opt out.

Many publishers may appreciate it if in the near future, they are given more say on how their content is used, especially by AI products like ChatGPT.

Whether that will happen is unknown at this time.

More resources:

Featured image by Shutterstock/ViDI Studio



Source link

Continue Reading

SEO

Google’s Mueller Criticizes Negative SEO & Link Disavow Companies

Published

on

Google's Mueller Criticizes Negative SEO & Link Disavow Companies

John Mueller recently made strong statements against SEO companies that provide negative SEO and other agencies that provide link disavow services outside of the tool’s intended purpose, saying that they are “cashing in” on clients who don’t know better.

While many frequently say that Mueller and other Googlers are ambiguous, even on the topic of link disavows.

The fact however is that Mueller and other Googlers have consistently recommended against using the link disavow tool.

This may be the first time Mueller actually portrayed SEOs who liberally recommend link disavows in a negative light.

What Led to John Mueller’s Rebuke

The context of Mueller’s comments about negative SEO and link disavow companies started with a tweet by Ryan Jones (@RyanJones)

Ryan tweeted that he was shocked at how many SEOs regularly offer disavowing links.

He tweeted:

“I’m still shocked at how many seos regularly disavow links. Why? Unless you spammed them or have a manual action you’re probably doing more harm than good.”

The reason why Ryan is shocked is because Google has consistently recommended the tool for disavowing paid/spammy links that the sites (or their SEOs) are responsible for.

And yet, here we are, eleven years later, and SEOs are still misusing the tool for removing other kinds of tools.

Here’s the background information about that.

Link Disavow Tool

In the mid 2000’s there was a thriving open market for paid links prior to the Penguin Update in April 2012. The commerce in paid links was staggering.

I knew of one publisher with around fifty websites who received a $30,000 check every month for hosting paid links on his site.

Even though I advised my clients against it, some of them still purchased links because they saw everyone else was buying them and getting away with it.

The Penguin Update caused the link selling boom collapsed.

Thousands of websites lost rankings.

SEOs and affected websites strained under the burden of having to contact all the sites from which they purchased paid links to ask to have them removed.

So some in the SEO community asked Google for a more convenient way to disavow the links.

Months went by and after resisting the requests, Google relented and released a disavow tool.

Google cautioned from the very beginning to only use the tool for disavowing links that the site publishers (or their SEOs) are responsible for.

The first paragraph of Google’s October 2012 announcement of the link disavow tool leaves no doubt on when to use the tool:

“Today we’re introducing a tool that enables you to disavow links to your site.

If you’ve been notified of a manual spam action based on ‘unnatural links’ pointing to your site, this tool can help you address the issue.

If you haven’t gotten this notification, this tool generally isn’t something you need to worry about.”

The message couldn’t be clearer.

But at some point in time, link disavowing became a service applied to random and “spammy looking” links, which is not what the tool is for.

Link Disavow Takes Months To Work

There are many anecdotes about link disavows that helped sites regain rankings.

They aren’t lying, I know credible and honest people who have made this claim.

But here’s the thing, John Mueller has confirmed that the link disavow process takes months to work its way through Google’s algorithm.

Sometimes things happen that are not related, no correlation. It just looks that way.

John shared how long it takes for a link disavow to work in a Webmaster Hangout:

“With regards to this particular case, where you’re saying you submitted a disavow file and then the ranking dropped or the visibility dropped, especially a few days later, I would assume that that is not related.

So in particular with the disavow file, what happens is we take that file into account when we reprocess the links kind of pointing to your website.

And this is a process that happens incrementally over a period of time where I would expect it would have an effect over the course of… I don’t know… maybe three, four, five, six months …kind of step by step going in that direction.

So if you’re saying that you saw an effect within a couple of days and it was a really strong effect then I would assume that this effect is completely unrelated to the disavow file. …it sounds like you still haven’t figured out what might be causing this.”

John Mueller: Negative SEO and Link Disavow Companies are Making Stuff Up

Context is important to understand what was said.

So here’s the context for John Mueller’s remark.

An SEO responded to Ryan’s tweet about being shocked at how many SEOs regularly disavow links.

The person responding to Ryan tweeted that disavowing links was still important, that agencies provide negative SEO services to take down websites and that link disavow is a way to combat the negative links.

The SEO (SEOGuruJaipur) tweeted:

“Google still gives penalties for backlinks (for example, 14 Dec update, so disavowing links is still important.”

SEOGuruJaipur next began tweeting about negative SEO companies.

Negative SEO companies are those that will build spammy links to a client’s competitor in order to make the competitor’s rankings drop.

SEOGuruJaipur tweeted:

“There are so many agencies that provide services to down competitors; they create backlinks for competitors such as comments, bookmarking, directory, and article submission on low quality sites.”

SEOGuruJaipur continued discussing negative SEO link builders, saying that only high trust sites are immune to the negative SEO links.

He tweeted:

“Agencies know what kind of links hurt the website because they have been doing this for a long time.

It’s only hard to down for very trusted sites. Even some agencies provide a money back guarantee as well.

They will provide you examples as well with proper insights.”

John Mueller tweeted his response to the above tweets:

“That’s all made up & irrelevant.

These agencies (both those creating, and those disavowing) are just making stuff up, and cashing in from those who don’t know better.”

Then someone else joined the discussion:

Mueller tweeted a response:

“Don’t waste your time on it; do things that build up your site instead.”

Unambiguous Statement on Negative SEO and Link Disavow Services

A statement by John Mueller (or anyone) can appear to conflict with prior statements when taken out of context.

That’s why I not only placed his statements into their original context but also the history going back eleven years that is a part of that discussion.

It’s clear that John Mueller feels that those selling negative SEO services and those providing disavow services outside of the intended use are “making stuff up” and “cashing in” on clients who might not “know better.”

Featured image by Shutterstock/Asier Romero



Source link

Continue Reading

Trending

en_USEnglish