Connect with us

SEO

How HTTP/3 Helps Feed SEO’s Need For Speed

Published

on

How HTTP/3 Helps Feed SEO's Need For Speed

The evolution of the web never stands still.

As new technologies are developed, consumer behaviors change and the core infrastructure that underpins the internet is forced to adapt.

The HTTP protocol – used to transfer data between client and server – has gone through a number of different iterations, all of which have enhanced the core functionality with new and exciting features.

After an 18-year gap between the adoption of HTTP/1.1 in 1997 and HTTP/2 in 2015, development has picked up the pace, with the draft proposal for HTTP/3 submitted merely three years later.

What Is HTTP/3?

At its core, HTTP/3 is an overhaul of the underlying transport layer used to manage file transfers.

It represents a move away from TCP (Transmission Control Protocol) to UDP (User Datagram Protocol), addressing several TCP limitations and improving performance and security for users.

Although it’s still waiting for final review before publication, 73% of web browsers already support the protocol.

Advertisement

This number will significantly increase once Safari makes it a core feature; currently, it’s experimental and has to be enabled via the developer menu.

http3 browser support caniuse
Screenshot from HTTP/3 support Caniuse.com, April 2022

The HTTP/3 protocol is already used by 25% of the top 10 million websites, including Google and Facebook.

In fact, if you’re using technologies like Google Analytics, Tag Manager, or Fonts, you’re already partially utilizing the protocol.

What Are HTTP/3’s Main Advantages Over HTTP/2 And HTTP/1?

To fully appreciate the advantages of HTTP/3, it’s worth stepping back to understand how HTTP/1.1 worked, and the problems HTTP/2 was designed to solve.

When being sent, files (HTML, JS, CSS, images, etc.) are broken down into smaller, individual packets with the data transmitted over time.

HTTP/1.1 was designed to give each file its own connection. As websites became increasingly complex, more files were needed to load each page.

website total requests over timewebsite total requests over time
Image from HTTP archive, April 2022

Browsers limit the number of parallel connections available, creating a bottleneck and slowing loading times. This resulted in several necessary workarounds to maximize performance, such as domain sharding and image sprites.

By introducing multiplexing, HTTP/2 solved the problem caused by connection limits, allowing the transfer of multiple files over a singular connection.

The other major improvement was the introduction of better header compression, alongside a few other features that have proved less successful in practice (see Ruth’s excellent HTTP/2 guide for more details).

Yet these enhancements didn’t fix all of the problems with the TCP protocol.

Advertisement

TCP transfers packets chronologically, meaning that if a packet is missed, the entire connection is held up until the packet is successfully received. This problem, known as head of line blocking, negates some of the benefits of multiplexing.

See also  SEOs To Allocate More Resources To Content Over Other Areas

Another challenge with TCP is it’s entirely detached from the TLS protocol.

This is by design, as sites can be both secure and insecure.

As a result, a server and client must make multiple round trips to negotiate a connection before transmitting data.

How Does HTTP/3 Solve These Problems?

By moving from TCP to UDP, HTTP/3 introduces three main features that set it apart from HTTP/1.1 and HTTP/2.

Independent Byte Streams

HTTP/3 solves head-of-line blocking by introducing independent byte streams for individual files. Only the data for an individual stream is blocked while the lost packet is resent, not the entire connection.

To illustrate this further, it’s worth thinking back to the fantastic truck analogy Tom Anthony used in his seminal presentation on HTTP/2 (now updated for HTTP/3).

Advertisement

The basic premise is that with HTTP/1.1, you end up with multiple trucks queuing to go on the same road (connection).

trucks http1.1 limitationScreenshot from @TomAnthonySEO, An introduction to HTTP/3, April 2022trucks http1.1 limitation

In contrast, HTTP/2 allows multiple trucks to be in the same lane simultaneously.

htt2 trucksScreenshot from @TomAnthonySEO, An introduction to HTTP/3, April 2022htt2 trucks

Unfortunately, with TCP, if a truck stalls, the entire road is blocked until the truck starts moving again.

http2 trucks tcp packet lossScreenshot from @TomAnthonySEO, An introduction to HTTP/3, April 2022http2 trucks tcp packet loss

With HTTP/3 and UDP, the other trucks can just drive around it.

TLS Integration

By incorporating TLS 1.3 into HTTP/3 itself, rather than having two distinct protocols operating independently, only a singular handshake is required reducing the number of roundtrips from two (or three if using TLS 1.2) to one.

This change means faster – and more secure – connections for users.

One consequence of this change is that HTTP/3 can only be used on a secure site because TLS and UDP are closely intertwined. Interestingly, this wasn’t the case with HTTP/2, which can technically be used on an insecure site – although none of the major browsers allow you to do so.

Connection Migration

Rather than using IPs to route packets, HTTP/3 instead uses connection IDs.

By doing so, it can handle network changes without the need to re-establish a connection.

Advertisement

This is hugely advantageous in a mobile-first world, where users often swap between wifi and cellular networks, both in terms of speed and connection stability.

Going back to our truck analogy, this is like coming to a junction and having to queue again before you can move on to the next road.

See also  Should You Make the Move? An SEO’s Journey from Agency to In-House

With HTTP/3, there’s a slip-road, allowing you to exchange between the two seamlessly.

Does HTTP/3 Have Any Disadvantages?

Although HTTP/3 has some clear performance benefits, its detractors have emphasized several disadvantages.

First,  the protocol will provide limited benefit to users on fast connections, with the slowest 1% to 10% seeing most of the gains.

But, as far as Core Web Vitals go, this could actually be very beneficial.

CWV scores are global, so it’s entirely possible to pull them down by a specific subset of users in a distant geographic location.

Equally, in a mobile-first world, even users with fast devices and close geographic proximity can suffer from temporary network issues, which may have an adverse effect on CWV.

Advertisement

The more mobile your users, the higher the probability of this having an impact.

Another complaint is that switching to HTTP/3 requires a fairly major server upgrade because it fundamentally changes how the transport layer works.

Additionally, the usage of UDP also introduces higher CPU requirements, which may put more pressure on servers.

Both arguments are fair, but CPU usage is currently being optimized.

Also, as we’ll see in the implementation section below, many CDN providers are already providing relatively simple HTTP/3 solutions that can easily be deployed at the edge.

Does HTTP/3 Matter For SEO?

While Googlebot has supported HTTP/2 since November 2020, with half of all URLs now crawled using the protocol, it’s not currently supporting HTTP/3.

HTTP/2 is only used when there is a clear benefit to doing so, i.e., when using HTTP/2 will lead to significant resource savings for both servers and Googlebot.

This will undoubtedly continue to ramp up over time, but given the five-year gap between the publication of the HTTP/2 protocol and Googlebot support, HTTP/3 is likely a way off still.

Advertisement

That said, implementing HTTP/3 could still have an indirect SEO impact – if supporting the protocol leads to better Core Web Vitals scores.

Upgrading your server infrastructure to support HTTP/3 – or, for that matter, HTTP/2 – is just one of many potential enhancements that you can leverage to ensure your website is as performant as possible.

And the benefits of having a performant website, including reduced bounce rates, increased time on site, and higher conversion rates, extend beyond SEO.

To see what protocol Googlebot is using to crawl a site, you can look for a notification in GSC or check Googlebot requests within your server access logs.

While formats vary, the protocol used is commonly listed in the HTTP request found within quotation marks, alongside the request method and URL path.

50.56.92.47 [18/Apr/2022:10:00:00 -0100] "GET /seo/technical-seo-auditing/ HTTP/1.1" 200 684 "https://moz.com/" "Mozilla/5.0 (compatible; Googlebot/2.1; +http://www.google.com/bot.html)"

Example of an Apache request (Combined Log Format).

See also  Google SEO 101: Blocking Special Files in Robots.txt

How To Check If A Website Supports HTTP/3

If you’re unsure whether or not a website supports HTTP/3, then you can check using an online tool like: https://http3check.net/

http3check.net h3 checkScreenshot of http3check.net, April 2022http3check.net h3 check

Alternatively, both Chrome and Firefox display the protocol per request within the dev tools network tab.

These fields aren’t visible by default but can be enabled by right-clicking on the navigation bar and selecting “Protocol.” The HTTP/3 requests are labeled “h3.”

Advertisement
network tab chrome http3Screenshot from network tab chrome http3, April 2022network tab chrome http3

It’s also possible to check using the command line and curl.

curl --http3 https://website.com/

As many sites will only have HTTP/3 enabled for page resources (usually those hosted on a CDN), using dev tools will give a more accurate picture and allow you to assess the opportunities available better.

How Can I Implement HTTP/3?

The easiest way by far to enable HTTP/3 is via a CDN.

Several major providers, including Cloudflare, Google Cloud, and Fastly already support the protocol.

According to W3Techs, 22% of the top 10 million websites use Cloudflare, where you can easily enable HTTP/3 in the dashboard.

cloudflare http3 enableScreenshot of Cloudflare dashboard, April 2022cloudflare http3 enable

If you’re unsure what tech stack you’re dealing with, use Builtwith or Wappalyzer and see if a CDN is listed.

Wappalyzer CDN cloudflareScreenshot of Wappalyzer, April 2022Wappalyzer CDN cloudflare

If a site is using Cloudflare and all of the requests are HTTP/2, you’ve found an easy and impactful recommendation to make.

If implementation via a CDN isn’t possible, a server change is required.

Various implementations are available, depending on the language used, but web servers haven’t universally adopted these.

Therefore, the feasibility of implementing HTTP/3 is likely to depend on the type of software you’re using.

Server http3 supportServer HTTP/3 support, April 2022Server http3 support

Unfortunately, 32% of web servers use Apache, but it is yet to begin working on support due to limited dev resources.

Similarly, enabling the protocol on Node requires a workaround due to the lack of OpenSSL support.

Advertisement

Windows (IIS) is the latest provider to offer the protocol natively, but it requires Windows Server 2022 and Windows 11 or later.

Wrapping Up

HTTP/3 is another significant step forward for the web and will provide a much-needed performance boost to support its continuing evolution.

As SEO and digital marketing professionals, we should be aware of the benefits the protocol brings ahead of its imminent publication, so we can start recommending its use and allow our users to reap the benefits for years to come.

More resources:


Featured Image: VectorHot/Shutterstock

!function(f,b,e,v,n,t,s)
{if(f.fbq)return;n=f.fbq=function(){n.callMethod?
n.callMethod.apply(n,arguments):n.queue.push(arguments)};
if(!f._fbq)f._fbq=n;n.push=n;n.loaded=!0;n.version=’2.0′;
n.queue=[];t=b.createElement(e);t.async=!0;
t.src=v;s=b.getElementsByTagName(e)[0];
s.parentNode.insertBefore(t,s)}(window,document,’script’,
‘https://connect.facebook.net/en_US/fbevents.js’);

if( typeof sopp !== “undefined” && sopp === ‘yes’ ){
fbq(‘dataProcessingOptions’, [‘LDU’], 1, 1000);
}else{
fbq(‘dataProcessingOptions’, []);
}

fbq(‘init’, ‘1321385257908563’);

Advertisement

fbq(‘track’, ‘PageView’);

fbq(‘trackSingle’, ‘1321385257908563’, ‘ViewContent’, {
content_name: ‘http3-guide’,
content_category: ‘enterprise technical-seo ‘
});

Source link

SEO

A Complete Google Search Console Guide For SEO Pros

Published

on

A Complete Google Search Console Guide For SEO Pros

Google search console provides data necessary to monitor website performance in search and improve search rankings, information that is exclusively available through Search Console.

This makes it indispensable for online business and publishers that are keen to maximize success.

Taking control of your search presence is easier to do when using the free tools and reports.

What Is Google Search Console?

Google Search Console is a free web service hosted by Google that provides a way for publishers and search marketing professionals to monitor their overall site health and performance relative to Google search.

It offers an overview of metrics related to search performance and user experience to help publishers improve their sites and generate more traffic.

Search Console also provides a way for Google to communicate when it discovers security issues (like hacking vulnerabilities) and if the search quality team has imposed a manual action penalty.

Important features:

  • Monitor indexing and crawling.
  • Identify and fix errors.
  • Overview of search performance.
  • Request indexing of updated pages.
  • Review internal and external links.

It’s not necessary to use Search Console to rank better nor is it a ranking factor.

However, the usefulness of the Search Console makes it indispensable for helping improve search performance and bringing more traffic to a website.

Advertisement

How To Get Started

The first step to using Search Console is to verify site ownership.

Google provides several different ways to accomplish site verification, depending on if you’re verifying a website, a domain, a Google site, or a Blogger-hosted site.

Domains registered with Google domains are automatically verified by adding them to Search Console.

The majority of users will verify their sites using one of four methods:

  1. HTML file upload.
  2. Meta tag
  3. Google Analytics tracking code.
  4. Google Tag Manager.

Some site hosting platforms limit what can be uploaded and require a specific way to verify site owners.

But, that’s becoming less of an issue as many hosted site services have an easy-to-follow verification process, which will be covered below.

How To Verify Site Ownership

There are two standard ways to verify site ownership with a regular website, like a standard WordPress site.

  1. HTML file upload.
  2. Meta tag.

When verifying a site using either of these two methods, you’ll be choosing the URL-prefix properties process.

Let’s stop here and acknowledge that the phrase “URL-prefix properties” means absolutely nothing to anyone but the Googler who came up with that phrase.

Don’t let that make you feel like you’re about to enter a labyrinth blindfolded. Verifying a site with Google is easy.

Advertisement

HTML File Upload Method

Step 1: Go to the Search Console and open the Property Selector dropdown that’s visible in the top left-hand corner on any Search Console page.

Screenshot by author, May 2022

Step 2: In the pop-up labeled Select Property Type, enter the URL of the site then click the Continue button.

Step 2Screenshot by author, May 2022

Step 3: Select the HTML file upload method and download the HTML file.

Step 4: Upload the HTML file to the root of your website.

Root means https://example.com/. So, if the downloaded file is called verification.html, then the uploaded file should be located at https://example.com/verification.html.

Step 5: Finish the verification process by clicking Verify back in the Search Console.

Verification of a standard website with its own domain in website platforms like Wix and Weebly is similar to the above steps, except that you’ll be adding a meta description tag to your Wix site.

Duda has a simple approach that uses a Search Console App that easily verifies the site and gets its users started.

Troubleshooting With GSC

Ranking in search results depends on Google’s ability to crawl and index webpages.

The Search Console URL Inspection Tool warns of any issues with crawling and indexing before it becomes a major problem and pages start dropping from the search results.

Advertisement

URL Inspection Tool

The URL inspection tool shows whether a URL is indexed and is eligible to be shown in a search result.

For each submitted URL a user can:

  • Request indexing for a recently updated webpage.
  • View how Google discovered the webpage (sitemaps and referring internal pages).
  • View the last crawl date for a URL.
  • Check if Google is using a declared canonical URL or is using another one.
  • Check mobile usability status.
  • Check enhancements like breadcrumbs.
See also  SEOs Are in High Demand, According to LinkedIn’s Data

Coverage

The coverage section shows Discovery (how Google discovered the URL), Crawl (shows whether Google successfully crawled the URL and if not, provides a reason why), and Enhancements (provides the status of structured data).

The coverage section can be reached from the left-hand menu:

CoverageScreenshot by author, May 2022

Coverage Error Reports

While these reports are labeled as errors, it doesn’t necessarily mean that something is wrong. Sometimes it just means that indexing can be improved.

For example, in the following screenshot, Google is showing a 403 Forbidden server response to nearly 6,000 URLs.

The 403 error response means that the server is telling Googlebot that it is forbidden from crawling these URLs.

Coverage report showing 403 server error responsesScreenshot by author, May 2022

The above errors are happening because Googlebot is blocked from crawling the member pages of a web forum.

Every member of the forum has a member page that has a list of their latest posts and other statistics.

The report provides a list of URLs that are generating the error.

Advertisement

Clicking on one of the listed URLs reveals a menu on the right that provides the option to inspect the affected URL.

There’s also a contextual menu to the right of the URL itself in the form of a magnifying glass icon that also provides the option to Inspect URL.

Inspect URLScreenshot by author, May 2022

Clicking on the Inspect URL reveals how the page was discovered.

It also shows the following data points:

  • Last crawl.
  • Crawled as.
  • Crawl allowed?
  • Page fetch (if failed, provides the server error code).
  • Indexing allowed?

There is also information about the canonical used by Google:

  • User-declared canonical.
  • Google-selected canonical.

For the forum website in the above example, the important diagnostic information is located in the Discovery section.

This section tells us which pages are the ones that are showing links to member profiles to Googlebot.

With this information, the publisher can now code a PHP statement that will make the links to the member pages disappear when a search engine bot comes crawling.

Another way to fix the problem is to write a new entry to the robots.txt to stop Google from attempting to crawl these pages.

By making this 403 error go away, we free up crawling resources for Googlebot to index the rest of the website.

Google Search Console’s coverage report makes it possible to diagnose Googlebot crawling issues and fix them.

Advertisement

Fixing 404 Errors

The coverage report can also alert a publisher to 404 and 500 series error responses, as well as communicate that everything is just fine.

A 404 server response is called an error only because the browser or crawler’s request for a webpage was made in error because the page does not exist.

It doesn’t mean that your site is in error.

If another site (or an internal link) links to a page that doesn’t exist, the coverage report will show a 404 response.

Clicking on one of the affected URLs and selecting the Inspect URL tool will reveal what pages (or sitemaps) are referring to the non-existent page.

From there you can decide if the link is broken and needs to be fixed (in the case of an internal link) or redirected to the correct page (in the case of an external link from another website).

Or, it could be that the webpage never existed and whoever is linking to that page made a mistake.

If the page doesn’t exist anymore or it never existed at all, then it’s fine to show a 404 response.

Advertisement

Taking Advantage Of GSC Features

The Performance Report

The top part of the Search Console Performance Report provides multiple insights on how a site performs in search, including in search features like featured snippets.

There are four search types that can be explored in the Performance Report:

  1. Web.
  2. Image.
  3. Video.
  4. News.

Search Console shows the web search type by default.

Change which search type is displayed by clicking the Search Type button:

Default search typeScreenshot by author, May 2022

A menu pop-up will display allowing you to change which kind of search type to view:

Search Types MenuScreenshot by author, May 2022

A useful feature is the ability to compare the performance of two search types within the graph.

Four metrics are prominently displayed at the top of the Performance Report:

  1. Total Clicks.
  2. Total Impressions.
  3. Average CTR (click-through rate).
  4. Average position.
Screenshot of Top Section of the Performance PageScreenshot by author, May 2022

By default, the Total Clicks and Total Impressions metrics are selected.

See also  Google And The Right To Be Forgotten

By clicking within the tabs dedicated to each metric, one can choose to see those metrics displayed on the bar chart.

Impressions

Impressions are the number of times a website appeared in the search results. As long as a user doesn’t have to click a link to see the URL, it counts as an impression.

Additionally, if a URL is ranked at the bottom of the page and the user doesn’t scroll to that section of the search results, it still counts as an impression.

Advertisement

High impressions are great because it means that Google is showing the site in the search results.

But, the meaning of the impressions metric is made meaningful by the Clicks and the Average Position metrics.

Clicks

The clicks metric shows how often users clicked from the search results to the website. A high number of clicks in addition to a high number of impressions is good.

A low number of clicks and a high number of impressions is less good but not bad. It means that the site may need improvements to gain more traffic.

The clicks metric is more meaningful when considered with the Average CTR and Average Position metrics.

Average CTR

The average CTR is a percentage representing how often users clicked from the search results to the website.

Advertisement

A low CTR means that something needs improvement in order to increase visits from the search results.

A higher CTR means the site is performing well.

This metric gains more meaning when considered together with the Average Position metric.

Average Position

Average Position shows the average position in search results the website tends to appear in.

An average in positions one to 10 is great.

An average position in the twenties (20 – 29) means that the site is appearing on page two or three of the search results. This isn’t too bad. It simply means that the site needs additional work to give it that extra boost into the top 10.

Average positions lower than 30 could (in general) mean that the site may benefit from significant improvements.

Advertisement

Or, it could be that the site ranks for a large number of keyword phrases that rank low and a few very good keywords that rank exceptionally high.

In either case, it may mean taking a closer look at the content. It may be an indication of a content gap on the website, where the content that ranks for certain keywords isn’t strong enough and may need a dedicated page devoted to that keyword phrase to rank better.

All four metrics (Impressions, Clicks, Average CTR, and Average Position), when viewed together, present a meaningful overview of how the website is performing.

The big takeaway about the Performance Report is that it is a starting point for quickly understanding website performance in search.

It’s like a mirror that reflects back how well or poorly the site is doing.

Performance Report Dimensions

Scrolling down to the second part of the Performance page reveals several of what’s called Dimensions of a website’s performance data.

There are six dimensions:

1. Queries: Shows the top search queries and the number of clicks and impressions associated with each keyword phrase.

Advertisement

2. Pages: Shows the top-performing web pages (plus clicks and impressions).

3. Countries: Top countries (plus clicks and impressions).

4. Devices: Shows the top devices, segmented into mobile, desktop, and tablet.

5. Search Appearance: This shows the different kinds of rich results that the site was displayed in. It also tells if Google displayed the site using Web Light results and video results, plus the associated clicks and impressions data. Web Light results are results that are optimized for very slow devices.

6. Dates: The dates tab organizes the clicks and impressions by date. The clicks and impressions can be sorted in descending or ascending order.

Keywords

The keywords are displayed in the Queries as one of the dimensions of the Performance Report (as noted above). The queries report shows the top 1,000 search queries that resulted in traffic.

Of particular interest are the low-performing queries.

Advertisement

Some of those queries display low quantities of traffic because they are rare, what is known as long-tail traffic.

But, others are search queries that result from webpages that could need improvement, perhaps it could be in need of more internal links, or it could be a sign that the keyword phrase deserves its own webpage.

See also  Facebook Renames 'News Feed' to Just 'Feed' to Avoid Potential Confusion

It’s always a good idea to review the low-performing keywords because some of them may be quick wins that, when the issue is addressed, can result in significantly increased traffic.

Links

Search Console offers a list of all links pointing to the website.

However, it’s important to point out that the links report does not represent links that are helping the site rank.

It simply reports all links pointing to the website.

This means that the list includes links that are not helping the site rank. That explains why the report may show links that have a nofollow link attribute on them.

Advertisement

The Links report is accessible  from the bottom of the left-hand menu:

Links reportScreenshot by author, May 2022

The Links report has two columns: External Links and Internal Links.

External Links are the links from outside the website that points to the website.

Internal Links are links that originate within the website and link to somewhere else within the website.

The External links column has three reports:

  1. Top linked pages.
  2. Top linking sites.
  3. Top linking text.

The Internal Links report lists the Top Linked Pages.

Each report (top linked pages, top linking sites, etc.) has a link to more results that can be clicked to view and expand the report for each type.

For example, the expanded report for Top Linked Pages shows Top Target pages, which are the pages from the site that are linked to the most.

Clicking a URL will change the report to display all the external domains that link to that one page.

The report shows the domain of the external site but not the exact page that links to the site.

Advertisement

Sitemaps

A sitemap is generally an XML file that is a list of URLs that helps search engines discover the webpages and other forms of content on a website.

Sitemaps are especially helpful for large sites, sites that are difficult to crawl if the site has new content added on a frequent basis.

Crawling and indexing are not guaranteed. Things like page quality, overall site quality, and links can have an impact on whether a site is crawled and pages indexed.

Sitemaps simply make it easy for search engines to discover those pages and that’s all.

Creating a sitemap is easy because more are automatically generated by the CMS, plugins, or the website platform where the site is hosted.

Some hosted website platforms generate a sitemap for every site hosted on its service and automatically update the sitemap when the website changes.

Search Console offers a sitemap report and provides a way for publishers to upload a sitemap.

Advertisement

To access this function click on the link located on the left-side menu.

sitemaps

The sitemap section will report on any errors with the sitemap.

Search Console can be used to remove a sitemap from the reports. It’s important to actually remove the sitemap however from the website itself otherwise Google may remember it and visit it again.

Once submitted and processed, the Coverage report will populate a sitemap section that will help troubleshoot any problems associated with URLs submitted through the sitemaps.

Search Console Page Experience Report

The page experience report offers data related to the user experience on the website relative to site speed.

Search Console displays information on Core Web Vitals and Mobile Usability.

This is a good starting place for getting an overall summary of site speed performance.

Rich Result Status Reports

Search Console offers feedback on rich results through the Performance Report. It’s one of the six dimensions listed below the graph that’s displayed at the top of the page, listed as Search Appearance.

Advertisement

Selecting the Search Appearance tabs reveals clicks and impressions data for the different kinds of rich results shown in the search results.

This report communicates how important rich results traffic is to the website and can help pinpoint the reason for specific website traffic trends.

The Search Appearance report can help diagnose issues related to structured data.

For example, a downturn in rich results traffic could be a signal that Google changed structured data requirements and that the structured data needs to be updated.

It’s a starting point for diagnosing a change in rich results traffic patterns.

Search Console Is Good For SEO

In addition to the above benefits of Search Console, publishers and SEOs can also upload link disavow reports, resolve penalties (manual actions), and security events like site hackings, all of which contribute to a better search presence.

It is a valuable service that every web publisher concerned about search visibility should take advantage of.

More Resources:

Advertisement

Featured Image: bunny pixar/Shutterstock



Source link

Continue Reading

DON'T MISS ANY IMPORTANT NEWS!
Subscribe To our Newsletter
We promise not to spam you. Unsubscribe at any time.
Invalid email address

Trending

en_USEnglish