Connect with us

SEO

What It Is & Why It Matters For SEO

Published

on

What It Is & Why It Matters For SEO


You may have run across the W3C in your web development and SEO travels.

The W3C is the World Wide Web Consortium, and it was founded by the creator of the World Wide Web, Tim Berners-Lee.

This web standards body creates coding specifications for web standards worldwide.

It also offers a validator service to ensure that your HTML (among other code) is valid and error-free.

Making sure that your page validates is one of the most important things one can do to achieve cross-browser and cross-platform compatibility and provide an accessible online experience to all.

Invalid code can result in glitches, rendering errors, and long processing or loading times.

Simply put, if your code doesn’t do what it was intended to do across all major web browsers, this can negatively impact user experience and SEO.

Advertisement

W3C Validation: How It Works & Supports SEO

Web standards are important because they give web developers a standard set of rules for writing code.

If all code used by your company is created using the same protocols, it will be much easier for you to maintain and update this code in the future.

This is especially important when working with other people’s code.

If your pages adhere to web standards, they will validate correctly against W3C validation tools.

When you use web standards as the basis for your code creation, you ensure that your code is user-friendly with built-in accessibility.

When it comes to SEO, validated code is always better than poorly written code.

According to John Mueller, Google doesn’t care how your code is written. That means a W3C validation error won’t cause your rankings to drop.

You won’t rank better with validated code, either.

Advertisement

But there are indirect SEO benefits to well-formatted markup:

  • Eliminates Code Bloat: Validating code means that you tend to avoid code bloat. Validated code is generally leaner, better, and more compact than its counterpart.
  • Faster Rendering Times: This could potentially translate to better render times as the browser needs less processing, and we know that page speed is a ranking factor.
  • Indirect Contributions to Core Web Vitals Scores: When you pay attention to coding standards, such as adding the width and height attribute to your images, you eliminate steps that the browser must take in order to render the page. Faster rendering times can contribute to your Core Web Vitals scores, improving these important metrics overall.

Roger Montti compiled these six reasons Google still recommends code validation, because it:

  1. Could affect crawl rate.
  2. Affects browser compatibility.
  3. Encourages a good user experience.
  4. Ensures that pages function everywhere.
  5. Useful for Google Shopping Ads.
  6. Invalid HTML in head section breaks Hreflang.

Multiple Device Accessibility

Valid code also helps translate into better cross-browser and cross-platform compatibility because it conforms to the latest in W3C standards, and the browser will know better how to process that code.

This leads to an improved user experience for people who access your sites from different devices.

If you have a site that’s been validated, it will render correctly regardless of the device or platform being used to view it.

That is not to say that all code doesn’t conform across multiple browsers and platforms without validating, but there can be deviations in rendering across various applications.

Common Reasons Code Doesn’t Validate

Of course, validating your web pages won’t solve all problems with rendering your site as desired across all platforms and all browsing options. But it does go a long way toward solving those problems.

In the event that something does go wrong with validation on your part, you now have a baseline from which to begin troubleshooting.

You can go into your code and see what is making it fail.

It will be easier to find these problems and troubleshoot them with a validated site because you know where to start looking.

Advertisement

Having said that, there are several reasons pages may not validate.

See also  A Full Year Blueprint (+Template) [Ebook]

Browser Specific Issues

It may be that something in your code will only work on one browser or platform, but not another.

This problem would then need to be addressed by the developer of the offending script.

This would mean having to actually edit the code itself in order for it to validate on all platforms/browsers instead of just some of them.

You Are Using Outdated Code

The W3C only started rendering validation tests over the course of the past couple of decades.

If your page was created to validate in a browser that predates this time (IE 6 or earlier, for example), it will not pass these new standards because it was written with older technologies and formats in mind.

While this is a relatively rare issue, it still happens.

This problem can be fixed by reworking code to make it W3C compliant, but if you want to maintain compatibility with older browsers, you may need to continue using code that works, and thus forego passing 100% complete validation.

Advertisement

Both problems could potentially be solved with a little trial and error.

With some work and effort, both types of sites can validate across multiple devices and platforms without issue – hopefully!

Polyglot Documents

Polyglot documents include any document that may have been transferred from an older version of code, and never re-worked to be compatible with the new version.

In other words, it’s a combination of documents with a different code type than what the current document was coded for (say an HTML 4.01 transitional document type compared to an XHTML document type).

Make no mistake: Even though both may be “HTML” per se, they are very different languages and need to be treated as such.

You can’t copy and paste one over and expect things to be all fine and dandy.

What does this mean?

For example, you may have seen situations where you may validate code, but nearly every single line of a document has something wrong with it on the W3C validator.

Advertisement

This could be due to somebody transferring over code from another version of the site, and not updating it to reflect new coding standards.

Either way, the only way to repair this is to either rework the code line by line (an extraordinarily tedious process).

How W3C Validation Works

The W3C validator is this author’s validator of choice for making sure that your code validates across a wide variety of platforms and systems.

The W3C validator is free to use, and you can access it here.

With the W3C validator, it’s possible to validate your pages by page URL, file upload, and Direct Input.

  • Validate Your Pages by URL: This is relatively simple. Just copy and paste the URL into the Address field, and you can click on the check button in order to validate your code.
  • Validate Your Pages by File Upload: When you validate by file upload, you will upload the html files of your choice one file at a time. Caution: if you’re using Internet Explorer or certain versions Windows XP, this option may not work for you.
  • Validate Your Pages by Direct Input: With this option, all you have to do is copy and paste the code you want to validate into the editor, and the W3C validator will do the rest.

While some professionals claim that some W3C errors have no rhyme or reason, in 99.9% of cases, there is a rhyme and reason.

If there isn’t a rhyme and reason throughout the entire document, then you may want to refer to our section on polyglot documents below as a potential problem.

HTML Syntax

Let’s start at the top with HTML syntax. Because it’s the backbone of the World Wide Web, this is the most common coding that you will run into as an SEO professional.

See also  FAQs, How To Set Up & Time-Saving Tips

The W3C has created a specification for HTML 5 called “the HTML5 Standard”.

Advertisement

This document explains how HTML should be written on an ideal level for processing by popular browsers.

If you go to their site, you can utilize their validator to make sure that your code is valid according to this spec.

They even give examples of some of the rules that they look for when it comes to standards compliance.

This makes it easier than ever to check your work before you publish it!

Validators For Other Languages

Now let’s move on to some of the other languages that you may be using online.

For example, you may have heard of CSS3.

The W3C has standards documentation for CSS 3 as well called “the CSS3 Standard.”

This means that there is even more opportunity for validation!

Advertisement

You can validate your HTML against their standard and then validate your CSS against the same standard to ensure conformity across platforms.

While it may seem like overkill to validate your code against so many different standards at once, remember that this means that there are more chances than ever to ensure conformity across platforms.

And for those of you who only work in one language, you now have the opportunity to expand your horizons!

It can be incredibly difficult if not impossible to align everything perfectly, so you will need to pick your battles.

You may also just need something checked quickly online without having the time or resources available locally.

Common Validation Errors

You will need to be aware of the most common validation errors as you go through the validation process, and it’s also a good idea to know what those errors mean.

This way, if your page does not validate, you will know exactly where to start looking for possible problems.

Some of the most common validation errors (and their meanings) include:

Advertisement
  • Type Mismatch: When your code is trying to make one kind of data object appear like another data object (e.g., submitting a number as text), you run the risk of getting this message. This error usually signals that some kind of coding mistake has been made. The solution would be to figure out exactly where that mistake was made and fix it so that the code validates successfully.
  • Parse Error: This error tells you that there was a mistake in the coding somewhere, but it does not tell you where that mistake is. If this happens, you will have to do some serious sleuthing in order to find where your code went wrong.
  • Syntax Errors: These types of errors involve (mostly) careless mistakes in coding syntax. Either the syntax is typed incorrectly, or its context is incorrect. Either way, these errors will show up in the W3C validator.

The above are just some examples of errors that you may see when you’re validating your page.

Unfortunately, the list goes on and on – as does the time spent trying to fix these problems!

More Specific Errors (And Their Solutions)

You may find more specific errors that apply to your site. They may include errors that reference “type attribute used in tag.”

This refers to some tags like JavaScript declaration tags, such as the following: <script type=”text/javascript”>.

The type attribute of this tag is not needed anymore and is now considered legacy coding.

If you use that kind of coding now, you may end up unintentionally throwing validation errors all over the place in certain validators.

Did you know that not using alternative text (alt text) – also called alt tags by some – is a W3C issue? It does not conform to the W3C rules for accessibility.

Alternative text is the text that is coded into images.

See also  Want to Learn About Bing and Search Ranking? Here You Go!

It is primarily used by screen readers for the blind.

Advertisement

If a blind person visits your site, and you do not have alternative text (or meaningful alternative text) in your images, then they will be unable to use your site effectively.

The way these screen readers work is that they speak aloud the words that are coded into images, so the blind can use their sense of hearing to understand what’s on your web page.

If your page is not very accessible in this regard, this could potentially lead to another sticky issue: that of accessibility lawsuits.

This is why it pays to pay attention to your accessibility standards and validate your code against these standards.

Other types of common errors include using tags out of context.

For code context errors, you will need to make sure they are repaired according to the W3C documentation so these errors are no longer thrown by the validator.

Preventing Errors From Impacting Your Site Experience

The best way to prevent validation errors from happening is by making sure your site validates before launch.

It’s also useful to validate your pages regularly after they’re launched so that new errors do not crop up unexpectedly over time.

Advertisement

If you think about it, validation errors are the equivalent of spelling mistakes in an essay – once they’re there, they’re difficult (if not impossible) to erase, and they need to be fixed as soon as humanly possible.

If you adopt the habit of always using the W3C validator in order to validate your code, then you can, in essence, stop these coding mistakes from ever happening in the first place.

Heads Up: There Is More Than One Way To Do It

Sometimes validation won’t go as planned according to all standards.

And there is more than one way to accomplish the same goal.

For example, if you use <button> to create a button and then give it an href tag inside of it using the <a> element, this doesn’t seem to be possible according to W3C standards.

But is perfectly acceptable in JavaScript because there are actually ways to do this within the language itself.

This is an example of how we create this particular code and insert it into the direct input of the W3C validator:

Screenshot from W3C validator, February 2022

In the next step, during validation, as discussed above we find that there are at least 4 errors just within this particular code alone, indicating that this is not exactly a particularly well-coded line:

Screenshot showing errors in the W3C validator tool.Screenshot from W3C validator, February 2022

While validation, on the whole, can help you immensely, it is not always going to be 100% complete.

This is why it’s important to familiarize yourself by coding with the validator as much as you can.

Advertisement

Some adaptation will be needed. But it takes experience to achieve the best possible cross-platform compatibility while also remaining compliant with today’s browsers.

The ultimate goal here is improving accessibility and achieving compatibility with all browsers, operating systems, and devices.

Not all browsers and devices are created equal, and validation achieves a cohesive set of instructions and standards that can accomplish the goal of making your page equal enough for all browsers and devices.

When in doubt, always err on the side of proper code validation.

By making sure that you work to include the absolute best practices in your coding, you can ensure that your code is as accessible as it possibly can be for all types of users.

On top of that, validating your HTML against W3C standards helps you achieve cross-platform compatibility between different browsers and devices.

By working to always ensure that your code validates, you are on your way to making sure that your site is as safe, accessible, and efficient as possible.

More resources: 

Advertisement

Featured Image: graphicwithart/Shutterstock





Source link

Advertisement

SEO

A Complete Google Search Console Guide For SEO Pros

Published

on

A Complete Google Search Console Guide For SEO Pros

Google search console provides data necessary to monitor website performance in search and improve search rankings, information that is exclusively available through Search Console.

This makes it indispensable for online business and publishers that are keen to maximize success.

Taking control of your search presence is easier to do when using the free tools and reports.

What Is Google Search Console?

Google Search Console is a free web service hosted by Google that provides a way for publishers and search marketing professionals to monitor their overall site health and performance relative to Google search.

It offers an overview of metrics related to search performance and user experience to help publishers improve their sites and generate more traffic.

Search Console also provides a way for Google to communicate when it discovers security issues (like hacking vulnerabilities) and if the search quality team has imposed a manual action penalty.

Important features:

  • Monitor indexing and crawling.
  • Identify and fix errors.
  • Overview of search performance.
  • Request indexing of updated pages.
  • Review internal and external links.

It’s not necessary to use Search Console to rank better nor is it a ranking factor.

However, the usefulness of the Search Console makes it indispensable for helping improve search performance and bringing more traffic to a website.

Advertisement

How To Get Started

The first step to using Search Console is to verify site ownership.

Google provides several different ways to accomplish site verification, depending on if you’re verifying a website, a domain, a Google site, or a Blogger-hosted site.

Domains registered with Google domains are automatically verified by adding them to Search Console.

The majority of users will verify their sites using one of four methods:

  1. HTML file upload.
  2. Meta tag
  3. Google Analytics tracking code.
  4. Google Tag Manager.

Some site hosting platforms limit what can be uploaded and require a specific way to verify site owners.

But, that’s becoming less of an issue as many hosted site services have an easy-to-follow verification process, which will be covered below.

How To Verify Site Ownership

There are two standard ways to verify site ownership with a regular website, like a standard WordPress site.

  1. HTML file upload.
  2. Meta tag.

When verifying a site using either of these two methods, you’ll be choosing the URL-prefix properties process.

Let’s stop here and acknowledge that the phrase “URL-prefix properties” means absolutely nothing to anyone but the Googler who came up with that phrase.

Don’t let that make you feel like you’re about to enter a labyrinth blindfolded. Verifying a site with Google is easy.

Advertisement

HTML File Upload Method

Step 1: Go to the Search Console and open the Property Selector dropdown that’s visible in the top left-hand corner on any Search Console page.

Screenshot by author, May 2022

Step 2: In the pop-up labeled Select Property Type, enter the URL of the site then click the Continue button.

Step 2Screenshot by author, May 2022

Step 3: Select the HTML file upload method and download the HTML file.

Step 4: Upload the HTML file to the root of your website.

Root means https://example.com/. So, if the downloaded file is called verification.html, then the uploaded file should be located at https://example.com/verification.html.

Step 5: Finish the verification process by clicking Verify back in the Search Console.

Verification of a standard website with its own domain in website platforms like Wix and Weebly is similar to the above steps, except that you’ll be adding a meta description tag to your Wix site.

Duda has a simple approach that uses a Search Console App that easily verifies the site and gets its users started.

Troubleshooting With GSC

Ranking in search results depends on Google’s ability to crawl and index webpages.

The Search Console URL Inspection Tool warns of any issues with crawling and indexing before it becomes a major problem and pages start dropping from the search results.

Advertisement

URL Inspection Tool

The URL inspection tool shows whether a URL is indexed and is eligible to be shown in a search result.

For each submitted URL a user can:

  • Request indexing for a recently updated webpage.
  • View how Google discovered the webpage (sitemaps and referring internal pages).
  • View the last crawl date for a URL.
  • Check if Google is using a declared canonical URL or is using another one.
  • Check mobile usability status.
  • Check enhancements like breadcrumbs.
See also  FAQs, How To Set Up & Time-Saving Tips

Coverage

The coverage section shows Discovery (how Google discovered the URL), Crawl (shows whether Google successfully crawled the URL and if not, provides a reason why), and Enhancements (provides the status of structured data).

The coverage section can be reached from the left-hand menu:

CoverageScreenshot by author, May 2022

Coverage Error Reports

While these reports are labeled as errors, it doesn’t necessarily mean that something is wrong. Sometimes it just means that indexing can be improved.

For example, in the following screenshot, Google is showing a 403 Forbidden server response to nearly 6,000 URLs.

The 403 error response means that the server is telling Googlebot that it is forbidden from crawling these URLs.

Coverage report showing 403 server error responsesScreenshot by author, May 2022

The above errors are happening because Googlebot is blocked from crawling the member pages of a web forum.

Every member of the forum has a member page that has a list of their latest posts and other statistics.

The report provides a list of URLs that are generating the error.

Advertisement

Clicking on one of the listed URLs reveals a menu on the right that provides the option to inspect the affected URL.

There’s also a contextual menu to the right of the URL itself in the form of a magnifying glass icon that also provides the option to Inspect URL.

Inspect URLScreenshot by author, May 2022

Clicking on the Inspect URL reveals how the page was discovered.

It also shows the following data points:

  • Last crawl.
  • Crawled as.
  • Crawl allowed?
  • Page fetch (if failed, provides the server error code).
  • Indexing allowed?

There is also information about the canonical used by Google:

  • User-declared canonical.
  • Google-selected canonical.

For the forum website in the above example, the important diagnostic information is located in the Discovery section.

This section tells us which pages are the ones that are showing links to member profiles to Googlebot.

With this information, the publisher can now code a PHP statement that will make the links to the member pages disappear when a search engine bot comes crawling.

Another way to fix the problem is to write a new entry to the robots.txt to stop Google from attempting to crawl these pages.

By making this 403 error go away, we free up crawling resources for Googlebot to index the rest of the website.

Google Search Console’s coverage report makes it possible to diagnose Googlebot crawling issues and fix them.

Advertisement

Fixing 404 Errors

The coverage report can also alert a publisher to 404 and 500 series error responses, as well as communicate that everything is just fine.

A 404 server response is called an error only because the browser or crawler’s request for a webpage was made in error because the page does not exist.

It doesn’t mean that your site is in error.

If another site (or an internal link) links to a page that doesn’t exist, the coverage report will show a 404 response.

Clicking on one of the affected URLs and selecting the Inspect URL tool will reveal what pages (or sitemaps) are referring to the non-existent page.

From there you can decide if the link is broken and needs to be fixed (in the case of an internal link) or redirected to the correct page (in the case of an external link from another website).

Or, it could be that the webpage never existed and whoever is linking to that page made a mistake.

If the page doesn’t exist anymore or it never existed at all, then it’s fine to show a 404 response.

Advertisement

Taking Advantage Of GSC Features

The Performance Report

The top part of the Search Console Performance Report provides multiple insights on how a site performs in search, including in search features like featured snippets.

There are four search types that can be explored in the Performance Report:

  1. Web.
  2. Image.
  3. Video.
  4. News.

Search Console shows the web search type by default.

Change which search type is displayed by clicking the Search Type button:

Default search typeScreenshot by author, May 2022

A menu pop-up will display allowing you to change which kind of search type to view:

Search Types MenuScreenshot by author, May 2022

A useful feature is the ability to compare the performance of two search types within the graph.

Four metrics are prominently displayed at the top of the Performance Report:

  1. Total Clicks.
  2. Total Impressions.
  3. Average CTR (click-through rate).
  4. Average position.
Screenshot of Top Section of the Performance PageScreenshot by author, May 2022

By default, the Total Clicks and Total Impressions metrics are selected.

See also  Google’s John Mueller to Investigate Deceitful Link Building Practices

By clicking within the tabs dedicated to each metric, one can choose to see those metrics displayed on the bar chart.

Impressions

Impressions are the number of times a website appeared in the search results. As long as a user doesn’t have to click a link to see the URL, it counts as an impression.

Additionally, if a URL is ranked at the bottom of the page and the user doesn’t scroll to that section of the search results, it still counts as an impression.

Advertisement

High impressions are great because it means that Google is showing the site in the search results.

But, the meaning of the impressions metric is made meaningful by the Clicks and the Average Position metrics.

Clicks

The clicks metric shows how often users clicked from the search results to the website. A high number of clicks in addition to a high number of impressions is good.

A low number of clicks and a high number of impressions is less good but not bad. It means that the site may need improvements to gain more traffic.

The clicks metric is more meaningful when considered with the Average CTR and Average Position metrics.

Average CTR

The average CTR is a percentage representing how often users clicked from the search results to the website.

Advertisement

A low CTR means that something needs improvement in order to increase visits from the search results.

A higher CTR means the site is performing well.

This metric gains more meaning when considered together with the Average Position metric.

Average Position

Average Position shows the average position in search results the website tends to appear in.

An average in positions one to 10 is great.

An average position in the twenties (20 – 29) means that the site is appearing on page two or three of the search results. This isn’t too bad. It simply means that the site needs additional work to give it that extra boost into the top 10.

Average positions lower than 30 could (in general) mean that the site may benefit from significant improvements.

Advertisement

Or, it could be that the site ranks for a large number of keyword phrases that rank low and a few very good keywords that rank exceptionally high.

In either case, it may mean taking a closer look at the content. It may be an indication of a content gap on the website, where the content that ranks for certain keywords isn’t strong enough and may need a dedicated page devoted to that keyword phrase to rank better.

All four metrics (Impressions, Clicks, Average CTR, and Average Position), when viewed together, present a meaningful overview of how the website is performing.

The big takeaway about the Performance Report is that it is a starting point for quickly understanding website performance in search.

It’s like a mirror that reflects back how well or poorly the site is doing.

Performance Report Dimensions

Scrolling down to the second part of the Performance page reveals several of what’s called Dimensions of a website’s performance data.

There are six dimensions:

1. Queries: Shows the top search queries and the number of clicks and impressions associated with each keyword phrase.

Advertisement

2. Pages: Shows the top-performing web pages (plus clicks and impressions).

3. Countries: Top countries (plus clicks and impressions).

4. Devices: Shows the top devices, segmented into mobile, desktop, and tablet.

5. Search Appearance: This shows the different kinds of rich results that the site was displayed in. It also tells if Google displayed the site using Web Light results and video results, plus the associated clicks and impressions data. Web Light results are results that are optimized for very slow devices.

6. Dates: The dates tab organizes the clicks and impressions by date. The clicks and impressions can be sorted in descending or ascending order.

Keywords

The keywords are displayed in the Queries as one of the dimensions of the Performance Report (as noted above). The queries report shows the top 1,000 search queries that resulted in traffic.

Of particular interest are the low-performing queries.

Advertisement

Some of those queries display low quantities of traffic because they are rare, what is known as long-tail traffic.

But, others are search queries that result from webpages that could need improvement, perhaps it could be in need of more internal links, or it could be a sign that the keyword phrase deserves its own webpage.

See also  How to Score a Perfect 100% on Google PageSpeed Insights

It’s always a good idea to review the low-performing keywords because some of them may be quick wins that, when the issue is addressed, can result in significantly increased traffic.

Links

Search Console offers a list of all links pointing to the website.

However, it’s important to point out that the links report does not represent links that are helping the site rank.

It simply reports all links pointing to the website.

This means that the list includes links that are not helping the site rank. That explains why the report may show links that have a nofollow link attribute on them.

Advertisement

The Links report is accessible  from the bottom of the left-hand menu:

Links reportScreenshot by author, May 2022

The Links report has two columns: External Links and Internal Links.

External Links are the links from outside the website that points to the website.

Internal Links are links that originate within the website and link to somewhere else within the website.

The External links column has three reports:

  1. Top linked pages.
  2. Top linking sites.
  3. Top linking text.

The Internal Links report lists the Top Linked Pages.

Each report (top linked pages, top linking sites, etc.) has a link to more results that can be clicked to view and expand the report for each type.

For example, the expanded report for Top Linked Pages shows Top Target pages, which are the pages from the site that are linked to the most.

Clicking a URL will change the report to display all the external domains that link to that one page.

The report shows the domain of the external site but not the exact page that links to the site.

Advertisement

Sitemaps

A sitemap is generally an XML file that is a list of URLs that helps search engines discover the webpages and other forms of content on a website.

Sitemaps are especially helpful for large sites, sites that are difficult to crawl if the site has new content added on a frequent basis.

Crawling and indexing are not guaranteed. Things like page quality, overall site quality, and links can have an impact on whether a site is crawled and pages indexed.

Sitemaps simply make it easy for search engines to discover those pages and that’s all.

Creating a sitemap is easy because more are automatically generated by the CMS, plugins, or the website platform where the site is hosted.

Some hosted website platforms generate a sitemap for every site hosted on its service and automatically update the sitemap when the website changes.

Search Console offers a sitemap report and provides a way for publishers to upload a sitemap.

Advertisement

To access this function click on the link located on the left-side menu.

sitemaps

The sitemap section will report on any errors with the sitemap.

Search Console can be used to remove a sitemap from the reports. It’s important to actually remove the sitemap however from the website itself otherwise Google may remember it and visit it again.

Once submitted and processed, the Coverage report will populate a sitemap section that will help troubleshoot any problems associated with URLs submitted through the sitemaps.

Search Console Page Experience Report

The page experience report offers data related to the user experience on the website relative to site speed.

Search Console displays information on Core Web Vitals and Mobile Usability.

This is a good starting place for getting an overall summary of site speed performance.

Rich Result Status Reports

Search Console offers feedback on rich results through the Performance Report. It’s one of the six dimensions listed below the graph that’s displayed at the top of the page, listed as Search Appearance.

Advertisement

Selecting the Search Appearance tabs reveals clicks and impressions data for the different kinds of rich results shown in the search results.

This report communicates how important rich results traffic is to the website and can help pinpoint the reason for specific website traffic trends.

The Search Appearance report can help diagnose issues related to structured data.

For example, a downturn in rich results traffic could be a signal that Google changed structured data requirements and that the structured data needs to be updated.

It’s a starting point for diagnosing a change in rich results traffic patterns.

Search Console Is Good For SEO

In addition to the above benefits of Search Console, publishers and SEOs can also upload link disavow reports, resolve penalties (manual actions), and security events like site hackings, all of which contribute to a better search presence.

It is a valuable service that every web publisher concerned about search visibility should take advantage of.

More Resources:

Advertisement

Featured Image: bunny pixar/Shutterstock



Source link

Continue Reading

DON'T MISS ANY IMPORTANT NEWS!
Subscribe To our Newsletter
We promise not to spam you. Unsubscribe at any time.
Invalid email address

Trending

en_USEnglish