Connect with us

SEO

How to Improve SEO With User Experience Factors

Published

on

How to Improve SEO With User Experience Factors

Google’s algorithm has consistently taken the user experience into account. For example, Google doesn’t rank directories because sending users from a page of 10 links to a page of 20 links is a poor user experience.

Thinking in terms of user experience can help with SEO because resultant strategies tend to align with how Google ranks websites.

Here are a few specific ways you can improve your SEO performance with user experience factors including Natural Language Processing, content creation, web design, and more.

Natural Language Processing

Google’s recent technological breakthroughs like RankBrain and BERT are designed to help Google better understand what people expect to see when they type a search query. They also help Google understand what web pages mean.

An example is a shortcoming of their algorithm that was recently addressed. Google recently introduced their Passages algorithm that allows them to refer searchers straight to a relevant section of a long web page that contains the answer.

Previous to this update Google was unable to adequately rank long web pages.

This is an example of Google using machine learning to provide better answers based on what a web page is about. This is a huge step away from sending users to web pages that contain the keywords in a search query.

Google is understanding web pages in order to match the content as an answer to a search query that poses a question.

It’s not matching questions to keywords. Google is matching questions to answers.

Content Creation for User Experience

This has a profound impact on how web content is planned, with the focus shifting from focusing on creating content around keywords to creating content for users.

This is an example of imposing a user experience point of view on the content creation process.

One has to ask, “What does a site visitor want from this page? What are they trying to accomplish? What is it that they aspire to do?”

Literally, ask those questions and the answers become your content. This will then line up with how Google understands web pages and ranks those pages.

See also  SEO vs PPC: Is Organic SEO or Pay Per Click Better for my Business?

Of course, it’s important to first look at the top one to three positions in the search results and read the content to tease out what question those pages are answering.

Once you find a pattern, you can begin to understand what users mean when they type a particular search query. Once you know that, you can begin the process of writing content.

Content writing that extracts the meaning from the top ten to the top thirty of the search results is going to result in an irrelevant analysis because there will be too many mixed search intents.

Analysis of the top ten with a subsequent segmentation of the positions by search intent is a better way to understand what users mean when they type a search query.

Don’t try to mimic the words on search results. Remember, Google is only ranking the best of what it feels satisfies a query.

By copying the keywords used in a top-ranked webpage, you’re missing out on the opportunity to find a better way to satisfy a search query.

Old Way:
Research top-ranked sites to extract keywords and write content with those keywords.

New Way:
Research top-ranked sites to understand the latent question being asked and then provide a better answer.

What’s a Better Answer

The better answer is the one that tells and shows the user the how, why, what or when that they are looking for.

Sometimes that means creating custom images to illustrate your message. Sometimes that means communicating the message with a graph that gives a visual presentation to the data.

Use your imagination and ask yourself: How can I make this message any clearer to people visiting my site?

That’s the process of creating content with the user experience in mind.

Web Page Experience

Google is introducing a small ranking boost for pages that can pass their Core Vitals Test. Core Web Vitals (CWV) measure a site visitor’s user experience.

See also  Why SEO Still Matters for Your Entire Digital Marketing Strategy

In an ideal world, most publishers would already be optimizing web pages for a fast user experience.

But in the real world, publishers are limited by the bloated content management systems available to them.

Providing a fast user experience takes more than a fast server, too. The page speed bottleneck happens on the site visitor’s end where they’re downloading your page on a mobile phone through a 4G wireless network with limited bandwidth.

Creating a site with a fast download is good for users and better for publishers. More conversions, more page views, and higher earnings happen when a website optimizes its web pages for speed.

What can you do to create a better user experience on the page?

The first thing to do is visit your own site and read your articles in one sitting all the way through.

Then ask yourself if you feel like clicking through to read some more. If there’s a feeling of fatigue, there are reasons for that and they all relate to user experience.

How to Create a Better Web Page Experience

  • Break up your content into smaller paragraphs.
  • Use meaningful Heading Tags (accurately describe the content that follows).
  • Use bullet points and ordered lists.
  • Use more images that illustrate what you’re trying to say.
  • Choose images that are inherently lightweight (light shades, less colors, fewer micro details like gravel or leaves).
  • Optimize your images.
  • Replace images that cannot squeeze down to less than 50 kilobytes (or at least no higher than 100 kb).
  • Do not require a minimum word count from your writers.
  • Write content that provides useful answers.
  • Use graphs.
  • Test your pages on different mobile devices.
  • Minimize CSS and JavaScript, especially third-party scripts.
  • Remove CSS and JavaScript that provide functionality for things like sliders and contact forms when those features are not on the page.
  • If possible, reconsider the use of sliders.
  • Consider using fonts that are already on visitor’s computers or simply update your font to sans-serif.
  • Run your URLs through the PageSpeed Insights tool and follow directions for improvements.
See also  SEO and The Pandemic: Adapt Your Marketing During COVID-19

Acknowledge & Mirror Your Site Visitors

Always seek out the opportunity to mirror your customer and site visitor in the images that you use.

Be diverse in your image choice. If your visitors tend to skew older and middle-aged, use images that reflect those users.

Do not make the mistake of mirroring yourself or those within your cultural bubble. Make your web pages welcoming for every segment of society that needs your information.

People tend to see themselves in the images that you use and it makes them comfortable to see themselves or people like themselves reflected in the images used in your web page (if it’s appropriate to use images of people!).

How Does User Experience Impact SEO?

Google tends to rank sites that are relevant to user queries.

Google also tends to rank popular webpages that users expect to see because the goal is to satisfy users.

Creating a site that is frictionless and that people enjoy is one of the fundamental ways of building popularity with users. When people share about a site, what they’re really sharing is the experience they had with that site.

And those are the kinds of pages that people tend to feel enthusiastic about enough to tell their friends about, link to, and recommend. Sites that rank well naturally are the kinds of sites that users feel enthusiastic enough to link to and recommend.

Creating a positive user experience is one of the building blocks of creating good search performance.

From attracting links, increasing page views, improving conversion rates and earnings, a site can’t lose by focusing on the user experience.


Image Credits

Featured Image: Paulo Bobita

Searchenginejournal.com

SEO

Are Local Citations (NAP) A Google Ranking Factor?

Published

on

Are Local Citations (NAP) A Google Ranking Factor?


In local SEO, a citation is a mention of key business information – your name, address, and phone number (NAP) – anywhere else on the web.

Local citations might appear in directories, on social networking or review sites, in apps, and on all kinds of other websites.

Clearly, these are an important part of a searcher’s experience; NAP info is how a local consumer will find their way to your store or give you a call.

But do citations help you rank higher in Google Search results?

The Claim: Local Citations As A Ranking Factor

Some citations allow only for the location’s name, address, and phone number.

However, you may be able to add a website link, business description, photos, and more, depending on the directory or platform.

The idea here is that each of these optimizations will help you rank higher in local search results:

  • Having your NAP info appear on more external sites.
  • Ensuring the accuracy of your citations.
  • Optimizing each one by adding as much supporting detail as the fields on that site allow.

WhiteSpark’s industry survey on local ranking factors provides a good framework that illustrates the variety of considerations in play when we talk about local citation signals. Citations are evaluated based on:

  • Consistency.
  • Quality/authority.
  • Quantity.
  • Enhancement/completeness.

The Evidence For Citations As A Ranking Factor

Citations have long been widely accepted by SEO professionals as a key local ranking factor.

“Consistency of citations” came in at #5 in Moz’s 2020 industry survey of what SEO pros believe are local ranking factors. (They were ranked fifth in the 2018 survey, as well, for both Local Pack/Finder and Localized Organic search results.)

See also  How Google Responds to a Site Move

However, what it is about citations that matter most has been the subject of debate over the years.

When BrightLocal surveyed the industry in 2016, 90% of respondents said citation accuracy was “very important” to “critical” for local search rankings. What’s more, 86% said the quality of those citations was more important than quantity.

In this video, Google confirms that local results are based primarily on relevance, distance, and prominence.

And while you cannot control all of these factors, they say:

“First, make sure all of your business information is complete. It’s important to have accurate information including your phone number, address, and business category.”

Google also recommends that in order to ensure the accuracy of your GMB listing and “help you stand out”, you should:

  • Double-check that hours of operation are accurate.
  • Use special hours for holidays.
  • Add photos of your location, services, or merchandise.
  • Verify your location to tell Google you are the correct owner of the business.

In their “Improve your local ranking on Google” help resource, the advice is clear:

“Local results favor the most relevant results for each search. Businesses with complete and accurate information are easier to match with the right searches.”

The Evidence Against Local Citations As A Ranking Factor

You could argue that citations are too difficult to maintain and therefore not a reliable signal.

And you would be right.

It’s incredibly difficult to ensure that all citations across the local search ecosystem are kept up to date.

With so many aggregators, user suggestions, manual errors, and other elements wreaking havoc with citation information, how can Google trust that the information they’re finding about any one business location is accurate?

See also  Should Freelance Writers Sign Non-Compete Agreements?

This is precisely why local listings management is so important, and providing Google a single source of truth through your GMB profile is key.

Monitoring for citation errors is essential so you can correct them before the wrong information is picked up by aggregators and more widely distributed.

Citation inconsistencies can happen for countless reasons:

  • Businesses move to new locations.
  • Brands open and close stores.
  • Staff and owners create listings without documenting them, and they grow outdated as the business evolves.
  • Consumers create duplicate listings by making spelling mistakes when trying to leave a review.
  • Google searchers suggest listing edits with the best of intentions but the wrong information.
  • And more. A lot more.

Google recognizes that all of these issues can impact citation accuracy, which is why it relies on such a wide array of sources to determine whether the information is trustworthy.

Local Citations As A Ranking Factor: Our Verdict

Bottom line: It is all but confirmed officially by Google that Google uses local citations as a ranking signal in Local Pack/Finder and localized organic search results.

Google’s aim is to provide the best, most trustworthy answers to every searcher.

Citations are an important signal as to whether key business information is correct and that location is the best answer for a local searcher’s relevant query.

If you’re just getting started, check out John McAlpin’s Citations & Local SEO: The Ultimate Beginner’s Guide.

Ready to get more advanced? Make sure your citations are accurate and complete on as many relevant sources as possible. WhiteSpark’s free Top Local Citation Sources by Country finder enables you to pull a list of the top directories, networks, websites, etc. in 15 countries.

See also  Mobile A/B Testing: 7 Big Errors and Misconceptions You Need to Avoid

And if you really want to step up your local strategy, you’ll want to download Local SEO: The Definitive Guide to Improve Your Local Search Rankings.


Featured Image: Paolo Bobita/Search Engine Journal





Source link

Continue Reading

SEO

Is It A Ranking Factor?

Published

on

Is It A Ranking Factor?


Quickly gaining a lot of links from other sites sounds like it should be a positive thing for any website.

But could it actually hurt, rather than help, your rankings?

Or does link velocity not matter at all to Google? Is it, in fact, just some made-up SEO term?

Read on as we investigate the origins of link velocity and whether it’s something you need to be genuinely concerned about in SEO.

The Claim: Link Velocity As A Ranking Factor

Link velocity refers to a theory that the speed at which a website gains links has the potential to impact rankings, either positively or negatively.

Link Velocity = Good

Years ago, having a high link velocity in a short period of time was viewed by some as a good thing in the SEO industry, one that could positively influence your Google rankings.

Link velocity was mentioned in articles and during conference sessions – because in those days link building was more about quantity than quality.

Want to get a webpage to rank quickly? Build a whole bunch of links to it fast.

But the idea of quantity over quality changed after Google launched the Penguin algorithm.

Link Velocity = Bad

The belief here is that gaining links too fast can cause a website to get penalized or demoted in search results.

It is based on the idea that Google will interpret a quick increase in inbound links as a sign that the website is trying to manipulate its search rankings.

Understandably, the idea of link velocity can be concerning for everyone who is averse to getting inadvertently penalized for acquiring links.

See also  Mobile A/B Testing: 7 Big Errors and Misconceptions You Need to Avoid

The growth of a website’s link profile is largely out of its control.

If a site publishes a great piece of content, for example, many other sites may reference it within a short time frame, resulting in a number of links gained all at once.

Were link velocity to work as SEO experts claim, the website in the above example could receive a penalty because it gained an influx of inbound links through no fault of its own.

The Evidence: Link Velocity As A Ranking Factor

The origins of link velocity in the SEO community can be dated back to the discovery of a Google patent that was filed in 2003.

The patent, Information Retrieval Based on Historical Data, includes ideas about how a search engine should treat a website based on the growth of its link profile.

In particular, the idea of link velocity can be traced back to this passage:

“While a spiky rate of growth in the number of backlinks may be a factor used by search engine 125 to score documents, it may also signal an attempt to spam search engine 125. Accordingly, in this situation, search engine 125 may actually lower the score of a document(s) to reduce the effect of spamming.”

Search Engine Journal’s Roger Montti has picked apart SEO experts’ interpretation of this patent, noting how they ignore parts of the patent which disprove their own theory.

For instance, the patent goes on to define what a “spiky rate of growth” is and how it can be the defining characteristic of unnatural link building.

The patent isn’t about penalizing websites that see a rapid increase in inbound links.

See also  Blending a Strategic SEO & PPC Strategy for Great Results

It’s about demoting websites that exhibit a pattern of unusual spikes in inbound links over extended periods.

According to Montti:

“What that patent is really talking about is the smooth natural rate of growth versus a spiky and unnatural rate of growth.

A spiky rate of growth can manifest over the course of months. That’s a big difference from the link velocity idea that proposes that a large amount of links acquired in a short period will result in a penalty.”

The evidence doesn’t add up to what experts claim about link velocity.

Link Velocity As A Ranking Factor: Our Verdict

There is no evidence to suggest that Google uses a signal known as link velocity that can negatively impact rankings.

Link velocity is not a term Google officially recognizes.

When asked about it, Google search representatives say a website’s links are assessed on their own merits, not by how many are gained in which length of time.

Here’s an example of such a response from Google’s John Mueller:

“It’s not so much a matter of how many links you get in which time period. It’s really just… if these are links that are unnatural or from our point of view problematic then they would be problematic. It’s like it doesn’t really matter how many or in which time.”

Google’s Gary Illyes put it more bluntly in a Reddit AMA, calling link velocity a made-up term.

Whether links are gained fast or slow, what really matters is the quality of the individual links and the manner in which they were acquired (naturally or unnaturally).

See also  Stop Recommending Impossible SEO Content Strategies. Do This Instead.

Featured Image: Paolo Bobita/Search Engine Journal





Source link

Continue Reading

SEO

Google’s Help Documents Aren’t Always Up To Date

Published

on

Google's Help Documents Aren't Always Up To Date


Google admits its help documents aren’t always up to date and says it’s worthwhile doing your own research on recommended best practices.

This topic is discussed during the latest episode of Google’s SEO & Devs web series on YouTube, which is all about whether official help documents can be trusted.

Martin Splitt of Google’s Developer Relations team, and Michael King, founder and managing director of iPullRank, get together to talk about how Google’s documentation can lead developers to not trust SEO professionals.

SEOs provide recommendations to developers based on the information in Google’s official documents.

Google aims to keep those documents accurate and trustworthy, but the information sometimes lags behind what’s actually working in SEO, and what’s no longer relevant.

A specific example they addressed is a situation that came up in 2019, when Google revealed it stopped supporting rel=”next” and rel=”prev” years before telling the search community.

That meant SEOs were telling developers to use pieces of code that were no longer relevant to Google Search.

Rather than making an official announcement about it, Google simply removed the documentation related to rel=”next” and rel=”prev”.

It wasn’t until Google’s Search Advocate John Mueller received a question about it on Twitter that anyone from the company told the search community about this change.

Some SEO professionals and developers may have come to that conclusion on their own after noticing Google understood pagination just fine without the use of rel=”next” and rel=”prev”.

That’s one example where doing your own research could help you learn how Google Search works, rather than relying solely on official documentation.

See also  John Mueller Recommends Pyramid Site Structure

Splitt shares background information about this situation, and the difficult choices Google had to make when it came to communicating the changes to the search community.

Why Aren’t Google’s Help Documents Always Up To Date?

Google Search changes quickly, so Splitt cautions against looking at the company’s documentation as the single source of truth.

Regarding the rel=”next” and rel=”prev” situation, Splitt says:

“The docs are not always in phase. We’re doing our best to work with the teams and help them to keep their documentation updated, but it does every now and then happen in this case like a bunch of engineers in search quality figured out — ‘hey, hold on, we actually don’t really need the rel-next and rel-prev links anymore to figure out that there’s like a pagination going on. We can figure that out from other things on the page by themselves.’”

When it was discovered the code was no longer needed, Google’s engineers removed support for it.

Splitt explains the decision making process behind communicating this change to SEOs and developers.

“… What do you do? Do you either update the docs to just quietly remove that part because it no longer is relevant?

Or do you go like ‘Hey, by the way, this is no longer necessary. And truthfully speaking it hasn’t been necessary in the last six months.’

Knowing very well that people are reading the documentation, making recommendations based on it to other people, and then these people then invest work and time and money into making that happen.”

Splitt goes on to say the choice was either to remove the documentation and come clean about rel=”next” and rel=”prev” being obsolete, or keep the documents up knowing the code wasn’t necessary anymore.

“And the alternative would be to let it live there in the documentation, even though it’s wrong it doesn’t hurt.

So we went with the full frontal way of going like — ‘Okay, here’s the thing. This has been removed a while ago and we’re sorry about that, but now our docs are updated.’

And I think none of the choices are easy or necessarily perfectly good, but it’s just what happens. So I think we’re trying to keep the docs updated as much as possible.”

So there’s the story behind rel=”next” and rel=”prev” and why Google handled the situation the way it did.

See also  Should Freelance Writers Sign Non-Compete Agreements?

The key takeaway from this story is to always be testing and doing your own research.

Figuring out what works on your own may be more reliable than Google’s help documents.

If you believe something isn’t necessary, even though Google recommends it, your instincts may be correct.

See the full video below:


Featured Image: Screenshot from YouTube.com/GoogleSearchCentral





Source link

Continue Reading

DON'T MISS ANY IMPORTANT NEWS!
Subscribe To our Newsletter
We promise not to spam you. Unsubscribe at any time.
Invalid email address

Trending