Connect with us

SEO

What You Need To Know

Published

on

What You Need To Know

Serious question: Are internal links a ranking factor?

Too often, the chatter around internal links as a ranking factor feels more like it’s coming from a never-ending game of telephone rather than the true source, the search engines.

Certain mythical SEO tales about internal links have been passed down through generations of SEO professionals. It can be hard to tell fact from fiction.

In an effort to set the record straight, I’ve tapped our resources to fact-check whether internal links are a confirmed ranking factor. Drumroll, please: You’ll find the truth about internal links ahead.

The Claim: Internal Links Are A Ranking Factor

An internal link is a hyperlink from a page on a domain to another page on the same domain. Internal links help people navigate websites and create a site architecture for hierarchy.

Advertisement

Okay, but what about the more nitty-gritty questions, like:

  • Does the total number of internal links pointing to a page matter?
  • Does the quality of those internal links pointing to the page have a strong effect?
  • What about the anchor text of those internal links – is that another relevancy signal? Does longer anchor text add more value?
  • Is there such a thing as too many internal links on a page?

Advertisement

Continue Reading Below

The Evidence for Internal Links as a Ranking Factor

Since there are tons of internal link questions to answer still, and I want you to have all the facts straight, here they are.

Are Internal Links A Ranking Factor?

Google confirms internal links are a ranking in their Search Engine Optimization (SEO) Starter Guide. Google states:

Create a naturally flowing hierarchy.

Make it as easy as possible for users to go from general content to the more specific content they want on your site. Add navigation pages when it makes sense and effectively work these into your internal link structure. Make sure all of the pages on your site are reachable through links, and that they don’t require an internal “search” functionality to be found. Link to related pages, where appropriate, to allow users to discover similar content.

And, Google’s “How Search Engines Work” establishes internal links as a ranking factor.

Advertisement

Some pages are known because Google has already crawled them before. Other pages are discovered when Google follows a link from a known page to a new page.

Advertisement

Continue Reading Below

This is also why Google Search Console features the “Top linked pages” report. It is used to “Confirm that the core site pages (home page, contact page) are properly linked within your site.”

The SEO Starter Guide also recommends using internal links in your breadcrumb structured data markup, stating:

“A breadcrumb is a row of internal links at the top or bottom of the page that allows visitors to quickly navigate back to a previous section or the root page. Many breadcrumbs have the most general page (usually the root page) as the first, leftmost link and list the more specific sections out to the right. We recommend using breadcrumb structured data markup when showing breadcrumbs.”

The PageRank algorithm itself, and the internal flow of it, relies on internal links.

Does Your Webpage Rank Faster If You Have Internal Links From High Traffic Pages?

Since Bill Slawski shared his analysis of Google’s Reasonable Surfer patent, there have been arguments in the SEO community as to whether pages with or without traffic affect the ranking signals from internal links.

Advertisement

Slawski stated that “…based upon a probability that a person following links at random on the web might end up upon a particular page.”

The patent talks about the position of a link on a page.

Essentially, it’s about giving more weight to links it believes people will actually click, including links placed in more higher-up positions on the page.

Matt Cutt’s confirmed this at PubCon in 2010.

The patent does not reference traffic.

Slawski also dives into the Page Segmentation patent that explains more about the placement of internal links on a page. And, he shares further insights on how search engines use internal links to understand a webpage.

Advertisement

Is Anchor Text In An Internal Link A Ranking Factor?

The SEO Starter Guide clears up the confusion if the internal link anchor text is a ranking factor as it states:

“Think about anchor text for internal links, too.

You may usually think about linking in terms of pointing to outside websites, but paying more attention to the anchor text used for internal links can help users and Google navigate your site better.”

Advertisement

Continue Reading Below

Google’s John Mueller also addressed this claim on Twitter, where he said:

“Most links do provide a bit of additional context through their anchor text. At least they should, right?”

And, in 2019, Mueller talked more about how internal links help your rankings in a Google Webmaster Hangout.

Advertisement

However, the claim that long anchor text within your internal links is merely speculation at this time. Search engines have not verified this myth.

In fact, the SEO Starter Guide evidently recommends avoiding “using excessively keyword-filled or lengthy anchor text just for search engines.”

Rand Fishkin also dives into his anchor text experiments to prove the value of quality anchor text.

And, Search Engine Journal’s Roger Montti digs into Mueller’s response on if anchor text helps improve rankings.

Are Internal Links Used As A Ranking Signal In Your Site Architecture?

Internal linking can have positive or negative effects:

  • NinjaOutreach increased their site traffic by 50% in three months with their internal link structure.
  • The Daily Mail failed to outrank its competitors because of its weak internal linking.

Advertisement

Continue Reading Below

Advertisement

Google’s patent on Ranking documents based on user behavior or feature data explores site architecture more in-depth.

So, What Happens If Your Internal Links Are Broken?

Broken internal links make it hard for search engines to index your pages and for users to navigate your site. Broken links are a sign of a low-quality site and could affect your rankings.

Google’s Web Page Decay patent validates this claim as it states,

“If the web page has a relatively large number of dead links, it is assessed as being a stale web page.”

Now, How Many Internal Links Are Too Many?

Back in 2009, Matt Cutts stated there was a limit of 100 internal links per page.

In the past, Google would not download more than 100k of a single page (no longer the case), so the idea that the links would distribute your PageRank made sense.

Advertisement

In 2013, Matt Cutts retracted this statement saying to “keep it at a reasonable number.” So, the rule of 100 internal links is no longer valid.

Advertisement

Continue Reading Below

Internal Links As A Ranking Factor: Our Verdict

Yes, there is a bit of truth in the myth that internal links and your ranking in search engines have a connection.

Think about it this way, as Cutts said:

Advertisement

“…if there’s a page that’s important or that has great profit margins or converts really well – escalate that. Put a link to that page from your root page that’s the sort of thing where it can make a lot of sense.”

Advertisement

Continue Reading Below


Featured image: Paulo Bobita/SearchEngineJournal




Source link

Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address
Click to comment

You must be logged in to post a comment Login

Leave a Reply

SEO

OpenAI To Show Content & Links In Response To Queries

Published

on

By

ChatGPT takes step toward becoming a search engine

OpenAI content deal will enhance ChatGPT with the ability to show real-time content with links in response to queries. OpenAI quietly took steps to gaining more search engine type functionality as part of a content licensing deal that may have positive implications for publishers and SEO.

Content Licensing Deal

OpenAI agreed to content licensing with the Financial Times, a global news organization with offices in London, New York, across continental Europe and Asia.

Content licensing deals between AI organizations and publishers are generally about getting access to high quality training data. The training data is then used by language models to learn connections between words and concepts. This deal goes far beyond that use.

ChatGPT Will Show Direct Quotes With Attribution

What makes this content licensing deal between The Financial Times and OpenAI is that there is a reference to giving attribution to content within ChatGPT.

The announced licensing deal explicitly mentions the use of the licensed content so that ChatGPT could directly quote it and provide links to the licensed content.

Advertisement

Further, the licensing deal is intended to help improve ChatGPT’s “usefulness”, which is vague and can mean many things, but it takes on a slightly different meaning when used in the context of attributed answers.

The Financial Times agreement states that the licensing deal is for use in ChatGPT when it provides “attributed content” which is content with an attribution, commonly a link to where the content appeared.

This is the part of the announcement that references attributed content:

“The Financial Times today announced a strategic partnership and licensing agreement with OpenAI, a leader in artificial intelligence research and deployment, to enhance ChatGPT with attributed content, help improve its models’ usefulness by incorporating FT journalism, and collaborate on developing new AI products and features for FT readers. “

And this is the part of the announcement that mentions ChatGPT offering users attributed quotes and links:

“Through the partnership, ChatGPT users will be able to see select attributed summaries, quotes and links to FT journalism in response to relevant queries.”

The Financial Times Group CEO was even more explicit about OpenAI’s intention to show content and links in ChatGPT:

“This is an important agreement in a number of respects,” said FT Group CEO John Ridding. “It recognises the value of our award-winning journalism and will give us early insights into how content is surfaced through AI. …this partnership will help keep us at the forefront of developments in how people access and use information.

OpenAI understands the importance of transparency, attribution, and compensation…”

Advertisement

Brad Lightcap, COO of OpenAI directly referenced showing real-time news content in ChatGPT but more important he referenced OpenAI exploring new ways to show content to its user base.

Lastly, the COO stated that they embraced disruption, which means innovation that creates a new industry or paradigm, usually at the expense of an older one, like search engines.

Lightcap is quoted:

“We have always embraced new technologies and disruption, and we’ll continue to operate with both curiosity and vigilance as we navigate this next wave of change.”

Showing direct quotes of Financial Times content with links to that content is very similar to how search engines work. This is a big change to how ChatGPT works and could be a sign of where ChatGPT is going in the future, a functionality that incorporates online content with links to that content.

Something Else That Is Possibly Related

Someone on Twitter recently noticed a change that is related to “search” in relation to ChatGPT.

This change involves an SSL security certificate that was added for a subdomain of ChatGPT.com. ChatGPT.com is a domain name that was snapped up by someone to capitalize on the 2022 announcement of ChatGPT by OpenAI. OpenAI eventually acquired the domain and it’s been redirecting to ChatGPT.

Advertisement

The change that was noticed is to the subdomain: search.chatgpt.com.

This is a screenshot of the tweet:

Big News For SEO and Publishers

This is significant news for publishers and search marketers ChatGPT will become a source of valuable traffic if OpenAI takes ChatGPT in the direction of providing attributed summaries and direct quotes.

How Can Publishers Get Traffic From ChatGPT?

Questions remain about attributed quotes with links in response to relevant queries. Here are X unknowns about ChatGPT attributed links.

  • Does this mean that only licensed content will be shown and linked to in ChatGPT?
  • Will ChatGPT incorporate and use most web data without licensing deals in the same way that search engines do?
  • OpenAI may incorporate an Opt-In model where publishers can use a notation in Robots.txt or in meta data to opt-in to receiving traffic from ChatGPT.
  • Would you opt into receiving traffic from ChatGPT in exchange for allowing your content to be used for training?
  • How would SEOs and publisher’s equation on ChatGPT change if their competitors are all receiving traffic from ChatGPT?

Read the original announcement:

Financial Times announces strategic partnership with OpenAI

Advertisement

Featured Image by Shutterstock/Photo For Everything

Source link

Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address
Continue Reading

SEO

Google’s John Mueller On Website Recovery After Core Updates

Published

on

By

businessman financial professional look through binocular to see graph and chart.

John Mueller, a Google Search Advocate, provided guidance this week regarding the path forward for websites impacted by recent search algorithm updates.

The discussion started on X (formerly Twitter) by SEO professional Thomas Jepsen.

Jepsen tagged Mueller, asking:

“Google has previously said Google doesn’t hold a grudge and sites will recover once issues have been solved. Is that still the case after HCU?”

Mueller’s response offered hope to site owners while being realistic about the challenges ahead.

Addressing Recovery Timelines

Mueller affirmed Google’s stance on not holding grudges, stating, “That’s still the case.”

Advertisement

However, he acknowledged the complexity of rankings, saying:

“…some things take much longer to be reassessed (sometimes months, at the moment), and some bigger effects require another update cycle.”

Mueller pointed to a Google help document explaining the nuances. The document reads:

“Broad core updates tend to happen every few months. Content that was impacted in Search or Discover by one might not recover—assuming improvements have been made—until the next broad core update is released.

Do keep in mind that improvements made by site owners aren’t a guarantee of recovery, nor do pages have any static or guaranteed position in our search results. If there’s more deserving content, that will continue to rank well with our systems.”

The Comments Sparking Debate

Jepsen probed further, asking, “Is a core update what’s needed for HCU-affected sites to recover (assuming they’ve fixed their issues)?”

Mueller’s response highlighted how situations can differ:

“It depends on the situation… I realize there’s a big space between the situations, but generalizing doesn’t help. Sometimes it takes a lot of work on the site, a long time, and an update.”

The thread grew as user @selectgame raised concerns about Google Discover traffic, to which Mueller replied:

“Google Discover is affected by core updates as well as other parts of Search (and there are more policies that apply to Discover).”

Growing Frustrations

Prominent industry figure Lily Ray voiced mounting frustrations, stating,

“…many HCU-affected websites – which have been making all kinds of improvements over the last 7 months – have only seen further declines with the March Core Update.

I have seen some sites lose 90% or more of their SEO visibility since the HCU, with the last few weeks being the nail in the coffin, despite making significant improvements.”

Ray continued:

“And in my professional opinion, many of these sites did not deserve anywhere near that level of impact, especially the further declines over the past month.”

Mueller hasn’t responded to Ray’s tweet at this time.

Advertisement

Looking Ahead

As the search community awaits Google’s next moves, the path to recovery appears arduous for many impacted by recent algorithm reassessments of “Helpful Content.”

Site improvements don’t guarantee immediate recovery, so publishers face an uphill battle guided only by Google’s ambiguous public advice.

Why SEJ Cares

The March 2024 core update has proven disastrous for many websites, with severe traffic losses persisting even after sites try to improve low-quality content, address technical issues, and realign with Google’s guidelines.

Having clear, actionable guidance from Google on recovering from core update updates is invaluable.

Advertisement

As evidenced by the frustrations expressed, the current communications leave much to be desired regarding transparency and defining a straightforward recovery path.

How This Can Help You

While Mueller’s comments provide some insights, the key takeaways are:

  • Regaining previous rankings after an algorithm hit is possible if sufficient content/site quality improvements are made.
  • Recovery timelines can vary significantly and may require a future core algorithm update.
  • Even with enhancements, recovery isn’t guaranteed as rankings depend on the overall pool of competing content.

The path is undoubtedly challenging, but Mueller’s comments underscore that perseverance with substantial site improvements can eventually pay off.


FAQ

Can SEO professionals predict recovery time for a website hit by core updates?

SEO professionals can’t pinpoint when a site will recover after a core Google algorithm update.

Reasons for this include:

  • Google releases core updates every few months, so sites may need to wait for the next one.
  • It can take months for Google to reassess and adjust rankings.
  • How competitive the query is also impacts if and when a site recovers.

Does making site improvements after a core update ensure recovery in rankings and visibility?

After making improvements following a Google algorithm update, regaining your previous rankings isn’t guaranteed.

Advertisement

Reasons why include:

  • Your impacted content may not recover until the next core update, provided you’ve implemented enough site improvements.
  • Google’s search results are dynamic, and rankings can fluctuate based on the quality of competitor content.
  • There’s no fixed or guaranteed position in Google’s search results.

What is the relationship between Google Discover traffic and core search updates?

Google’s core algorithm updates that impact regular search results also affect Google Discover.

However, Google Discover has additional specific policies that determine what content appears there.

This means:

  • Improving your content and website quality can boost your visibility on Google Discover, just like regular searches.
  • You may see changes in your Discover traffic when Google rolls out core updates.
  • Your SEO and content strategy should account for potential impacts on regular searches and Google Discover.
Advertisement

Featured Image: eamesBot/Shutterstock



Source link

Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address
Continue Reading

SEO

5 Things To Consider Before A Site Migration

Published

on

By

How to successfully do a site migration

One of the scariest SEO tasks is a site migration because the stakes are so high and the pitfalls at every step . Here are five tips that will help keep a site migration on track to a successful outcome.

Site Migrations Are Not One Thing

Site Migrations are not one thing, they are actually different scenarios and the only thing they have in common is that there is always something that can go wrong.

Here are examples of some of the different kinds of site migrations:

  • Migration to a new template
  • Migrating to a new web host
  • Merging two different websites
  • Migrating to a new domain name
  • Migrating to a new site architecture
  • Migrating to a new content management system (CMS)
  • Migrating to a new WordPress site builder

There are many ways a site can change and more ways for those changes to result in a negative outcome.

The following is not a site migration checklist. It’s five suggestions for things to consider.

1. Prepare For Migration: Download Everything

Rule number one is to prepare for the site migration. One of my big concerns is that the old version of the website is properly documented.

Advertisement

These are some of the ways to document a website:

  • Download the database and save it in at least two places. I like to have a backup of the backup stored on a second device.
  • Download all the website files. Again, I prefer to save a backup of the backup stored on a second device.
  • Crawl the site, save the crawl and export it as a CSV or an XML site map. I prefer to have redundant backups just in case something goes wrong.

An important thing to remember about downloading files by FTP is that there are two formats for downloading files: ASCII and Binary.

  1. Use ASCII for downloading files that contain code, like CSS, JS, PHP and HTML.
  2. Use Binary for media like images, videos and zip files.

Fortunately, most modern FTP software have an automatic setting that should be able to distinguish between the two kinds of files. A sad thing that can happen is to download image files using the ASCII format which results in corrupted images.

So always check that your files are all properly downloaded and not in a corrupted state. Always consider downloading a copy for yourself if you have hired a third party to handle the migration or a client is doing it and they’re downloading files. That way if they fail with their download you’ll have an uncorrupted copy backed up.

The most important rule about backups: You can never have too many backups!

2. Crawl The Website

Do a complete crawl of the website. Create a backup of the crawl. Then create a backup of the backup and store it on a separate hard drive.

After the site migration, this crawl data can be used to generate a new list for crawling the old URLs to identify any URLs that are missing (404), are failing to redirect, or are redirecting to the wrong webpage. Screaming Frog also has a list mode that can crawl a list of URLs saved in different formats, including as an XML sitemap, and directly input into a text field.  This is a way to crawl a specific batch of URLs as opposed to crawling a site from link to to link.

3. Tips For Migrating To A New Template

Website redesigns can be can be a major source of anguish when they go wrong. On paper, migrating a site to a new template should be a one-to-one change with minimal issues. In practice that’s not always the case.  For one, no template can be used off the shelf, it has to be modified to conform to what’s needed, which can mean removing and/or altering the code.

Advertisement

Search marketing expert Nigel Mordaunt (LinkedIn), who recently sold his search marketing agency, has experience migrating over a hundred sites and has important considerations for migrating to a new WordPress template.

This is Nigel’s advice:

“Check that all images have the same URL, alt text and image titles, especially if you’re using new images.

Templates sometimes have hard-coded heading elements, especially in the footer and sidebars. Those should be styled with CSS, not with H tags. I had this problem with a template once where the ranks had moved unexpectedly, then found that the Contact Us and other navigation links were all marked up to H2. I think that was more of a problem a few years ago. But still, some themes have H tags hard coded in places that aren’t ideal.

Make sure that all URLs are the exact same, a common mistake. Also, if planning to change content then check that the staging environment has been noindexed then after the site goes live make sure that the newly uploaded live site no longer contains the noindex robots meta tag.

If changing content then be prepared the site to perhaps be re-evaluated by Google. Depending on the size of the site, even if the changes are positive it may take several weeks to be rewarded, and in some cases several months. The client needs to be informed of this before the migration.

Also, check that analytics and tracking codes have been inserted into the new site, review all image sizes to make sure there are no new images that are huge and haven’t been scaled down. You can easily check the image sizes and heading tags with a post-migration Screaming Frog crawl. I can’t imagine doing any kind of site migration without Screaming Frog.”

Advertisement

4. Advice For Migrating To A New Web Host

Mark Barrera (LinkedIn), VP SEO, Newfold Digital (parent company of Bluehost), had this to say about crawling before a site migration in preparation for a migration to a new web host:

“Thoroughly crawl your existing site to identify any indexing or technical SEO issues prior to the move.

Maintain URL Structure (If Possible): Changing URL structures can confuse search engines and damage your link equity. If possible, keep your URLs the same.

301 Redirects: 301 Redirects are your friend. Search engines need to be informed that your old content now lives at a new address. Implementing 301 redirects from any old URLs to their new counterparts preserves link equity and avoids 404 errors for both users and search engine crawlers.

Performance Optimization: Ensure your new host provides a fast and reliable experience. Site speed is important for user experience.

Be sure to do a final walkthrough of your new site before doing your actual cutover. Visually double-check your homepage, any landing pages, and your most popular search hits. Review any checkout/cart flows, comment/review chains, images, and any outbound links to your other sites or your partners.

SSL Certificate: A critical but sometimes neglected aspect of hosting migrations is the SSL certificate setup. Ensuring that your new host supports and correctly implements your existing SSL certificate—or provides a new one without causing errors is vital. SSL/TLS not only secures your site but also impacts SEO. Any misconfiguration during migration can lead to warnings in browsers, which deter visitors and can temporarily impact rankings.

Advertisement

Post migration, it’s crucial to benchmark server response times not just from one location, but regionally or globally, especially if your audience is international. Sometimes, a new hosting platform might show great performance in one area but lag in other parts of the world. Such discrepancies can affect page load times, influencing bounce rates and search rankings. “

5. Accept Limitations

Ethan Lazuk, SEO Strategist & Consultant, Ethan Lazuk Consulting, LLC, (LinkedIn, Twitter) offers an interesting perspective on site migrations on the point about anticipating client limitations imposed upon what you are able to do. It can be frustrating when a client pushes back on advice and it’s important to listen to their reasons for doing it.

I have consulted over Zoom with companies whose SEO departments had concerns about what an external SEO wanted to do. Seeking a third party confirmation about a site migration plan is a reasonable thing to do. So if the internal SEO department has concerns about the plan, it’s not a bad idea to have a trustworthy third party take a look at it.

Ethan shared his experience:

“The most memorable and challenging site migrations I’ve been a part of involved business decisions that I had no control over.

As SEOs, we can create a smart migration plan. We can follow pre- and post-launch checklists, but sometimes, there are legal restrictions or other business realities behind the scenes that we have to work around.

Not having access to a DNS, being restricted from using a brand’s name or certain content, having to use an intermediate domain, and having to work days, weeks, or months afterward to resolve any issues once the internal business situations have changed are just a few of the tricky migration issues I’ve encountered.

Advertisement

The best way to handle these situations require working around client restrictions is to button up the SEO tasks you can control, set honest expectations for how the business issues could impact performance after the migration, and stay vigilant with monitoring post-launch data and using it to advocate for resources you need to finish the job.”

Different Ways To Migrate A Website

Site migrations are a pain and should be approached with caution. I’ve done many different kinds of migrations for myself and have assisted them with clients. I’m currently moving thousands of webpages from a folder to the root and it’s complicated by multiple redirects that have to be reconfigured, not looking forward to it. But migrations are sometimes unavoidable so it’s best to step up to it after careful consideration.

Featured Image by Shutterstock/Krakenimages.com



Source link

Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address
Continue Reading

Trending

Follow by Email
RSS