Connect with us

SEO

11 Disadvantages Of ChatGPT Content

Published

on

11 Disadvantages Of ChatGPT Content

ChatGPT produces content that is comprehensive and plausibly accurate.

But researchers, artists, and professors warn of shortcomings to be aware of which degrade the quality of the content.

In this article, we’ll look at 11 disadvantages of ChatGPT content. Let’s dive in.

1. Phrase Usage Makes It Detectable As Non-Human

Researchers studying how to detect machine-generated content have discovered patterns that make it sound unnatural.

One of these quirks is how AI struggles with idioms.

Advertisement

An idiom is a phrase or saying with a figurative meaning attached to it, for example, “every cloud has a silver lining.” 

A lack of idioms within a piece of content can be a signal that the content is machine-generated – and this can be part of a detection algorithm.

This is what the 2022 research paper Adversarial Robustness of Neural-Statistical Features in Detection of Generative Transformers says about this quirk in machine-generated content:

“Complex phrasal features are based on the frequency of specific words and phrases within the analyzed text that occur more frequently in human text.

…Of these complex phrasal features, idiom features retain the most predictive power in detection of current generative models.”

This inability to use idioms contributes to making ChatGPT output sound and read unnaturally.

2. ChatGPT Lacks Ability For Expression

An artist commented on how the output of ChatGPT mimics what art is, but lacks the actual qualities of artistic expression.

Advertisement

Expression is the act of communicating thoughts or feelings.

ChatGPT output doesn’t contain expressions, only words.

It cannot produce content that touches people emotionally on the same level as a human can – because it has no actual thoughts or feelings.

Musical artist Nick Cave, in an article posted to his Red Hand Files newsletter, commented on a ChatGPT lyric that was sent to him, which was created in the style of Nick Cave.

He wrote:

“What makes a great song great is not its close resemblance to a recognizable work.

…it is the breathless confrontation with one’s vulnerability, one’s perilousness, one’s smallness, pitted against a sense of sudden shocking discovery; it is the redemptive artistic act that stirs the heart of the listener, where the listener recognizes in the inner workings of the song their own blood, their own struggle, their own suffering.”

Advertisement

Cave called the ChatGPT lyrics a mockery.

This is the ChatGPT lyric that resembles a Nick Cave lyric:

“I’ve got the blood of angels, on my hands
I’ve got the fire of hell, in my eyes
I’m the king of the abyss, I’m the ruler of the dark
I’m the one that they fear, in the shadows they hark”

And this is an actual Nick Cave lyric (Brother, My Cup Is Empty):

“Well I’ve been sliding down on rainbows
I’ve been swinging from the stars
Now this wretch in beggar’s clothing
Bangs his cup across the bars
Look, this cup of mine is empty!
Seems I’ve misplaced my desires
Seems I’m sweeping up the ashes
Of all my former fires”

It’s easy to see that the machine-generated lyric resembles the artist’s lyric, but it doesn’t really communicate anything.

Nick Cave’s lyrics tell a story that resonates with the pathos, desire, shame, and willful deception of the person speaking in the song. It expresses thoughts and feelings.

It’s easy to see why Nick Cave calls it a mockery.

Advertisement

3. ChatGPT Does Not Produce Insights

An article published in The Insider quoted an academic who noted that academic essays generated by ChatGPT lack insights about the topic.

ChatGPT summarizes the topic but does not offer a unique insight into the topic.

Humans create through knowledge, but also through their personal experience and subjective perceptions.

Professor Christopher Bartel of Appalachian State University is quoted by The Insider as saying that, while a ChatGPT essay may exhibit high grammar qualities and sophisticated ideas, it still lacked insight.

Bartel said:

“They are really fluffy. There’s no context, there’s no depth or insight.”

Insight is the hallmark of a well-done essay and it’s something that ChatGPT is not particularly good at.

Advertisement

This lack of insight is something to keep in mind when evaluating machine-generated content.

4. ChatGPT Is Too Wordy

A research paper published in January 2023 discovered patterns in ChatGPT content that makes it less suitable for critical applications.

The paper is titled, How Close is ChatGPT to Human Experts? Comparison Corpus, Evaluation, and Detection.

The research showed that humans preferred answers from ChatGPT in more than 50% of questions answered related to finance and psychology.

But ChatGPT failed at answering medical questions because humans preferred direct answers – something the AI didn’t provide.

The researchers wrote:

Advertisement

“…ChatGPT performs poorly in terms of helpfulness for the medical domain in both English and Chinese.

The ChatGPT often gives lengthy answers to medical consulting in our collected dataset, while human experts may directly give straightforward answers or suggestions, which may partly explain why volunteers consider human answers to be more helpful in the medical domain.”

ChatGPT tends to cover a topic from different angles, which makes it inappropriate when the best answer is a direct one.

Marketers using ChatGPT must take note of this because site visitors requiring a direct answer will not be satisfied with a verbose webpage.

And good luck ranking an overly wordy page in Google’s featured snippets, where a succinct and clearly expressed answer that can work well in Google Voice may have a better chance to rank than a long-winded answer.

OpenAI, the makers of ChatGPT, acknowledges that giving verbose answers is a known limitation.

The announcement article by OpenAI states:

Advertisement

“The model is often excessively verbose…”

The ChatGPT bias toward providing long-winded answers is something to be mindful of when using ChatGPT output, as you may encounter situations where shorter and more direct answers are better.

5. ChatGPT Content Is Highly Organized With Clear Logic

ChatGPT has a writing style that is not only verbose but also tends to follow a template that gives the content a unique style that isn’t human.

This inhuman quality is revealed in the differences between how humans and machines answer questions.

The movie Blade Runner has a scene featuring a series of questions designed to reveal whether the subject answering the questions is a human or an android.

These questions were a part of a fictional test called the “Voigt-Kampff test“.

One of the questions is:

Advertisement

“You’re watching television. Suddenly you realize there’s a wasp crawling on your arm. What do you do?”

A normal human response would be to say something like they would scream, walk outside and swat it, and so on.

But when I posed this question to ChatGPT, it offered a meticulously organized answer that summarized the question and then offered logical multiple possible outcomes – failing to answer the actual question.

Screenshot Of ChatGPT Answering A Voight-Kampff Test Question

Screenshot from ChatGPT, January 2023

The answer is highly organized and logical, giving it a highly unnatural feel, which is undesirable.

6. ChatGPT Is Overly Detailed And Comprehensive

ChatGPT was trained in a way that rewarded the machine when humans were happy with the answer.

The human raters tended to prefer answers that had more details.

But sometimes, such as in a medical context, a direct answer is better than a comprehensive one.

What that means is that the machine needs to be prompted to be less comprehensive and more direct when those qualities are important.

Advertisement

From OpenAI:

“These issues arise from biases in the training data (trainers prefer longer answers that look more comprehensive) and well-known over-optimization issues.”

7. ChatGPT Lies (Hallucinates Facts)

The above-cited research paper, How Close is ChatGPT to Human Experts?, noted that ChatGPT has a tendency to lie.

It reports:

“When answering a question that requires professional knowledge from a particular field, ChatGPT may fabricate facts in order to give an answer…

For example, in legal questions, ChatGPT may invent some non-existent legal provisions to answer the question.

…Additionally, when a user poses a question that has no existing answer, ChatGPT may also fabricate facts in order to provide a response.”

The Futurism website documented instances where machine-generated content published on CNET was wrong and full of “dumb errors.”

Advertisement

CNET should have had an idea this could happen, because OpenAI published a warning about incorrect output:

“ChatGPT sometimes writes plausible-sounding but incorrect or nonsensical answers.”

CNET claims to have submitted the machine-generated articles to human review prior to publication.

A problem with human review is that ChatGPT content is designed to sound persuasively correct, which may fool a reviewer who is not a topic expert.

8. ChatGPT Is Unnatural Because It’s Not Divergent

The research paper, How Close is ChatGPT to Human Experts? also noted that human communication can have indirect meaning, which requires a shift in topic to understand it.

ChatGPT is too literal, which causes the answers to sometimes miss the mark because the AI overlooks the actual topic.

The researchers wrote:

Advertisement

“ChatGPT’s responses are generally strictly focused on the given question, whereas humans’ are divergent and easily shift to other topics.

In terms of the richness of content, humans are more divergent in different aspects, while ChatGPT prefers focusing on the question itself.

Humans can answer the hidden meaning under the question based on their own common sense and knowledge, but the ChatGPT relies on the literal words of the question at hand…”

Humans are better able to diverge from the literal question, which is important for answering “what about” type questions.

For example, if I ask:

“Horses are too big to be a house pet. What about raccoons?”

The above question is not asking if a raccoon is an appropriate pet. The question is about the size of the animal.

ChatGPT focuses on the appropriateness of the raccoon as a pet instead of focusing on the size.

Advertisement

Screenshot of an Overly Literal ChatGPT Answer

11 Disadvantages Of ChatGPT ContentScreenshot from ChatGPT, January 2023

9. ChatGPT Contains A Bias Towards Being Neutral

The output of ChatGPT is generally neutral and informative. It’s a bias in the output that can appear helpful but isn’t always.

The research paper we just discussed noted that neutrality is an unwanted quality when it comes to legal, medical, and technical questions.

Humans tend to pick a side when offering these kinds of opinions.

10. ChatGPT Is Biased To Be Formal

ChatGPT output has a bias that prevents it from loosening up and answering with ordinary expressions. Instead, its answers tend to be formal.

Humans, on the other hand, tend to answer questions with a more colloquial style, using everyday language and slang – the opposite of formal.

ChatGPT doesn’t use abbreviations like GOAT or TL;DR.

The answers also lack instances of irony, metaphors, and humor, which can make ChatGPT content overly formal for some content types.

Advertisement

The researchers write:

“…ChatGPT likes to use conjunctions and adverbs to convey a logical flow of thought, such as “In general”, “on the other hand”, “Firstly,…, Secondly,…, Finally” and so on.

11. ChatGPT Is Still In Training

ChatGPT is currently still in the process of training and improving.

OpenAI recommends that all content generated by ChatGPT should be reviewed by a human, listing this as a best practice.

OpenAI suggests keeping humans in the loop:

“Wherever possible, we recommend having a human review outputs before they are used in practice.

This is especially critical in high-stakes domains, and for code generation.

Humans should be aware of the limitations of the system, and have access to any information needed to verify the outputs (for example, if the application summarizes notes, a human should have easy access to the original notes to refer back).”

Advertisement

Unwanted Qualities Of ChatGPT

It’s clear that there are many issues with ChatGPT that make it unfit for unsupervised content generation. It contains biases and fails to create content that feels natural or contains genuine insights.

Further, its inability to feel or author original thoughts makes it a poor choice for generating artistic expressions.

Users should apply detailed prompts in order to generate content that is better than the default content it tends to output.

Lastly, human review of machine-generated content is not always enough, because ChatGPT content is designed to appear correct, even when it’s not.

That means it’s important that human reviewers are subject-matter experts who can discern between correct and incorrect content on a specific topic.

More resources: 

Advertisement

Featured image by Shutterstock/fizkes



Source link

Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address

SEO

OpenAI To Show Content & Links In Response To Queries

Published

on

By

ChatGPT takes step toward becoming a search engine

OpenAI content deal will enhance ChatGPT with the ability to show real-time content with links in response to queries. OpenAI quietly took steps to gaining more search engine type functionality as part of a content licensing deal that may have positive implications for publishers and SEO.

Content Licensing Deal

OpenAI agreed to content licensing with the Financial Times, a global news organization with offices in London, New York, across continental Europe and Asia.

Content licensing deals between AI organizations and publishers are generally about getting access to high quality training data. The training data is then used by language models to learn connections between words and concepts. This deal goes far beyond that use.

ChatGPT Will Show Direct Quotes With Attribution

What makes this content licensing deal between The Financial Times and OpenAI is that there is a reference to giving attribution to content within ChatGPT.

The announced licensing deal explicitly mentions the use of the licensed content so that ChatGPT could directly quote it and provide links to the licensed content.

Advertisement

Further, the licensing deal is intended to help improve ChatGPT’s “usefulness”, which is vague and can mean many things, but it takes on a slightly different meaning when used in the context of attributed answers.

The Financial Times agreement states that the licensing deal is for use in ChatGPT when it provides “attributed content” which is content with an attribution, commonly a link to where the content appeared.

This is the part of the announcement that references attributed content:

“The Financial Times today announced a strategic partnership and licensing agreement with OpenAI, a leader in artificial intelligence research and deployment, to enhance ChatGPT with attributed content, help improve its models’ usefulness by incorporating FT journalism, and collaborate on developing new AI products and features for FT readers. “

And this is the part of the announcement that mentions ChatGPT offering users attributed quotes and links:

“Through the partnership, ChatGPT users will be able to see select attributed summaries, quotes and links to FT journalism in response to relevant queries.”

The Financial Times Group CEO was even more explicit about OpenAI’s intention to show content and links in ChatGPT:

“This is an important agreement in a number of respects,” said FT Group CEO John Ridding. “It recognises the value of our award-winning journalism and will give us early insights into how content is surfaced through AI. …this partnership will help keep us at the forefront of developments in how people access and use information.

OpenAI understands the importance of transparency, attribution, and compensation…”

Advertisement

Brad Lightcap, COO of OpenAI directly referenced showing real-time news content in ChatGPT but more important he referenced OpenAI exploring new ways to show content to its user base.

Lastly, the COO stated that they embraced disruption, which means innovation that creates a new industry or paradigm, usually at the expense of an older one, like search engines.

Lightcap is quoted:

“We have always embraced new technologies and disruption, and we’ll continue to operate with both curiosity and vigilance as we navigate this next wave of change.”

Showing direct quotes of Financial Times content with links to that content is very similar to how search engines work. This is a big change to how ChatGPT works and could be a sign of where ChatGPT is going in the future, a functionality that incorporates online content with links to that content.

Something Else That Is Possibly Related

Someone on Twitter recently noticed a change that is related to “search” in relation to ChatGPT.

This change involves an SSL security certificate that was added for a subdomain of ChatGPT.com. ChatGPT.com is a domain name that was snapped up by someone to capitalize on the 2022 announcement of ChatGPT by OpenAI. OpenAI eventually acquired the domain and it’s been redirecting to ChatGPT.

Advertisement

The change that was noticed is to the subdomain: search.chatgpt.com.

This is a screenshot of the tweet:

Big News For SEO and Publishers

This is significant news for publishers and search marketers ChatGPT will become a source of valuable traffic if OpenAI takes ChatGPT in the direction of providing attributed summaries and direct quotes.

How Can Publishers Get Traffic From ChatGPT?

Questions remain about attributed quotes with links in response to relevant queries. Here are X unknowns about ChatGPT attributed links.

  • Does this mean that only licensed content will be shown and linked to in ChatGPT?
  • Will ChatGPT incorporate and use most web data without licensing deals in the same way that search engines do?
  • OpenAI may incorporate an Opt-In model where publishers can use a notation in Robots.txt or in meta data to opt-in to receiving traffic from ChatGPT.
  • Would you opt into receiving traffic from ChatGPT in exchange for allowing your content to be used for training?
  • How would SEOs and publisher’s equation on ChatGPT change if their competitors are all receiving traffic from ChatGPT?

Read the original announcement:

Financial Times announces strategic partnership with OpenAI

Advertisement

Featured Image by Shutterstock/Photo For Everything

Source link

Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address
Continue Reading

SEO

Google’s John Mueller On Website Recovery After Core Updates

Published

on

By

businessman financial professional look through binocular to see graph and chart.

John Mueller, a Google Search Advocate, provided guidance this week regarding the path forward for websites impacted by recent search algorithm updates.

The discussion started on X (formerly Twitter) by SEO professional Thomas Jepsen.

Jepsen tagged Mueller, asking:

“Google has previously said Google doesn’t hold a grudge and sites will recover once issues have been solved. Is that still the case after HCU?”

Mueller’s response offered hope to site owners while being realistic about the challenges ahead.

Addressing Recovery Timelines

Mueller affirmed Google’s stance on not holding grudges, stating, “That’s still the case.”

Advertisement

However, he acknowledged the complexity of rankings, saying:

“…some things take much longer to be reassessed (sometimes months, at the moment), and some bigger effects require another update cycle.”

Mueller pointed to a Google help document explaining the nuances. The document reads:

“Broad core updates tend to happen every few months. Content that was impacted in Search or Discover by one might not recover—assuming improvements have been made—until the next broad core update is released.

Do keep in mind that improvements made by site owners aren’t a guarantee of recovery, nor do pages have any static or guaranteed position in our search results. If there’s more deserving content, that will continue to rank well with our systems.”

The Comments Sparking Debate

Jepsen probed further, asking, “Is a core update what’s needed for HCU-affected sites to recover (assuming they’ve fixed their issues)?”

Mueller’s response highlighted how situations can differ:

“It depends on the situation… I realize there’s a big space between the situations, but generalizing doesn’t help. Sometimes it takes a lot of work on the site, a long time, and an update.”

The thread grew as user @selectgame raised concerns about Google Discover traffic, to which Mueller replied:

“Google Discover is affected by core updates as well as other parts of Search (and there are more policies that apply to Discover).”

Growing Frustrations

Prominent industry figure Lily Ray voiced mounting frustrations, stating,

“…many HCU-affected websites – which have been making all kinds of improvements over the last 7 months – have only seen further declines with the March Core Update.

I have seen some sites lose 90% or more of their SEO visibility since the HCU, with the last few weeks being the nail in the coffin, despite making significant improvements.”

Ray continued:

“And in my professional opinion, many of these sites did not deserve anywhere near that level of impact, especially the further declines over the past month.”

Mueller hasn’t responded to Ray’s tweet at this time.

Advertisement

Looking Ahead

As the search community awaits Google’s next moves, the path to recovery appears arduous for many impacted by recent algorithm reassessments of “Helpful Content.”

Site improvements don’t guarantee immediate recovery, so publishers face an uphill battle guided only by Google’s ambiguous public advice.

Why SEJ Cares

The March 2024 core update has proven disastrous for many websites, with severe traffic losses persisting even after sites try to improve low-quality content, address technical issues, and realign with Google’s guidelines.

Having clear, actionable guidance from Google on recovering from core update updates is invaluable.

Advertisement

As evidenced by the frustrations expressed, the current communications leave much to be desired regarding transparency and defining a straightforward recovery path.

How This Can Help You

While Mueller’s comments provide some insights, the key takeaways are:

  • Regaining previous rankings after an algorithm hit is possible if sufficient content/site quality improvements are made.
  • Recovery timelines can vary significantly and may require a future core algorithm update.
  • Even with enhancements, recovery isn’t guaranteed as rankings depend on the overall pool of competing content.

The path is undoubtedly challenging, but Mueller’s comments underscore that perseverance with substantial site improvements can eventually pay off.


FAQ

Can SEO professionals predict recovery time for a website hit by core updates?

SEO professionals can’t pinpoint when a site will recover after a core Google algorithm update.

Reasons for this include:

  • Google releases core updates every few months, so sites may need to wait for the next one.
  • It can take months for Google to reassess and adjust rankings.
  • How competitive the query is also impacts if and when a site recovers.

Does making site improvements after a core update ensure recovery in rankings and visibility?

After making improvements following a Google algorithm update, regaining your previous rankings isn’t guaranteed.

Advertisement

Reasons why include:

  • Your impacted content may not recover until the next core update, provided you’ve implemented enough site improvements.
  • Google’s search results are dynamic, and rankings can fluctuate based on the quality of competitor content.
  • There’s no fixed or guaranteed position in Google’s search results.

What is the relationship between Google Discover traffic and core search updates?

Google’s core algorithm updates that impact regular search results also affect Google Discover.

However, Google Discover has additional specific policies that determine what content appears there.

This means:

  • Improving your content and website quality can boost your visibility on Google Discover, just like regular searches.
  • You may see changes in your Discover traffic when Google rolls out core updates.
  • Your SEO and content strategy should account for potential impacts on regular searches and Google Discover.
Advertisement

Featured Image: eamesBot/Shutterstock



Source link

Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address
Continue Reading

SEO

5 Things To Consider Before A Site Migration

Published

on

By

How to successfully do a site migration

One of the scariest SEO tasks is a site migration because the stakes are so high and the pitfalls at every step . Here are five tips that will help keep a site migration on track to a successful outcome.

Site Migrations Are Not One Thing

Site Migrations are not one thing, they are actually different scenarios and the only thing they have in common is that there is always something that can go wrong.

Here are examples of some of the different kinds of site migrations:

  • Migration to a new template
  • Migrating to a new web host
  • Merging two different websites
  • Migrating to a new domain name
  • Migrating to a new site architecture
  • Migrating to a new content management system (CMS)
  • Migrating to a new WordPress site builder

There are many ways a site can change and more ways for those changes to result in a negative outcome.

The following is not a site migration checklist. It’s five suggestions for things to consider.

1. Prepare For Migration: Download Everything

Rule number one is to prepare for the site migration. One of my big concerns is that the old version of the website is properly documented.

Advertisement

These are some of the ways to document a website:

  • Download the database and save it in at least two places. I like to have a backup of the backup stored on a second device.
  • Download all the website files. Again, I prefer to save a backup of the backup stored on a second device.
  • Crawl the site, save the crawl and export it as a CSV or an XML site map. I prefer to have redundant backups just in case something goes wrong.

An important thing to remember about downloading files by FTP is that there are two formats for downloading files: ASCII and Binary.

  1. Use ASCII for downloading files that contain code, like CSS, JS, PHP and HTML.
  2. Use Binary for media like images, videos and zip files.

Fortunately, most modern FTP software have an automatic setting that should be able to distinguish between the two kinds of files. A sad thing that can happen is to download image files using the ASCII format which results in corrupted images.

So always check that your files are all properly downloaded and not in a corrupted state. Always consider downloading a copy for yourself if you have hired a third party to handle the migration or a client is doing it and they’re downloading files. That way if they fail with their download you’ll have an uncorrupted copy backed up.

The most important rule about backups: You can never have too many backups!

2. Crawl The Website

Do a complete crawl of the website. Create a backup of the crawl. Then create a backup of the backup and store it on a separate hard drive.

After the site migration, this crawl data can be used to generate a new list for crawling the old URLs to identify any URLs that are missing (404), are failing to redirect, or are redirecting to the wrong webpage. Screaming Frog also has a list mode that can crawl a list of URLs saved in different formats, including as an XML sitemap, and directly input into a text field.  This is a way to crawl a specific batch of URLs as opposed to crawling a site from link to to link.

3. Tips For Migrating To A New Template

Website redesigns can be can be a major source of anguish when they go wrong. On paper, migrating a site to a new template should be a one-to-one change with minimal issues. In practice that’s not always the case.  For one, no template can be used off the shelf, it has to be modified to conform to what’s needed, which can mean removing and/or altering the code.

Advertisement

Search marketing expert Nigel Mordaunt (LinkedIn), who recently sold his search marketing agency, has experience migrating over a hundred sites and has important considerations for migrating to a new WordPress template.

This is Nigel’s advice:

“Check that all images have the same URL, alt text and image titles, especially if you’re using new images.

Templates sometimes have hard-coded heading elements, especially in the footer and sidebars. Those should be styled with CSS, not with H tags. I had this problem with a template once where the ranks had moved unexpectedly, then found that the Contact Us and other navigation links were all marked up to H2. I think that was more of a problem a few years ago. But still, some themes have H tags hard coded in places that aren’t ideal.

Make sure that all URLs are the exact same, a common mistake. Also, if planning to change content then check that the staging environment has been noindexed then after the site goes live make sure that the newly uploaded live site no longer contains the noindex robots meta tag.

If changing content then be prepared the site to perhaps be re-evaluated by Google. Depending on the size of the site, even if the changes are positive it may take several weeks to be rewarded, and in some cases several months. The client needs to be informed of this before the migration.

Also, check that analytics and tracking codes have been inserted into the new site, review all image sizes to make sure there are no new images that are huge and haven’t been scaled down. You can easily check the image sizes and heading tags with a post-migration Screaming Frog crawl. I can’t imagine doing any kind of site migration without Screaming Frog.”

Advertisement

4. Advice For Migrating To A New Web Host

Mark Barrera (LinkedIn), VP SEO, Newfold Digital (parent company of Bluehost), had this to say about crawling before a site migration in preparation for a migration to a new web host:

“Thoroughly crawl your existing site to identify any indexing or technical SEO issues prior to the move.

Maintain URL Structure (If Possible): Changing URL structures can confuse search engines and damage your link equity. If possible, keep your URLs the same.

301 Redirects: 301 Redirects are your friend. Search engines need to be informed that your old content now lives at a new address. Implementing 301 redirects from any old URLs to their new counterparts preserves link equity and avoids 404 errors for both users and search engine crawlers.

Performance Optimization: Ensure your new host provides a fast and reliable experience. Site speed is important for user experience.

Be sure to do a final walkthrough of your new site before doing your actual cutover. Visually double-check your homepage, any landing pages, and your most popular search hits. Review any checkout/cart flows, comment/review chains, images, and any outbound links to your other sites or your partners.

SSL Certificate: A critical but sometimes neglected aspect of hosting migrations is the SSL certificate setup. Ensuring that your new host supports and correctly implements your existing SSL certificate—or provides a new one without causing errors is vital. SSL/TLS not only secures your site but also impacts SEO. Any misconfiguration during migration can lead to warnings in browsers, which deter visitors and can temporarily impact rankings.

Advertisement

Post migration, it’s crucial to benchmark server response times not just from one location, but regionally or globally, especially if your audience is international. Sometimes, a new hosting platform might show great performance in one area but lag in other parts of the world. Such discrepancies can affect page load times, influencing bounce rates and search rankings. “

5. Accept Limitations

Ethan Lazuk, SEO Strategist & Consultant, Ethan Lazuk Consulting, LLC, (LinkedIn, Twitter) offers an interesting perspective on site migrations on the point about anticipating client limitations imposed upon what you are able to do. It can be frustrating when a client pushes back on advice and it’s important to listen to their reasons for doing it.

I have consulted over Zoom with companies whose SEO departments had concerns about what an external SEO wanted to do. Seeking a third party confirmation about a site migration plan is a reasonable thing to do. So if the internal SEO department has concerns about the plan, it’s not a bad idea to have a trustworthy third party take a look at it.

Ethan shared his experience:

“The most memorable and challenging site migrations I’ve been a part of involved business decisions that I had no control over.

As SEOs, we can create a smart migration plan. We can follow pre- and post-launch checklists, but sometimes, there are legal restrictions or other business realities behind the scenes that we have to work around.

Not having access to a DNS, being restricted from using a brand’s name or certain content, having to use an intermediate domain, and having to work days, weeks, or months afterward to resolve any issues once the internal business situations have changed are just a few of the tricky migration issues I’ve encountered.

Advertisement

The best way to handle these situations require working around client restrictions is to button up the SEO tasks you can control, set honest expectations for how the business issues could impact performance after the migration, and stay vigilant with monitoring post-launch data and using it to advocate for resources you need to finish the job.”

Different Ways To Migrate A Website

Site migrations are a pain and should be approached with caution. I’ve done many different kinds of migrations for myself and have assisted them with clients. I’m currently moving thousands of webpages from a folder to the root and it’s complicated by multiple redirects that have to be reconfigured, not looking forward to it. But migrations are sometimes unavoidable so it’s best to step up to it after careful consideration.

Featured Image by Shutterstock/Krakenimages.com



Source link

Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address
Continue Reading

Trending

Follow by Email
RSS