Connect with us

SEO

The Role of SEO in Mergers and Acquisitions

Published

on

The Role of SEO in Mergers and Acquisitions

SEOs have a lot to offer companies during the merger and acquisition (M&A) process. They can help identify acquisition targets, do due diligence and help with valuation and risk identification, identify future opportunities, work with teams on website migrations, monitor migration progress, and train new teams in best practices.

Acquisitions can have a dramatic impact on your search visibility. Long ago, one of the main competitors of an engineering company I was working with shut down. I asked about acquiring their website and managed to get it for a few hundred dollars. Needless to say, this led to a significant amount of new leads and business growth as we merged the two websites.

Another time, I managed to snag the expired domain of what was the number one HVAC company in my local market. They were consolidating several service companies into one new brand and they let the domain expire. I redirected this to the website of a client who was fairly new in the market, and they saw top rankings for many of their main terms practically overnight.

While this type of acquisition isn’t common for small companies, it is business as usual for larger companies. If you’ve done enterprise SEO, you might have helped with a few of these mergers and acquisitions.

Large companies acquire lots of other companies. Many companies have dedicated Wikipedia pages that tell you about all their acquisitions. For example, Alphabet (Google’s parent company) has a list of 257 acquired companies.

Let’s look at how SEOs can help with mergers and acquisitions.

Once you enter a period of exclusivity, where companies are now only negotiating with each other, it’s time to take a more in-depth look and do due diligence.

SEOs will evaluate the target company’s online presence and SEO strategies. Many of the things we looked at before, like traffic, rankings, backlinks, forecasts, etc., will all be looked at. Any positive or negative items can be noted to help determine the value, potential, and risks of a website.

One additional report you may want to look at is the Opportunities report. You can go through this to see what kind of potential a website has to rank better.

Use the Opportunities report to see the SEO potential of sitesUse the Opportunities report to see the SEO potential of sites

Companies may have more than one domain, so you might have to check a few different websites during this process.

Whether you choose to merge domains usually comes down to whether you want more listings or one listing that potentially ranks higher. This can depend a lot on your current rankings and the resources you have available to maintain your web presence. Or you may have a company policy that says it needs to merge.

Many news sites choose to run the websites on separate domains. Both sites can show in Google News and in organic search results multiple times for the same stories or affiliate content targeting the same terms.

Businesses will often run the websites separately for a while but tend to merge the websites eventually. You may see this happen in several stages:

  1. The acquired company adds a tag of “a xxx company” on the current domain.
  2. The acquired domain is migrated to the main company’s domain with the same branding.
  3. The acquired company is rebranded and more integrated with the product or offering of the main company.

Check out our guide on website migrations to see what it takes to migrate a site successfully.

Some of the main things that can cause traffic loss during migrations are failing to do redirects and killing off content. I’ll show you how to check these in the next section.

You also need to make sure that you support older branded names in some way. Sometimes these terms are still used by people in the market for many years, and you don’t want to lose this valuable search traffic to another website that may rank instead of your own!

You’ll also want to make sure your TLS certificate (what allows HTTPs to work) will work across domains. If not, users may get an error and not be forwarded to the new site, even if you have a redirect in place.

The easiest way to check for any major drops is to create a Portfolio with the old domain and the new path or pages on the site. Then you can use the Site Explorer Overview report to look for any major traffic drops and use the compare mode in any of the other reports, like Organic Keywords, to zero in on where traffic may have been lost.

Portfolio view of two websites that mergedPortfolio view of two websites that merged

Depending on the setup, you may be able to just add the old domain as a competitor in the overview report to see how the migration went.

Add merged website as a competitor to check for any traffic dropsAdd merged website as a competitor to check for any traffic drops

You’ll want to check the old URLs to make sure redirects were done, and all the content was migrated successfully.

To get a list of your most linked URLs, you can use the Best by links report in Site Explorer.

Best by Links report in Ahrefs' Site ExplorerBest by Links report in Ahrefs' Site Explorer

You can upload that list as a custom list in Site Audit in the URL sources tab. Alternatively, you could just select Backlinks as the source in this tab. I would remove any other crawl sources for this use case.

Adding most linked URLs as a custom URL list in Site AuditAdding most linked URLs as a custom URL list in Site Audit

We’ll then crawl all the URLs with links. In Page Explorer, you can customize the table to include things like Redirect URL, Redirect status code, Final redirect URL, and Final redirect status code to get an easy view of all the redirects that are happening.

Redirects in Site Audit's Page ExplorerRedirects in Site Audit's Page Explorer

Make sure your redirects are 301 or 308 rather than 302 or 307 status codes if you are doing a permanent move and want URLs indexed on the new website instead of the old one.

You should monitor the renewal of the old domains as well. You wouldn’t want a competitor registering them or for the site to be repurposed into something more nefarious.

SEOs can also help with the transfer of knowledge and best practices between companies. There are lots of different ways they can facilitate this. See this section for some ideas.

Final thoughts

Even if you weren’t involved in the original migration process, you probably should check behind some of the main company acquisitions to see if any value was left on the table. Look for redirects not done, content not migrated, etc. In my experience, there’s a lot of value to be had cleaning up after these old acquisitions.

If you have questions, message me on X or LinkedIn.



Source link

Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address

SEO

Google Hints At Improving Site Rankings In Next Update

Published

on

By

Google Hints At Improving Site Rankings In Next Update

Google’s John Mueller says the Search team is “explicitly evaluating” how to reward sites that produce helpful, high-quality content when the next core update rolls out.

The comments came in response to a discussion on X about the impact of March’s core update and September’s helpful content update.

In a series of tweets, Mueller acknowledged the concerns, stating:

“I imagine for most sites strongly affected, the effects will be site-wide for the time being, and it will take until the next update to see similar strong effects (assuming the new state of the site is significantly better than before).”

He added:

“I can’t make any promises, but the team working on this is explicitly evaluating how sites can / will improve in Search for the next update. It would be great to show more users the content that folks have worked hard on, and where sites have taken helpfulness to heart.”

What Does This Mean For SEO Professionals & Site Owners?

Mueller’s comments confirm Google is aware of critiques about the March core update and is refining its ability to identify high-quality sites and reward them appropriately in the next core update.

For websites, clearly demonstrating an authentic commitment to producing helpful and high-quality content remains the best strategy for improving search performance under Google’s evolving systems.

The Aftermath Of Google’s Core Updates

Google’s algorithm updates, including the September “Helpful Content Update” and the March 2024 update, have far-reaching impacts on rankings across industries.

While some sites experienced surges in traffic, others faced substantial declines, with some reporting visibility losses of up to 90%.

As website owners implement changes to align with Google’s guidelines, many question whether their efforts will be rewarded.

There’s genuine concern about the potential for long-term or permanent demotions for affected sites.

Recovery Pathway Outlined, But Challenges Remain

In a previous statement, Mueller acknowledged the complexity of the recovery process, stating that:

“some things take much longer to be reassessed (sometimes months, at the moment), and some bigger effects require another update cycle.”

Mueller clarified that not all changes would require a new update cycle but cautioned that “stronger effects will require another update.”

While affirming that permanent changes are “not very useful in a dynamic world,” Mueller adds that “recovery” implies a return to previous levels, which may be unrealistic given evolving user expectations.

“It’s never ‘just-as-before’,” Mueller stated.

Improved Rankings On The Horizon?

Despite the challenges, Mueller has offered glimmers of hope for impacted sites, stating:

“Yes, sites can grow again after being affected by the ‘HCU’ (well, core update now). This isn’t permanent. It can take a lot of work, time, and perhaps update cycles, and/but a different – updated – site will be different in search too.”

He says the process may require “deep analysis to understand how to make a website relevant in a modern world, and significant work to implement those changes — assuming that it’s something that aligns with what the website even wants.”

Looking Ahead

Google’s search team is actively working on improving site rankings and addressing concerns with the next core update.

However, recovery requires patience, thorough analysis, and persistent effort.

The best way to spend your time until the next update is to remain consistent and produce the most exceptional content in your niche.


FAQ

How long does it generally take for a website to recover from the impact of a core update?

Recovery timelines can vary and depend on the extent and type of updates made to align with Google’s guidelines.

Google’s John Mueller noted that some changes might be reassessed quickly, while more substantial effects could take months and require additional update cycles.

Google acknowledges the complexity of the recovery process, indicating that significant improvements aligned with Google’s quality signals might be necessary for a more pronounced recovery.

What impact did the March and September updates have on websites, and what steps should site owners take?

The March and September updates had widespread effects on website rankings, with some sites experiencing traffic surges while others faced up to 90% visibility losses.

Publishing genuinely useful, high-quality content is key for website owners who want to bounce back from a ranking drop or maintain strong rankings. Stick to Google’s recommendations and adapt as they keep updating their systems.

To minimize future disruptions from algorithm changes, it’s a good idea to review your whole site thoroughly and build a content plan centered on what your users want and need.

Is it possible for sites affected by core updates to regain their previous ranking positions?

Sites can recover from the impact of core updates, but it requires significant effort and time.

Mueller suggested that recovery might happen over multiple update cycles and involves a deep analysis to align the site with current user expectations and modern search criteria.

While a return to previous levels isn’t guaranteed, sites can improve and grow by continually enhancing the quality and relevance of their content.


Featured Image: eamesBot/Shutterstock

Source link

Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address
Continue Reading

SEO

Google Reveals Two New Web Crawlers

Published

on

By

Google Reveals Two New Web Crawlers

Google revealed details of two new crawlers that are optimized for scraping image and video content for “research and development” purposes. Although the documentation doesn’t explicitly say so, it’s presumed that there is no impact in ranking should publishers decide to block the new crawlers.

It should be noted that the data scraped by these crawlers are not explicitly for AI training data, that’s what the Google-Extended crawler is for.

GoogleOther Crawlers

The two new crawlers are versions of Google’s GoogleOther crawler that was launched in April 2023. The original GoogleOther crawler was also designated for use by Google product teams for research and development in what is described as one-off crawls, the description of which offers clues about what the new GoogleOther variants will be used for.

The purpose of the original GoogleOther crawler is officially described as:

“GoogleOther is the generic crawler that may be used by various product teams for fetching publicly accessible content from sites. For example, it may be used for one-off crawls for internal research and development.”

Two GoogleOther Variants

There are two new GoogleOther crawlers:

  • GoogleOther-Image
  • GoogleOther-Video

The new variants are for crawling binary data, which is data that’s not text. HTML data is generally referred to as text files, ASCII or Unicode files. If it can be viewed in a text file then it’s a text file/ASCII/Unicode file. Binary files are files that can’t be open in a text viewer app, files like image, audio, and video.

The new GoogleOther variants are for image and video content. Google lists user agent tokens for both of the new crawlers which can be used in a robots.txt for blocking the new crawlers.

1. GoogleOther-Image

User agent tokens:

  • GoogleOther-Image
  • GoogleOther

Full user agent string:

GoogleOther-Image/1.0

2. GoogleOther-Video

User agent tokens:

  • GoogleOther-Video
  • GoogleOther

Full user agent string:

GoogleOther-Video/1.0

Newly Updated GoogleOther User Agent Strings

Google also updated the GoogleOther user agent strings for the regular GoogleOther crawler. For blocking purposes you can continue using the same user agent token as before (GoogleOther). The new Users Agent Strings are just the data sent to servers to identify the full description of the crawlers, in particular the technology used. In this case the technology used is Chrome, with the model number periodically updated to reflect which version is used (W.X.Y.Z is a Chrome version number placeholder in the example listed below)

The full list of GoogleOther user agent strings:

  • Mozilla/5.0 (Linux; Android 6.0.1; Nexus 5X Build/MMB29P) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/W.X.Y.Z Mobile Safari/537.36 (compatible; GoogleOther)
  • Mozilla/5.0 AppleWebKit/537.36 (KHTML, like Gecko; compatible; GoogleOther) Chrome/W.X.Y.Z Safari/537.36

GoogleOther Family Of Bots

These new bots may from time to time show up in your server logs and this information will help in identifying them as genuine Google crawlers and will help publishers who may want to opt out of having their images and videos scraped for research and development purposes.

Read the updated Google crawler documentation

GoogleOther-Image

GoogleOther-Video

Featured Image by Shutterstock/ColorMaker

Source link

Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address
Continue Reading

SEO

ChatGPT To Surface Reddit Content Via Partnership With OpenAI

Published

on

By

ChatGPT artificial intelligence chatbot app on smartphone screen with large shadow giving the feeling of floating on top of the background. White background.

Reddit partners with OpenAI to integrate content into ChatGPT.

  • Reddit and OpenAI announce a partnership.
  • Reddit content will be used in ChatGPT.
  • Concerns about accuracy of Reddit user-generated content.

Source link

Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address
Continue Reading

Trending