NEWS
Google’s Mueller on Keys to a Successful Site Migration
In a Google SEO Office-hours hangout, a person asked John Mueller about handling a site migration for a site that acquired another one and they are now joining the two websites into a single site. They are also changing the domain name.
The person asking the question asked if Mueller had any top considerations on what they should look out for.
John Mueller answered with two tips for how to conduct a site migration.
1. Track the “Before and After” URLs
His first suggestion is a good one. He said to track all URLs of the current websites before commencing the site migration.
Having a map of both sites allows you to use those URLs as a list that can be uploaded to a tool like Screaming Frog to find pages that don’t have a 301 redirect to a new URL and also to find URLs that may have been overlooked and are now returning a 404 Error Response Code.
John Mueller answered:
“I think the most important part is really to track the individual URLs, so that you have a clear map of what previously was and what it should be in the future.
And based on that, on the one hand to make sure that you have all of the redirects set up properly.
So the various tools that you can use to kind of submit the list of the old URLs and check to see that they redirect to the right new ones.”
Screaming Frog is able to easily handle this task. In the top navigation bar just select Mode > List Mode. Then to the right a new set of buttons will spawn where you can choose an Upload button from which you can choose to upload a file, enter URLs manually, paste the URLs or to follow an XML sitemap.
Screaming Frog will then report on the redirects and other factors just for those URLs. I’ve used this for corporate site migrations for a similar scenario where a multinational organization purchased another company then absorbed the URLs into the bigger company’s website.
2. Migrating Internal Linking
This second tip is a fundamental aspect of a site migration and something that Screaming Frog can be useful for tracking before a site migration.
A full site crawl can reveal the internal linking structure for every web page and this information can be exported into a spreadsheet.
In general though, old content from one site is often redirected (merged) with existing content. And under that scenario the internal linking structure of the page that is remaining is going to be preserved (unless new pages are added).
I think that the key to making the transition work is that redirected content is redirected to pages that are substantially similar. So the old page that is going away should redirect to a page on the new site that is substantially the same.
If there’s no match for the old page to redirect to, then in general do not redirect that page to the home page. Google’s going to treat that as a soft 404. So in that circumstance it’s best to just let the page 404.
John Mueller on Twitter About 404s to Home Pages
Google’s Search Central page outlines the best practice for handling a web page that no longer exists.
Google describes how to handle a page that has no clear replacement:
“If your page is no longer available, and has no clear replacement, it should return a 404 (not found) or 410 (Gone) response code. Either code clearly tells both browsers and search engines that the page doesn’t exist. You can also display a custom 404 page to the user, if appropriate: for example, a page containing list of your most popular pages, or a link to your home page.”
This is what John Mueller said:
“The other thing I would watch out for is all of the internal linking, so that you really make sure that all of the internal signals that you have as well that they’re forwarded to whatever new URLs.
Because what sometimes happens or what I’ve sometimes seen with these kind of restructurings is that you redirect the URLs, you move them over but you forget to set the rel canonical, you forget to set the links in the navigation, or in the footer somewhere.
And all of those other signals there, they wouldn’t necessarily break the navigation. But they make it a lot harder for us to pick the new URLs as canonicals.
So that’s kind of the effect that you would see there. It’s not so much that it would stop ranking but it’s more that we would just keep the old URLs for much longer than we actually need to.”
Site Migrations Can Feel Scary
There are many anecdotes of site migrations resulting in lost rankings. Usually there is a blip in traffic as Google figures out where everything should rank. But a site migration will generally turn out fine as long as web pages on the old site are redirected to pages on the old site that are substantially the same.
Trying to trick Google into sending PageRank to a page that is substantially different might actually confuse Google about what the page is about and backfire. So if there is no suitable page to redirect to then it’s best to return a 404 error response code.
Site migrations can turn out well as long as they’re done in a sensible manner that preserves the meanings of the old pages within the similar new pages.
Citation
Watch Google’s Mueller discuss site migrations about about the 18 minute mark.
Site Migrations – 18:26
NEWS
OpenAI Introduces Fine-Tuning for GPT-4 and Enabling Customized AI Models
OpenAI has today announced the release of fine-tuning capabilities for its flagship GPT-4 large language model, marking a significant milestone in the AI landscape. This new functionality empowers developers to create tailored versions of GPT-4 to suit specialized use cases, enhancing the model’s utility across various industries.
Fine-tuning has long been a desired feature for developers who require more control over AI behavior, and with this update, OpenAI delivers on that demand. The ability to fine-tune GPT-4 allows businesses and developers to refine the model’s responses to better align with specific requirements, whether for customer service, content generation, technical support, or other unique applications.
Why Fine-Tuning Matters
GPT-4 is a very flexible model that can handle many different tasks. However, some businesses and developers need more specialized AI that matches their specific language, style, and needs. Fine-tuning helps with this by letting them adjust GPT-4 using custom data. For example, companies can train a fine-tuned model to keep a consistent brand tone or focus on industry-specific language.
Fine-tuning also offers improvements in areas like response accuracy and context comprehension. For use cases where nuanced understanding or specialized knowledge is crucial, this can be a game-changer. Models can be taught to better grasp intricate details, improving their effectiveness in sectors such as legal analysis, medical advice, or technical writing.
Key Features of GPT-4 Fine-Tuning
The fine-tuning process leverages OpenAI’s established tools, but now it is optimized for GPT-4’s advanced architecture. Notable features include:
- Enhanced Customization: Developers can precisely influence the model’s behavior and knowledge base.
- Consistency in Output: Fine-tuned models can be made to maintain consistent formatting, tone, or responses, essential for professional applications.
- Higher Efficiency: Compared to training models from scratch, fine-tuning GPT-4 allows organizations to deploy sophisticated AI with reduced time and computational cost.
Additionally, OpenAI has emphasized ease of use with this feature. The fine-tuning workflow is designed to be accessible even to teams with limited AI experience, reducing barriers to customization. For more advanced users, OpenAI provides granular control options to achieve highly specialized outputs.
Implications for the Future
The launch of fine-tuning capabilities for GPT-4 signals a broader shift toward more user-centric AI development. As businesses increasingly adopt AI, the demand for models that can cater to specific business needs, without compromising on performance, will continue to grow. OpenAI’s move positions GPT-4 as a flexible and adaptable tool that can be refined to deliver optimal value in any given scenario.
By offering fine-tuning, OpenAI not only enhances GPT-4’s appeal but also reinforces the model’s role as a leading AI solution across diverse sectors. From startups seeking to automate niche tasks to large enterprises looking to scale intelligent systems, GPT-4’s fine-tuning capability provides a powerful resource for driving innovation.
OpenAI announced that fine-tuning GPT-4o will cost $25 for every million tokens used during training. After the model is set up, it will cost $3.75 per million input tokens and $15 per million output tokens. To help developers get started, OpenAI is offering 1 million free training tokens per day for GPT-4o and 2 million free tokens per day for GPT-4o mini until September 23. This makes it easier for developers to try out the fine-tuning service.
As AI continues to evolve, OpenAI’s focus on customization and adaptability with GPT-4 represents a critical step in making advanced AI accessible, scalable, and more aligned with real-world applications. This new capability is expected to accelerate the adoption of AI across industries, creating a new wave of AI-driven solutions tailored to specific challenges and opportunities.
This Week in Search News: Simple and Easy-to-Read Update
Here’s what happened in the world of Google and search engines this week:
1. Google’s June 2024 Spam Update
Google finished rolling out its June 2024 spam update over a period of seven days. This update aims to reduce spammy content in search results.
2. Changes to Google Search Interface
Google has removed the continuous scroll feature for search results. Instead, it’s back to the old system of pages.
3. New Features and Tests
- Link Cards: Google is testing link cards at the top of AI-generated overviews.
- Health Overviews: There are more AI-generated health overviews showing up in search results.
- Local Panels: Google is testing AI overviews in local information panels.
4. Search Rankings and Quality
- Improving Rankings: Google said it can improve its search ranking system but will only do so on a large scale.
- Measuring Quality: Google’s Elizabeth Tucker shared how they measure search quality.
5. Advice for Content Creators
- Brand Names in Reviews: Google advises not to avoid mentioning brand names in review content.
- Fixing 404 Pages: Google explained when it’s important to fix 404 error pages.
6. New Search Features in Google Chrome
Google Chrome for mobile devices has added several new search features to enhance user experience.
7. New Tests and Features in Google Search
- Credit Card Widget: Google is testing a new widget for credit card information in search results.
- Sliding Search Results: When making a new search query, the results might slide to the right.
8. Bing’s New Feature
Bing is now using AI to write “People Also Ask” questions in search results.
9. Local Search Ranking Factors
Menu items and popular times might be factors that influence local search rankings on Google.
10. Google Ads Updates
- Query Matching and Brand Controls: Google Ads updated its query matching and brand controls, and advertisers are happy with these changes.
- Lead Credits: Google will automate lead credits for Local Service Ads. Google says this is a good change, but some advertisers are worried.
- tROAS Insights Box: Google Ads is testing a new insights box for tROAS (Target Return on Ad Spend) in Performance Max and Standard Shopping campaigns.
- WordPress Tag Code: There is a new conversion code for Google Ads on WordPress sites.
These updates highlight how Google and other search engines are continuously evolving to improve user experience and provide better advertising tools.
Facebook Faces Yet Another Outage: Platform Encounters Technical Issues Again
Uppdated: It seems that today’s issues with Facebook haven’t affected as many users as the last time. A smaller group of people appears to be impacted this time around, which is a relief compared to the larger incident before. Nevertheless, it’s still frustrating for those affected, and hopefully, the issues will be resolved soon by the Facebook team.
Facebook had another problem today (March 20, 2024). According to Downdetector, a website that shows when other websites are not working, many people had trouble using Facebook.
This isn’t the first time Facebook has had issues. Just a little while ago, there was another problem that stopped people from using the site. Today, when people tried to use Facebook, it didn’t work like it should. People couldn’t see their friends’ posts, and sometimes the website wouldn’t even load.
Downdetector, which watches out for problems on websites, showed that lots of people were having trouble with Facebook. People from all over the world said they couldn’t use the site, and they were not happy about it.
When websites like Facebook have problems, it affects a lot of people. It’s not just about not being able to see posts or chat with friends. It can also impact businesses that use Facebook to reach customers.
Since Facebook owns Messenger and Instagram, the problems with Facebook also meant that people had trouble using these apps. It made the situation even more frustrating for many users, who rely on these apps to stay connected with others.
During this recent problem, one thing is obvious: the internet is always changing, and even big websites like Facebook can have problems. While people wait for Facebook to fix the issue, it shows us how easily things online can go wrong. It’s a good reminder that we should have backup plans for staying connected online, just in case something like this happens again.