SEO
Google Penalties: The Newbie-Friendly Guide

A penalty from Google is every webmaster’s worst nightmare. But even if you get one, all is not lost.
In this post, you will learn the following:
Thankfully, we have never dealt with penalties on Ahrefs’ blog, so I reached out to SEO experts who deal with Google penalties in their professional work:
Let’s rock!
In simple terms, a penalty is a “punishment” manually imposed on a website by Google’s webspam team.
This generally happens when the website violates Google’s quality guidelines. This penalty results in a dramatic drop in rankings and organic traffic loss. It’s worth mentioning the negative effect of Google’s algorithm updates must not be mistaken for a penalty.
Google does not use the term “penalties” in its documentation. Instead, it calls them manual actions and algorithmic actions.
If your organic rankings and traffic suddenly drop while your website has no downtimes or technical SEO issues, you may be facing one of two things:
- Your website was reviewed by a human and got a manual action.
- A Google update resulted in an algorithmic action demoting your website.
Google’s mission is to present the best results to its users. Its quality guidelines explain what it expects from websites. Any attempt to manipulate its ranking factors or abuse its quality guidelines is called spam.
Every day, Google discovers around 40 billion spammy pages. Here’s how it fights spam.
Google has systems that can detect spam at the “crawling” stage. As a result, a large number of pages considered spammy won’t even make it to Google’s index.
The content included in its index is double-checked for spam. Such low-quality content may be shown in the search results, but its search rankings will be low.
As explained by Google, these AI-aided automated systems provide 99% protection against spam. Manual actions come into play for the remaining 1%.
Google has a vast army of human reviewers who can impose manual actions on websites.
In 2020, they sent 2.9 million messages to site owners in Search Console related to manual webspam actions.
Sidenote.
Manual actions are imposed and lifted by an in-house webspam team in Google. Don’t confuse this team with the quality raters who only help Google with evaluating changes to the algorithms and are not privy to any insights regarding the inner workings of Google Search.
Whenever a website gets a manual action, this issue will always be visible under Security and Manual actions in Google Search Console. It looks something like this.

With such a message, you know the problem and can address it directly.
Manual actions imposed on a website will result in pages being ranked lower or omitted from the search results completely.
But diagnosing the effect of an algorithmic adjustment is much more challenging, as you will get zero notifications from Google.
The best way to identify an algorithmic action is to look at your Google organic traffic and see if you have a drop that coincides with a known or suspected algorithm update.
You can search through webmaster forums or Twitter to see if other webmasters are facing similar issues. Google Search Central Help Community, for example, is an excellent place to start.
With Ahrefs, you can check if other websites in your industry are losing their rankings too.
Let’s use our blog as an example:

On Dec. 3, 2020, a new core algorithm update was rolled out. Notably, Ahrefs’ blog has the strongest focus on content quality; we have never bought a single backlink. So this update was a slap in the face for us.
You should also note that algorithm updates don’t only demote low-quality or spammy websites; they also promote high-quality sites. So even if there’s nothing “wrong” with your website, you can find other sites outranking you after the next core update.
If you’re ‘hit’ by a core update, you shouldn’t think of it as a penalty. You might not be doing anything wrong at all, it might just be that someone is doing something better.
There are two types of actions that can be displayed on the “Manual actions” page. These are:
- Manual actions that affect the entire site.
- Manual actions that only affect a specific URL or section of a site.
Manual actions can be anything from quite broad to quite specific, and very fluid in between. The goal is always to neutralize the issue, and sometimes that’s easy to isolate & do, other times it’s much harder, so we end up taking broader action.
Every “manual action” notification is accompanied by “Reason” and “Effects” information.
Today, the list of manual actions includes the following:
- Site abused with third-party spam
- User-generated spam
- Spammy free host
- Structured data issue
- Unnatural links to your site
- Unnatural links from your site
- Thin content with little or no added value
- Cloaking and/or sneaky redirects
- Pure spam
- Cloaked images
- Hidden text and/or keyword stuffing
- AMP content mismatch
- Sneaky mobile redirects
- News and Discover policy violations
Whenever Google rolls out a new spam update, it’s basically targeting the same issues.
The core updates address the relevance and quality of content.
Also, you should note that manual actions can cover different Google products. But if you somehow get a manual action related to Google Discover, your regular rankings should not be affected.
News and Discover are entirely separate products.
News includes newsy-content, not just news publishers.
Discover includes content from across the web, not just news content.
Manual actions can happen for either; if issued, they only involve the particular product issued for.
You can read more about the Manual Actions report in this Search Console Help article.
Unfortunately, there’s no data on the share of different penalties from Google, so I searched through recent threads on different webmaster forums and Twitter.
The most common manual actions that webmasters care to resolve these days are related to the attempts to manipulate Google’s search results. This may be a coincidence, but these penalties are even next to each other on the list of manual actions. These are:
- Unnatural links to your site.
- Unnatural links from your site.
- Thin content with little or no added value.
I would say that almost all Google penalties now are given because the site owner was trying too hard to manipulate Google. Five or six years ago I did see a lot of penalties that came as a result of good, honest business owners hiring poor SEO companies who built unnatural links. But, now, most of that type of link is just ignored by Penguin. As such, if you get a link penalty, you usually know that you deserved it.
So stop doing shady stuff, folks! 🙂
Links are still one of the most important ranking factors, so “unnatural links” are the reason behind a large number of penalties these days.
Now, let’s be honest. If you see this notification in Search Console, you most likely participated in a link scheme and know what you’re hit for.
But if you purchased a website or are working on someone else’s website, you’ll need to do a comprehensive link audit. Here are a few quick things that you can work on.
Check your anchor texts
You must pay attention to the dominance of irrelevant or over-optimized anchor texts that come from poor-quality websites.
The first step you should take is to plug your website into Ahrefs’ Site Explorer (or another backlink checker you like) and navigate to the Anchors report. We’ve recently enhanced ours with various helpful filters. Now it’s even easier to diagnose issues related to anchor texts.
With a natural backlink profile, most links are URL-anchored or brand-anchored. But if a site has been heavily involved in manipulative link building, you’ll usually see a lot of keyword-stuffed anchors from multiple referring domains:

If the Anchors report looks something like this, that’s a bad sign.
As for “Unnatural links from your site,” you should look at the anchor texts of the outgoing links. Ahrefs’ Site Explorer has a special report for that.

Check your referring domains
The Referring domains report in Site Explorer can help you identify websites explicitly built for selling links. The main attribute of such websites is the large number of sites they link to (related to the size of a website).
Besides, these websites usually get little to no traffic from Google, although they may have a pretty high Domain Rating score.
This is why we added “Dofollow linked domains” and “Traffic” columns straight into the Referring domains report.

For example, here at Ahrefs’ blog, we’ve never been greedy for external links. If a page or resource deserves a reference, we will link to it.
Our blog has approximately 2K pages indexed in Google.

And Ahrefs’ Linked Domains report will show you that we link to around 1.5K websites.

So the ratio is close to 1:1. I’m not saying it’s a standard for every industry, but I hope you get the idea. A website with 500 pages linking to 5K different websites should raise flags.
You should note that a low DR and poor organic traffic do not always indicate a website’s low quality. It could be that the website is new and could grow into a strong one over time.
But when a linking website has low DR, ugly design and UI, low-quality content, and no organic traffic, there’s no way a link from it will be considered natural.
Google seems to have a well-maintained list of websites that sell links. And it may be using link-selling emails to expand that list. Check out this article from Barry Schwartz for more information.
To get a good sense of how serious the link problem is for your site, you can access Marie Haynes’ Disavow Blacklist Bulk Upload tool. Then export the list of your referring domains from Ahrefs and see how many of them are on Marie’s blacklist.
You should also estimate the linking pages visually.
Websites created for the sake of linking out are easily noticeable most of the time. Content on their pages won’t make much sense, and images will be a total mess.
Recommended reading: How to Deal With Unnatural Links & Google Manual Actions
To lift a manual action from your website, you must take actions to rectify the problems specified in the GSC Manual Actions message(s) and select “Request Review” for the particular issue.
“Unnatural links to your site” is the only penalty that has its roots outside your website.
To rehabilitate, you must get rid of those links.
To do that, you need to clean up your link profile as much as possible. Simply disavowing these links may not suffice.
Here’s what Google’s document says:
You should make a good-faith effort to remove backlinks before using the disavow tool. It can also be helpful to document the effort involved in removing those links. Simply disavowing all backlinks without attempting to remove them might lead to rejection of your request.
If you have control over the link, remove it.
Then you should send link removal requests to webmasters. But most likely, this is going to be tough. It will often take you more than one attempt, as you’ll need to look for different contacts before finding the right person.
And even if you reach them, some simply won’t care. And sometimes, they’ll even ask you to pay to remove links.
If you fail to remove a link to your website, disavow the linking page or domain. But do explain why you did so in your reconsideration request.
Make sure to document every step you take and every message you send to fix the issues. This log will make your reconsideration request more convincing.
If the penalty results from thin content, provide evidence of your improvements. Show what content you took down and what you added.
Request a review only when you have fixed the issues on all affected pages, as the manual action can’t be lifted partially.
It is essential to demonstrate your effort to address all the issues on your website, as well as any results, when you send a reconsideration request. Remember that your website will be reconsidered by humans, not machine algorithms.
There is no limit on the number of times you can apply for reconsideration.
With a manual action, you want to show to a knowledgable person that you understand what the problem was, that you’ve taken all of the necessary steps to resolve it and ideally that you won’t do it again. Just disavowing some links someone else placed doesn’t seem complete to me.
— 🐐 John 🐐 (@JohnMu) June 30, 2021
Algorithmic actions do not imply reconsideration requests. All you can do is fix the issues that may be demoting your website and wait for Google as it recrawls and reindexes your site. Only after that will you figure out whether your site is still compromised.
If you fail to lift your penalties on the first try, repeat the above process by being more thorough this time. If you’re lost, find an SEO professional who can assess the damage and find the solution.
Is there a penalty for duplicate content?
25%–30% of the web is duplicate content.
Google demystified the “duplicate content penalty” back in 2008. Still, people keep asking this question.
The short answer to this question is “no.” There’s no penalty for duplicate content.
If you scrape or steal content from other websites, Google will simply rank the original.
As for the duplicate pages on your website, Google will try to determine the primary URL as the canonical version. Use the rel=canonical
labeling to help Google choose the most appropriate page.
Recommended reading: Duplicate Content: Why It Happens and How to Fix It
Can a website get a penalty for bad or outdated design?
Sometimes minimal, old, simplified, or even ugly pages rank well, sometimes that also changes over time (it feels like you’d be leaving open room for competition by being suboptimal).
Can a spam report from my competitor result in a penalty for my website?
Before 2020, Google’s documentation indicated that spam reports could be used to take manual actions against websites.
Today, these reports are used only to improve their spam detection algorithms. According to Google:
While Google does not use these reports to take direct action against violations, these reports still play a significant role in helping us understand how to improve our spam detection systems that protect our search results.
Should I disavow any low-quality links to prevent a manual action?
If you haven’t been building links, you probably have next to nothing to worry about.
Random links collected over the years aren’t necessarily harmful, we’ve seen them for a long time too and can ignore all of those weird pieces of web-graffiti from long ago. Disavow links that were really paid for (or otherwise actively unnaturally placed), don’t fret the cruft.
How long does it take to get a response to the reconsideration request?
There’s no exact answer to this question. Obviously, it depends on the severity of the issue and the current load on the webspam team.
Here’s what Google says:
Most reconsideration reviews can take several days or weeks, although in some cases, such as link-related reconsideration requests, it may take longer than usual to review your request.
In rare cases, it can take months.
In the past 18 months, I’ve experienced anywhere between 3 weeks and in one extreme case 5 months.
I believe that Google processes reconsideration requests in batches versus a conveyor belt method, and depending on the severity of the penalty/offence and its potential impact on user welfare (of lifting the penalty), they are treated with different levels (and batched differently) of importance. Which given how Google weights resources in favour of the user majority, this makes sense.
Can negative SEO attacks lead to manual actions?
It’s highly unlikely the spam links built as a negative SEO attack by your competitors will demote your site. Google says it’s smart enough to ignore those links.
But if you’re worried, simply disavow them.
In my opinion, it is rare that negative SEO attempts will lead to a manual action. If a site does get a manual action after a wave of negative SEO links are pointed at them, it almost always turns out that the site had also been involved in a lot of their own link building as well.
So if you do some black-hat SEO, negative SEO from your black-hat competition may bring unnecessary attention to your website. Isn’t that another signal to quit black hat?
Can a penalty expire?
To my surprise, manual actions do expire. Algorithm actions don’t, and they’re only getting better.
Yes, manual actions expire after time. Often things change over the years, so what might have required manual intervention to solve / improve back then, might be handled better algorithmically nowadays.
— 🐐 John 🐐 (@JohnMu) September 7, 2018
Will my website performance on search become normal once the manual action is lifted?
The general answer is “yes.” Google does not hold a grudge against websites with lifted manual actions.
Site being ‘tainted’ after a manual action is a SEO myth. You request a RR and if successful, you’re off the hook.
John Mueller even recorded a dedicated video on this topic.
But things may be a bit more complicated than that.
Over the years I’ve worked with a number of websites that have had manual actions, link penalties, and once lifted the website has been able to rebuild and progress. When lifting a link penalty, there is oftentimes a misconception that ‘performance will return to normal,’ but people forget that they will have had some benefit from the backlinks originally, impacting performance — otherwise why would Google see them as manipulations if they didn’t work?
A website gets a manual action usually because the techniques it was using were actually working. Even after the manual action is lifted, the site gets back to the point from before these black hat/grey hat techniques were implemented and actually started to work.
I did some research and came across John’s AMA session on Reddit, where he took the time to answer tons of interesting questions:
There’s no ‘reset button’ for a domain, we don’t even have that internally, so a manual review wouldn’t change anything there. If there’s a lot of bad history associated with that, you either have to live with it, clean it up as much as possible, or move to a different domain. I realize that can be a hassle, but it’s the same with any kind of business, cleaning up a bad name/reputation can be a lot of work, and it’s hard to say ahead of time if it’ll be worth it in the end.
Final thoughts
As you saw from this post, websites mostly suffer from Google’s penalties because of low-quality content and shady SEO techniques.
If you want to make your website bulletproof, make sure it meets Google’s quality guidelines, follows the E‑A-T principles, and has a natural link profile.
Apart from that, monitor your site for hacking and remove hacked content as soon as possible to prevent user-generated spam on your site.
Have more thoughts to share? Got questions? Ping me on Twitter.
SEO
Firefox URL Tracking Removal – Is This A Trend To Watch?

Firefox recently announced that they are offering users a choice on whether or not to include tracking information from copied URLs, which comes on the on the heels of iOS 17 blocking user tracking via URLs. The momentum of removing tracking information from URLs appears to be gaining speed. Where is this all going and should marketers be concerned?
Is it possible that blocking URL tracking parameters in the name of privacy will become a trend industrywide?
Firefox Announcement
Firefox recently announced that beginning in the Firefox Browser version 120.0, users will be able to select whether or not they want URLs that they copied to contain tracking parameters.
When users select a link to copy and click to raise the contextual menu for it, Firefox is now giving users a choice as to whether to copy the URL with or without the URL tracking parameters that might be attached to the URL.
Screenshot Of Firefox 120 Contextual Menu
According to the Firefox 120 announcement:
“Firefox supports a new “Copy Link Without Site Tracking” feature in the context menu which ensures that copied links no longer contain tracking information.”
Browser Trends For Privacy
All browsers, including Google’s Chrome and Chrome variants, are adding new features that make it harder for websites to track users online through referrer information embedded in a URL when a user clicks from one site and leaves through that click to visit another site.
This trend for privacy has been ongoing for many years but it became more noticeable in 2020 when Chrome made changes to how referrer information was sent when users click links to visit other sites. Firefox and Safari followed with similar referrer behavior.
Whether the current Firefox implementation would be disruptive or if the impact is overblown is kind of besides the point.
What is the point is whether or not what Firefox and Apple did to protect privacy is a trend and if that trend will extend to more blocking of URL parameters that are stronger than what Firefox recently implemented.
I asked Kenny Hyder, CEO of online marketing agency Pixel Main, what his thoughts are about the potential disruptive aspect of what Firefox is doing and whether it’s a trend.
Kenny answered:
“It’s not disruptive from Firefox alone, which only has a 3% market share. If other popular browsers follow suit it could begin to be disruptive to a limited degree, but easily solved from a marketers prospective.
If it became more intrusive and they blocked UTM tags, it would take awhile for them all to catch on if you were to circumvent UTM tags by simply tagging things in a series of sub-directories.. ie. site.com/landing/<tag1>/<tag2> etc.
Also, most savvy marketers are already integrating future proof workarounds for these exact scenarios.
A lot can be done with pixel based integrations rather than cookie based or UTM tracking. When set up properly they can actually provide better and more accurate tracking and attribution. Hence the name of my agency, Pixel Main.”
I think most marketers are aware that privacy is the trend. The good ones have already taken steps to keep it from becoming a problem while still respecting user privacy.”
Some URL Parameters Are Already Affected
For those who are on the periphery of what’s going on with browsers and privacy, it may come as a surprise that some tracking parameters are already affected by actions meant to protect user privacy.
Jonathan Cairo, Lead Solutions Engineer at Elevar shared that there is already a limited amount of tracking related information stripped from URLs.
But he also explained that there are limits to how much information can be stripped from URLs because the resulting negative effects would cause important web browsing functionality to fail.
Jonathan explained:
“So far, we’re seeing a selective trend where some URL parameters, like ‘fbclid’ in Safari’s private browsing, are disappearing, while others, such as TikTok’s ‘ttclid’, remain.
UTM parameters are expected to stay since they focus on user segmentation rather than individual tracking, provided they are used as intended.
The idea of completely removing all URL parameters seems improbable, as it would disrupt key functionalities on numerous websites, including banking services and search capabilities.
Such a drastic move could lead users to switch to alternative browsers.
On the other hand, if only some parameters are eliminated, there’s the possibility of marketers exploiting the remaining ones for tracking purposes.
This raises the question of whether companies like Apple will take it upon themselves to prevent such use.
Regardless, even in a scenario where all parameters are lost, there are still alternative ways to convey click IDs and UTM information to websites.”
Brad Redding of Elevar agreed about the disruptive effect from going too far with removing URL tracking information:
“There is still too much basic internet functionality that relies on query parameters, such as logging in, password resets, etc, which are effectively the same as URL parameters in a full URL path.
So we believe the privacy crackdown is going to continue on known trackers by blocking their tracking scripts, cookies generated from them, and their ability to monitor user’s activity through the browser.
As this grows, the reliance on brands to own their first party data collection and bring consent preferences down to a user-level (vs session based) will be critical so they can backfill gaps in conversion data to their advertising partners outside of the browser or device.”
The Future Of Tracking, Privacy And What Marketers Should Expect
Elevar raises good points about how far browsers can go in terms of how much blocking they can do. Their response that it’s down to brands to own their first party data collection and other strategies to accomplish analytics without compromising user privacy.
Given all the laws governing privacy and Internet tracking that have been enacted around the world it looks like privacy will continue to be a trend.
However, at this point it time, the advice is to keep monitoring how far browsers are going but there is no expectation that things will get out of hand.
SEO
How To Become an SEO Expert in 4 Steps

With 74.1% of SEOs charging clients upwards of $500 per month for their services, there’s a clear financial incentive to get good at SEO. But with no colleges offering degrees in the topic, it’s down to you to carve your own path in the industry.
There are many ways to do this; some take longer than others.
In this post, I’ll share how I’d go from zero to SEO pro if I had to do it all over again.
Understanding what search engine optimization really is and how it works is the first state of affairs. While you can do this by reading endless blog posts or watching YouTube videos, I wouldn’t recommend that approach for a few reasons:
- It’s hard to know where to start
- It’s hard to join the dots
- It’s hard to know who to trust
You can solve all of these problems by taking a structured course like our SEO course for beginners. It’s completely free (no signup required), consists of 14 short video lessons (2 hours total length), and covers:
- What SEO is and why it’s important
- How to do keyword research
- How to optimize pages for keywords
- How to build links (and why you need them)
- Technical SEO best practices
Here’s the first lesson to get you started:
It doesn’t matter how many books you read about golf, you’re never going to win a tournament without picking up a set of clubs and practicing. It’s the same with SEO. The theory is important, but there’s no substitute for getting your hands dirty and trying to rank a site.
If you don’t have a site already, you can get up and running fairly quickly with any major website platform. Some will set you back a few bucks, but they handle SEO basics out of the box. This saves you time sweating the small stuff.
As for what kind of site you should create, I recommend a simple hobby blog.
Here’s a simple food blog I set up in <10 minutes:


Once you’re set-up, you’re ready to start practicing and honing your SEO skills. Specifically, doing keyword research to find topics, writing and optimizing content about them, and (possibly) building a few backlinks.
For example, according to Ahrefs’ Keywords Explorer, the keyword “neopolitan pizza dough recipe” has a monthly traffic potential of 4.4K as well as a relatively low Keyword Difficulty (KD) score:


Even better, there’s a weak website (DR 16) in the top three positions—so this should definitely be quite an easy topic to rank for.


Given that most of the top-ranking posts have at least a few backlinks, a page about this topic would also likely need at least a few backlinks to compete. Check out the resources below to learn how to build these.
It’s unlikely that your hobby blog is going to pay the bills, so it’s time to use the work you’ve done so far to get a job in SEO. Here are a few benefits of doing this:
- Get paid to learn. This isn’t the case when you’re home alone reading blog posts and watching videos or working on your own site.
- Get deeper hands-on experience. Agencies work with all kinds of businesses, which means you’ll get to build experience with all kinds of sites, from blogs to ecommerce.
- Build your reputation. Future clients or employers are more likely to take you seriously if you’ve worked for a reputable SEO agency.
To find job opportunities, start by signing up for SEO newsletters like SEO Jobs and SEOFOMO. Both of these send weekly emails and feature remote job opportunities:


You can also go the traditional route and search job sites for entry-level positions. The kinds of jobs you’re looking for will usually have “Junior” in their titles or at least mention that it’s a junior position in their description.


Beyond that, you can search for SEO agencies in your local area and check their careers pages.
Even if there are no entry-level positions listed here, it’s still worth emailing and asking if there are any upcoming openings. Make sure to mention any SEO success you’ve had with your website and where you’re at in your journey so far.
This might seem pushy, but many agencies actually encourage this—such as Rise at Seven:


Here’s a quick email template to get you started:
Subject: Junior SEO position?
Hey folks,
Do you have any upcoming openings for junior SEOs?
I’ve been learning SEO for [number] months, but I’m looking to take my knowledge to the next level. So far, I’ve taken Ahrefs’ Beginner SEO course and started my own blog about [topic]—which I’ve had some success with. It’s only [number] months old but already ranks for [number] keywords and gets an estimated [number] monthly search visits according to Ahrefs.
[Ahrefs screenshot]
I checked your careers page and didn’t see any junior positions there, but I was hoping you might consider me for any upcoming positions? I’m super enthusiastic, hard-working, and eager to learn.
Let me know.
[Name]
You can pull all the numbers and screenshots you need by creating a free Ahrefs Webmaster Tools account and verifying your website.
SEO is a broad industry. It’s impossible to be an expert at every aspect of it, so you should niche down and hone your skills in the area that interests you the most. You should have a reasonable idea of what this is from working on your own site and in an agency.
For example, link building was the area that interested me the most, so that’s where I focused on deepening my knowledge. As a result, I became what’s known as a “t-shaped SEO”—someone with broad skills across all things SEO but deep knowledge in one area.


Marie Haynes is another great example of a t-shaped SEO. She specializes in Google penalty recovery. She doesn’t build links or do on-page SEO. She audits websites with traffic drops and helps their owners recover.
In terms of how to build your knowledge in your chosen area, here are a few ideas:
Here are a few SEOs I’d recommend following and their (rough) specialties:
Final thoughts
K Anders Ericsson famously theorized that it takes 10,000 hours of practice to master a new skill. Can it take less? Possibly. But the point is this: becoming an SEO expert is not an overnight process.
I’d even argue that it’s a somewhat unattainable goal because no matter how much you know, there’s always more to learn. That’s part of the fun, though. SEO is a fast-moving industry that keeps you on your toes, but it’s a very rewarding one, too.
Here are a few stats to prove it:
- 74.1% of SEOs charge clients upwards of $500 per month for their services (source)
- $49,211 median annual salary (source)
- ~$74k average salary for self-employed SEOs (source)
Got questions? Ping me on Twitter X.
SEO
A Year Of AI Developments From OpenAI

Today, ChatGPT celebrates one year since its launch in research preview.
Try talking with ChatGPT, our new AI system which is optimized for dialogue. Your feedback will help us improve it. https://t.co/sHDm57g3Kr
— OpenAI (@OpenAI) November 30, 2022
From its humble beginnings, ChatGPT has continually pushed the boundaries of what we perceive as possible with generative AI for almost any task.
a year ago tonight we were probably just sitting around the office putting the finishing touches on chatgpt before the next morning’s launch.
what a year it’s been…
— Sam Altman (@sama) November 30, 2023
In this article, we take a journey through the past year, highlighting the significant milestones and updates that have shaped ChatGPT into the versatile and powerful tool it is today.
a year ago tonight we were placing bets on how many total users we’d get by sunday
20k, 80k, 250k… i jokingly said “8B”.
little did we know… https://t.co/8YtO8GbLPy— rapha gontijo lopes (@rapha_gl) November 30, 2023
ChatGPT: From Research Preview To Customizable GPTs
This story unfolds over the course of nearly a year, beginning on November 30, when OpenAI announced the launch of its research preview of ChatGPT.
As users began to offer feedback, improvements began to arrive.
Before the holiday, on December 15, 2022, ChatGPT received general performance enhancements and new features for managing conversation history.

As the calendar turned to January 9, 2023, ChatGPT saw improvements in factuality, and a notable feature was added to halt response generation mid-conversation, addressing user feedback and enhancing control.
Just a few weeks later, on January 30, the model was further upgraded for enhanced factuality and mathematical capabilities, broadening its scope of expertise.
February 2023 was a landmark month. On February 9, ChatGPT Plus was introduced, bringing new features and a faster ‘Turbo’ version to Plus users.
This was followed closely on February 13 with updates to the free plan’s performance and the international availability of ChatGPT Plus, featuring a faster version for Plus users.
March 14, 2023, marked a pivotal moment with the introduction of GPT-4 to ChatGPT Plus subscribers.


This new model featured advanced reasoning, complex instruction handling, and increased creativity.
Less than ten days later, on March 23, experimental AI plugins, including browsing and Code Interpreter capabilities, were made available to selected users.
On May 3, users gained the ability to turn off chat history and export data.
Plus users received early access to experimental web browsing and third-party plugins on May 12.
On May 24, the iOS app expanded to more countries with new features like shared links, Bing web browsing, and the option to turn off chat history on iOS.
June and July 2023 were filled with updates enhancing mobile app experiences and introducing new features.
The mobile app was updated with browsing features on June 22, and the browsing feature itself underwent temporary removal for improvements on July 3.
The Code Interpreter feature rolled out in beta to Plus users on July 6.
Plus customers enjoyed increased message limits for GPT-4 from July 19, and custom instructions became available in beta to Plus users the next day.
July 25 saw the Android version of the ChatGPT app launch in selected countries.
As summer progressed, August 3 brought several small updates enhancing the user experience.
Custom instructions were extended to free users in most regions by August 21.
The month concluded with the launch of ChatGPT Enterprise on August 28, offering advanced features and security for enterprise users.
Entering autumn, September 11 witnessed limited language support in the web interface.
Voice and image input capabilities in beta were introduced on September 25, further expanding ChatGPT’s interactive abilities.
An updated version of web browsing rolled out to Plus users on September 27.
The fourth quarter of 2023 began with integrating DALL·E 3 in beta on October 16, allowing for image generation from text prompts.
The browsing feature moved out of beta for Plus and Enterprise users on October 17.
Customizable versions of ChatGPT, called GPTs, were introduced for specific tasks on November 6 at OpenAI’s DevDay.


On November 21, the voice feature in ChatGPT was made available to all users, rounding off a year of significant advancements and broadening the horizons of AI interaction.
And here, we have ChatGPT today, with a sidebar full of GPTs.


Looking Ahead: What’s Next For ChatGPT
The past year has been a testament to continuous innovation, but it is merely the prologue to a future rich with potential.
The upcoming year promises incremental improvements and leaps in AI capabilities, user experience, and integrative technologies that could redefine our interaction with digital assistants.
With a community of users and developers growing stronger and more diverse, the evolution of ChatGPT is poised to surpass expectations and challenge the boundaries of today’s AI landscape.
As we step into this next chapter, the possibilities are as limitless as generative AI continues to advance.
Featured image: photosince/Shutterstock
-
MARKETING6 days ago
Whiteboard Friday Recap 2023: AI Edition
-
SEARCHENGINES5 days ago
Google Merchant Center Automatically Creating Promotions
-
SEO4 days ago
Google Discusses Fixing 404 Errors From Inbound Links
-
SEARCHENGINES6 days ago
Google Bug Sends Notice To Some Advertisers That Their Ad Accounts Were Suspended
-
SEARCHENGINES6 days ago
No Estimate To Share For Completion Of Google November Core & Reviews Updates
-
MARKETING5 days ago
3 Questions About AI in Content: What? So What? Now What?
-
SEO6 days ago
Google On Traffic Metric & SEO
-
SEO7 days ago
SEO Community Spotlight: London
You must be logged in to post a comment Login