En SEO audit is where you find opportunities to improve a site’s search performance. It involves finding technical, on-page, content, and link-related issues to fix or improve.
Everyone’s SEO audit process differs, as there’s no universal approach. But there are a handful of basic issues all site owners should look for.
You’ll learn how to check for 14 of them in this guide.
Manual actions are when a human reviewer at Google decides that your site doesn’t comply with their webmaster guidelines. The result is that some or all of your site won’t be shown in Google’s search results.
You’re unlikely to have a manual action unless you’ve done something drastically wrong. But it’s still arguably the best first thing to check because if you have one, you’re dead in the water before you even start.
To check for manual actions, go to the Manual actions report in Google Search Console.
If it says anything other than “No issues detected,” read our Google penalties guide.
Google updates its search algorithms all the time. Many of these updates target specific things like link spam or content quality.
For that reason, it’s important to check for organic traffic drops coinciding with known Google updates, as these may point to specific issues.
For example, the core update in August 2018 appeared to largely affect hälsa, kondition, and medical sites that failed to demonstrate expertise, authoritativeness, and trust (E-A-T). In fact, Barry Schwartz, a prominent blogger, dubbed it the “Medic” update.
The update all but destroyed some sites, like this one:
You can check your organic traffic trend for free in Google Search Console. Just go to the Search results report and set the period to the past year or two.
You can also see an estimated traffic graph in Ahrefs’ Site Explorer, where you can also overlay known Google updates to more easily diagnose issues.
For example, we can see that this site’s traffic drop coincided with a core update:
If you spot a big traffic drop coinciding with a Google update, check our Google Algorithm Updates History page to see the focus of the update.
HTTPS is a secure protocol for transferring data to and from visitors. It helps to keep things like passwords and credit card details secure, and it’s been a small Google ranking factor since 2014.
You can check if your website uses HTTPS by visiting it. If there’s a “lock” icon in the address bar, it’s secure.
However, some websites face issues where certain pages load securely, but other pages and resources don’t. So we recommend digging a bit deeper to make sure there are no HTTPS-related issues. Here’s how:
- Sign up for a free Ahrefs verktyg för webbansvariga konto
- Crawl your site with Webbplatsrevision
- Gå till Internal pages Rapportera
From here, check the “Protocols distribution” graph to see whether any pages are using HTTP. Ideally, you want to see an all-green graph.
- Click on “HTTP”
- Sort the report by status code from low to high
- Add a column for “Final redirect URL”
If the lowest HTTP status code is “301” and the final redirect URLs all begin with HTTPS, everything is fine.
Next, hit the “Issues” tab and look for the “HTTPS/HTTP mixed content” issue. This indicates that while your initial HTML is loading over a secure HTTPS connection, some resource files like images load over an unsecure one.
If you see either of these issues, read our HTTPS guide to learn more about dealing with them.
People should only be able to access one of these four versions of your website:
http://domain.com http://www.domain.com https://domain.com https://www.domain.com
The other three variations should redirect to the canonical (master) version.
This is important because Google sees all four of these as separate site versions. Having more than one accessible can cause crawling and indexing issues. In some cases, it can even dilute link equity and, thus, may negatively impact rankings.
To check that everything works as it should, install Ahrefs’ SEO Toolbar, type each URL version into your browser, then check the HTTP headers to make sure they all redirect to the same “master” version.
For example, if we visit
http://ahrefs.com, it redirects to the secure version at
The same happens if we visit the secure www version (
If this doesn’t happen, you’ll need to implement redirects.
Google search results come from its index, which is a database of hundreds of billions of webpages. Your pages need to be in this index to stand any chance at ranking.
It’s also important to keep pages that aren’t valuable for searchers out of Google’s index, as this can also cause SEO issues.
Indexing issues can get quite complicated, but you can check for basic issues fairly easily.
First, check the Indexability Rapportera in Webbplatsrevision for “Noindex page” warnings.
Google can’t index pages with this warning, so it’s worth checking they’re not pages you want indexed. If they are, remove or edit the meta robots tag.
Second, check the number of indexable URLs in the same report.
Investigate further if this looks abnormally high.
For example, given that we only have around 500 published blog posts, 2,164 indexable URLs seem high for the Ahrefs blog. But if we click the number, we see that it’s because it includes versions of our blog in other languages.
If we exclude those pages, along with author, category, and pagination pages, the number of indexable URLs looks pretty much spot on.
Mobile-friendliness has been a ranking factor everywhere since Google moved to mobile-first indexing under 2019.
Checking for mobile-friendliness is easily done. Just go to the Mobil användbarhet report in Google Search Console. It tells you whether any URLs have errors that affect mobile usability.
If you don’t have access to Google Search Console, plug any page from your website into Google’s Mobile-Friendly Test tool.
In general, assuming that other pages on your website use the same design and layout, the result should apply to most, if not all, of your pages.
Page speed has been a small ranking factor on desktop since 2010 and mobile since 2018. However, there’s no official threshold for how fast a page should load, and there are a confusing number of metrics you can use as a proxy.
For example, Google’s PageSpeed Insights tool shows all kinds of metrics:
The other downside of this tool is that you can only test one page at a time.
For that reason, it’s better to start with a tool that’ll give you speed metrics on all your pages. You can do this in Ahrefs’ Webbplatsrevision, which you can use for free with an Ahrefs verktyg för webbansvariga account. Here’s how:
- Crawl your website with Webbplatsrevision
- Gå till Prestanda Rapportera
- Check the “Time to first byte” and “Load time distribution” graphs
As a general rule, the more green you see here, the better. If you see lots of red, you may want to work on improving your page speed.
Core Web Vitals are metrics that Google uses to measure user experience. They measure a page’s load time, interactivity, and the stability of the content as it loads.
As they’re currently a weak ranking signal, you shouldn’t obsess over them. But it’s still worth taking a quick look at your site’s performance.
To do this, check the Kärnwebben Vitals report in Google Search Console.
As this report is based on Chrome User Experience (CrUX) data, there’s a chance you may see a “Not enough data collected” or “Not enough recent usage data” message instead of data.
If that happens, head over to the Prestanda rapport i Ahrefs Webbplatsrevision and check the Lighthouse scores. As this is lab data, it doesn’t rely on user experience data from Google.
Having broken pages on your site is never good. If these pages have backlinks, they are effectively wasted because they point to nothing.
To find broken pages on your website, head to the Internal pages Rapportera in Webbplatsrevision and click the number under “Broken.”
If you want to see the number of backlinks to each of these pages, add the “No. of referring domains” column to the report.
You can also find broken URLs with backlinks in Site Explorer. Just plug in your domain, go to the Bäst av länkar report, add a “404 not found” filter, then sort the report by referring domains from high to low.
The benefit of using Site Explorer is that it shows URLs that people linked to accidentally.
For example, we have länkar from three referring domains to this URL:
This page never existed. The linkers just linked to the wrong URL. It should have an “s” at the end.
Here’s our recommended process for dealing with broken länkar:
A sitemap lists the pages that you want search engines to index. It shouldn’t list things like redirects, non-canonicals, or dead pages because those send mixed signals to Google.
To check for sitemap issues, head to the All issues Rapportera in Webbplatsrevision and scroll to the “Other” section.
You’ll see any issues here relating to:
- Dead or inaccessible pages in the sitemap.
- Noindexed pages in the sitemap.
- Non-canonical pages in the sitemap.
If you have any of these issues, hit the caret and follow the advice on fixing them.
Every indexable page on your site should have a title tag, meta description, and H1 tag. These basic on-page elements help Google understand your content and help you to win more clicks from your rankings.
To check for issues, head to the “Issues” tab in the Innehåll Rapportera in Webbplatsrevision.
For example, the website above has 724 pages with a missing or empty title tag. This isn’t ideal because Google shows them in the search results, so the site could be missing out on clicks as a result.
It also has the same number of pages with an empty or missing meta description, and thousands with a missing or empty H1 tag.
Google often shows meta descriptions in the search results, so you should try to write an enticing one for every important page. Missing H1 tags, on the other hand, usually point to bigger issues like an improperly coded theme.
You can see which URLs are affected by clicking an issue and hitting “View affected URLs.”
If you want to prioritize fixes, sort the report by estimated organic traffic from high to low.
Rankings rarely last forever. As content becomes outdated, its search traffic will often start to drop off. But you can often solve this by refreshing and republishing the content.
For example, our list of top Google searches declined massively in 2021.
This is because we didn’t update the post for over a year, so the content became outdated. The recent spike in traffic is a result of us updating and republishing the piece.
Here’s an easy way to find declining content in Google Search Console:
- Gå till Search results Rapportera
- Set the date filter to compare mode
- Choose “Compare last 6 months to previous period”
- Click the “Pages” tab
- Sort the table by “Clicks Difference” from low to high
For example, this shows us that our list of the most visited websites has declined massively over the last six months. So this is probably ripe for an update.
If you’re a WordPress user, you can automate this process with our free SEO plugin. It monitors for pages that no longer perform well and gives recommendations on how to fix them.
For example, it’s suggesting that we rewrite our list of the best keyword tools because it used to rank in the top three for its target keyword but now doesn’t even rank in the top 100.
Content gaps occur when you miss important subtopics in your content. The result is that you don’t rank for as many long-tail keywords and potentially not as high as you could for your main target keyword.
Here’s an easy way to find content gaps:
- Paste one of your page’s URLs into Site Explorer
- Gå till Content Gap Rapportera
- Paste in the URLs of a few similar pages outranking you
Hit “Show keywords.” You’ll see all of the keywords that these pages rank for where yours don’t.
Many of these will just be different ways of searching for the same thing, but some may represent subtopics you’ve missed.
This is an interesting case because we kind of covered this in our definition on the page:
However, we didn’t explicitly state that this is what SEO stands for. Many of our competitors did.
For that reason, it may be worth us stating this in a more explicit way.
Many other technical issues can hinder your rankings. That’s why it’s always worth crawling your site with a tool like Ahrefs’ Webbplatsrevision to check for other SEO issues.
For example, if we do this for Ahrefs’ blog, we find a redirect loop:
Redirect loops are something you’re unlikely to spot by chance. So this issue would have likely gone unnoticed without a crawl-based audit.
It looks like we also have missing alt text on over 2,400 images:
This is arguably not a huge problem, but the sheer number of affected images in this instance points to a likely hole in our processes.
Running this SEO audit gives you three things to take action on to improve SEO.
- Technical SEO issues – Fixing these may boost your site’s overall search performance.
- On-page SEO issues – Fixing these may increase your organic clicks.
- Content opportunities – Pursuing these may rank pages higher and for more keywords.
If you want to run a deeper audit, read our guide to running a technical SEO audit.
Har du frågor? Pinga mig på Twitter.
Vad det betyder för SEO
Google not only changes how it presents information to users and updates algorithms, but the way users search is also changing.
SEO best practices are changing every year, so it’s best to keep up with what it means to properly optimize a website today.
Signals Of Authenticity And Usefulness
Google has released five Product Review Updates since April 2021.
The associated guidelines that Google published for writing product reviews recommend specific on-page factors that must exist in order for the page to be ranked for product review-related search queries.
This is an extraordinary change in how sites are ranked. Google has redefined what it means for a webpage to be relevant for a search query.
The definition of relevance simply meant that a webpage has to be about what the user was searching for, in this case, product reviews.
Product reviews were commonly thought of as expressing an opinion about a product, comparing the features of the product to the cost, and expressing a judgment if something is worth purchasing or not.
But now, it’s not enough for a webpage to review a product. It must also be authentic and useful. That’s a big change in how sites are ranked.
Here are two product review Google ranking factors introduced in December 2021:
“…we are introducing two new best practices for product reviews, to take effect in a future update.
- Provide evidence such as visuals, audio, or other länkar of your own experience with the product, to support your expertise and reinforce the authenticity of your review.
- Consider including länkar to multiple sellers to give the reader the option to purchase from their merchant of choice, if it makes sense for your site.”
Google calls them “best practices” but also says they will “take effect,” which implies that it’s something in the algorithm that is looking for these two qualities.
The first signal is about the authenticity of the product review.
The second signal is specific to sites that don’t sell the reviewed products, and it’s about being useful to site visitors by giving them multiple stores to purchase a product.
Authenticity and usefulness as signals of relevance is a huge shift for SEO.
Search Is Increasingly About Context
Context is the setting in which something is said or done, which provides meaning to those actions or settings.
The context of a search can influence the search results.
What’s happening is that Google is redefining what it means to be relevant by understanding the user context.
When a user searches for [pizza], Google does not show recipes for pizza; it shows local pizza restaurants.
Google defines the meaning of the keyword phrase “pizza” according to the context of the user, which includes the geographic location of that user.
Another context that influences search results is current events, which can change the meaning of a search phrase. This is a part of what is known as the Freshness algorithm.
The Freshness algorithm takes into account time-based factors that can change the meaning of a search phrase, and this influences what websites are shown.
So, those are the contexts of geography and time influencing what it means to be relevant for a search query.
Search Is Increasingly About Topics
As noted in the discussion of the 2013 Hummingbird update, Google is increasingly moving away from keywords and more toward understanding the multiple meanings inherent in search queries.
Google is also redefining relevance through the concept of topics.
When someone searches with the keyword [mustang], the likeliest meaning is the automobile, right?
In the above screenshot, Google lists multiple topics related to the Ford Mustang automobile.
- For sale.
Clicking on any of the above-listed topics results in a different search result.
Some of the top-ranked sites appear on different topics because they are relevant to multiple topics. Something to think about, right?
Back in 2018, Google’s Danny Sullivan tweeted about a way to change the search results by topic, which are the topic buttons we just reviewed above.
“A new dynamic way to quickly change results is coming, such as how you can toggle to quickly change about a dog breeds.
This is powered by the Topic Layer, a way of leveraging how the Knowledge Graph knows about people, places and things into topics.”
Google published a blog post about these changes and discussed them in the section titled, Dynamic Organization of Search Results.
In the article, Google explained that it is organizing some searches by topics och subtopics.
“Every search journey is different, and especially if you’re not familiar with the topic, it’s not always clear what your next search should be to help you learn more.
So we’re introducing a new way of dynamically organizing search results that helps you more easily determine what information to explore next.”
People Also Ask (PAA) is a way for Google to help users navigate to the information they’re looking for, particularly when the user searches with a vague keyword phrase, like CBD.
The queries listed in the PAA are topics.
People like to think of them as keyword phrases, but they are more than keywords. They are topics for webpage of content.
Clicking the first topic, “Does CBD do anything?” reveals an article on the topic of whether CBD products work.
Some people and tools like to use every single People Also Ask suggestion box as keywords for use in a single comprehensive article.
But what is missed in that approach is that every individual suggestion is a single topic for one article.
Because Google likes to rank precise content, one would have better luck creating content for each topic rather than a giant page of content on multiple topics since a giant page is not particularly precise.
Google’s focus on topics continues.
On September 28, 2022, Google introduced more ways to craft search queries by topic.
As you start typing in the search box, we’ll provide keyword or topic options to help you craft your question. Say you’re looking for a destination in Mexico. We’ll help you specify your question, so you can navigate to more relevant results for you https://t.co/oWeCGjhevS pic.twitter.com/ywoseDKOWa
— Google SearchLiaison (@searchliaison) September 28, 2022
Takeaway: Google’s Focus On Topics
Keywords are important because the proper use of the correct keyword phrases will help the content connect with users who use those keywords when searching for answers or information.
Advanced users tend to use more jargon, and less advanced users who have less knowledge will use more general terms.
Given that understanding, it’s important to keep in mind that Google understands the world in terms of topics and not keyword phrases.
When Google looks at a page, it’s understanding the page at the level of, “What’s this page about? What is the topic?”
Content can appear unnatural when the content author focuses on keywords, in my opinion.
This happens is because a keyword-focused article tends to meander as the author tries to stuff the article with the targeted keyword phrases, sometimes repeating.
Keyword-focused content feels unnatural because the author is struggling to create sentences that include the keywords.
A better way to create content, in my opinion, is to focus on topics (as well as usefulness!).
Relevance And Topic Category
For some types of search queries, Google may be ranking sites that belong to a category of sites.
There is a 2015 patent named Re-ranking resources based on categorical quality that describes a way to rank webpages based on whether the category of the content matches the category implied by the search query.
I believe this patent may be related to the August 2018 Google update known as the Medic Update.
It was called the Medic Update because it noticeably affected the category of Hälsa websites.
This patent represents a revolutionary change in how Google determines what is relevant for certain queries and discusses how it will re-rank the search results according to whether a website belongs to a topic category.
Google’s patent first describes two kinds of searches: informational and navigational.
An informational search is one that can be answered by multiple kinds of sites. Google uses examples of queries about football and space travel as the kinds of searches that are informational.
It then notes that navigational queries are when users search using the name of a site, like YouTube.
Then it gets to the point of the patent, which is a type of search query that is relevant to a category of information.
The patent says:
“Sometimes, however, users may have a particular interest in a category of information for which there are a number of well-served resources.”
That’s why the patent is called, “Re-ranking resources based on categorical quality” and in the abstract (the description of the patent) it states, it’s about “re-ranking resources for categorical queries.”
The word “categorical” is used in the sense of something belonging to a category.
A simple description of this patent is that it will rank a search query and then apply a filter to the search results that are based on categories that a search query belongs to. That’s what is meant by the word “re-rank.”
Re-ranking is a process of ranking websites for a search query and then selecting the top results by re-ranking the results based on additional criteria.
The following passage from the patent uses the words “quality condition” and “resources.”
In the context of this patent, the “quality condition” means the quality of being a part of a category.
A “resource” is just a webpage.
It first describes two ranking scenarios. A regular ranking of websites (“search ranking”) and another ranking called a “quality ranking” that ranks pages that belong to a “category.”
Remember, resources mean a webpage, and the quality condition is the quality of belonging to a category.
Here’s the important passage from the patent:
“By re-ranking search results for a proper subset of resources that satisfy a quality condition, the search system provides a set of search results that lists resources that belong to a category according to a quality ranking that differs from a search ranking of a received query.”
Next, it explains the benefit of re-ranking search results based on the “quality with respect to the category.”
“Because the search results are provided according to a ranking that is based, in part, on quality with respect to the category, the search results are more likely to satisfy a user’s informational need when the users issues a query that is categorical for the category.”
Lastly, I call attention to the section titled, Detailed Description, where the patent goes into more detail.
First, it notes that when users don’t know much about a category, they will tend to not use the jargon that is typical for that category and instead use “broader” or more general phrases.
“…when a user knows very little about the category, the queries are more likely to be broader queries.
This is because a user may not have developed an understanding of the category, and may not be aware of the websites and resources that best serve the category.”
Next, the patent says that it will take that general query that is related to a category and match it to sites that fit into that category.
As an example, if someone searches on the topic of pain in the stomach, Google might match that query to the category of medical websites and re-rank the top-ranked search results to only show websites that belong to the medical category of websites.
The patent explains:
“The systems and methods described below re-rank resources for a broad categorical query by their corresponding quality in the category to which the categorical query corresponds.
The set of re-ranked search results are more likely to show the websites and resources that best serve the category.”
To Be Relevant Means To Fit Into A Category
The point of that patent from 2015 is that Google likely changed what it means to be relevant.
For example, for medical queries, Google ranks websites with traditional ranking factors like länkar och innehåll.
But then Google re-ranks those search results by filtering out all the sites that don’t fit into the right category for that search query.
This change was a radical departure for Google in 2018 because it meant that alternative-health sites that used to rank for medical queries stopped ranking for those queries.
Those sites were not a part of the medical category, they were a part of the alternative-health category.
Google said that the 2018 update was not targeting health sites; it was simply more noticeable in that vertical.
That means that this change applies to a wide range of other categories as well.
This means that the meaning of relevance for some queries has changed. It’s not enough to have certain keywords in the content for certain verticals, the content must also fit into the right category, described by the patent as the “quality with respect to the category.”
Precise Search Results And Keywords
Google’s search ranking algorithms have progressively become more precise.
Precision in search results is something that took off in a big way after Google’s Hummingbird update in 2013.
What made search more precise after the Hummingbird update was that Google wasn’t using all the keywords in a search query to match what is on a webpage.
Instead, what was happening is that Google was ignoring some words, particularly in natural language type searches, and focusing on what that query actually means and then using that understanding to match the search query to a webpage.
Precision is something important to think about when considering how to SEO a webpage.
Google engineer (at the time) Matt Cutts förklarade:
“Hummingbird is a rewrite of the core search algorithm.
Just to do a better job of matching the users queries with documents, especially for natural language queries, you know the queries get longer, they have more words in them and sometimes those words matter and sometimes they don’t.”
Cutts is quoted again in the above article expanding on the idea of precision:
“…the idea behind Hummingbird is, if you’re doing a query, it might be a natural language query, and you might include some word that you don’t necessarily need…
…Some of those words don’t matter as much.
And previously, Google used to match just the words in the query.
Now, we’re starting to say which ones are actually more helpful and which ones are more important.”
This was the beginning of Google evolving to understand topics and what users really want.
Most importantly, Google’s focus on precision remains and can be seen in their increasingly sophisticated ranking technologies like Google Lens, where Google can rank webpages based on users searching with images from their cell phones.
For example, one can take a snapshot of a bug that’s on the ground and search with that.
Precision In User Intent
A change in search engines dating to approximately 2012/2013 is Google’s increasing use of user intent in search results.
Google didn’t announce the introduction of user intent into the search results.
And the reporting of a June 2011 Q&A between Matt Cutts and Danny Sullivan where Cutts discusses user intent went over the heads of the people reporting it.
In the Q&A, Cutts talks about how Larry Page came to him and asked why the search results for [warm mangoes] weren’t so good.
Cutts wondered what the user intent was for that search and discovered some facts about how warm mangoes ripen in a box.
I was there during the Q&A, and I was blown away by Google’s ambition to integrate user intent into the search results.
But none of the reporting in 2011 understood how the [warm mangoes] search fit into what Cutts was talking about, even though he mentioned the phrase “user intent.”
So, it was just reported as an amusing anecdote about warm mangoes.
Over 10 years later, everyone is talking about user intent.
But there’s a new understanding of intent that goes beyond the current understanding of it.
It’s the understanding that user intent is more than just informational, transactional, etc.
Those categories are actually very general, and there is actually a more nuanced way to understand user intent by understanding the verbs used in search queries.
“Verbs fundamentally change keyword research.
My best practice recommendation is to abandon the notion of “User intent” being described as “Informational/Navigational/Transactional/commercial or Local Intent”.
Boxing user intent into only four vague descriptions is not entirely accurate.
A user’s intent when they search is far more nuanced than trying to do one of four things, it is more specific.
User intent is much better described by analyzing verbs.
Most keyword research data focuses on words or phrases, without understanding user intent, which can lead to fundamental errors.
For example, a site about horses might do keyword research that finds search volumes around phrases like “Mustang” or even “Horse power” which are entirely different topics and concepts, which may or may not be relevant to a website’s topic.
Here is the key point: Words generated through keyword research are not specifically relevant to what anyone searches for without a verb in the search query to give the search context.
The verb “ride” and “mustang” together suggests and entirely different meaning and audience than the verb “drive” and “mustang.”
Further, a phrase like “buy a Mustang” probably isn’t relevant to a horse website because the most popular intent is related to an automobile.
Without any other information about the user, you cannot know for sure other than to make a guess based on the most popular intent.
But it’s still just a guess.
Google may well know more about the user, based on their search history, but all you can do as an SEO is to be true to your website’s topic and purpose.
If you start writing content around a keyword phrase simply because the search volumes are high, it’s possible for the site to lose context, rather than improve context.
Analyzing verbs in keyword research is one of the ideas that we have been researching at InLinks.net.
Using NLP algorithms can help weed out irrelevant keyword suggestions when the entities and verbs in the user queries are checked for proximity to topics in your own content.”
Search Queries Have Evolved
It’s important to note that Google continues to evolve what it means to search. Initially, searching meant typing words into a desktop or laptop computer.
Then, it involved speaking those queries into a mobile phone.
Now, it’s changing to include searching with images through the Google Lens app.
For example, I wanted more information about a bottle of wine at the store. I took a photo of it and submitted it to Google Lens, which returned search results about that wine.
What’s notable about evolving search queries is that it’s Google that is driving the evolution by creating new ways for users to search, such as Google Lens.
On September 28, 2022, Google announced nine new ways for users to conduct shopping searches.
“Today at our annual Search On event, we announced nine new ways we’re transforming the way you shop with Google, bringing you a more immersive, informed and personalized shopping experience.
Powering this experience is the Shopping Graph, our AI-enhanced model that now understands more than 35 billion product listings — up from 24 billion just last year.”
And then there is multi-search, a new way to search:
With multisearch, you can take a pic *and* ask a question to get the look you want or fix something. 🤯 We’re bringing this new way to search to 70+ languages. And soon, you’ll be able to add “near me” to your image to find what you’re looking for nearby. #SearchOn pic.twitter.com/RHxRQm42EU
— Google (@Google) September 28, 2022
Each change to how users can search and how Google presents information is an opportunity for businesses to claim a share of the new ways of searching and being discovered.
The old way of 10 blue lines is long behind us, powered by changes in technology.
It’s a new era for search. Are you up to date?
Featured Image: Masson/Shutterstock
LinkedIn Announces Expanded Roll-Out of New ‘Focused Inbox’ Format for InMail
Googles höjdpunkter Ladda ner din Google-företagsprofildata när du tar bort dina uppgifter
Vad det betyder för SEO
YouTube tillkännager uppdaterade riktlinjer för intäktsgenerering, nya analysverktyg i mobilappen
Google Sök Ny ruta "Kartor om detta ämne" och "Från dina prenumerationer"
Länkbyggnad för SEO: Nybörjarguiden
Hur man sökmotoroptimerar ditt blogginnehåll
Den här veckan på Xbox: spännande nya releaser, uppdateringar och mer
TikTok tillkännager uppdaterad CAP University Marketing Education Course
Vad 2022 SEO-förändringar kan betyda för 2023 och därefter [Webinar]
B2C-marknadsföring: En guide för marknadsförare
Marketos oktobersläpp: En chefsguide
Veckans erbjudanden med guld- och spotlight-rea, plus Xbox Black Friday-rea
Vampire Survivors tillgängliga idag med Xbox Game Pass för Xbox Series X|S och Xbox One
Identifiera en effektiv B2B-målmarknad för annonser
Xbox delar Community Safety Approach i Transparency Report
Hjälpa affiliates att skapa tillfredsställande innehåll i långa format
För- och nackdelar med ditt varumärke genom att använda affiliate-länkar
Twitters bortgång skulle kosta marknadsförare en viktig och användbar kanal
8 marknadsföringsstrategier för e-handel för 2022 och framåt
SEO5 dagar sedan
En enkel (men komplett) SEO-handledning för nybörjare i 7 steg
MARKNADSFÖRING7 dagar sedan
Vad är försäljningsacceleration?
SPEL6 dagar sedan
Upplev hösten i Phantasy Star Online 2: New Genesis
SÖKMOTORER7 dagar sedan
Matt Cutts slogs med Sergey Brin och Larry Page över spamproblem i Google Sök