SEO
How To Perform A SEO SWOT Analysis

For most organizations, implementing an effective SEO (search engine optimization) strategy involves collecting and analyzing significant amounts of keywords, content, analytics, and competitive data from various sources.
SEO professionals then need to use this data to prioritize keyword, content, structural, and/or linking tasks to address issues or build on existing organic search authority.
One familiar method of prioritization, which lends itself well to helping focus attention and often maximize limited SEO and marketing resources, is the SWOT (Strengths, Weaknesses, Opportunities, Threats) framework.
A SWOT, by definition, is geared to help identify items with the biggest potential impact on growth – or the most dangerous threats.
The following breakdown of organizational SEO priorities assumes keyword research has already been done and is being used for the website, SERP (Search Engine Results Page), and competitive data, which will be the foundation of an effective SWOT.
Keyword research alone is often deserving of its own SWOT process.
Strengths
One of the primary factors search engines use in determining your organic search visibility is an organization’s relative strength and authority for a topical group of keywords.
Identifying those keywords for which the organization already has some authority – or as some like to call “momentum” in the eyes of the search engines – is an excellent place to begin focusing your attention.
Authority is generally difficult to come by and takes time to establish, so why not build on what you already have.
Your first question should be, “Which pieces of content do I have that rank well (let’s say in the top 20 results) in the search engines for my primary keyword groups?”
Recognizing where you have existing strength can be leveraged in three ways:
- Look for opportunities to link out from or to your strongest pieces of content. This can have the dual effect of reinforcing your original piece of content by linking to more comprehensive answers to your audiences’ questions and borrowing from the authority of the strongest piece.
- Perform full-page keyword, technical, and link audits on all webpages that rank between positions five and 20 to see where any improvements can be made to move them higher in the SERPs. This may mean adjusting title tags, headings, or updating links to more current or relevant sources.
- Determine whether the “right” landing pages rank for the keywords you want to be found for. While it may seem great to have your homepage rank for several of your keywords, this is not optimal.
Searchers who land on your homepage looking for something specific will have to spend more time clicking or searching again to find the exact answer to their question.
Identify the pages you have that provide answers, and focus on having them usurp the position currently maintained by the homepage.
If you determine such pages don’t exist, then it’s time to create them.
Be sure to also pay attention to the types and characteristics of your strongest content pieces as signals to what content to create moving forward.
For example, if you have videos ranking well on Google and/or YouTube, by all means, create more videos.
If long-form blog posts dominate the top of the search results for your primary keywords, this is your cue to publish and share more of the same.
Weaknesses
We all have our weaknesses; when it comes to SEO, recognizing and admitting them early on can save us a great deal of effort, time, money, and lost business.
Keywords And Content
While there are undoubtedly keyword groups we feel we must be found for, it’s important to let go of those which will require too much time and/or effort to establish authority for.
Generally, a quick review of the search engine results will reveal keywords that are out of reach based on your competitors’ size, age, reputation, and quality of content.
In this case, looking at the more specific long-tail and intent-driven keyword alternatives may be necessary or considering other avenues (including paid) to generate visibility, traffic, and conversions.
Sometimes, the best strategy is to employ complementary paid search tactics until you can establish organic search authority.
Technical Audit
Another area of weakness, which you can readily control more, maybe the quality of your own website and content from a technical/structural, keyword relevance, or depth perspective.
You can begin identifying areas of weakness by conducting an SEO audit.
There are several excellent free and paid tools available, including Google Lighthouse and Search Console (specifically the Core Web Vitals Report and Mobile-Friendly Test), which will provide a prioritized list of issues and/or errors found in the title and heading tags, internal and external links, website code, keyword usage/density, and a myriad of mobile-friendly factors.
As noted above, you should start by focusing on and fixing any issues found on those pages for which you already have some authority based on search engine results.
Optimizing these pages can only help improve their chances of moving up the SERPs.
You can move on to other priority web pages based on website analytics data or strategic importance.
Backlinks
Organically obtained, relevant, quality backlinks (aka inbound links) are still a search engine ranking factor as they speak to, and can enhance, the authority of the site to which they link.
As with site auditing, many good third-party backlink tools can reveal where you maintain backlinks. These are particularly useful for looking at the backlink sources of your strongest-known competitors.
Where appropriate, you may want to reach out to obtain links from the same relevant sources to leverage their authority.
Opportunities
In SEO, opportunities abound for those who know how, where, and who take the time to look.
SEO is really about moving from one opportunity to the next.
Once optimization is deemed successful for one group of keywords or pieces of content, it’s time to move to the next topic upon which authority can be established or reinforced.
Keywords And Content
Several keyword research tools like Ahrefs, Semrush, and others can discover both keyword and content opportunities or gaps based on providing your website domain, the domains of your known competitors, or a targeted list of keywords.
Most provide prioritized lists of potentially high-value keywords based on estimated monthly search volumes, organic traffic, and/or relative competition.
In other words: Which high-value keywords are your competitors ranking for which you are not?
As with the Weaknesses above, part of this analysis should consider the level of effort required to obtain authority relative to the potential return on establishing organic visibility.
Is it a worthwhile opportunity?

- A more manual process for discovering keyword and content opportunities is to run a reverse website audit on competitors’ websites.
Or, spend some time simply reviewing your top competitors’ primary pages, paying particular attention to the keywords used in title tags, headings, and internal link anchor text.
These are presumably the keywords that matter most to them.
However, be careful, as this strategy assumes the competition has conducted their own keyword research and has been following SEO best practices, which may or may not always be the case.
Focusing on those competitors who rank well for your primary keywords should single out the ones who are intentionally optimizing for search.
Content Refresh
Another opportunity within a web presence is the refresh of top-performing or complementary content.
First, scan the SERPs or a preferred keyword tool to identify older content that is ranking for target keywords or serves to support other primary content pages.
Then, review this content to see where there may be opportunities to update text, images, internal/external links, or any other components.
Perhaps there’s an opportunity to enhance the piece by creating and adding images or videos.
Finally, re-share this content via appropriate channels, and perhaps consider identifying new avenues – as a previously popular piece of content will likely perform well again.
Existing content offers an excellent opportunity to build authority, often with just a little extra effort.
Backlinks
While typically a manually intensive process, there is long-term value in seeking out backlinks.
Ideally, you want to identify relevant, authoritative websites/domains from which high-quality inbound links can be obtained.
There are several sources you can use to start looking for inbound links:
The SERPs for your primary keywords are a natural backlink research starting point, as the websites found here are, by definition, considered “relevant” and “authoritative” by the search engines.
Of particular interest are those sites which rank ahead of yours because they presumably have higher authority upon which you can piggyback.
Look for any non-competitive backlinking opportunities such as directories, association listings, or articles and blog posts that you may be able to contribute to, get mentioned in, or comment on.
The Google Search Console Links Report is the next best resource for backlink research, as it indicates what Google recognizes as the domains linking to your content.
Here you can validate the quality and accuracy of the links you already have, as well as determine if there are any other opportunities to obtain additional links from these same domains.
Referral sources in Google Analytics represent external sites that send you traffic but may or may not be providing an organic search boost.
Review these domains/sites regularly to see other linking opportunities.
4. As noted under Weaknesses, several third-party backlink tools can be used to identify potential backlink sources where links to your competitors can be found.
Some will even help by authority ranking and prioritizing the value of each existing and potential source, which can save significant time.
Threats
Whether done intentionally or not, there are more than a few things which can threaten organic authority in the eyes of the search engines and should be prioritized to avoid potentially damaging penalties.
Content
The primary content threat most are familiar with is duplicate content, which, as the name suggests, is content repurposed on a website without proper attribution to the original source.
To avoid being penalized for using this type of content, you must be sure to include rel canonical tags by referencing the source content in the headers of pages containing the duplicate content.
In other words: It’s okay to have some duplicate content on a website, as long as the original source is properly identified.
Backlinks
While relevant, high-quality backlinks can help boost your authority, irrelevant, low-quality inbound links from non-reputable sites (particularly those that are part of paid link schemes) can do long-lasting harm and even get you tagged with a manual penalty.
The threat here is a potential loss of organic visibility and traffic.
Further, recovering from a manual penalty is not an easy or quick process.
Simply put, you should never pay for backlinks and ensure any backlinks you acquire have not been purchased on your behalf by a third party, like a marketing agency.
As such, you should regularly review the Google Search Console Links report or other backlink reporting sources for questionable domains or those you don’t recognize as relevant.
Competitors
All online competitors creating their own content represent threats to your authority.
Even if you maintain strong organic visibility and traffic relative to your “known” competitors, there is always the potential for new, aggressive, or unknown competitors to come onto the scene.
Many of the aforementioned SEO tools provide competitor discovery tools to help quickly identify domains that consistently appear in the search results for your primary keywords.
Oftentimes, there may be competitors here you’ve never considered. You’ll naturally want to pay attention to these competitors and use the tactics noted above to see what you can learn from them.
Search engines love and reward fresh, relevant content, and Google even has a freshness algorithm to identify it.
As such, you should regularly monitor the search engine results for new entrants, which may, over time, challenge your authority and position.
Of course, the best way to combat this type of threat is by continuing to publish and update your own comprehensive content, which will give the search engines less reason to question your authority.
Actioning On The SWOT
The detailed SWOT outputs will map prioritized actions to protect and/or improve online authority, visibility, and resulting traffic, leads, and revenue.
Proactive search marketers should conduct these analyses on at least a bi-annual, if not quarterly, basis, depending on how competitive the industry is and how active the competitors are.
A well-structured SWOT can provide an excellent roadmap for where, when, and how often action needs to be taken or content needs to be created and shared to boost your organization’s primary SEO goals.
More Resources:
Featured Image: Rawpixel.com/Shutterstock
SEO
Google To Curb Microtargeting In Consumer Finance Ads

Google is updating its policy limiting personalized advertising to include more restrictions on ads related to consumer financial products and services.
Google’s personalized ads policy prohibits targeting users based on sensitive categories like race, religion, or sexual orientation.
Over the years, Google has continued updating the policy to introduce new limitations. The latest update to restrict consumer finance ads is part of Google’s ongoing efforts to refine its ad targeting practices.
What’s Changing?
Google will update its personalized ads policy in February 2024 to prevent advertisers from targeting audiences for credit and banking ads based on sensitive factors like gender, age, parental status, marital status, or zip code.
Google’s current policy prohibiting “Credit in personalized ads” will be renamed “Consumer finance in personalized ads” under the changes.
Google’s new policy will state:
“In the United States and Canada, the following sensitive interest categories cannot be targeted to audiences based on gender, age, parental status, marital status, or ZIP code.
Offers relating to credit or products or services related to credit lending, banking products and services, or certain financial planning and management services.”
Google provided examples, including “credit cards and loans including home loans, car loans, appliance loans, short-term loans,” as well as “banking and checking accounts” and “debt management products.”
When Does The New Policy Take Effect?
The updated limitations on personalized advertising will take effect on February 28, 2024, with full enforcement expected within six weeks.
Google said advertisers in violation will receive a warning at least seven days before any account suspension.
According to Google, the policy change aims to protect users’ privacy better and prevent discrimination in financial services advertising.
However, the company will still allow generalized ads for credit and banking products that do not use sensitive personal data for targeting.
What Do Advertisers Need To Do?
Google will begin enforcing the updated restrictions in late February 2024 but advises advertisers to review their campaigns for compliance issues sooner.
Advertisers should carefully check their ad targeting settings, remove improper personalization based on sensitive categories, and adhere to the revised policy requirements.
Failure to follow the rules could lead to account suspension after an initial warning. Google will work with advertisers to ensure a smooth transition during the ramp-up period over the next six months.
Featured Image: SurfsUp/Shutterstock
SEO
Google Discusses Fixing 404 Errors From Inbound Links

Google’s John Mueller responded to a thread in Reddit about finding and fixing inbound broken links, offering a nuanced insight that some broken links are worth finding and fixing and others are not.
Reddit Question About Inbound Broken Links
Someone asked on Reddit if there’s a way to find broken links for free.
This is the question:
“Is it possible to locate broken links in a similar manner to identifying expired domain names?”
The person asking the question clarified if this was a question about an inbound broken link from an external site.
John Mueller Explains How To Find 404 Errors To Fix
John Mueller responded:
“If you want to see which links to your website are broken & “relevant”, you can look at the analytics of your 404 page and check the referrers there, filtering out your domain.
This brings up those which actually get traffic, which is probably a good proxy.
If you have access to your server logs, you could get it in a bit more detail + see which ones search engine bots crawl.
It’s a bit of technical work, but no external tools needed, and likely a better estimation of what’s useful to fix/redirect.”
In his response, John Mueller answers the question on how to find 404 responses caused by broken inbound links and identify what’s “useful to fix” or to “redirect.”
Mueller Advises On When Not To “Fix” 404 Pages
John Mueller next offered advice on when it doesn’t make sense to not fix a 404 page.
Mueller explained:
“Keep in mind that you don’t have to fix 404 pages, having things go away is normal & fine.
The SEO ‘value’ of bringing a 404 back is probably less than the work you put into it.”
Some 404s Should Be Fixed And Some Don’t Need Fixing
John Mueller said that there are situations where a 404 error generated from an inbound link is easy to fix and suggested ways to find those errors and fix them.
Mueller also said that there are some cases where it’s basically a waste of time.
What wasn’t mentioned was what the difference was between the two and this may have caused some confusion.
Inbound Broken Links To Existing Webpages
There are times when another sites links into your site but uses the wrong URL. Traffic from the broken link on the outside site will generate a 404 response code on your site.
These kinds of links are easy to find and useful to fix.
There are other situations when an outside site will link to the correct webpage but the webpage URL changed and the 301 redirect is missing.
Those kinds of inbound broken links are also easy to find and useful to fix. If in doubt, read our guide on when to redirect URLs.
In both of those cases the inbound broken links to the existing webpages will generate a 404 response and this will show up in server logs, Google Search Console and in plugins like the Redirection WordPress plugin.
If the site is on WordPress and it’s using the Redirection plugin, identifying the problem is easy because the Redirection plugin offers a report of all 404 responses with all the necessary information for diagnosing and fixing the problem.
In the case where the Redirection plugin isn’t used one can also hand code an .htaccess rule for handling the redirect.
Lastly, one can contact the other website that’s generating the broken link and ask them to fix it. There’s always a small chance that the other site might decide to remove the link altogether. So it might be easier and faster to just fix it on your side.
Whichever approach is taken to fix the external inbound broken link, finding and fixing these issues is relatively simple.
Inbound Broken Links To Removed Pages
There are other situations where an old webpage was removed for a legitimate reason, like an event passed or a service is no longer offered.
In that case it makes sense to just show a 404 response code because that’s one of the reasons why a 404 response should be shown. It’s not a bad thing to show a 404 response.
Some people might want to get some value from the inbound link and create a new webpage to stand in for the missing page.
But that might not be useful because the link is for something that is irrelevant and of no use because the reason for the page no longer exists.
Even if you create a new reason, it’s possible that some of that link equity might flow to the page but it’s useless because the topic of that inbound link is totally irrelevant to anyting but the expired reason.
Redirecting the missing page to the home page is a strategy that some people use to benefit from the link to a page that no longer exists. But Google treats those links as Soft 404s, which then passes no benefit.
These are the cases that John Mueller was probably referring to when he said:
“…you don’t have to fix 404 pages, having things go away is normal & fine.
The SEO ‘value’ of bringing a 404 back is probably less than the work you put into it.”
Mueller is right, there are some pages that should be gone and totally removed from a website and the proper server response for those pages should be a 404 error response.
SEO
Site Quality Is Simpler Than People Think

Google’s John Mueller, Martin Splitt and Gary Illyes discussed site quality in a recent podcast, explaining the different ways of thinking about site quality and at one point saying it’s not rocket science. The discussion suggests that site quality could be simpler than most people know.
Site Quality Is Not Rocket Science
The first point they touched on is to recommend reading site quality documentation, insisting that site quality is not especially difficult to understand.
Gary Illyes said:
“So I would go to a search engine’s documentation.
Most of them have some documentation about how they function and just try to figure out where your content might be failing or where your page might be failing because honestly, okay, this is patronizing, but it’s not rocket science.”
No Tools For Site Quality – What To Do?
Gary acknowledged that there’s no tool for diagnosing site quality, not in the same way there are tools for objectively detecting technical issues.
The traffic metrics that show a downward movement don’t explain why, they just show that something changed.
Gary Illyes:
“I found the up-down metric completely useless because you still have to figure out what’s wrong with it or why people didn’t like it.
And then you’re like, “This is a perfectly good page. I wrote it, I know that it’s perfect.”
And then people, or I don’t know, like 99.7% of people are downvoting it. And you’re like, ‘Why?’”
Martin Splitt
“And I think that’s another thing.
How do I spot, I wrote the page, so clearly it is perfect and helpful and useful and amazing, but then people disagree, as you say.
How do you think about that? What do you do then?
How can I make my content more helpful, better, more useful? I don’t know.
…There’s all these tools that I can just look at and I see that something’s good or something’s bad.
But for quality, how do I go about that?”
Gary Illyes
“What if quality is actually simpler than at least most people think?
…What if it’s about writing the thing that will help people achieve whatever they need to achieve when they come to the page? And that’s it.”
Martin Splitt asked if Gary was talking about reviewing the page from the perspective of the user.
Illyes answered:
“No, we are reframing.”
Reframing generally means to think about the problem differently.
Gary’s example is to reframe the problem as whether the page delivers what it says it’s going to deliver (like helping users achieve X,Y,Z).
Something I see a lot with content is that the topic being targeted (for example, queries about how to catch a trout) isn’t matched by the content (which might actually be about tools for catching trout) which is not what the site visitor wants to achieve.
Quality In Terms Of Adding Value
There are different kinds of things that relate to site and page quality and in the next part of the podcast John Mueller and Gary Illyes discuss the issue about adding something of value.
Adding something of value came up in the context of where the SERPs offer good answers from websites that people not only enjoy but they expect to see those sites as answers for those queries.
You can tell when users expect specific sites for individual search queries when Google Suggests shows the brand name and the keyword.
That’s a clue that probably a lot of people are turning keywords into branded searches, which signals to Google what people want to see.
So, the problem of quality in those situations isn’t about being relevant for a query with the perfect answer.
For these situations, like for competitive queries, it’s not enough to be relevant or have the perfect answer.
John Mueller explains:
“The one thing I sometimes run into when talking with people is that they’ll be like, “Well, I feel I need to make this page.”
And I made this page for users in air quotes…
But then when I look at the search results, it’s like 9,000 other people also made this page.
It’s like, is this really adding value to the Internet?
And that’s sometimes kind of a weird discussion to have.
It’s like, ‘Well, it’s a good page, but who needs it?’
There are so many other versions of this page already, and people are happy with those.”
This is the type of situation where competitive analysis to “reverse engineer” the SERPs works against the SEO.
It’s stale because using what’s in the SERPs as a template for what to do rank is feeding Google what it already has.
It’s like, as an example, let’s represent the site ranked in Google with a baseline of the number zero.
Let’s imagine everything in the SERPs has a baseline of zero. Less than zero is poor quality. Higher than zero is higher quality.
Zero is not better than zero, it’s just zero.
The SEOs who think they’re reverse engineering Google by copying entities, copying topics, they’re really just achieving an imperfect score of zero.
So, according to Mueller, Google responds with, “it’s a good page, but who needs it?”
What Google is looking for in this situation is not the baseline of what’s already in the SERPs, zero.
According to Mueller, they’re looking for something that’s not the same as the baseline.
So in my analogy, Google is looking for something above the baseline of what is already in the SERPs, a number greater than zero, which is a one.
You can’t add value by feeding Google back what’s already there. And you can’t add value by doing the same thing ten times bigger. It’s still the same thing.
Breaking Into The SERPs By The Side Door
Gary Illyes next discusses a way to break into a tough SERP, saying the way to do it is indirectly.
This is an old strategy but a good one that still works today.
So, rather than bringing a knife to a gunfight, Gary Illyes suggests choosing more realistic battles to compete in.
Gary continued the conversation about competing in tough SERPs.
He said:
“…this also is kind of related to the age-old topic that if you are a new site, then how can you break into your niche?
I think on today’s Internet, like back when I was doing ‘SEO’, it was already hard.
For certain topics or niches, it was absolutely a nightmare, like ….mesothelioma….
That was just impossible to break into. Legal topics, it was impossible to break into.
And I think by now, we have so much content on the Internet that there’s a very large number of topics where it is like 15 years ago or 20 years ago, that mesothelioma topic, where it was impossible to break into.
…I remember Matt Cutts, former head of Web Spam, …he was doing these videos.
And in one of the videos, he said try to offer something unique or your own perspective to the thing that you are writing about.
Then the number of perspective or available perspectives, free perspectives, is probably already gone.
But if you find a niche where people are not talking too much about, then suddenly, it’s much easier to break into.
So basically, this is me saying that you can break into most niches if you know what you are doing and if you are actually trying to help people.”
What Illyes is suggesting as a direction is to “know what you are doing and if you are actually trying to help people.”
That’s one of my secrets to staying one step ahead in SEO.
For example, before the reviews update, before Google added Experience to E-A-T, I was telling clients privately to do that for their review pages and I told them to keep it a secret, because I knew I had it dialed in.
I’m not psychic, I was just looking at what Google wants to rank and I figured it out several years before the reviews update that you need to have original photos, you need to have hands-on experience with the reviewed product, etc.
Gary’s right when he advises to look at the problem from the perspective of “trying to help people.”
He next followed up with this idea about choosing which battles to fight.
He said:
“…and I think the other big motivator is, as always, money. People are trying to break into niches that make the most money. I mean, duh, I would do the same thing probably.
But if you write about these topics that most people don’t write about, let’s say just three people wrote about it on the Internet, then maybe you can capture some traffic.
And then if you have many of those, then maybe you can even outdo those high-traffic niches.”
Barriers To Entry
What Gary is talking about is how to get around the barrier to entry, which are the established sites. His suggestion is to stay away from offering what everyone else is offering (which is a quality thing).
Creating content that the bigger sites can’t or don’t know to create is an approach I’ve used with a new site.
Weaknesses can be things that the big site does poorly, like their inability to resonate with a younger or older audience and so on.
Those are examples of offering something different that makes the site stand out from a quality perspective.
Gary is talking about picking the battles that can be won, planting a flag, then moving on to the next hill.
That’s a far better strategies than walking up toe to toe with the bigger opponent.
Analyzing For Quality Issues
It’s a lot easier to analyze a site for technical issues than it is for quality issues.
But a few of the takeaways are:
- Be aware that the people closest to the content are not always the best judges of content is quality.
- Read Google’s search documentation (for on-page factors, content, and quality guidelines).
- Content quality is simpler than it seems. Just think about knowing the topic well and being helpful to people.
- Being original is about looking at the SERPs for things that you can do differently, not about copying what the competitors are doing.
In my experience, it’s super important to keep an open mind, to not get locked into one way of thinking, especially when it comes to site quality. This will help one keep from getting locked into a point of view that can keep one from seeing the true cause of ranking issues.
Featured Image by Shutterstock/Stone36
-
FACEBOOK4 days ago
Indian Government Warns Facebook, YouTube About Deepfakes, Misinformation Violations
-
MARKETING3 days ago
Whiteboard Friday Recap 2023: AI Edition
-
MARKETING6 days ago
“Undercover” Case Studies: Why the Future of Marketing Is Proving Yourself in the Field
-
SEARCHENGINES6 days ago
Follower Count Is Not A Google Search Ranking Factor
-
MARKETING7 days ago
Sam’s Club Member Access Platform (MAP) Advertiser’s Guide
-
SOCIAL6 days ago
17-Year-Old Claims To Make 6 Figures A Year
-
SOCIAL5 days ago
Meta Stock: Still Room For Upside In A Maturing Market (NASDAQ:META)
-
SOCIAL7 days ago
U.S. Senators Accuse X of Profiting From Terrorist Propaganda in the App
You must be logged in to post a comment Login