SEO
Is your SEO performance a dumpster fire? Here’s how to salvage it
30-second summary:
- A failing SEO strategy can happen to the best of us
- No doubt it’s disheartening when your competitors are miles ahead and your business is struggling to bring in new leads
- Founder of LSEO and best-selling author, Kristopher (Kris) Jones provides comprehensive steps and advice on how you can salvage your SEO performance
Dumpster fires: surely they can’t happen to you. Right? But before you know it, your website’s traffic has tanked, your competitors are getting all the organic love, and you couldn’t get a conversion if your life depended on it. Folks, if your SEO performance sounds like that, you might just have a dumpster fire on your hands.
A failing SEO strategy can happen to the best of us. There’s no doubt it’s disheartening when your competitors are all miles ahead of you and your business isn’t bringing in new leads.
The good news is that it’s never too late to turn things around.
When is the best time to plant a tree? 20 years ago.
When’s the second-best time? Right now, so let’s get to it.
Here’s how to salvage your dumpster fire of an SEO strategy.
1. Review and optimize all your current content
I’m going to talk about content a few times in this post.
That’s because content has long been and remains the most important element to focus on in your overall SEO strategy.
Websites are nothing without content.
You can see a website getting by with no meta descriptions, you can see them getting by without optimized images, but without content, what do you have?
Not a website!
But if you’re focusing on content first to turn around your SEO strategy, where do you start?
Yes, you optimize everything you already have.
You don’t want to get ahead of yourself by constantly creating new content when you have a whole slew of old pages and posts that may have fallen into SEO disrepair.
Google treats optimized content the same as new content, so to start out, you’ll want to audit your existing content to see what’s good, what’s bad, and what you can fix up to be good again.
You can use a content audit tool like that found in Semrush, or, if you have a more manageable load of content to work with, checking things out manually would work well, too.
This is about more than just deciding what content you like or do not like, although you should be able to tell at a glance which topics are still relevant to your website.
But to check out the SEO performance of each page and post, you can use Semrush as I said, or go manual with Google Search Console.
What I like to do is to put each URL into Search Console and check out how it’s doing as far as impressions versus clicks, click-through rate, and the average positions of its ranking keywords.
That gives me a decent snapshot of which pages need attention.
For example
A page with 10,000 impressions in a 30-day period but only 100 clicks will have a CTR of only one percent (not too great).
I would then go to that page to figure out what is causing the low CTR.
The page is obviously being ranked for the keyword, given its high impressions, but if few people are clicking, then maybe the page isn’t as relevant for the term as it once was.
If that’s the case, then optimizing the page for SEO could be a matter of creating new sections of content around that keyword, and certainly retooling what’s there already.
Optimizing your website’s content is a major part of improving your SEO strategy because it involves so many things that are going to help you.
For this first point, I focused only on the writing and editing part of the content optimization.
Let’s now move on to some other parts of an SEO strategy where you could update things (things that could nonetheless still be involved in content optimization).
2. Assess and update all meta tags
Your pages’ meta tags play an important role in your website’s overall SEO health.
Meta tags are also one of the easiest things to let slip by as you work on your website, because they’re so brief and simple, and there are so many of them.
The thing is, meta tags can go out of date as the landscape shifts around your industry and the keywords for which you were optimizing are no longer relevant.
Meta tags are a classic example of why you can’t set it and forget it with SEO.
Meta tags are another element to look at as you go through your content pages to improve their CTR.
Sure, a lot of your content itself could use updates, but retool the meta titles and descriptions, as well.
Remember, the meta information is what organic users see as they scroll a SERP.
If your title and description aren’t interesting or urgent enough to draw in audiences that are in the awareness stage, then those people will keep on scrolling.
Redoing meta tags could include using a new target keyword, rewriting the call to action, or making everything more concise.
Maybe start with a handful of pages only, say 20 or 30, and A/B test the old and new titles and descriptions to see how traffic and CTR change after your edits.
Doing that will confirm for you whether the updating you’re doing is worth it, and whether you should continue down this road with the rest of your pages.
3. Work on your technical performance
When you have to turn around your entire SEO strategy, you have to think about your website holistically.
That means focusing not just on your keywords and content, but also on how your pages perform technically.
I’m grouping issues such as image compression, site speed, mobile responsiveness, and Core Web Vitals all together under the umbrella of “technical performance.”
Although these factors are less “creative” and open-ended as compared to performing new keyword research or optimizing content, they matter just as well.
When people get to your website and are greeted with slow pages, a messy mobile appearance, and content elements that jump around as they load, their trust in you drops.
In a world as competitive as ours, you can’t afford to give people cause for distrust, because you can bet that there are a hundred competitors waiting in line to market to those customers if you can’t do so successfully.
If development work isn’t your forte, look into contracting out to someone who can clean up your website’s coding and otherwise speed things up while also optimizing for mobile.
Images should be compressed so they take up less space but don’t lose any of their quality, and each image should have optimized alt text.
Compressing and optimizing images is something you can definitely do yourself, either through a plugin (on WordPress) or manually if it’s feasible.
Even though page speed and load times aren’t always the most accessible kind of work to business owners and website owners, those are important issues to keep in mind as you labor toward turning around your dumpster fire of an SEO strategy.
4. Resume creating new content
You can turn around even the worst SEO strategy in the world.
Google isn’t going to hold you to the fire forever just because your SEO has been in the dumps even for the last few years.
Google crawls your site every so often whether you’re doing something with it or not, and as it sees that your SEO is improving, it can start to rank some of those pages higher.
So here is where we get into creating all-new, high-quality content.
Content in 2023 can mean a whole range of things, from blog posts to infographics to videos and podcasts and webinars and slide decks.
Whatever makes sense for your business and your industry is what you should do. Whatever types of media you know your audiences like to consume, give that to them.
In 2023, however, you have to be incredibly mindful of being comprehensive and useful for people.
If there’s anything that we’ve learned from 2022’s helpful content update, it’s that you just cannot skimp on content creation (not that you ever could, but Google is smarter than it was 10 years ago).
Gone are the days of skirting by on SEO-centric content, created just to score some ranking for this or that keyword.
Google is paying much more attention now to the intent and usefulness of a piece, and rewarding those web pages featuring actually helpful content (get it?) with higher rankings.
A perfect example of how Google is thinking these days is the product review update, also from 2022.
Google is now deprioritizing the ranking of low-quality product reviews in favor of more expert-level reviews where the reviewer has actually used the product or service and can speak to its pros and cons.
Why? Because Google wants to direct users to content they can actually trust to help them.
When you take the product reviews update and helpful content update together, you can see why content marketing has gotten so much harder over the years.
You can’t just rank after spending an hour on a 400-word blog post anymore.
You have to be a real expert, or at least put in the time and effort to create deep content if you work for a client portfolio.
These are all things you must keep in mind as you create new content for your website in the name of putting out your dumpster fire of an SEO strategy.
Now, of course, there are the nuts and bolts you have to remember, as well, when it comes to new content.
You have to mine the SERPs, develop the proper keyword strategy, and understand the correct intent behind those keywords to be sure you’re creating what people expect to see when they search that keyword.
That stuff you can all learn.
What I want you to take from this section is the idea that you have to work to create that new content. You have to put in that time and dedication to do it well.
5. If you’re local, focus on reviews
I don’t want to leave out the local businesses here: if you’re a local business, do you know that one of the single largest factors in helping your SEO is getting positive Google reviews?
Now, local businesses need to perform all the on-page SEO work that anyone else does, but what do you do as an ongoing SEO strategy?
The play here isn’t keyword-driven SEO content so much, because your local audience isn’t really going to find you that way.
Local audiences find local businesses by performing local searches and checking out the reviews in the map pack.
In fact, 77 percent of local buyers always read online reviews while checking out local businesses.
Your reviews affect the level of trust the public has in you. More people are likely to visit your website and use your business when they see that others have had a positive experience with you.
The cycle goes on when you encourage your customers to leave positive Google reviews.
The more reviews you have, and the more positive they are, the better off your chances will be of rising to the top of your local map pack.
Being at the top should translate into more traffic and better SEO overall.
6. Build natural backlinks
Finally, I want to mention another pillar of Google’s list of known ranking factors: natural backlinks.
Links are what unite everything on the internet together.
They’re also vital in keeping the ranking juices flowing to your web pages when it comes to your SEO strategy.
Backlinks to your website from other websites show Google that you’re an authority in your market niche since people want to reference what you have to say.
Link building, then, is really about building relationships to get your name out there as a trustworthy resource for others.
When Google sees your links coming from relevant, authoritative websites, it will assign more trust to your own site.
Just remember to keep the links coming from websites that make sense to your own.
The quality matters much more than the quantity here.
To do it, create content that people would want to link to, something with a lot of useful stats and other data.
You can also scout other websites in your niche to see where they may have content gaps, and then create content to fill that gap and ask for a link back.
It takes time and effort, and you’re not guaranteed anything, but it’s the natural way to earn backlinks that will actually help your SEO.
Give your SEO time to turn around
You can put out even the biggest dumpster fire when you know what to do and how to do it.
I’ll say again that SEO dumpster fires can happen to the best of us. Sometimes we go all-in on things we think will work, and they don’t.
Sometimes we get lazy and let our SEO go for years.
But it’s never too late to correct things.
It will definitely take time to see things start to shift for you, though; SEO isn’t an overnight solution. It needs anywhere from three to six months or longer to start showing a difference.
If you keep in mind both the broad strokes and the specifics of everything I’ve described here, you truly can reinvent your SEO strategy and be on your way to business growth.
Kris Jones is the founder and former CEO of digital marketing and affiliate network Pepperjam, which he sold to eBay Enterprises in 2009. Most recently Kris founded SEO services and software company LSEO.com and has previously invested in numerous successful technology companies. Kris is an experienced public speaker and is the author of one of the best-selling SEO books of all time called, ‘Search-Engine Optimization – Your Visual Blueprint to Effective Internet Marketing’, which has sold nearly 100,000 copies.
Subscribe to the Search Engine Watch newsletter for insights on SEO, the search landscape, search marketing, digital marketing, leadership, podcasts, and more.
Join the conversation with us on LinkedIn and Twitter.
SEO
YouTube Extends Shorts To 3 Minutes, Adds New Features
YouTube expands Shorts to 3 minutes, adds templates, AI tools, and the option to show fewer Shorts on the homepage.
- YouTube Shorts will allow 3-minute videos.
- New features include templates, enhanced remixing, and AI-generated video backgrounds.
- YouTube is adding a Shorts trends page and comment previews.
SEO
How To Stop Filter Results From Eating Crawl Budget
Today’s Ask An SEO question comes from Michal in Bratislava, who asks:
“I have a client who has a website with filters based on a map locations. When the visitor makes a move on the map, a new URL with filters is created. They are not in the sitemap. However, there are over 700,000 URLs in the Search Console (not indexed) and eating crawl budget.
What would be the best way to get rid of these URLs? My idea is keep the base location ‘index, follow’ and newly created URLs of surrounded area with filters switch to ‘noindex, no follow’. Also mark surrounded areas with canonicals to the base location + disavow the unwanted links.”
Great question, Michal, and good news! The answer is an easy one to implement.
First, let’s look at what you’re trying and apply it to other situations like ecommerce and publishers. This way, more people can benefit. Then, go into your strategies above and end with the solution.
What Crawl Budget Is And How Parameters Are Created That Waste It
If you’re not sure what Michal is referring to with crawl budget, this is a term some SEO pros use to explain that Google and other search engines will only crawl so many pages on your website before it stops.
If your crawl budget is used on low-value, thin, or non-indexable pages, your good pages and new pages may not be found in a crawl.
If they’re not found, they may not get indexed or refreshed. If they’re not indexed, they cannot bring you SEO traffic.
This is why optimizing a crawl budget for efficiency is important.
Michal shared an example of how “thin” URLs from an SEO point of view are created as customers use filters.
The experience for the user is value-adding, but from an SEO standpoint, a location-based page would be better. This applies to ecommerce and publishers, too.
Ecommerce stores will have searches for colors like red or green and products like t-shirts and potato chips.
These create URLs with parameters just like a filter search for locations. They could also be created by using filters for size, gender, color, price, variation, compatibility, etc. in the shopping process.
The filtered results help the end user but compete directly with the collection page, and the collection would be the “non-thin” version.
Publishers have the same. Someone might be on SEJ looking for SEO or PPC in the search box and get a filtered result. The filtered result will have articles, but the category of the publication is likely the best result for a search engine.
These filtered results can be indexed because they get shared on social media or someone adds them as a comment on a blog or forum, creating a crawlable backlink. It might also be an employee in customer service responded to a question on the company blog or any other number of ways.
The goal now is to make sure search engines don’t spend time crawling the “thin” versions so you can get the most from your crawl budget.
The Difference Between Indexing And Crawling
There’s one more thing to learn before we go into the proposed ideas and solutions – the difference between indexing and crawling.
- Crawling is the discovery of new pages within a website.
- Indexing is adding the pages that are worthy of showing to a person using the search engine to the database of pages.
Pages can get crawled but not indexed. Indexed pages have likely been crawled and will likely get crawled again to look for updates and server responses.
But not all indexed pages will bring in traffic or hit the first page because they may not be the best possible answer for queries being searched.
Now, let’s go into making efficient use of crawl budgets for these types of solutions.
Using Meta Robots Or X Robots
The first solution Michal pointed out was an “index,follow” directive. This tells a search engine to index the page and follow the links on it. This is a good idea, but only if the filtered result is the ideal experience.
From what I can see, this would not be the case, so I would recommend making it “noindex,follow.”
Noindex would say, “This is not an official page, but hey, keep crawling my site, you’ll find good pages in here.”
And if you have your main menu and navigational internal links done correctly, the spider will hopefully keep crawling them.
Canonicals To Solve Wasted Crawl Budget
Canonical links are used to help search engines know what the official page to index is.
If a product exists in three categories on three separate URLs, only one should be “the official” version, so the two duplicates should have a canonical pointing to the official version. The official one should have a canonical link that points to itself. This applies to the filtered locations.
If the location search would result in multiple city or neighborhood pages, the result would likely be a duplicate of the official one you have in your sitemap.
Have the filtered results point a canonical back to the main page of filtering instead of being self-referencing if the content on the page stays the same as the original category.
If the content pulls in your localized page with the same locations, point the canonical to that page instead.
In most cases, the filtered version inherits the page you searched or filtered from, so that is where the canonical should point to.
If you do both noindex and have a self-referencing canonical, which is overkill, it becomes a conflicting signal.
The same applies to when someone searches for a product by name on your website. The search result may compete with the actual product or service page.
With this solution, you’re telling the spider not to index this page because it isn’t worth indexing, but it is also the official version. It doesn’t make sense to do this.
Instead, use a canonical link, as I mentioned above, or noindex the result and point the canonical to the official version.
Disavow To Increase Crawl Efficiency
Disavowing doesn’t have anything to do with crawl efficiency unless the search engine spiders are finding your “thin” pages through spammy backlinks.
The disavow tool from Google is a way to say, “Hey, these backlinks are spammy, and we don’t want them to hurt us. Please don’t count them towards our site’s authority.”
In most cases, it doesn’t matter, as Google is good at detecting spammy links and ignoring them.
You do not want to add your own site and your own URLs to the disavow tool. You’re telling Google your own site is spammy and not worth anything.
Plus, submitting backlinks to disavow won’t prevent a spider from seeing what you want and do not want to be crawled, as it is only for saying a link from another site is spammy.
Disavowing won’t help with crawl efficiency or saving crawl budget.
How To Make Crawl Budgets More Efficient
The answer is robots.txt. This is how you tell specific search engines and spiders what to crawl.
You can include the folders you want them to crawl by marketing them as “allow,” and you can say “disallow” on filtered results by disallowing the “?” or “&” symbol or whichever you use.
If some of those parameters should be crawled, add the main word like “?filter=location” or a specific parameter.
Robots.txt is how you define crawl paths and work on crawl efficiency. Once you’ve optimized that, look at your internal links. A link from one page on your site to another.
These help spiders find your most important pages while learning what each is about.
Internal links include:
- Breadcrumbs.
- Menu navigation.
- Links within content to other pages.
- Sub-category menus.
- Footer links.
You can also use a sitemap if you have a large site, and the spiders are not finding the pages you want with priority.
I hope this helps answer your question. It is one I get a lot – you’re not the only one stuck in that situation.
More resources:
Featured Image: Paulo Bobita/Search Engine Journal
SEO
Ad Copy Tactics Backed By Study Of Over 1 Million Google Ads
Mastering effective ad copy is crucial for achieving success with Google Ads.
Yet, the PPC landscape can make it challenging to discern which optimization techniques truly yield results.
Although various perspectives exist on optimizing ads, few are substantiated by comprehensive data. A recent study from Optmyzr attempted to address this.
The goal isn’t to promote or dissuade any specific method but to provide a clearer understanding of how different creative decisions impact your campaigns.
Use the data to help you identify higher profit probability opportunities.
Methodology And Data Scope
The Optmyzr study analyzed data from over 22,000 Google Ads accounts that have been active for at least 90 days with a minimum monthly spend of $1,500.
Across more than a million ads, we assessed Responsive Search Ads (RSAs), Expanded Text Ads (ETAs), and Demand Gen campaigns. Due to API limitations, we could not retrieve asset-level data for Performance Max campaigns.
Additionally, all monetary figures were converted to USD to standardize comparisons.
Key Questions Explored
To provide actionable insights, we focused on addressing the following questions:
- Is there a correlation between Ad Strength and performance?
- How do pinning assets impact ad performance?
- Do ads written in title case or sentence case perform better?
- How does creative length affect ad performance?
- Can ETA strategies effectively translate to RSAs and Demand Gen ads?
As we evaluated the results, it’s important to note that our data set represents advanced marketers.
This means there may be selection bias, and these insights might differ in a broader advertiser pool with varying levels of experience.
The Relationship Between Ad Strength And Performance
Google explicitly states that Ad Strength is a tool designed to guide ad optimization rather than act as a ranking factor.
Despite this, marketers often hold mixed opinions about its usefulness, as its role in ad performance appears inconsistent.
Our data corroborates this skepticism. Ads labeled with an “average” Ad Strength score outperformed those with “good” or “excellent” scores in key metrics like CPA, conversion rate, and ROAS.
This disparity is particularly evident in RSAs, where the ROAS tends to decrease sharply when moving from “average” to “good,” with only a marginal increase when advancing to “excellent.”
Interestingly, Demand Gen ads also showed a stronger performance with an “average” Ad Strength, except for ROAS.
The metrics for conversion rates in Demand Gen and RSAs were notably similar, which is surprising since Demand Gen ads are typically designed for awareness, while RSAs focus on driving transactions.
Key Takeaways:
- Ad Strength doesn’t reliably correlate with performance, so it shouldn’t be a primary metric for assessing your ads.
- Most ads with “poor” or “average” Ad Strength labels perform well by standard advertising KPIs.
- “Good” or “excellent” Ad Strength labels do not guarantee better performance.
How Does Pinning Affect Ad Performance?
Pinning refers to locking specific assets like headlines or descriptions in fixed positions within the ad. This technique became common with RSAs, but there’s ongoing debate about its efficacy.
Some advertisers advocate for pinning all assets to replicate the control offered by ETAs, while others prefer to let Google optimize placements automatically.
Our data suggests that pinning some, but not all, assets offers the most balanced results in terms of CPA, ROAS, and CPC. However, ads where all assets are pinned achieve the highest relevance in terms of CTR.
Still, this marginally higher CTR doesn’t necessarily translate into better conversion metrics. Ads with unpinned or partially pinned assets generally perform better in terms of conversion rates and cost-based metrics.
Key Takeaways:
- Selective pinning is optimal, offering a good balance between creative control and automation.
- Fully pinned ads may increase CTR but tend to underperform in metrics like CPA and ROAS.
- Advertisers should embrace RSAs, as they consistently outperform ETAs – even with fully pinned assets.
Title Case Vs. Sentence Case: Which Performs Better?
The choice between title case (“This Is a Title Case Sentence”) and sentence case (“This is a sentence case sentence”) is often a point of contention among advertisers.
Our analysis revealed a clear trend: Ads using sentence case generally outperformed those in title case, particularly in RSAs and Demand Gen campaigns.
(RSA Data)
(ETA Data)
(Demand Gen)
ROAS, in particular, showed a marked preference for sentence case across these ad types, suggesting that a more natural, conversational tone may resonate better with users.
Interestingly, many advertisers still use a mix of title and sentence case within the same account, which counters the traditional approach of maintaining consistency throughout the ad copy.
Key Takeaways:
- Sentence case outperforms title case in RSAs and Demand Gen ads on most KPIs.
- Including sentence case ads in your testing can improve performance, as it aligns more closely with organic results, which users perceive as higher quality.
- Although ETAs perform slightly better with title case, sentence case is increasingly the preferred choice in modern ad formats.
The Impact Of Ad Length On Performance
Ad copy, particularly for Google Ads, requires brevity without sacrificing impact.
We analyzed the effects of character count on ad performance, grouping ads by the length of headlines and descriptions.
(RSA Data)
(ETA Data)
(Demand Gen Data)
Interestingly, shorter headlines tend to outperform longer ones in CTR and conversion rates, while descriptions benefit from moderate length.
Ads that tried to maximize character counts by using dynamic keyword insertion (DKI) or customizers often saw no significant performance improvement.
Moreover, applying ETA strategies to RSAs proved largely ineffective.
In almost all cases, advertisers who carried over ETA tactics to RSAs saw a decline in performance, likely because of how Google dynamically assembles ad components for display.
Key Takeaways:
- Shorter headlines lead to better performance, especially in RSAs.
- Focus on concise, impactful messaging instead of trying to fill every available character.
- ETA tactics do not translate well to RSAs, and attempting to replicate them can hurt performance.
Final Thoughts On Ad Optimizations
In summary, several key insights emerge from this analysis.
First, Ad Strength should not be your primary focus when assessing performance. Instead, concentrate on creating relevant, engaging ad copy tailored to your target audience.
Additionally, pinning assets should be a strategic, creative decision rather than a hard rule, and advertisers should incorporate sentence case into their testing for RSAs and Demand Gen ads.
Finally, focus on quality over quantity in ad copy length, as longer ads do not always equate to better results.
By refining these elements of your ads, you can drive better ROI and adapt to the evolving landscape of Google Ads.
Read the full Ad Strength & Creative Study from Optmyzr.
More resources:
Featured Image: Sammby/Shutterstock
-
SEARCHENGINES7 days ago
Daily Search Forum Recap: September 27, 2024
-
SEO6 days ago
How to Estimate It and Source Data
-
SEO4 days ago
6 Things You Can Do to Compete With Big Sites
-
SEO6 days ago
9 Successful PR Campaign Examples, According to the Data
-
SEARCHENGINES5 days ago
Google’s 26th Birthday Doodle Is Missing
-
SEO5 days ago
Yoast Co-Founder Suggests A WordPress Contributor Board
-
SEARCHENGINES4 days ago
Google Volatility With Gains & Losses, Updated Web Spam Policies, Cache Gone & More Search News
-
SEARCHENGINES3 days ago
Daily Search Forum Recap: September 30, 2024
You must be logged in to post a comment Login