Connect with us

SEO

How to Learn SEO (Complete Roadmap)

Published

on

How to Learn SEO (Complete Roadmap)

Learning SEO can seem overwhelming. It’s a complex topic, and the industry is rife with misinformation. But with a bit of time, effort, and the right roadmap, it’s something that anyone can learn.

Here’s the roadmap we’ll cover in this guide:

Learn SEO flowchart

Let’s get to it.

  1. Learn SEO fundamentals
  2. Put your knowledge into practice
  3. Deepen your SEO knowledge
  4. Keep your finger on the pulse 
  5. Teach others what you know

1. Learn SEO fundamentals

If you’re not already familiar with the basics of SEO, this is where you should start. Specifically, you need to understand how search engines work and the four main facets of SEO. Let’s go through these real quick.

How search engines work

Search engines work by finding content and storing it in a big index. They then use complex processes, also known as search algorithms, to rank content from the index when a user performs a search. In other words, when you search for something on Google, you’re not searching the entire web—you’re searching Google’s index.

This means that if Google can’t find and index your content, you can’t rank because you won’t be indexed.

Google builds its index from two main sources:

  1. Sitemaps A sitemap is a file listing all the important pages on your website that you want search engines to index. You can submit your sitemap to Google to tell it that your pages exist.
  2. Links from known webpages – Google already has billions of pages in its index. If you get a link from one of those pages, Google can “follow” the link to discover your page.

Note that Google can discover new pages on your website by “following” links from known pages on your website too.

For example, if Google already has your blog homepage in its index, you can link internally to newly published blog posts from there. Google would be able to “follow” these links to discover newly published posts on your blog.

Learn more: How Do Search Engines Work and Why Should You Care?

Keyword research

Keyword research is the process of finding what your customers are searching for. It’s important because you won’t get discovered if people aren’t searching for the keywords you target.

Because Google doesn’t exactly make this information accessible, the best way to find keywords is with a keyword research tool like Ahrefs’ Keywords Explorer. To use it, enter one or a few broad topics related to your industry, hit search, then go to one of the keyword ideas reports. You’ll see the keywords’ monthly search volumes and a few other SEO metrics.

Matching terms report results

In Keywords Explorer, we also show the “Traffic Potential” metric for each keyword. This estimates how much traffic the current top-ranking page for the keyword gets, which is usually a good indicator of how much traffic you can get by ranking #1.

Keyword Explorer overview

As pages tend to rank for more than one keyword, “Traffic Potential” usually gives a more accurate estimate of a keyword’s potential than its search volume.

Learn more: How to Do Keyword Research for SEO

On-page SEO

On-page SEO is where you optimize the content on your page to rank higher on search engines. It revolves heavily around understanding what searchers want and giving it to them—a process known as optimizing for search intent.

For example, if we look at the top results for the keyword “best protein powder,” we see that they’re all blog posts comparing top picks:

Google SERP of "best protein powder"

This tells us that although searchers are in the market for a protein powder, they’re still weighing up their options and aren’t quite ready to buy. As a result, it would be tough to rank an e‑commerce product page for this query. That’s not what searchers want.

Learn more: On-Page SEO: Complete Beginner’s Guide

Link building

Link building is the process of acquiring backlinks from other websites to your site. It’s important because backlinks are one of Google’s top three ranking factors.

This is probably why there’s a clear correlation between linking websites and organic traffic:

Line graph showing clear correlation between referring domains and search traffic

Not all links are created equal, however. Links from relevant and high-quality websites usually move the needle more than links from irrelevant and low-quality websites. In other words, if your site is about Bitcoin, a link from a website about cryptocurrencies will likely positively impact rankings more than one from a website about travel.

Building high-quality links to your website is arguably one of the most challenging aspects of SEO and one of the most in-demand SEO skills.

Learn more: The Beginner’s Guide to Link Building

Technical SEO

Technical SEO ensures that search engines like Google can find, crawl, and index your content. Unless they can do all three of these things, it’s unlikely that your pages will show up in the search results.

Let’s take a look at these three things in more detail.

  1. Find  Google first needs to know that your page exists and where to find it.
  2. Crawl  Google now needs permission to crawl the page. That’s where a computer program downloads the page’s content.
  3. Index Google now needs permission to add your page to its index.

You can solve the first part of the process by ensuring that your page has links from other known pages on your website and is in a sitemap that you’ve submitted to Google.

As for crawling and indexing, you need to ensure that you’re not blocking Google from doing either of these things. This is done using a file called robots.txt (crawling) and a meta tag called meta robots (indexing).

Learn more: The Beginner’s Guide to Technical SEO

2. Put your knowledge into practice

Here’s an apt quote:

Knowing SEO theory is one thing; applying that knowledge to rank a website is another thing entirely. You’ll learn more about SEO in the trenches than any other way.

Will Critchlow

For example, when I was getting started in SEO, I created a bodybuilding website, as I was interested in the topic at the time. First, I made sure my technical SEO was on point and that Google could find, crawl, and index any content I published. I then did some keyword research to find topics to cover. After that, I began publishing optimized content.

Here’s the first post I published in August 2012:

Excerpt of article about best types of protein powder

Finally, I built some links.

Here’s one of the links I built with a guest post (it’s still live today… 10 years later!):

Excerpt of article about motivating yourself to go to the gym

This website ended up doing quite well, which validated that the SEO theory I’d learned made sense. However, I made some mistakes too. For example, I distinctly recall the rankings for a page tanking after randomly deciding to rewrite the copy. This taught me a valuable lesson that I didn’t learn elsewhere: If it ain’t broke, don’t fix it! 

3. Deepen your SEO knowledge

It’s impossible to learn absolutely everything about every facet of SEO. The topic is just too broad. So now that you’ve spent some time in the trenches and learned which aspects of SEO you enjoy, it’s time to niche down and deepen your knowledge in one area.

This is known as becoming a t‑shaped SEO.

T-shaped SEO

Being a t‑shaped SEO means that you have a broad knowledge of all things SEO but excel in one particular area. The area you choose to specialize in should be one that you’re best at and most enjoy.

For me, this is link building—which is why I’ve written much of our content about this topic.

Here are a few more examples of t‑shaped SEOs:

Notice how Marie Haynes’ specialty is hyperspecific? Instead of choosing one broad facet of SEO (e.g., keyword research or link building), she decided to specialize in the niche area of Google penalty recovery. As a result, there’s probably no SEO on the planet that knows more about this topic than Marie.

Going hyperspecific like this is a good idea if you’re learning SEO to become an in-demand SEO expert. But if you’re looking to rank websites, it’s probably better to keep things slightly broader and stick with one of the four main facets of SEO.

Either way, you should always test what you learn on your website. This is where true learning happens.

4. Keep your finger on the pulse

Despite what many people say, the fundamentals of SEO barely change. But small things are constantly changing. There are Google updates multiple times a year, changes to how search engines handle aspects of technical SEO, smart folks coming up with new tactics, etc.

With this in mind, while you shouldn’t spend all day every day reading SEO news, it’s important to keep your finger on the pulse.

Here are a few ways to do that:

Attend SEO conferences and meetups

SEO is a big industry with big conferences. For example, BrightonSEO attracts more than 4,000 attendees. There are numerous smaller meetups too, which you can find on meetup.com, such as this one in my hometown. These are all places where like-minded people doing SEO share insights and tactics, so there’s a lot to learn from getting involved.

Learn more: 7 SEO Conferences (Online and Offline) to Attend

Listen to SEO podcasts

Podcasters often interview smart SEOs about their successes, failures, and experiences, making podcasts a great way to keep your finger on the pulse while on the go. For example, in this episode of the Authority Hacker podcast, link building extraordinaire Bibi shares her creative approach to link outreach emails.

Learn more: 15 Podcasts to Boost Your SEO Game

Join SEO Facebook groups

Facebook has an active community of SEOs who are always willing to answer questions and offer advice should you need it. In fact, our Facebook group, Ahrefs Insider, has almost 17K members and is very active.

Learn more: 4 Best Facebook Groups for SEOs (Most Voted For)

Join SEO Slack communities

If you’d prefer not to be distracted by Facebook, consider joining an SEO Slack community. Some are free, whereas others charge a monthly subscription. Traffic Think Tank (TTT) is a good choice if you’re open to paid communities.

Learn more: 11 Slack Communities for SEOs and Digital Marketers

Read SEO blogs

… Like the one you’re reading, where we often publish unique ideas, processes, and studies. For example, when Google switched to relying less on title tags to generate SERP titles, we studied almost a million pages and published the results for the community.

Learn more: 29 Awesome SEO Blogs to Follow (Graded and Ranked)

Watch SEO YouTube videos

… Like our YouTube channel, where we publish similar content to our blog.

Official sources

Google publishes official algorithm updates and announcements on the Search Console Blog and hosts weekly “office hours” hangouts on its YouTube channel. You can also follow Google search representatives like John Mueller and Gary Illyes on Twitter.

Read SEO news

Search Engine Roundtable describes itself as the pulse of search marketing and publishes daily updates on everything search. Search Engine Land and Search Engine Journal also frequently publish news.

5. Teach others what you know (optional)

Look back at the roadmap, and you’ll see a recommendation to share what you learn with others.

Learn SEO flowchart

This may seem counterintuitive, given that you want to learn more about SEO, but I find that teaching others helps me retain and assimilate knowledge. I think it’s because it forces me to articulate things, which often leads me to conclude that I don’t know as much as I thought I knew.

While you can do this publicly on a blog or YouTube channel, you can also do it semi-privately (in groups and communities) or privately (direct messages, face-to-face).

If you’re thick-skinned enough, doing it publicly often provides an extra line of defense against misinformation because people are usually kind enough to call you out when you get things wrong.

For example, here’s Bill Slawski pointing out an inaccurate claim in one of my articles on Twitter:

Bill's tweet about Google not ranking pages on the basis of content accuracy

This leads me to an important point…

Don’t try to teach others SEO unless one of these things is true:

  • You’ve thoroughly researched and understood what you’re teaching.
  • You’re teaching something based on personal experience and testing (and you’ve made that fact clear).

The last thing you want to do is contribute more misinformation to an industry already rife with it.

Final thoughts

The famous psychologist, K. Anders Ericsson, theorized that learning a new skill takes 10,000 hours of practice. You’ll certainly gain a good understanding of SEO in that time, but the truth is that you never stop learning. I’ve been involved in SEO for 11+ years, and I learn new things all the time.

But remember, learning isn’t only about reading and retaining information. It’s also about putting what you read into practice, testing things for yourself, and finding ways to improve on conventional wisdom over time.

Got questions? Ping me on Twitter.




Source link

Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address

SEO

YouTube Extends Shorts To 3 Minutes, Adds New Features

Published

on

By

YouTube Extends Shorts To 3 Minutes, Adds New Features

YouTube expands Shorts to 3 minutes, adds templates, AI tools, and the option to show fewer Shorts on the homepage.

  • YouTube Shorts will allow 3-minute videos.
  • New features include templates, enhanced remixing, and AI-generated video backgrounds.
  • YouTube is adding a Shorts trends page and comment previews.

Source link

Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address
Continue Reading

SEO

How To Stop Filter Results From Eating Crawl Budget

Published

on

By

How To Find The Right Long-tail Keywords For Articles

Today’s Ask An SEO question comes from Michal in Bratislava, who asks:

“I have a client who has a website with filters based on a map locations. When the visitor makes a move on the map, a new URL with filters is created. They are not in the sitemap. However, there are over 700,000 URLs in the Search Console (not indexed) and eating crawl budget.

What would be the best way to get rid of these URLs? My idea is keep the base location ‘index, follow’ and newly created URLs of surrounded area with filters switch to ‘noindex, no follow’. Also mark surrounded areas with canonicals to the base location + disavow the unwanted links.”

Great question, Michal, and good news! The answer is an easy one to implement.

First, let’s look at what you’re trying and apply it to other situations like ecommerce and publishers. This way, more people can benefit. Then, go into your strategies above and end with the solution.

What Crawl Budget Is And How Parameters Are Created That Waste It

If you’re not sure what Michal is referring to with crawl budget, this is a term some SEO pros use to explain that Google and other search engines will only crawl so many pages on your website before it stops.

If your crawl budget is used on low-value, thin, or non-indexable pages, your good pages and new pages may not be found in a crawl.

If they’re not found, they may not get indexed or refreshed. If they’re not indexed, they cannot bring you SEO traffic.

This is why optimizing a crawl budget for efficiency is important.

Michal shared an example of how “thin” URLs from an SEO point of view are created as customers use filters.

The experience for the user is value-adding, but from an SEO standpoint, a location-based page would be better. This applies to ecommerce and publishers, too.

Ecommerce stores will have searches for colors like red or green and products like t-shirts and potato chips.

These create URLs with parameters just like a filter search for locations. They could also be created by using filters for size, gender, color, price, variation, compatibility, etc. in the shopping process.

The filtered results help the end user but compete directly with the collection page, and the collection would be the “non-thin” version.

Publishers have the same. Someone might be on SEJ looking for SEO or PPC in the search box and get a filtered result. The filtered result will have articles, but the category of the publication is likely the best result for a search engine.

These filtered results can be indexed because they get shared on social media or someone adds them as a comment on a blog or forum, creating a crawlable backlink. It might also be an employee in customer service responded to a question on the company blog or any other number of ways.

The goal now is to make sure search engines don’t spend time crawling the “thin” versions so you can get the most from your crawl budget.

The Difference Between Indexing And Crawling

There’s one more thing to learn before we go into the proposed ideas and solutions – the difference between indexing and crawling.

  • Crawling is the discovery of new pages within a website.
  • Indexing is adding the pages that are worthy of showing to a person using the search engine to the database of pages.

Pages can get crawled but not indexed. Indexed pages have likely been crawled and will likely get crawled again to look for updates and server responses.

But not all indexed pages will bring in traffic or hit the first page because they may not be the best possible answer for queries being searched.

Now, let’s go into making efficient use of crawl budgets for these types of solutions.

Using Meta Robots Or X Robots

The first solution Michal pointed out was an “index,follow” directive. This tells a search engine to index the page and follow the links on it. This is a good idea, but only if the filtered result is the ideal experience.

From what I can see, this would not be the case, so I would recommend making it “noindex,follow.”

Noindex would say, “This is not an official page, but hey, keep crawling my site, you’ll find good pages in here.”

And if you have your main menu and navigational internal links done correctly, the spider will hopefully keep crawling them.

Canonicals To Solve Wasted Crawl Budget

Canonical links are used to help search engines know what the official page to index is.

If a product exists in three categories on three separate URLs, only one should be “the official” version, so the two duplicates should have a canonical pointing to the official version. The official one should have a canonical link that points to itself. This applies to the filtered locations.

If the location search would result in multiple city or neighborhood pages, the result would likely be a duplicate of the official one you have in your sitemap.

Have the filtered results point a canonical back to the main page of filtering instead of being self-referencing if the content on the page stays the same as the original category.

If the content pulls in your localized page with the same locations, point the canonical to that page instead.

In most cases, the filtered version inherits the page you searched or filtered from, so that is where the canonical should point to.

If you do both noindex and have a self-referencing canonical, which is overkill, it becomes a conflicting signal.

The same applies to when someone searches for a product by name on your website. The search result may compete with the actual product or service page.

With this solution, you’re telling the spider not to index this page because it isn’t worth indexing, but it is also the official version. It doesn’t make sense to do this.

Instead, use a canonical link, as I mentioned above, or noindex the result and point the canonical to the official version.

Disavow To Increase Crawl Efficiency

Disavowing doesn’t have anything to do with crawl efficiency unless the search engine spiders are finding your “thin” pages through spammy backlinks.

The disavow tool from Google is a way to say, “Hey, these backlinks are spammy, and we don’t want them to hurt us. Please don’t count them towards our site’s authority.”

In most cases, it doesn’t matter, as Google is good at detecting spammy links and ignoring them.

You do not want to add your own site and your own URLs to the disavow tool. You’re telling Google your own site is spammy and not worth anything.

Plus, submitting backlinks to disavow won’t prevent a spider from seeing what you want and do not want to be crawled, as it is only for saying a link from another site is spammy.

Disavowing won’t help with crawl efficiency or saving crawl budget.

How To Make Crawl Budgets More Efficient

The answer is robots.txt. This is how you tell specific search engines and spiders what to crawl.

You can include the folders you want them to crawl by marketing them as “allow,” and you can say “disallow” on filtered results by disallowing the “?” or “&” symbol or whichever you use.

If some of those parameters should be crawled, add the main word like “?filter=location” or a specific parameter.

Robots.txt is how you define crawl paths and work on crawl efficiency. Once you’ve optimized that, look at your internal links. A link from one page on your site to another.

These help spiders find your most important pages while learning what each is about.

Internal links include:

  • Breadcrumbs.
  • Menu navigation.
  • Links within content to other pages.
  • Sub-category menus.
  • Footer links.

You can also use a sitemap if you have a large site, and the spiders are not finding the pages you want with priority.

I hope this helps answer your question. It is one I get a lot – you’re not the only one stuck in that situation.

More resources: 


Featured Image: Paulo Bobita/Search Engine Journal

Source link

Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address
Continue Reading

SEO

Ad Copy Tactics Backed By Study Of Over 1 Million Google Ads

Published

on

By

Ad Copy Tactics Backed By Study Of Over 1 Million Google Ads

Mastering effective ad copy is crucial for achieving success with Google Ads.

Yet, the PPC landscape can make it challenging to discern which optimization techniques truly yield results.

Although various perspectives exist on optimizing ads, few are substantiated by comprehensive data. A recent study from Optmyzr attempted to address this.

The goal isn’t to promote or dissuade any specific method but to provide a clearer understanding of how different creative decisions impact your campaigns.

Use the data to help you identify higher profit probability opportunities.

Methodology And Data Scope

The Optmyzr study analyzed data from over 22,000 Google Ads accounts that have been active for at least 90 days with a minimum monthly spend of $1,500.

Across more than a million ads, we assessed Responsive Search Ads (RSAs), Expanded Text Ads (ETAs), and Demand Gen campaigns. Due to API limitations, we could not retrieve asset-level data for Performance Max campaigns.

Additionally, all monetary figures were converted to USD to standardize comparisons.

Key Questions Explored

To provide actionable insights, we focused on addressing the following questions:

  • Is there a correlation between Ad Strength and performance?
  • How do pinning assets impact ad performance?
  • Do ads written in title case or sentence case perform better?
  • How does creative length affect ad performance?
  • Can ETA strategies effectively translate to RSAs and Demand Gen ads?

As we evaluated the results, it’s important to note that our data set represents advanced marketers.

This means there may be selection bias, and these insights might differ in a broader advertiser pool with varying levels of experience.

The Relationship Between Ad Strength And Performance

Google explicitly states that Ad Strength is a tool designed to guide ad optimization rather than act as a ranking factor.

Despite this, marketers often hold mixed opinions about its usefulness, as its role in ad performance appears inconsistent.

Image from author, September 2024

Our data corroborates this skepticism. Ads labeled with an “average” Ad Strength score outperformed those with “good” or “excellent” scores in key metrics like CPA, conversion rate, and ROAS.

This disparity is particularly evident in RSAs, where the ROAS tends to decrease sharply when moving from “average” to “good,” with only a marginal increase when advancing to “excellent.”

data for demand gen ad strengthScreenshot from author, September 2024

Interestingly, Demand Gen ads also showed a stronger performance with an “average” Ad Strength, except for ROAS.

The metrics for conversion rates in Demand Gen and RSAs were notably similar, which is surprising since Demand Gen ads are typically designed for awareness, while RSAs focus on driving transactions.

Key Takeaways:

  • Ad Strength doesn’t reliably correlate with performance, so it shouldn’t be a primary metric for assessing your ads.
  • Most ads with “poor” or “average” Ad Strength labels perform well by standard advertising KPIs.
  • “Good” or “excellent” Ad Strength labels do not guarantee better performance.

How Does Pinning Affect Ad Performance?

Pinning refers to locking specific assets like headlines or descriptions in fixed positions within the ad. This technique became common with RSAs, but there’s ongoing debate about its efficacy.

Some advertisers advocate for pinning all assets to replicate the control offered by ETAs, while others prefer to let Google optimize placements automatically.

data on pinningImage from author, September 2024

Our data suggests that pinning some, but not all, assets offers the most balanced results in terms of CPA, ROAS, and CPC. However, ads where all assets are pinned achieve the highest relevance in terms of CTR.

Still, this marginally higher CTR doesn’t necessarily translate into better conversion metrics. Ads with unpinned or partially pinned assets generally perform better in terms of conversion rates and cost-based metrics.

Key Takeaways:

  • Selective pinning is optimal, offering a good balance between creative control and automation.
  • Fully pinned ads may increase CTR but tend to underperform in metrics like CPA and ROAS.
  • Advertisers should embrace RSAs, as they consistently outperform ETAs – even with fully pinned assets.

Title Case Vs. Sentence Case: Which Performs Better?

The choice between title case (“This Is a Title Case Sentence”) and sentence case (“This is a sentence case sentence”) is often a point of contention among advertisers.

Our analysis revealed a clear trend: Ads using sentence case generally outperformed those in title case, particularly in RSAs and Demand Gen campaigns.

Data on title vs sentence casingImage from author, September 2024

(RSA Data)

(ETA Data)Image from author, September 2024

(ETA Data)

(Demand Gen)Image from author, September 2024

(Demand Gen)

ROAS, in particular, showed a marked preference for sentence case across these ad types, suggesting that a more natural, conversational tone may resonate better with users.

Interestingly, many advertisers still use a mix of title and sentence case within the same account, which counters the traditional approach of maintaining consistency throughout the ad copy.

Key Takeaways:

  • Sentence case outperforms title case in RSAs and Demand Gen ads on most KPIs.
  • Including sentence case ads in your testing can improve performance, as it aligns more closely with organic results, which users perceive as higher quality.
  • Although ETAs perform slightly better with title case, sentence case is increasingly the preferred choice in modern ad formats.

The Impact Of Ad Length On Performance

Ad copy, particularly for Google Ads, requires brevity without sacrificing impact.

We analyzed the effects of character count on ad performance, grouping ads by the length of headlines and descriptions.

rsa headline character countImage from author, September 2024
RSA description lengthImage from author, September 2024

(RSA Data)

ETA dataImage from author, September 2024
1727879162 7 Ad Copy Tactics Backed By Study Of Over 1 MillionImage from author, September 2024

(ETA Data)

creative length demand genImage from author, September 2024
1727879163 98 Ad Copy Tactics Backed By Study Of Over 1 MillionImage from author, September 2024

(Demand Gen Data)

Interestingly, shorter headlines tend to outperform longer ones in CTR and conversion rates, while descriptions benefit from moderate length.

Ads that tried to maximize character counts by using dynamic keyword insertion (DKI) or customizers often saw no significant performance improvement.

Moreover, applying ETA strategies to RSAs proved largely ineffective.

In almost all cases, advertisers who carried over ETA tactics to RSAs saw a decline in performance, likely because of how Google dynamically assembles ad components for display.

Key Takeaways:

  • Shorter headlines lead to better performance, especially in RSAs.
  • Focus on concise, impactful messaging instead of trying to fill every available character.
  • ETA tactics do not translate well to RSAs, and attempting to replicate them can hurt performance.

Final Thoughts On Ad Optimizations

In summary, several key insights emerge from this analysis.

First, Ad Strength should not be your primary focus when assessing performance. Instead, concentrate on creating relevant, engaging ad copy tailored to your target audience.

Additionally, pinning assets should be a strategic, creative decision rather than a hard rule, and advertisers should incorporate sentence case into their testing for RSAs and Demand Gen ads.

Finally, focus on quality over quantity in ad copy length, as longer ads do not always equate to better results.

By refining these elements of your ads, you can drive better ROI and adapt to the evolving landscape of Google Ads.

Read the full Ad Strength & Creative Study from Optmyzr.

More resources: 


Featured Image: Sammby/Shutterstock

Source link

Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address
Continue Reading

Trending