Connect with us

SEO

How Can We Improve Rankings For Older Content? Ask An SEO

Published

on

How Can We Improve Rankings For Older Content? Ask An SEO

How can you help existing webpages get new traction and move up in search rankings?

That’s the question posed by Faith in this edition of Ask An SEO. She wrote:

“I have a few keywords ranking on the fourth or fifth page of Google.

It’s been a year ranking at this position. What should I do to improve the rankings now?”

Adam Riemer from Adam Riemer Marketing shares his response with Miranda Miller, Writer & Editor, in this edition of Ask An SEO.

Evaluating internal pages that may be competing against your candidates for optimization is an important first step, he says.

Improving Page Speed and Core Web Vitals may also give you new opportunities to improve rankings.

Adam shares a step-by-step process for finding opportunities to improve existing content with local schema, improving a user’s on-page experience, getting links from relevant media sources, and more.

You can watch the full video here and find the full transcript below.

Ask An SEO: Improving Rankings With Adam Riemer [Full Transcript]

Miranda Miller: Hello, and welcome to Ask An SEO. … This week, we have with us Adam Riemer from Adam Riemer Marketing, AdamRiemer.me.

The question that people have for you this week comes from Faith.

Faith has a few keywords ranking on the fourth or fifth page of Google. They’ve been stable there for about a year, and she would like to know: What can she do to improve those rankings now?

Adam Riemer: Okay. That’s a good question and comes up way too often. I have to deal with that with a lot of clients. Well, not deal with it, but I get to solve that problem for a lot of clients.

Improving Rankings For Existing Content, Step By Step

Adam Riemer: And basically, the very first thing I do is, I’ll take a tool, whether it’s Authority Labs or Semrush; I think Ahrefs does this too.

And I’ll look to see: Do we have competing pages in those positions?

And is there one with an indent after it, maybe? And from there, I’ll be like, Okay, well… do both of these pages need to exist?

If there is nothing competing and it’s just one page there, I start to look at the page experience, and I say, Okay, why is this not the best experience for the user or for the search query?

And then we start to address, and you can look at, Do we properly explain the concept?

Is the article as good as it could be?

Is it formatted correctly? Could it use some bulking up?

Sometimes, one thing I’ve had to do a lot recently… there’s a case study on my website right now recently, I have to delete most of the copy because people just wrote copy to hit a minimum word count by actually reducing it, and just sharing the actual information.

We’ve been able to pop our clients up to the top positions from there.

Another option you can do, if everything’s perfect and your copy’s great… you can start to look at Page Speed and Core Web Vitals.

That’s not going to move the needle much, but when it does, it’s going to help you convert more traffic and decrease your bounce rate.

Another thing you can try to do is build some internal links from contextually relevant content.

You don’t want to just link to that page off of keywords for the sake of doing it.

Build out your content strategy. Look for previous articles.

If you’re on WordPress, for example, you can log in, click on posts, and then click on pages.

You do this twice and type in the keyword or a similar version of the keyword. And that’ll pull up a list of the actual pages that mention this. And you can start building words that way.

You can also use search operators. We’ll do site, put your URL in, and then in quotation marks, you’ll put in the keyword phrase, and it’ll scan through your website for mentions of that specific keyword or phrase throughout the site.

And now you have a list of pages you can build internal links from.

You can also try doing PR work. So if your content or if your page is genuinely good enough, then you can probably attract backlinks from major media, possibly bloggers.

Tips For Local Businesses

Adam Riemer: If you’re local, go for local websites and complementary companies, and try to do it that way. It won’t be an immediate result, but you will start to see climbing if it’s good.

If it’s a product page and you’re not the manufacturer, it doesn’t make sense to give you an anchor or a backlink.

So what you want to do then is you want to create content that’s worth linking to and get backlinks that way, and pass the authority to the page.

Those are all different ways you can pop up from position or page four and five to the front page of Google and possibly overtake it.

Don’t Forget About Schema

The last thing to look at, and probably should be done earlier, is the schema.

A lot of people forget that schema.org does update its libraries regularly. So you’ll want to go in and say, Do I have everything here? Did I add a video?

And is there video object schema?

Do I have FAQs on here?

Or did I add some, and is there FAQ schema?

If it’s listed as an article, because maybe you’re a publisher, there’s probably a part, and you can nest it in the has part portion of the schema.

And those are always… you can actually take your page from the fourth and fifth page of Google and bump it up to page one while achieving some featured rich results.

Evaluating A Visitor’s Page Experience

Miranda Miller: That is awesome. That’s great information, Adam.

I have a couple of follow-up questions for you.

I was wondering – when you’re evaluating page experience, the experience that any given user is having on that website and on that specific page, do you use tools to help you with that?

Or is it a largely manual process, and what are you looking for?

Adam Riemer: Depends on what I’m looking at on the page, specifically.

If I notice it’s just going really slow, then I’ll use webpagetest.org. That’s my first go-to tool because the waterfall is very easy to dissect, and they’ve now added Core Web Vitals – that’s similar to what you’ll see in Search Console.

So that way, I can say, Okay, this is rendering first. This is coming, or this is being pulled in first before we actually start to render the page, and we can move it to the end. It doesn’t need to be there.

We can identify all the fonts and everything else that’s slowing down the page.

We can also look for scripts and code that aren’t being used anymore – because it’s all just right there in front of you.

Another thing I’ll do is, a lot of time, branding teams will come in and say, “No, this is the messaging that we have to use.

And this is what we want for our thing, for our product or our service or our content.”

When in reality, that’s what they want. That’s not what the end user or what these search engines think.

And if you’re not going to give the proper words and the proper message to your users, then you’re not going to get those users.

So what I do is I make that same branding team go on a video call outside, both of us, and we will start saying the H1 tag and the top blurb to random people or the students, saying, What does this mean?

What do we offer? What do we do?

Nine times out of ten, people can’t answer, and they have no idea. And it really drives it home.

I’ve made a fortune 500 CEO actually stand outside and say what his branding team made us put on the website – it did not go well.

But it drove the point home: That nobody knows what it is we do or sell or what the content of the article’s supposed to be.

And this is a great way to start to make it resonate; okay, let’s keep the messaging while keeping branding and tax. So there’s a good balance. So really, it just depends on what the goal is and what we’re looking at for page experience.

Tips For Getting Noticed By Busy News Media Professionals

Miranda Miller: That’s great. And the other thing I wondered about is when we’re talking about link building and getting in front of news media and, you know, people who might give you a relevant link.

What tips do you have to stand out in a jam-packed inbox?

Adam Riemer:

Avoid “MeWe” syndrome and compliment them.

And I actually did this with a client yesterday. I said, “Your email was not the best it could have been.”

They were like, “Why? We covered everything.”

That’s the problem. Let’s go through and read this.

And every time a sentence starts with “I,” ”We,” “Ours,” or “My,” I put a finger up. And if those words appeared in the sentence again, they get two fingers for each one.

So within the first three sentences, we had already hit 10 fingers pointing up.

How is this about the journalist? They were like: “Because they write about this topic.”

But it’s not about the journalist. “About the journalist” means you’ve read two or three of their articles and probably visited their social media.

So what I do is I look for an older article that they’re probably proud of and a recent one that are both topically relevant.

And then I say,

“Hey, thank you for your article about this, this and this. The point about halfway down where you mention WordPress versus Wix versus GoDaddy, for example, and the way that you called out the brand new features that launched, I had no idea that you could do this with X, Y, and Z CRM systems or CMS systems.”

So then you want to say, “I also notice you updated here where you have WordPress versus Squarespace. Have you considered doing a comparison chart and maybe adding X, Y, Z in, and X, Y, Z would be the new client?” Just to introduce and say, they have these features, including the ones you personally enjoyed in your review under the pros and cons list here.

And now what you’re doing is you’re showing you actually read it, and you’re giving a reason to include, and you’re saying, this is the only one.

Or you can say, “X, Y, Z company has this feature just like this company. And just like that one, but it’s not available there. And they’re actually doing this. I work with them. I would be happy to give you a complimentary account if you’d like to review it.”

If it’s just a product page… like, we’re both wearing T-shirts. So maybe it’s the top 30 T-shirts or the best 30 T-shirts for interviewing on Search Engine Journal.

So we go in, and we see Cosmopolitan and Refinery 29 and Rolling Stone and all these other publications – one, you’re going to need an affiliate program because they’re all affiliate sites now.

And two, you’re going to also need to cater to the journalist. Well, that’s actually not true because the journalists in those publications specifically do have editorial control, and not everything on those lists has to be an affiliate link. It just helps, which means you don’t get your backlink because it’s gonna go through a 307 redirect.

But this is me rambling. And I’m sorry, please keep me focused.

Miranda Miller: You’re good.

Adam Riemer: Good for hours.

Miranda Miller: That is a lot of great advice. And as an editor, I can tell you, we can smell it a mile away if you’ve just dropped our name in there and didn’t actually, like, put any homework into what the publication is about and why we would link to you.

And yeah. If… what did you call it? “MeWe” syndrome – if you’re just talking about yourself. You’re just that guy in the corner at the party. Nobody wants to talk to you, nobody wants to give you a link.

Well, thank you, Adam. I really appreciate your time.

Adam Riemer: Can I finish the one part real quick? Sorry. So, yeah.

So when you’re going through that list, it’s not enough. You can click on the author’s name, and you’ll see all of the articles they’ve written.

So what you want to do then, because we’re going to be pitching our T-shirt, is we want to say: “Okay, your article here, I had no idea that Lululemon produces T-shirts.”

And then say, “In your recent one, the third one down where you featured the green T-shirt with XYZ prank is stunning. Thank you for the link off to Nordstrom. Our company offers this type of T-shirt, which is missing. It’s made from an eco-friendly thing here, which I notice may be a big topic for you because you wrote about eco-friendly hair ties and eco-friendly telephones.”

I’m just looking at stuff that’s on. And it sounds weird. I have hair ties. I just bought them for my neighbor.

That’s how you get in front of them: You show that you actually paid attention.

You thank them for their advice, and you cater to what matters to them and take out the mentions of “I, we, and our,” and talk about them to them and compliment their work. That’s how you do it.

We get about… out of every five emails we send, we get about three responses, and usually, at least one of those turns into a yes, because we take the time. We don’t have as much outreach, but it’s more effective outreach.

Miranda Miller: Nice. I love that. There’s no spray-and-pray happening. Exactly. Well, thank you, Adam,

Adam Riemer: I’m sorry for interrupting.

Miranda Miller: No, no, you’re good. And thank you, Faith, for the great question.

We will have a transcript and some highlights from Adam’s advice and the tips that he shared on searchenginejournal.com. So check that out, and you’ll find a link there to submit your own questions for Ask An SEO. Until next time. Thank you.

Adam Riemer:

Bye, thanks for having me.


Editor’s note: Ask an SEO is a weekly SEO advice column written by some of the industry’s top SEO experts, who have been hand-picked by Search Engine Journal. Got a question about SEO? Fill out our form. You might see your answer in the next #AskanSEO post!

Source link

Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address

SEO

8% Of Automattic Employees Choose To Resign

Published

on

By

8% Of Automattic Employees Choose To Resign

WordPress co-founder and Automattic CEO announced today that he offered Automattic employees the chance to resign with a severance pay and a total of 8.4 percent. Mullenweg offered $30,000 or six months of salary, whichever one is higher, with a total of 159 people taking his offer.

Reactions Of Automattic Employees

Given the recent controversies created by Mullenweg, one might be tempted to view the walkout as a vote of no-confidence in Mullenweg. But that would be a mistake because some of the employees announcing their resignations either praised Mullenweg or simply announced their resignation while many others tweeted how happy they are to stay at Automattic.

One former employee tweeted that he was sad about recent developments but also praised Mullenweg and Automattic as an employer.

He shared:

“Today was my last day at Automattic. I spent the last 2 years building large scale ML and generative AI infra and products, and a lot of time on robotics at night and on weekends.

I’m going to spend the next month taking a break, getting married, and visiting family in Australia.

I have some really fun ideas of things to build that I’ve been storing up for a while. Now I get to build them. Get in touch if you’d like to build AI products together.”

Another former employee, Naoko Takano, is a 14 year employee, an organizer of WordCamp conferences in Asia, a full-time WordPress contributor and Open Source Project Manager at Automattic announced on X (formerly Twitter) that today was her last day at Automattic with no additional comment.

She tweeted:

“Today was my last day at Automattic.

I’m actively exploring new career opportunities. If you know of any positions that align with my skills and experience!”

Naoko’s role at at WordPress was working with the global WordPress community to improve contributor experiences through the Five for the Future and Mentorship programs. Five for the Future is an important WordPress program that encourages organizations to donate 5% of their resources back into WordPress. Five for the Future is one of the issues Mullenweg had against WP Engine, asserting that they didn’t donate enough back into the community.

Mullenweg himself was bittersweet to see those employees go, writing in a blog post:

“It was an emotional roller coaster of a week. The day you hire someone you aren’t expecting them to resign or be fired, you’re hoping for a long and mutually beneficial relationship. Every resignation stings a bit.

However now, I feel much lighter. I’m grateful and thankful for all the people who took the offer, and even more excited to work with those who turned down $126M to stay. As the kids say, LFG!”

Read the entire announcement on Mullenweg’s blog:

Automattic Alignment

Featured Image by Shutterstock/sdx15

Source link

Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address
Continue Reading

SEO

YouTube Extends Shorts To 3 Minutes, Adds New Features

Published

on

By

YouTube Extends Shorts To 3 Minutes, Adds New Features

YouTube expands Shorts to 3 minutes, adds templates, AI tools, and the option to show fewer Shorts on the homepage.

  • YouTube Shorts will allow 3-minute videos.
  • New features include templates, enhanced remixing, and AI-generated video backgrounds.
  • YouTube is adding a Shorts trends page and comment previews.

Source link

Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address
Continue Reading

SEO

How To Stop Filter Results From Eating Crawl Budget

Published

on

By

How To Find The Right Long-tail Keywords For Articles

Today’s Ask An SEO question comes from Michal in Bratislava, who asks:

“I have a client who has a website with filters based on a map locations. When the visitor makes a move on the map, a new URL with filters is created. They are not in the sitemap. However, there are over 700,000 URLs in the Search Console (not indexed) and eating crawl budget.

What would be the best way to get rid of these URLs? My idea is keep the base location ‘index, follow’ and newly created URLs of surrounded area with filters switch to ‘noindex, no follow’. Also mark surrounded areas with canonicals to the base location + disavow the unwanted links.”

Great question, Michal, and good news! The answer is an easy one to implement.

First, let’s look at what you’re trying and apply it to other situations like ecommerce and publishers. This way, more people can benefit. Then, go into your strategies above and end with the solution.

What Crawl Budget Is And How Parameters Are Created That Waste It

If you’re not sure what Michal is referring to with crawl budget, this is a term some SEO pros use to explain that Google and other search engines will only crawl so many pages on your website before it stops.

If your crawl budget is used on low-value, thin, or non-indexable pages, your good pages and new pages may not be found in a crawl.

If they’re not found, they may not get indexed or refreshed. If they’re not indexed, they cannot bring you SEO traffic.

This is why optimizing a crawl budget for efficiency is important.

Michal shared an example of how “thin” URLs from an SEO point of view are created as customers use filters.

The experience for the user is value-adding, but from an SEO standpoint, a location-based page would be better. This applies to ecommerce and publishers, too.

Ecommerce stores will have searches for colors like red or green and products like t-shirts and potato chips.

These create URLs with parameters just like a filter search for locations. They could also be created by using filters for size, gender, color, price, variation, compatibility, etc. in the shopping process.

The filtered results help the end user but compete directly with the collection page, and the collection would be the “non-thin” version.

Publishers have the same. Someone might be on SEJ looking for SEO or PPC in the search box and get a filtered result. The filtered result will have articles, but the category of the publication is likely the best result for a search engine.

These filtered results can be indexed because they get shared on social media or someone adds them as a comment on a blog or forum, creating a crawlable backlink. It might also be an employee in customer service responded to a question on the company blog or any other number of ways.

The goal now is to make sure search engines don’t spend time crawling the “thin” versions so you can get the most from your crawl budget.

The Difference Between Indexing And Crawling

There’s one more thing to learn before we go into the proposed ideas and solutions – the difference between indexing and crawling.

  • Crawling is the discovery of new pages within a website.
  • Indexing is adding the pages that are worthy of showing to a person using the search engine to the database of pages.

Pages can get crawled but not indexed. Indexed pages have likely been crawled and will likely get crawled again to look for updates and server responses.

But not all indexed pages will bring in traffic or hit the first page because they may not be the best possible answer for queries being searched.

Now, let’s go into making efficient use of crawl budgets for these types of solutions.

Using Meta Robots Or X Robots

The first solution Michal pointed out was an “index,follow” directive. This tells a search engine to index the page and follow the links on it. This is a good idea, but only if the filtered result is the ideal experience.

From what I can see, this would not be the case, so I would recommend making it “noindex,follow.”

Noindex would say, “This is not an official page, but hey, keep crawling my site, you’ll find good pages in here.”

And if you have your main menu and navigational internal links done correctly, the spider will hopefully keep crawling them.

Canonicals To Solve Wasted Crawl Budget

Canonical links are used to help search engines know what the official page to index is.

If a product exists in three categories on three separate URLs, only one should be “the official” version, so the two duplicates should have a canonical pointing to the official version. The official one should have a canonical link that points to itself. This applies to the filtered locations.

If the location search would result in multiple city or neighborhood pages, the result would likely be a duplicate of the official one you have in your sitemap.

Have the filtered results point a canonical back to the main page of filtering instead of being self-referencing if the content on the page stays the same as the original category.

If the content pulls in your localized page with the same locations, point the canonical to that page instead.

In most cases, the filtered version inherits the page you searched or filtered from, so that is where the canonical should point to.

If you do both noindex and have a self-referencing canonical, which is overkill, it becomes a conflicting signal.

The same applies to when someone searches for a product by name on your website. The search result may compete with the actual product or service page.

With this solution, you’re telling the spider not to index this page because it isn’t worth indexing, but it is also the official version. It doesn’t make sense to do this.

Instead, use a canonical link, as I mentioned above, or noindex the result and point the canonical to the official version.

Disavow To Increase Crawl Efficiency

Disavowing doesn’t have anything to do with crawl efficiency unless the search engine spiders are finding your “thin” pages through spammy backlinks.

The disavow tool from Google is a way to say, “Hey, these backlinks are spammy, and we don’t want them to hurt us. Please don’t count them towards our site’s authority.”

In most cases, it doesn’t matter, as Google is good at detecting spammy links and ignoring them.

You do not want to add your own site and your own URLs to the disavow tool. You’re telling Google your own site is spammy and not worth anything.

Plus, submitting backlinks to disavow won’t prevent a spider from seeing what you want and do not want to be crawled, as it is only for saying a link from another site is spammy.

Disavowing won’t help with crawl efficiency or saving crawl budget.

How To Make Crawl Budgets More Efficient

The answer is robots.txt. This is how you tell specific search engines and spiders what to crawl.

You can include the folders you want them to crawl by marketing them as “allow,” and you can say “disallow” on filtered results by disallowing the “?” or “&” symbol or whichever you use.

If some of those parameters should be crawled, add the main word like “?filter=location” or a specific parameter.

Robots.txt is how you define crawl paths and work on crawl efficiency. Once you’ve optimized that, look at your internal links. A link from one page on your site to another.

These help spiders find your most important pages while learning what each is about.

Internal links include:

  • Breadcrumbs.
  • Menu navigation.
  • Links within content to other pages.
  • Sub-category menus.
  • Footer links.

You can also use a sitemap if you have a large site, and the spiders are not finding the pages you want with priority.

I hope this helps answer your question. It is one I get a lot – you’re not the only one stuck in that situation.

More resources: 


Featured Image: Paulo Bobita/Search Engine Journal

Source link

Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address
Continue Reading

Trending