SEO
8 SEO Problems Solved By This SEO Artificial Intelligence Tool
This post was sponsored by Market Brew. The opinions expressed in this article are the sponsor’s own.
Wish you could rank higher than your competitors in search results?
Feel like your tried-and-true SEO tools no longer give you a true advantage?
It’s time to move past the tools that your competitors are using. Stop using historical data and start looking at SEO tools that accurately predict Google algorithm update impacts.
Gain your ultimate SEO advantage by accurately predicting future SERPs so you can truly rank higher.
You can do this with a comprehensive SEO intelligence tool, like Market Brew.
SEO intelligence uses AI to determine the future of your search by helping you:
Today, we’ll dive into how to solve hiccups at each step, but first, let’s learn more about what SEO intelligence is.
What Is SEO Intelligence?
SEO intelligence uses Artificial Intelligence (AI) to help guide search engine optimization for better rankings and organic traffic by assisting with:
- Keyword research.
- SERP analysis.
- Content writing.
- Optimization, and more.
The most successful SEO professionals prioritize testing new tools to find exciting new intelligence features that boost your search engine optimization.
Market Brew sets a new standard for forecasting SEO like no other software tool has before.
1. Predict Exactly How Algorithm Changes Will Affect Your Search Visibility
With Market Brew’s sophisticated AI process, you can confidently make predictions on the future of your search for your site.
Yes, with AI, you can immediately predict SEO rankings, months before those changes show up in their rank trackers.
Instead of spending hours upon hours researching what could happen in the future, and putting your own personal credibility on the line, Market Brew can show you exactly what to expect from a volatile Google algorithm update.
How? By creating an SEO testing platform, or predictive model, based on real algorithmic data.
When your SEO team completes a task in Market Brew, the changes are tied directly back to the predictive model that the task was based on.
Users can immediately see the effect of even the smallest optimization, leading to a deeper understanding of how each SERP is calculated.
Predict your ranking shifts, now →
2. Learn & Observe Decisions Search Engines Make
Wondering why a competitor’s page is ranking higher than yours?
Need to explain to clients why Google is shaking things up again?
Using AI, you will be able to quickly and accurately get the SEO analysis you are looking for.
With its ability to utilize a genetic algorithm and self-calibrate each search engine model, Market Brew will help you understand how each SERP change happened through comparison – it will view the different learned algorithmic bias/weight changes from before and after Google’s algorithm update.
Market Brew uses Particle Swarm Optimization to adjust the algorithmic settings on its standard model so that Market Brew’s results look exactly like the target SERP results.
Each time this calibration process runs, the bias/weight settings are updated and stored.
When Google rolls out its next algorithmic update, Market Brew automatically re-calibrates its search engine models.
With the new calibrated settings, users can easily see which algorithms are now more or less important.
3. Conduct Truly Accurate Competitive Research With AI
With Market Brew, you can start by looking at your competitors’ sites in a detailed side-by-side comparison.
You can quickly create an SEO pros and cons list, with websites in the same field.
Here, you’ll be able to pay attention to key comparison details, such as:
- Elements of your competitor’s content.
- Your competitors’ internal page links and backlinks, sorted by either link flow distribution, anchor text distribution, or both.
- Keywords with high search volume or transaction value.
- Your competitor’s top-performing pages.
Looking at your competition will guide you on how to improve your site. Competitor comparisons save time, money, and resources when taking over market share in search.
4. Measure Total Link Value
The links on your pages are an essential part of gaining traffic to your site, specifically the number of links and the links’ authority.
The types of links on your page can significantly affect the value and ranking authority from one page to another. Measuring links is the most straightforward method of determining a site’s backlink value.
Market Brew scores each link on your site with the most sophisticated link algorithms to create the most accurate overall depiction of how powerful each link is.
When viewing your total link value, here are things to consider:
- Is the link relevant?
- Is the linking site authoritative?
- Is the link followed?
- Where is the link on the page?
- How many links are on the page?
- Is the link reciprocal?
- Does this link share anchor text with another link?
You should always be intentional with your linking. Remember:
- Cluttered and excessive links are unlikely to create a seamless user experience, and oversaturation devalues links.
- When placing your links, consider the visitor’s intent and how to help them achieve their goals.
- Links from trusted sites that have built up their authority pass more link equity than links from brand-new sites just starting.
- Links at the bottom of a page do not hold as much weight or authority in Google’s eyes.
- Links that are reciprocal do not pass as much link flow as links that are not.
- Links that appear on every page, and share anchor text and target, do not pass as much link flow as those that appear uniquely.
If you link to a restaurant business’s grand opening from an article about where to trade in your car, the link is irrelevant — Google will note this, and your SEO link software should, too. Irrelevant links do not provide authority or value.
5. Measure & Evaluate Keyword Link Value
Market Brew uses targeted keywords as anchor texts for internal linking and link building from other websites to help certain pages rank on search engines.
This calculation, combined with Market Brew’s sophisticated link scoring, determines an anchor text distribution that other tools can’t see and assigns each anchor text a specific importance, based on the quantity and quality of backlinks sharing this anchor text.
SEO tools can help you find your highest-ranking keywords. Vary your anchor text distribution using your primary and secondary anchor texts.
Link variance performs better on search results both for primary and secondary keywords. The goal is to rank for specific keywords while focusing your anchor text distribution on the right landing pages.
6. Determine Real E-A-T Scores
E-A-T stands for Expertise, Authority, and Trust. According to Google, E-A-T is one of the most important factors when considering a website’s overall Page Quality Rating.
Google states you need these things to be trustworthy and have authority:
- A satisfying amount of high-quality content, including a descriptive or helpful title.
- Satisfying website information and/or information about who handles the website. For example, an ecommerce site should have satisfying customer service information.
- Maintaining a positive website reputation across the internet.
Until Market Brew’s insights, SEO professionals had no way to measure E-A-T scores with credibility and confidence.
In 2021, Market Brew deployed its first expertise algorithm (the E in E-A-T) which measures the coverage of content for each page’s topic cluster.
In certain search results, this expertise algorithm is now showing a correlation with ranking positions.
Market Brew is the world’s first statistical modeling tool for search engines.
Market Brew is a search engine built by search engineers that can calibrate itself to behave like any search engine in the world. Its Artificial Intelligence-powered task system provides seamless navigation through each relevant part of the search engine model, uncovering SERP-specific prioritized tasks from off-page to on-page and everything in between.
A successful SEO campaign begins and ends with Market Brew’s search engine models.
7. Establish Topic Authority
Imagine being able to submit every change to Google to see what would happen. With Market Brew’s SEO testing platform features, you can.
Market Brew’s Spotlight algorithm allows users to establish the perfect Topic Authority and understand exactly what the outperforming site’s topic cluster is, and even which expert topics the content should be talking about.
Each Market Brew account comes standard with the following:
- Evergreen Googlebot Crawler.
- Blink JavaScript Rendering Engine.
- Particle Swarm Optimization for all of your search engine models.
- Proven search engine algorithms.
- Rapid testing capabilities.
- SEO Teams tool to help Manage Optimization.
- The ability to compare and contrast how your changes fared.
8. Identify Keyword Stuffing
Topic clusters are a distillation of keywords, and keywords are essential for SEO.
But stuffing your copy with an overkill of keywords gives the impression that you focus on rankings rather than readers – and search engines notice.
Instead, Google wants to see intentional content that adds value to its readers. That is why Google frowns on pages that stuff their keywords.
Instead, focus on responsible keyword SEO optimization.
Using two to five keywords on a page is a safe number to stick with for your SEO.
Instead of using the same keyword repeatedly on your site, find related topics for each keyword to prevent the overuse of one keyword and build a topic cluster.
Google rewards sites with focused topic clusters, and sites that use them will usually have more dense content if there are a variety of keywords and related topics on a page.
Simplify Your SEO Process
If you want to simplify your enterprise SEO optimization process, stick with Market Brew.
Each Market Brew account comes with Market Focus capability, which begins with a keyword-based approach. Then, with Market Brew’s robust link-scoring layer, the technology calculates a basket of keyword clusters for each page.
Market Brew’s automated discovery system uses this data to guide you to the exact algorithms that are the deciding factors in the target search engine environment, shown on the Top Optimizations screen.
In conclusion, manipulating search rankings with repeated words or phrases will only cause a site to rank lower in Google’s search results.
Make your life simpler and let Market Brew take your keyword optimization to the next level with our software.
SEO professionals find our predictability analysis to be the closest thing you can get to an SEO services guarantee.
Through our search engine replication process, we quickly and clearly show you how best to optimize your site, making it a walk in the park for you and your business.
Are you interested in signing up for a Market Brew account? Book a demo today, and see how valuable our software can be for your business.
Image Credits
Featured Image: Image by Market Brew. Used with permission.
SEO
The Expert SEO Guide To URL Parameter Handling
In the world of SEO, URL parameters pose a significant problem.
While developers and data analysts may appreciate their utility, these query strings are an SEO headache.
Countless parameter combinations can split a single user intent across thousands of URL variations. This can cause complications for crawling, indexing, visibility and, ultimately, lead to lower traffic.
The issue is we can’t simply wish them away, which means it’s crucial to master how to manage URL parameters in an SEO-friendly way.
To do so, we will explore:
What Are URL Parameters?
URL parameters, also known as query strings or URI variables, are the portion of a URL that follows the ‘?’ symbol. They are comprised of a key and a value pair, separated by an ‘=’ sign. Multiple parameters can be added to a single page when separated by an ‘&’.
The most common use cases for parameters are:
- Tracking – For example ?utm_medium=social, ?sessionid=123 or ?affiliateid=abc
- Reordering – For example ?sort=lowest-price, ?order=highest-rated or ?so=latest
- Filtering – For example ?type=widget, colour=purple or ?price-range=20-50
- Identifying – For example ?product=small-purple-widget, categoryid=124 or itemid=24AU
- Paginating – For example, ?page=2, ?p=2 or viewItems=10-30
- Searching – For example, ?query=users-query, ?q=users-query or ?search=drop-down-option
- Translating – For example, ?lang=fr or ?language=de
SEO Issues With URL Parameters
1. Parameters Create Duplicate Content
Often, URL parameters make no significant change to the content of a page.
A re-ordered version of the page is often not so different from the original. A page URL with tracking tags or a session ID is identical to the original.
For example, the following URLs would all return a collection of widgets.
- Static URL: https://www.example.com/widgets
- Tracking parameter: https://www.example.com/widgets?sessionID=32764
- Reordering parameter: https://www.example.com/widgets?sort=latest
- Identifying parameter: https://www.example.com?category=widgets
- Searching parameter: https://www.example.com/products?search=widget
That’s quite a few URLs for what is effectively the same content – now imagine this over every category on your site. It can really add up.
The challenge is that search engines treat every parameter-based URL as a new page. So, they see multiple variations of the same page, all serving duplicate content and all targeting the same search intent or semantic topic.
While such duplication is unlikely to cause a website to be completely filtered out of the search results, it does lead to keyword cannibalization and could downgrade Google’s view of your overall site quality, as these additional URLs add no real value.
2. Parameters Reduce Crawl Efficacy
Crawling redundant parameter pages distracts Googlebot, reducing your site’s ability to index SEO-relevant pages and increasing server load.
Google sums up this point perfectly.
“Overly complex URLs, especially those containing multiple parameters, can cause a problems for crawlers by creating unnecessarily high numbers of URLs that point to identical or similar content on your site.
As a result, Googlebot may consume much more bandwidth than necessary, or may be unable to completely index all the content on your site.”
3. Parameters Split Page Ranking Signals
If you have multiple permutations of the same page content, links and social shares may be coming in on various versions.
This dilutes your ranking signals. When you confuse a crawler, it becomes unsure which of the competing pages to index for the search query.
4. Parameters Make URLs Less Clickable
Let’s face it: parameter URLs are unsightly. They’re hard to read. They don’t seem as trustworthy. As such, they are slightly less likely to be clicked.
This may impact page performance. Not only because CTR influences rankings, but also because it’s less clickable in AI chatbots, social media, in emails, when copy-pasted into forums, or anywhere else the full URL may be displayed.
While this may only have a fractional impact on a single page’s amplification, every tweet, like, share, email, link, and mention matters for the domain.
Poor URL readability could contribute to a decrease in brand engagement.
Assess The Extent Of Your Parameter Problem
It’s important to know every parameter used on your website. But chances are your developers don’t keep an up-to-date list.
So how do you find all the parameters that need handling? Or understand how search engines crawl and index such pages? Know the value they bring to users?
Follow these five steps:
- Run a crawler: With a tool like Screaming Frog, you can search for “?” in the URL.
- Review your log files: See if Googlebot is crawling parameter-based URLs.
- Look in the Google Search Console page indexing report: In the samples of index and relevant non-indexed exclusions, search for ‘?’ in the URL.
- Search with site: inurl: advanced operators: Know how Google is indexing the parameters you found by putting the key in a site:example.com inurl:key combination query.
- Look in Google Analytics all pages report: Search for “?” to see how each of the parameters you found are used by users. Be sure to check that URL query parameters have not been excluded in the view setting.
Armed with this data, you can now decide how to best handle each of your website’s parameters.
SEO Solutions To Tame URL Parameters
You have six tools in your SEO arsenal to deal with URL parameters on a strategic level.
Limit Parameter-based URLs
A simple review of how and why parameters are generated can provide an SEO quick win.
You will often find ways to reduce the number of parameter URLs and thus minimize the negative SEO impact. There are four common issues to begin your review.
1. Eliminate Unnecessary Parameters
Ask your developer for a list of every website’s parameters and their functions. Chances are, you will discover parameters that no longer perform a valuable function.
For example, users can be better identified by cookies than sessionIDs. Yet the sessionID parameter may still exist on your website as it was used historically.
Or you may discover that a filter in your faceted navigation is rarely applied by your users.
Any parameters caused by technical debt should be eliminated immediately.
2. Prevent Empty Values
URL parameters should be added to a URL only when they have a function. Don’t permit parameter keys to be added if the value is blank.
In the above example, key2 and key3 add no value, both literally and figuratively.
3. Use Keys Only Once
Avoid applying multiple parameters with the same parameter name and a different value.
For multi-select options, it is better to combine the values after a single key.
4. Order URL Parameters
If the same URL parameter is rearranged, the pages are interpreted by search engines as equal.
As such, parameter order doesn’t matter from a duplicate content perspective. But each of those combinations burns crawl budget and split ranking signals.
Avoid these issues by asking your developer to write a script to always place parameters in a consistent order, regardless of how the user selected them.
In my opinion, you should start with any translating parameters, followed by identifying, then pagination, then layering on filtering and reordering or search parameters, and finally tracking.
Pros:
- Ensures more efficient crawling.
- Reduces duplicate content issues.
- Consolidates ranking signals to fewer pages.
- Suitable for all parameter types.
Cons:
- Moderate technical implementation time.
Rel=”Canonical” Link Attribute
The rel=”canonical” link attribute calls out that a page has identical or similar content to another. This encourages search engines to consolidate the ranking signals to the URL specified as canonical.
You can rel=canonical your parameter-based URLs to your SEO-friendly URL for tracking, identifying, or reordering parameters.
But this tactic is not suitable when the parameter page content is not close enough to the canonical, such as pagination, searching, translating, or some filtering parameters.
Pros:
- Relatively easy technical implementation.
- Very likely to safeguard against duplicate content issues.
- Consolidates ranking signals to the canonical URL.
Cons:
- Wastes crawling on parameter pages.
- Not suitable for all parameter types.
- Interpreted by search engines as a strong hint, not a directive.
Meta Robots Noindex Tag
Set a noindex directive for any parameter-based page that doesn’t add SEO value. This tag will prevent search engines from indexing the page.
URLs with a “noindex” tag are also likely to be crawled less frequently and if it’s present for a long time will eventually lead Google to nofollow the page’s links.
Pros:
- Relatively easy technical implementation.
- Very likely to safeguard against duplicate content issues.
- Suitable for all parameter types you do not wish to be indexed.
- Removes existing parameter-based URLs from the index.
Cons:
- Won’t prevent search engines from crawling URLs, but will encourage them to do so less frequently.
- Doesn’t consolidate ranking signals.
- Interpreted by search engines as a strong hint, not a directive.
Robots.txt Disallow
The robots.txt file is what search engines look at first before crawling your site. If they see something is disallowed, they won’t even go there.
You can use this file to block crawler access to every parameter based URL (with Disallow: /*?*) or only to specific query strings you don’t want to be indexed.
Pros:
- Simple technical implementation.
- Allows more efficient crawling.
- Avoids duplicate content issues.
- Suitable for all parameter types you do not wish to be crawled.
Cons:
- Doesn’t consolidate ranking signals.
- Doesn’t remove existing URLs from the index.
Move From Dynamic To Static URLs
Many people think the optimal way to handle URL parameters is to simply avoid them in the first place.
After all, subfolders surpass parameters to help Google understand site structure and static, keyword-based URLs have always been a cornerstone of on-page SEO.
To achieve this, you can use server-side URL rewrites to convert parameters into subfolder URLs.
For example, the URL:
www.example.com/view-product?id=482794
Would become:
www.example.com/widgets/purple
This approach works well for descriptive keyword-based parameters, such as those that identify categories, products, or filters for search engine-relevant attributes. It is also effective for translated content.
But it becomes problematic for non-keyword-relevant elements of faceted navigation, such as an exact price. Having such a filter as a static, indexable URL offers no SEO value.
It’s also an issue for searching parameters, as every user-generated query would create a static page that vies for ranking against the canonical – or worse presents to crawlers low-quality content pages whenever a user has searched for an item you don’t offer.
It’s somewhat odd when applied to pagination (although not uncommon due to WordPress), which would give a URL such as
www.example.com/widgets/purple/page2
Very odd for reordering, which would give a URL such as
www.example.com/widgets/purple/lowest-price
And is often not a viable option for tracking. Google Analytics will not acknowledge a static version of the UTM parameter.
More to the point: Replacing dynamic parameters with static URLs for things like pagination, on-site search box results, or sorting does not address duplicate content, crawl budget, or internal link equity dilution.
Having all the combinations of filters from your faceted navigation as indexable URLs often results in thin content issues. Especially if you offer multi-select filters.
Many SEO pros argue it’s possible to provide the same user experience without impacting the URL. For example, by using POST rather than GET requests to modify the page content. Thus, preserving the user experience and avoiding SEO problems.
But stripping out parameters in this manner would remove the possibility for your audience to bookmark or share a link to that specific page – and is obviously not feasible for tracking parameters and not optimal for pagination.
The crux of the matter is that for many websites, completely avoiding parameters is simply not possible if you want to provide the ideal user experience. Nor would it be best practice SEO.
So we are left with this. For parameters that you don’t want to be indexed in search results (paginating, reordering, tracking, etc) implement them as query strings. For parameters that you do want to be indexed, use static URL paths.
Pros:
- Shifts crawler focus from parameter-based to static URLs which have a higher likelihood to rank.
Cons:
- Significant investment of development time for URL rewrites and 301 redirects.
- Doesn’t prevent duplicate content issues.
- Doesn’t consolidate ranking signals.
- Not suitable for all parameter types.
- May lead to thin content issues.
- Doesn’t always provide a linkable or bookmarkable URL.
Best Practices For URL Parameter Handling For SEO
So which of these six SEO tactics should you implement?
The answer can’t be all of them.
Not only would that create unnecessary complexity, but often, the SEO solutions actively conflict with one another.
For example, if you implement robots.txt disallow, Google would not be able to see any meta noindex tags. You also shouldn’t combine a meta noindex tag with a rel=canonical link attribute.
Google’s John Mueller, Gary Ilyes, and Lizzi Sassman couldn’t even decide on an approach. In a Search Off The Record episode, they discussed the challenges that parameters present for crawling.
They even suggest bringing back a parameter handling tool in Google Search Console. Google, if you are reading this, please do bring it back!
What becomes clear is there isn’t one perfect solution. There are occasions when crawling efficiency is more important than consolidating authority signals.
Ultimately, what’s right for your website will depend on your priorities.
Personally, I take the following plan of attack for SEO-friendly parameter handling:
- Research user intents to understand what parameters should be search engine friendly, static URLs.
- Implement effective pagination handling using a ?page= parameter.
- For all remaining parameter-based URLs, block crawling with a robots.txt disallow and add a noindex tag as backup.
- Double-check that no parameter-based URLs are being submitted in the XML sitemap.
No matter what parameter handling strategy you choose to implement, be sure to document the impact of your efforts on KPIs.
More resources:
Featured Image: BestForBest/Shutterstock
SEO
SEO Experts Gather for a Candid Chat About Search [Podcast]
Wix just celebrated their 100th podcast episode! Congrats, Wix. To quote Mordy Oberstein, Head of SEO Brand at Wix; “we talk a lot.”
You sure do! It’s a good thing you have a lot of interesting stuff to say.
The 100th episode of “SERPs Up” was full of awesome guests. Here’s a summary of the action.
Apart from the usual faces, Oberstein and Crystal Carter, Head Of SEO Communications, it was a powerhouse guestlist:
- Chima Mmeje.
- Darren Shaw.
- Joy Hawkins.
- Eli Schwartz.
- Kevin Indig.
- Barry Schwartz.
Just How Broken Are The SERPs?
The first guest was Chima Mmeje from Moz. She dove into the frustrations that many SEOs have been feeling and spoke plainly about the flaws in Google’s updates.
Mordy Oberstein: “Is the SERP broken?”
Chima Mmeje: “The helpful content update, and I’m saying this here, live, is a farce. There was nothing helpful about that update. … Yes, the SERP is 1,000% broken. … How does anybody even use Google in the U.S.? … I don’t think they are going to release any update that will fix these issues.”
Mordy Oberstein: “There’s no update. … Plopping Reddit all over the SERP was because they saw the content trends … and they said ‘we don’t have any so we’re just going to throw Reddit there’.”
Chima Mmeje: “It was lazy to have Reddit there … Nobody uses their real names. Anybody can go on Reddit and answer questions and then you see these answers populating in People Also Ask, populating in featured snippets, populating all over the SERPs as correct information. It is dangerous, at worst.”
Crystal Carter: “Do you think that one of the reasons why we’ve seen so much upheaval and so much so volatility in the SERPs, which I certainly agree with in the last year … is lots and lots of variables, like lots of new features coming in, so the alignment with Reddit, the AI overviews, the SGE … Do you think it is just too many things being thrown in at the same time and it messing up lots of SERPs as a result? Or do you think it’s something else?”
Chima Mmeje: ” … releasing too many features that they did not test properly. Features that were rushed SGE [testing] did not even last a year and now they brought in Google AI Overviews. I still don’t understand why we have AI Overviews and featured snippets on the same SERP. I feel like it’s like pick one, make a choice.”
Mordy Oberstein’s next question was about what we can do. “As an SEO, how are you supposed to do this? I’ve heard things from people … Yeah, I don’t know what to do. I can’t produce the kind of results that I’ve always wanted to. Can you still be effective as an SEO in an environment like this?”
Chima Mmeje: “I’m going to be honest, we are suffering … It feels like we are trying our best with what we are seeing … because there is no clear guidance. And to be honest, a lot of us are playing a guessing game right now and that is the best that we can do. It’s all a guessing game based on what we’ve seen one or two variables work. And this is not a long-term strategy. If we’re going to be realistic, it’s not going to work in the long-term. I honestly, I don’t know what the answer is … you’re fighting against Reddit. How do you compete against Reddit? Nobody has figured that out yet.”
Crystal Carter: “Thanks for saying it out loud, Chima.” Crystal was reflecting the sentiment of the commenters, who appreciated her candor and willingness to say: we don’t know, but we’re trying our best.
Mordy Oberstein: “The most honest take I’ve heard on that in quite a long time.”
Mmeje also recounted examples of small website owners and small businesses that have had to shut down. She also talked about the pervasive feeling in the SEO community that there is no rhyme or reason to how the algorithms handle websites and content.
What’s Going On In Local SEO?
The next guests were Darren Shaw from Whitespark and Joy Hawkins, owner of Sterling Sky for a segment called “It’s New.” They talked about new developments in local SEO.
Hawkins talked about a new feature in Google Business Profile.
Joy Hawkins: “… There’s a little services section inside the Google business profile dashboard that’s easy to miss, but you can add anything you want in there. … We’ve done a lot of testing on it and they do impact ranking, but I should clarify, it’s like a small impact. So usually we see it for longer-tailed queries that maybe don’t match a category or things that are not super competitive. … So it is a small ranking factor, but still one that is worth filling out.”
Darren Shaw: “ .. this is the question that a lot of people ask. We know that if you go into the services section of your Google business profile, Google will suggest predefined services … And so Joy’s original research was focused on those predefined ones and it definitely identified that when you do put those on your profile, you now rank better for those terms depending on how competitive they’re, as Joy had mentioned. … There is a place where you can add your own custom services. Have you done any testing around that? Will you rank better with the custom services?”
Joy Hawkins: “Yes. They both work. In custom services … I’m trying to remember the keyword that Colin tested it on. It was something super niche like vampire facials. I was Googling, what the hell is that? … Really, really niche … But he just wanted to know if there was any impact whatsoever and there was. [Custom services fields are a] good way to go after longer tail keywords that don’t have crazy high search volume or aren’t super competitive.”
Darren Shaw: “You want to make sure that you’re telling Google what you do … that’s basically what the services section provides. And it’s not a huge ranking factor, but it’s just another step in the local optimization process. … a tip for custom services because custom services often get pulled into the local results as justifications. It’ll say this business provides vampire facials, right? Well, did you know there’s a vampire emoji? So if you put the vampire emoji in the title … Then in the local results you’ll see a whole panel of businesses that all provide that service, but yours has that little vampire emoji which will draw people in.”
There was tons more in this section, including questions from the audiences and some great jokes.
The Obligatory AI Section.
Eli Schwartz And Kevin indig were next up to talk about AI. Oberstein, professional rabble rouser, tried to get them to argue, but despite their very different posting habits, they found a lot to agree on about AI.
Mordy Oberstein: “It wouldn’t be an SEO podcast if we didn’t talk about AI. Where do we currently stand with AI? What can it do? What can’t it do?”
Kevin Indig: “… We’re at a stage where AI basically has the capability to create content, analyze some basic data. It still hallucinates here and there and it still makes mistakes. … If you compare that to when this AI hype started in November, 2022, so it’s almost two years now and we’ve come a really long way, these models are getting exponentially better. … It means different things based on whether you look at it as a tool for yourself to make your work more efficient. And of course, what does it mean from an SEO perspective? How does it change search, not just Google, but also how people search. And I think these are all different questions that are exciting to dive into. … So there is a lot of objective data that indicates efficiencies and benefits from AI. There’s also a lot of hype that promises a little too much about what AI can do. And so I’m generally AI bullish, but I’m not in the camp of AI is going to replace us all the next two years.”
Mordy Oberstein: “I’m setting the stage here a little bit because while your LinkedIn pros are generally like pro ai, a lot of Eli’s posts are a little more skeptical about AI. So Eli, what do you think about what Kevin just said? By the way, I’m like, for those who are listening or watching this, I’m pitting them against each other. They’re friends and they do a podcast together. So it’s cool.”
Eli Schwartz: I think AI is great. I think that there’s a lot of great things you can get out of AI. You can, again, like Kevin said, it can be your thought partner. … I’m anti AI in the way people are using it. And I don’t think people have necessarily changed their behaviors because before … they outsource [content] on Fiverr and Upwork and they bought very cheap content and now they’re getting very free content. So then that’s coming from AI. That behavior hasn’t really changed. The challenge is that now there are more people that think they can copy them.
So I talk to CMOs all the time who are like, well, I just go of my SEO team. A big company reached out to me recently. They wanted to gut check themselves after they already fired their SEO team. So I can’t really help there, but they’re like, AI can do everything. … Well, I’ll see them in a year from now when they have whatever sort of penalty. AI is a very powerful tool. Any tool we have a drill is a very powerful tool. But if you just hold it in the air and just let it go, it’s going to make holes. But if you use it appropriately, it does the thing it’s supposed to do. … We’re humans and we buy stuff and it has to come to a point where humans are talking to humans.
Crystal Carter: “… Most of the gains are coming from productivity. The stuff like Kevin was talking about with being able to write product descriptions more quickly, being able to write lots of posts more quickly and being able to finish your things more quickly, brainstorm, et cetera, in terms of the quality, the quality is still not there. It’s getting there rapidly, but it’s still not there.”
There was lots more AI talk, so you should listen to the whole episode if you want to hear the full range of opinions.
Snappy News About The Google August Update
“The Snappy News” segment featured Barry Schwartz, Contributing Editor to Search Engine Land. It also featured the dreaded SEO phrase “it depends.”
Mordy Oberstein: So the article of the day is from Search Engine Land, basically written by Barry that the core update, the August 2024 core update is done. It is complete. … The issue with Google folks who are trying to figure out, will they see a reversal of their fortunes from the 2023 helpful content update, the September, 2023 helpful content update. It’s a mouthful, to be honest with you. And my question for you, since you’re here, did that happen? Was the August updated reversal?
Barry Schwartz: “It depends on the site. I think the number, I don’t have the exact data, obviously I don’t think anybody does, but I’ve seen examples of some very few sites see complete reversals. … There are a number of sites that saw maybe a 20% bump, a 30% bump, maybe a 5% bump. But very few sites saw a complete reversal, if you want to even call it that. … I’ve been through a lot of Google updates over the years, and it’s sometimes sad to see the stories, but at the same time, if you keep at it and you are true to the content, your audience, generally, you’ll do well in the long run. Not every site, there’s plenty of sites that have been hit, went out of business, and they couldn’t come back. That’s business in general. And things change, like seasonalities and times change. You’re writing about the railroad business a hundred years ago and you keep writing about it today. There’s not many people investing a lot of money in railroads these days. So I dunno, it’s, it’s hard to read those stories, but not everybody deserves to go back to where they were. And then at the same time, Google’s not perfect either, which is why they keep on releasing new updates.”
That’s a wrap!
If you haven’t experienced a SERPs Up episode before, you should absolutely take a listen to experience the full effect of Mordy and Crystal’s banter.
The SERP’s Up podcast is brought to you by Wix Studio.
SEO
OpenAI Claims New “o1” Model Can Reason Like A Human
OpenAI has unveiled its latest language model, “o1,” touting advancements in complex reasoning capabilities.
In an announcement, the company claimed its new o1 model can match human performance on math, programming, and scientific knowledge tests.
However, the true impact remains speculative.
Extraordinary Claims
According to OpenAI, o1 can score in the 89th percentile on competitive programming challenges hosted by Codeforces.
The company insists its model can perform at a level that would place it among the top 500 students nationally on the elite American Invitational Mathematics Examination (AIME).
Further, OpenAI states that o1 exceeds the average performance of human subject matter experts holding PhD credentials on a combined physics, chemistry, and biology benchmark exam.
These are extraordinary claims, and it’s important to remain skeptical until we see open scrutiny and real-world testing.
Reinforcement Learning
The purported breakthrough is o1’s reinforcement learning process, designed to teach the model to break down complex problems using an approach called the “chain of thought.”
By simulating human-like step-by-step logic, correcting mistakes, and adjusting strategies before outputting a final answer, OpenAI contends that o1 has developed superior reasoning skills compared to standard language models.
Implications
It’s unclear how o1’s claimed reasoning could enhance understanding of queries—or generation of responses—across math, coding, science, and other technical topics.
From an SEO perspective, anything that improves content interpretation and the ability to answer queries directly could be impactful. However, it’s wise to be cautious until we see objective third-party testing.
OpenAI must move beyond benchmark browbeating and provide objective, reproducible evidence to support its claims. Adding o1’s capabilities to ChatGPT in planned real-world pilots should help showcase realistic use cases.
Featured Image: JarTee/Shutterstock
-
SEARCHENGINES7 days ago
Google Search Volatility Still Heated After August Core Update Rollout
-
SEARCHENGINES6 days ago
Daily Search Forum Recap: September 9, 2024
-
SEO6 days ago
Mediavine Bans Publisher For Overuse Of AI-Generated Content
-
SEARCHENGINES4 days ago
Daily Search Forum Recap: September 10, 2024
-
SEO5 days ago
Expert Embedding Techniques for SEO Success
-
WORDPRESS5 days ago
The Ultimate eCommerce Launch Checklist for WordPress
-
WORDPRESS5 days ago
Roadmap Update – WordPress.com News
-
AFFILIATE MARKETING6 days ago
One $40 Payment Can Get You Lifetime Access to Microsoft Office Professional 2021
You must be logged in to post a comment Login