SEARCHENGINES
James Pate On SEO At IBM & Using Airtable For Managing Enterprise SEO Data
James Pate is the Technical SEO Manager at IBM, IBM has a whole group of respected and well known SEOs. I’ve spoken with Patrick Stox while he was at IBM, I also spoke with Tanu Javeri at IBM’s NYC offices a while back. So this is the third IBM SEO I’ve interviewed. James is based in Brooklyn, which is not too far but yet super far from my office – the wonderful world of New York.
James ventured into SEO by developing software, such as WordPress plugins, like a rich snippets stars plugin. So I blamed James for Google showing fewer stars in the search results over the years. He also coded a plugin for videos and other areas. This is how technical SEOs think, which is wonderful. He did a lot of affiliate marketing when he first got into SEO, which is how a lot of SEOs sometimes start. He has a small company doing mostly WordPress for local companies and then he moved to a larger company to work on larger projects and then joined IBM.
SEO At IBM:
At IBM he is involved in a lot of different projects, but he does a lot of work with SEO data, he looks at ways to improve inbound search specific to IBM products and then a lot of technical SEO projects. He built a redirect engine, he built a crawler and stuff like that. IBM is obviously a super old company, with a ton of old content, on many platforms and subdomains.
Using Airtable For Managing Enterprise SEO Data:
He spoke to me about the framework for managing enterprise SEO data. Specifically how he was about to centralize IBM’s SEO marketing data using Airtable with various automations. This helped IBM discover linking opportunities, content gaps, manage a topic taxonomy and more. This is all to help with huge number of pages, keywords and URLs. In Airtable, IBM stores not just keyword but also keyword intent, which is super interesting. He also tags them by topics, by page category, topic relationships and so on so he can build this map.
If you want help with this, reach out to James Pete on Twitter @jamesfpate.
You can subscribe to our YouTube channel by clicking here so you don’t miss the next vlog where I interviews. I do have a nice lineup of interviews scheduled with SEOs and SEMS, many of which you don’t want to miss – and I promise to continue to make these vlogs better over time. If you want to be interviewed, please fill out this form with your details.
Forum discussion at YouTube.
Source: www.seroundtable.com
SEARCHENGINES
Google Search Console Verification Does Not Impact Your Ranking In Google Search

Gary Illyes of Google said in the Google SEO office-hours from yesterday that verifying your website in Google Search Console won’t impact your Google Search indexing or ranking whatsoever.
Gary said, “Having your site verified in Search Console or changing the verification code and method has no effect on indexing or ranking whatsoever.”
John Mueller of Google previously said that Search Console verification doesn’t help with crawling either.
Gary added later that Search Console gives you data and analytics that can help you make improvements to your site to help you rank better in Google Search potentially. “You can use the data that Search Console gives you to improve your site and thus potentially do better in Search with your site, but otherwise has no effect on search whatsoever,” he added.
Here is the video embed at the 15:27 mark:
Forum discussion at Twitter.
SEARCHENGINES
Google Say Most Common Reason For Blocking Googlebot Are Firewalls or CDNs Issues

Gary Illyes from Google posted a new PSA on LinkedIn saying that the most common reason a site unexpectedly blocks Googlebot from crawling is due to a misconfiguration of a firewall or CDN.
Gary wrote, “check what traffic your firewalls and CDN are blocking.” “By far the most common issue in my inbox is related to firewalls or CDNs blocking googlebot traffic. If I reach out to the blocking site, in the vast majority of the cases the blockage is unintended.”
So what can you do? Gary said, “I’ve said this before, but want to emphasize it again: make a habit of checking your block rules. We publish our IP ranges so it should be very easy to run an automation that checks the block rules against the googlebot subnets.”
Gary linked to this help document for more details.
In short, do what you can to test to see if your site is accessible to Googlebot. You can use the URL inspection tool in Google Search Console, as one method. Also, confirm with your CDN or firewall company that they are allowing Googlebot and ask them to prove it.
Forum discussion at on LinkedIn.
SEARCHENGINES
Google Search Testing More Card Box Like Buttons In Search

Google is testing more card and box-like elements in the search results. We covered this with the product results interface a few weeks back but now we are seeing them for other elements.
Here are screenshots shared with me of these designs:
Via @b4k_khushal:
Via @mblumenthal via @b4k_khushal:
I kind of like them but I might be wrong. 🙂
Forum discussion at Twitter.
-
SEARCHENGINES6 days ago
Google Publishes A New SEO Case Study
-
OTHER7 days ago
Now the internet’s fighting over old scrollbar designs
-
AMAZON6 days ago
41 Super Practical Valentine’s Day Gifts Of 2023
-
MARKETING4 days ago
11 Email Marketing Design Tips to Drive More Revenue
-
PPC6 days ago
47 Creative February Marketing Ideas (Beyond Valentine’s Day!)
-
MARKETING6 days ago
A Digital Practioner’s Guide to Starting the New Year Right
-
SEARCHENGINES7 days ago
Microsoft Bing Testing Infinite Scroll
-
MARKETING6 days ago
Is a Marketing Degree Worth it in 2023?