SEARCHENGINES
Google Search Central Unconference 2022: Quick Recap
Mihai Aperghis (@mihaiaperghis), an SEO we reference here from time to time, is also a Google Product Expert and attended the Google Search Central Unconference the other week. He has written this blog post and I am posting it here as a super rare guest post on this site. Why? (1) Mihai rocks and (2) this site is about community and Product Experts are the essence of the Google community. I (Barry) personally was unable to attend due to a conflict. Note: Mihai did not ask for a link or a mention, but I added this so it is clear that he wrote this.
Google hosted the 2022 edition of the Search Central Virtual Unconference on April 27, making it the third global Google Unconference and fourth such event so far (counting the Japanese one that took place a few weeks earlier).
Quick Overview
For those unfamiliar with it, the Google Unconferences aim to provide discussion-focused sessions where participants (SEOs, developers, business owners, etc.) join online and share their experiences under a slightly more ‘informal’ structure, as opposed to traditional speaker events or even to Google office hours. The “facilitators” for each session (typically two people) have the sole responsibility of guiding discussions within the topic bounds, making sure all participants get heard and, since the sessions are not recorded, taking notes of the conversations that take place.
This year, the event was once again run by Martin Splitt, the Googler behind the Unconference idea, together with Cherry Sireetorn Prommawin from Google APAC.
Just as with editions from previous years (2021 and 2020), Product Experts such as myself were invited ahead of the event to propose session topics that they would then facilitate, should Martin and Cherry accept their proposals.
I won’t get into more details regarding the Unconference format, since I would probably just repeat what I covered in last year’s recap, so feel free to check that out if you’re curious.
One key difference for this year’s edition, however, was that the facilitator role was now open to everyone, thus allowing Product Experts to send the session proposal form to anyone they might see fit to moderate.
Sessions
Since this year’s proposal form was opened to more people, the number of proposed sessions was likely much higher than in previous years. Martin and Cherry ultimately selected 25 of them, which were then voted by people who wanted to attend based on their topic of interest.
Also different from last year was the number of participants, which this time was limited to a maximum of about 14 people (as opposed to 20-25 people). This made it easier for more people to speak up during the 45 minute session, as well have an ice breaker or have everyone introduce themselves without taking too much time.
All 25 sessions were ultimately kept, given that there were at least 5-6 people interested in each one. As usual, the sessions were split into two 45-minute blocks, which meant people could only attend one session in each slot:
Session slot A:
- Tech SEO Q&A, all your questions answered.
- E-commerce SEO Challenges
- Making the best use of Search Intent Optimization (SIO)
- International SEO
- Schema: JSON Successes and Challenges
- Organic and Paid Growth Collaborations
- How can SEOs and Web developers work better together?
- Exploring Google Search Console APIs
- Video SEO – Best practices for optimizing videos on Google
- Let’s discuss spam and low quality results
- Let’s talk about Product review sites
- Webmaster & Podcaster
- Where do you find help?
Sessions slot B:
- SEO A/B split testing ideas
- Content: It’s All About Trust, Transparency and (Human) Typing
- Can Google See This? Rendering Q&A
- Core Web Vitals and how to approach it
- Project Management for Digital and SEO
- User Journey R&D Discussion
- Is Search Console working for you?
- Site Troubleshooting
- Google Business Profile: Myths and Guidance.
- Localization and its Peculiarities
- Working with Images on the Web
- A positive thing in 2022.. Unconference in Spanish!
There were also two conclusion blocks, one after each slot, in which facilitators presented the main conclusions for each session (thus being a good idea for one of the facilitators in each session to be taking notes). Everything was padded by a 15 minutes intro and a quick wrap-up at the end.
If you’re curious about the full description for each session, you can find everything on the official event page.
The E-commerce SEO Challenges Session
Since people outside Google or the Product Experts program were able to join as facilitators, I decided to ask the wonderful Aleyda Solis to co-facilitate one of the sessions with me. Together we landed on a list of three potential topics, out of which the E-commerce SEO Challenges one was ultimately chosen and included in slot A.
Being such a popular topic, we managed to have a full room of outstanding people from highly diverse backgrounds. There were in-house SEOs, agency owners, webmasters and marketers, from highly experienced technical people to folks who only recently started learning the ropes.
With her vast experience running conversations on SEO topics, such as with the weekly #SEOFOMO Twitter chat, Aleyda skillfully guided the discussion around some of the popular E-commerce SEO issues, such as product variations and structured data, facets and navigation indexing, but also dealing with SEO implementation costs and getting buy-in from management or leadership people. I took the note-taking job this time, focusing on getting everyone’s opinion down in order to draft a few takeaways.
Since there will likely be an official Google blog post that will provide more details on each session’s conclusions, I won’t really delve into more details here.
What I can say though is that it was an excellent session. Almost everyone that joined had a story, a perspective or an opinion they wanted to share, which made for a really pleasant conversation.
After the session timer ran out, everybody was moved back to the main room to listen to all of the slot A session conclusions, where I was happy to present our own.
Slot B and Wrap Up
Since I had no session to facilitate in slot B, I happily chose to attend the Working with Images on the Web one, which was masterfully moderated by Roxana Stingu and Olesia Korobka.
The topics discussed ranged from image indexing and meta data, to AI, MUM and other cutting-edge info that I was completely unaware of up until then (seems it’s harder to keep up with everything in SEO nowadays!).
After the session ended, facilitators from all slot B sessions presented their takeaways, after which we bidded farewell to everyone and called it an evening (or morning, or night, depending on where everyone was joining from).
All in all, I’m really happy for how the event turned out and very grateful to Martin and the Google team for giving SEO enthusiasts the opportunity to facilitate sessions. If you haven’t joined any of the Unconference events so far, I highly recommend you keep an eye out for the next one. Something tells me there will be (hopefully many) more editions coming soon.
I am so happy that everyone at the #scunconf seems to have had a great time. Now I shall catch up on a lot of sleep 🤣
— Martin Splitt (@g33konaut) April 27, 2022
Forum discussion at Twitter.
Source: www.seroundtable.com
SEARCHENGINES
Microsoft Testing Clear Distinction Between Free & Paid Search Results On Bing
Microsoft’s disclosure of search ads on Bing has not been the greatest, honestly, in many cases, worse than Google’s disclosures. Recently, however, Bing has been testing a clearer distinction between its ads and organic free listings.
Frank Sandtmann spotted this and posted about it on Mastodon and after fiddling with it enough, I was able to replicate it.
Look at how the ads are in the white background and the free organic listings are in the gray background:
I wonder if this will go live or after Microsoft sees the results, they will go back to making the distinction between ads and free results almost impossible to see.
Frank posted more examples on Mastodon.
Forum discussion at Mastodon.
SEARCHENGINES
Google Started Enforcing The Site Reputation Abuse Policy
Google said it began to enforce its new site reputation abuse policy last night. The policy went into effect on Sunday, May 5th, but Google did not announce it would take action until last night. As a reminder, this should target sites doing what some call “Parasite SEO.”
It seems some large “reputable” sites were hit by this update, including CNN, USA Today, LA Times, Fortune, Daily Mail, Outlook India, TimesUnion, PostandCourier, SFGATE and many more. Google specifically targeted these sites using manual actions, where Google manually took action on these sites and notified them of these actions with a message in Google Search Console. These are not algorithmic actions.
As a reminder, on March 5th, Google released new spam policies and a spam update including scaled content and expired domain abuse. But said the site reputation abuse policy would go live only after May 5th. That date has come and Danny Sullivan, Google’s Search Liaison, said on X yesterday:
It’ll be starting later today. While the policy began yesterday, the enforcement is really kicking off today.
Sullivan later told me on X, “we’re only doing manual actions right now.” “The algorithmic component will indeed come, as we’ve said, but that’s not live yet,” he added.
And it seems Google has already started to drop these sites from showing this type of content. CNN, USA Today, LA Times and others all left those coupon directories open for Google as of last night and then all saw those pages no longer rank in Google Search last night.
I am not seeing a lot of people share screenshots of manual actions but I did spot one site owner say they received this manual action. They posted in the Google Webmaster Help forum saying:
We have a section on the website for brands to promote.
Nofollow attribute is already implemented on these articles which falls under brand category.
Still we got manual action: Site Reputation Abuse for this category.
How to fix that?
Brodie Clark also secured a screenshot of this manual action, here is that screenshot:
Here are examples of sites hit by this site reputation abuse enforcement from last night:
You’re right. I’m seeing the same thing. USA Today, CNN, and LA Times are gone for “subway coupons” and other queries. Sure seems like the update is underway. 🙂 First screenshot is now and second is as of yesterday. pic.twitter.com/f46B5h2ccP
— Glenn Gabe (@glenngabe) May 6, 2024
Here’s another example. The query “uber promos codes” yielded CNN as #2 yesterday and Fortune at #4. Both are now gone. I can’t even find them. Wow. pic.twitter.com/0Oc48ggYeh
— Glenn Gabe (@glenngabe) May 6, 2024
As Glenn wrote, “Google has already released the Kraken.”
Has ‘Vouchergeddon’ begun? I can no longer see the Daily Mail discount code website ranking in the UK for brand and non-branded queries that the Daily Mail was previously ranking for? cc @rustybrick pic.twitter.com/2T8ffmgCFI
— Carl Hendy (@carlhendy) May 7, 2024
Seeing the “parasite” directories from Outlook India, TimesUnion, PostandCourier, and SFGATE completely deindexed from Google right now, to name a few. Note: Post and Courier added a NoIndex tag on all of its pages nested in their parasite directory: @glenngabe @rustybrick
— Vlad Rappoport (@vladrpt) May 7, 2024
This is what the rankings looked like for “Walmart coupon code” during the first half of the day today.
The SERPs are COMPELTELY different now, hours later.
It’s still early, but it seems like Google is NOT messing around with site reputation abuse. pic.twitter.com/eawsCxUeL4
— Lily Ray 😏 (@lilyraynyc) May 7, 2024
Google has already started taking action for the new site reputation abuse policy 👀👇 See the before/after for many of the most popular “promo code(s)” queries:
* carhartt promo code
* postmates promo code
* samsung promo code
* godaddy promo codeSites that were ranking… pic.twitter.com/Byw8DZmkQP
— Aleyda Solis 🕊️ (@aleyda) May 7, 2024
Site abuse: Google has confirmed it has taken manual actions.
I took 2,500 of the most popular search queries for discount and voucher codes from the UK and Australian markets. The data was sourced from @semrush. Using these queries, I created a ‘Share of Search’ report for each… pic.twitter.com/00zMCdeW5g
— Carl Hendy (@carlhendy) May 7, 2024
Good Morning Google Land! Well, we had a pretty exciting end to yesterday as Google released the Kraken with the “Site reputation abuse” update — starting with a flurry of manual actions on some of the most authoritative sites on the web. The manual actions were pattern-based,… pic.twitter.com/xDEpm2sdCE
— Glenn Gabe (@glenngabe) May 7, 2024
Google said it will take action on this policy abuse both algorithmically and through manual actions. Many sites, not all, already removed sections of their sites that would get hit by this penalty prior to Google enforcing it. This includes sites like Forbes coupons, but many many more big brands removed these types of sections on their websites.
As a reminder, site reputation abuse “is when third-party pages are published with little or no first-party oversight or involvement, where the purpose is to manipulate Search rankings by taking advantage of the first-party site’s ranking signals,” Chris Nelson from the Google Search Quality team wrote. This includes sponsored, advertising, partner, or other third-party pages that are typically independent of a host site’s main purpose or produced without close oversight or involvement of the host site, and provide little to no value to users, he explained.
I am not posting the aggregate Google tracking tools because I posted them in my previous story and this is a targeted hit that only impacts sites with that rent out sections of their domain. So this would not hit a huge number of web sites like big algorithmic updates…
If you got hit by this, follow the instructions in the manual action notice you received in Google Search Console. There is also more documentation on this penalty over here.
I am not sure if Google will notify us of when algorithmic action will take place on this policy…
Forum discussion at X.
SEARCHENGINES
Google Says Again, Sites Hit By The Old Helpful Content Update Can Recover
Google’s John Mueller said again this morning that sites hit by the old September helpful content update or even new core updates can recovery. He said on X and on LinkedIn that it is possible to recover but it is not a simple change you can tweak on your website, but rather it takes a lot of effort, over time, to recover.
John said that not only can you recover but you can grow. He said this morning, “Yes, sites can grow again after being affected by the “HCU” (well, core update now).”
Last week we covered how John said it may just take a lot of time to recover from that helpful content update. This is despite Google telling some people it can take weeks (then said several months) to recover.
I know the helpful content update is no more, it is now a core update. But many were expecting some of those hit by the September helpful content update to recover with the March 2024 core update – but that did not happen.
John Mueller from Google said on LinkedIn, “It’s just that some kinds of changes take a long time to build up, and that applies to all kinds of systems & updates in Google & in any other larger computer system.”
He wrote on LinkedIn fully:
I realize this is from the title of Barry’s post, but to be clear, it’s not that “helpful content update” “recoveries” take longer than other updates. It’s just that some kinds of changes take a long time to build up, and that applies to all kinds of systems & updates in Google & in any other larger computer system. Saying that this is specific to the helpful content system, or to core updates would be wrong & misleading.
There is, however, the additional aspect of the “core update” being about how our systems assess content overall, how we consider it to be helpful, reliable, relevant to users’ queries. This does not map back to a single change that you can make on a website, so – in my experience – it’s not something that a website can just tweak overnight and be done with it. It can require deep analysis to understand how to make a website relevant in a modern world, and significant work to implement those changes — assuming that it’s something that aligns with what the website even wants. These are not “recoveries” in the sense that someone fixes a technical issue and they’re back on track – they are essentially changes in a business’s priorities (and, a business might choose not to do that).
He added on LinkedIn:
making a site more helpful (assuming that’s what you’re aiming for) doesn’t mean you have to add more content. There’s a lot that goes into making a helpful site – content is one part, and more content is not necessarily more helpful. Think about how you use the web.
He also posted this morning on X, “Yes, sites can grow again after being affected by the “HCU” (well, core update now). This isn’t permanent. It can take a lot of work, time, and perhaps update cycles, and/but a different – updated – site will be different in search too.” He added, “Permanent changes are not very useful in a dynamic world, so yes. However, “recover” implies going back to just-as-before, and IMO that is always unrealistic, since the world, user-expectations, and the rest of the web continues to change. It’s never “just-as-before”.”
Here are some of the new posts on this topic from John over the weekend:
Permanent changes are not very useful in a dynamic world, so yes. However, “recover” implies going back to just-as-before, and IMO that is always unrealistic, since the world, user-expectations, and the rest of the web continues to change. It’s never “just-as-before”.
— John 🧀 … 🧀 (@JohnMu) May 6, 2024
Yes, sites can grow again after being affected by the “HCU” (well, core update now). This isn’t permanent. It can take a lot of work, time, and perhaps update cycles, and/but a different – updated – site will be different in search too.
— John 🧀 … 🧀 (@JohnMu) May 6, 2024
It’s because not all changes require another update cycle. In practice, I’d assume that stronger effects will require another update. Core updates can include many things.
— John 🧀 … 🧀 (@JohnMu) May 4, 2024
(“thresholds” is a simplification for any numbers that need a lot of work and data to be recalculated, reevaluated, reviewed)
— John 🧀 … 🧀 (@JohnMu) May 4, 2024
So keep working on your site and maybe you will recover in the long run?
-
MARKETING7 days ago
Generative Engine Optimization Framework Introduced in New Research
-
SEARCHENGINES4 days ago
Daily Search Forum Recap: May 3, 2024
-
MARKETING6 days ago
Let’s Start Treating Content More Like We Treat Code
-
MARKETING4 days ago
How Tagging Strategies Transform Marketing Campaigns
-
WORDPRESS7 days ago
20 Best WooCommerce Extensions for Your WordPress Shop
-
MARKETING5 days ago
Tinuiti Recognized in Forrester Report for Media Management Excellence
-
SEARCHENGINES7 days ago
Daily Search Forum Recap: May 1, 2024
-
PPC6 days ago
5 Ways To Use Google Gemini For PPC Inspiration
You must be logged in to post a comment Login