Ta kontakt med oss

SEO

How To Automate Ecommerce Category Page Creation With Python

Publicerad

Clustering product inventory and automatically aligning SKUs to search demand is a great way to find opportunities to create new ecommerce categories.

Niche category pages are a proven way for ecommerce sites to align with organic search demand while simultaneously assisting users in purchasing.

If a site stocks a range of products and there is search demand, creating a dedicated landing page is an easy way to align with the demand.

But how can SEO professionals find this opportunity?

Sure, you can eyeball it, but you’ll usually leave a lot of opportunity on the table.

This problem motivated me to script something in Python, which I’m sharing today in a simple to use Streamlit application. (No coding experience required!)

The app linked above created the following output automatically using nothing more than two crawl exports!

Screenshot from Microsoft Excel, May 2022A csv file export showing new subcategories generated automatically using Python

Notice how the suggested categories are automatically tied back to the existing parent category?

A csv export showing that the new subcategories have been tied back to their parent category.Screenshot from Microsoft Excel, May 2022A csv export showing that the new subcategories have been tied back to their parent category.

The app even shows how many products are available to populate the category.

the number of products available to populate the new subcategories have been highlighted.Screenshot from Microsoft Excel, May 2022the number of products available to populate the new subcategories have been highlighted.

Benefits And Uses

  • Improve relevancy to high-demand, competitive queries by creating new landing pages.
  • Increase the chance of relevant site links displaying underneath the parent category.
  • Reduce CPCs to the landing page through increased relevancy.
  • Potential to inform merchandising decisions. (If there is high search demand vs. low product count – there is a potential to widen the range.0
    A mock up image displaying the new categories as sitelinks within the Google search engine.Mock-up Screenshot from Google Chrome, May 2022A mock up image displaying the new categories as sitelinks within the Google search engine.

Creating the suggested subcategories for the parent sofa category would align the site to an additional 3,500 searches per month with relatively little effort.

Features

  • Create subcategory suggestions automatically.
  • Tie subcategories back to the parent category (cuts out a lot of guesswork!).
  • Match to a minimum of X products before recommending a category.
  • Check similarity to an existing category (X % fuzzy match) before recommending a new category.
  • Set minimum search volume/CPC cut-off for category suggestions.
  • Supports search volume and CPC data from multiple countries.

Getting Started/Prepping The Files

To use this app you need two things.

At a high level, the goal is to crawl the target website with two custom extractions.

The internal_html.csv report is exported, along with an inlinks.csv export.

These exports are then uploaded to the Streamlit app, where the opportunities are processed.

Crawl And Extraction Setup

When crawling the site, you’ll need to set two extractions in Screaming Frog – one to uniquely identify product pages and another to uniquely identify category pages.

The Streamlit app understands the difference between the two types of pages when making recommendations for new pages.

The trick is to find a unique element for each page type.

(For a product page, this is usually the price or the returns policy, and for a category page, it’s usually a filter sort element.)

Extracting The Unique Page Elements

Screaming Frog allows for custom extractions of content or code from a web page when crawled.

This section may be daunting if you are unfamiliar with custom extractions, but it’s essential for getting the correct data into the Streamlit app.

The goal is to end up with something looking like the below image.

(A unique extraction for product and category pages with no overlap.)

A screenshot from screaming frog showing two custom extractions to unique identify product and category pagesScreenshot from Screaming Frog SEO Spider, May 2022A screenshot from screaming frog showing two custom extractions to unique identify product and category pages

The steps below walk you through manually extracting the price element for a product page.

Then, repeat for a category page afterward.

If you’re stuck or would like to read more about the web scraper tool in Screaming Frog, the official documentation is worth your time.

Manually Extracting Page Elements

Let’s start by extracting a unique element only found on a product page (usually the price).

Highlight the price element on the page with the mouse, right-click and choose Inspect.

A screenshot demonstrating how to use the inspect element feature of Google Chrome to extract a CSS Selector.Screenshot from Google Chrome, May 2022A screenshot demonstrating how to use the inspect element feature of Google Chrome to extract a CSS Selector.

This will open up the elements window with the correct HTML line already selected.

Right-click the pre-selected line and choose Copy > Copy selector. That’s it!

A screenshot showing how to cop the CSS selector for use in Screaming FrogScreenshot from Google Chrome, May 2022A screenshot showing how to cop the CSS selector for use in Screaming Frog

Open Screaming Frog and paste the copied selector into the custom extraction section. (Configuration > Custom > Extraction).

A screenshot from Screaming Frog showing how to use a custom extractorScreenshot from Screaming Frog SEO Spider, May 2022A screenshot from Screaming Frog showing how to use a custom extractor

Name the extractor as “product,” select the CSSPath drop down and choose Extract Text.

Repeat the process to extract a unique element from a category page. It should look like this once completed for both product and category pages.

A screenshot from Screaming Frog showing the custom extractor correctly populatedScreenshot from Screaming Frog SEO Spider, May 2022A screenshot from Screaming Frog showing the custom extractor correctly populated

Finally, start the crawl.

The crawl should look like this when viewing the Custom Extraction tab.

A screenshot showing unique extractions for product and category pagesScreenshot from Screaming Frog SEO Spider, May 2022A screenshot showing unique extractions for product and category pages

Notice how the extractions are unique to each page type? Perfect.

The script uses the extractor to identify the page type.

Internally the app will convert the extractor to tags.

(I mention this to stress that the extractors can be anything as long as they uniquely identify both page types.)

A screenshot of how the app / script interprets the custom extractions to tag each pageScreenshot from Microsoft Excel, May 2022A screenshot of how the app / script interprets the custom extractions to tag each page

Exporting The Files

Once the crawl has been completed, the last step is to export two types of CSV files.

  • internal_html.csv.
  • inlinks to product pages.

Go to the Custom Extraction tab in Screaming Frog and highlight all URLs that have an extraction for products.

(You will need to sort the column to group it.)

A screenshot showing how to select the inlinks report from Screaming Frog ready for exportingScreenshot from Screaming Frog SEO Spider, May 2022A screenshot showing how to select the inlinks report from Screaming Frog ready for exporting

Lastly, right-click the product URLs, select Export, and then Inlinks.

A screenshot showing how to right click in Screaming Frog to export the inlinks report.Screenshot from Screaming Frog SEO Spider, May 2022A screenshot showing how to right click in Screaming Frog to export the inlinks report.

You should now have a file called inlinks.csv.

Finally, we just need to export the internal_html.csv file.

Click the Internal tab, select HTML from the dropdown menu below and click on the adjacent Export button.

Finally, choose the option to save the file as a .csv

A screenshot in Screaming Frog showing how to export the internal_html.csv reportScreenshot from Screaming Frog SEO Spider, May 2022A screenshot in Screaming Frog showing how to export the internal_html.csv report

Congratulations! You are now ready to use the Streamlit app!

Using The Streamlit App

Using the Streamlit app is relatively simple.

The various options are set to reasonable defaults, but feel free to adjust the cut-offs to better suit your needs.

I would highly recommend using a Keywords Everywhere API key (although it is not strictly necessary as this can be looked up manually later with an existing tool if preferred.

(The script pre-qualifies opportunity by checking for search volume. If the key is missing, the final output will contain more irrelevant words.)

If you want to use a key, this is the section on the left to pay attention to.

A screenshot showing the area to paste in the option Keywords Everywhere API keyScreenshot from Streamlit.io, May 2022A screenshot showing the area to paste in the option Keywords Everywhere API key

Once you have entered the API key and adjusted the cut-offs to your links, upload the inlinks.csv crawl.

A screenshot showing how to upload the inlinks.csv report Screenshot from Streamlit.io, May 2022A screenshot showing how to upload the inlinks.csv report

Once complete, a new prompt will appear adjacent to it, prompting you to upload the internal_html.csv crawl file.

A screenshot showing how to upload the internal_html.csv reportScreenshot from Streamlit.io, May 2022A screenshot showing how to upload the internal_html.csv report

Finally, a new box will appear asking you to select the product and column names from the uploaded crawl file to be mapped correctly.

A screenshot demonstrating how to correct map the column names from the crawlScreenshot from Streamlit.io, May 2022A screenshot demonstrating how to correct map the column names from the crawl

Click submit and the script will run. Once complete, you will see the following screen and can download a handy .csv export.

A screenshot showing the Streamlit app after it has successfully run a reportScreenshot from Streamlit.io, May 2022A screenshot showing the Streamlit app after it has successfully run a report

How The Script Works

Before we dive into the script’s output, it will help to explain what’s going on under the hood at a high level.

At a glance:

  • Generate thousands of keywords by generating n-grams from product page H1 headings.
  • Qualify keywords by checking whether the word is in an exact or fuzzy match in a product heading.
  • Further qualify keywords by checking for search volume using the Keywords Everywhere API (optional but recommended).
  • Check whether an existing category already exists using a fuzzy match (can find words out of order, different tenses, etc.).
  • Uses the inlinks report to assign suggestions to a parent category automatically.

N-gram Generation

The script creates hundreds of thousands of n-grams from the product page H1s, most of which are completely nonsensical.

In my example for this article, n-grams generated 48,307 words – so this will need to be filtered!

An example of the script generating thousands of nonsensical n-gram combinations.Screenshot from Microsoft Excel, May 2022An example of the script generating thousands of nonsensical n-gram combinations.

The first step in the filtering process is to check whether the keywords generated via n-grams are found at least X times within the product name column.

(This can be in an exact or fuzzy match.)

Anything not found is immediately discarded, which usually removes around 90% of the generated keywords.

The second filtering stage is to check whether the remaining keywords have search demand.

Any keywords without search demand are then discarded too.

(This is why I recommend using the Keywords Everywhere API when running the script, which results in a more refined output.)

It’s worth noting you can do this manually afterward by searching Semrush/Ahrefs etc., discarding any keywords without search volume, and running a VLOOKUP in Microsoft Excel.

Cheaper if you have an existing subscription.

Recommendations Tied To Specific Landing Pages

Once the keyword list has been filtered the script uses the inlinks report to tie the suggested subcategory back to the landing page.

Earlier versions did not do this, but I realized that leveraging the inlinks.csv report meant it was possible.

It really helps understand the context of the suggestion at a glance during QA.

This is the reason the script requires two exports to work correctly.

Limitations

  • Not checking search volumes will result in more results for QA. (Even if you don’t use the Keywords Everywhere API, I recommend shortlisting by filtering out 0 search volume afterward.)
  • Some irrelevant keywords will have search volume and appear in the final report, even if keyword volume has been checked.
  • Words will typically appear in the singular sense for the final output (because products are singular and categories are pluralized if they sell more than a single product). It’s easy enough to add an “s” to the end of the suggestion though.

User Configurable Variables

I’ve selected what I consider to be sensible default options.

But here is a run down if you’d like to tweak and experiment.

  • Minimum products to match to (exact match) – The minimum number of products that must exist before suggesting the new category in an exact match.
  • Minimum products to match to (fuzzy match) – The minimum number of products that must exist before suggesting the new category in a fuzzy match, (words can be found in any order).
  • Minimum similarity to an existing category – This checks whether a category already exists in a fuzzy match before making the recommendation. The closer to 100 = stricter matching.
  • Minimum CPC in $ – The minimum dollar amount of the suggested category keyword. (Requires the Keywords Everywhere API.)
  • Minimum search volume – The minimum search volume of the suggested category keyword. (Requires Keywords Everywhere API.)
  • Keywords Everywhere API key – Optional, but recommended. Used to pull in CPC/search volume data. (Useful for shortlisting categories.)
  • Set the country to pull search data from – Country-specific search data is available. (Default is the USA.)
  • Set the currency for CPC data – Country-specific CPC data is available. (Default USD.)
  • Keep the longest word suggestion – With similar word suggestions, this option will keep the longest match.
  • Enable fuzzy product matching – This will search for product names in a fuzzy match. (Words can be found out of order, recommended – but slow and CPU intensive.)

Slutsats

With a small amount of preparation, it is possible to tap into a large amount of organic opportunity while improving the user experience.

Although this script was created with an ecommerce focus, according to feedback, it works well for other site types such as job listing sites.

So even if your site isn’t an ecommerce site, it’s still worth a try.

Python enthusiast?

I released the source code for a non-Streamlit version här.

Fler resurser:


Featured Image: patpitchaya/Shutterstock

!function(f,b,e,v,n,t,s)
{if(f.fbq)return;n=f.fbq=function(){n.callMethod?
n.callMethod.apply(n,arguments):n.queue.push(arguments)};
if(!f._fbq)f._fbq=n;n.push=n;n.loaded=!0;n.version=’2.0′;
n.queue=[];t=b.createElement(e);t.async=!0;
t.src=v;s=b.getElementsByTagName(e)[0];
s.parentNode.insertBefore(t,s)}(window,document,’script’,
‘https://connect.facebook.net/en_US/fbevents.js’);

if( typeof sopp !== “undefined” && sopp === ‘yes’ ){
fbq(‘dataProcessingOptions’, [‘LDU’], 1, 1000);
}else{
fbq(‘dataProcessingOptions’, []);
}

fbq(‘init’, ‘1321385257908563’);

fbq(‘track’, ‘PageView’);

fbq(‘trackSingle’, ‘1321385257908563’, ‘ViewContent’, {
content_name: ‘python-ecommerce-category-pages’,
content_category: ‘ecommerce technical-seo’
});

Källlänk

SEO

Twitter Will Share Ad Revenue With Twitter Blue Verified Creators

Publicerad

Twitter Will Share Ad Revenue With Twitter Blue Verified Creators

Elon Musk, owner and CEO of Twitter, announced that starting today, Twitter will share ad revenue with creators. The new policy applies only to ads that appear in a creator’s reply threads.

The move comes on the heels of YouTube launching ad revenue sharing for creators through the YouTube Partner Program in a bid to become the most rewarding social platform for creators.

Social networks like Instagram, TikTok, and Snapchat have similar monetization options for creators who publish reels and video content. For example, Instagram’s Reels Play Bonus Program offers eligible creators up to $1,200 for Reel views.

The catch? Unlike other social platforms, creators on Twitter must have an active subscription to Twitter Blue and meet the eligibility requirements for the Blue Verified checkmark.

The following is an example of a Twitter ad in a reply thread (Promoted by @ASUBootcamps). It should generate revenue for the Twitter Blue Verified creator (@rowancheung), who created the thread.

Skärmdump från Twitter, januari 2023

To receive the ad revenue share, creators would have to pay $8 per month (or more) to maintain an active Twitter Blue subscription. Twitter Blue pricing varies based on location and is available in the United States, Canada, Australia, New Zealand, Japan, the United Kingdom, Saudi Arabia, France, Germany, Italy, Portugal, and Spain.

Eligibility for the Twitter Blue Verified checkmark includes having an active Twitter Blue subscription and meeting the following criteria.

  • Your account must have a display name, profile photo, and confirmed phone number.
  • Your account has to be older than 90 days and active within the last 30 days.
  • Recent changes to your account’s username, display name, or profile photo can affect eligibility. Modifications to those after verification can also result in a temporary loss of the blue checkmark until Twitter reviews your updated information.
  • Your account cannot appear to mislead or deceive.
  • Your account cannot spam or otherwise try to manipulate the platform for engagement or follows.

Did you receive a Blue Verified checkmark before the Twitter Blue subscription? That will not help creators who want a share of the ad revenue. The legacy Blue Verified checkmark does not make a creator account eligible for ad revenue sharing.

When asked about accounts with a legacy and Twitter Blue Verified checkmark, Musk tweeted that the legacy Blue Verified is “deeply corrupted” and will sunset in just a few months.

Regardless of how you gained your checkmark, it’s important to note that Twitter can remove a checkmark without notice.

In addition to ad revenue sharing for Twitter Blue Verified creators, Twitter Dev announced that the Twitter API would no longer be free in an ongoing effort to reduce the number of bots on the platform.

While speculation looms about a loss in Twitter ad revenue, the Wall Street Journal rapporterad a “fire-sale” Super Bowl offer from Musk to win back advertisers.

The latest data from DataReportal shows a positive trend for Twitter advertisers. Ad reach has increased from 436.4 million users in January 2022 to 556 million in January 2023.

Twitter is also the third most popular social network based on monthly unique visitors and page views globally, according to SimilarWeb data through December 2022.


Featured Image: Ascannio/Shutterstock



Källlänk

Fortsätt läsa

SEO

AI Content Detection Software: Kan de upptäcka ChatGPT?

Publicerad

AI Content Detection Software: Can They Detect ChatGPT?

We live in an age when AI technologies are booming, and the world has been taken by storm with the introduction of ChatGPT.

ChatGPT is capable of accomplishing a wide range of tasks, but one that it does particularly well is writing articles. And while there are many obvious benefits to this, it also presents a number of challenges.

In my opinion, the biggest hurdle that AI-generated written content poses for the publishing industry is the spread of misinformation.

ChatGPT, or any other AI tool, may generate articles that may contain factual errors or are just flat-out incorrect.

Imagine someone who has no expertise in medicine starting a medical blog and using ChatGPT to write content for their articles.

Their content may contain errors that can only be identified by professional doctors. And if that blog content starts spreading over social media, or maybe even ranks in Search, it could cause harm to people who read it and take erroneous medical advice.

Another potential challenge ChatGPT poses is how students might leverage it within their written work.

If one can write an essay just by running a prompt (and without having to do any actual work), that greatly diminishes the quality of education – as learning about a subject and expressing your own ideas is key to essay writing.

Even before the introduction of ChatGPT, many publishers were already generating content using AI. And while some honestly disclose it, others may not.

Also, Google recently changed its wording regarding AI-generated content, so that it is not necessarily against the company’s guidelines.

Image from Twitter, November 2022

This is why I decided to try out existing tools to understand where the tech industry is when it comes to detecting content generated by ChatGPT, or AI generally.

I ran the following prompts in ChatGPT to generate written content and then ran those answers through different detection tools.

  • “What is local SEO? Why it is important? Best practices of Local SEO.”
  • “Write an essay about Napoleon Bonaparte invasion of Egypt.”
  • “What are the main differences between iPhone and Samsung galaxy?”

Here is how each tool performed.

1. Writer.com

For the first prompt’s answer, Writer.com fails, identifying ChatGPT’s content as 94% human-generated.

Writer.com resultsScreenshot from writer.com, January 2023

For the second prompt, it worked and detected it as AI-written content.

Writer.com test resultScreenshot from writer.com, January 2023

For the third prompt, it failed again.

Sample ResultScreenshot from writer.com, January 2023

However, when I tested real human-written text, Writer.com did identify it as 100% human-generated very accurately.

2. Copyleaks

Copyleaks did a great job in detecting all three prompts as AI-written.

Sample ResultScreenshot from Copyleaks, January 2023

3. Contentatscale.ai

Contentatscale.ai did a great job in detecting all three prompts as AI-written, even though the first prompt, it gave a 21% human score.

Contentscale.aiScreenshot from Contentscale.ai, January 2023

4. Originality.ai

Originality.ai did a great job on all three prompts, accurately detecting them as AI-written.

Also, when I checked with real human-written text, it did identify it as 100% human-generated, which is essential.

Originality.aiScreenshot from Originality.ai, January 2023

You will notice that Originality.ai doesn’t detect any plagiarism issues. This may change in the future.

Over time, people will use the same prompts to generate AI-written content, likely resulting in a number of very similar answers. When these articles are published, they will then be detected by plagiarism tools.

5. GPTZero

This non-commercial tool was built by Edward Tian, and specifically designed to detect ChatGPT-generated articles. And it did just that for all three prompts, recognizing them as AI-generated.

GPTZeroScreenshot from GPTZero, January 2023

Unlike other tools, it gives a more detailed analysis of detected issues, such as sentence-by-sentence analyses.

sentence by sentence text perplexityScreenshot from GPTZero, January 2023

OpenAI’s AI Text Classifier

And finally, let’s see how OpenAi detects its own generated answers.

For the 1st and 3rd prompts, it detected that there is an AI involved by classifying it as “possibly-AI generated”.

AI Text Classifier. Likely AI-generatedAI Text Classifier. Likely AI-generated

But surprisingly, it failed for the 2nd prompt and classified that as “unlikely AI-generated.” I did play with different prompts and found that, as of the moment, when checking it, few of the above tools detect AI content with higher accuracy than OpenAi’s own tool.

AI Text Classifier. Unlikely AI-generatedAI Text Classifier. Unlikely AI-generated

As of the time of this check, they had released it a day before. I think in the future, they will fine tune it, and it will work much better.

Slutsats

Current AI content generation tools are in good shape and are able to detect ChatGPT-generated content (with varying degrees of success).

It is still possible for someone to generate copy via ChatGPT and then paraphrase that to make it undetectable, but that might require almost as much work as writing from scratch – so the benefits aren’t as immediate.

If you think about ranking an article in Google written by ChatGPT, consider for a moment: If the tools we looked at above were able to recognize them as AI-generated, then for Google, detecting them should be a piece of cake.

On top of that, Google has quality raters who will train their system to recognize AI-written articles even better by manually marking them as they find them.

So, my advice would be not to build your content strategy on ChatGPT-generated content, but use it merely as an assistant tool.

Fler resurser: 


Featured Image: /Shutterstock



Källlänk

Fortsätt läsa

SEO

Fem saker du behöver veta om innehållsoptimering 2023

Publicerad

5 Things You Need To Know About Optimizing Content in 2023

30-second summary:

  • As the content battleground goes through tremendous upheaval, SEO insights will continue to grow in importance
  • ChatGPT can help content marketers get an edge over their competition by efficiently creating and editing high-quality content
  • Making sure your content rank high enough to engage the target audience requires strategic planning and implementation

Google is constantly testing and updating its algorithms in pursuit of the best possible searcher experience. As the search giant explains in its ‘How Search Works’ documentation, that means understanding the intent behind the query and bringing back results that are relevant, high-quality, and accessible for consumers.

As if the constantly shifting search landscape weren’t difficult enough to navigate, content marketers are also contending with an increasingly technology-charged environment. Competitors are upping the stakes with tools and platforms that generate smarter, real-time insights and even make content optimization and personalization on the fly based on audience behavior, location, and data points.

Set-it-and-forget-it content optimization is a thing of the past. Here’s what you need to know to help your content get found, engage your target audience, and convert searchers to customers in 2023.

AI automation going to be integral for content optimization

Technologies-B2B-organizations-use-to-optimize-content

As the content battleground heats up, SEO insights will continue to grow in importance as a key source of intelligence. We’re optimizing content for humans, not search engines, after all – we had better have a solid understanding of what those people need and want.

While I do not advocate automation for full content creation, I believe next year – as resources become stretched automation will have a bigger impact on helping with content optimization of existing content.

CHATGPT

ChatGPT, developed by OpenAI, is a powerful language generation model that leverages the Generative Pre-trained Transformer (GPT) architecture to produce realistic human-like text. With Chat GPT’s wide range of capabilities – from completing sentences and answering questions to generating content ideas or powering research initiatives – it can be an invaluable asset for any Natural Language Processing project.

ChatGPT-for-content

The introduction on ChatGPT has caused considerable debate and explosive amounts of content on the web. With ChatGPT, content marketers can achieve an extra edge over their competition by efficiently creating and editing high-quality content. It offers assistance with generating titles for blog posts, summaries of topics or articles, as well as comprehensive campaigns when targeting a specific audience.

However, it is important to remember that this technology should be used to enhance human creativity rather than completely replacing it.

For many years now AI-powered technology has been helping content marketers and SEOs automate repetitive tasks such as data analysis, scanning for technical issues, and reporting, but that’s just the tip of the iceberg. AI also enables real-time analysis of a greater volume of consumer touchpoints and behavioral data points for smarter, more precise predictive analysis, opportunity forecasting, real-time content recommendations, and more.

With so much data in play and recession concerns already impacting 2023 budgets in many organizations, content marketers will have to do more with less this coming year. You’ll need to carefully balance human creative resources with AI assists where they make sense to stay flexible, agile, and ready to respond to the market.

It’s time to look at your body of content as a whole

Google’s Helpful Content update, which rolled out in August, is a sitewide signal targeting a high proportion of thin, unhelpful, low-quality content. That means the exceptional content on your site won’t rank to their greatest potential if they’re lost in a sea of mediocre, outdated assets.

It might be time for a content reboot – but don’t get carried away. Before you start unpublishing and redirecting blog posts, lean on technology for automated site auditing and see what you can fix up first. AI-assisted technology can help sniff out on-page elements, including page titles and H1 tags, and off-page factors like page speed, redirects, and 404 errors that can support your content refreshing strategy.

Focus on your highest trafficked and most visible pages first, i.e.: those linked from the homepage or main menu. Google’s John Mueller confirmed recently that if the important pages on your website are low quality, it’s bad news for the entire site. There’s no percentage by which this is measured, he said, urging content marketers and SEOs to instead think of what the average user would think when they visit your website.

Take advantage of location-based content optimization opportunities

Consumers crave personalized experiences, and location is your low-hanging fruit. Seasonal weather trends, local events, and holidays all impact your search traffic in various ways and present opportunities for location-based optimization.

AI-assisted technology can help you discover these opportunities and evaluate topical keywords at scale so you can plan content campaigns and promotions that tap into this increased demand when it’s happening.

Make the best possible use of content created for locally relevant campaigns by repurposing and promoting it across your website, local landing pages, social media profiles, and Google Business Profiles for each location. Google Posts, for example, are a fantastic and underutilized tool for enhancing your content’s visibility and interactivity right on the search results page.

Optimize content with conversational & high-volume keywords

Look for conversational and trending terms in your keyword research, too. Top-of-funnel keywords that help generate awareness of the topic and spur conversations in social channels offer great opportunities for promotion. Use hashtags organically and target them in paid content promotion campaigns to dramatically expand your audience.

Conversational keywords are a good opportunity for enhancing that content’s visibility in search, too. Check out the ‘People Also Ask’ results and other featured snippets available on the search results page (SERP) for your keyword terms. Incorporate questions and answers in your content to naturally optimize for these and voice search queries.

SEO-and-creating-content-in-2023

It’s important that you utilize SEO insights and real-time data correctly; you don’t want to be targeting what was trending last month and is already over. AI is a great assist here, as well, as an intelligent tool can be scanning and analyzing constantly, sending recommendations for new content opportunities as they arise.

Consider how you optimize content based on intent and experience

The best content comes from a deep, meaningful understanding of the searcher’s intent. What problem were they experiencing or what need did they have that caused them to seek out your content in the first place? And how does your blog post, ebook, or landing page copy enhance their experience?

Look at the search results page as a doorway to your “home”. How’s your curb appeal? What do potential customers see when they encounter one of your pages in search results? What kind of experience do you offer when they step over the threshold and click through to your website?

The best content meets visitors where they are at with relevant, high-quality information presented in a way that is accessible, fast loading, and easy to digest. This is the case for both short and long form SEO content. Ensure your content contains calls to action designed to give people options and help them discover the next step in their journey versus attempting to sell them on something they may not be ready for yet.

2023, the year of SEO: why brands are leaning in and how to prepare

Slutsats

The audience is king, queen, and the entire court as we head into 2023. SEO and content marketing give you countless opportunities to connect with these people but remember they are a means to an end. Keep searcher intent and audience needs at the heart of every piece of content you create and campaign you plan for the coming year.

Källlänk

Fortsätt läsa

Trendigt

sv_SESvenska