Connect with us

SEO

Technical SEO Audits: Tips For Successful Implementation

Published

on

According to BrightEdge, 68 percent of all online experiences start with a search engine, and 53 percent of website traffic comes from organic search.

Yet, only 0.78 percent of these searchers click results on the second page of Google.

That means if you’re not showing up on the front page of the SERPs, you aren’t getting traffic.

It’s clear why SEO is a top priority for marketers. In fact, 61 percent say it’s their main focus when it comes to inbound marketing.

If you can get your client’s website to the top of user search, you’ll have a much easier time improving their ROI.

Of course, mastering SEO is no easy task.

There are so many things to consider, from on-page SEO to off-page SEO, copy, content, and even technical SEO.

A truly effective SEO strategy includes all of these strategies, plus regular optimization to ensure consistent returns.

This post dives deep into technical SEO and how you can use it to increase organic traffic, show up in search, and improve your overall site experience.

What Is Technical SEO?

Technical SEO is an area of SEO that covers optimizations that improve search engine ranking by making your site easier for search engines to crawl. For example, improving site load time, checking robot.txt files, and making redirects work properly.

Essentially, it’s the process of ensuring your website can be seen, crawled, and ranked by search engines.

Search engines, such as Google, give preference to websites that meet their webmaster guidelines. The basic principles state your website content should be accurate, easy to access, and user-friendly.

If your website loads slowly, has an unresponsive design, or lacks a secure connection, your content will not meet these guidelines.

This is where technical SEO comes in, as it can help you improve the technical characteristics of your website to improve organic traffic.

Why Is Technical SEO Important?

Imagine you wrote the most amazing content in the world. It’s content that everyone should read.

People would pay buckets of money just to read it. Millions are eagerly waiting for the notification that you’ve posted this new, incredible content.

Then, the day finally comes, and the notification goes out. Customers excitedly click the link to read your article.

Then, it takes over 10 seconds for your web page to load. Readers are annoyed and they don’t want to wait.

For every second that it takes for your web page to load, you’re losing readers and increasing your bounce rate.

technical seo for bounce rate

It doesn’t matter how great that piece of content is—your site isn’t functioning well, and you’re losing precious traffic.

That’s just one example of why technical SEO is so critical.

Without it, Google and other search engines are incapable of finding, crawling, and indexing your site.

If search engines can’t access your site, you can’t rank and you become one of the 90.63 percent of websites that get no organic search traffic from Google. Yikes.

Even if your site can be found, user experience issues, like page load times and confusing navigation, can still negatively impact SEO.

Other issues like mobile optimizations, duplicate content, and site security can cause search engines to rank your site lower.

Elements of Technical SEO

While crawling and indexing are important factors in SEO, there are many more aspects to consider when performing a technical SEO audit. These include:

  • mobile optimization
  • page load speed
  • link health
  • duplicate content
  • schemas
  • crawl errors
  • image issues
  • site security
  • URL structure
  • 404 pages
  • 301 redirects
  • canonical tags
  • XML sitemaps
  • site architecture

At a minimum, a technically sound website should be secure, quick to load, easy to crawl, have clear and actionable navigation, and not contain any duplicate links or content.

It should also have systems in place to engage users even if they do hit a dead end, such as content created for 404 errors and 301 redirect pages.

Finally, a site should have structured data to help search engines understand the content. This can come in the form of schema graphs and XML sitemaps.

When conducting a technical SEO audit, be wary of over-optimizing your website. Too many improvements can work against your best intentions and actually damage your SEO rankings.

What Is an SEO Audit?

An SEO audit is the process of evaluating your website to see how well it is performing on search engines.

SEO audits are a great way to create actionable plans to outperform your competitors, identify opportunities within your website, find and fix exit points, and create better customer experiences.

You should perform technical SEO audits, on-page SEO audits, and off-page SEO audits regularly.

As you go through your audit, you’ll find places where you can improve or optimize your website performance to improve performance and keep site visitors happy.

You may not be able to fix every error at once, but you can figure out what’s going wrong and make a plan to fix it.

What Are the Key Elements of a Technical SEO Audit?

There are three key factors to look at during an SEO audit:

  • back-end factors, such as hosting and indexing
  • front-end factors, such as content, keywords, and metadata
  • link quality and outside references

Sometimes, you won’t have the time to address each pain point. So, when deciding which audit insights are worth taking action on, use the 80/20 rule.

The most important part of your site’s SEO is the part that your incoming traffic actually sees.

That’s all washed away if your site isn’t mobile-friendly, though.

With the introduction of the mobile-first index, you need to make sure you understand how your site performs on mobile to ensure proper placement on SERPs.

What does the mobile-first index mean?

Due to 52.2 percent of global web traffic coming through mobile, Google has adjusted its algorithm to crawl the mobile version of websites.

technical seo - mobile first indexing example

It boils down to this—if your site doesn’t perform well on mobile devices, you are not just losing traffic; your site also looks bad to Google. That can result in lower rankings, and even less traffic.

How to Perform a Technical SEO Audit

SEO guidelines are constantly changing. Every time a major search engine significantly updates its algorithm, SEO has to adapt.

The good news is the frequency of changes in technical SEO tends to be lower.

After all, it’s not like search engines or readers will suddenly decide they’re okay with slower speeds.

If anything, you will see the average acceptable speed continue to drop. Your site simply has to be faster if you want to keep up with SEO demands.

Your website has to be mobile-friendly. This is only going to become more important over time, too.

It has to work without errors, duplicate content, and poor images.

Search engines also have to be able to crawl it successfully.

These things are all crucial to your success on search engines and site visitors. If you want to prioritize your SEO efforts, make sure you tackle the technical aspects first.

1. Crawl Your Website

The most important part of the SEO audit is the crawl.

Before you do anything else, start a crawl of your website. You can use Ubersuggest to make it a simple process. Here’s how you do it:

  • Step 1: Enter your URL and click “Search.”
  • Step 2: Click “Site Audit” in the left sidebar.
technical seo ubersuggest sidebar
  • Step 3: Run the scan. Upon completion, you’ll see this:
technical seo - Ubersuggest audit page crawl

Crawling is useful for identifying problems such as duplicate content, low word count, unlinked pagination pages, and excess redirects. Ubersuggest will even rank issues in order of importance, so you can focus on what matters most.

technical seo audit with Ubersuggest

If you find anything here, click on it for more information and advice on how to fix it. For example, our website has 32 pages with a low word count.

technical seo audit links

You can then review these pages to determine if you need to add more content.

What does this all mean?

In short, it gives you a glimpse into how the Googlebot is crawling your site.

If you don’t use Ubersuggest for your technical SEO audit, you can also search your site manually. We’ll explain that below.

2. Perform a Manual Google Search

A few Google searches can tell you approximately how well your website is ranking. This will help you figure out where to start your technical SEO audit.

How many of your pages appear in relevant search results?

Does your site appear first when you search for it by name?

Overall, where does your site appear in the results?

To figure out which pages are actually being crawled, you can use a “site:rootdomain” search to see what shows up.

Here’s what this looks like in action:

technical seo audit - root domain search

Missing pages don’t automatically mean that your site is un-crawlable, but it’s useful to understand what’s happening behind the scenes.

Your website doesn’t need to be at the very top of your searches, either. By using the site search, it will show you only pages on your own site.

3. Make Sure Only One Version of Your Site Is Browseable 

If your website has multiple “versions” of itself, you send search engines a mixed message about how to crawl your site.

Basically, the crawlers don’t know which one is the right one.

If search engines don’t even know how to show your site to prospective traffic, your site’s SEO ranking will be negatively impacted.

This could be a mobile and desktop version warring with each other, or a duplicate “https” version and a non-”https” version.

The impact of HTTP vs. HTTPS on a site’s SEO is debated in the SEO community. Some sites using AdSense saw a decrease in revenue after making the switch to HTTPS.

For example, Crunchify’s revenue decreased 10 percent after switching to an HTTPS site.

However, it seems that websites without SSL protection are being deprecated on Google SEO moving forward.

Google is even taking steps to make it more known which sites have SSL protection and which do not. Chrome is marking pages as “Not secure” to make it clearer.

technical seo audit http

With this change from Google, it seems you will need to make sure that your website only uses “https.”

4. Conduct On-Page Technical SEO Checks

When evaluating your site and the results from your crawl, there are tons of things to check. Don’t get overwhelmed! Start by looking for duplicate pages, headers, and title tags.

If you’ve published a lot of content with similar themes, like me, some seemingly unrelated content will show up in your crawl.

That’s okay. You’re looking for duplicates of the same content.

You can use a tool like Copyscape to assess potential technical SEO problems arising from duplicate content.

technical seo audit plagiarism

From there, closely examine a few key criteria that Google evaluates in their rankings.

Page Titles and Title Tags

A title tag is an HTML code that tells search engines the title of a page. This information will be displayed on SERPs.

It looks something like this:

technical seo audit serp

You’ll want to make sure these are relevant to the content on your page. The content should also answer the questions your users are asking as fast as possible.

The optimal length for title tags is between 56-60 characters. You can use a pixel width checker to make sure that your title isn’t truncated.

Meta Descriptions

Meta descriptions don’t directly impact ranking; but they are still incredibly important because it’s the first thing a user sees in the SERPs.

Meta descriptions should be compelling, engaging, and give a taste of what the user will find on the page.

Google recently expanded the limit for descriptions from 160 to 320, which provides even more real estate to draw in a click.

Clear Hierarchy

You’ll want to make sure your content is organized, with a clear hierarchy on the page.

This makes it easy for Google to analyze your site and index it for search.

technical seo hierarchy

Essentially, you want to make sure the placement of pages makes sense—all your service pages should be under your “services” tab, for example. Make sure users don’t have to click through four levels of pages to find best-selling products. The goal is to make it easy for Google —and users—to find the information or products they are looking for.

Keyword Placement

Every page on your site should have a focus keyword included in the first 100 words.

For example, in this post about social proof, it’s included twice in the first 100 words.

technical seo keywords

This helps Google understand what the post focuses on—but don’t stop there.

While keyword stuffing will penalize you, you should be strategic about keyword placement.

Include them, when possible:

  • title
  • alt tags
  • URL
  • subheadings (h2, h3, etc.)
  • meta description

Overall, on-page SEO checks are incredibly important, but they are only one part of your overarching technical SEO strategy. There are also other SEO checks to consider.

5. Manage Your Internal and External Links

Sites with logical hierarchies have improved SEO rankings. That’s why it’s important to check your internal and external links—to make sure visitors can navigate your site intuitively.

Pages might be deleted or moved, which can result in broken links and annoyed site visitors.

Don’t worry; you don’t have to do this manually.

Integrity and Xenu Sleuth can help you identify your broken links on your site. (Note: Integrity only works for Mac.)

While both tools are straightforward to use, I’ll use Integrity as an example.

Once you download it, add your URL in the text bar at the top of the page and click “Go.”

technical seo audit integrity

Then the tool will begin testing all the links found on your site and provide you with the results.

technical seo audit backlinks

In the top-left corner, you see a snapshot of links and how many are bad.

Depending on the size of your site and how many links you have, you might consider viewing the results by link, page, status, or flat view to understand the results.

You’ll want to change any links marked in red with the “404 not found” label. These dead ends can negatively impact your technical SEO.

Google does score clicks from internal and external links differently, although both have their purpose in improving your SEO.

technical SEO audit links tweet

6. Check Your Site Speed

People are impatient. Google knows this.

Your customers don’t want to wait around. The longer your page takes to load, the higher the chance your customer will bounce.

You need to check your site speed, and Ubersuggest can help. Here’s how to get started:

  • Step 1: Enter your URL and click “Search.”
  • Step 2: Click “Site Audit” in the left sidebar.
  • Step 3: Scroll down to “Site Speed.”
Technical SEO - ubersuggest site speed

Ubersuggest displays loading time for both desktop and mobile devices. The results above show my site is in the “excellent” range for both.

In addition to loading time, it also tests:

  • First Contentful Paint
  • Speed Index
  • Time to Interactive
  • First Meaningful Paint
  • First CPU Idle
  • Est. Input Latency

Take action if your website scores less than excellent or good.

You might need to optimize your images, minify JavaScript, leverage browser caching, or more.

Ubersuggest will outline just what you need to do to improve site speed.

7. Leverage Your Analytics and Compare Site Metrics

This step determines whether your analytics service (e.g., Google Analytics, Kissmetrics, etc.) is reporting live metric data.

If it is, your code is installed correctly.

If not, your code is not installed correctly and needs to be fixed.

If you’re using Google Analytics, you want the tracker code to be placed above the header of each web page.

Once you have an analytics service up and running, compare the metric data to the results of your earlier “site:rootdomain” search.

The number of pages showing in your metric data should be comparatively similar to the number of pages from the “site:rootdomain” search.

If not, certain pages aren’t properly accepting crawl requests.

Check Your Bounce Rate

Google Analytics can be helpful when assessing your page’s bounce rate.

A high bounce rate means that people aren’t finding what they are looking for on your site. This means you might have to go back and make sure the content is optimized for your audience.

You can check your bounce rate by logging into your Google Analytics account and clicking on Audiences > Overview.

Compare Metrics With the MozBar

In addition, you can use Moz’s tool called The Mozbar to benchmark between pages.

The MozBar is a tool that provides various SEO details of any web page or search engine results page.

The toolbar adds an overlay to your browser and offers a number of features.

For example, MozBar can be used to highlight different types of links that you view.

technical seo mozbar tool

This is useful on its own, but it also lets you compare link metrics on or between pages.

It also comes with robust search tools to make your life easy.

With it, you can create custom searches by location, down to the city.

Page Authority is also supported by the MozBar.

It ranks each specific page from 1 to 100 in terms of how well it will rank on search engine results pages.

When doing a technical SEO audit, tools like this help you quickly take the temperature of your site’s relationship with search engines.

The less guesswork you have to do, the better quality your SEO audit will be.

8. Check Your Off-Site SEO and Perform a Backlink Audit

Backlinks are critical for SEO success.

This way, Google and other search engines will know that your page is particularly relevant and that other users will find it useful.

Remember that hyperlinks are not the only thing crawlers look for in off-site SEO.

Your site is also crawled for brand mentions. This is why it’s crucial to pay attention to what’s happening both on and off your site.

Perform Your Backlink Audit

Use a tool such as Ubersuggest to perform a backlink audit and assess the kind of backlinks pointing to your site. Here’s how:

  • Step 1: Enter your URL and click “Search.”
  • Step 2: Click “Backlinks” in the left sidebar.
  • Step 3: Review the report.
technical SEO backlinks ubersuggets
technical seo audit new and lost backlinks

Backlink audits are helpful because:

  • You can assess your current link profile and see how it is affecting your site.
  • You can identify areas where you can focus on getting more high-value links.
  • You can assess your competitors’ number of backlinks and work to outperform them.

Don’t just stop with your site’s backlink audit—you’ll also want to see what the competition is up to.

Analyze Competitor Keywords

Your competitors were busy upping their SEO capability while you were sleeping. Now, they rank higher for your most important search terms.

Ubersuggest can also help with this.

It allows you to see what keywords other sites are ranking for. It also shows what backlinks are going to those sites.

Basically, you want to explore your competitors’ backlinks and see how they compare to your own. Here’s how you do it:

  • Step 1: Enter your competitor’s URL and click “Search.”
  • Step 2: Click “Keywords” in the left sidebar.
  • Step 3: Review the results.
technical seo keywords list
ubersuggest technical SEO audit

This provides a clear overview of what your competitor’s site is ranking for. In addition to a list of keywords, you can review:

  • Volume: the average number of monthly searches for the keyword
  • Position: position the URL is ranked in Google search
  • Estimated Visits: estimated monthly traffic the webpage gets for the keyword
  • SD: estimated competition in organic search, the higher the number, the more competitive the term is

Engage on Social Media

Social media is a conduit for consistent backlinks and engagement. You can use it to support your technical SEO efforts.

technical SEO social media chart

You want to figure out which additional social media platforms are frequented by your target audience.

Simply put, social media can improve your SEO by:

  • Increasing the number of your backlinks. Those who discover your content on social media might be more likely to link to it.
  • Increasing brand awareness, which can help with search queries including your brand’s name.

Social media is an opportunity to increase traffic and mentions beyond what people are searching for on a search engine.

Social media saturation is also simpler than putting together a link-building campaign.

Use the Facebook Sharing Debugger to see what your web content looks like when shared on Facebook.

This tool also allows you to check your Open Graph tags.

Technical SEO: Final Thoughts

There are three different aspects of SEO, and technical SEO is the most important of the three.

It won’t matter how amazing your on-page SEO is if you fail at technical SEO.

It also won’t matter how great you are at off-page SEO if you’re horrible at the technical stuff.

Don’t get overwhelmed by the idea of it being “technical” or complex. Start with the big, critical aspects discussed above and tackle them one problem at a time.

How have you found success with technical SEO on your site?

See How My Agency Can Drive Massive Amounts of Traffic to Your Website

  • SEO – unlock massive amounts of SEO traffic. See real results.
  • Content Marketing – our team creates epic content that will get shared, get links, and attract traffic.
  • Paid Media – effective paid strategies with clear ROI.

Book a Call

Neilpatel.com

SEO

Sustaining A SaaS Brand & Organic Channel During A Recession

Published

on

Sustaining A SaaS Brand & Organic Channel During A Recession

During an economic recession, marketing budgets and ROAS typically comes under much more scrutiny.

You should read this article for reasons you should not cut your SEO spending during a recession.

The next question will be about ROI and what you can do to mitigate the oncoming issues.

During an economic downturn, the objectives of reducing churn are amplified. Your sales pipelines may see less activity, and the C-suite may focus more on MRR (monthly recurring revenue) and ARR (annual recurring revenue).

In this article, I will look at subscription-model-based businesses and some methods and strategies that can pivot their SEO efforts toward maintaining performance and SEO ROI (return on investment).

Understanding Why Accounts Cancel

Customers cancel their subscriptions for myriad reasons, but during an economic downturn, reasons tend to gravitate toward costs and perceived value.

Other reasons include not receiving enough value from the subscription, difficulty canceling their subscription, or feeling that customer support is unresponsive or unhelpful.

You can identify these issues before customers provide feedback on an exit survey. Create opportunities for conversations and feedback loops with the sales and customer service teams. This lets customers address concerns before they cancel.

Targeting Disengagement & Value Shortfalls

To show this value, we can pivot our content and messaging to demonstrate opportunity costs and how the upfront cost prevents a more significant shortfall in the long run.

Encountering usage friction with the software is an identifiable problem.

Within the organization, teams should be able to provide you access to DAU (daily active user) and MAU (monthly active user) data.

Companies often boast about having high numbers of each, but the data can also be used to identify accounts with below-average or spare login frequency, and these can then be collated and reached out to.

  • Put accounts on low and mid-tier subscriptions into an email gauntlet and reach out. Offer a consultation with an accounts person. You could also ask them to fill out a feedback form to identify pain points to help build a content strategy.
  • Reach out to accounts on high-tier subscriptions with existing account managers.

Addressing customer issues could be as simple as rewording elements of commercial product pages, adding additional sections, or reinforcing the value proposition with case studies.

You can also address these issues with traditional blog content. Add more support articles to your support center and build out existing ones with media such as video to address common friction points.

Developing Content Against Competitor Value Pitfalls

Price is likely the most challenging reason for leaving to predict and manage. Price is informed and dictated by other business needs and costs. While it might make sense to offer deals to high-value accounts, reducing the price on a wide scale likely isn’t an option.

Price and cost are subjective to the value your solution provides. So Demonstrating your benefits can help customers justify the expenditure.

Any solution’s cost must, at minimum, balance out the problem or provide additional value.

This is known as a cost-benefit analysis. A vital part of a cost-benefit analysis is comparing the costs of the solution versus the benefits and determining a net present value.

During this assessment, your messaging can leverage and demonstrate additional benefits, or benefit enhancements, against your competitors.

In SaaS, you could break this down as comparisons between both product elements and overall “package” elements:

  • Direct product features and performance of those features.
  • Indirect product features and “add ons” that supplement the core product.
  • The bandwidth of the solution on a monthly or annual basis.
  • The number of user seats/sub-accounts per main account.
  • Speed of customer support response (and level of customer support).

A typical approach to highlighting competitor pitfalls is with comparison tables and our-brand-v-competitor-brand URLs and blogs.

These pages will then compete with your competitors’ versions and independent websites, affiliates, and other reviews for clicks and to sway consumer opinion.

You must also explain these benefits and competitive advantages on the product pages themselves.

Bullet listing the product features is commonplace. But make sure the benefits are explained directly against your competitors. This can help these competitive advantages better resonate with your target audience.

Reinforcing Brand Solution Compounds

A brand compound search term is a term made up of two or more words and refers to a specific brand.

For example, the brand compound search term “Decathlon waterproofs” would highlight users wanting to find waterproofs specifically from the brand Decathlon.

Users performing searches like this also reaffirms the connection between topics and brands, helping Google further understand relationships and relevancy.

To optimize brand compound search terms, you need to understand the concept of semantic marketing. This means knowing how different words, phrases, and ideas relate in terms of meaning.

You should research how your target audience searches for information related to your product or service and use those search terms in your content.

Another strategy you can use is to add modifiers to your search terms.

These can be words like “best,” “how,” or any other qualifier that will make the search more specific. This will help you get more targeted traffic that will likely convert better than generic search terms.

Summary

While these are uncertain times and competition for users and recurring revenue becoming more fierce, pivoting your SEO and content strategy to focus on value propositions and addressing consumer friction points can help better qualify leads and provide objection questions that consumers will take to competitors.

In this strategy, the keyword search volumes and other values might not be high. When you’re addressing user friction points and concerns, the value is qualitative, not quantitative.

More resources:


Featured Image: VectorMine/Shutterstock



Source link

Continue Reading

SEO

Where Are The Advertisers Leaving Twitter Going For The Super Bowl?

Published

on

Where Are The Advertisers Leaving Twitter Going For The Super Bowl?

Since Elon Musk’s takeover of Twitter last October 27, 2022, things at the social media company have gone from bad to worse.

You probably saw this coming from a mile away – especially if you had read about a study by Media Matters that was published on November 22, 2022, entitled, “In less than a month, Elon Musk has driven away half of Twitter’s top 100 advertisers.”

If you missed that, then you’ve probably read Matt G. Southern’s article in Search Engine Journal, which was entitled, “Twitter’s Revenue Down 40% As 500 Top Advertisers Pull Out.”

This mass exodus creates a challenge for digital advertising executives and their agencies. Where should they go long term?

And what should they do in the short term – with Super Bowl LVII coming up on Sunday, February 12, 2023?

Ideally, these advertisers would follow their audience. If they knew where Twitter users were going, their ad budgets could follow them.

But it isn’t clear where Twitter users are going – or if they’ve even left yet.

Fake Followers On Twitter And Brand Safety

According to the latest data from Similarweb, a digital intelligence platform, there were 6.9 billion monthly visits to Twitter worldwide during December 2022 – up slightly from 6.8 billion in November, and down slightly from 7.0 billion in October.

So, if a high-profile user like Boston Mayor Michelle Wu has taken a step back from the frequent posts on her Twitter account, @wutrain, which has more than 152,000 followers, then it appears that other users have stepped up their monthly visits.

This includes several accounts that had been banned previously for spreading disinformation, which Musk unbanned.

(Disinformation is defined as “deliberately misleading or biased information,” while misinformation may be spread without the sender having harmful intentions.)

It’s also worth noting that SparkToro, which provides audience research software, also has a free tool called Fake Follower Audit, which analyzes Twitter accounts.

This tool defines “fake followers” as ones that are unreachable and will not see the account’s tweets either because they’re spam, bots, and propaganda, or because they’re no longer active on Twitter.

On Jan. 24, 2023, I used this tool and found that 70.2% of the 126.5 million followers of the @elonmusk account were fake.

According to the tool, accounts with a similar-sized following to @elonmusk have a median of 41% fake followers. So, Elon Musk’s account has more fake followers than most.

Screenshot from SparkToro, January 2023

By comparison, 20.6% of the followers of the @wutreain account were fake. So, Michelle Wu’s account has fewer fake followers than accounts with a similar-sized following.

Sparktoro results for fake followersScreenshot from SparkToro, January 2023

In fact, most Twitter accounts have significant numbers of fake followers.

This underlines the brand safety concerns that many advertisers and media buyers have, but it doesn’t give them any guidance on where they should move their ad dollars.

Who Are Twitter’s Top Competitors And What Are Their Monthly Visits?

So, I asked Similarweb if they had more data that might help. And they sent me the monthly visits from desktop and mobile devices worldwide for Twitter and its top competitors:

  • YouTube.com: 34.6 billion in December 2022, down 2.8% from 35.6 billion in December 2021.
  • Facebook.com: 18.1 billion in December 2022, down 14.2% from 21.1 billion in December 2021.
  • Twitter.com: 6.9 billion in December 2022, up 1.5% from 6.8 billion in December 2021.
  • Instagram.com: 6.3 billion in December 2022, down 3.1% from 6.5 billion in December 2021.
  • TikTok.com: 1.9 billion in December 2022, up 26.7% from 1.5 billion in December 2021.
  • Reddit.com: 1.8 billion in December 2022, down 5.3% from 1.9 billion in December 2021.
  • LinkedIn.com: 1.5 billion in December 2022, up 7.1% from 1.4 billion in December 2021.
  • Pinterest.com: 1.0 billion in December 2022, up 11.1% from 0.9 billion in December 2021.

The most significant trends worth noting are monthly visits to TikTok are up 26.7% year over year from a smaller base, while monthly visits to Facebook are down 14.2% from a bigger base.

So, the short-term events at Twitter over the past 90 days may have taken the spotlight off the long-term trends at TikTok and Facebook over the past year for some industry observers.

But based on Southern’s article in Search Engine Journal, “Facebook Shifts Focus To Short-Form Video After Stock Plunge,” which was published on February 6, 2022, Facebook CEO Mark Zuckerberg is focused on these trends.

In a call with investors, Zuckerberg said back then:

“People have a lot of choices for how they want to spend their time, and apps like TikTok are growing very quickly. And this is why our focus on Reels is so important over the long term.”

Meanwhile, there were 91% more monthly visits to YouTube in December 2022 than there were to Facebook. And that only counts the visits that Similarweb tracks from mobile and desktop devices.

Similarweb doesn’t track visits from connected TVs (CTVs).

Measuring Data From Connected TVs (CTVs) And Co-Viewing

Why would I wish to draw your attention to CTVs?

First, global viewers watched a daily average of over 700 million hours of YouTube content on TV devices, according to YouTube internal data from January 2022.

And Insider Intelligence reported in 2022 that 36.4% of the U.S. share of average time spent per day with YouTube came from connected devices, including Apple TV, Google Chromecast, Roku, and Xfinity Flex, while 49.3% came from mobile devices, and 14.3% came from desktops or laptops.

Second, when people watch YouTube on a connected TV, they often watch it together with their friends, family, and colleagues – just like they did at Super Bowl parties before the pandemic.

There’s even a term for this behavior: Co-viewing.

And advertisers can now measure their total YouTube CTV audience using real-time and census-level surveys in over 100 countries and 70 languages.

This means Heineken and Marvel Studios can measure the co-viewing of their Super Bowl ad in more than 100 markets around the globe where Heineken 0.0 non-alcoholic beer is sold, and/or 26 countries where “Ant-Man and The Wasp: Quantumania” is scheduled to be released three to five days after the Big Game.

It also enables Apple Music to measure the co-viewing of their Super Bowl LVII Halftime Show during Big Game parties worldwide (except Mainland China, Iran, North Korea, and Turkmenistan, where access to YouTube is currently blocked).

And, if FanDuel has already migrated to Google Analytics 4 (GA4), then the innovative sports-tech entertainment company can not only measure the co-viewing of their Big Game teasers on YouTube AdBlitz in 16 states where sports betting is legal, but also measure engaged-view conversions (EVCs) from YouTube within 3 days of viewing Rob Gronkowski’s attempt to kick a live field goal.

 

Advertisers couldn’t do that in 2022. But they could in a couple of weeks.

If advertisers want to follow their audience, then they should be moving some of their ad budgets out of Facebook, testing new tactics, and experimenting with new initiatives on YouTube in 2023.

Where should the advertisers leaving Twitter shift their budgets long term? And how will that change their Super Bowl strategies in the short term?

According to Similarweb, monthly visits to ads.twitter.com, the platform’s ad-buying portal dropped 15% worldwide from 2.5 million in December 2021 to 2.1 million in December 2022.

So, advertisers were heading for the exit weeks before they learned that 500 top advertisers had left the platform.

Where Did Their Ad Budgets Go?

Well, it’s hard to track YouTube advertising, which is buried in Google’s sprawling ad business.

And we can’t use business.facebook.com as a proxy for interest in advertising on that platform because it’s used by businesses for other purposes, such as managing organic content on their Facebook pages.

But monthly visits to ads.snapchat.com, that platform’s ad-buying portal, jumped 88.3% from 1.6 million in December 2021 to 3.0 million in December 2022.

Monthly visits to ads.tiktok.com are up 36.6% from 5.1 million in December 2021 to 7.0 million in December 2022.

Monthly visits to ads.pinterest.com are up 23.3% from 1.1 million in December 2021 to 1.4 million in December 2022.

And monthly visits to business.linkedin.com are up 14.6% from 5.7 million in December 2021 to 6.5 million in December 2022.

It appears that lots of advertisers are hedging their bets by spreading their money around.

Now, most of them should probably continue to move their ad budgets into Snapchat, TikTok, Pinterest, and LinkedIn – unless the “Chief Twit” can find a way to keep his microblogging service from becoming “a free-for-all hellscape, where anything can be said with no consequences!

How will advertisers leaving Twitter change their Super Bowl plan this year?

To double-check my analysis, I interviewed Joaquim Salguerio, who is the Paid Media Director at LINK Agency. He’s managed media budgets of over eight figures at multiple advertising agencies.

Below are my questions and his answers.

Greg Jarboe: “Which brands feel that Twitter has broken their trust since Musk bought the platform?”

Joaquim Salguerio: “I would say that several brands will have different reasonings for this break of trust.

First, if you’re an automaker, there’s suddenly a very tight relationship between Twitter and one of your competitors.

Second, advertisers that are quite averse to taking risks with their communications because of brand safety concerns might feel that they still need to be addressed.

Most of all, in a year where we’re seeing mass layoffs from several corporations, the Twitter troubles have given marketing teams a reason to re-evaluate its effectiveness during a time of budget cuts. That would be a more important factor than trust for most brands.

Obviously, there are some famous cases, such as the Lou Paskalis case, but it’s difficult to pinpoint a brand list that would have trust as their only concern.”

GJ: “Do you think it will be hard for Twitter to regain their trust before this year’s Super Bowl?”

JS: “It’s highly unlikely that any brand that has lost trust in Twitter will change its mind in the near future, and definitely not in time for the Super Bowl. Most marketing plans for the event will be finalized by now and recent communications by Twitter leadership haven’t signaled any change in direction.

If anything, from industry comments within my own network, I can say that comments from Musk recently (“Ads are too frequent on Twitter and too big. Taking steps to address both in coming weeks.”) were quite badly received. For any marketers that believe Twitter advertising isn’t sufficiently effective, this pushes them further away.

Brand communications should still occur on Twitter during Super Bowl though – it will have a peak in usage. And advertising verticals that should dominate the advertising space on Twitter are not the ones crossing the platform from their plans.”

GJ: “How do you think advertisers will change their Super Bowl plans around Twitter this year?”

JS: “The main change for advertising plans will likely be for brand comms amplification. As an example, the betting industry will likely be heavily present on Twitter during the game and I would expect little to no change in plans.”

In the FCMG category, though, time sensitivity won’t be as important, which means that social media teams will likely be making an attempt at virality without relying as much on paid dollars.

If budgets are to diverge, they will likely be moved within the social space and toward platforms that will have user discussion/engagement from the Super Bowl (TikTok, Reddit, etc.)”

GJ: “What trends will we see in advertising budget allocation for this year’s Super Bowl?”

Joaquim Salguerio: “We should see budget planning much in line with previous years in all honesty. TV is still the most important media channel on Super Bowl day.

Digital spend will likely go towards social platforms, we predict a growth in TikTok and Reddit advertising around the big day for most brands.

Twitter should still have a strong advertising budget allocated to the platform by the verticals aiming to get actions from users during the game (food delivery/betting/etc.).”

GJ: “Which platforms will benefit from this shift?”

JS: “Likely, we will see TikTok as the biggest winner from a shift in advertising dollars, as the growth numbers are making it harder to ignore the platform as a placement that needs to be in the plan.

Reddit can also capture some of this budget as it has the right characteristics marketers are looking for around the Super Bowl – it’s relevant to what’s happening at the moment and similar demographics.”

GJ: “Do you think advertisers that step away from Twitter for this year’s Big Game will stay away long term?”

JS: “That is impossible to know, as it’s completely dependent on how the platform evolves and the advertising solutions it will provide. Twitter’s proposition was always centered around brand marketing (their performance offering was always known to be sub-par).

Unless brand safety concerns are addressed by brands that decided to step away, it’s hard to foresee a change.

I would say that overall, Super Bowl ad spend on Twitter should not be as affected as it’s been portrayed – it makes sense to reach audiences where audiences are.

Especially if you know the mindset. The bigger issue is what happens when there isn’t a Super Bowl or a World Cup.”

More resources:


Featured Image: Brocreative/Shutterstock



Source link

Continue Reading

SEO

Is ChatGPT Use Of Web Content Fair?

Published

on

Is ChatGPT Use Of Web Content Fair?

Large Language Models (LLMs) like ChatGPT train using multiple sources of information, including web content. This data forms the basis of summaries of that content in the form of articles that are produced without attribution or benefit to those who published the original content used for training ChatGPT.

Search engines download website content (called crawling and indexing) to provide answers in the form of links to the websites.

Website publishers have the ability to opt-out of having their content crawled and indexed by search engines through the Robots Exclusion Protocol, commonly referred to as Robots.txt.

The Robots Exclusions Protocol is not an official Internet standard but it’s one that legitimate web crawlers obey.

Should web publishers be able to use the Robots.txt protocol to prevent large language models from using their website content?

Large Language Models Use Website Content Without Attribution

Some who are involved with search marketing are uncomfortable with how website data is used to train machines without giving anything back, like an acknowledgement or traffic.

Hans Petter Blindheim (LinkedIn profile), Senior Expert at Curamando shared his opinions with me.

Hans commented:

“When an author writes something after having learned something from an article on your site, they will more often than not link to your original work because it offers credibility and as a professional courtesy.

It’s called a citation.

But the scale at which ChatGPT assimilates content and does not grant anything back differentiates it from both Google and people.

A website is generally created with a business directive in mind.

Google helps people find the content, providing traffic, which has a mutual benefit to it.

But it’s not like large language models asked your permission to use your content, they just use it in a broader sense than what was expected when your content was published.

And if the AI language models do not offer value in return – why should publishers allow them to crawl and use the content?

Does their use of your content meet the standards of fair use?

When ChatGPT and Google’s own ML/AI models trains on your content without permission, spins what it learns there and uses that while keeping people away from your websites – shouldn’t the industry and also lawmakers try to take back control over the Internet by forcing them to transition to an “opt-in” model?”

The concerns that Hans expresses are reasonable.

In light of how fast technology is evolving, should laws concerning fair use be reconsidered and updated?

I asked John Rizvi, a Registered Patent Attorney (LinkedIn profile) who is board certified in Intellectual Property Law, if Internet copyright laws are outdated.

John answered:

“Yes, without a doubt.

One major bone of contention in cases like this is the fact that the law inevitably evolves far more slowly than technology does.

In the 1800s, this maybe didn’t matter so much because advances were relatively slow and so legal machinery was more or less tooled to match.

Today, however, runaway technological advances have far outstripped the ability of the law to keep up.

There are simply too many advances and too many moving parts for the law to keep up.

As it is currently constituted and administered, largely by people who are hardly experts in the areas of technology we’re discussing here, the law is poorly equipped or structured to keep pace with technology…and we must consider that this isn’t an entirely bad thing.

So, in one regard, yes, Intellectual Property law does need to evolve if it even purports, let alone hopes, to keep pace with technological advances.

The primary problem is striking a balance between keeping up with the ways various forms of tech can be used while holding back from blatant overreach or outright censorship for political gain cloaked in benevolent intentions.

The law also has to take care not to legislate against possible uses of tech so broadly as to strangle any potential benefit that may derive from them.

You could easily run afoul of the First Amendment and any number of settled cases that circumscribe how, why, and to what degree intellectual property can be used and by whom.

And attempting to envision every conceivable usage of technology years or decades before the framework exists to make it viable or even possible would be an exceedingly dangerous fool’s errand.

In situations like this, the law really cannot help but be reactive to how technology is used…not necessarily how it was intended.

That’s not likely to change anytime soon, unless we hit a massive and unanticipated tech plateau that allows the law time to catch up to current events.”

So it appears that the issue of copyright laws has many considerations to balance when it comes to how AI is trained, there is no simple answer.

OpenAI and Microsoft Sued

An interesting case that was recently filed is one in which OpenAI and Microsoft used open source code to create their CoPilot product.

The problem with using open source code is that the Creative Commons license requires attribution.

According to an article published in a scholarly journal:

“Plaintiffs allege that OpenAI and GitHub assembled and distributed a commercial product called Copilot to create generative code using publicly accessible code originally made available under various “open source”-style licenses, many of which include an attribution requirement.

As GitHub states, ‘…[t]rained on billions of lines of code, GitHub Copilot turns natural language prompts into coding suggestions across dozens of languages.’

The resulting product allegedly omitted any credit to the original creators.”

The author of that article, who is a legal expert on the subject of copyrights, wrote that many view open source Creative Commons licenses as a “free-for-all.”

Some may also consider the phrase free-for-all a fair description of the datasets comprised of Internet content are scraped and used to generate AI products like ChatGPT.

Background on LLMs and Datasets

Large language models train on multiple data sets of content. Datasets can consist of emails, books, government data, Wikipedia articles, and even datasets created of websites linked from posts on Reddit that have at least three upvotes.

Many of the datasets related to the content of the Internet have their origins in the crawl created by a non-profit organization called Common Crawl.

Their dataset, the Common Crawl dataset, is available free for download and use.

The Common Crawl dataset is the starting point for many other datasets that created from it.

For example, GPT-3 used a filtered version of Common Crawl (Language Models are Few-Shot Learners PDF).

This is how  GPT-3 researchers used the website data contained within the Common Crawl dataset:

“Datasets for language models have rapidly expanded, culminating in the Common Crawl dataset… constituting nearly a trillion words.

This size of dataset is sufficient to train our largest models without ever updating on the same sequence twice.

However, we have found that unfiltered or lightly filtered versions of Common Crawl tend to have lower quality than more curated datasets.

Therefore, we took 3 steps to improve the average quality of our datasets:

(1) we downloaded and filtered a version of CommonCrawl based on similarity to a range of high-quality reference corpora,

(2) we performed fuzzy deduplication at the document level, within and across datasets, to prevent redundancy and preserve the integrity of our held-out validation set as an accurate measure of overfitting, and

(3) we also added known high-quality reference corpora to the training mix to augment CommonCrawl and increase its diversity.”

Google’s C4 dataset (Colossal, Cleaned Crawl Corpus), which was used to create the Text-to-Text Transfer Transformer (T5), has its roots in the Common Crawl dataset, too.

Their research paper (Exploring the Limits of Transfer Learning with a Unified Text-to-Text Transformer PDF) explains:

“Before presenting the results from our large-scale empirical study, we review the necessary background topics required to understand our results, including the Transformer model architecture and the downstream tasks we evaluate on.

We also introduce our approach for treating every problem as a text-to-text task and describe our “Colossal Clean Crawled Corpus” (C4), the Common Crawl-based data set we created as a source of unlabeled text data.

We refer to our model and framework as the ‘Text-to-Text Transfer Transformer’ (T5).”

Google published an article on their AI blog that further explains how Common Crawl data (which contains content scraped from the Internet) was used to create C4.

They wrote:

“An important ingredient for transfer learning is the unlabeled dataset used for pre-training.

To accurately measure the effect of scaling up the amount of pre-training, one needs a dataset that is not only high quality and diverse, but also massive.

Existing pre-training datasets don’t meet all three of these criteria — for example, text from Wikipedia is high quality, but uniform in style and relatively small for our purposes, while the Common Crawl web scrapes are enormous and highly diverse, but fairly low quality.

To satisfy these requirements, we developed the Colossal Clean Crawled Corpus (C4), a cleaned version of Common Crawl that is two orders of magnitude larger than Wikipedia.

Our cleaning process involved deduplication, discarding incomplete sentences, and removing offensive or noisy content.

This filtering led to better results on downstream tasks, while the additional size allowed the model size to increase without overfitting during pre-training.”

Google, OpenAI, even Oracle’s Open Data are using Internet content, your content, to create datasets that are then used to create AI applications like ChatGPT.

Common Crawl Can Be Blocked

It is possible to block Common Crawl and subsequently opt-out of all the datasets that are based on Common Crawl.

But if the site has already been crawled then the website data is already in datasets. There is no way to remove your content from the Common Crawl dataset and any of the other derivative datasets like C4 and .

Using the Robots.txt protocol will only block future crawls by Common Crawl, it won’t stop researchers from using content already in the dataset.

How to Block Common Crawl From Your Data

Blocking Common Crawl is possible through the use of the Robots.txt protocol, within the above discussed limitations.

The Common Crawl bot is called, CCBot.

It is identified using the most up to date CCBot User-Agent string: CCBot/2.0

Blocking CCBot with Robots.txt is accomplished the same as with any other bot.

Here is the code for blocking CCBot with Robots.txt.

User-agent: CCBot
Disallow: /

CCBot crawls from Amazon AWS IP addresses.

CCBot also follows the nofollow Robots meta tag:

<meta name="robots" content="nofollow">

What If You’re Not Blocking Common Crawl?

Web content can be downloaded without permission, which is how browsers work, they download content.

Google or anybody else does not need permission to download and use content that is published publicly.

Website Publishers Have Limited Options

The consideration of whether it is ethical to train AI on web content doesn’t seem to be a part of any conversation about the ethics of how AI technology is developed.

It seems to be taken for granted that Internet content can be downloaded, summarized and transformed into a product called ChatGPT.

Does that seem fair? The answer is complicated.

Featured image by Shutterstock/Krakenimages.com



Source link

Continue Reading

Trending

en_USEnglish