Connect with us

MARKETING

Content Assets: Score for Long-Term Success

Published

on

Content Assets: Score for Long-Term Success

Updated January 11, 2022

Want a balanced and actionable way to know whether your content is doing what it’s supposed to do?

Create a content scorecard.

A content scorecard allows for normalized scoring based on benchmarks determined by the performance of similar content in your industry or your company’s content standards.

It marries both qualitative and quantitative assessments. Quantitative scores are based on performance metrics such as views, engagement, SEO rank, etc. Qualitative scores are derived from predetermined criteria, such as readability, accuracy, and voice consistency (more on that in a bit).

Advertisement

A #content scorecard marries qualitative and quantitative assessments, says @lindroux via @CMIContent. Click To Tweet

Let’s get to work to create a content scorecard template you can adapt for your situation.

Establish your quantitative success indicators

First, you must measure what matters. What is the job for that piece of content?

For example, an index or landing page is rarely designed to be the final destination. If a reader spends too long on that kind of page, it’s likely not a good sign. On the other hand, a long time spent on a detailed article or white paper is a positive reflection of user engagement. Be specific with your content goals when deciding what to measure.

What should you measure based on the content’s purpose? Here are some ideas:

  • Exposure – content views, impressions, backlinks
  • Engagement time spent on page, clicks, rating, comments
  • Conversion – purchase, registration for gated content, return visits, click-throughs
  • Redistribution – shares, pins

After you’ve identified your quantitative criteria, you need to identify the benchmarks. What are you measuring against? Industry standards? Internal standards? A little of both?

A good starting point for researching general user behavior standards is the Nielsen Norman Group. If you seek to focus on your industry, look at your industry marketing groups or even type something like “web metrics for best user experience in [INDUSTRY].”

Advertisement

Find out general web user behavior standards from @NNGroup research, advises @lindroux via @CMIContent. Click To Tweet

Below is a sample benchmark key. The left column identifies the metric, while the top row indicates the resulting score on a scale of 1 to 5. Each row lists the parameters for the metric to achieve the score in its column.

Sample Quantitative Content Score 1-5 *

Score: 1 2 3 4 5
Page Views/Section Total <2% 2 – 3% 3 – 4% 4 – 5% >5%
Return Visitors <20% 20 – 30% 30 – 40% 40 – 50% >50%
Trend in Page Views Decrease of >50% Decrease Static Increase Increase of >50%
Page Views/Visit <1.2 1.2 – 1.8 1.9 – 2.1 2.2 – 2.8 >2.8
Time Spent/Page <20 sec 20 – 40 sec 40 – 60 sec 60 – 120 sec >120 sec
Bounce Rate >75% 65 – 75% 35 – 65 % 25 – 35% <25%
Links 0 1 – 5 5 – 10 10 – 15 >15
SEO <35% 35 – 45% 45 – 55% 55 – 65% >65%

*Values should be defined based on industry or company benchmarks.

Using a 1-to-5 scale makes it easier to analyze content that may have different goals and still identify the good, the bad, and the ugly. Your scorecard may look different depending on the benchmarks you select.

How to document it

You will create two quantitative worksheets.

Label the first one as “Quantitative benchmarks.” Create a chart (similar to the one above) tailored to identify your key metrics and the ranges needed to achieve each score. Use this as your reference sheet.

Advertisement

Label a new worksheet as “Quantitative analysis.” Your first columns should be content URL, topic, and type. Label the next columns based on your quantitative metrics (i.e., page views, return visitors, trend in page views).

After adding the details for each piece of content, add the score for each one in the corresponding columns.

Remember, the 1-to-5 rating is based on the objective standards you documented on the quantitative reference worksheet.

Determine your qualitative analytics

It’s easy to look at your content’s metrics, shrug, and say, “Let’s get rid of everything that’s not getting eyeballs.” But if you do, you risk throwing out great content whose only fault may be it hasn’t been discovered. Scoring your content qualitatively (using a different five-point scale) helps you identify valuable pieces that might otherwise be buried in the long tail.

In this content scorecard process, a content strategist or someone equally qualified on your team/agency analyzes the content based on your objectives.

TIP: Have the same person review all the content to avoid any variance in qualitative scoring standards.

Advertisement

Here are some qualitative criteria we’ve used:

  • Consistency – Is the content consistent with the brand voice and style?
  • Clarity and accuracy – Is the content understandable, accurate, and current?
  • Discoverability – Does the layout of the information support key information flows?
  • Engagement – Does the content use the appropriate techniques to influence or engage visitors?
  • Relevance – Does the content meet the needs of all intended user types?

To standardize the assessment, use yes-no questions. One point is earned for every yes. No point is earned for a no. The average qualitative score is then determined by adding up the yes points and dividing the total by the number of questions for the category.

To standardize a qualitative #content assessment, use yes-no questions, says @lindroux via @CMIContent. Click To Tweet

The following illustrates how this would be done for the clarity and accuracy category as well as discoverability. Bold indicates a yes answer.

Clarity and accuracy: Is the content understandable, accurate, and current?

  • Is the content understandable to all user types?
  • Does it use appropriate language?
  • Is content labeled clearly?
  • Do images, video, and audio meet technical standards so they are clear?

Score: 3/4 * 5 = 3.8

Discoverability: Does the layout of information on the page support key information flows? Is the user pathway to related answers and next steps clear and user-friendly?

Score: 1/5 * 5 = 1.0

Advertisement

TIP: Tailor the questions in the relevance category based on the information you can access. For example, if the reviewer knows the audience, the question, “Is it relevant to the interests of the viewers,” is valid. If the reviewer doesn’t know the audience, then don’t ask that question. But almost any reviewer can answer if the content is current. So that would be a valid question to analyze.

How to document it

Create two qualitative worksheets.

Label the first worksheet “Qualitative questions.”

The first columns are the content URL, topic, and type. Then section the columns for each category and its questions. Add the average formula to the cell under each category label.

Let’s illustrate this following on the example above:

After the content details, label the next column “Clarity and accuracy,” and add a column for each of the four corresponding questions.

Advertisement

Then go through each content piece and question, inputting a 1 for yes and a 0 for no.

To calculate the average rating for clarity and accuracy, input this formula into the cell “=(B5+B6+B7+B8)/4” to determine the average for the first piece of content.

For simpler viewing, create a new worksheet labeled “Qualitative analysis.” Include only the content information accompanied by the category averages in each subsequent column.

Put it all together

With your quantitative and qualitative measurements determined, you now can create your scorecard spreadsheet.

Here’s what it would look like based on the earlier example (minus the specific content URLs).

Qualitative Scores

Article A Article B Article C Article D Article E
Brand voice/style 5 1 2 3 1
Accuracy/currency? 4 2 3 2 2
Discoverability 3 3 3 3 3
Engagement 4 2 4 2 2
Relevance 3 3 5 3 3
Average Qualitative Score 3.8 2.2 3.4 2.6 2.2

Quantitative Scores

Exposure 3 1 3 3 3
Engagement 2 2 2 2 2
Conversion 1 3 3 1 3
Backlinks 4 2 2 4 2
SEO % 2 3 3 2 3
Average Quantitative Score 2.4 2.2 2.6 2.4 2.6
Average Qualitative Score 3.8 2.2 3.4 2.6 2.2
Recommended Action Review and improve Remove and avoid Reconsider distribution plan Reconsider distribution plan Review and improve

On the scorecard, an “average” column has been added. It is calculated by totaling the numbers for each category and dividing it by the total number of categories.

Advertisement

Now you have a side-by-side comparison of each content URL’s average quantitative and qualitative scores. Here’s how to analyze the numbers and then optimize your content:

  • Qualitative score higher than a quantitative score: Analyze your distribution plan. Consider alternative times, channels, or formats for this otherwise “good” content.
  • Quantitative score higher than a qualitative score: Review the content to identify ways to improve it. Could its quality be improved with a rewrite? What about the addition of data-backed research?
  • Low quantitative and qualitative scores: Remove this content from circulation and adapt your content plan to avoid this type of content in the future.
  • High quantitative and qualitative scores: Promote and reuse this content as much as feasible. Update your content plan to replicate this type of content in the future.

Of course, there are times when the discrepancy between quantitative and qualitative scores may indicate that the qualitative assessment is off. Use your judgment, but at least consider the alternatives.

HANDPICKED RELATED CONTENT: 

Get going

When should you create a content scorecard? While it may seem like a daunting task, don’t let that stop you. Don’t wait until the next big migration. Take bite-size chunks and make it an ongoing process. Start now and optimize every quarter, then the process won’t feel quite so Herculean.

Selecting how much and what content should be evaluated depends largely on the variety of content types and the consistency of content within the same type. You need to select a sufficient number of content pieces to see patterns in topic, content type, traffic, etc.

Though there is no hard and fast science to sample size, in our experience 100 to 200 content assets were sufficient. Your number will depend on:

  • Total inventory size​
  • Consistency within a content type
  • Frequency of audits​

Review in batches so you don’t get overwhelmed. Set evaluation cycles and look at batches quarterly, revising, retiring, or repurposing your content based on the audit results every time. And remember to select content across the performance spectrum. If you only focus on high-performing content, you won’t identify the hidden gems.

HANDPICKED RELATED CONTENT:

Advertisement
Raise your qualitative and quantitative content marketing initiatives with helpful insight from experts in the field. Subscribe to the free CMI weekday newsletter.

Cover image by Joseph Kalinowski/Content Marketing Institute




Source link

Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address

MARKETING

YouTube Ad Specs, Sizes, and Examples [2024 Update]

Published

on

YouTube Ad Specs, Sizes, and Examples

Introduction

With billions of users each month, YouTube is the world’s second largest search engine and top website for video content. This makes it a great place for advertising. To succeed, advertisers need to follow the correct YouTube ad specifications. These rules help your ad reach more viewers, increasing the chance of gaining new customers and boosting brand awareness.

Types of YouTube Ads

Video Ads

  • Description: These play before, during, or after a YouTube video on computers or mobile devices.
  • Types:
    • In-stream ads: Can be skippable or non-skippable.
    • Bumper ads: Non-skippable, short ads that play before, during, or after a video.

Display Ads

  • Description: These appear in different spots on YouTube and usually use text or static images.
  • Note: YouTube does not support display image ads directly on its app, but these can be targeted to YouTube.com through Google Display Network (GDN).

Companion Banners

  • Description: Appears to the right of the YouTube player on desktop.
  • Requirement: Must be purchased alongside In-stream ads, Bumper ads, or In-feed ads.

In-feed Ads

  • Description: Resemble videos with images, headlines, and text. They link to a public or unlisted YouTube video.

Outstream Ads

  • Description: Mobile-only video ads that play outside of YouTube, on websites and apps within the Google video partner network.

Masthead Ads

  • Description: Premium, high-visibility banner ads displayed at the top of the YouTube homepage for both desktop and mobile users.

YouTube Ad Specs by Type

Skippable In-stream Video Ads

  • Placement: Before, during, or after a YouTube video.
  • Resolution:
    • Horizontal: 1920 x 1080px
    • Vertical: 1080 x 1920px
    • Square: 1080 x 1080px
  • Aspect Ratio:
    • Horizontal: 16:9
    • Vertical: 9:16
    • Square: 1:1
  • Length:
    • Awareness: 15-20 seconds
    • Consideration: 2-3 minutes
    • Action: 15-20 seconds

Non-skippable In-stream Video Ads

  • Description: Must be watched completely before the main video.
  • Length: 15 seconds (or 20 seconds in certain markets).
  • Resolution:
    • Horizontal: 1920 x 1080px
    • Vertical: 1080 x 1920px
    • Square: 1080 x 1080px
  • Aspect Ratio:
    • Horizontal: 16:9
    • Vertical: 9:16
    • Square: 1:1

Bumper Ads

  • Length: Maximum 6 seconds.
  • File Format: MP4, Quicktime, AVI, ASF, Windows Media, or MPEG.
  • Resolution:
    • Horizontal: 640 x 360px
    • Vertical: 480 x 360px

In-feed Ads

  • Description: Show alongside YouTube content, like search results or the Home feed.
  • Resolution:
    • Horizontal: 1920 x 1080px
    • Vertical: 1080 x 1920px
    • Square: 1080 x 1080px
  • Aspect Ratio:
    • Horizontal: 16:9
    • Square: 1:1
  • Length:
    • Awareness: 15-20 seconds
    • Consideration: 2-3 minutes
  • Headline/Description:
    • Headline: Up to 2 lines, 40 characters per line
    • Description: Up to 2 lines, 35 characters per line

Display Ads

  • Description: Static images or animated media that appear on YouTube next to video suggestions, in search results, or on the homepage.
  • Image Size: 300×60 pixels.
  • File Type: GIF, JPG, PNG.
  • File Size: Max 150KB.
  • Max Animation Length: 30 seconds.

Outstream Ads

  • Description: Mobile-only video ads that appear on websites and apps within the Google video partner network, not on YouTube itself.
  • Logo Specs:
    • Square: 1:1 (200 x 200px).
    • File Type: JPG, GIF, PNG.
    • Max Size: 200KB.

Masthead Ads

  • Description: High-visibility ads at the top of the YouTube homepage.
  • Resolution: 1920 x 1080 or higher.
  • File Type: JPG or PNG (without transparency).

Conclusion

YouTube offers a variety of ad formats to reach audiences effectively in 2024. Whether you want to build brand awareness, drive conversions, or target specific demographics, YouTube provides a dynamic platform for your advertising needs. Always follow Google’s advertising policies and the technical ad specs to ensure your ads perform their best. Ready to start using YouTube ads? Contact us today to get started!

Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address
Continue Reading

MARKETING

Why We Are Always ‘Clicking to Buy’, According to Psychologists

Published

on

Why We Are Always 'Clicking to Buy', According to Psychologists

Amazon pillows.

(more…)

Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address
Continue Reading

MARKETING

A deeper dive into data, personalization and Copilots

Published

on

A deeper dive into data, personalization and Copilots

Salesforce launched a collection of new, generative AI-related products at Connections in Chicago this week. They included new Einstein Copilots for marketers and merchants and Einstein Personalization.

To better understand, not only the potential impact of the new products, but the evolving Salesforce architecture, we sat down with Bobby Jania, CMO, Marketing Cloud.

Dig deeper: Salesforce piles on the Einstein Copilots

Salesforce’s evolving architecture

It’s hard to deny that Salesforce likes coming up with new names for platforms and products (what happened to Customer 360?) and this can sometimes make the observer wonder if something is brand new, or old but with a brand new name. In particular, what exactly is Einstein 1 and how is it related to Salesforce Data Cloud?

“Data Cloud is built on the Einstein 1 platform,” Jania explained. “The Einstein 1 platform is our entire Salesforce platform and that includes products like Sales Cloud, Service Cloud — that it includes the original idea of Salesforce not just being in the cloud, but being multi-tenancy.”

Data Cloud — not an acquisition, of course — was built natively on that platform. It was the first product built on Hyperforce, Salesforce’s new cloud infrastructure architecture. “Since Data Cloud was on what we now call the Einstein 1 platform from Day One, it has always natively connected to, and been able to read anything in Sales Cloud, Service Cloud [and so on]. On top of that, we can now bring in, not only structured but unstructured data.”

Advertisement



That’s a significant progression from the position, several years ago, when Salesforce had stitched together a platform around various acquisitions (ExactTarget, for example) that didn’t necessarily talk to each other.

“At times, what we would do is have a kind of behind-the-scenes flow where data from one product could be moved into another product,” said Jania, “but in many of those cases the data would then be in both, whereas now the data is in Data Cloud. Tableau will run natively off Data Cloud; Commerce Cloud, Service Cloud, Marketing Cloud — they’re all going to the same operational customer profile.” They’re not copying the data from Data Cloud, Jania confirmed.

Another thing to know is tit’s possible for Salesforce customers to import their own datasets into Data Cloud. “We wanted to create a federated data model,” said Jania. “If you’re using Snowflake, for example, we more or less virtually sit on your data lake. The value we add is that we will look at all your data and help you form these operational customer profiles.”

Let’s learn more about Einstein Copilot

“Copilot means that I have an assistant with me in the tool where I need to be working that contextually knows what I am trying to do and helps me at every step of the process,” Jania said.

For marketers, this might begin with a campaign brief developed with Copilot’s assistance, the identification of an audience based on the brief, and then the development of email or other content. “What’s really cool is the idea of Einstein Studio where our customers will create actions [for Copilot] that we hadn’t even thought about.”

Here’s a key insight (back to nomenclature). We reported on Copilot for markets, Copilot for merchants, Copilot for shoppers. It turns out, however, that there is just one Copilot, Einstein Copilot, and these are use cases. “There’s just one Copilot, we just add these for a little clarity; we’re going to talk about marketing use cases, about shoppers’ use cases. These are actions for the marketing use cases we built out of the box; you can build your own.”

Advertisement



It’s surely going to take a little time for marketers to learn to work easily with Copilot. “There’s always time for adoption,” Jania agreed. “What is directly connected with this is, this is my ninth Connections and this one has the most hands-on training that I’ve seen since 2014 — and a lot of that is getting people using Data Cloud, using these tools rather than just being given a demo.”

What’s new about Einstein Personalization

Salesforce Einstein has been around since 2016 and many of the use cases seem to have involved personalization in various forms. What’s new?

“Einstein Personalization is a real-time decision engine and it’s going to choose next-best-action, next-best-offer. What is new is that it’s a service now that runs natively on top of Data Cloud.” A lot of real-time decision engines need their own set of data that might actually be a subset of data. “Einstein Personalization is going to look holistically at a customer and recommend a next-best-action that could be natively surfaced in Service Cloud, Sales Cloud or Marketing Cloud.”

Finally, trust

One feature of the presentations at Connections was the reassurance that, although public LLMs like ChatGPT could be selected for application to customer data, none of that data would be retained by the LLMs. Is this just a matter of written agreements? No, not just that, said Jania.

“In the Einstein Trust Layer, all of the data, when it connects to an LLM, runs through our gateway. If there was a prompt that had personally identifiable information — a credit card number, an email address — at a mimum, all that is stripped out. The LLMs do not store the output; we store the output for auditing back in Salesforce. Any output that comes back through our gateway is logged in our system; it runs through a toxicity model; and only at the end do we put PII data back into the answer. There are real pieces beyond a handshake that this data is safe.”

Source link

Advertisement



Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address
Continue Reading

Trending