Connect with us


Content Assets: Score for Long-Term Success



Content Assets: Score for Long-Term Success

Updated January 11, 2022

Want a balanced and actionable way to know whether your content is doing what it’s supposed to do?

Create a content scorecard.

A content scorecard allows for normalized scoring based on benchmarks determined by the performance of similar content in your industry or your company’s content standards.

It marries both qualitative and quantitative assessments. Quantitative scores are based on performance metrics such as views, engagement, SEO rank, etc. Qualitative scores are derived from predetermined criteria, such as readability, accuracy, and voice consistency (more on that in a bit).

A #content scorecard marries qualitative and quantitative assessments, says @lindroux via @CMIContent. Click To Tweet

Let’s get to work to create a content scorecard template you can adapt for your situation.


Establish your quantitative success indicators

First, you must measure what matters. What is the job for that piece of content?

For example, an index or landing page is rarely designed to be the final destination. If a reader spends too long on that kind of page, it’s likely not a good sign. On the other hand, a long time spent on a detailed article or white paper is a positive reflection of user engagement. Be specific with your content goals when deciding what to measure.

What should you measure based on the content’s purpose? Here are some ideas:

  • Exposure – content views, impressions, backlinks
  • Engagement time spent on page, clicks, rating, comments
  • Conversion – purchase, registration for gated content, return visits, click-throughs
  • Redistribution – shares, pins

After you’ve identified your quantitative criteria, you need to identify the benchmarks. What are you measuring against? Industry standards? Internal standards? A little of both?

A good starting point for researching general user behavior standards is the Nielsen Norman Group. If you seek to focus on your industry, look at your industry marketing groups or even type something like “web metrics for best user experience in [INDUSTRY].”

Find out general web user behavior standards from @NNGroup research, advises @lindroux via @CMIContent. Click To Tweet

Below is a sample benchmark key. The left column identifies the metric, while the top row indicates the resulting score on a scale of 1 to 5. Each row lists the parameters for the metric to achieve the score in its column.

Sample Quantitative Content Score 1-5 *

Score: 1 2 3 4 5
Page Views/Section Total <2% 2 – 3% 3 – 4% 4 – 5% >5%
Return Visitors <20% 20 – 30% 30 – 40% 40 – 50% >50%
Trend in Page Views Decrease of >50% Decrease Static Increase Increase of >50%
Page Views/Visit <1.2 1.2 – 1.8 1.9 – 2.1 2.2 – 2.8 >2.8
Time Spent/Page <20 sec 20 – 40 sec 40 – 60 sec 60 – 120 sec >120 sec
Bounce Rate >75% 65 – 75% 35 – 65 % 25 – 35% <25%
Links 0 1 – 5 5 – 10 10 – 15 >15
SEO <35% 35 – 45% 45 – 55% 55 – 65% >65%

*Values should be defined based on industry or company benchmarks.

Using a 1-to-5 scale makes it easier to analyze content that may have different goals and still identify the good, the bad, and the ugly. Your scorecard may look different depending on the benchmarks you select.


How to document it

You will create two quantitative worksheets.

Label the first one as “Quantitative benchmarks.” Create a chart (similar to the one above) tailored to identify your key metrics and the ranges needed to achieve each score. Use this as your reference sheet.

Label a new worksheet as “Quantitative analysis.” Your first columns should be content URL, topic, and type. Label the next columns based on your quantitative metrics (i.e., page views, return visitors, trend in page views).

After adding the details for each piece of content, add the score for each one in the corresponding columns.

Remember, the 1-to-5 rating is based on the objective standards you documented on the quantitative reference worksheet.

Determine your qualitative analytics

It’s easy to look at your content’s metrics, shrug, and say, “Let’s get rid of everything that’s not getting eyeballs.” But if you do, you risk throwing out great content whose only fault may be it hasn’t been discovered. Scoring your content qualitatively (using a different five-point scale) helps you identify valuable pieces that might otherwise be buried in the long tail.

In this content scorecard process, a content strategist or someone equally qualified on your team/agency analyzes the content based on your objectives.

TIP: Have the same person review all the content to avoid any variance in qualitative scoring standards.


Here are some qualitative criteria we’ve used:

  • Consistency – Is the content consistent with the brand voice and style?
  • Clarity and accuracy – Is the content understandable, accurate, and current?
  • Discoverability – Does the layout of the information support key information flows?
  • Engagement – Does the content use the appropriate techniques to influence or engage visitors?
  • Relevance – Does the content meet the needs of all intended user types?

To standardize the assessment, use yes-no questions. One point is earned for every yes. No point is earned for a no. The average qualitative score is then determined by adding up the yes points and dividing the total by the number of questions for the category.

To standardize a qualitative #content assessment, use yes-no questions, says @lindroux via @CMIContent. Click To Tweet

The following illustrates how this would be done for the clarity and accuracy category as well as discoverability. Bold indicates a yes answer.

Clarity and accuracy: Is the content understandable, accurate, and current?

  • Is the content understandable to all user types?
  • Does it use appropriate language?
  • Is content labeled clearly?
  • Do images, video, and audio meet technical standards so they are clear?

Score: 3/4 * 5 = 3.8

Discoverability: Does the layout of information on the page support key information flows? Is the user pathway to related answers and next steps clear and user-friendly?

Score: 1/5 * 5 = 1.0

TIP: Tailor the questions in the relevance category based on the information you can access. For example, if the reviewer knows the audience, the question, “Is it relevant to the interests of the viewers,” is valid. If the reviewer doesn’t know the audience, then don’t ask that question. But almost any reviewer can answer if the content is current. So that would be a valid question to analyze.

How to document it

Create two qualitative worksheets.


Label the first worksheet “Qualitative questions.”

The first columns are the content URL, topic, and type. Then section the columns for each category and its questions. Add the average formula to the cell under each category label.

Let’s illustrate this following on the example above:

After the content details, label the next column “Clarity and accuracy,” and add a column for each of the four corresponding questions.

Then go through each content piece and question, inputting a 1 for yes and a 0 for no.

To calculate the average rating for clarity and accuracy, input this formula into the cell “=(B5+B6+B7+B8)/4” to determine the average for the first piece of content.

For simpler viewing, create a new worksheet labeled “Qualitative analysis.” Include only the content information accompanied by the category averages in each subsequent column.

Put it all together

With your quantitative and qualitative measurements determined, you now can create your scorecard spreadsheet.


Here’s what it would look like based on the earlier example (minus the specific content URLs).

Qualitative Scores

Article A Article B Article C Article D Article E
Brand voice/style 5 1 2 3 1
Accuracy/currency? 4 2 3 2 2
Discoverability 3 3 3 3 3
Engagement 4 2 4 2 2
Relevance 3 3 5 3 3
Average Qualitative Score 3.8 2.2 3.4 2.6 2.2

Quantitative Scores

Exposure 3 1 3 3 3
Engagement 2 2 2 2 2
Conversion 1 3 3 1 3
Backlinks 4 2 2 4 2
SEO % 2 3 3 2 3
Average Quantitative Score 2.4 2.2 2.6 2.4 2.6
Average Qualitative Score 3.8 2.2 3.4 2.6 2.2
Recommended Action Review and improve Remove and avoid Reconsider distribution plan Reconsider distribution plan Review and improve

On the scorecard, an “average” column has been added. It is calculated by totaling the numbers for each category and dividing it by the total number of categories.

Now you have a side-by-side comparison of each content URL’s average quantitative and qualitative scores. Here’s how to analyze the numbers and then optimize your content:

  • Qualitative score higher than a quantitative score: Analyze your distribution plan. Consider alternative times, channels, or formats for this otherwise “good” content.
  • Quantitative score higher than a qualitative score: Review the content to identify ways to improve it. Could its quality be improved with a rewrite? What about the addition of data-backed research?
  • Low quantitative and qualitative scores: Remove this content from circulation and adapt your content plan to avoid this type of content in the future.
  • High quantitative and qualitative scores: Promote and reuse this content as much as feasible. Update your content plan to replicate this type of content in the future.

Of course, there are times when the discrepancy between quantitative and qualitative scores may indicate that the qualitative assessment is off. Use your judgment, but at least consider the alternatives.


Get going

When should you create a content scorecard? While it may seem like a daunting task, don’t let that stop you. Don’t wait until the next big migration. Take bite-size chunks and make it an ongoing process. Start now and optimize every quarter, then the process won’t feel quite so Herculean.

Selecting how much and what content should be evaluated depends largely on the variety of content types and the consistency of content within the same type. You need to select a sufficient number of content pieces to see patterns in topic, content type, traffic, etc.

Though there is no hard and fast science to sample size, in our experience 100 to 200 content assets were sufficient. Your number will depend on:

  • Total inventory size​
  • Consistency within a content type
  • Frequency of audits​

Review in batches so you don’t get overwhelmed. Set evaluation cycles and look at batches quarterly, revising, retiring, or repurposing your content based on the audit results every time. And remember to select content across the performance spectrum. If you only focus on high-performing content, you won’t identify the hidden gems.


Raise your qualitative and quantitative content marketing initiatives with helpful insight from experts in the field. Subscribe to the free CMI weekday newsletter.

Cover image by Joseph Kalinowski/Content Marketing Institute

Source link


PGA TOUR transforms fan experience, analytics and customer feedback



PGA TOUR transforms fan experience, analytics and customer feedback

This week, the PGA TOUR announced a partnership with experience management (XM) technology company Qualtrics to begin a multiyear transformation of fan experience across all touchpoints for tour events.

The PGA TOUR will use Qualtrics’ XM, which includes Qualtric Social Connect and Qualtrics Customer and Employee XM products, to draw insights from how fans engage with digital platforms at tournaments and determine ways to improve the experience. This, in turn, will help meet the goal of cultivating new fans, as well, according to Travis Trembath, vice president fan engagement for PGA TOUR.

Improving the fan’s journey. “There are several stages in a tournament attendee’s journey, each which can make or break someone’s overall experience — from parking, to food and beverage, to restrooms and venue sight lines,” said Trembath. “Our goal is to improve all aspects to provide fans a best-in-class experience from start to finish.”

The journey also includes different levels of engagement through digital experience (DX) touchpoints. For instance, some fans use the PGA TOUR app while on-the-go to check scores and tournament news. Other fans want a more engaged DX that complements a tour telecast on a second screen. And there are also fans who seek out stats and other content relating to fantasy sports and betting.

“The partnership with Qualtrics will enable us to gain a deeper understanding of fan preferences across all of these channels and allow us to begin to optimize the experience on our existing platforms for different types of fans that consume the tour in different ways,” Trembath said.

Read next: What is a digital experience platform or DXP?

Feedback front and center. What will fuel the transformation? Feedback from fans. The tour already sends surveys to ticket buyers and a fan panel following an event. They also use social media listening tools. PGA TOUR will use the XM products to build out holistic fan profiles to make the feedback, and eventual improvements, more comprehensive.


“One potential outcome of getting closer to our fans is that we may very well uncover new ways to engage fans that we had not previously contemplated or implemented,” said Trembath.

For current touchpoints, insights from customer preferences will inform the kind of content this is produced and distributed on digital channels.

“We understand our fans are looking for more immersive on and offline experiences; they want more behind-the-scenes access and content from our world-class athletes,” Trembath explained. “Qualtrics XM products will allow us to dig deeper and use a more systematic approach to learning about our fans’ preferences and experiences, and enable us to connect the dots to build more holistic profiles of our fans’ behaviors across multiple touchpoints.”

Read next: How to tackle the challenges of running successful hybrid events

Implementing Qualtrics XM. “The initial rollout will be focused on identifying macro insights that can be used to improve the overall experience for fans around the world,” said Trembath.

As a phase two, PGA TOUR will integrate fan preferences into their first-party fan database, resulting in more personalized experiences.

“The Qualtrics platform will be complementary to our Adobe digital marketing products and AWS data and analytics tools, enhancing our overall capabilities when it comes to learning and engaging our fans,” Trembath said.

Timeline for rollout. Some Qualtrics XM elements will be deployed this fall, including collecting feedback from websites, apps and social media.


Real-time feedback through the Qualtrics XM platform will be incorporated into some tournaments in Q1 2023. The tour will use insights to make improvements to experience on-the-fly. Additionally, post-event feedback will be used to improve experience at specific annual tournaments the following year.

“As we uncover opportunities to improve the fan experience, we will act on them immediately,” said Trembath.

Get MarTech! Daily. Free. In your inbox.

Why we care. Golf tournaments have many of the same touchpoints as stadium sports, even if the golf course is a more open, outdoor venue. You have parking, ticketing, concessions and, of course, the game itself. With a lot of downtime between swings there’s also more opportunity for fans to consume content on a mobile device. So there is definitely a need to make sure that the experience is first rate. And who is a better authority on how to improve the experience than the fans themselves?


Golf fans who also play the game have seen more technology at many high-end courses, so it’s reasonable to assume that they expect the experience at tournaments to continue to improve.

About The Author

Chris Wood draws on over 15 years of reporting experience as a B2B editor and journalist. At DMN, he served as associate editor, offering original analysis on the evolving marketing tech landscape. He has interviewed leaders in tech and policy, from Canva CEO Melanie Perkins, to former Cisco CEO John Chambers, and Vivek Kundra, appointed by Barack Obama as the country’s first federal CIO. He is especially interested in how new technologies, including voice and blockchain, are disrupting the marketing world as we know it. In 2019, he moderated a panel on “innovation theater” at Fintech Inn, in Vilnius. In addition to his marketing-focused reporting in industry trades like Robotics Trends, Modern Brewery Age and AdNation News, Wood has also written for KIRKUS, and contributes fiction, criticism and poetry to several leading book blogs. He studied English at Fairfield University, and was born in Springfield, Massachusetts. He lives in New York.


Source link

Continue Reading

Subscribe To our Newsletter
We promise not to spam you. Unsubscribe at any time.
Invalid email address