This interview is part of our Content Visionaries series, asking content leaders across industries for their insights into findings from our 2021 State of Content Operations Study.   

Content measurement is key to the success of any content strategy. Content Science has found that very successful organizations are more than twice as likely than slightly successful ones to evaluate content impact and success regularly. 

But many organizations struggle with where to start and what to measure. To get expert answers to pressing content measurement questions Content Science spoke with AT&T’s Cory Bennett—Associate Director for CX Content Strategy, Tracking, and Measurement.

In this interview, Bennett gives his views on how companies can overcome content measurement challenges, approach setting up a measurement system, and the type of measurement tools he finds helpful.

In our 2021 Content Operations study, we found that 65% of organizations do not regularly evaluate content impact or success. A lack of time and vague objectives and goals are the most common reasons respondents cited for not evaluating content. Why do you think so many organizations struggle with content evaluation?

BENNETT: Getting started with content evaluation—setting up the process—takes a significant amount of time. To do it right, you need to be very intentional about what it is you’re evaluating against. You need to be mindful of the content goals for your customer/end-user. You also have business or organization goals to keep in mind, and your content will be a key driver in meeting those…or not. 

For your customers

Think about:

  • What pillars do you keep in mind when you’re designing experiences? 
  • How well do you know your customers, their needs, and their expectations? 
  • What is the purpose of the overall experience, each individual page, or each module of content? Who are your competitors? 
  • What experiences are your customers engaging in that are the true benchmarks for how they view your customer experience? 

Without that framework in place, evaluating content is a subjective exercise. With those details defined and with industry leaders or marketplace influencers as competitive benchmarks, you can better understand if your content is meeting your customers’ needs. Are they engaging with content as intended? Are they able to easily understand your products and services? Can they complete tasks with minimal frustration? 

For your business

These goals should be influenced by your customers’ needs, and they can and should impact how you evaluate the effectiveness of content for your customers. Think about:

  • Are progress through the experience and conversions being tracked?  
  • Are customers getting stuck at a specific part in their journey? 
  • Are there industry or company-specific terms and jargon that might be confusing users and distracting them from the task at hand? 
  • Are you attempting to build long-term engagement or is shorter-term acquisition more important? 

Knowing the answers to these questions will help you optimize the experience for your customers and can impact your bottom line. 

How do you approach content evaluation at your organization?

BENNETT: I’m fortunate to work in an organization where understanding the effectiveness of our digital experience and its content is a priority. Our team is still relatively new and we’re growing our capabilities, but we have leadership buy-in, budget for staffing, and a small team of experts committed to the tracking and measurement of our digital experience. 

We use a number of tools to evaluate content in our digital experience. Our team collaborated with a number of other key partners to define a set of pillars that we consider when building or evaluating experiences. Every project we work on, whether the scope is evaluating an experience and recommending enhancements or a full customer experience design exercise, we view the project through the lens of these CX pillars.

When it comes to our process, it’s evolved pretty organically. Early last year, one of our content strategists – Robin Japar – was looking for a new way to document points of friction for a project that included all of the qualitative and quantitative data the team had gathered. It was then that she created the first of what we now call Friction Maps. A Friction Map is a deep dive into a task/flow/sub-journey with the intent of identifying points of friction and opportunities to improve the experience. 

We start by identifying the funnel or sub-journey and then we leverage recorded user sessions to observe users’ behaviors as they attempt to complete tasks on the website or mobile app. We use the recorded sessions to document the user flow from the perspective of the customer and how they engage with the experience, as opposed to defining the flow in the way it was originally designed. We document URLs and assign screen shots to each step of the user flow to enable more thorough analysis. We gather a variety of data including but not limited to traditional web analytics, voice of the customer survey feedback, SEO, site search terms, and engagement data. The relevant data points are overlaid alongside the specific points in the flow so they can provide quantitative context. 

This is where things get interesting, because experts from our Content Strategy, Experience Strategy, Experience Design, and Research teams audit the experience and provide insights that reflect each of their respective disciplines and are aligned to the pillars I mentioned earlier.  Our Research team provides “known knowns” from past studies as well as marketplace insights and results from competitive analyses to further inform the analysis.

From there the team conducts a synthesis of all of the data gathered and frictions identified. We develop hypotheses and conduct guerilla user studies to validate them. That data is used to finalize and benchmark the friction map so we can report on trends. It’s also a handy deliverable for project teams to inform journey map development and “how might we” design sprint exercises.  

Setting up a content measurement system can take time. Have you had a specific example of success in overcoming a measurement challenge your team has faced?

BENNETT: The friction maps I mentioned previously are some of the best examples of success we’ve had this year. We’ve received lots of positive feedback from partners and stakeholders as we’ve conducted a soft rollout of these processes. Possibly the most exciting development has been the eagerness of these partner teams to get involved and help evangelize the capabilities. We’re hosting regular training and partnering sessions with a number of teams to enable them to either use the tools we create or to build their own in alignment with the pillars we’ve defined so each organization is tracking and measuring similarly and able to tell a unified story. 

These partnerships have also introduced new tools and data sets for our team to include in the friction mapping process to help us have a more well-rounded understanding of the customer experience. At the beginning of 2021, most of the data we had access to was specific to the digital experience. As we kick off 2022, we’re getting access to data from our chat and call center reps that can help us understand points of friction in those channels so we can design digital experiences that address those needs and opportunities.

We’re developing a thorough communications and evangelization plan for these processes that we’ll expand in the new year to include more regular reporting, brown bag training sessions, and regular partner discussions so we can more holistically understand opportunities to improve experiences for our customers.

Our Content Operations study also found that very successful organizations are more likely to be using ROI to evaluate content. How does your organization assess content ROI?

BENNETT: Defining the ROI of content for our customer experience is a capability we’re building toward. Right now, we have an understanding of the opportunity cost of not getting the experience right in the form of potential revenue lost. This is helpful in identifying opportunities where we’ll get the most lift from focusing our time on that portion of the experience. We continue to evolve our experimentation process, which provides not only quick insights into the effectiveness of changes to content and design elements, but also helps to define further areas of exploration for marketplace insights.

We’re also improving our capacity modeling and content operations processes, the result of which will be a clearer understanding of the ROI of content—from the effort it takes to create or change a component, to the impact of that change on the customer, and to our bottom line.

What content measurement tools do you use? 

BENNETT: We have a number of tools at our disposal right now. We use Mural as our collaborative canvas where we build journeys and friction maps. It allows each discipline to audit the experience and provide feedback and insights, and it provides an organized area for us to synthesize the information and prepare it for sharing and reporting.

We use Quantum Metrics (QM) to understand how customers are engaging with the experience. Its ever-improving capabilities have made it an incredibly useful tool. Its session replays help us to understand how customers actually use our digital experiences. We’re using it to understand engagement throughout the customer journey through access to web analytics, traffic data, Voice of the Customer insights and survey responses, heat mapping, and other details.

We use Adobe Analytics to get a deeper understanding of engagement, abandonment, progress, and conversions. We use Google Analytics to understand what customers are searching for, how they describe products and services, how they prefer to manage their account, and what they look for from support. 

Our analysis also includes customer sentiment from Voice of the Customer surveys and third-party benchmarking and competitive studies.

How often does your team review content measurement data? 

BENNETT: We review and report on content measurement data monthly, quarterly, and on a project-by-project basis. For projects, we review data pre-kickoff, during the discovery phase of the project, and then again as we hand off our deliverables at the end of a project. Depending on the goals of the project, we include reporting relevant to those goals into our regular monthly/quarterly reporting.

How does content evaluation impact future content decisions? 

BENNETT: Our evaluations—the insights and data they’re built on—are part of the foundation for our content strategy.

At a macro level, we use these insights to inform prioritization of projects we should embark on as a team and organization based on what we expect will have the biggest impact on the customer experience. 

At a project level, each project starts with a discovery phase, which is kicked off with the relevant insights and data, and an analysis of what additional data is needed before our research, tracking, and metrics teams pull in additional information. That data is used to inform the strategy we define for the project that guides the experience we design, test, and iterate on. The resulting definitions are incorporated into our regular reporting.

For organizations that aren’t sure where to begin when it comes to content measurement, what are your suggestions for getting started?

BENNETT: It’s ok to be scrappy. While it may seem like you’re starting from the ground floor, there’s almost always data available to help you get up and running.

A good starting point is defining your pillars. What are the values you want to measure against? What’s important to your customers or end users that you need to keep tabs on? Then assess the marketplace and understand the competition in your industry and best-in-class experiences from non-traditional competitors. What do those competitors do well that sets customer expectations for how they’ll interact with your experience?

Identify those attributes while at the same time getting a baseline understanding of what data you have available to you. Does another organization or partner team already have tools in place to gather data and provide you with insights? Anything from traditional web analytics to tools freely available through most major search engines will give you something to start with. If you have access to more advanced analytics tools and teams willing to share data, or partner with you on insight, that can take your evaluation up another notch.

The Authors

Content Science partners with the world’s leading organizations to close the content gap in digital business. We bring together the complete capabilities you need to transform or scale your content approach. Through proprietary data, smart strategy, expert consulting, creative production, and one-of-a-kind products like ContentWRX and Content Science Academy, we turn insight into impact. Don’t simply compete on content. Win.


Cory Bennett is a St. Louis-based digital and content strategist, and Associate Director for CX Content Strategy, Tracking, and Measurement at AT&T.

This article is about

Comments

COMMENT GUIDELINES

We invite you to share your perspective in a constructive way. To comment, please sign in or register. Our moderating team will review all comments and may edit them for clarity. Our team also may delete comments that are off-topic or disrespectful. All postings become the property of
Content Science Review.

Events, Resources, + More

Workshop: Are You Ready for AI?

Is your organization really ready for AI at scale? Let the Content Science team guide your leaders through assessing 4 areas of readiness.

Course: Prompting Text Generative AI

Learn how to bring out the full potential of text generative AI to create impactful content from this on-demand course.

Webinar: Benchmarks for Content Effectiveness

It's not about more content. It's about more effective content. Gain tips based on Content Science's unique research + experience.

The Ultimate Guide to End-to-End Content

Discover why + how an end-to-end approach is critical in the age of AI with this comprehensive white paper.