This interview is part of our Content Visionaries series, asking content leaders across industries for their insights into findings from our 2021 State of Content Operations Study.
Content measurement is key to the success of any content strategy. Content Science has found that very successful organizations are more than twice as likely than slightly successful ones to evaluate content impact and success regularly.
But many organizations struggle with where to start and what to measure. To get expert answers to pressing content measurement questions Content Science spoke with AT&T’s Cory Bennett—Associate Director for CX Content Strategy, Tracking, and Measurement.
In this interview, Bennett gives his views on how companies can overcome content measurement challenges, approach setting up a measurement system, and the type of measurement tools he finds helpful.
BENNETT: Getting started with content evaluation—setting up the process—takes a significant amount of time. To do it right, you need to be very intentional about what it is you’re evaluating against. You need to be mindful of the content goals for your customer/end-user. You also have business or organization goals to keep in mind, and your content will be a key driver in meeting those…or not.
For your customers
Think about:
Without that framework in place, evaluating content is a subjective exercise. With those details defined and with industry leaders or marketplace influencers as competitive benchmarks, you can better understand if your content is meeting your customers’ needs. Are they engaging with content as intended? Are they able to easily understand your products and services? Can they complete tasks with minimal frustration?
For your business
These goals should be influenced by your customers’ needs, and they can and should impact how you evaluate the effectiveness of content for your customers. Think about:
Knowing the answers to these questions will help you optimize the experience for your customers and can impact your bottom line.
BENNETT: I’m fortunate to work in an organization where understanding the effectiveness of our digital experience and its content is a priority. Our team is still relatively new and we’re growing our capabilities, but we have leadership buy-in, budget for staffing, and a small team of experts committed to the tracking and measurement of our digital experience.
We use a number of tools to evaluate content in our digital experience. Our team collaborated with a number of other key partners to define a set of pillars that we consider when building or evaluating experiences. Every project we work on, whether the scope is evaluating an experience and recommending enhancements or a full customer experience design exercise, we view the project through the lens of these CX pillars.
When it comes to our process, it’s evolved pretty organically. Early last year, one of our content strategists – Robin Japar – was looking for a new way to document points of friction for a project that included all of the qualitative and quantitative data the team had gathered. It was then that she created the first of what we now call Friction Maps. A Friction Map is a deep dive into a task/flow/sub-journey with the intent of identifying points of friction and opportunities to improve the experience.
We start by identifying the funnel or sub-journey and then we leverage recorded user sessions to observe users’ behaviors as they attempt to complete tasks on the website or mobile app. We use the recorded sessions to document the user flow from the perspective of the customer and how they engage with the experience, as opposed to defining the flow in the way it was originally designed. We document URLs and assign screen shots to each step of the user flow to enable more thorough analysis. We gather a variety of data including but not limited to traditional web analytics, voice of the customer survey feedback, SEO, site search terms, and engagement data. The relevant data points are overlaid alongside the specific points in the flow so they can provide quantitative context.
This is where things get interesting, because experts from our Content Strategy, Experience Strategy, Experience Design, and Research teams audit the experience and provide insights that reflect each of their respective disciplines and are aligned to the pillars I mentioned earlier. Our Research team provides “known knowns” from past studies as well as marketplace insights and results from competitive analyses to further inform the analysis.
From there the team conducts a synthesis of all of the data gathered and frictions identified. We develop hypotheses and conduct guerilla user studies to validate them. That data is used to finalize and benchmark the friction map so we can report on trends. It’s also a handy deliverable for project teams to inform journey map development and “how might we” design sprint exercises.
BENNETT: The friction maps I mentioned previously are some of the best examples of success we’ve had this year. We’ve received lots of positive feedback from partners and stakeholders as we’ve conducted a soft rollout of these processes. Possibly the most exciting development has been the eagerness of these partner teams to get involved and help evangelize the capabilities. We’re hosting regular training and partnering sessions with a number of teams to enable them to either use the tools we create or to build their own in alignment with the pillars we’ve defined so each organization is tracking and measuring similarly and able to tell a unified story.
These partnerships have also introduced new tools and data sets for our team to include in the friction mapping process to help us have a more well-rounded understanding of the customer experience. At the beginning of 2021, most of the data we had access to was specific to the digital experience. As we kick off 2022, we’re getting access to data from our chat and call center reps that can help us understand points of friction in those channels so we can design digital experiences that address those needs and opportunities.
We’re developing a thorough communications and evangelization plan for these processes that we’ll expand in the new year to include more regular reporting, brown bag training sessions, and regular partner discussions so we can more holistically understand opportunities to improve experiences for our customers.
BENNETT: Defining the ROI of content for our customer experience is a capability we’re building toward. Right now, we have an understanding of the opportunity cost of not getting the experience right in the form of potential revenue lost. This is helpful in identifying opportunities where we’ll get the most lift from focusing our time on that portion of the experience. We continue to evolve our experimentation process, which provides not only quick insights into the effectiveness of changes to content and design elements, but also helps to define further areas of exploration for marketplace insights.
We’re also improving our capacity modeling and content operations processes, the result of which will be a clearer understanding of the ROI of content—from the effort it takes to create or change a component, to the impact of that change on the customer, and to our bottom line.
BENNETT: We have a number of tools at our disposal right now. We use Mural as our collaborative canvas where we build journeys and friction maps. It allows each discipline to audit the experience and provide feedback and insights, and it provides an organized area for us to synthesize the information and prepare it for sharing and reporting.
We use Quantum Metrics (QM) to understand how customers are engaging with the experience. Its ever-improving capabilities have made it an incredibly useful tool. Its session replays help us to understand how customers actually use our digital experiences. We’re using it to understand engagement throughout the customer journey through access to web analytics, traffic data, Voice of the Customer insights and survey responses, heat mapping, and other details.
We use Adobe Analytics to get a deeper understanding of engagement, abandonment, progress, and conversions. We use Google Analytics to understand what customers are searching for, how they describe products and services, how they prefer to manage their account, and what they look for from support.
Our analysis also includes customer sentiment from Voice of the Customer surveys and third-party benchmarking and competitive studies.
BENNETT: We review and report on content measurement data monthly, quarterly, and on a project-by-project basis. For projects, we review data pre-kickoff, during the discovery phase of the project, and then again as we hand off our deliverables at the end of a project. Depending on the goals of the project, we include reporting relevant to those goals into our regular monthly/quarterly reporting.
BENNETT: Our evaluations—the insights and data they’re built on—are part of the foundation for our content strategy.
At a macro level, we use these insights to inform prioritization of projects we should embark on as a team and organization based on what we expect will have the biggest impact on the customer experience.
At a project level, each project starts with a discovery phase, which is kicked off with the relevant insights and data, and an analysis of what additional data is needed before our research, tracking, and metrics teams pull in additional information. That data is used to inform the strategy we define for the project that guides the experience we design, test, and iterate on. The resulting definitions are incorporated into our regular reporting.
BENNETT: It’s ok to be scrappy. While it may seem like you’re starting from the ground floor, there’s almost always data available to help you get up and running.
A good starting point is defining your pillars. What are the values you want to measure against? What’s important to your customers or end users that you need to keep tabs on? Then assess the marketplace and understand the competition in your industry and best-in-class experiences from non-traditional competitors. What do those competitors do well that sets customer expectations for how they’ll interact with your experience?
Identify those attributes while at the same time getting a baseline understanding of what data you have available to you. Does another organization or partner team already have tools in place to gather data and provide you with insights? Anything from traditional web analytics to tools freely available through most major search engines will give you something to start with. If you have access to more advanced analytics tools and teams willing to share data, or partner with you on insight, that can take your evaluation up another notch.
Learn how the most successful organizations scale and mature content operations. Based on our research with 700+ content leaders and professionals.
Discover why + how an end-to-end approach is critical in the age of AI with this comprehensive white paper.
Learn more about the much-anticipated third edition of the highly rated book by Colleen Jones. Preorder the electronic version.
Learn how to bring out the full potential of text generative AI to create impactful content from this on-demand course.
Use this white paper to diagnose the problem so you can achieve the right solution faster.
Training for modern content roles through on-demand certifications + courses or live workshops.
Comments
We invite you to share your perspective in a constructive way. To comment, please sign in or register. Our moderating team will review all comments and may edit them for clarity. Our team also may delete comments that are off-topic or disrespectful. All postings become the property of
Content Science Review.