At first glance, a distributed publishing model, over 1 million pages, and varied, diverse audiences present an intimidating content challenge for National Instruments. But Web Content Producer Lauren Moler tackled this challenge by mapping mental models (inspired by Indi Young’s book Mental Models) that visualize the connection between specific user tasks and content. For Lauren, using these mental models as maps helped quickly identify content gaps and more effectively reinforced the importance of meeting user needs.
In this interview with Content Science, Lauren discusses mental models in more detail and explains how this tool helps her tackle complex enterprise content problems. Her insights help reinforce why backing up your content decisions with data is so important—not only with quantitative analytics but also by providing a qualitative analysis of user needs.
A mental model maps user tasks or questions to the content we publish on our site. The visual becomes a series of boxes (see below) that each represents a user task. Those tasks are stacked up in towers and grouped together at the top of the mental model. At the bottom are pages grouped to match the tasks.
We map the mental model directly to the buying cycle, so this visualization shows us if we’re creating content focused on what users need. It also helps us build a roadmap for creating new content to fill the gaps. That way, we’re not creating duplicate content when we already have answers for users on our site. Now we’re trying to use the mental model to understand what potentially needs better refreshing or curating.
The key element is focusing content on user tasks. Before, content for a particular project varied depending on who was in the room or where we were at that time with the business. We’d have different ideas of what content needed to be published that weren’t based on research. The mental model helps us make sure that we’re focusing on user tasks and filling the right content gaps. We learned a lot in talking to sales and got great insights into what’s really important in the buying cycle.
The mental model also helps us understand potential areas for reusing content. While we started out using mental models on a just particular part of our site, we’ve expanded that to include almost the entire National Instruments site. With that kind of overall view, we started to see content patterns where the user needs were the same. That suggested areas where we could reuse content.
Understanding user needs helps us with this issue. We convinced people to write less technical and feature-focused content by showing them our users’ questions and needs. We support our user insights with search data, analytics data, and qualitative data that we’ve collected through surveys and usability testing. When you show stakeholders the data, it’s really easy to demonstrate what happens when you write an article one way versus another way. Also, I like to show what people search for after they read a technical piece of content to better understand what information we failed to provide.
I suggest using case studies about companies that have had similar problems with redesigns. Cite internal examples as well. When companies redesign a website, they often focus on the top page layer and ignore what’s underneath. Solving the content problem underneath the top page layer is simply too large to handle in a redesign. For us, that’s over 1 million pages, so it’s impossible for us to completely redesign everything. It’s too large.
Of course, there are still times when people think they must redesign. In those cases, I suggest trying to slow the process down enough to do some user research and define a content strategy upfront. Try to get a good handle on what content you already have. When enterprises see a lot of problems with a website, the natural inclination is to tear it down and start over from scratch. But through the mental model process, we learned that we have a lot of really good content on our site. If you redesign, let decision makers know that they may risk losing a lot of really good content that currently answers user questions.
Create clearly defined roles. In the case of the mental model, we involved everyone but had a core team of about 4-5 decision makers. If we brought new people into the process, we helped them understand the role we wanted them to play and clearly set expectations. We also have a distributed publishing model at National Instruments. That makes it impossible to gatekeep all of the content that goes up onto our site. It’s something we’re hoping to change, but we have to currently work around that reality. If everyone can publish, that means getting as many people in front of our content strategy as possible so they can see it and become a part of it.
Given these parameters, the process turned out much better than I expected. The mental model process speeded things up a lot. We spent less time having ambiguous conversations about what our content should be doing. Instead, we have something concrete that makes it easier to get a large number of people behind a content idea. I was also surprised by how willing people were to participate. But it makes sense. Once you talk to people about user needs, you realize everyone cares about your customers as much as you do.
At the end of our interview, Lauren added, “Any amount of content strategy you can implement within your organization or into a project, do it. Even if it’s not 100%.” If you need inspiration to start somewhere, Lauren’s progress at National Instruments illustrates the importance of studying user needs, using that data to justify content strategy, and filling in content gaps to help shore up critical parts of the buying cycle. Lauren’s experience also reinforces the need to use data as much as possible in order to help people understand why content matters.
Visit National Instruments to explore its content.
Originally published on the now-archived Content Science blog in November 2013.
Content that uses emotive language performs nearly twice as well as purely factual content. Learn more in this guide from Acrolinx.
Learn why one page is rarely enough to rank for competitive topics and how to build a content cluster that positions you as an authority in this MarketMuse whitepaper.
Make better content decisions with a system of data + insight.
Your content approach makes or breaks your digital transformation. Learn why intelligent content strategy + engineering are critical to your success.
Comments
We invite you to share your perspective in a constructive way. To comment, please sign in or register. Our moderating team will review all comments and may edit them for clarity. Our team also may delete comments that are off-topic or disrespectful. All postings become the property of
Content Science Review.