Content Research (or How to Tell If Your Content is Broken)

Georgina Laidlaw
Share

How well is your content working?

In the age of data-worship, most site owners answer that question by turning to their site analytics package. Pageviews, bounces, conversions, goals – these are the vital statistics by which most assess their content’s health.

But if you publish or manage content in almost any form – from a simple sales website to a full-blown digital publication like SitePoint – sooner or later you’ll want more information than this.

Much more.

Little man on a scribble-filled pad.

Photo: Rafael Peñaloza

You don’t just want to know what works to achieve an end-goal. You want to understand how users are actually using the content you’re providing.

I found this recently with a support site I write and manage. I didn’t want to know about ad conversions or video shares and leave it at that.

What I wanted to know were things like:

  • Were those print buttons we put on each help article a waste of space?
  • Did people think the presenters in our videos were entertaining, friendly, professional, or annoying?
  • How much of each article were people reading?

My questions didn’t stop with the digital experience, though. Content research, I’ve found, is the ultimate can of worms. The more you know, the more you want to know.

Soon I found myself asking what role our content played in the bigger picture of problem-solving that our users face as they use our software. What were the gaps – and how could we best help to fill them? And so on (and on).

These kinds of questions can easily be recast for other content-heavy sites, where you want to know if people make it to the end of the content, if they feel good about your stuff, and if, dare I say it, their lives are enriched or made easier somehow by what your site offers.

Your analytics package won’t tell you all that. This is where content research comes in.

What is content research?

Tracks in the snow.

Photo: gregorywass

When I first had these problems, I didn’t know the answer to that question. All I knew was that I had content – most of which I hadn’t created – and I wanted to understand more about how that content actually worked for users.

So I needed to research my content. Simple, right?!

Well, sort of.

There is a plethora of tools and techniques out there, so you’ll want to pick the right tool for the job. The good news is that there’s probably more than one capable tool for any one job. The bad news is that if you use that tool poorly, you’ll get low-quality information – and possibly even be completely mislead by it.

So, put simply, content research is the process of finding out the best answers you can to the questions you have about how your content works for the people who are using it.

It’s about digging deeper than the stats we all know and love to ask non-standard questions in ways that enable you to understand, trust, and act on the answers you get.

Why would you do it?

From where I’m standing, having embarked on the content research path, it’s easy to see the benefits. For me, the question is, “Why wouldn’t you do it?”

But if you’re basically happy with what your analytics are telling you, there are probably quite a few reasons not to bother.

Chief among them is the fact that there aren’t any benchmarks here – we’re talking about bespoke research designed by you, to answer your questions about your content. For some, that’ll be both intimidating and time-consuming.

Then there’s the fact that this kind of research forces you to challenge your own perceptions of how your users approach your content, consume it, and act on it.

In my case, I was deeply certain that those print buttons on my articles were a complete waste of space, a distraction. So deeply certain, in fact, that this was something I probably wouldn’t have questioned unless I’d actually witnessed real users using them as part of some other content research I was doing.

Another likely objection is the cost. Because if you’re thinking along traditional digital content lines, you’re probably thinking I’m talking about UX-style user research, and that costs big bucks, right?

In fact, we’re not talking about UX-style user research. And the things we did to understand how our content was performing for our users didn’t cost us much – most of them were free.

But the content research we completed has already delivered a great deal:

  • We understand our users, and the place our content has in their lives, much better.
  • We’ve been able to quash internal stereotypes and misperceptions of the kinds of people who use the site, and what they’re doing there.
  • My team has exploded its own long-held, subconscious perceptions of how “people” use “content” “online”.
  • Perhaps most unexpectedly, we’ve avoided making several mistakes that would have reduced the usefulness of the content – and by extensions, the site – for our users.

So, if you’re at least curious about content research, let’s have a look at the process of creating a rough research plan that might start to help you get a better handle on how your content works.

Creating a research plan

Three things prompted me to start researching our content.

The first was that I kept coming up against ingrained, company-centric perceptions of the site’s users that, as an outsider, I was skeptical of. The stories about users varied greatly depending on who I spoke to. Surely they couldn’t ALL be right?!

At the same time, I understood that the people my content was serving knew the product and the industry far better than I did. It was nice for me to add content to a topic section on the site, and assign it a location in a content hierarchy … but I had to rely on others to tell me if that location was right.

It felt like I needed to understand the processes explained in, and people using, the content much better.

Finally, I thought I had a basic understanding of how people used our content, based on my broader understanding of how users generally use any web content. But there were a few gaps I thought I needed to fill.

So I started writing down the stories I heard about the site’s users from different parts of the business. I started writing down questions about tagging, hierarchy and categorization, page layouts and pageflow logic. I started writing down the things I thought needed to change, based on my understandings and assumptions.

So I had a list of questions. That was step 1.

Step 2 was to work out how I could find answers to those questions. This was a long and convoluted process that this series of articles hopes to shorten for you.

Checkout clerk looking unimpressed.

photo credit: Consumerist Dot Com

The thing to understand here, though, is that you won’t get the answers you seek from any one stats product or testing service.

You can’t farm this thinking out to on-demand, prepackaged services. It’s like asking a supermarket checkout clerk for cooking tips. They just aren’t set up to answer those types of questions.

You have to get creative, and you have to be prepared to change tack – and tools – as you find out new things.

Step 3 is to put together a loose plan that outlines what you’ll do to answer the questions you have. We didn’t start big. In the beginning, I really only had three questions that I wanted answers to:

  • Which, if any, of the many landing page layouts on our site was most effective?
  • Users didn’t need that tag cloud in the right-hand side of the article page, right? It was so 2001.
  • No one was ever going to click on our FAQ panels, which show 10 topic-relevant FAQs randomly selected from pools as large as 20, were they? Surely we could ditch that, couldn’t we?

Looking at these questions, we thought the quickest way to answer them was to get some good click tracking software installed and have a look at what it told us.

You’re probably thinking, that is one heck of a “loose plan”. That’s my point: to kick off your content research adventure, you just need a few useful questions, and a decision on how you’ll try to get the answers.

That’s where the magic begins…

Next week, we’ll look at click tracking tools. We’ll see what click tracking is good for, and learn its limits. I’ll also show you how it answered our questions … and posed a few others.

But in the meantime, let us know in the comments what kinds of content research you’ve done – and on which kinds of sites.