Showing posts with label user testing. Show all posts
Showing posts with label user testing. Show all posts

Thursday, 18 January 2018

How can you learn if you don’t test?

Even in my few years of working in fundraising and marketing here I’ve noticed a real shift in mind-set from the very top. We’ve stopped thinking we know everything! Instead of thinking we’re right without question, we’ve adopted an approach where our decisions are increasingly based on insights. We’re listening to our supporters more than ever and they inform the direction in which we move.

Stop being so vague and give us an example

Ok, ok. I’ll cut to the chase. I joined a Digital Spoke in 2016 to help design and develop a personal account for our supporters to use. Without going into the exact details of the product, it changed our team’s way of working beyond just this project. We worked in synergy with the Digital team and learned so much about adopting an agile approach to tasks while also gaining a huge amount of experience of different digital tools!

But I don’t want to talk to you about all of that. As exciting as it was, I think the way our team worked independently post-Spoke is the most impressive part of the project.

What’s so great then? Spit it out already!

The first thing we did was to maintain our daily stand ups which helped keep everyone focused. Each morning we huddled around a screen which showed all of the bitesize cards on our Trello board and we’d assess their progress. Reviewing this frequently really helped to keep things front of mind while also holding people accountable for completing tasks.

Both before and during our series of recruitment communications we held testing plan meetings where we could vote for which ideas we wanted to test using an effort vs. impact matrix. Where possible we’d go for what we felt were low effort, high impact tests! By holding these meetings in between each recruitment drive they also acted as an evaluation of the previous tests which was very useful context for planning.

What about when people signed up?

During the Spoke we gained a lot of experience at creating and manipulating dashboards and custom reports on Google Analytics. This allowed us to understand in real time how users were interacting with the product, what was working well and where we should be focusing our attention. Of course we sought expert advice for more difficult queries, but for the most part we were self-sufficient which was very encouraging.

Some of the trends we spotted through analytics caused us to implement real time optimisation tests. For example, when we spotted a dip in the conversion rate on the registration form we A/B tested different wording and infographics using Optimizely in an attempt to bring this up again.

How did people know what you were doing?

One of the key principles of agile working is to share progress and learnings with stakeholders and the wider organisation. Although we did this regularly with key stakeholders, we wanted to improve how we communicated to the wider organisation. It required an adjustment to how we approached the project but it definitely helped improve our skills and confidence at presenting. We’d give engaging updates every two weeks or so where anyone could show up and listen. The best thing about them was the questions we’d get asked which provoked thoughts and ideas we hadn’t previously considered.

To make this more accessible, we’d record the presentations and upload them to our Wiki page which increased our reach. This also allowed us to re-watch each one retrospectively to hone and improve our public speaking skills.

So what’s happening next?

This is a really exciting time to be working for Cancer Research UK. The direction we’re moving in as a charity is becoming inevitably more digital and we’re facing this challenge head on by devolving skills, knowledge and expertise into the product and marketing teams. This success of this Spoke is simply one example. The way we’ve carried these skills on post-Spoke and continued this way of working has really helped to take this project from strength to strength – but this is a process that is becoming more prevalent in all areas of the charity.

Graham Goodings
Digital Producer


Tuesday, 4 July 2017

10 top tips for awesome remote usability testing


Usability testing is fundamental to how we do digital projects at Cancer Research UK, helping us get to our ambitious target that by 2034, 3 in 4 by people diagnosed with cancer will survive. Testing our ideas out early and often on the people who use our products and services means we save a lot of time and money and give our users what they need in the most intuitive way possible.

We do all sorts of different types of testing (A/b testing, guerilla testing and lab testing to name a few!). But what about if you don’t have the time or budget to meet people face-to-face? That’s where remote usability testing comes in. With remote usability testing, you can share what you're testing with people who are in different locations to you and ask them questions about it. It’s a great way to talk to people who may not have the time to come in to talk to you. And it means you can speak to people that don’t just live near where you work. As a national charity, this is especially important to us at Cancer Research UK.

We recently did some remote usability testing on a new version of our publications website, which allows health professionals to order leaflets and other resources to help educate the public about cancer.
Here are 10 top tips that helped us run a great session:

1. Assemble an ace team



User testing is a team sport so it’s useful to get a small team of people together to help organise all the tasks you’ll need to do. Our team included me (UX designer), Hayley (Product manager) and Becky (Digital producer). We were all quite flexible in the tasks we picked up and all very willing to get stuck in!

2. Know your research goals

Before you start, it’s important to work out what you want to get out of your testing. For this session we wanted to know more about our users, and see if they could complete specific tasks when ordering resources. From these research goals we created a list of questions about the participant’s day-to-day working lives and some more specific task-based questions where we asked participants to use a prototype of the new website and tested out whether they could use it. 

3. Recruit the right people

This is really important for getting meaningful results. Luckily we had a list of active users from the old website who had already signed up to hear from us. So we emailed them to ask if they’d  be willing to help us improve the new site. A few people responded, so we were able to get 6 willing participants (6 is usually a good number to aim for, enough to start seeing patterns without getting too many repetitions). There are lots of other ways to find people though. In other projects we have asked our work colleagues and friends if they know the kinds of people we’re looking for. We’ve also put shout-outs on social media.

4. Write the script

Based on your research goals, you’ll know what you want to get out of the session. Now you can write the questions and tasks that will help you get the answers you need. For each task, think of a goal, the question you will ask and what a successful result looks like. Here’s an example from our session:




5. Do a dry run

Sometimes the questions you’ve written will make total sense on paper, but don’t work out when you ask them on the day. To minimize the risks of this, practice all your questions and tasks out on a member of your team who isn’t too close to the project. This is a great way to find out whether the questions make sense to someone else, and an opportunity to get your timings right.

6. Prepare the tech in advance

This bit is super-important. It’s very easy for a perfectly planned testing session to be cruelly foiled by a dodgy microphone or an out of date screen sharing subscription. For this test we used join.me to share our screens with participants. We made sure we tested both this and our own hardware (computers, microphones etc) out the day before we did the real thing.  

7. Meet with the team to make sure everyone knows their roles

Once you’ve assembled your team, make sure everyone understands their roles, both in preparing for the tests and what they’ll do on the day. You’ll need to work out who will be facilitating (asking the questions) and who will be doing the note taking (Pro tip: If possible, the facilitator shouldn't be someone who's been too involved in the design, as this can bias the testing). We used google sheets for note taking, as it’s the easiest way for the whole team to share and add their notes.

8. Invite the whole team along

Invite anyone involved in the project, including your development team and any stakeholders you have. This is incredibly useful as the project progresses so you will have a shared understanding of your users and their challenges.

9. During the session – Don't forget the notes










Now’s the interesting part where you actually get to hear from your users! Make sure the note taker is taking qualitative notes (describing how the person is doing the task) and quantitative (usually a score for each task of between 1-3, based on how easily someone completed the task). Both will be very useful after the session for understanding what the pain points were.

10. After the session – get into the detail

Well done, you made it through all those interviews in one piece. Take a well deserved break, but not for too long. Now’s your opportunity to take stock of what happened. Review the session with your team. Take a look through all your qualitative and quantitative notes and make a list of what can be improved. Don’t leave it too long or you may start to forget what happened in the session.

Remote usability testing – one piece of the testing puzzle

Remote usability testing is a great way of testing out your ideas early and in this project it helped us discover insights we would have never have uncovered otherwise.  But it’s just one of the many tools we use at Cancer Research UK to understand our supporters better, making them happier and helping us beat cancer sooner.