How we built Option Grid

A product design case study
15 minute read

A little context

Choosing a treatment option following a diagnosis can often be stressful, worrisome and confusing. Very often, patients are left feeling they haven't being given the full picture, or that they are being pushed to move forwards with a treatment they aren't comfortable with. The Option Grid project aims to change this by empowering patients with the knowledge they need to make an informed choice.

The Option Grid project aims to empower patients with the knowledge they need to make an informed choice of healthcare treatment.

What is an Option Grid?

An Option Grid is a simple table of information where treatment options for an illness or condition run along the top, and frequently-asked-questions relating to that treatment run down the side. Answers to those FAQs fill the cells, according to each treatment option.

The take-away on the original site

Let's say a person has been diagnosed with Plaque Psoriasis and is looking at treatment options. There are 3 medicines available, but which one's going to work best? One deciding factor for the patient might be how quickly the medicine will start working, so you look down the list on the left for that question. The Grid shows that Acitretin should work the fastest, and so this might help steer the decision to go with that treatment option.

Using the static printed Option Grid

The project was conceived by The Dartmouth Centre for Healthcare Delivery Science, and while working at Pumkin in 2015, I led a team that was asked to bring the digital experience up to date.

The brief

Two distinct audiences benefit from the Option Grid project: The system serves both as a decision-aid for patients, and as a research tool for the college staff.

Two distinct audiences benefit from the Option Grid project

The concerns for the patient were;

  • It wasn't inviting, engaging or accessible enough for the patient
  • There was virtually no contextual information or explanation of the take-away

The concerns for the research team were;

  • Creating and maintaining the Grids was a slow and cumbersome exercise
  • There was no way of tracking the use of the Grids, which options patients were selecting, and whether or not they were helping in a meaningful way

The new system therefore needed to;

  • Enable patients to confidently select a treatment option
  • Speed up the process of creating the Option Grids
  • Gather reliable data about their use

Our approach

We ran the project with four people; Alex (me), a digital designer, Ben, a front-end developer, Gints, a back-end developer, and Jay, our project manager.

I like to run projects with a mix of disciplines involved from the start, especially developers, who can bring great ideas and insights to a brief, as well as being able to help steer a solution that's feasible to build.

We also had frequent contact with Dartmouth; on their team was an experienced GP, two medical researchers, and a project coordinator.

Those involved was spread across Europe and the US, with ours located in London and Latvia. Whilst we talked a lot on Skype, we also relied heavily on Confluence for documenting discussions, and used decision boards to allow everyone across both teams to have their say; this was really helpful when working across different timezones. We also used JIRA for Agile workflow, and BitBucket for version control.


Defining personas

Our first step was to identify who we were designing for. We grouped our users into two camps according to which area of the system they'd be engaged with;

The Decision Aid

  • Patient
  • Carer
  • Doctor or medical practioner

The Research tool

  • Data researcher
  • Content editor

To define the patient and carer personas we spoke to friends and neighbours, asking questions about previous experiences in choosing treatments, whether people had felt actively involved in selecting their treatment, how informed they felt in making a selection, their levels of trust in medical practictioners and the level of influence of healthcare insurers.

We turned to Dartmouth for help in defining the Researcher and Doctor personas; as many of their team and contributors were active medical professionals, we were able to get a detailed insight into these roles.

Using the decision aid we had Raajhu...

Raajhu has been diagnosed with carpal tunnel syndrome. He's worried and is confused about the treatment options available, which range from surgery to simple exercises. He needs help to decide which is going to work best for him.

  • 52 years old, lives in the north of England
  • average technical ability
  • skeptical that patients always get given the full picture when it comes to treatment
...using the content editor we had Megan...

Megan is a content Editor. She has a lot of data-input to do and wants the most efficient and accurate way of doing it. Mis-information could be disastrous so she needs to be diligent and for the system to support her in this

  • 28 years old, lives in Boston and Paris
  • good technical ability
  • has to produce accurate work to tight deadlines
And using the research tool, Arianne.

Arianne is a medical researcher at Dartmouth. She wants to find out which aspects (eg. side-effects, time to recovery, long-term risks) are considered the most important factor in a patient's choice of treatment

  • 41 years old, lives near Boston, US
  • good technical ability
  • loves data; the more the better

User stories

We then came up with some story boards to illustrate the existing pain points;

The proposed solution

Together with the Dartmouth team we developed a concept for an interactive journey that would guide the patient through the available treatment options step-by-step, and record their activity along the way.

On the other side, we would create an interface to be used by project staff to expedite the creation of Option Grids, and to allow exploration of the data recorded from the interactive.

Design studio

User flows

We focused on two main journeys, starting with the interactive decision aid. We needed to create a journey that would feel easy and manageable, but thorough enough that the patient leaves feeling properly informed. For legal reasons the system couldn't make a recommendation for them, but it should present their choices back to them in a way which would help them select a treatment on their own.

The editor journey needed to be all about speed and accuracy. The interface for adding content should be as intuitive as possible, as well as allowing for version control and collaboration with other editors.

The reseacher wanted to be able to easily look up data gathered from the use of the interactive. They'd want to be able to draw patterns and understand whether the decision aid was making a meaningful difference to patients. This meant we'd have to consider these needs while we designed the interactive, and make sure the system could capture information the researchers needed, without putting too much of a burden on the patient.

Paper prototypes

We started with wireframes on paper, exploring them within our team and inviting comments from Dartmouth.

Sketching the interface helped us ensure that we were capturing all the needs of the Dartmouth research team

Clickable wireframe prototype

Dartmouth were satisfied we'd captured the main requirements in the paper prototypes, but paper wasn't telling us much beyond that- we needed to be able to test the length of the journey and proper interactions on the page.

Using Axure, we built a full journey as a low-fi clickable wireframe. We did the same with the editing / research tool, running the two designs in parallel.

The wireframe for the content editor

We put the prototypes in front of the Dartmouth team first, and then invited a few of our neighbours from around the building to come in and give us some feedback.

Findings from wireframe testing

The response to the wireframes was encouraging; there were no major issues with sequence or navigation, but there were concerns around the amount of information shown on the page at once. There were also two actions per page; the patient was asked to choose a treatment option, and indicate how important this particular question was to them. It was a lot to take in.

A busy page and two required actions led to confusion as to how to move forwards in the journey

Another concern was with the final review page in the journey; it didn't improve enough on the standard Grid format, and it wasn't easy to immediately draw a pattern in the treatment options that had been selected at each step.

Moving to a high-fidelity prototype

To improve the heavy pages, we did two things-

  • We split the two in-page actions into separate steps
  • We looked at ways to structure the layout more distinctly to add clarity through better grouping and separation

We developed the design from work that we'd done early in the project to establish a visual identity and tone of voice

Colour was an important tool in achieving this structure, and so we decided to move to a high-fidelity prototype using Illustrator and InVision, developing the design from work that we'd done early in the project to establish a visual identity and tone of voice.

Colour and better structure helped to bring clarity

The importance rating then became a step of its own, immediately following the question it related to. This meant we had much cleaner pages, but we did have more steps in the journey.

We also revised the final Grid page to give it a more coherent layout, more engaging visual and better feedback on the options the patient had selected during the journey.

Final review page, in wireframe
After a complete redesign, information was consolidated, and choices were more clearly visible

Findings from hi-fi prototypes

With a more fully-featured and 'real' prototype, we started to get much more detailed feedback from our testers.

We were still getting some negative feedback to do with the flow of information on the page; though we felt we'd achieved the right hierarchy of information, our early versions didn't seem to read right.

We tried a few different layouts and again asked for input from our testers, and a winner quickly emerged; one which worked by combining the explanation with the title for each treatment option, and presenting the entire element as a clickable object.

Combining the title and explanation as a clickable object made things more intuitive for the user

Data vs. Experience

We also still had an issue with the length of the journey. We knew we were asking for a lot of info from the patient, and splitting the page actions across more steps had made things feel tedious overall.

The Dartmouth Centre is a research facility, and the team was very reluctant for us to cut down or even to allow the user to skip any of the stages in the questionnaire, because this would put holes in their data and make it difficult to draw up patterns. The challenge therefore was balancing the need to collect data with trying to maintain a friction-free experience for the patient.

The Dartmouth Centre is a research facility, so one of our biggest challenges was balancing the need to collect data with a friction-free experience for the patient

Interface-wise we felt we'd already used techniques to make the journey feel more manageable; breaking down the questions into distinct steps, providing lots of progress indication, auto-advancing the page to the next element, etc.

We felt however that we could still-

  • make the language more conversational
  • add friendlier visual elements
  • consider reversing our earlier decision to split the actions into two steps

Together with the Dartmouth team, we revised the language throughout the journey, switching tedious-sounding things like "Step 11" to a more motivating "Almost done!", and adding short phrases explaining why we were asking the questions we were. We added illustrative icons to break up the repetitive parts of the interface, and optional interactive elements to add interest, like a data-viewer showing a visual representation of things like '1 in 100 people'.

Adding more conversational language, friendlier visuals and additional interactive items added interest and reduced fatigue

Going back on our earlier decision to split the steps was a tricky move too, but we'd had specific comments from our testers about this step, and we felt that with the new question/answer design that we'd introduced, we may be able to safely bring the importance rating back onto the same page and cut down the journey length significantly. Our final testing round would tell us if we were right.

Along the way, we also revised the final display of the patient's treatment selections, the device used to rate the importance of a question, pop-ups, colours, typography, and lots more.

Meanwhile on the back-end

Working on both sides of the build and having developers right there on our team provided big benefits;

  • We'd find out really fast if we were designing anything that wasn't technically feasible
  • We made shared decisions in the design that made things simpler for the devs to build and kept things clean code-wise
  • We could try things out for real and iterate changes really quickly
  • Technical input in the design led to some really nice solutions, particularly on the admin panel
Desiging the flow for capturing data

The back-end design was in many ways more involved than the front-end, and the user journey, though not as linear, just as important. Creating Option Grids requires input from several different experts, reliability and accuracy is paramount, and in order for the team to quickly publish new Grids and maintain existing ones, the process needed to be as straight-forward as possible.

The back-end user journey was just as important as the front-end one

After looking at different ways of laying out the page and sharing ideas with Gints and the Dartmouth team, we managed to come up with a design that mimicked the single-page Grid, making for a very intuitive tool for adding content into the right places. We created a prototype of this and passed it to the Dartmouth team for discussion and refinement.

A wireframe of the admin panel mimicking the Grid layout. This made it easy for editors to place content in the right place
And the working system

The back-end design and build presented numerous challenges. Among these were-

  • User access controls and on-boarding for admin staff
  • Version control of Grids
  • Management of Grids in development or under review, future Grids, and out-of-date Grids
  • Layout-adjustment tools to ensure the generated Grids always fit on a single page
  • Support for multiple translations
  • Inspection of data gathered from the interactive

Solving these issues required thorough and detailed discussion between both teams, and relied on input from everyone involved, particularly the developers.

Building the real thing

As soon as both teams were confident we had a good working prototype for both back and front-end, we moved into the build.

Our process was very much like that of the prototyping; we shared progress and working versions on a staging server as we moved forwards, and continued to respond to feedback as we went along. The Dartmouth staff jumped into the CMS the moment it was semi-functional and started adding Grid content so that early on we could test with live data and make improvements to the experience as we moved forwards.

Final testing

The complete Option Grid system was too complex to test fully at prototype stage, so we had planned for a product testing stage and reviews.

We built a focus group, casting our net far and wide and inviting responses from as broad a range of people as we could.

We then narrowed this to a base of 12 people, which included ages from 22 to 66, a person with cognitive impairments, those speaking English as a second language, those in locations from London to Vancouver, and using devices from iPads to iMacs.

We started by defining a task that we wanted our audience to be able to complete: Find the Grid for "Carpal Tunnel Syndrome" and select your preferred treatment option. We produced a questionnaire which accompanied our demo link, and asked our testers to complete this as they went along, providing comments where they could.

We recorded the testing sessions using Silverback, and collected general observations and specific comments. We chatted to some testers by Skype to allow them to elaborate on some of these, and then collated issues according to the level of concern they raised.

Recording user sessions let us see where people were clicking, where they went wrong and what slowed them down

Among the outcomes were-

  • Issues with on-boarding. Following email confirmation, testers were returned to a welcome page rather than the point at which they had left to create their account (usually a specific Grid landing page), and this was jarring
  • A couple of testers weren't sure what outcome to expect at the end of the journey, ie. why were they being asked to complete the interactive in the first place
  • The search facility didn't allow for non-specific terms, and returned no results upon mis-spellings and typos
  • There were some issues with the language in some of the Grids; some of the pages didn't read all that well
  • The final page recapping the chosen options was still not clear enough
  • People felt the text was too large on titles and sub-headings

Preparing to launch

Following the usability feedback, we made a number of improvements; first changing the on-boarding journey so that upon account creation the user would be returned to the exact page they'd left at, with a notification confirming their account was ready to use.

We then worked with Dartmouth to introduce extra contextual information and graphics at the start of the interactive to help explain the outcome the user could expect, and to improve the flow of language within the Grids themselves.

We revised the final page once again, to make the chosen options and importance ratings clearer, and introduced the ability to jump back to a step and revise a choice.

The revised recap page made drawing a conclusion even easier

We also added smaller functional improvements like an auto-suggest feature to the keyword search. We revised the text sizes and adjusted the layout in places to make better use of the space.

Developing the design to resolve final issues and across different devices

After re-presenting the system back to some of our original testers, and numerous bug-fixes later, we launched the site 🎉

A living product

Since then, the system has been used by patients, doctors and carers to help people choose treatments for conditions ranging from Indigestion to Parkinson's Disease. It has helped expectant parents decide whether to undergo Amniocentesis checks, and helped explain options for dealing with hearing loss.

The attention and number of visitors to is growing, and as a contributor to Pumkin and main contact for the Option Grid project, I continue to work on improvements and new features. In 2016, a major update was made to the system to enable more data to be collected from the interactive. Some of these changes resulted in an increase in the length of the journey, and so we are monitoring user behaviour and drop-out rates to see if these will have a detrimental effect on the number of users completing the journey.

We monitor user behaviour and drop-out rates to see if any changes have a detrimental effect

Since launching, the system has been put to use by clinical practioners at the University of Oxford and at Yale, and is referred to by the NHS, and The Health Foundation, among many others. Earlier in 2016, we were excited to learn that a Dartmouth student began writing a PhD based on data gathered from the Option Grid interactive.

Working in the sphere of healthcare on this project was an exciting experience. The sensitivities involved, level of accuracy and reliability required, and the need to design a highly accessible system were all welcome challenges, and working on a product which is actively making a difference in people's lives was extremely rewarding.

Alex Charlton is a digital product and service designer

More case studies
Home / Work