Are you an AMA member? Please log in
19th December 2016 Rebecca Moore

The Architecture of Digital Experimentation #DMA

Tom Beardshaw, digital marketing expert and a mentor for the Digital Marketing Academy offers some guidance on technical architecture for digital experimentation.

Much of my work this year as a Mentor within the CultureHive DMA 3.0 project has been about helping fellows to understand an experimental approach towards digital marketing work, and establish the technical infrastructure that they need to be able to run – and assess – experiments they wish to run. Here are the key components of this technical architecture, and the logic of an experimental approach.

Hypotheses
At the heart of many conversations about digital marketing work are hypotheses – your ideas about what will work. Your hypothesis might be a particular way of designing a webpage, an email, an advertising campaign, or media promotion. When you formulate a hypothesis, there will be a couple of elements that you’ll need to detail; firstly, the alternative that you think will work better, and secondly, the criteria that you think it will affect in a positive direction.

Articulating a theory of change and unpacking your ideas so that you can identify their precise, testable, components, allows you to test them individually; e.g. “Moving the ‘Buy now’ button to this new position on the product page will increase sales”. Formulating a hypothesis is all about formulating a proposition that you can go on to test.

Goals
A foundation of a digital marketing experiment is to be working towards meaningful goals. Charles Duhigg, in his book Smarter, Faster, Better, talks about two types of goals that people can articulate in order to improve performance.

1. STRETCH goals, which encapsulate the bigger picture
2. SMART goals, which specify goals which contribute to STRETCH goals.

When designing an experiment, it’s useful to develop a SMART goal that specifies the criteria for testing a hypothesis, sets out how it will be measured, sense checks it and specifies a time frame for the experiment.

Goals also refer to a particular setup in your data analytics which allow you check data against conversions. If you’re able to set up goals in, say, Google Analytics, you will then be able to check the performance of your experiment against.

Experimental design
Once you know what you want to do, and how you’re going to measure it, you’ll need to design an experiment that can test your hypothesis effectively. Because people often react differently at different times, the most effective way to do this is to set up an A/B test, which provides a proportion of your audience with a variation on your standard digital offering.

Email management platforms such as MailChimp have A/B testing baked into the product, so it’s easy to set up two versions or your newsletter, and send them each to a proportion of your audience. With simple websites, you can do this easily with Google Analytics experiments, with other configurations, you’ll benefit from dedicated A/B testing software, such as VWO, or Optimizely. These will help you configure your experiments, and provide data reporting to help you make decisions when your experiment comes to an end.

Data driven decision making
Probably the most exciting part of this approach is the ability to begin making decisions based on data, instead of opinion. Once you’ve tested your hypothesis against your carefully measured criteria, you’ll have results that (usually) show you an improvement in conversions, a decrease, or no change. Whichever way your data moves, you’ll have a solid basis on which to base your decisions about whether to implement the idea you’ve tested, or whether to move on and focus on something else.

Either way, you’ll be in a position to move ahead confidently, instead of holding discussions based on people’s personal preferences, which all too often result in changes being made based on the opinion of the loudest, most persuasive, or powerful. Data driven decision making enables you to move ahead with confidence that you have a good foundation of knowledge about how your users have responded positively to changes you’ve thought about making.

It’s this data driven decision making that lies at the heart of the experimental, iterative approach that the fellows of the DMA are learning, and it’s always a challenge of the programme to grasp it conceptually, set it up within your systems and them put it into action and change the culture of their organisation.

 

Header Image courtesy of Big Pit National Coal Museum © National Museum Wales

Tagged: ,