Are you an AMA member? please login

Riding the wave #DigiLab

In Emilie Barton’s second Digital Lab blog, she explains how data analysis into audience digital behaviour has helped Pavilion Dance South West (PDSW) to implement improvements to its digital work. Emilie also shares insights into her three #DigiLab experiments .

In the first stage of the Digital Lab, I was collating data on PDSW’s main group of digital customers who were split into:  

  • Audience who look at Live Performance
  • Audience who look at Classes

After gaining some very useful data, I was able to see a picture of how these different groups are using our website, as well as the different traffic referrals and in-site behaviour.

General improvements have been implemented into our digital work after learning from an analysis of our current digital efforts. The new developments are very exciting, simple to utilise/upkeep and enables further data gathering and insight into audience behaviour.

  • SEO plugin (Yoast) added to WordPress — very useful for controlling our SEO and social media links (e.g. pulling images through to sites such as Facebook).
  • Launched Google Search Console (Webmaster Tools) — increased visibility of searched terms.
  • Increased use of Google Analytics.
  • Google AdWords — looking into — free advertising for charities.
  • Klear.com plugin — useful social media stats and standing with other organisations.

It is also worth noting these developments though useful for any marketeer, are at no cost and anyone can use.

Experiment 1 — Riding the wave
Experiments are also underway. After finding out that PDSW has a high bout of engagement into the website in January/February, I wanted to ride the wave and create a coherent digital plan to attack my main aim: ‘to improve PDSW awareness’. This for me has led to including the whole PDSW message and picture in all areas:

  • Emails — adding PDSW messaging into emails with designed buttons that lead audiences to different areas of our website. It also reinforces the message that PDSW is a charity and has a wide remit of great work. I have summarised and evaluated the first test which provided overwhelming positive click data for the designed buttons.
  • Social — adding PDSW messaging to posts which emphasise that we are a charity and that our work covers Classes, performances, Youth work, Dancing for Dementia, Parkinson’s Dance, Artist Development and much more.
  • Social — I have also separated our classes into weekly topics/themes which allow audiences to gradually absorb the whole spectrum of our classes offer.
  • Seasons Listings gone live.
  • Design work — Along with the Buttons used in our emails. I am also testing a ‘Did you know?’ campaign.

Measure: To measure this experiment I hope to improve this website stat: ‘Traffic who look at ‘Classes’ pages who also look at ‘What’s on’ pages = 3%’

Experiment 2 — ‘Did you know?’ campaign
Along with ‘riding the wave’, I am designing some simple E-Images which include a series of basic facts about PDSW. PDSW holds thousands of dance sessions and provides opportunities to the community all year round. Releasing these facts will enable our audiences and the general public to understand us better.

  • Designed E-Images with PDSW branding which will include ‘Did you know’ facts about PDSW. To be used mainly on Instagram, Facebook, emails, Twitter.
  • Posts on social media to link to different parts of the website.
  • Template for e-image is made and set up.
  • Planning a schedule and strategy for release.

Measure: I hope to see an improvement in traffic to areas of the site, other than ‘Classes’ and ‘Live Performance’.

Experiment 3 — Home page
My last experiment requires further data gathering which will enable us to see how our home page is being used in more detail. Once further information comes in, I plan to make some small copy and image based edits which will reflect the data that I am gathering now.

  • Google Tag Manager has been implemented with a Google Analytics sync. I have applied a trigger and Tag to our home page buttons. Data will now feed into our Google Analytics telling us how often and what buttons get clicked on.
  • I hope this information will provide insight into what content audiences are interested in and what content needs a bit of development.

Measure: I plan to draw up data stats on the current use of the home page. Once I have implemented some changes I will draw up the same data and look for any improvements.

The Digital Lab is proving very beneficial to those looking at their digital efforts as a whole picture. I hope to keep gathering as much data as possible, for a continuous understanding of how our audiences are behaving, which is crucial when developing a new digital strategy. I also hope to provide some data-backed evidence of what works for our audiences and their engagement.

Read Emilie’s first #DigiLab blog — Audience data research #DigiLab.

Image courtesy of Artsadmin. Passage for Par, Rosemary Lee (Groundwork 2018). Produced by Artsadmin and Dance Republic 2. Photo by Graham Gaunt.

Reflections on #DigiLab

Fresh from completing the Digital Lab, Rachael Williams from the British Library takes a few minutes to look back on her five top key takeaways from the Fellowship.

Rachael has been catching up with the amazing people who work at the British Library to start a mini Curators on Camera series. Follow the British Library on Facebook, Twitter and Instagram to see, in true Blue Peter style, one (or two) videos the team and Rachael made earlier — going live in Spring 2019.

Read Rachael’s two previous #DigiLab blogs — Finding that golden nugget among a hoard of treasure #DigiLab and  To takeover or not to takeover #Digilab

Keeping the dialogue open #DigiLab

In her second #DigiLab blog, Ruth Selwyn-Crome from University of East Anglia (UEA) reflects on the key takeaways she’s gained from recent AMA training events and the impact these have had on her work.

One of the benefits of taking part in the Digital Lab has been access to the online workshops and being invited to attend AMA events. These would not have been on my radar in my current role in an higher education institution.

I’ve really appreciated the opportunity to hear about thinking from the arts and cultural sector and have been pleasantly surprised by synergies in thinking and practices. Many of the events offered to the public by UEA are free of charge — there’s no financial gain — but we strive to attract and encourage as wide an audience as possible to hear about brand new research in an accessible way. Purpose and aims shared by Digital Lab colleagues often resonate.

Three months on from blog no.1 and I’ve participated in:

  • Two online workshops: Producing a Podcast and Content is All
  • Digital Marketing Day — Future Now

I’ve written:

  • Two blog posts for UEA’s Alumni newsletter (open rate c. 15K)

Uploaded:

  • Two new memory boxes onto the Gig History site
  • And one brand new venue list to publicise and play with
  • AND taken over the foyer of the wonderful Regent St Cinema for a UEA London lecture: Pop and the Disabled Body

A project with an original completion date of December 2017 is still attracting interest from all areas of the university — with requests for The Gig List publication for visitors and invitations from University partners like the East Anglian Film Archive to commission a short film for UEA students’ Do Something Different week. When thinking about the brief to the students, I was reminded of the original aim for the project: to create an exciting, mysterious, online venue.

There’s been some serious distraction (see above) from setting up the test website. But UEA Gig History now has a domain and a WordPress site, together with a special template. Tester gigs have been identified and material has been gathered and my mentor is working on setting up some interesting widgets.

I had a number of takeaways from Future Now, which I’ve shared with colleagues here at UEA. The opening keynote by Anne Lise Kjaer really hit home; that being a good storyteller isn’t enough these days — you have to keep the dialogue open. The Gig History project was designed to be a short, fun campaign within a temporary online mini-exhibition. There was never space for true interaction and the new website should — ideally — allow for this functionality.

Another key takeaway was from the wonderful Lucy Sinclair from the Royal Opera House; that everyone in a modern organisation should be digitally literate. As someone who has always been first in line to try new methods of communication (I still believe I was one of the first people at UEA to have a Twitter account…) there’s always been that sneaky suspicion that it’s not “my job”, so Lucy’s words were heartening.

The inclusive digital experiences session with Robin Christopherson, was also incredibly inspiring. Key takeaway was that everyone is (almost) on a level playing field when it comes to new technology. We mustn’t associate accessibility with disability as if it were something out of our own experience. I will bear this in mind when designing those first pages of the new site.

Suhair Kahn from Google Arts and Culture was last to speak.  Scanned Life magazines caught my eye with the ‘lifetags’ project. I could imagine the gig archive scans being tagged and searchable. Would that be too simple?

If nothing else I need to make one small, measurable experiment at a time. But for now, I’m excited to see what my mentor makes of the theme I’ve chosen.

Read Ruth’s first #DigiLab blog Shrinking changes for the Sex Pistols #DigiLab

Image courtesy of University of East Anglia (UEA). Photographer Mark Hodgson via University of East Anglia (UEA) — The Passions performing in 1981. 

Everything is an experiment #DigiLab

Rachael Norris explains how her experience of the Digital Lab has been a lesson in being adaptive and responsive to changing information.

My initial idea for a digital experiment was quite ambitious in terms of time and resources — and it turns out it wasn’t exactly an experiment either.

Working with my mentor Tom, I set a stretch goal for my experiment. This means that I can never fail. As long as I am working towards my stretch goal — even if something doesn’t work or goes wrong — I am able to learn and adapt from the information I gain.

My main goal, or stretch vision, is to meet people new to the theatre and grow a relationship with them online by giving people an experience that motivates them to connect with Octagon by liking us on Facebook or following us on Twitter or Instagram. This element of my experiment won’t change however I reach this end goal.

The Setting Up Scrappy Experiments online workshop taught me that my experiment might look like a series of experiments rather than one experiment, and that it must have measurable outcomes.

To achieve my end goal I need to ask questions about what I want to learn from the experiment.

What I want to learn from this experiment is which methods are most effective at prompting people new to us to engage with us on Facebook, Twitter or Instagram. I can measure this by tracking the metrics of how many people like/follow us over a certain time period and how many people use a certain #hashtag so we can know what prompted them.

So my experiment has become three different experiments. This means I am able to contrast the results of different methodologies and learn which method is most effective. I am going to test which method is most effective in driving engagement by asking audiences to:

  • Create something
  • Express an opinion
  • Take part

My experiments will follow the cycle of trying, reflecting, learning, planning and then trying something different. The experiments must serve to disprove that my hypothesis is wrong. Science says that you can never prove that a hypothesis is correct but that you can prove a hypothesis is not wrong!

The hypothesis for my digital experiment is that new people will engage with us online if we give them the opportunity to. What I will discover is which method works best. It is important that I approach the experiment with an open mind and eagerness to learn. I am not looking for information to support an assumption I already believe but I intend to impartially collect and reflect on the data to learn something new.

I have a bad habit of procrastinating through planning. Experimenting is almost the opposite of planning. Experiments are doing and learning as you go. Assessing where failures are and how to improve, rather than planning to avoid failure completely. It can be very disheartening if a plan doesn’t go completely according to plan and I have learnt that it is inevitable that this will happen. You really cannot account for everything through planning and if there is one thing that you can rely on it is that things will change.

Being adaptive to change is the most valuable lesson I have learnt in the process so far.

I have already adapted and changed my ideas to make things happen. Getting things done is half the battle. (My mentor Tom recommended reading the Manifesto of Done). As long as you are travelling towards your stretch goal or strategy then whatever you do to get there are your tactics, and by changing tactics that you can learn best practice.

Listening to the data, noticing when information changes and adapting to work wiser is a good start!

Image courtesy of Octagon Bolt — #BoltonOzComp — design competition on social media (Facebook, Twitter and Instagram). 

 

Creating a Culture of Evaluation #Digilab

If we could see the end of a project, examine it through a 360 perspective and then bring all of that learning into the next project, organisational growth occurs.  Based on her experiences at the Brooklyn Museum, Digital Lab Mentor, Sara Devine blogs about the importance of project evaluation processes.

The concept of using data to drive decision-making is definitely taking hold in the arts and cultural sector. And let me say, hallelujah to that! However, it can be a tricky thing to execute. To really do it well, you need a culture of evaluation in your institution. This means setting aside the necessary time during the planning stages of projects to define measurements of success, setting up ways to take those measurements, monitoring them, and gathering results. Then to really bring it home, you have to do something with the results. That’s the point, after all, of gathering the data. To make decisions.

The concept of using data to drive decision-making is definitely taking hold in the arts and cultural sector. And let me say, hallelujah to that!

It can be a bit of an uphill battle trying to institute this. People (leadership, for example) might realise we need data to make decisions, but not have a full picture of what it takes to get that data. It takes time and resources. Rob Stein’s chapter in the Manual of Digital Museum Planning (Rowman & Littlefield 2017), ‘Making It Personal: Putting Data at the Heart of of Your Museum,’  offers guidance on just how to go about accomplishing this goal. I encourage you to check it out if you’re struggling in this arena. He gives helpful advice on how to set and measure KPIs (key performance indicators) and even references the OODA (Observe, Orient, Decide, Act) loop, a decision-making approach born of aerial combat, as a useful tool.

In my own work, I find that starting in my proverbial backyard is a good approach. I regularly have to make decisions or provide insight to inform others’ decisions, and have become a broken record about either needing data or using data to make decisions. We struggle a bit at the Brooklyn Museum because we lack a CRM (customer relationship management database) and rely on multiple systems to do our work. But that doesn’t mean we can’t gather data. In fact, we’ve recently partnered with Frontier7 to build a baseline visitor study I hope to run once a year. We’ll also work with them on market research for specific exhibitions and ongoing evaluation of outreach efforts. I recently sat down with our Director of Technology and Senior Marketing Manager to map out our ongoing evaluation needs and begin to put a timeline in place for executing these studies.

We used giant sticky sheets to map our evaluation plans for the next year, including what questions we want for a baseline visitor study and for an exhibition-specific version.

We constantly find ourselves wishing we had more information. So we’re going to get it. I don’t mean to be flippant here — it’s going to take work to ensure these studies take place regularly, to gather the info, analyse the data, and report on it. But, honestly, there’s only one way to build a culture of evaluation and that’s to get started evaluating. I’m trying to help create that culture by practicing what I preach. I’m confident as my colleagues see the usefulness, they’ll get on board with helping create that culture building-wide.

Image courtesy of Sara Devine.

To takeover or not to takeover? #DigiLab

That is the question that Digital Lab Fellow Rachael Williams explores in her second #DigiLab blog post as she considers the pros and cons of website homepage takeovers as a new website functionality allows the British Library to further test how its users like to consume key brand messages and launches.

Homepage takeovers are a big statement for a brand or company. What do they want to shout about so much that they’ll give it pride of place on their homepage — often overruling all other messaging available to users landing on the page? A big decision when your organisation serves a multitude of different users trying to reach very different outcomes. In our case, over one million people use our website every year to do things like research their family history, find primary resource material on typographical techniques in Germany, seek out tips for starting their own hair care business, peruse our shop’s stationery offerings, or buy tickets for a blockbuster exhibition.

In short: we have a lot of users to guide and content to provide.

At the request of our Head of Digital Content Strategy, our Technology team developed the functionality for our brand to showcase homepage takeovers. Once ready, it was down to us to test what type of content was to be served and find out more about what our web visitors reacted well (and not so well) to.

A quick look at our everyday www.bl.uk homepage — our events and exhibitions are featured prominently, as is the ability to search the Library’s catalogue. Scroll past the fold and you’ll see news items, blog, ways to join us, featured collection items and a shop carousel.

The takeovers give prominence to one key cultural/marketing message but retains the top drop-down menus and the search the ‘Main Catalogue’ option (as many users will be coming to our site solely for research purposes).

Over a five-week period across October and November 2018 we tested a number of content types in the homepage takeover platform. We chose content that served Library values and purposes and that would help to deliver KPIs (key performance indicators) such as driving traffic to web spaces (curated micro-websites on specific subject areas), increasing the number of digitally consulted collection items, and selling tickets to exhibitions.

Varying content type, imagery and calls to action, we served ‘takeovers’ for a maximum of seven days — for our major exhibition Anglo-Saxon Kingdoms — and three days for all other tested content, to ensure we could measure results and behaviour for a substantial period of time.

Some other examples
Following the success of our Windrush exhibition we wanted to promote the launch of its accompanying website — Windrush Stories — using the same striking lead artwork:


 Do big names sell? Our launch of the Winter Events featured some of the key speakers taking part in the season ahead:

Cats and the internet. The purrfect combination. Paw prints prowled onto the homepage for the launch of our free Cats on the Page exhibition:

We also ran tests linking to our World War One webspace for the centenary weekend of the Armistice.

My hypothesis, based on user behaviour email, was that a playful or unexpected call to action would get more clicks than a ‘Book now’ — as (it’s thought) that people are put off by the idea of parting with their money so quickly. Using my awareness of how posts have gone down with our social media users, I also thought a purely events-focused message wouldn’t be as popular as other content, such as free web spaces that offer users a wealth of amazing collection items and stories to explore.

The results: a snapshot

  • Anglo-Saxon Kingdoms was the top-performing takeover, receiving 3,487 total number of clicks on the call to action (CTA) button over seven days (avg. 489 per day), followed by Winter Events, which received an average of 291 clicks on the CTA per day. Windrush Stories was the least successful, receiving 325 CTA clicks over three days.
  • All of the takeovers performed better in terms of driving revenue from the homepage than the same number of days in the previous week.
  • The best takeover topic in terms of clicks, click-through rate and revenue was Anglo-Saxons. Even though it ran for seven days rather than three days for the other takeovers the daily metrics are still greater.
  • ‘Book now’ is the strongest call to action used out of the five — interesting as we tend to steer away from such a definite CTA in email so as not to ‘scare’ people off.

So without making things sound too essay-like
Here’s my analysis, or at least food for thought, from this experiment so far:

  • Our World War One web space is one of our top-performing microsites so I would have expected more people to click-through. Were some users confused seeing this on the homepage? Unsure of what they were being delivered so abandoned rather than pursued?
  • Expect the unexpected: Windrush was one of our most-visited free exhibitions ever, we should try to play with text and calls to action if we feature this on the homepage again. We know there’s an appetite for this content (as the exhibition visitor figures demonstrate) and just need to think of better ways to serve to get those clicks up.
  • The diversity of our audience should never be underestimated. Our Anglo-Saxons exhibition has attracted over 90,000 visitors and the reaction to seeing this on the homepage was astonishing; this exhibition appeals to a very Library-loyal audience. But we should be experimental too — Cats on the Page is playful, bold and different to our ‘expected’ themes and material, and if our social media followers are anything to go by, they love moggies on- and offline
  • Our Anglo-Saxons takeover was the first one we tested — were people especially curious of this new landing page experience which helped to generate the very high click rate?
  • While successes like a spike in ticket sales may be the result of a takeover, we should continue to monitor the impact on other areas of the website that are reliant on homepage traffic, i.e. blogs and vacancies. Customer Services can also provide us with qualitative feedback from users who have landed on a homepage they were not expecting (and perhaps didn’t like).
  • Are our users bolder and braver than we think? They don’t seem to have been put off by clicking ‘Book now’.
  • Higher click rates on the ‘takeovers’ may be due to the design itself rather than content alone. As mentioned in the beginning of this blog, we have a lot of users to serve and our homepage tries to tick a lot of boxes when it comes to what people are looking for. This bolder, more prominent ‘singular choice’ approach seems to have a positive effect. Are people deterred by too many options on our standard homepage?
  • Short bursts of carefully considered takeover content is working. Our team have drawn up criteria for proposals for homepage takeovers, this will evolve as we find out more about our users and their reaction to different types of content.
  • How do the takeovers effect user behaviour otherwise I hear you cry (or should that be meow?)? As expected, unique page views on the content being directed to is significantly increased (around 80% more) the days of a takeover, compared to the same days the following week. Interestingly however, the dwell time and bounce rate on those landing pages are not as dissimilar when comparing corresponding days the week after a takeover. Obviously with something like an exhibition, a spike in ticket sales is no doubt down to more people going to that web page — leading them on a journey to purchase – all stemming from the start point of a homepage takeover. But with takeovers where we’re not ‘selling’ it’s hard to fully interpret the results at this early stage. I guess just because we’re putting the content more predominantly in front of someone, they still may not be wholly interested in it. Maybe if we time certain content types at more opportune moments, we’ll capture people’s attention — and hold on to it — a little better. Which leads me on to …

Next steps?
We have some exciting exhibitions and webspaces launching over the coming months which will be given homepage takeover slots. It’s likely that we’ll also re-try events season launches and exhibitions with longer runs, so we can compare how Cats performs at ‘Final weeks’ push with its opening message. With LGBT History Month and International Women’s Day in February and March, it would also be topical for us to try more ‘timely’ takeovers by giving our LGBTQ histories and Sisterhood and After webspaces prominence on the homepage.

We shall continue to test these mini experiments and learn from the results and I shall purr-sistently try to weave cat puns in to my work at every opportunity for the duration of our Cats on the Page exhibition.

Pawsome.

Read Rachael’s first #DigiLab blog Finding that golden nugget among a hoard of treasure #DigiLab

Images courtesy of the British Library ©.

 

New Ways of Working: Airtable #Digilab

If it ain’t broke, don’t fix it, goes the general idiom.  At Shakespeare Birthplace Trust, Digital Lab Fellow, Nicky Hand explains in this detailed blog how implementing new digital tools is helping her organisation try new ways of working. 

For my first Digital Lab experiment I decided to focus on an internal process that was basically working, but I knew could be so much better. We’ve been using a content calendar to plan our social media and blog posts here at the Trust for several years now. During that time we’ve been working hard to skill up colleagues across the organisation and enable more people to contribute content to our outgoing feeds.

We’ve made great strides in empowering our staff to share their stories, but there’s still more we can do to ensure that the overall cohesion of our digital channels isn’t compromised. I was keen to explore how our content schedule could work harder to help us towards that aim.

During my research I came across Airtable; a web-based tool that can be used to create spreadsheets (much like Excel or Google Sheets) but also allows you to view the data in a calendar format. It’s this extra layer of flexibility— along with plenty of filtering and sorting options — that I was most intrigued by. I knew from feedback that some of our content champions found it hard to draw useful insights from information in a table format, so the idea of being able to manipulate the data to suit different learning styles seemed like something worth exploring.

How Airtable works
I’ve found Airtable straightforward to navigate and intuitive to use, so I was able to set up our demo database without much trouble. Airtable offers a suite of templates to inspire you, or you can start from scratch and just set up the fields that you need.

To help users along (and keep things neat) you can dictate the type of data that can be entered in different fields, such as date/time, single choice, multiple choice, free text etc. You can also link records between sheets, which gives us an easy way to note when a blog post has been shared through social.

There are different levels of user access, so you can tailor that to suit the needs of your teams. I’ve found it helpful to restrict the ability to create new tags in drop-down lists so that people need to ask me for help creating new labels. This ensures we don’t end up with multiple variants of the same thing and helps me to retain my overview of what people are working on.

The most exciting potential I can see is the ability to manipulate the data and use it alongside other campaign analysis. I hope to be able to draw insights that will help us to improve our content in line with what goes down well with our audiences. Airtable makes it easy to switch between different views such as ‘grid’ and ‘calendar’ to see the data in different formats. Records can be grouped, sorted and filtered in a variety of ways so that we can see what went out on a particular channel, or within a campaign but across all channels, or across all content but aimed at a particular audience.

By looking at the data from different angles we’ve already started to notice patterns as well as anomalies in our outgoing content, which threw up opportunities to experiment with different ways of posting. Next up I’m eager to work with our marketing team to incorporate Airtable into their campaign analysis workflow and see what other insights can be drawn from it then.

Working within limits
We’ve started off using the free version of Airtable, which allows you to invite an unlimited number of users but does have some restrictions in functionality. Almost everything we need is covered in the free version, but there are a couple of extras that might come in handy if budget were available for a paid subscription;

  • Up to 50,000 records in the database – the free version is capped at 1,200 so I plan to make regular exports to archive anything over nine months old.
  • Colour coding in calendar view – this would make it easier to see at-a-glance which platform a post is going out on (for example). We’ve been working round this using filters to narrow things down to a particular channel.
  • Plotting a record across a date range – to plot the duration of full campaigns as well as just individual posts.

These features require a Pro Plan which costs $20 (USD) per user per month. Airtable offers a 50% discount to non-profits and educational organisations that meet the eligibility criteria listed on their website.

Rolling out a new tool
I was keen to involve our content champions in the process of trialing any new tool, and to take the opportunity to review the way our content schedule was working — and where it could be made to work harder.

With the help of some of expert advice from my mentor, I mapped out some additional information that could be useful to us, but wasn’t being captured in the current social media schedule. I’ve added new fields to record things like the wider campaign that a post fits into, the audience segment being targeted and the scheduling tool that’s been used (we have a couple of different tools in circulation, so it’s useful to know where to look if something needs to be amended).

After giving our content champions a demo to get their thoughts on the viability of the tool, I followed up with written guidance and a trial period for users to access and play with Airtable for themselves. I invited feedback and offered guidance throughout this time and finally took the plunge, migrating both our social media schedule and our blog schedule over to Airtable in December.

Since then I’ve continued to offer support and welcome suggestions for how things could be amended. We’ve worked through a few teething problems and made some adjustments to make sure the data capture is as useful as possible without the whole document becoming unwieldy.

It’s still early days, but user feedback has been positive so far and we’ve already identified another area of the Trust where Airtable could be put to good use (I’m working with another team to set that up next). I’m really looking forward to working alongside my colleagues to further develop their confidence and hope to see Airtable being adopted in increasingly strategic ways. I believe that as our bank of data grows, the extra insights to be gained through Airtable could have a real impact on our understanding of why certain content performs better than others and will have a genuine influence on the way we approach every aspect of content creation.

Image courtesy of The Shakespeare Birthplace Trust © Stewart Writtle.

A question of bias and balance #Digilab

How many times is too many times when it comes to testing an idea until it ‘succeeds’?  Ron Evans, a mentor on the Digital Lab, examines the layers that encapsulate our attitudes and approaches to being flexible when we are giving things a ‘good go’.

Picture of Ron EvansAfter working with hundreds of nonprofit cultural organisations, I’ve been in a lot of conversations about marketing strategy. “Oh, we tried that, and it didn’t work” is something I hear often. When I probe more deeply, I find out that the strategy was tried perhaps a couple of times, and then abandoned, and the person implementing the idea wasn’t enthusiastic about it in the first place.

This is a particularly nasty form of confirmation bias, which is defined as the tendency to interpret new evidence as confirmation of one’s existing beliefs or theories. When we don’t believe something will work, it’s easy to find evidence to that effect. But there are a variety of other factors at play. Was the idea implemented correctly? Was there an objective method of measuring effectiveness? Was the time that the experiment ran long enough to get usable results?

When we don’t believe something will work, it’s easy to find evidence to that effect.

As Digital Lab researchers, one of our objectives is to combat personal bias. We do that by recognising and correcting for emotions we feel about a particular experiment, and actively exploring ways we may be influencing the outcomes. For example, it’s useful for us to frequently ask ourselves: “Are my feelings about this experiment influencing its execution, or my interpretation of the results?” Another way of looking at this question, which is often asked in academia, is: “Is another researcher going to see any issues of personal bias that may affect my results?” This type of introspection will often lead to dramatically improved experiments.

The science of experimentation that mentees get to practice in Digital Lab is a primary benefit of the programme. But equally important is the exploration of how we mentally approach experimental design, execution, and interpretation of results. A good researcher learns to balance both

Audience data research #DigiLab

Emilie Barton, Marketing Coordinator at Pavilion Dance South West, explains how the Digital Lab has been an invaluable learning opportunity that’s helped develop her knowledge of Google Analytics and SEO.

I’m currently focusing on three areas as part of devising a new digital strategy for Pavilion Dance South West (PDSW).

  1. Raising general PDSW brand awareness
  2. How can we improve and develop our use of film across platforms?
  3. Developing a PDSW template for social media campaigns. What content works?

Meeting with my mentor Devon Smith was super helpful, as she helped me launch my ideas and hone my thoughts. We decided that my first experiment should receive the most attention, as taking on audience data research will supplement any further experiments. We also delved into how I understand my audiences, how I could group them, and who could be developed in their general awareness of the spectrum of work that PDSW does.

We found that there were two groups that I want to focus on, Classes and Live Performance. I also became aware of the data gaps that I had on these groups. So we set up some research questions to explore in Google Analytics:

  • How do these audiences behave?
  • What do these groups have in common and how do they differ from each other?
  • Do audiences who look at Classes web pages also look at Live Performance pages and vice versa?
  • What do these audiences search for/want to find?

After asking these questions and delving into Google Analytics (GA), I started to build some great data analysis. As an emerging marketeer, I wouldn’t say I was advanced in GA. So I’m finding the Digital Lab an invaluable learning opportunity to develop my knowledge.

I have discovered that one of the best ways to learn is just to start poking around and experimenting!

PDSW went through a box office and website change this year, so this also proved to be the perfect time to understand any differences in the way our data is reported in GA. Initial findings have to led to overcoming some small reporting problems. Other initial findings have led to instant change. For example, one website referral had much higher click throughs than I was expecting. My response was to make sure all details on this site were correct and looking great. This type of information is critical in knowing what actions are worth the time and resources.

Some findings

  • Classes audience make up 30% of all traffic in the last year.
  • Most classes traffic comes through organic search — a very useful SEO online session proved that our SEO is very good and people who are searching can find us easily. I can now work to keep up our SEO levels and allow our content to reflect what is being searched.
  • Live performance audiences referred on par: organic search, direct and social.
  • Facebook brings in 60% of all referrals for audiences looking at Live Performance webpages.
  • User journeys from social media are predominately on a mobile device.

Many of these stats are achieved through segmentation on GA. I picked up some great tips from Devon Smith in her Google Analytics online session.

Now that I have gathered a good amount of behaviour stats, I can make some evaluations on our audiences. This allows room for some basic surveys to be developed, which I hope will conclude with cause for action. Basic action is already happening as a result of my findings, which is great. I aim to experiment with some small developments (potentially could be very simple) which can aid in raising awareness of PDSW’s whole organisational picture.

Further ideas and plans

  • Explore the digital tools in Daniel Rowles ‘Digital Marketing Toolkit’ e.g. Klear.com / answerthepublic.com / analytics.google.com/analytics/academy/
  • Usability test anything new, especially website content.
  • Adding Google Surveys to the website.
  • Ask our audiences what they think.
  • Create content that answers the public’s questions.
  • Understand what user journey’s our audiences are going on.

Image courtesy of Pavilion Dance South West. Photograph Red Manhattan.

Change of details?

If you would like to change your contact details or organisation please get in contact with us.