Sara Devine is a Mentor on ADA 2.0. Here she shares her thoughts on her latest agile experiment at Brooklyn Museum where she is Director of Digital Engagement.
As you might expect as part of an academy focused on diversifying audience, visitor studies arises in conversation occasionally. After all, in order to make any changes to the makeup of your audience, you need to first know something about them. This is where audience research and evaluation come into play.
It so happens that I just finished teaching a course in audience research and evaluation for Pratt Institute, and for this course the students ran the Brooklyn Museum’s triennial visitor survey. One of the things we spent a good amount of time discussing in class was the demographic data so often included in such surveys. While this information can be necessary (often required) for institutions to gather and report back to funders, it is deeply personal information with a fraught history, particularly if you think about the legacy of such population surveys such as the U.S. Census and government categories of race specifically*.
We decided to do two things a little differently when gathering demographic data for this survey. The first was to separate the demographic data questions from the rest of the survey and to offer an explanation as to why they were included:
The demographic questions below help us get a clearer picture of our visitors, so that we may better plan for the future and report to our funders as required. These questions are optional and all responses are anonymous. Thank you.
The second thing was to offer respondents complete control over how, or if, they answered certain demographic questions by providing more type-in responses options for demographic information. This allowed respondents to better self-identify or bypass the questions altogether. We did this for two reasons, the most important of which was to be inclusive, thoughtful, and make people feel comfortable. The secondary reason was to hopefully solicit more responses based on this (assumed) higher comfort level. Not all demographic data was collected this way; total household income, for example, was a check-box option. Below are three questions for which we provided a type-in response option:
- How would you identify your ethnicity and/or cultural heritage?
- How would you identify your race?
- Please fill in your gender.
For each of these, we instructed survey takers to leave the question blank if they preferred not to answer, and none of these questions were required. The response by survey takers offers some interesting insight, I think, into people’s feelings about such questions.
Surprisingly (to me anyway), more people answered the race question than the ethnicity/cultural heritage question. Most (53%) of those who chose to answer the race question, answered with what could be considered a “standard” response, mostly following current U.S. Census categories such as “White/Caucasian,” “Black/African American,” or “Asian.” Some respondents chose to offer a bit of push back about the question, typing in things like: “White—race is a construct,” “Human race,” and even “Never answer that.” For the ethnicity/cultural identity question only about a third of respondents (36%) typed in a “standard” response. Many saw this an opportunity to be more specific and descriptive such as “Russian gal/NYC born,” “Macedonian/Swedish/Russian Jewish,” and “American Muslim Pakistani.” We even had one respondent share “privileged white man” while another described themself as “black as hell.”
It was important to include both questions since ethnicity and race often represent different things to different people and we didn’t want to limit people’s responses. Our hypothesis was that the ethnicity/cultural identity question might be less threatening and perhaps less loaded and therefore more people would answer it—particularly as opposed to the race question. We were wrong. Most respondents (85%) skipped the ethnicity/cultural heritage question compared to a little more than half (54%) skipping the race question. It’s possible that question order had something to do with this. The ethnicity/cultural identity question was first and perhaps people skipped it thinking it was the only such question, but one could see all of the demographic questions on the screen at once, so this may not be a factor.
Slightly more than a quarter (71%) answered the gender question, and most people typed in either “male” or “female.” We did have few people key in different responses, including “n/a,” “mixed,” “non-binary,” and “person.”** Really, it was for the comfort level of those people that opted to provide a non-binary response that we made this fill-in.
Turns out that opportunity to self-identify does not encourage responses. Compared to the 2014 survey (which has a much larger data set), far fewer people answered the ethnicity question in this recent survey. In the 2014 survey, we only asked about ethnicity/cultural identity and provided check boxes with “check all that apply” instructions and offered a “prefer not to answer” option. Only 6% preferred not to answer and only 3% skipped the question. For gender, in 2014 we only offered male/female/prefer not to answer as options. Similarly, a small percentage preferred not to answer (2%) or skipped the question (0.2%).
So if the goal of offering type-in responses was to garner more results, it did not work. However, if the goal was to offer a thoughtful approach to survey takers that reflects the way we as an institution strive to be—inclusive and welcoming—I hope we were successful.
* For more on this history, see: http://www.pewsocialtrends.org/interactives/multiracial-timeline/ and http://www.smithsonianmag.com/smart-news/how-us-census-defines-race-america-180957189/
** Huge shout out to my students for crunching all this data, in particular Tyler Dennis who evaluated the demographic data I share here.