Waging Culture 2017: Methodology
Back in 2008, when we initiated our first Waging Culture survey, we had little idea of what we were getting ourselves into. Intensive research into survey methodologies led us down the track of a relatively unknown and underused (in our estimation) sampling method known as Respondent-Driven Sampling (RDS). A technique developed by Douglas Heckathorn to study so-called hidden populations (any demographic group that is not easily captured using standard randomized sampling), it mimics the more popular-yet-statistically-suspect snowball sampling to create a pool of respondents and then uses sophisticated algorithms to weight the responses into modelling that target population.
We took this method, adjusted it slightly, and applied it to the professional visual artist community in Canada. With that first survey, full report available here, we spent a lot of time outlining the methodology we were using, as cultural researchers were for the most part unfamiliar with it. With our second iteration of the survey from 2012, available as mini-reports on this website, we spent less time discussing methodology and more time discussing results. And now, as we are preparing to present the results to our third survey over the coming months, we are in the enviable position of having our methodology explained for us in reports such as Guy Bellavance’s The Visual Arts in Canada (2011) and Kelly Hill’s Artist Career Research Methods (2019).
2017 Methodology in short
In order to keep our research internally consistent, which allows us to undertake diachronic analyses of the three iterations, our methodology has remained consistent through all three surveys, and we would refer you to the 2007 report’s methodology section, the 2012 methodology mini-report, and, additionally, to Heckathorn, Bellavance, and Hill’s websites and reports.
The major differences between the three surveys are the inclusion of a benchmarking exercise in the first, dropped in favour of brevity in subsequent iterations; a rethink of how we were asking our question about gender for 2017;, and, finally, the inclusion of two new exploratory questions: the first on secondary market sales and the second on social class background. The first survey also used a two-stage survey instrument, which was subsequently revamped into a single-stage questionnaire.
Our working definition of what constitutes a professional artist remained the same, cribbed from the Canada Council for the Arts definition in use in 2007, as follows:
“a professional artist [is] someone who has specialized training in the field (not necessarily in academic institutions), who is recognized as such by her or his peers (artists working in the same artistic tradition), and who has a history of public presentation or publication.”
The advantage of this definition is its elasticity: the definition can remain the same as it incorporates contextual changes within the field (such as the recognition of alternate forms of training and alternate peer groups) without losing specificity.
Historical Context of responses to the survey
For context, Canadian news cycles in the summer of 2008 were dominated by the federal election. Federal funding to the arts played a huge role in the election coverage, as the incumbent conservatives had cut funding to the arts over their two-year government. Also in the news were recent changes to the Census by the Conservatives which compromised the efficacy of Statistics Canada ability to represent the population at large. Finally, that summer was also the period when the full financial crisis of 2008 was building, climaxing (at least narratively) with the collapse of the Lehman Brothers in September. The combination of all of these factors, as well as an intuitive understanding of the paucity of information on the socio-economic status of visual artists meant that the first Waging Culture survey was perfectly timed.
That survey was delivered to a different media universe. In 2008, Netflix was still mailing DVDs. MySpace dominated social media, Facebook was still an upstart, Twitter existed but just barely. Over the course of the first iteration of the survey, the second generation of iPhones were released. All of this is to say, the primary mode of digital communication was not social media but rather email, and the survey was designed for this mode of communication.
The first survey was broken into two distinct stages: an initial survey asking demographic questions was sent out starting early July, and then a second survey asking economic questions followed in October. The response to the initial survey was incredible, with over 3,700 invites in fifteen waves sent out, and an initial response rate of 34%. Even more impressive, in hindsight, was the relatively short period in which these responses were received, a mere two months. The second stage, which was sent to all respondents of the first survey, was successfully completed by just under half of the initial respondents, ending up with around 560 workable responses (in all three surveys, responses were vetted for obvious errors and those that were deemed unreasonable were ignored). In the end, the response rate from initial invites to total valid responses was about 15%.
Five years later, and the technical context for the second survey had changed drastically. MySpace was all but gone, and Facebook had reached 1 billion users. More importantly, the spread of smart phones had led to new communication habits, with various forms of text messaging, Facebook posts, et cetera, starting to supplant email as the go-to for quick communications.
Politically, the Conservative Party under Stephen Harper was (still) in power, having been elected to a majority two years prior to the survey being sent. The distance from the election certainly would have seen less urgency in responding. There was also a general feeling that the economy was finally recovering from the 2008 crash (a full understanding of how that recovery took shape as a jobless one was not yet clear), the urgency behind the first survey’s success was not felt as much. We managed to send out just over 1900 invitations, though, through 12 waves of responses, and ended up with 390 valid responses, or about a 20% response rate. Unlike the fast turnaround of the first survey, this one took from July to November, five months in total (as opposed to the two of the first implementation). This, it should be noted, is not out of line with the general understanding that RDS can take months of recruitment to reach a large enough sample.
And now, with this third iteration, and things seem to have changed even more drastically. Facebook now has over 2 billion users, and SMS and related communications channels have come to dominate how we interact (and, yes, this includes WhatsApp). We have shifted over to a new phase of communication patterns, where email addresses are no longer as key to contacting someone as they once were. Social media contacts are more ephemeral than email addresses, and are not easily shareable (not to add that anonymous contact via social media is likely to be spam-trapped along the way). Add to this an increase in distrust of online communication through a (reasonable) fear of being phished, and the increased circulation of fake news.
With all this, it’s no wonder responses were affected. Despite having the survey open for almost six months, we sent only 960 invites. This resulted in around 275 usable responses, which works out to just shy of 30% response rate. So, we had a very positive response rate (the best of all three surveys), but the number of referrals were down, drastically. We also just managed to break into the 9th wave of responses, which got us to the bare minimum of waves needed to be able to say our numbers are reliable.
As a result, though, we will have to double-down on our limits to subdividing our responses, as we did for the 2012 results. (We also will have to revise the next survey to take into account more the changing communication patterns). Our mini reports, that we will be delivering over the fall, will deal with very strict subdivisions, with little combining of variables. This will still provide a very useful snapshot of where we are, tracking the changes that are rooted in the jobless recovery since the 2008 crash, as well as the early implications of the increase to the Canada Council for the Arts budget in 2016 (amongst other shifts, generally upwards, in arts funding since 2012).
These mini-reports are not the end game for this survey, however, as this iteration of Waging Culture has received direct support from a Canada Council Sector Innovation and Development project grant, which has in turn enabled us to engage the analytical wizardry of Kelly Hill (of Hill Strategies Research) to come up with some new and exciting takes on our data. Using all three data sets, we are going to be modelling the various multivariate correlations within the data. We started looking at this in the 2012 data, particularly questions of equity in regards to grants versus sales revenues. While this analysis expanded our understanding of various inequities within the field (particularly gender discrepancies), we weren’t able to fully comprehend what we were seeing without deeper modelling of the dataset. It presented to us some rather interesting questions, though, and so as we move towards more complex statistical analysis, we should be able to better understand the underlying forces at play in studios across Canada.
21 Jul 2014
24 May 2014
14 May 2014
01 May 2014
28 Apr 2014
24 Mar 2014
26 Jun 2013
01 Apr 2009