Tech tools and tryabilityPosted: Sat 12.17.2011
Recently, I blogged about analyzing qualitative interview data. Using a constant comparative method, themes emerged upon multiple readings of the data. Done by hand, the process was laborious, time-consuming, and highly instructive. In other words, I learned a lot. However, as in other areas of life (electric screw drivers, anyone?) having the right tool for the job can save an enormous amount of time, not to mention muscle power. Time and energy that might be spent in better ways, say in writing up the analysis or catching up on your professional reading, or even polishing off the last of your online Christmas shopping. So when I received over 50 pages of newly transcribed interview data, I decided it might be time to investigate tech tools to do the job more efficiently, even more precisely.
Last year, fellow blogger, SES, and I interviewed 5 qualitative researchers on campus. We were interested to learn, among other questions, whether practitioners used tech tools in their field research and subsequent data analysis. Most knew of such tools, but didn’t use them, preferring to work by hand. Transcription software, such as Dragon Dictation, had to be “trained” in voice recognition over time. Analysis software, although promising, took time to learn well. We live in a time-crunched world. From students to tenured faculty, academics face multiple and often competing demands upon their time. So deciding to invest in learning something new, when you already know how to do it a simple albeit time-consuming way, is no small commitment. Web designer and blogger, Joshua Porter, coined the term tryability or “the pain of trying something new.” He argues that it involves, not only effort, but attention in the midst of other distractions to learn new ways of doing business.
Thinking of future potential utility for the rather large mixed-methods study I’m proposing and having a little time while waiting for a freshly red-inked version of Chapter 2 to be returned to my inbox, I decided to investigate NVivo. According to the company NVivo is software that will analyze “unstructured” data and will move you from “questions to outcomes.” A tall order, but how long would it take? The good news is that anyone can download a free 30-day trial. There’s a pdf Quick Start guide, which I used. There’s a sample project already set up for those who like to crash around in the program, rather than read a lot of instructions. (OK, I’ll admit, I’m one of those.) Finally, there’s online tutorials for the more advanced features of the program, which I started to watch after I had already uploaded and coded the interview data. So far, so good. Day 1, I hadn’t really taken that much extra time to set up the program and get started with really basic features. Day 2, with a little help from the video tutorials, I’d started to use simpler features like word frequency counts, tag clouds, and charts. However, the really cool stuff, like aggregating codes into hierarchies, assigning attributes to interviewees, and cross referencing responses by attributes, will have to wait until I have the time to learn it. Tryability factor? Not so painful.