Kindle Notes & Highlights
by
Sam Ladner
Read between
December 7 - December 26, 2021
Intuitively, many researchers know that video clips of participants seem to move their stakeholders. But keep in mind that if you don’t do this with care, you run the risk of using participants as props in your own agenda, and not actually taking the participant’s perspective.
you should have some qual data that make those dry numbers come alive.
Consider selecting quotes that feature participants trying to grapple with paradoxical or ambiguous situations. One participant may tell you he recognizes it’s not good for him to keep his work relationships superficial, and that he should cultivate a sense of comradery with his coworkers, but he doesn’t seem to know how, and it bothers him. This will give more detail to the statistic: “64% of men at this company feel their relationships with coworkers are not deep.” The quote pushes stakeholders to see more than just the statistic, and the counterintuitive nature of the story conveys the
...more
Today, I work hard to avoid doing that because my agenda is not my participants’ agenda. They don’t care about a particular product and its future. Their paycheck doesn’t rely on my desire to have a VP change her mind about something. Participants have different concerns than I do. I work hard to bring their concerns to the top of my list.
Even better than diagrams are metaphors. Using cohering metaphors provides stakeholders with a deep understanding of a thing, without the temptation to ask for deductive logic.
In their wonderful book on doing anthropological research, Denny and Sunderland suggest asking a simple question in inductive reporting: “What is ‘X’?” Coffee, literally, is a hot brown liquid, they note, but metaphorically, coffee is a social lubricant, or a morning ritual, or a business tool. Metaphorical language is a good way to explain the hidden, social complexities of a thing and its cultural meaning.
But I often tell people to tamp down their excitement about data exhaust because none of these data are actually designed for falsifiability in mind—it’s simply the detritus of our digital lives. Just because we have more data doesn’t mean we are doing better research.
Web analytics data today are problematic for two reasons. First, they do not measure intentionality on the part of the user. Just because a computer records a click does not mean that a human intended to make that click.
Second, web analytics require a great deal of non-standardized interpretations on the part of the analyst, making one analyst’s interpretation wildly different from another’s because of their differing beliefs about what “counts” even when analyzing the exact same data set. Anyone who has spent time with the raw data generated by analytics tools will attest to this challenge.
Data science as a discipline is not one that focuses on a subject area, like human-computer interaction or even online consumer behavior, but more on the tools and techniques of data management (such as SQL, Python and R). The sheer volume of data means data scientists spend much of their time creating data sets so that they might be able to ask questions rather than spending all their time asking challenging or innovative research questions.
We just have to whip data exhaust into actual data. How might we do that? The challenge is not to create more data, but to be purposeful and strategic in choosing (or better yet, creating) the data that are most likely to be easily falsified.