Kindle Notes & Highlights
Read between
December 11, 2019 - March 3, 2022
42.2.3 Task Analysis Is Relevant at All Stages of the Process Our third principle is that task analysis belongs everywhere in the process of planning, designing, developing, and evaluating a product. Task analysis, like so much else in the usercentered
• High-level task analysis: The work needed to accomplish a large goal broken down into subgoals and major tasks
The cardinal rule of all documentation is to give users what they need in the form they need it when they need it. That is why most technical communicators have moved from writing extensive tomes that people do not open to helping teams bring communication into the interface.
Tables are an excellent way to show comparisons and so are useful for presenting many types of analysis. Technical communicators, for example, have traditionally used a user/task matrix to understand which tasks are done by which types of users. The user/task matrix becomes a major input to a communication plan—to answer the question of what tasks to include in documentation for people in different roles (e.g., system administrators, end users).
Kujala, Kauppinen, and Rekola (2001) developed what they call a “user needs table”
Often we get lost in the methods themselves and we forget the journey we are on, the goals of the research and the reason we are doing the research in the first place. It’s important not to treat user research as a silo-ed or singular activity independent of other research or insights that may exist to inform design and product strategy going forward.
• Don’t do user research for the sake of research— understand what you want to find out.
We first made a list of all the research questions that we had on the topic areas, then grouped and prioritized the questions.
One cornerstone of CD is that it acts as a framework or scaffolding for putting structure into a part of the engineering process that is typically unclear: how to get to the most important requirements to drive the next version of a product or system.
But as we have discussed, designers are not usually familiar with or experienced in the user activities they are supporting. If they operate from their gut feelings, they rely on their own experiences as users.
You cannot simply ask people for design requirements, in part because they do not understand wha...
This highlight has been truncated due to consecutive passage length restrictions.
but more because they are not aware of what they * Some products, systems, and websites support the way people work, keep businesses running, or help users find needed information. Other products, systems, and websites address games, other entertainment, or consumer information to support life decisions. To gather data for these consumer products we have to look at people’s life practice. To simplify language, this chapter will use practice to mean both work and life practice. really do. Because the everyday things people do become habitual and unconscious, people are usually unable to
...more
This highlight has been truncated due to consecutive passage length restrictions.
The Contextual Interview starts like a conventional interview, but after a brief overview of the practice, it transitions to ongoing observation and discussion with the user about that part of the practice that is relevant to the design focus.
Dumas and Loring (2008) have provided a systematic rationale for how to moderate a test session. They describe 10 rules for interacting with participants that put the first stake in the ground on the topic.
The gracious host, who is responsible for making participants feel welcome from the moment they arrive to the moment they leave and who attends to their physical comfort, ensuring that the session goes smoothly and that they have a positive experience overall
The leader, who respects participants but who is clearly in charge of the direction a...
This highlight has been truncated due to consecutive passage length restrictions.
The neutral observer, who is unbiased and objective and who keeps interactions to a minimum while providing support and encourage...
This highlight has been truncated due to consecutive passage length restrictions.
55.2 SURVEY DESIGN IN HCI
55.2.1.2 Evaluation Surveys
Most common in HCI, evaluation surveys are administered after participants have completed a number of tasks in controlled computer environments.
Reliability of a survey is the measure of whether the survey is measuring things consistently,
The most commonly used measure for internal reliability of surveys is called “Cronbach’s Alpha Internal Reliability Coefficient” (Cronbach 1990).