More on this book
Community
Kindle Notes & Highlights
Using elements of design thinking helps us really understand the problems we’re solving, so we don’t solve problems we imagine people have. Prototyping and testing solutions before we invest big in building full-featured scalable solutions helps us validate that we’re building solutions people really value and can use. But design thinking alone can lead to some problems.
Fail to focus on specific problem, and instead try to solve lots of problems for lots of people. The more problems you solve the better, right? Except that big problems often result in big solutions. And trying to solve a problem for people with conflicting needs can result in a solution neither person likes.
He described a process for progressively validating that you’ve found customers who are interested in a solution, and for then validating that the solutions you have in mind are the solutions they’ll buy, use, and tell others about. Blank referred to this as a validated learning process.
Eric Ries’s biggest contribution to product development is simplifying and “productizing” that thinking into this simple mantra: build-measure-learn. Eric emphasized reducing the time it takes to get through this simple learning cycle. One of the biggest flaws in traditional design processes is spending a very long time learning and designing—so long that you become very attached to the solutions, and then failing to validate that those solutions really do bring about the outcomes you intended. Where a typical design process may take weeks or months to validate a solution idea, a Lean Startup
...more
“There’s not much risk or uncertainty in this project,” you need to remember that those are famous last words.
I take my assumptions and guesses about who my users are, usually by sketching simple prototypes. I’ll describe how I think they work today by building simple “now” story maps. I’ll do this collaboratively with other people who have firsthand experience with users and customers. And, in some situations I’ll do this involving customers and users directly. So, in fact, many of our team’s
guesses aren’t guesses at all. But it’s not exactly research like we used to do, either. We’ll spend hours to a couple of days doing stuff like this—never weeks or months.
We’ll form a hypothesis in our minds about how we think they’ll behave with our solution. We’ll also discuss technical risks—things that would threaten the feasibility of our solution.
Given a list of risks and assumptions about our customers, our users, and our solution, we’ll identify what we think the few biggest risks are.
But using a Lean Startup approach, where our goal is to learn something as quickly as possible, we’ll do our best to make the smallest prototype possible. In lots of cases, it’ll be hard to call it a prototype at all.
That smallest possible solution to test is what Lean Startup refers to as a minimum viable product. Yes, Eric Ries knows it’s not a whole product. But, when your goal is learning, it is the smallest product you could build to learn.
Put the test in front of customers and users. In early work this usually means schedule interviews and spend time with people. If you’re creating a consumer solution, you can do customer intercepts, which is a technical way of saying go to where your customers and users are, stop them, and talk to
Rethink Your Solution and Your Assumptions After running your test a few times, you’ll begin to get predictable results. If you’re dead wrong, you’ll often learn that pretty quickly. Take back what you’ve learned. Roll those facts back into what you thought you knew about your users, and the way they work today. Use that to rethink your solution. Then, rethink your assumptions about users and solutions. Then design your next test. After the JSTOR folks ran their tests, they learned that some students didn’t have the problems they thought they did. Normally this would be disappointing news,
...more
In a Lean Startup approach, failing to learn is frequently the biggest failure.
In a Lean Startup approach, build means build the smallest possible experiment you can. Measure may be analytics gathered from working software, direct observations from interviews and face-to-face testing of prototypes, or both. Learning is what we do with the information. It’s the rethinking of our assumptions and reforming what we believe to be a best solution going forward.
Throughout a validated learning approach, you’ll be constantly telling stories about who your users are, what they’re doing, and why. You’ll use story maps to tell bigger stories about how people work today, and how you imagine they’ll use your solution. When it comes time to build prototypes, you’ll use stories and story conversations to agree specifically on what the prototype you’re building should look like, and what you’ll check to confirm the prototype is done. After you understand that stories are a way of working, you’ll find it’s difficult to tell when you are or aren’t using them.
We hope to build simple prototypes in hours, not days. Even the prototypes we build using code and live data we hope will take days, not weeks. We’re building to learn, and we expect most of our ideas to fail, or at minimum, need some adjustment to be successful. So we focus on working together quickly, agreeing quickly, and minimizing the formality.
During discovery and validated learning, you may be telling stories constantly, breaking ideas and work down into small buildable pieces, and agreeing on exactly what to build. You’ll be doing it so fast that it won’t be clear you’re using stories. But you are.
But now it gets real. It’s time to have our best last conversations.
We’d like to get to work building these things, and we know that building the software that our storytelling describes will go smoothly and predictably if we can concisely describe exactly what we’d like to build.
I want you to picture an elegantly designed little machine. We’ll drop jagged, rough stories from our release backlog into a big funnel on the left side. Then, inside the machine, we’ll hear a little grinding and clattering. But then out of a little spout on the right side comes small, polished little nuggets. These little nuggets are the things that team members can pick up and use to predictably build high-quality software.
The special, secret mechanism hidden inside this machine is a story workshop.
story workshops are small, productive conversations where the right people work together to tell the stories one last time, and in the process make all the tough decisions about exactly what they’ll choose to build. These are the deep story conversations that result in confirmation. Finally, we’re getting to that third C in the card-conversation-confirmation flow. And it’s this C that helps us really cut and add the polish to these rocks.
You’ll need a small group that includes a developer, a tester, and people who understand users and how the UI will look and behave—UI designers or business analysts in some organizations. This goes best when the group is small enough to work together effectively in front of a whiteboard. That’s usually three to five people.
We’ve got to come out of this conversation with solid shared understanding, and we’ll need space to have those productive words-and-pictures conversations.
exactly what will we build?
Happily, you have exactly the right people in the room to help you break down this story into smaller stories that can be delivered, tested, and demonstrated in the growing product the team is working together to build.
the acceptance criteria for what we’ll choose to build.
Someone who understands users and how the user interface could or should work—often a product owner, user experience professional, or business analyst
One or two developers who understand the codebase you’ll be adding the software into, because they’ll best understand what’s feasible to build
A tester who’ll help test the product—because he’ll help ask the tough questions so that we consider the “what abouts” that others...
This highlight has been truncated due to consecutive passage length restrictions.
Exactly how the software behaves underneath that user interface—those sticky business rules and data validation stuff
What will we check to confirm this software is done? How will we demonstrate this software later when we review it together?
My first foray into the world of an Agile project team in my role as a business analyst was a cold, hard lesson in the power of collaboration over the written word. — Nicola Adams