Inspired: How to Create Tech Products Customers Love (Silicon Valley Product Group)
Rate it:
Open Preview
12%
Flag icon
The reality of startup life is that you're in a race to achieve product/market fit before you run out of money.
13%
Flag icon
The technology infrastructure that was created to meet the needs of the initial product is often bursting at the seams, and you start to hear the term “technical debt” from every engineer you speak with. This stage is also tough on leaders because the leadership style and mechanisms that worked while the company was a young startup often fail to scale. Leaders are forced to change their roles and, in many cases, their behaviors.
14%
Flag icon
Once an idea makes it to the top of the list, the first thing that's done is for a product manager to talk to the stakeholders to flesh out the idea and to come up with a set of “requirements.” These requirements might be user stories, or they might be more like some form of a functional specification. Their purpose is to communicate with the designers and engineers what needs to be built.
15%
Flag icon
projects are output and product is all about outcome.
15%
Flag icon
The biggest flaw of the old waterfall process has always been, and remains, that all the risk is at the end, which means that customer validation happens way too late.
16%
Flag icon
Risks are tackled up front, rather than at the end. In modern teams, we tackle these risks prior to deciding to build anything. These risks include value risk (whether customers will buy it), usability risk (whether users can figure out how to use it), feasibility risk (whether our engineers can build what we need with the time, skills, and technology we have), and business viability risk (whether this solution also works for the various aspects of our business—sales, marketing, finance, legal, etc.).
17%
Flag icon
The purpose of product discovery is to quickly separate the good ideas from the bad. The output of product discovery is a validated product backlog. Specifically, this means getting answers to four critical questions: Will the user buy this (or choose to use it)? Can the user figure out how to use this? Can our engineers build this? Can our stakeholders support this?
33%
Flag icon
A strong product culture means that the team understands the importance of continuous and rapid testing and learning. They understand that they need to make mistakes in order to learn, but they need to make them quickly and mitigate the risks.
34%
Flag icon
The main obstacle to rapid delivery is often technical debt,
41%
Flag icon
Before we jump into the alternative, however, we need to remind ourselves that roadmaps have existed for so long because they serve two purposes, and these needs don't go away: The first purpose is because the management of the company wants to make sure that teams are working on the highest‐business‐value items first. The second purpose is because—since they're trying to run a business—there are cases where they need to make date‐based commitments, and the roadmap is where they see and track those commitments (even though in most companies, they rarely trust the dates anymore). So, to be ...more
41%
Flag icon
The idea behind business objectives is simple enough: tell the team what you need them to accomplish and how the results will be measured, and let the team figure out the best way to solve the problems.
41%
Flag icon
There are several benefits to this way of working: First, the teams are much more motivated when they are free to solve the problem the best way they see fit. It's the missionary versus mercenary thing again. Moreover, the teams are designed to be in the best position to solve these problems. Second, the team is not off the hook just by delivering a requested feature or project. The feature must solve the business problem (as measured by the key results);
44%
Flag icon
Most important, the product vision should be inspiring, and the product strategy should be focused.
44%
Flag icon
Fall in love with the problem, not with the solution.
45%
Flag icon
Evangelize continuously and relentlessly. There is no such thing as over‐communicating when it comes to explaining and selling the vision. Especially in larger organizations, there is simply no escaping the need for near‐constant evangelization. You'll find that people in all corners of the company will at random times get nervous or scared about something they see or hear. Quickly reassure them before their fear infects others.
46%
Flag icon
Objectives should be qualitative; key results need to be quantitative/measurable. Key results should be a measure of business results, not output or tasks.
46%
Flag icon
Agree as an organization on how you will be evaluating or scoring your key results. There are different approaches to this, and it's in large part a reflection of your particular company culture. What's important here is consistency across the organization, so that teams know when they can depend on one another. It's common to define a score of 0 (on a scale from 0 to 1.0) if you essentially make no progress, 0.3 if you just did the bare minimum—what you know you can achieve, 0.7 if you've accomplished more than the minimum and have really done what you'd hoped you would achieve, and 1.0 if ...more
50%
Flag icon
the product manager and product designer do need to ensure that they're available to answer questions from the engineers that arise during delivery activities. Normally, answering these delivery questions is on the order of half an hour to an hour of time per day.
51%
Flag icon
The purpose of product discovery is to address these critical risks: Will the customer buy this, or choose to use it? (Value risk) Can the user figure out how to use it? (Usability risk) Can we build it? (Feasibility risk) Does this solution work for our business? (Business viability risk)
62%
Flag icon
One of the most common techniques for assessing product/market fit is known as the Sean Ellis test. This involves surveying your users (those in your target market that have used the product recently, at least a couple times, and you know from the analytics that they've at least made it through to the core value of the product) and asking them how they'd feel if they could no longer use this product. (The choices are “very disappointed,” “somewhat disappointed,” “don't care,” and “no longer relevant because I no longer use.”). The general rule of thumb is that if more than 40 percent of the ...more
64%
Flag icon
How does the customer solve this problem today?
65%
Flag icon
allow, and even encourage, our customers to use our products to solve problems other than what we planned for and officially support.
Alexandra Baker
"Customer misbehavior" technique - e.g. Ebay's everything else category
65%
Flag icon
If you find your customers using your product in ways you didn't predict, this is potentially very valuable information. Dig in a little and learn what problem they are trying to solve and why they believe your product might provide the right foundation. Do this enough and you'll soon see patterns and, potentially, some very big product opportunities.
71%
Flag icon
If you remember the key questions from the Customer Interview Technique, we want to learn whether the user or customer really has the problems we think they have, and how they solve those problems today, and what it would take for them to switch.
71%
Flag icon
When you first start the actual usability test, make sure to tell your subject that this is just a prototype, it's a very early product idea, and it's not real. Explain that she won't be hurting your feelings by giving her candid feedback, good or bad. You're testing the ideas in the prototype, you're not testing her. She can't pass or fail—only the prototype can pass or fail.
71%
Flag icon
One more thing before you jump into your tasks: See if they can tell from the landing page of your prototype what it is that you do, and especially what might be valuable or appealing to them. Again, once they jump into tasks, you'll lose that first‐time visitor context, so don't waste the opportunity. You'll find that landing pages are...
This highlight has been truncated due to consecutive passage length restrictions.
71%
Flag icon
In general, you'll want to avoid giving any help or leading the witness in any way. If you see the user scrolling the page up and down and clearly looking for something, it's okay to ask the user what specifically she's looking for, as that information is very valuable to you. Some people ask users to keep a running narration of what they're thinking, but I find this tends to put people in critique mode, as it's not a natural behavior.
73%
Flag icon
The demand‐testing technique is called a fake door demand test. The idea is that we put the button or menu item into the user experience exactly where we believe it should be. But, when the user clicks that button, rather than taking the user to the new feature, it instead takes the user to a special page that explains that you are studying the possibility of adding this new feature, and you are seeking customers to talk to about this. The page also provides a way for the user to volunteer (by providing their e‐mail or phone number, for example).
Alexandra Baker
Painted Door!
75%
Flag icon
Using Time to Demonstrate Value Especially with businesses, you can also ask the person if they'd be willing to schedule some significant time with you to work on this (even if we don't need it). This is another way people pay for value.
76%
Flag icon
Optimization testing is where we experiment with different calls to action, different color treatments on a button, and so forth. Conceptually they are the same, but in practice there are some differences. Optimization testing is normally surface‐level, low‐risk changes, which we often test in a split test (50:50). In discovery A/B testing, we usually have the current product showing to 99 percent of our users, and the live‐data prototype showing to only 1 percent of our users or less. We monitor the A/B test more closely.
77%
Flag icon
Here is the core set for most tech products: User behavior analytics (click paths, engagement) Business analytics (active users, conversion rate, lifetime value, retention) Financial analytics (ASP, billings, time to close) Performance (load time, uptime) Operational costs (storage, hosting) Go‐to‐market costs (acquisition costs, cost of sales, programs) Sentiment (NPS, customer satisfaction, surveys)
78%
Flag icon
When we talk about validating feasibility, the engineers are really trying to answer several related questions: Do we know how to build this? Do we have the skills on the team to build this? Do we have enough time to build this? Do we need any architectural changes to build this? Do we have on hand all the components we need to build this? Do we understand the dependencies involved in building this? Will the performance be acceptable? Will it scale to the levels we need? Do we have the infrastructure necessary to test and run this? Can we afford the cost to provision this?
80%
Flag icon
If what you are proposing would represent a departure from what the sales channel has proven their ability to sell, sit down with the sales leadership and show them what you are proposing before you build anything, and see if together you can figure out a way to effectively sell this.
80%
Flag icon
Some tech companies have what's referred to as a high‐touch model of helping their customers, and some have a low‐touch model. You need to understand what your company's customer success strategy is, and you need to ensure that your products are aligned with that strategy. Again, if you are proposing something that would represent a change, you'll want to sit down with leadership and discuss the options.
82%
Flag icon
A discovery sprint is a one‐week time box of product discovery work, designed to tackle a substantial problem or risk your product team is facing.
Alexandra Baker
Aka "design sprint"
86%
Flag icon
a group setting is not the forum for designing strong products. It results in design by committee, which yields mediocre results at best. Instead, meet privately with each stakeholder, show them the high‐fidelity prototype, and give them the chance to raise any concerns.
87%
Flag icon
As part of my advisory work, I'm often on the interview team for senior positions, and when the person is coming from one of these types of companies, I'm up front with the prospective hire. We'll talk about the reasons why their former company has not produced successful new products in many years, and I'll emphasize to them that the new company is interested in them because of their mind and their talents, and of course they wouldn't want to bring with them the bad practices of their former company.
Alexandra Baker
Note for interviewing senior candidates.
88%
Flag icon
Good teams get their inspiration and product ideas from their vision and objectives, from observing customers' struggle, from analyzing the data customers generate from using their product, and from constantly seeking to apply new technology to solve real problems. Bad teams gather requirements from sales and customers.
89%
Flag icon
Good teams are skilled in the many techniques to rapidly try out product ideas to determine which ones are truly worth building. Bad teams hold meetings to generate prioritized roadmaps.
89%
Flag icon
Focused product strategy. One of the surest paths to product failure is to try to please everyone at once. Yet large companies often forget this reality. The product strategy needs to spell out a logical and intentional sequence of target markets for the product teams to focus on.