More on this book
Community
Kindle Notes & Highlights
Read between
February 27 - March 17, 2022
legacy modernization is less about technical implementation and more about the morale of the team doing the modernizing.
People are often too quick to equate morale issues with character flaws. Incentives play a much larger role in who’s effective at an organization than some fanciful notion of their character.
Organizations that refuse to take responsibility for the situations in which they put their own employees struggle to achieve operational excellence.
Remember, no one wants to suck at their job. Popular culture sells the myth about lazy, stupid, uncaring bureaucrats. It’s easy to dismiss people that way. Buying into the idea that those kinds of problems are really character flaws means you don’t have to recognize that you’ve created an environment where people feel trapped. They are caught between conflicting incentives with no way to win.
When a leader has lost the room, it is usually because the organization has pushed the engineering team back into a place where it is not possible to succeed.
People without confidence self-sabotage. They create self-fulfilling prophecies and display signs of learned helplessness.
Confidence comes before success. Success rarely creates confidence.
Confidence problems are always compounding. The only thing that convinces people to stop belittling themselves is knowing they have the trust and acceptance of their peers.
The way a murder board works is you put together a panel of experts who will ask questions, challenge assumptions, and attempt to poke holes in a plan or proposal put in front of them by the person or group the murder board exercise is intended to benefit. It’s called a murder board because it’s supposed to be combative. The experts aren’t just trying to point out flaws in the proposal; they are trying to outright murder the ideas.
Murder boards have two goals. The first is to prepare candidates for a stressful event by making sure they have an answer for every question, a response to every concern, and mitigation strategy for every foreseeable problem. The second goal of a murder board is to build candidates’ confidence. If they go into the stressful event knowing that they survived the murder board process, they will know that every aspect of their plan or testimony has been battle-tested.
Legacy modernization projects do not fail because one mistake was made or something went wrong once. They fail because the organization deploys solutions that actually reinforce unsuccessful conditions.
“Planning is problem solving, while design is problem setting.”
Problem-solving versus problem-setting is the difference between being reactive and being responsive.
the process of managing a major modernization is all about manipulating scope.
During a normal team conversation, individual members are looking either to increase or to maintain their status among the group. And, what increases their status? Shooting down the ideas of others.
A good rule of thumb is questions that begin with why produce more abstract statements, while questions that begin with how generate answers that are more specific and actionable.
The Surprising Power of Liberating Structures: Simple Rules to Unleash a Culture of Innovation by Henri Lipmanowicz and Keith McCandless,
asking your team to play saboteur. If you wanted to guarantee that the project fails, what would you do? How can you achieve the worst possible outcome?
Nothing produces out-of-scope digressions more effectively than having people in meetings who don’t need to be there.
Probabilistic outcome-based decision-making is better known as betting.
In 1968, Melvin Conway published a paper titled “How Do Committees Invent?”
Conway’s law has become a voodoo curse—something that people believe only in retrospect. Few engineers attribute their architecture successes to the structures of their organizations, but when a product is malformed, the explanation of Conway’s law is easily accepted.
When a manager’s prestige is determined by the number of people reporting up to her and the size of her budget, the manager will be incentivized to subdivide design tasks that in turn will be reflected in the efficiency of the technical design—or
When an organization has no clear career pathway for software engineers, they grow their careers by building their reputations externally. This means getting drawn into the race of being one of the first to prove the production-scale benefits of a new paradigm, language, or technical product.
Organizations end up with patchwork solutions because the tech community rewards explorers.
Left to their own devices, software engineers will proliferate tools, ignoring feature overlaps for the sake of that one thing tool X does better than tool Y that is relevant only in that specific situation.
Well-integrated, high-functioning software that is easy to understand usually blends in. Simple solutions do not do much to enhance one’s personal brand. They are rarely worth talking about. Therefore, when an organization provides no pathway to promotion for software engineers, they are incentivized to make technical decisions that emphasize their individual contribution over integrating well into an existing system.
engineers are motivated to create named things. If something can be named, it can have a creator. If the named thing turns out to be popular, the engineer’s prestige increases, and her career will advance.
The folly of engineering culture is that we are often ashamed of signing up our organization for a future rewrite by picking the right architecture for right now, but we have no misgivings about producing systems that are difficult for others to understand and therefore impossible to maintain.
Conway argued against aspiring for a universally correct architecture. He wrote in 1968, “It is an article of faith among experienced system designers that given any system design, someone someday will find a better one to do the same job. In other words, it is misleading and incorrect to speak of the design for a specific job, unless this is understood in the context of space, time, knowledge, and technology.”
Systems do not generally fail all at once; they “drift” into failure via feedback loops caused by a desire to prevent failure.
A hundred errors on a legacy system is not failure-prone if it handles two million requests over that period.
you shouldn’t hire managers who want to reorg because they read a blog post that said engineering teams work better when structured this particular way or that particular way.
Conway’s law is a tendency, not a commandment. Large, complex organizations can develop fluid and resilient communication pathways; it just requires the right leadership and the right tooling.
To find the right leadership, look for people who have been successful in a wide variety of different contexts—old systems, new systems, big bureaucracies, and small startups. Do not hire aspirationally.
This is a typical problem with legacy modernizations: the ideal solution is dependent on conditions that are either not present or not possible.
yak shaving. It’s when every problem has another problem that must be solved before it can be addressed.
A leader with low tolerance for ambiguity either doesn’t see these blockers or will not acknowledge them, so she sends a top-down directive mandating the new solution.
you don’t want to start product development with everything designed up front. Your concept of what the new system will look like will be wrong in some minor ways you can’t possibly foresee.
Don’t design the organization; let the organization design itself by choosing a structure that facilitates the communication teams will need to get the job done.
“The only thing the government hates more than change is the way things are.”
Risk is not a static number on a spreadsheet. It’s a feeling that can be manipulated, and while we may justify that feeling with statistics, probabilities, and facts, our perception of level of risk often bears no relationship to those data points.
Fear of change is all about perception of risk.
Positive reinforcement in the form of social recognition tends to be a more effective motivator than the traditional incentive structure of promotions, raises, and bonuses.
Punishment and rewards are two sides of the same coin. Rewards have a punitive effect because they, like outright punishment, are manipulative. “Do this and you’ll get that” is not really very different from “Do this or here’s what will happen to you.” In the case of incentives, the reward itself may be highly desired; but by making that bonus contingent on certain behaviors, managers manipulate their subordinates, and that experience of being controlled is likely to assume a punitive quality over time.
Given a choice between a monetary incentive and a social one, people will almost always choose the behavior that gets them the social boost.
If you want people to do the right thing despite the risk, you need to accept that failure.
Sidney Dekker published an article titled “Just Culture: Who Gets to Draw the Line?”
Prescribed safety, security, and reliability processes are useful only if operators can exercise discretion when applying them. When organizations take the ability to adapt away from the software engineers in charge of a system, any gap in what procedure covers becomes an Achilles heel.
Google has repeatedly promoted the notion that when services are overperforming their SLOs, teams are encouraged to create outages to bring the performance level down.10 The rationale for this is that perfectly running systems create a false sense of security that lead other engineering teams to stop building proper fail-safes.