More on this book
Community
Kindle Notes & Highlights
Read between
June 13, 2023 - January 1, 2024
Digital work, which in our larger society commands so much attention (whether it’s lionized or vilified), in government is reduced to an afterthought. It’s not what important people do, and important people don’t do it.
At times it almost seems that status in government is dependent on how distant one can be from the implementation of policy.
The temporal, organizational, structural, and cultural gaps between policy and tech teams, and between tech teams and the users of that tech, make it hard to try out strategies, learn what works, resolve ambiguities, and readjust.
Their job is simply to meet a predetermined list of requirements. These are often exceedingly specific, but nowhere in government documents will you find a requirement that the service actually works for the people who are supposed to use it.
Even the process of getting a construction permit, registering a vehicle, or just filing taxes can erode faith in our system of government.
We can’t afford this downward spiral of poor service leading to alienation and decreased political participation, which in turn lead to poorer service.
“A revolution doesn’t happen when society adopts new technologies—it happens when society adopts new behaviors.”
We saw dozens of idiosyncrasies like this one.
When you’re given a near-impossible task like clearing a backlog of 1.2 million unemployment claims, an obstacle like an angry turkey feels like a fitting warm-up act.
Perhaps some law changed and programmers coded a new work item to fit the new rules, but the original one persisted, most likely because it was still attached to active claims. Everything accumulates.
Lawmakers often have good intentions, but they continually add policy layers with too little understanding of (and, sometimes, regard for) how what they add will interact with the layers that are already cluttering the delivery environment.
Kevin had defined it away.
Finally, a bit confused, I asked why was he so reticent to talk about what the system was designed to do. “I’ve spent my entire career training my team not to have an opinion on business requirements,” he told me. “If they ask us to build a concrete boat, we’ll build a concrete boat.”3 Why? I asked. “Because that way, when it goes wrong, it’s not our fault.”
And here was one of the people charged with helping them, a senior official at the department, embracing the disempowerment.
He was proudly abdicating responsibility in order to avoid blame.
In the end, he could say he’d just been following the established process. He’d just been doing what he was told.
For people stuck in waterfall frameworks, data is not a tool in their hands. It’s something other people use as a stick to beat them with.
Today, if there’s any relationship between inaccuracy of the applicant data and fraud, it’s an inverse correlation. Our world is awash in databases of stolen identities from breaches at credit monitoring services, retailers, and employers, and these stolen identities are freely traded on the dark web.
I am much more likely to get my own Social Security number wrong than a sophisticated criminal enterprise is, especially if I’m two-fingering it into a wonky, hard-to-see web form on a tiny keyboard on my mobile phone.
The status quo was safer. That’s how the waterfall’s pledge not to learn anything while doing the work operates.
General Stanley McChrystal put it this way: “I tell people, ‘Don’t follow my orders. Follow the orders I would have given you if I were there and knew what you know.’”
I credit Marina and her team with superpowers—both the empathy to build trust with people like Carl and the analytical acumen to build the first accurate picture of what was going wrong.
And it wasn’t newer and sexier tools to replace legacy technology, since the infamous COBOL code chugged along just the same the whole time.
But Paula was the product of a system that values deference to the hierarchy and punishes risk taking.
State and federal civil service rules are a big part of that system, but they are simply the expression of a culture in which fidelity to flawed rules and practices is valued more than solving problems.
What we need has to do less with updating rigid 1950s code than with updating rigid 1950s thinking.
In 2010, the US Air Force awarded the defense contractor Raytheon a $1.5 billion contract to develop the Next Generation GPS Operational Control System, known as OCX.
Defense Department officials had already revised their project budget from $1.5 billion to $3.7 billion and would soon revise it again to $5.5 billion.
Requirements are the foundation of software development processes in government, and the source of many of its failures.
Yet when the team approached VA management with a plan to redo the form, they were told there was no need. The vendor that had been hired to build the application had fulfilled all the specified requirements. The contracting officer in the VA had signed off on the completed project. Officially, there was nothing to fix.
By any reasonable definition, the form didn’t work. But the way we build government technology is to specify the requirements and fulfill the requirements. That had happened. There had been no requirement to test the software outside the building and no requirement that the software actually work.
What can look from the outside like solving a problem can feel for those on the inside like creating one,
The outcome mattered more than the process.
As with so much detective work, it can be satisfying to solve the puzzle but truly unsettling to see the answer.
The accountability trap is a damned-if-you-do, damned-if-you-don’t situation.
On the other hand, violations of policy, process, and procedure—real or perceived—can do all of that, even if there is no hearing.
These discussions tend to function as a vetocracy, in which it takes all thumbs up in order to accept the risk, and only one thumbs-down to stick with the less-risky option.
In the business world, they say that culture eats strategy for breakfast—meaning that the people implementing the strategy, and the skills, attitudes, and assumptions they bring to it, will make more difference than even the most brilliant plan.
Government’s obsession with requirements—voluminous, detailed requirements that can take so long to compile the software is obsolete before it’s even bid out—stems from a delusion that it’s possible to make a work plan so specific that it requires no further decision-making. You hand it off and the developers just do exactly what they’re told. Why not let those developers choose the best tool or platform for the job? In part, because they sit at the bottom of the waterfall.
But the goal in government seems to be to drain the job of software development of any opportunity to exercise judgment.
What I do know is that the perverse effects of glorifying process are far greater in technology, for the simple reason that there are so few people in government who understand tech.
So why do we have so little tech expertise in government? And why do we treat the experts we do have with so little regard?
Somehow, the US government went from being a technology pioneer to seeming digitally incompetent. What happened? In a certain sense, the digital revolution had bad timing.
But Mike’s biggest problem wasn’t the deadline or the procurement rules themselves—it was the prevailing operating model that says that government staff manage but they don’t implement. Especially not when it comes to digital. They rely on contractors for that.
A big reason for the lag is that our government spent the first several decades of the digital revolution treating tech systems like steel—as a commodity to be bought, not a capability to be developed.
Our modern-day Grace Hoppers and Alphonse Chapanises are spending their days on paperwork instead of programming and design.
THE LACK OF technological know-how within government doesn’t only make it hard to get projects started, as Mike Byrne found with his broadband map.
Though government should buy commodity products for commodity functions, when it’s not accounting or payroll but your agency’s mission, the technology needs to be your product. It can’t just be a project that was contracted for, developed, tested, and declared “done.”
You need to own the code, and you need to be able to change it to meet your needs. This doesn’t mean that you can’t use contractors at all—in government, you will almost certainly use them. It means that you must have the core competencies to support a living, ever-adapting system.
Government knows how to acquire technology. What we need to acqui...
This highlight has been truncated due to consecutive passage length restrictions.