More on this book
Community
Kindle Notes & Highlights
Read between
May 4 - October 27, 2022
Restoring legacy systems to operational excellence is ultimately about resuscitating an iterative development process so that the systems are being maintained and evolving as time goes on.
Like pottery sherds, old computer programs are artifacts of human thought.
To understand legacy systems, you have to be able to define how the original requirements were determined. You have to excavate an entire thought process and figure out what the trade-offs look like now that the options are different.
Because we don’t talk about modernizing old tech, organizations fall into the same traps over and over again.
It is easy to build things, but it is difficult to rethink them once they are in place.
The first mistake software engineers make with legacy modernization is assuming technical advancement is linear.
Adopting new practices doesn’t necessarily make technology better, but doing so almost always makes technology more complicated, and more complicated technology is hard to maintain and ultimately more prone to failure.
Changing technology should be about real value and trade-offs, not faulty assumptions that newer is by default more advanced.
Whether Big Data as a Service saves you any money depends on how big your big data actually is, where it is centralized, and how long it takes it to get that big in the first place. Having petabytes of data collected over a five-year period is a different situation from having petabytes generated over the course of a few hours.
Value propositions are often complicated questions for this reason. It’s hard enough for a purely technical organization to get it right; it’s even harder at organizations where the only people with enough knowledge to advise on these issues are vendors.

