Kindle Notes & Highlights
Read between
August 30, 2023 - May 20, 2024
Disruptive Technologies Harvard Business School professor Clayton M. Christensen first used the term disruptive technology in his 1997 book The Innovator’s Dilemma. Christensen separated emerging technologies into two categories: sustaining and disruptive. Sustaining technology relied on incremental improvements to an already-established technology, while a disruptive technology was one that displaced an established technology. A disruptive technology also had the potential to transform an industry or develop a novel product to create a completely new industry.
Defining whether a technology is disruptive is not an exact science, but rather largely open to interpretation. In assessing whether a technology is considered disruptive, one study proposed four factors that could be used in making this determination: (1) the technology is rapidly advancing or experiencing breakthroughs, (2) the potential scope of the impact is broad, (3) significant economic value could be affected, and (4) economic impact is potentially broad.
twelve technologies that were considered disruptive. They included mobile internet, automation of knowledge work, Internet of Things, cloud computing, advanced robotics, autonomous vehicles, next-generation genomics, energy storage, 3-D printing,
advanced materials, advanced oil and gas exploration, and renewable energy. It also highlighted several important technologies that were potentially disruptive that did not make the list, including fusion power, carbon sequestration, advanced water purification, and quantum computing.18 Another such list of disruptive technologies retrospectively identified the personal computer, Windows operating systems, email, cell phones, mobile computing, smartphones, cloud computing, and social networking as disruptive technologies.19
Democratization of Technology The democratization of technology refers to the
increasing accessibility of technologies across the world for individuals, industries, and nations. Through democratization of technology, more people have access to sophisticated capabilities, tools, and techniques that were once the exclusive purview of only a few. History provides limitless examples of the technology democratization that has been occurring, which is both accompanying and fueling the globalization that has been undergoing exponential growth over the past several millennia.
TECHNOLOGY DEVELOPMENT IS NOT WITHOUT RISK History demonstrates that technology development is never without risk. Accidents do occur, and development does not proceed unimpeded, with one success leading to another and another until the technology is mature.
What are the acceptable limits for use of these types of systems, and who should decide these limits?
Getting technology wrong can be catastrophic. Failure to understand the physical phenomena governing a technology can result in erroneous and potentially dangerous conclusions. Not understanding the operational context within which the technology will be employed can also lead to catastrophic outcomes. Finally, inadequate design specifications can result in technologies that lack the resiliency to operate outside of the narrow operational envelopes for which they have been designed and built. Indeed, history provides numerous examples of getting it wrong.
A common element in these cases—from the Maginot Line to the electrical grid—was that developers failed to understand the relevant operational environments and design their technologies to have the structure and resilience to operate within them.
Another major pitfall to be avoided is the “valley of death” that frequently occurs in technology development. This is very much related to the question of connectedness versus disconnectedness addressed previously. If a technology does not have a willing customer on the other side of the developmental process, the chances that it will cross the “valley” and become something that serves a practical purpose diminish greatly. The phrase “willing customer” is not just about the desire for the technology but includes such factors as technology feasibility, cost, and amount of training time
...more
Even where the customer has an expressed need and the resources are adequate, obstacles may appear. In the development of vaccines, heavy regulatory burdens for either human or animal vaccines translate to a lengthy period for demonstrating safety and efficacy. The normal developmental life cycle for medical countermeasures (including vaccines) is approximately a decade unless “emergency use authorization” can be obtained due to operational necessity.
In effect, the future portends the continuing convergence of technology such that the whole becomes far greater—perhaps even exponentially greater—than the sum of the parts. Through this transformative process, we come to see that technologies and combinations of technologies lose their original forms as they inherit other technologies.
First and foremost, the structured way of thinking about technology in the form of the S-curve and Brian Arthur’s evolutionary technology principles provide the bedrock for thinking about future technology development. Second, risks are inherent in any research and development. Scientific discovery is not without dangers. The path to developing successful technologies is not linear and may result in many twists and turns before a technology can be put to practical use. Along the way, technologists would do well to consider the effects of the three D’s (disruptive, democratized, and dual-use
...more
Rather, in this chapter our focus will be on understanding the evolution of the “modern” age of technology, which we will loosely define as beginning in the World War II period.
Bush stated, “A nation which depends upon others for its new basic scientific knowledge will be slow in its industrial progress and weak in its competitive position in world trade, regardless of its mechanical skill”3—thus indicating his belief in an inextricable link between scientific discovery and economic vitality.
To promote the connected R&D model and provide agency officials a foundation from which to think through and evaluate proposed research programs during his tenure, George H. Heilmeier, DARPA director from 1975–1977, developed a set of questions to evaluate proposed research programs. The list became known as the Heilmeier Catechism: What are you trying to do? Articulate your objectives using absolutely no jargon. How is it done today, and what are the limits of current practice? What is new in your approach and why do you think it will be successful?
Who cares? If you are successful, what difference will it make? What are the risks? How much will it cost? How long will it take?
With an annual budget of $7.5 billion in 2017, the NSF provides funding for 24 percent of the federally supported basic research conducted by America’s colleges and universities.
The NSF promotes research through a competitive grant process, awarding approximately eleven thousand of the forty thousand proposals received annually.
The Defense Innovation Unit Experimental (DIUx) was established in 2015 to focus on five areas: artificial intelligence, autonomy, human systems, information technology, and space.
To respond to this increasing complexity of technology, companies responded by changing their approaches to conducting R&D. One can think of these changes in terms of four generations. Generation 1.0: internal company assets used for all R&D Generation 2.0: increasing complexity requiring use of specialists for certain technologies Generation 3.0: conscious management decision to focus on core competencies and outsource for specialties to save money Generation 4.0: decision to use strategic technology sourcing for understanding of and access to complex and specialized technologies that can be
...more
The first is the investment community composed of venture capitalists, angel investors, and strategic partners. The investment community provides funding for small businesses and entrepreneurs. It also can be important for identifying cutting-edge technologies and tracking investments to calculate rates of return for the R&D that has been conducted. In-Q-Tel, discussed previously, is such a venture capital firm, with a focus on the US Intelligence Community. These firms can be a source of funding for small start-ups and are willing to take a longer view of their investment, but they will
...more
This highlight has been truncated due to consecutive passage length restrictions.
First, the United States must become a better consumer of technology and get used to living in a world where it might not lead in all technology areas. This will likely be necessary, as the cost for leading in all critical areas (and the subcomponent technologies) would undoubtedly be unaffordable. We have already seen this occur in several areas, such as biotechnology, where China leads in high-throughput sequencing capacity and the United States (at least for now) leads in synthetic biology. The same can be said of cybersecurity, where the United States, Russia, China, and Israel all have
...more
This highlight has been truncated due to consecutive passage length restrictions.
The democratization and dual-use nature of technology combine to make more capabilities available to a wider array of people who might seek to misuse them. However, this is not to say that someone with malicious intent now has the upper hand.
technology development is an ongoing process, requiring strategic thinking about potential risks (including threats, vulnerabilities, and consequences), dedicated R&D resources to address those risks, and recurring assessments to ensure that risks have been identified and appropriately considered.
visionary leaders
technology development does not simply happen without purposeful effort and the proper allocation of resources. And leadership is a key component of both.
Developing a vision for the future is not too dissimilar from developing a hypothesis, as called for in the application of the scientific method. Based on an idea, concept, or need to fill a practical void, a leader, scientist, or technology developer conceives of a concept for addressing the issue under consideration. This vision or hypothesis then serves as a starting point for investigation. Just as in other endeavors, strong leadership is required to see a technology development project through and ensure that the resources expended on these efforts come to fruition through a successful
...more
the Manhattan Project, putting a man on the moon, and the development of the conceptual and technological underpinnings of the Joint Warfighting force.
President John F. Kennedy announced on May 25, 1961, before a joint session of Congress that the United States had the goal of sending an American to the moon and returning him safely back to Earth by the end of the decade.4 The catalyzing speech, corresponding resource commitments, and focused R&D could not have occurred without Kennedy’s leadership at this critical moment in history.
Technology opportunism implies strategic scouting of technologies to identify where work is ongoing or may be reaching maturity to allow for incorporation as a subcomponent of a larger technology or even as a stand-alone system.
The goal of becoming technology opportunistic is to link up technology with operators.
The DoD has become keenly interested in industry R&D, both in terms of gaining access to the results of previously funded internal R&D or IRAD as well as exerting “directive authority over how these companies employed their independent research and development funds.”7 The goal is to get industry to spend its R&D resources to develop next-generation products that align with the highest priorities of the DoD.
Becoming technology opportunistic also means that technologists must be aware of who might be developing such technologies of interest.
Part of being technology opportunistic is strategic teaming with partners to bring in key knowledge, technologies, assemblages, and capabilities.
In some instances, the government would also benefit by becoming a better consumer. Learning to make use of commercial off-the-shelf technology—where it is appropriate and a high percentage of the requirements can be satisfied with little or no modification—could be beneficial for government. While tailored solutions are likely necessary for the software to run weapons systems, they are likely unnecessary for enterprise software for government financial management and human capital systems. In a related concern, many government acquisition programs (which entail developmental activities) have
...more
These same concepts work for other technologies as well. For example, cellular telephone networks and individual backbone-system technologies are upgraded when new capabilities become available.
Private sector strategies often employ exactly the opposite of this updatability in their commercial products. The abandonment of planned obsolescence offers huge opportunity for increased public and institutional possibility and wellfare.
DOTMLPF—doctrine, organization, training, material, leadership, personnel, facilities—framework that focuses on building capabilities more broadly using S&T, R&D and I&T methods.
S-curve model of technology development.
One technique for conserving resources and becoming more technologically opportunistic is to go from R&D to “little r, big D.” In this paradigm, one would rely on others for the bulk of the research and focus on acquiring technologies that could be developed for the desired operational use case.12
A r&D approach does imply limiting internal resources for basic and applied research while focusing funding on technology development. However, such a strategy does not imply that no research will be done but, rather, that it will be done sparingly in key areas where others are not working or on specific operational issues.
In this chapter, a framework for looking into the future will be offered to the reader for assessing individual technologies or technology fields. The concept will rely on looking to the future to assess how the interplay among five attributes—(1) science and technology maturity; (2) use case, demand, and market forces; (3) resources required; (4) policy, legal, ethical, and regulatory impediments; and
(5) technology accessibility—will combine to affect the technology’s development.
In other words, this analysis should not just consider tomorrow but tomorrow’s tomorrow when looking at future technologies. It requires developing the capacity to think multigenerationally in examining those technologies likely to dominate, how they are likely to evolve, and how they are likely to shape their respective fields and even humankind.
In the 1960s, the RAND Corporation developed the Delphi method, which was a structured way to consider difficult questions that involved asking experts and narrowing down opinions until consensus judgments could be identified.
So how are these predictions done today, and could a method be designed to more clearly focus on the future rather than relying on the past? In evaluating technology, one often does side-by-side comparisons. The military uses this approach to compare its fighter aircraft to those of a potential adversary. For example, such an analysis might entail comparing the United States’ F-35 Lightning with China’s J-31.
However, comparing attributes such as range, weapons, avionics, and speed can be misleading when trying to understand the overall contribution to warfare.
In fact, comparison of these aircraft should be less about the individual systems than how they are combined with other key sys...
This highlight has been truncated due to consecutive passage length restrictions.