HBR Year in Tech 2022 covers more than a few interesting topics. It is more focused on tech future and trends (a lot more thought provoking ideas in a second part of the book, than at first). Or what is about to come and hit big.
From ongoing (or known) technology:
- Discussions of “data ethics” and “AI ethics” - interesting and educational article
- Blockchain in supply chain management (very well discussed, with all open questions, possible trends, technology limitations). Great article. I actually got stuck with the book for such a long time, since I was looking into blockchain on a side.
- Interesting new law in Germany that is designed to endorse Healthcare digital tools. I think that is a great step in a our "post-COVID" world. Hopefully we will hear more about the outcomes of this new strategy on a wider scale.
- Using VR for training salespeople and general personnel that has a lot of communication in a job description
Second part of the book, or the future tech was much more impactful for me.
I think the most disturbing and troubling for me was definitely an article related to the development of brain-computer interfaces (BCIs). As the book says: "Imagine if your manager could know whether you actually paid attention in your last Zoom meeting. Or, imagine if you could prepare your next presentation using only your thoughts. These scenarios might soon become a reality <..>. The development of BCI technology was initially focused on helping paralyzed people control assistive devices using their thoughts. For example, BCIs can now be used as a neurofeedback training tool to improve cognitive performance."
Looking at these statements my thoughts stagger. With a possibility to control and evaluate levels of attention, do we want to do that? We already expect knowledge workers to perform as a machine by being productive 8 hours a day. With time.. can we expect humans to stop being "human"? Will kinds want to increase their brain "GPU performance" by available tools? Will everyone want to be able to have a computer in their head? Is it maybe OK? Am I afraid of a change? It is effeminately a benefit that a car is going to stop cause you are tired. Save the driver, right? But who is going to control a manual override? What if your car stops 20 meters away from a point of destination? or you picked up a person who needs medical help? What does it mean for countries and areas that are sill low or in development? It was such a short article, but such a huge topic. Why AI ethics was discussed in details and this topic is left without questions? Maybe because it's still new. But this should definitely a topic for future investigation and monitoring.
What else it there?
- Ways to go green in Tech and a message to "to take energy measurements from the system as it executes specific workloads within their application and determine its efficiency". Set it as KPI for tech performance. I kinda think this field can be very interesting to investigate and maybe even work in.
- Quantum computing
- Space flight and commercial space age (curious fact here: "In 2015, for example, Argotec and Lavazza collaborated to build an espresso machine that could function in the zero-gravity environment of the ISS, delivering a bit of everyday luxury to the crew."). Somehow when reading about space and how governments should start regulating ownership there, I was thinking about waste. A location, where people will want to take waste from Earth (even though there is nothing about such an idea in the book, it kinda seems very "human-like" to fight over what is not their or find a place where to put all the rubbish.)
- Black people in tech
- Tech companies self-regulations