Kindle Notes & Highlights
by
Salim Ismail
Read between
September 2, 2023 - January 7, 2024
The word “exponential” signifies not just a theory but also a warning: the business world is beginning to move so quickly and so purposefully that if you’re not already changing your business model and thinking differently (and way bigger), you are going to be fatally left behind. ExO: Second Generation The original Exponential Organizations book was published in 2014.
“The world’s biggest problems are the world’s biggest business opportunities,” and “If you want to become a billionaire, help a billion people.”
In April 2023, Jeremiah Owyang, Ben Parr, and Chris Saad tweeted a prediction: “The next billion-dollar startup will only have three employees.” The culture of that startup would be “AI first,” and it would use autonomous AI agents to get work done. All marketing and sales would be automated via AI bots, and the three employees would be: The CEO, who would handle vision and purpose and lead public-facing marketing. She would also code and be involved in engineering. The Product Lead, who would interface with customers and team to manage the product roadmap and drive development The
...more
The Great Resignation was big. Millions of people around the world quit their jobs rather than returning to the status quo of their working lives before the COVID-19 pandemic and the resulting global lockdown. The pandemic only accelerated trends that had been building for most of the century. Over the last four decades, the half-life of learned skills has dropped from 30 years to fewer than four, in large part because of the accelerating pace of change driven by the tech revolution. According to noted business visionary John Seely, this trend will continue to accelerate in the years ahead.
...more
How did GPT-4 originate? GPT-4 is the fourth generation of the GPT (Generative Pretrained Transformer) language processing model developed by OpenAI. The first generation of GPT, known as GPT-1, was released in 2018 and was followed by GPT-2 in 2019. Both of these models were based on the Transformer architecture, which was originally introduced in the paper “Attention is All You Need” by Vaswani et al. in 2017. GPT-3 was released in 2020. The GPT models were designed to be trained using large amounts of unannotated text data, using a technique known as unsupervised learning. This

