Blaine Morrow

62%
Flag icon
Modern AI systems are essentially black boxes. Data goes in at one end, and an answer comes out the other. It can be impossible to understand how the system reached its conclusion, even if you are the system’s designer and can examine the code. Researchers don’t know precisely how an AI image-classification system differentiates turtles from rifles, let alone why one of them mistook one for the other.
A Hacker's Mind: How the Powerful Bend Society's Rules, and How to Bend them Back
Rate this book
Clear rating
Open Preview