madi

21%
Flag icon
A chatbot confidently responding with made-up information is referred to by some AI developers as “hallucination.” Author and cultural critic Naomi Klein observes that the term hallucination is a clever way to market product failures. It sounds better than saying the system makes factual mistakes or presents nonsense as facts. These frequent errors are a demonstration of the AI functionality fallacy and a reminder that appearing knowledgeable isn’t the same as being factual.
Unmasking AI: My Mission to Protect What Is Human in a World of Machines
Rate this book
Clear rating
Open Preview