Miguel

42%
Flag icon
Anything that requires exact recall is likely to result in a hallucination, though giving AI the ability to use outside resources, like web searches, might change this equation. And you can’t figure out why an AI is generating a hallucination by asking it. It is not conscious of its own processes. So if you ask it to explain itself, the AI will appear to give you the right answer, but it will have nothing to do with the process that generated the original result. The system has no way of explaining its decisions, or even knowing what those decisions were. Instead, it is (you guessed it) merely ...more
Co-Intelligence: The Definitive, Bestselling Guide to Living and Working with AI
Rate this book
Clear rating
Open Preview