the irrelevance of thinking

Mary Harrington:

As I argued here, it would would be more accurate (if less snappy) to describe AI as “powerful modelling and prediction tools based on pattern recognition across very large datasets”. It is, in other words, not a type of cognition in its own right, but – to borrow a term from Marshall McLuhan – one of the “extensions of man”: specifically a means of extending cognition itself.

I don’t think this is correct; what LLMs do is not the extension of cognition but rather the simulation and commodification of the palpable products of cognition.

The people who make LLMs have little discernible interest in cognition itself. Some of them may believe that they’re interested in cognition, but what they’re really focused on is product — that is, output, what gets spat out in words or images or sounds at the conclusion of an episode of thinking.

Seeing those products, they want to simulate them so that they can commodify them: package them and serve them up in exchange for money.

This doesn’t mean that LLMs are evil, or that it’s wrong to sell products for money; only that thinking itself is irrelevant to the whole business.

 •  0 comments  •  flag
Share on Twitter
Published on April 25, 2025 03:41
No comments have been added yet.


Alan Jacobs's Blog

Alan Jacobs
Alan Jacobs isn't a Goodreads Author (yet), but they do have a blog, so here are some recent posts imported from their feed.
Follow Alan Jacobs's blog with rss.