Naive Epistemology
One of the things I learned from studying maths and philosophy is an appreciation of what things follow from what other things. Identifying and understanding what assumptions are implicit in a given argument, what axioms required to establish a given proof.
So when I see or hear something that I disagree with, I feel the need to trace where the disagreement comes from - is there a difference in fact or value or something else? Am I missing some critical piece of knowledge or understanding, that might lead me to change my mind? And if I want to correct someone's error, is there some piece of knowledge or understanding that I can give them, that will bring them around to my way of thinking?
(By the way, this skill would seem important for teachers. If a child struggles with simple arithmetic, exactly which step in the process has the child failed to grasp? However, teachers don't always have time to do this.)
There is also an idea of the economy of argument. What is the minimum amount of knowledge or understanding that is needed in this context, and how can I avoid complicating the argument by bringing in a lot of other material that may be fascinating but not strictly relevant. (I acknowledge that I don't always follow this principle myself.) And when I'm wrong about something, how can other people help me see this without requiring me to wade through far more material than I have time for.
There was a thread on Twitter recently, prompted by some weak thinking by a certain computer scientist. @jennaburrell noted that computer science has never been very strong on epistemology – either recognizing that it implicitly has one, that there might be any other, or interrogating its weaknesses as a way of understanding the world.
Some people suggested that the solution involves philosophy.
So when I see or hear something that I disagree with, I feel the need to trace where the disagreement comes from - is there a difference in fact or value or something else? Am I missing some critical piece of knowledge or understanding, that might lead me to change my mind? And if I want to correct someone's error, is there some piece of knowledge or understanding that I can give them, that will bring them around to my way of thinking?
(By the way, this skill would seem important for teachers. If a child struggles with simple arithmetic, exactly which step in the process has the child failed to grasp? However, teachers don't always have time to do this.)
There is also an idea of the economy of argument. What is the minimum amount of knowledge or understanding that is needed in this context, and how can I avoid complicating the argument by bringing in a lot of other material that may be fascinating but not strictly relevant. (I acknowledge that I don't always follow this principle myself.) And when I'm wrong about something, how can other people help me see this without requiring me to wade through far more material than I have time for.
There was a thread on Twitter recently, prompted by some weak thinking by a certain computer scientist. @jennaburrell noted that computer science has never been very strong on epistemology – either recognizing that it implicitly has one, that there might be any other, or interrogating its weaknesses as a way of understanding the world.
Some people suggested that the solution involves philosophy.
People in CS and machine learning have been haphazardly trying to reinvent epistemology while universities make cuts to philosophy departments. Instead of getting more STEM majors we might be better off if we figured out how to send more funding to the humanities.—
Published on July 03, 2020 10:05
No comments have been added yet.