Political Philosophy and Ethics discussion
Both Pol. and Ethical Philosophy
>
Reason, Informal Logic, Evidence, and Critical Thinking
Feliks wrote: This may be a site which helps me out!
https://theprogressivecynic.com/2014/... "
Interesting. Thanks for posting.
https://theprogressivecynic.com/2014/... "
Interesting. Thanks for posting.

overview of where the entire idea of 'liberal media' came from originally
https://www.theatlantic.com/politics/...
Feliks wrote: "from The Atlantic Monthly
overview of where the entire idea of 'liberal media' came from originally
https://www.theatlantic.com/politics/...-..."
Interesting article. Thanks for the link.
overview of where the entire idea of 'liberal media' came from originally
https://www.theatlantic.com/politics/...-..."
Interesting article. Thanks for the link.
Heather A. Butler, a psychology professor, has just published a brief but very interesting article on the difference between intelligence and critical thinking: "Why Do Smart People Do Foolish Things?: Intelligence Is Not the Same as Critical Thinking and the Difference Matters," Scientific American Mind 29, no. 1 (January 2018): 40-41, https://www.scientificamerican.com/in....
Butler discusses the popular understanding of "intelligence" as follows:
"What does it mean to be smart or intelligent? Our everyday use of the term is meant to describe someone who is knowledgeable and makes wise decisions, but this definition is at odds with how intelligence is traditionally measured. The most widely known measure of intelligence is the intelligence quotient, more commonly known as the IQ test, which includes visuospatial puzzles, math problems, pattern recognition, vocabulary questions, and visual searches." (40)
She contrasts the IQ definition of intelligence with "critical thinking":
"The ability to think critically, on the other hand, has been associated with wellness and longevity. Though often confused with intelligence, critical thinking is not intelligence. Critical thinking is a collection of cognitive skills that allow us to think rationally in a goal-orientated fashion, and a disposition to use those skills when appropriate. Critical thinkers are amiable skeptics. They are flexible thinkers who require evidence to support their beliefs and recognize fallacious attempts to persuade them. Critical thinking means overcoming all sorts of cognitive biases (e.g., hindsight bias, confirmation bias)." (40)
She concludes the article as follows:
"Intelligence and improving intelligence are hot topics that receive a lot of attention. It is time for critical thinking to receive a little more of that attention. . . . Reasoning and rationality more closely resemble what we mean when we say a person is smart than spatial skills and math ability. Furthermore, improving intelligence is difficult. Intelligence is largely determined by genetics. Critical thinking, though, can improve with training and the benefits have been shown to persist over time. Anyone can improve their critical thinking skills: Doing so, we can say with certainty, is a smart thing to do." (41)
Butler says what I have thought for decades. She also claims that critical thinkers "experience fewer bad things in life" because they make wiser life choices. (40) She supports her conclusions with data from the Halpern Critical Thinking Assessment. That may be true in general. But I would guess that at least some critical thinkers do not fit that profile, because their negative life experiences may be the result of being under the control (in employment or otherwise) of people who do not possess such critical thinking skills. We perhaps all have experienced the unhappiness of working under an irrational employer or supervisor, for example. And changing jobs is not always the answer: sometimes it is going from the devil to the deep blue sea (an American colloquial expression, for those who live in other countries). However this may be, Butler's analysis is worth consideration.
Butler discusses the popular understanding of "intelligence" as follows:
"What does it mean to be smart or intelligent? Our everyday use of the term is meant to describe someone who is knowledgeable and makes wise decisions, but this definition is at odds with how intelligence is traditionally measured. The most widely known measure of intelligence is the intelligence quotient, more commonly known as the IQ test, which includes visuospatial puzzles, math problems, pattern recognition, vocabulary questions, and visual searches." (40)
She contrasts the IQ definition of intelligence with "critical thinking":
"The ability to think critically, on the other hand, has been associated with wellness and longevity. Though often confused with intelligence, critical thinking is not intelligence. Critical thinking is a collection of cognitive skills that allow us to think rationally in a goal-orientated fashion, and a disposition to use those skills when appropriate. Critical thinkers are amiable skeptics. They are flexible thinkers who require evidence to support their beliefs and recognize fallacious attempts to persuade them. Critical thinking means overcoming all sorts of cognitive biases (e.g., hindsight bias, confirmation bias)." (40)
She concludes the article as follows:
"Intelligence and improving intelligence are hot topics that receive a lot of attention. It is time for critical thinking to receive a little more of that attention. . . . Reasoning and rationality more closely resemble what we mean when we say a person is smart than spatial skills and math ability. Furthermore, improving intelligence is difficult. Intelligence is largely determined by genetics. Critical thinking, though, can improve with training and the benefits have been shown to persist over time. Anyone can improve their critical thinking skills: Doing so, we can say with certainty, is a smart thing to do." (41)
Butler says what I have thought for decades. She also claims that critical thinkers "experience fewer bad things in life" because they make wiser life choices. (40) She supports her conclusions with data from the Halpern Critical Thinking Assessment. That may be true in general. But I would guess that at least some critical thinkers do not fit that profile, because their negative life experiences may be the result of being under the control (in employment or otherwise) of people who do not possess such critical thinking skills. We perhaps all have experienced the unhappiness of working under an irrational employer or supervisor, for example. And changing jobs is not always the answer: sometimes it is going from the devil to the deep blue sea (an American colloquial expression, for those who live in other countries). However this may be, Butler's analysis is worth consideration.


Her article sounds like a good one and I intend to read it. Certainly I myself always struggled in school, until I found the niche I was best situated for; and if anything my K-12 matriculation was a hindrance. Entering college was almost like starting from scratch.
And in some quarters these days, its almost like 'smart' or 'intelligent' only mean one thing to people: how well you can use your phone or computer. I've noticed a marked decrease in respect for anyone who gets through the day confidently relying on their own brain to solve problems, as if that is now "passe'".
It's one more reason why I refuse to have anything to with such culture. There's no accomplishment I can see in 'how fast I can surf to the right website'.
Cary wrote: "The book: "Thinking: Fast and Slow" by Kahneman also explores this topic and makes a case for critical thinking too! See my review!!"
Thanks. Nice review.
Thanks. Nice review.
Feliks wrote: "Certainly I myself always struggled in school, until I found the niche I was best situated for; and if anything my K-12 matriculation was a hindrance. Entering college was almost like starting from scratch."
They say that Einstein also had problems with the regimented German educational system. So you're in good company.
They say that Einstein also had problems with the regimented German educational system. So you're in good company.

Feliks wrote: "Another observation I'm noting of late: chatting with co-workers or neighborhood residents about any random topic and they'll occasionally punctuate their remarks with extra, extra, 'weight' on a p..."
I would probably be equally lost. I only watch PBS News Hour, some political satire programs on HBO and Comedy Central, and the occasional Amazon Prime original series (heavy political -historical-futuristic drama). In a recent email, someone referenced something "Sheldon" said. Upon inquiry, this turned out to be a character on the sitcom "The Big Bang Theory," which I've never watched, though I feel I should just on the basis of the title. I don't let this stuff bother me.
Come to think of it, my wife and I also watched, years ago, the entire "Seinfeld" series (she's a Seinfeld freak), the entire "Soprano" series (which I found oddly funny), the entire "Breaking Bad" series (which gave me nightmares), and the entire "Star Trek" (first generation) series (which was great). So I get a lot of the allusions on TV to those and the other programs I mentioned above. But current sitcoms are not my cup of tea.
You are clearly in a younger generation and hang out with different kinds of people than I (indeed, any New Yorker would be outside my cultural experience, except when I was in college and grad school). My wife and I hang out with old retired folks like us--mostly with her friends from college and former jobs. We live a very sedate existence, as befits our age.
I would probably be equally lost. I only watch PBS News Hour, some political satire programs on HBO and Comedy Central, and the occasional Amazon Prime original series (heavy political -historical-futuristic drama). In a recent email, someone referenced something "Sheldon" said. Upon inquiry, this turned out to be a character on the sitcom "The Big Bang Theory," which I've never watched, though I feel I should just on the basis of the title. I don't let this stuff bother me.
Come to think of it, my wife and I also watched, years ago, the entire "Seinfeld" series (she's a Seinfeld freak), the entire "Soprano" series (which I found oddly funny), the entire "Breaking Bad" series (which gave me nightmares), and the entire "Star Trek" (first generation) series (which was great). So I get a lot of the allusions on TV to those and the other programs I mentioned above. But current sitcoms are not my cup of tea.
You are clearly in a younger generation and hang out with different kinds of people than I (indeed, any New Yorker would be outside my cultural experience, except when I was in college and grad school). My wife and I hang out with old retired folks like us--mostly with her friends from college and former jobs. We live a very sedate existence, as befits our age.



https://en.wikipedia.org/wiki/The_Pap...
Re posts 116 (Cary) and 117 (Feliks):
Having written and edited social science textbooks as well as attended law school back in the 1970s, I have some thoughts on this.
The elementary and secondary school social science textbooks prepared by my company were under the overall rubric of "Concepts and Inquiry." We attempted to get students to think—not just to memorize facts. It was a great idea but not very popular with teachers at that time. Many of them preferred the rote memorization method of ancient vintage. Indeed, although my boss and the company generally were quite conservative in their politics (much more so than I), local school boards often considered us crazy hippies because of these educational principles and methods. The series was published by a famous publishing company in Boston, and some of the editors there actually were crazy hippies. There was a lot of conflict between my boss, a former chair of the Kenyon political science department, and the Boston elitists, who seemed to want to follow the latest educational fad, no matter how insubstantial or ridiculous it was. So we fought the battle on both our right and left flanks.
As a result of funding insecurities at the nonprofit at which I worked (not part of the Boston publisher's business or legal structure), I began attending law school in the evenings and eventually graduated and passed the bar in 1979. The "Socratic method" (a misnomer if there ever was one) of legal training was, at least, calculated to make one "think like a lawyer" (or, to be more precise, to think like a law professor or appellate judge—another long story about which I could write a book or two). Yes, I saw the movie "Paper Chase" when it first came out (about a year before I started law school), and, yes, it does portray accurately the sadistic (and, to me, counterproductive) methods of some law professors.
Law school does not lend itself well to rote memorization of facts, because one is trying to learn legal concepts and principles. It was somewhat successful in that regard, though I had some reservations about the concrete details of this pedagogic approach at the time. Suffice it to say that evidentiary facts, to the practicing litigation lawyer, are every bit as important as the law, and law professors generally recognize this. Law professors and others argue endlessly about the best way to teach. I won't discuss this further, as it would take a tome (which some have produced) to cover all the issues.
Each field is somewhat different in such matters, and much depends on what one is trying to accomplish. My book on Roger Williams was heavy on both principles and facts, because one could not understand the principles without the facts, and the facts themselves had been distorted by centuries of misrepresentation. And Roger Williams's life was unusual in that the facts in his environment interacted very clearly with the principles he espoused as well as the contrary theocratic principles. The facts, correctly identified, were quite important for understanding the genesis of the principles, and people have misunderstood Williams over the centuries in part because they have operated from erroneous factual premises. But I would not expect anyone other than a professional historian to give that much attention to the facts surrounding Roger Williams. Certainly, if I were a teacher, I would not test students on factual details: the principles, not the facts, are ultimately the most important thing.
With regard to pedagogy, it is necessary to memorize the eternally important facts, for example the different timeframes of the American Revolution, the Articles of Confederation, the 1787 Constitutional Convention, the ratification of the Constitution, and the main points of the executive and legislative history of the first several presidential administrations. But testing on minute factual details that are not highly relevant to the constitutional concepts and principles being explicated is an error, especially in introductory courses.
Professional training in a given field, for example medicine, must necessarily be both fact intensive and principle intensive. One wouldn't want to have surgery performed on one's brain by someone who knows only the principles of neurosurgery and not the factual details of what the brain is about. The same applies in many other fields.
So, if I may say so, it all depends . . . . But, no, memory is not the best indication of intelligence.
Having written and edited social science textbooks as well as attended law school back in the 1970s, I have some thoughts on this.
The elementary and secondary school social science textbooks prepared by my company were under the overall rubric of "Concepts and Inquiry." We attempted to get students to think—not just to memorize facts. It was a great idea but not very popular with teachers at that time. Many of them preferred the rote memorization method of ancient vintage. Indeed, although my boss and the company generally were quite conservative in their politics (much more so than I), local school boards often considered us crazy hippies because of these educational principles and methods. The series was published by a famous publishing company in Boston, and some of the editors there actually were crazy hippies. There was a lot of conflict between my boss, a former chair of the Kenyon political science department, and the Boston elitists, who seemed to want to follow the latest educational fad, no matter how insubstantial or ridiculous it was. So we fought the battle on both our right and left flanks.
As a result of funding insecurities at the nonprofit at which I worked (not part of the Boston publisher's business or legal structure), I began attending law school in the evenings and eventually graduated and passed the bar in 1979. The "Socratic method" (a misnomer if there ever was one) of legal training was, at least, calculated to make one "think like a lawyer" (or, to be more precise, to think like a law professor or appellate judge—another long story about which I could write a book or two). Yes, I saw the movie "Paper Chase" when it first came out (about a year before I started law school), and, yes, it does portray accurately the sadistic (and, to me, counterproductive) methods of some law professors.
Law school does not lend itself well to rote memorization of facts, because one is trying to learn legal concepts and principles. It was somewhat successful in that regard, though I had some reservations about the concrete details of this pedagogic approach at the time. Suffice it to say that evidentiary facts, to the practicing litigation lawyer, are every bit as important as the law, and law professors generally recognize this. Law professors and others argue endlessly about the best way to teach. I won't discuss this further, as it would take a tome (which some have produced) to cover all the issues.
Each field is somewhat different in such matters, and much depends on what one is trying to accomplish. My book on Roger Williams was heavy on both principles and facts, because one could not understand the principles without the facts, and the facts themselves had been distorted by centuries of misrepresentation. And Roger Williams's life was unusual in that the facts in his environment interacted very clearly with the principles he espoused as well as the contrary theocratic principles. The facts, correctly identified, were quite important for understanding the genesis of the principles, and people have misunderstood Williams over the centuries in part because they have operated from erroneous factual premises. But I would not expect anyone other than a professional historian to give that much attention to the facts surrounding Roger Williams. Certainly, if I were a teacher, I would not test students on factual details: the principles, not the facts, are ultimately the most important thing.
With regard to pedagogy, it is necessary to memorize the eternally important facts, for example the different timeframes of the American Revolution, the Articles of Confederation, the 1787 Constitutional Convention, the ratification of the Constitution, and the main points of the executive and legislative history of the first several presidential administrations. But testing on minute factual details that are not highly relevant to the constitutional concepts and principles being explicated is an error, especially in introductory courses.
Professional training in a given field, for example medicine, must necessarily be both fact intensive and principle intensive. One wouldn't want to have surgery performed on one's brain by someone who knows only the principles of neurosurgery and not the factual details of what the brain is about. The same applies in many other fields.
So, if I may say so, it all depends . . . . But, no, memory is not the best indication of intelligence.

I recall that in a very thorough IQ test, there was some general knowledge.
You can argue that IQ tests, insofar as they involve any testing of vocabulary, are testing rote knowledge.
But that's true for math, too. You have to remember the rules.
So what's the line between recall of facts vs. vocabulary vs. math principles?
Mark wrote: "Here's something to know: Working memory is one of 5 sections of IQ. It's distinctly different from general knowledge.
I recall that in a very thorough IQ test, there was some general knowledge.
..."
Thanks, Mark, for this information. I don't know the answer to the question in your last paragraph. Perhaps we have one or more group members who are experts on this.
I recall that in a very thorough IQ test, there was some general knowledge.
..."
Thanks, Mark, for this information. I don't know the answer to the question in your last paragraph. Perhaps we have one or more group members who are experts on this.


Its one of the few American films I can name which even treats such a 'cerebral' topic at all. 'Intellectualism' as the basis for a plot--this hum-drum little flick inhabits a small territory all to itself.
If we had even a dozen such movies (of this exact same caliber) per year in this era, I might be much more willing to take it to task for its flaws. Its shocking to remember that this was somewhat of a 'dog' for its year. Looks pretty golden from where we stand now.
But I don't want to stray into movie reviewing so ...back to topic!
In his introductory remarks to his course on the Symposium, Leo Strauss remarked on the ad populum fallacy:
"[T]hough in all practical matters it is indispensable, either always or mostly, to follow custom, to do what is generally done, in theoretical matters it is simply untrue. In practical matters there is a right of the first occupant: what is established must be respected. In theoretical matters this cannot be. Differently stated: The rule of practice is 'let sleeping dogs lie,' do not disturb the established. In theoretical matters the rule is 'do not let sleeping dogs lie.' Therefore, we cannot defer to precedent . . . ."
Leo Strauss, On Plato's Symposium, ed. Seth Benardete (Chicago: University of Chicago Press, 2001), 1.
The ad populum fallacy is also discussed in the Plato (427-347 BCE) topic at posts 165-202 passim.
"[T]hough in all practical matters it is indispensable, either always or mostly, to follow custom, to do what is generally done, in theoretical matters it is simply untrue. In practical matters there is a right of the first occupant: what is established must be respected. In theoretical matters this cannot be. Differently stated: The rule of practice is 'let sleeping dogs lie,' do not disturb the established. In theoretical matters the rule is 'do not let sleeping dogs lie.' Therefore, we cannot defer to precedent . . . ."
Leo Strauss, On Plato's Symposium, ed. Seth Benardete (Chicago: University of Chicago Press, 2001), 1.
The ad populum fallacy is also discussed in the Plato (427-347 BCE) topic at posts 165-202 passim.
Feliks wrote (in another topic): "I have a question and I do not know where--in this discussion zone--I ought place it.
Alan and I have debated afore, on 'emotion' vs 'reason' and the subsequent arbitration of human behavior via social institutions such as law.
I'm pondering all this once again, tonight. I have only faint contact with the US legal system--a sprinkling of varietal incident such as a traffic accident, an injury case, a grand jury trial in which I was an expert witness--nothing much to speak of.
Alan has repeatedly insisted to me that reason is the supreme means to sort out right and wrong. Yet--is not the entire legal system of this country built not at all upon reason and fairness, but instead upon the notion of 'hurt' and 'harm'?
When someone experiences pain and we (their peers) deem the pain to have been inflicted by another party, is it not the degree of suffering and pain which is entirely the order of fineness, leading to the deliberation upon punishment or civil penalty?
Don't we place the utmost weight upon the element of suffering which one of our fellows has borne, this determining our verdict and little else? Isn't it the plaintiveness of the 'wound' rather than the wisdom of the 'remedy' which speaks loudest?
Any injury can have a dozen means of redress adjudicated for it by a jurist, or prescribed for it by a medico. These verdicts can be too little or too much, as happens. But the pain of the sufferer carries the most weight, doesn't it? Doesn't that pain answer all?
Isn't suffering the most 'just' arbiter and doesn't the degree of pain inflicted, always call upon a corresponding degree of redress to be extracted, from our legal system, from the malefactor?
If I am struck, and fly into rage, striking my opponent in return--isn't that that righteousness, that mammalian humanness, recognized as a response being in accord with nature and thus given more priority over merely what courts determine is 'correct'? Pain is always given more recognition, no?
When an injured party acts punitively--but with deliberation and the opportunity for forethought-- isn't it usual for vindication of his actions to be withheld (by the rest of us)?
Just musing aloud. "
Feliks, as usual you have a very cynical view of institutions and customs. I was a civil litigation lawyer (usually on the defense side) for more than three decades. It is true that some lawyers appeal to the emotional side of jurors. However, it is also true that many lawyers rely primarily on reason and evidence, and the rules structure of litigation encourages the latter. Moreover, the judge is supposed to be above merely emotional influences, though it doesn't always work that way. The entire structure of the rules of evidence is designed to reduce emotional decisionmaking. Ditto many of the procedural rules. The trial judge enforces the rules of evidence and the rules of civil (or criminal) procedure. In the event the judge fails to do so, it is an appealable issue, and the appellate court will reverse the decision of the trial court if the rules violation amounts to prejudicial error.
Alan and I have debated afore, on 'emotion' vs 'reason' and the subsequent arbitration of human behavior via social institutions such as law.
I'm pondering all this once again, tonight. I have only faint contact with the US legal system--a sprinkling of varietal incident such as a traffic accident, an injury case, a grand jury trial in which I was an expert witness--nothing much to speak of.
Alan has repeatedly insisted to me that reason is the supreme means to sort out right and wrong. Yet--is not the entire legal system of this country built not at all upon reason and fairness, but instead upon the notion of 'hurt' and 'harm'?
When someone experiences pain and we (their peers) deem the pain to have been inflicted by another party, is it not the degree of suffering and pain which is entirely the order of fineness, leading to the deliberation upon punishment or civil penalty?
Don't we place the utmost weight upon the element of suffering which one of our fellows has borne, this determining our verdict and little else? Isn't it the plaintiveness of the 'wound' rather than the wisdom of the 'remedy' which speaks loudest?
Any injury can have a dozen means of redress adjudicated for it by a jurist, or prescribed for it by a medico. These verdicts can be too little or too much, as happens. But the pain of the sufferer carries the most weight, doesn't it? Doesn't that pain answer all?
Isn't suffering the most 'just' arbiter and doesn't the degree of pain inflicted, always call upon a corresponding degree of redress to be extracted, from our legal system, from the malefactor?
If I am struck, and fly into rage, striking my opponent in return--isn't that that righteousness, that mammalian humanness, recognized as a response being in accord with nature and thus given more priority over merely what courts determine is 'correct'? Pain is always given more recognition, no?
When an injured party acts punitively--but with deliberation and the opportunity for forethought-- isn't it usual for vindication of his actions to be withheld (by the rest of us)?
Just musing aloud. "
Feliks, as usual you have a very cynical view of institutions and customs. I was a civil litigation lawyer (usually on the defense side) for more than three decades. It is true that some lawyers appeal to the emotional side of jurors. However, it is also true that many lawyers rely primarily on reason and evidence, and the rules structure of litigation encourages the latter. Moreover, the judge is supposed to be above merely emotional influences, though it doesn't always work that way. The entire structure of the rules of evidence is designed to reduce emotional decisionmaking. Ditto many of the procedural rules. The trial judge enforces the rules of evidence and the rules of civil (or criminal) procedure. In the event the judge fails to do so, it is an appealable issue, and the appellate court will reverse the decision of the trial court if the rules violation amounts to prejudicial error.

Hey, that book sounds good. Gonna see if I can I can come across a copy sometime. Thanks~!
This article In the Winter 2018 issue of The University of Chicago Magazine contains an interesting discussion of quantitative and qualitative thinking. It raises some questions concerning quantitative studies about which I have often wondered. The article seems to conclude that both types of thinking are necessary.
Feliks wrote (in another topic): "Refreshing my familiarity with logical fallacies tonight. Poking around various web pages of dubious scholarship.
Depending on the authorship of the fallacies being written about, I find myself casting a jaundiced eye over some of the interpretations offered.
Certain fallacies (ad hominem, or the Argument from Ignorance) seem much more obvious and basic (and acceptable) than others, which seem more to me like nit-picking and hair-splitting.
Does it seem to anyone else but I, that a few of the classical fallacies seem written as if to advocate a path to logical behavior on some other planet than Earth? This is my reaction to reading some of these.
But then I'm naturally reminded that logic is often said (by its designers) to be intended to serve mankind in whatever environment we may find ourselves. We are admonished to let logic govern our rationality even if for instance, our species had (at any point in time) developed somewhere else in the universe. Even on some other planet.
Well. This leads me to then ask this: what if there ever had been another planet within our experience? What if we had ever stepped someplace where extremely different physical or chemical or biological laws applied? If we were raised with different gravity or different reproduction cycles?
Some formal fallacies seem to take many Earth-centric factors for granted as their basis (for their explanation), and yet at the same time say "don't fall back on nature, because it doesn't apply everywhere".
Example: the average citizen usually believes in a 'principle' like say, 'revenge' or 'retribution' without being able to say exactly why. He just knows it, He has seen it all his life. He lives by it. It's not logical, but it is how the world generally works. Yet, logic tells this man never to use this 'worldliness' to validate his actions.
Its implied that he's allowing too much awareness of concrete everyday life to permeate his thinking. Instead (to be fully circumspect and logical) he should behave as if inhabiting an abstract planet in a theoretical, non-corporeal universe.
But under a different definition of nature and evolution (from which over millennia, our current society developed) wouldn't all logical fallacies themselves, be grossly altered? Wouldn't even the logic from which logical fallacies are drawn, be undercut?
Imagine a world where gravity did not permit a harmful or violent blow to be struck. Or a planet where nothing could 'crash' into anything else, resulting in 'harm'. How many fallacies would have to be re-written?
Just musing. I realize I may have asked this question in a different form, some time previously (this strikes me now as I read what I have written). If so, I ask for indulgence. :D "
If there are other universes (i.e., a multiverse), different laws of physics and logic may apply to those other universes. However, according to Richard Wolfson, Professor of Physics at Middlebury College, in his Great Courses lecture series entitled "Physics and Our Universe: How It All Works," the same laws of physics apply throughout our knowable universe (and always have). For example, the speed of light, a constant, is everywhere the same in our universe. Accordingly, I don't see the "logic" of talking about human beings basing human thought (including logic) and action on different laws of physics. As Aristotle famously wrote (if I recall his Metaphysics correctly), "A" is "A."
Now, Randal and others speak of "mutlivalent" logic, which I don't really understand, but I think they are still talking about logical principles that arguably apply within the universe that we know. I'm not certain of this, however, because this approach may be based on a radical skepticism and perhaps a profound subjectivism or relativism, but, as I say, I am not knowledgeable about this way of thinking and may not be describing it accurately.
As for replacing rational thinking (i.e., logic in the broad sense) with emotion (e.g., "revenge" or "retribution," as you say), I am already on record in earlier responses to you on that matter. Although it is unquestionable that many people operate on the basis of such irrational impulses, the statistical average does not define proper thinking. Otherwise, we have chaos and anarchy, the "war of all against all," as Hobbes said of his infamous "state of nature."
You are, of course, justified in not taking statements of logical principles as holy writ just on the basis of one "authority" or another. Some of the alleged logical fallacies are loosely defined (e.g., the so-called "naturalistic fallacy") and should at least be more specifically described instead of being bandied about for decades in rarified philosophical circles as if everyone knew what everyone else was talking about (I suspect different people mean different things by the "naturalistic fallacy"). This kind of inquiry gets into an interesting area that I wish I had time to explore further and may, in fact, do so if I live long enough.
Depending on the authorship of the fallacies being written about, I find myself casting a jaundiced eye over some of the interpretations offered.
Certain fallacies (ad hominem, or the Argument from Ignorance) seem much more obvious and basic (and acceptable) than others, which seem more to me like nit-picking and hair-splitting.
Does it seem to anyone else but I, that a few of the classical fallacies seem written as if to advocate a path to logical behavior on some other planet than Earth? This is my reaction to reading some of these.
But then I'm naturally reminded that logic is often said (by its designers) to be intended to serve mankind in whatever environment we may find ourselves. We are admonished to let logic govern our rationality even if for instance, our species had (at any point in time) developed somewhere else in the universe. Even on some other planet.
Well. This leads me to then ask this: what if there ever had been another planet within our experience? What if we had ever stepped someplace where extremely different physical or chemical or biological laws applied? If we were raised with different gravity or different reproduction cycles?
Some formal fallacies seem to take many Earth-centric factors for granted as their basis (for their explanation), and yet at the same time say "don't fall back on nature, because it doesn't apply everywhere".
Example: the average citizen usually believes in a 'principle' like say, 'revenge' or 'retribution' without being able to say exactly why. He just knows it, He has seen it all his life. He lives by it. It's not logical, but it is how the world generally works. Yet, logic tells this man never to use this 'worldliness' to validate his actions.
Its implied that he's allowing too much awareness of concrete everyday life to permeate his thinking. Instead (to be fully circumspect and logical) he should behave as if inhabiting an abstract planet in a theoretical, non-corporeal universe.
But under a different definition of nature and evolution (from which over millennia, our current society developed) wouldn't all logical fallacies themselves, be grossly altered? Wouldn't even the logic from which logical fallacies are drawn, be undercut?
Imagine a world where gravity did not permit a harmful or violent blow to be struck. Or a planet where nothing could 'crash' into anything else, resulting in 'harm'. How many fallacies would have to be re-written?
Just musing. I realize I may have asked this question in a different form, some time previously (this strikes me now as I read what I have written). If so, I ask for indulgence. :D "
If there are other universes (i.e., a multiverse), different laws of physics and logic may apply to those other universes. However, according to Richard Wolfson, Professor of Physics at Middlebury College, in his Great Courses lecture series entitled "Physics and Our Universe: How It All Works," the same laws of physics apply throughout our knowable universe (and always have). For example, the speed of light, a constant, is everywhere the same in our universe. Accordingly, I don't see the "logic" of talking about human beings basing human thought (including logic) and action on different laws of physics. As Aristotle famously wrote (if I recall his Metaphysics correctly), "A" is "A."
Now, Randal and others speak of "mutlivalent" logic, which I don't really understand, but I think they are still talking about logical principles that arguably apply within the universe that we know. I'm not certain of this, however, because this approach may be based on a radical skepticism and perhaps a profound subjectivism or relativism, but, as I say, I am not knowledgeable about this way of thinking and may not be describing it accurately.
As for replacing rational thinking (i.e., logic in the broad sense) with emotion (e.g., "revenge" or "retribution," as you say), I am already on record in earlier responses to you on that matter. Although it is unquestionable that many people operate on the basis of such irrational impulses, the statistical average does not define proper thinking. Otherwise, we have chaos and anarchy, the "war of all against all," as Hobbes said of his infamous "state of nature."
You are, of course, justified in not taking statements of logical principles as holy writ just on the basis of one "authority" or another. Some of the alleged logical fallacies are loosely defined (e.g., the so-called "naturalistic fallacy") and should at least be more specifically described instead of being bandied about for decades in rarified philosophical circles as if everyone knew what everyone else was talking about (I suspect different people mean different things by the "naturalistic fallacy"). This kind of inquiry gets into an interesting area that I wish I had time to explore further and may, in fact, do so if I live long enough.

I believe I came across a specific fallacy-definition last night called the 'acts have consequences' fallacy. The author described this as a fallacy because unlike a naturalistic observation "stormy skies mean it will rain" is different from a municipality which admonishes us "three strikes means prison". The latter statement is a punishment which the town has a choice to levy or not. It does not automatically follow; because different towns can have different strictness in their legal code.
But to me (with a pragmatist's hat on) the axiomatic nature of man's society is nigh-invariable in its 'automatic' qualities. Commit a crime...you are very likely to suffer punishment; insult a neighbour, you are likely to get the same back; kick a dog and you will likely get bit, etc. For the common man, our world is very "tit for tat". Do such-and-such; and it will bring such-and-such down on your head. "Those to whom evil is done, do evil in return". Revenge, or even just principles of good luck/bad luck. Making your own bed and then (of course) lying in it.Take risks with life or property and you will not enjoy either for very long.
This kind of reasoning is what I mean; in the modern world we all live with mindfulness of 'consequences'. From this usually stems (in most people) a common, everyday morality. Being proactive to avoid harm to yourself or others. Not tossing a banana peel over your shoulder onto the sidewalk as you eat it. Shoveling ice off your driveway so that no one slips. Using your turn signal properly as you drive your vehicle. Living with awareness of liability and hurt and damage, and taking steps in advance to reduce accident and mishap. Using forethought.
But the definition of 'acts have consequences' fallacy (which I read last night) seemed to take no account of this; and only recommended it to us as a fallacy in an argument, because it can be misused by people in authority when speaking of 'punishment'. But the natural world and the man-made world are all full of consequences, and recognizing this is how we actually live.
Maybe I simply mis-read the passage. But if I were in a debate with someone--and if the argument seemed to call for it--I would certainly cite the very obvious principle that yes, acts DO have consequences in this life. Am I incorrect?
Feliks wrote: "Thanks Alan. Let me see if I can make my point plainer.
I believe I came across a specific fallacy-definition last night called the 'acts have consequences' fallacy. The author described this as a ..."
I'm in a rush right now, but it strikes me that the so-called "acts-have-consequences fallacy" is not really within the purview of logical fallacies. It rather gets into legal and practical matters that are outside the realm of logic (even what is called "practical logic"). Perhaps the author you were reading was using the word 'fallacy" in a very extended sense. Gotta go now.
I believe I came across a specific fallacy-definition last night called the 'acts have consequences' fallacy. The author described this as a ..."
I'm in a rush right now, but it strikes me that the so-called "acts-have-consequences fallacy" is not really within the purview of logical fallacies. It rather gets into legal and practical matters that are outside the realm of logic (even what is called "practical logic"). Perhaps the author you were reading was using the word 'fallacy" in a very extended sense. Gotta go now.

"American business owners who hire cheap Mexican immigrant labor off the books, paying them at less-than-minimum wage, granting them no rights and no benefits, no medical care, sweatshop conditions--these businessmen are actually liberal because they support the open border policy, the spirit of 'give us your tired, your poor, your hungry'..."
Groan.
The University of Chicago Magazine recently reprinted excerpts from a 1967 essay by Wayne C. Booth, then Dean of the University of Chicago College and professor of English, entitled "'Now Don't Try to Reason with Me': Rhetoric Today, Left, Right, and Center," originally published in the November and December 1967 issue of the magazine. The current issue reprints these remarks under the title "#FakeNews and Free Speech—in 1967." I was an undergraduate at the University of Chicago at exactly that time, and my thoughts then (as now) were exactly as Booth stated them in his essay, of which I was totally unaware. Had I read the essay at that time, I would have realized that he was a kindred spirit. I strongly recommend reading the current article, which applies equally to today's situation as it did in 1967—an era that I remember all too well.
Wayne Booth (1921-2005) was the author of The Rhetoric of Fiction (the first edition of which I partially read in the 1960s and/or 1970s) and Now Don't Try to Reason with Me: Essays and Ironies from a Secular Age (1970), a book of which I was unaware until today and have now put on my "to read" list.
If I recall correctly, Robert Wess, Professor Emeritus at Oregon State University and a member of this Goodreads group, may have known Wayne Booth when he was at the University of Chicago. Bob can correct me if my recollection is inaccurate.
Wayne Booth (1921-2005) was the author of The Rhetoric of Fiction (the first edition of which I partially read in the 1960s and/or 1970s) and Now Don't Try to Reason with Me: Essays and Ironies from a Secular Age (1970), a book of which I was unaware until today and have now put on my "to read" list.
If I recall correctly, Robert Wess, Professor Emeritus at Oregon State University and a member of this Goodreads group, may have known Wayne Booth when he was at the University of Chicago. Bob can correct me if my recollection is inaccurate.

Booth tells a story about a campus controversy over the dismissal of a popular teacher in which, as he describes it, "things got so bad that each side found itself reduplicating broadsides produced by the other side, and distributing them, in thousands of copies, without comment; to each side it seemed as if the other side's rhetoric was self-damning" (MODERN DOGMA AND THE RHETORIC OF ASSENT, U Chicago P, 1974, pp. 8-9).
Booth was dean of the College while I was in graduate school. His time for students was thus limited and I never took a course from him or worked with him in any way. He did, however, become a Kenneth Burke fan and came to conferences of the Kenneth Burke Society. He came to a seminar I gave at one of these conferences, so I got to know him a bit. An amicable gentleman, he was a Mormon, grew up in Utah, and went to Brigham Young, before going to graduate school at the University of Chicago.
THE RHETORIC OF FICTION is the book that made him famous in literary studies. .
Bob
Robert wrote: "Alan, your memory accords with mine.
Booth tells a story about a campus controversy over the dismissal of a popular teacher in which, as he describes it, "things got so bad that each side found it..."
Thanks, Bob, for your comment and for the reference to Modern Dogma and the Rhetoric of Assent, which I have now put on my "To Read" list. It looks like this book may be directly relevant to my forthcoming book on reason and ethics. I was not heretofore aware of the books Wayne Booth had published after I left Chicago, and this looks like one that will be of great interest to me.
Booth's story about the campus controversy probably involves (to my recollection, which is now some fifty years old) a sociology professor who was denied tenure (or who had a temporary position, not renewed). I recall that she had sympathy for the New Left or Marxism generally, which many of the undergraduates (themselves heavily influenced by the New Left) felt was the cause of her being cashiered from the faculty. The controversy played out in real time in the pages of the Maroon (or "the Moron," as it was colloquially called), which was the student newspaper. I took no side in that controversy. I was not sympathetic to the New Left or to Marxism generally, but I had no knowledge of the specifics of the dispute. Looking at it fifty years later, one might suspect that male chauvinism, as well as politics, had something to do with it, but all I knew about the dispute was what was reported in the Maroon, which was hardly an unbiased source.
Our respective times at the University of Chicago overlapped, but, of course, we never met, considering the many thousands of undergraduate and graduate students there at the time and the fact that we were in different fields.
Alan
Booth tells a story about a campus controversy over the dismissal of a popular teacher in which, as he describes it, "things got so bad that each side found it..."
Thanks, Bob, for your comment and for the reference to Modern Dogma and the Rhetoric of Assent, which I have now put on my "To Read" list. It looks like this book may be directly relevant to my forthcoming book on reason and ethics. I was not heretofore aware of the books Wayne Booth had published after I left Chicago, and this looks like one that will be of great interest to me.
Booth's story about the campus controversy probably involves (to my recollection, which is now some fifty years old) a sociology professor who was denied tenure (or who had a temporary position, not renewed). I recall that she had sympathy for the New Left or Marxism generally, which many of the undergraduates (themselves heavily influenced by the New Left) felt was the cause of her being cashiered from the faculty. The controversy played out in real time in the pages of the Maroon (or "the Moron," as it was colloquially called), which was the student newspaper. I took no side in that controversy. I was not sympathetic to the New Left or to Marxism generally, but I had no knowledge of the specifics of the dispute. Looking at it fifty years later, one might suspect that male chauvinism, as well as politics, had something to do with it, but all I knew about the dispute was what was reported in the Maroon, which was hardly an unbiased source.
Our respective times at the University of Chicago overlapped, but, of course, we never met, considering the many thousands of undergraduate and graduate students there at the time and the fact that we were in different fields.
Alan

Who do university administrations 'answer to' for their policies? There's always a university president of course and then usually a board of university trustees. Everyone knows that much.
Beyond that, where does decision-making rest? Surely, not with alumni (although no one wants to offend wealthy alumni)?
For example, I see University of Chicago was founded by the Rockefellers. In this or any other such case, does it mean that a corporate tint colors the water of these institutions?
What about State Universities? For convenience, let's say the University of Delaware. Does the University of Delaware respond --or steer itself in any way--to the ideology of the State of Delaware or the Duponts?
Feliks wrote: "Was that Marlene Dixon who was let go?"
I didn't (and don't) recall the name, but per this Wikipedia article (section on Marlene Dixon), she is definitely the one. Feliks, you really know your left-wing trivia! And my memory from five decades ago is not so bad, though I've never been able to remember names.
I didn't (and don't) recall the name, but per this Wikipedia article (section on Marlene Dixon), she is definitely the one. Feliks, you really know your left-wing trivia! And my memory from five decades ago is not so bad, though I've never been able to remember names.
Feliks wrote (post 134): "For example, I see University of Chicago was founded by the Rockefellers. In this or any other such case, does it mean that a corporate tint colors the water of these institutions?"
The University of Chicago of the mid-twentieth century was much, much different from the Baptist institution that John D. Rockefeller funded, probably due in large part to the influence of the "boy wonder" Robert Maynard Hutchins (1899-1977), who as president (1929–1945) and chancellor (1945–1951), of the University transformed the College into a Great Books type of program (see the Wikipedia article on him here). Hutchins hired Leo Strauss as a professor in the political science department in 1949 at the prompting (I have read) of Hans Morgenthau, and Strauss held the Robert Maynard Hutchins Distinguished Service Professorship there until he left in 1969 due to the University's mandatory retirement policies at that time. When I was an undergraduate at the University of Chicago from 1964 to 1968 (later also obtaining my A.M. degree there), there was still a substantial, though attenuated, Hutchins influence, including a two-year core curriculum based largely (in the humanities and social sciences) on the Great Books. Hutchins was, in the view of many Chicago graduates, the greatest of the leaders of the University. Among other things, he eliminated football and opposed fraternities. When I was there, fraternities were a very low-level operation which no self-respecting student would join. Football began, however, making a comeback during my last years there, though hardly anyone attended football games. According to the Wikipedia article, the business community and donors frowned on Hutchins's revolutionary ideas, and football, fraternities, and sororities are now, I understand, well established. I am glad to have been there during the last of the College's glory days, albeit marred by the upheavals of the late 1960s. Today, it may be more like a typical neoliberal university, though I cannot speak either through knowledge or experience of its present situation. So perhaps (I don't really know) John D. Rockefeller is once again exerting influence, this time from the grave. However, the University of Chicago was definitely not a Baptist institution at the time I was there; it had no religious affiliation whatsoever.
I have insufficient knowledge to speak to your more general questions.
The University of Chicago of the mid-twentieth century was much, much different from the Baptist institution that John D. Rockefeller funded, probably due in large part to the influence of the "boy wonder" Robert Maynard Hutchins (1899-1977), who as president (1929–1945) and chancellor (1945–1951), of the University transformed the College into a Great Books type of program (see the Wikipedia article on him here). Hutchins hired Leo Strauss as a professor in the political science department in 1949 at the prompting (I have read) of Hans Morgenthau, and Strauss held the Robert Maynard Hutchins Distinguished Service Professorship there until he left in 1969 due to the University's mandatory retirement policies at that time. When I was an undergraduate at the University of Chicago from 1964 to 1968 (later also obtaining my A.M. degree there), there was still a substantial, though attenuated, Hutchins influence, including a two-year core curriculum based largely (in the humanities and social sciences) on the Great Books. Hutchins was, in the view of many Chicago graduates, the greatest of the leaders of the University. Among other things, he eliminated football and opposed fraternities. When I was there, fraternities were a very low-level operation which no self-respecting student would join. Football began, however, making a comeback during my last years there, though hardly anyone attended football games. According to the Wikipedia article, the business community and donors frowned on Hutchins's revolutionary ideas, and football, fraternities, and sororities are now, I understand, well established. I am glad to have been there during the last of the College's glory days, albeit marred by the upheavals of the late 1960s. Today, it may be more like a typical neoliberal university, though I cannot speak either through knowledge or experience of its present situation. So perhaps (I don't really know) John D. Rockefeller is once again exerting influence, this time from the grave. However, the University of Chicago was definitely not a Baptist institution at the time I was there; it had no religious affiliation whatsoever.
I have insufficient knowledge to speak to your more general questions.

1) A joke: UC is where atheist professors teach Thomas Aquinas to Jewish students.
2) Undergraduates are a minority. This had a big impact on me because I spent my first year in college at Michigan State, then transferred to UC in my second year. The change in atmosphere was dramatic. If I'm not mistaken, back in the 1960s there were roughly 7500 graduate students and 2500 undergraduates. Undergraduates are still a minority. I checked online for 2018 winter quarter enrollment numbers:
15377 total enrollment
6150 undergraduates
5208 masters
955 professional
3064 doctoral
3) One sign of the "Great Books" emphasis is that required readings were often the Great Books themselves, not textbook accounts of what they said. You can find an example of this in Richard McKeon, ON KNOWING: THE NATURAL SCIENCES (University of Chicago P). This is a transcription of one of McKeon's undergraduate courses (students often taped McKeon's classes because they were so difficult). The assigned readings in this course were selections from Plato's TIMAEUS, Aristotle's PHYSICS, Galileo's TWO NEW SCIENCES, Newton's PRINCIPIA MATHEMATICS, and Maxwell's MATTER AND MOTION. The book also includes McKeon's notes for a section on Einstein that was sometimes included in the course. Another McKeon course is also in print, but I haven't gotten a copy yet. I know that there are current efforts to publish more of his courses.
In partial answer to Feliks's question, state universities like Oregon State ultimately answer to state legislatures. Legislatures have a regular impact on a university's budget, which is crucial, and can intervene in university affairs as much or as little as they like. Oregon has a state board of education that mediates between the legislature and the universities in the state system. Private universities like UC have much more autonomy and generally speaking the great universities in the country are private universities. I imagine that each private university has its own system of self-governance, subject to whatever laws may be relevant.
No doubt donors have influence. Conservatives object to affirmative action but usually don't like to talk about affirmative action for the children of the rich. Evidently Jared Kushner's GPA and SAT didn't measure up to Harvard standards but he nonetheless got admitted around the time his father pledged to donate 2.5 million to Harvard. Maybe just a coincidence, right?
I don't know if George Mason is a state school or private, but Koch money is trying to shape its policies, so much so that students are evidently protesting. Such money will limit the number of faculty who would want to go to George Mason, but no doubt it would attract some.
To a large extent, then, the situation varies from university to university.
Bob

Still, I think this topic (accidentally raised by yours truly), could use some more exhumation.
For instance, when students 'took over' campuses in the 1960s, what powers ultimately governed the adjudication of those uprisings? Has anyone ever asked?

Things no doubt worked out differently in different cases. I know of a case like that of Marlene Dixon, for example, that turned out differently. At the University of Texas, the English department was going to terminate a popular teacher (Joe Malof, if I remember the name correctly). Students protested. The Dean sided with the students and overruled the English department, even re-organized the governance of the department I was told (this happened a few years before I taught there, before later moving to Oregon State.) Was gender a factor in this difference? Maybe, but feminism was among the things that got going in the 1960s. Betty Friedan's famous FEMININE MYSTIQUE appeared in 1963. Sexist decisions began encountering push back.
As I said, I haven't researched the subject, but I don't think you're going to find some "powers" that determined "the adjudication" in all these cases.
Bob

Nevertheless I feel there must be some summary statement which can be made about State Education and chartered university administration (both) in the USA.
Who determines the stance of universities in general? I've seen one report which suggests that university administrations and faculties (except for business schools) are overwhelmingly 'liberal'.
Howsoever that may be, hearing this reminds me of Chomsky's statement about news-media.
Let's imagine that newsmedia are overwhelmingly liberal ('based on their voter records'). Well, so what? The real question is, does the system allow them any empowerment of their views? In media, no.
But what about in universities? How much political power is wielded? How much suppression?
Robert wrote (post 137): "Three notes to supplement Alan's fine account of what UC was like in the 1960s:
1) A joke: UC is where atheist professors teach Thomas Aquinas to Jewish students.
2) Undergraduates are a minority..."
Thanks, Bob, for your additional remarks about the University of Chicago, all of which are quite accurate (as of fifty years ago and perhaps, to some extent, today except as updated by the numbers).
1) A joke: UC is where atheist professors teach Thomas Aquinas to Jewish students.
2) Undergraduates are a minority..."
Thanks, Bob, for your additional remarks about the University of Chicago, all of which are quite accurate (as of fifty years ago and perhaps, to some extent, today except as updated by the numbers).
Feliks wrote (post 140): "Valuable insight, BOB. Thanks.
Nevertheless I feel there must be some summary statement which can be made about State Education and chartered university administration (both) in the USA.
Who de..."
I would guess (but don't really know) that universities, private and public, are governed generally by a board of trustees (analogous to a corporate board of directors) and day-to-day by its officers, i.e., the presidents/chancellors, deans, provosts, etc. But each state has its separate laws regarding nonprofits and public universities/colleges, and I don't think much of a generalization can be made. We live in a decentralized republic, and each state has its own laws, rules, etc.
Nevertheless I feel there must be some summary statement which can be made about State Education and chartered university administration (both) in the USA.
Who de..."
I would guess (but don't really know) that universities, private and public, are governed generally by a board of trustees (analogous to a corporate board of directors) and day-to-day by its officers, i.e., the presidents/chancellors, deans, provosts, etc. But each state has its separate laws regarding nonprofits and public universities/colleges, and I don't think much of a generalization can be made. We live in a decentralized republic, and each state has its own laws, rules, etc.

...and the opposing party says "well, any information is subject to error, all facts can be falsified and skewed depending on where they originate...a German might interpret a fact one way and an Italian might interpret the same fact completely differently, etc etc etc"
You probably know this off the top of your head so its to you I'm turning to save time! :D
Feliks wrote: "Alan, what is the name of the logical fallacy invoked during a debate when one party says "here's the source for the point I just stated about (such-and-such topic), it's an authoritative, first-le..."
The first would be an argument from authority (not necessarily fallacious as such in all circumstances, as no one can be an expert on all matters, but the burden of the argument then shifts to the expert's arguments and evidence), and the second would be an argument based on relativism (which may not be a named fallacy as such but which I would contend is fallacious). I have just encountered an essay that appears to be on this kind of issue but which I have not yet read: Neil Levy, "Science Gone Wild: Explaining Irrational Beliefs". Levy is the author of numerous books and articles on neuroethics, ethics generally, critical thinking, and related subjects. I am reading one of Levy's books right now ( Neuroethics: Challenges for the 21st Century) and own or have ordered additional ones (see his Amazon page). His academia.edu website, which contains a multitude of his papers, is here. I am quite impressed with what I have read so far of his work. He is a professor of philosophy at Macquarie University (Australia) and a senior research fellow at the Oxford Uehiro Centre for Practical Ethics.
The first would be an argument from authority (not necessarily fallacious as such in all circumstances, as no one can be an expert on all matters, but the burden of the argument then shifts to the expert's arguments and evidence), and the second would be an argument based on relativism (which may not be a named fallacy as such but which I would contend is fallacious). I have just encountered an essay that appears to be on this kind of issue but which I have not yet read: Neil Levy, "Science Gone Wild: Explaining Irrational Beliefs". Levy is the author of numerous books and articles on neuroethics, ethics generally, critical thinking, and related subjects. I am reading one of Levy's books right now ( Neuroethics: Challenges for the 21st Century) and own or have ordered additional ones (see his Amazon page). His academia.edu website, which contains a multitude of his papers, is here. I am quite impressed with what I have read so far of his work. He is a professor of philosophy at Macquarie University (Australia) and a senior research fellow at the Oxford Uehiro Centre for Practical Ethics.

I discovered some basic facts to start with. If I didnt mention it in passing (above) its easily seen that MBA and business school grads are the most popular college major by far and away over any other subject field.
Just one link for convenience, (but there are obviously many sources for this)
https://www.washingtonpost.com/news/g...
Next:
How many colleges & universities in America?
Approximately 4,168 (including even the most minor 2-yr colleges, "for-profit", etc)
Approximately 2,169 (just the 4-year, non-profit institutions).
How many business schools in America?
Approximately 633.
What does 'accreditation' mean?
It means that in addition (or perhaps in spite of) the state & local governance that Alan mentions above, a university must meet educational standards as approved by the US Dept of Education. (US Edu doesn't administer the standards itself, but it approves the accrediting-bodies which do so). If a college fails to meet these standards, the Dept of Edu will deny financial aid programs to students attending there. That's a pretty serious requirement that no college is going to flout if it can help it.
So now the question would be, "well, what are the standards of accreditation as prescribed by the accrediting bodies?"
To be continued...

I'm trying to paint a picture of how most of the schools reacted to these troubles when they struck. Seems to me that most Boards-of-Trustees accommodated a small margin of student demands but resisted any attempt by students to seriously alter university policies. And of course any violent protests were simply shut down. Is that fair to say?
Feliks wrote: "Does anyone know--off the top of their head--what were the most violent, agitated, and persistently troubled campuses in the '60s? Berkeley, I assume. U of Chicago (as we read earlier in this threa..."
I believe Cornell was another. There are undoubtedly books on this, but I have not bothered to read them as I lived through it. The trouble is that my memory of these events is not so good after five decades. There are some things one likes to remember and some things one would prefer to forget. This topic belongs to the latter category.
I believe Cornell was another. There are undoubtedly books on this, but I have not bothered to read them as I lived through it. The trouble is that my memory of these events is not so good after five decades. There are some things one likes to remember and some things one would prefer to forget. This topic belongs to the latter category.
I have deleted my earlier comment at this location but am adding this note as a placeholder so that the numerical designations for this topic will not changed.
This article discusses how Amazon is now allowing "peer" review of scholarly papers, even by people who have no background or expertise in the subject matter of the paper. The article quotes Amazon CEO Jeff Bezos as praising this development as follows: "Why shouldn’t the public comment on a research paper on kidney disease? I have two kidneys, and so do most of my friends."
Trump may want to reconsider his attacks against Bezos. Sounds like they have exactly the same attitude toward knowledge and expertise.
Trump may want to reconsider his attacks against Bezos. Sounds like they have exactly the same attitude toward knowledge and expertise.

For example, 'consistency' might be one characteristic. Its the first trait I usually think of.
Another might be, 'not based on a small sample size'.
Another might be, 'must not be self-contradicting'
Or, 'must not exhibit gaps' (cohesiveness; uniformity).
Must be rational.
Should be simple?
Must be 'congruent"? (In 'alignment' with the rest of modern practice?)
Should be 'impersonal' (free from a vested interest)?
'Must apply to historical circumstances', just as well as it applies to present ones. Flexible enough for all time-periods.
Must not be based on purely on circumstance, observation alone, or anecdotal evidence alone (?)
'Must not rely on exceptions or outliers'?.
(similarly --I would say-- a strong argument must be able to account for outliers and exceptions. It can't 'fold up' in the face of singularities).
'Must be able to contain all previous theories and account for them'. For instance, this is how Einstein supercedes Newton, who encompassed Galileo who replaced Copernicus who (I think) replaced Kepler. Right?
'Must not be hierarchical or 'handed-down' knowledge, ought to be testable'
And of course not even neglecting that a good argument must not commit logical fallacies, either. But all these 'other traits' I have in mind seem to go beyond even fallacy tests.
Maybe I am mixing things up. Is there a concise list of obvious hallmarks which mark a strong argument vs a weak one? Even at just a glance?
Books mentioned in this topic
The Philosophy of Symbolic Forms: Volume 4: The Metaphysics of Symbolic Forms (other topics)Mythical Thought (other topics)
The Philosophy of Symbolic Forms 3: The Phenomenology of Knowledge (other topics)
Ernst Cassirer: The Last Philosopher of Culture (other topics)
The Philosophy of Symbolic Forms, Volume 1: Language (other topics)
More...
Authors mentioned in this topic
Arthur Koestler (other topics)Aleksandr Solzhenitsyn (other topics)
Edward R. Tufte (other topics)
Richard Saul Wurman (other topics)
This may be a site which helps me out!
https://theprogressivecynic.com/2014/...