Daniel M. Russell's Blog, page 39

July 11, 2019

SearchResearch Challenge (7/7/2019): A couple of questions about Polynesia! (Why so long? What are those clear patches?)


I managed to find wifi! 

As I mentioned last week, I'm touring through French Polynesia for the next two weeks.  It's kind of a long way to go, but it's completely worth it.  Lots of long stretches as we sail from one island to the next.  Many of these are coral atolls, and look a bit like this as we sail by.  They're all low-slung, just barely out of the water.  You wonder how they survive when a big storm comes through.  

Rangiroa seen from the sea.

Or like this, from a satellite image:  



As I've said before, traveling is an endless source of SRS questions. Here, in this place, there are SO many things I've had to look up--my SRS skills are getting a great workout!  What kind of tree is that?  Does the nut from that tree really have fish-stupefying properties?  Really? 

Many of the things I've been seeing need a bit of research to help me understand what I'm seeing.  

This week I've got two Challenges, but on the next cycle, I'll add two more.  But for today, let's start with one slightly difficult Challenge, and a simpler one.  

1.  In researching the dates of initial colonization of Polynesian islands, I noticed a VERY strange incongruity.  Look at the map below.  The blue pins are all island nations that were first colonized around 1000AD.  The red pins (to the left of the long green line) were all colonized around 1000BCE or before.  What happened here between 1000BCE and 1000AD?  Why are the all of the blue pins MUCH later than the red pinned locations?  It's not that far from Samoa to Niue, why didn't anyone colonize that island until 900AD or so?  Generally--why didn't the Polynesians go beyond the green line for a very long time?  





2.  As we're sailing from place to place, it's not uncommon to see large patches of water without any ripples on the surface.  It's something you see nearly everywhere--it's a common effect on lakes, ponds, and oceans.  But what causes these ripple-free regions on the water?   (See below for an image that has a large Y-shaped blank area in the middle. What causes this?)  



As always, be sure to tell us not JUST the answer, but how you figured it out!  What searches worked for you, and if you spend a lot of time on a rathole that doesn't work out, be sure to leave us a comment to that effect.  We can learn a lot from strategies that don't work out.  

Search on!  


 •  0 comments  •  flag
Share on Twitter
Published on July 11, 2019 10:37

July 6, 2019

Answer (Part 2): What DO we know about history, math, and geography?

Last time we talked about history...  
Now, let's talk about math and geography and how much people know about each.   


In this excerpt from Raphael's fresco The School of Athens, Pythagoras is shown writing in a book as a young man presents him with a tablet.  It shows a diagrammatic representation of a lyre above a drawing of the sacred tetractys.  Do you know Pythagoras and his contributions to mathematics? Do you know why the lyre is significant here? 
1.  Can you find a high-quality (that is, very credible) study of how well the citizens of the United States (or your country, if you're from somewhere else) understand (A) history, (B) geography, (C) mathematics?  

Let's try repeating what we did last time for history.  

     [ American knowledge of math ] 


And we see a similar result: 


Here you go: Lots of results telling us that Americans are terrible at math.  Once again I'll open up the top 10 results in parallel browsing and take a look. 

Even the New York Times has an article with the headline, "Why do Americans Stink at Math?" from 2014.  Although it's a compellingly dismal story about American's ability to do math and why the education system isn't working.  But it refers to the results of studies (but does not give any citation) for their data.  

We must dig deeper, looking at the articles AND who publishes each, AND where they get their data from.  

1. US News and World Report "Why students are bad at math" -- points us to the 2017 National Assessment of Education Progress.  We've seen this data source before in our previous post.  This org is also called NAEP and this report is called "The Nation's Report Card," and summarizes the results of testing across a wide spectrum of US schools for grades 4, 8 and 12.  (I'm alway encouraged by a data source when you can download the data for yourself.  Open data is a sign of a reputable organization, one that's willing to let you look at the raw data source.)  Here's their 2017 Math data set in PDF form.  Here's the top line of that report.  (If you're interested, it's worth looking through the data for all of the metadata about their testing methods, and all of the data exceptions--which all data sets have, but which give me confidence that they took good care collecting this data.)  

Click to see this figure at full-size.  
Summary of this data?  There's been a huge drop in math test scores between 1991 and 2017 almost across the board for grades 4 and 8.  

2.  The Quartz.com article "Americans are spectacularly bad at answering even the most basic math questions." is another dismal headline.  This article points to the PISA studies done by the OECD (Organisation for Economic Co-operation and Development).  As they say on their website, "PISA is the OECD's Programme for International Student Assessment. Every three years it tests 15-year-old students from all over the world in reading, mathematics and science. The tests are designed to gauge how well the students master key subjects in order to be prepared for real-life situations in the adult world."  

This is an interesting comparison source that I hadn't thought about:  How can we measure one country's math understanding?  By comparing test scores with other countries!  

What this this test show?  

"Shanghai-China has the highest scores in mathematics, with a mean score of 613 points – 119 points, or the equivalent of nearly three years of schooling, above the OECD average. Singapore, Hong Kong-China, Chinese Taipei, Korea, Macao-China, Japan, Liechtenstein, Switzerland and the Netherlands, in descending order of their scores, round out the top ten performers in mathematics..."  

Uh oh, this means the US isn't even in the top 10.  Where are we?  You can look at their test data overview here.  And this is the key chart... 

Click to see full size.  

As the overview reports: 

Among the 34 OECD countries, the United States performed below average in mathematics in 2012 and is ranked 27th (this is the best estimate, although the rank could be between 23 and 29 due to sampling and measurement error). Performance in reading and science are both close to the OECD average. The United States ranks 17 in reading, (range of ranks: 14 to 20) and 20 in science (range of ranks: 17 to 25). There has been no significant change in these performances over time.
Meanwhile, mathematics scores for the top-performer, Shanghai-China, indicate a performance that is the equivalent of over two years of formal schooling ahead of those observed in Massachusetts, itself a strong-performing U.S. state. 
Just over one in four U.S. students do not reach the PISA baseline Level 2 of mathematics proficiency – a higher-than-OECD average proportion and one that hasn’t changed since 2003. At the opposite end of the proficiency scale, the U.S. has a below-average share of top performers.... 


3.  The Pew Research Center's report, "U.S. Students' academic achievement still lags that of their peers in many other countries" also points to the OECD / PISA study AND several others, giving a nicely integrated overview of the data.  They put a slightly more optimistic spin on the data.  They tell us that American students' math skills have increased over the past according to the NEAP scores from 1990 - 2015, although there seems to be a small tailing off in 2015... 

Chart from the Pew study. Credit: Pew Research Center.  
They also looked at the PISA data (from above) and show the results slightly differently: 

The US position in world math test scores.  Data from PSA, chart by Pew Reearch.  


I could go on here, but you get the point.  Of the top 10 results on the SERP, 10 had bad news about the state of math education in the US.  Many of the results are from reputable sources, they expose their testing methods, and they share their data sets.  The evidence is pretty overwhelming--the US is not doing a great job teaching mathematics to their students.  There's much to do here in teaching our students how to do math.  


Our other SearchResearch Challenge was about geographic knowledge.  

How is the US doing there? 

Let's use our same approach as before: 


And by doing the same analysis (who wrote the article?  what's their bias?  why did they write this article?)  

The first article is from National Geographic, a well-known (and highly reputable) source of geographic information.  They cite a survey done for them by the Council on Foreign Relations about "What College-Aged Students Know About the World: A Survey on Global Literacy."  The upshot? 

The average score was 55% correct. Just 29% of respondents earned a minimal pass—66 % correct or better. And just over 1 percent—17 out of the 1,203 surveyd—earned an A, 91% or higher. 
Respondents exhibited limited knowledge of issues critical to the United States. Only 28 percent of respondents knew that the United States is bound by treaty to protect Japan if it is attacked. 

This doesn't really surprise me.  I live in a United States that is profoundly inward-looking.  Just out of curiosity I asked [ how many US citizens have a passport ] and found that about 37% of the population has one, compared to Canada’s 60% and the United Kingdom’s 75%. This means that nearly 2 out of 3 Americans can’t even fly to Canada, let alone travel to anywhere else in the world (according to a report from the geography department at UC Santa Barbara).  

But it's distressing.  While doing the research for this article I ran across a 2017 New York Times story, If Americans Can Find North Korea on a Map, They’re More Likely to Prefer Diplomacy, which includes this sobering image.  With North Korea in the news on a daily basis, wouldn't you expect a more accurate hit rate?

Data collected by the New York Times. From "If Americans Can Find North Korea on a Map..."
Out of 1.746 US adults who were asked to click on the location of North Korea (on an unlabelled map), only 36% got it right.  The light blue dots are all of the incorrect locations.  This is crazy.  

This has a real-world consequence.  As the author, Kevin Quealy writes: 

"An experiment led by Kyle Dropp of Morning Consult from April 27-29, conducted at the request of The New York Times, shows that respondents who could correctly identify North Korea tended to view diplomatic and nonmilitary strategies more favorably than those who could not.."

The only factor (e.g. gender, age, education, etc.) that seemed to make much of a difference in locating Korea on a map was "Do you know someone of Korean ancestry?"  

Once again, we have much to do to help our students (and ourselves) understand the world at large.  We live in an international web of countries and businesses--it's useful to at least know where they are!  


Search Lessons 
There's an obvious point here about the remarkable lack of knowledge in mathematics and geography, but that's not the goal of SearchResearch (although I personally feel this is a terrible state of affairs).  

The SRS Lessons are: 

1.  To find reliable data, look for data sets.   If an author isn't showing you the data, be skeptical.  Reliable places tend to link to their open data.  If that's not happening, be skeptical.  

2.  Our query pattern [ American knowledge of X ] seems to work pretty well.  I'd be curious to hear from SRS readers if this works well in other countries.   What did YOU find worked? 

3. Parallel browsing (by opening tabs from the SERP within the window), and then going deep on a topic in a new window, is a remarkably efficient way to do  quick broad-brush research.   




Note:  I'm about to set out on two weeks of travel in a place that might (or might not) have an internet connection.  I'll try to post next week, but if I don't post, don't worry--I'm just having too much fun diving in some exotic corner of the world!  



Search on! 

 •  0 comments  •  flag
Share on Twitter
Published on July 06, 2019 06:28

July 4, 2019

Answer: How much DO we know about history / math / geography?


HA!  


You thought I'd gone away.  But no, it was just another busy couple of weeks.  Got the chance to give an invited talk at the American Library Association (in DC) all about the book, and then I was a discussant for a wonderful paper about interruptions at the Human Computer Interaction Consortium conference at Pajaro Dunes (near Monterey, CA).  Those were both a lot of work, but also inspiring and extraordinarily interesting.  

But it put me behind schedule.  So, here I am, back with you again to see what SRS we can do about how much DO people know about history, about math, or about geography?  

The key question was this: how would you assess "our" level of knowledge in these three areas?  What does the "public" really know? 


Figure 1.  How many Americans can describe the Declaration of Independence and what role it played in the US Revolutionary war?  Does it matter if you know what year this document was signed?  (Painting by John Trumbull, 1817-1819) 

Our Challenge: What DO we know, and how to we know what we know? 


1.  Can you find a high-quality (that is, very credible) study of how well the citizens of the United States (or your country, if you're from somewhere else) understand (A) history, (B) geography, (C) mathematics?  

As always, I'm always looking for new ways to answer questions like this.  (That is, really difficult questions to search for.)  It's easy and short to ask this type of question, but what do you DO?  

I realize that this is going to take a bit of explaining--so I'm going to break up my answer into 2 separate posts.  This is Part 1: "How much do we understand about history?"  I'll do part 2 later this week.  

As I started thinking about this, it became obvious that there are a couple of key questions that we need to list out.  In the book I call these "Research Questions," and that's what they are.  I recommend to searchers that they actually write these down--partly to help organize your notes, but also partly to make it VERY clear what you're searching for!  In essence, they help to frame your research.  

A. "How much do we...?" Who is "we"?  For my purposes, I'm going to limit "we" to currently living people in the US.  We'll touch on global understanding later, but for this round, just US.  (Of course, if you live in another country, you should do your place!)  I'm hoping we can find data to measure this across the range of ages, although it might be simpler to find just student data to begin with.  

B. ".. know about history?"  How are we going to measure this?  Ideally, we'd give some kind of history test to everyone in the US--but that's not going to happen.  An important question for us is what will count as a proxy measurement of historical knowledge?  (That is, what's a good way to measure historical knowledge?  What organization is giving the survey/test/exam?)  

Also, another underspecified part of this question is "..about history?"  Are we trying to measure World History, or just US History knowledge?  

C.  "How well..."   What does it mean to measure "how well the citizens .. understand..."?  All tests implicitly have a standard, an expectation that they're measuring against.  In this case, how should we measure "how well"?  We'll have to figure this out when we learn how "citizen history understanding" is gauged.  



I started with the obvious query: 

     [ US knowledge of history ] 

I wasn't sure if this would work, but it gave some pretty interesting results, including a big hint that "American" is probably a useful search term: 


Figure 2. 

For this kind of topic (that is, one that I'm not sure where to begin) I opened a bunch of tabs in parallel. s:  (on a Mac, you CMD+click on the link; on Windows it's Ctrl+left-click)

 This is called parallel browsing [1] [2]  and is a great way to look at a topic across its range without getting stuck in one particular interpretation.  When parallel searching, your goal is to see the spectrum of opinions on a topic.  In particular, you'll want to pay attention to how many different sources you're seeing, and what sources you're reading.  

Note how I've opened all of the top 7 search results in parallel:


Figure 3

Now, I can look at a bunch of these results and compare them.  But, as always, you want to scan the result AND check for the organization (and author).  For instance, in the above SERP there are results from TheHill.com, NationalReview.com, NAS.org, Historians.org, TheAtlantic.com, VOANews.com, and SmithsonianMag.com

Let's do a quick rundown of these sources.  The best way I know to do this is to (1) go to the organization's home page and do a quick overview scan; (2) search for the name of the organization, looking for articles about the org from other sources (and points of view); (3) search for the name of the org along with the keyword "bias."  Here's an example of what my screen looks like when I'm in mid-review, in this case, I'm checking out the American Historical Association (that is, Historians.org)...

Figure 4.  Click on this window to see it full size--that's the only way you can read the text! 
In the bottom window you can see the AHA article about "Chapter 2: Why Should Americans Study History"  (that's link #4 in Figure 3).  In the right window you can see my query:  [ "American Historical Association" bias ] -- this is a quick way to see if anyone else has written about possible biases in that org. In this case, the AHA org seems pretty kosher.  There are articles about AHA that discuss their attempts to fight bias in various forms, but nobody seems to have written about their bias.  (If you try this bias context term trick on the other orgs in this SERP, you'll find very different results.) 

An important SRS point to make:  I open tabs in parallel as I'm exploring the main SERP, but I open a new window when I'm going to go in depth on a topic (and them open parallel tabs in there, rather than in the first window).

In the lower left window you'll see the Wikipedia article about AHA.  You can see that it's been around for quite a while (chartered in 1884) as an association to promote historical studies, teaching, and preservation.  The Wiki version of AHA is that it's a scholarly org with an emphasis on collaboration as a way of doing history.  That's important, as it suggests that it's a reasonably open organization.

Now.. back to our task of checking on the stance of each of these sources.

I'll leave it to you to do all of the work, but here's my summary of these sources:


TheHill.com - a political news newspaper/magazine that claims "nonpartisan reporting on the inner workings of Congress and the nexus of politics and business."  AllSides.com (a bias ranking org) finds it a bit conservative 
NationalReview.com - shows up consistently as very conservative.  (AllSides.com and Wikipedia agree. 
NAS.org (National Association of Scholars) - pretty clearly "opposes multiculturalism and affirmative action and seeks to counter what it considers a "liberal bias" in academia.
Historians.org - (American Association of Historians) - multi-voice, collaborative institution of long standing that tries to represent an unbiased view of history.
TheAtlantic.com - news magazine with a slightly left-of-center bias.
VOANews.com (Voice of America) - is part of the U.S. Agency for Global Media (USAGM), the government agency that oversees all non-military, U.S. international broadcasting. Funded by the U.S. Congress.
SmithsonianMag.com - rated by Media Bias Fact Check as a "pro-science" magazine with a good reputation for accuracy.


NOW, with that background, what do we have in that first page of results?

TheHill.com reports that
"...Only one in three Americans is capable of passing the U.S. citizenship exam. That was the finding of a survey recently conducted by the Woodrow Wilson National Fellowship Foundation of a representative sample of 1,000 Americans. Respondents were asked 20 multiple choice questions on American history, all questions that are found on the publicly available practice exam for the U.S. Citizenship Test."
Okay, now we have to go still deeper and do the same background check on the Woodrow Wilson National Fellowship Foundation.  Using the method above, I found that it's a nonprofit founded in 1945 for supporting leadership development in education.  As such, they have a bit of an interest in finding that they're needed--for instance, to help teach history and civics. 

But the survey mentioned above was actually conducted by Lincoln Park Strategies, a well-known political survey company that's fairly Democratic, but also writes extensively on the reliability of surveys. (So while I might tend to be a little skeptical, a survey about historical knowledge is likely to be accurate.) 

The key result from this survey is that only 36% of those 1,000 citizens who were surveyed could pass the citizenship test.  (See a sample US citizenship test and see if you could pass!)  Among their findings, only 24 percent could correctly identify something that Benjamin Franklin was famous for, with 37 percent believing he invented the lightbulb. 

Note that this survey implicitly answers Research Questions B and C (from above):  How do we measure historical knowledge?  Answer: By using the Citizenship Test.  And, How well do people do on the test?  Answer: A "good" grade would be passing, that is, the passing grade for a new citizen. 


What about the other sources? 

The National Review article reports on a 2016 American Council of Trustees and Alumni report that historical knowledge is terrible ("... less than a quarter of twelfth-grade students passed a basic examination [of history] at a 'proficient' level.").  

Now we have to ask again, who/what is the "American Council of Trustees and Alumni"?  The short answer:  part of a group of very conservative "think tanks" and non-profits that are closely linked to far-right groups (e.g., the Koch Brothers).  

So, while that information could well be true, we realize that there's an agenda at work here.  (I did look at their survey method as reported in the report above, and it seems reasonable.) 

Meanwhile, the National Association of Scholars points to the US Education Department’s National Assessment of Educational Progress quadrennial survey, The Nation’s Report Card: U.S. History 2010.  Looking at the original report shows that the NAS article accurately reflects the underlying data.  While average scores on the test have improved over the past several years, the absolute scores are terrible.  As they write: "...20 per cent of fourth grade students, seventeen per cent of eighth graders, and twelve per cent of high school seniors performed well enough to be rated “proficient.”   It looks even worse when you invert those positive figures: eighty per cent of fourth graders, eighty-three per cent of eighth graders and eighty-eight per cent of high school seniors flunked the minimum proficiency rating." 

Wow.  That's pretty astounding. 

Continuing onward:

The Historians.org article ("Chapter 2: Why Should Americans Know Their Own History") is an argument for teaching history, but has no data in it.  However, Chapter 1 of the same text at the same site talks about the data, but the crucial figure is MISSING.  (And I couldn't find it.)  So this doesn't count for much of anything.  

In that same vein, The Atlantic's article "Americans vs. Basic Historical Knowledge" is really a reprint from another (now defunct) journal, "The Wire." This article decries the state of American students with a bunch of terrifying examples, but it points to yet another set of data that's missing-in-action.  

The VOA article, "Poll: Americans’ Knowledge of Government, History in ‘Crisis'" is also ANOTHER reference to the American Council of Trustees and Alumni  survey of 2016 (referred to as the data source for the National Review article).  This article is basically a set of pull quotes from that report.  

What about the pro-science magazine, Smithsonian?  Their article, "How Much U.S. History Do Americans Actually Know? Less Than You Think" says that the 2014 National Assessment of Educational Progress (NAEP) report found that only 18 percent of 8th graders were proficient or above in U.S. History and only 23 percent in Civics.  (See the full report here, or the highlights here.)  

Figure 5. NAEP history test scores for 8th graders, 1994 - 2014.Figure 5 shows an excerpt from the full report, and when I saw it I thought it looked awfully familiar. 

Remember the National Association of Scholars article from a few paragraphs ago?  Yeah, that one.  Turns out that this article and that article both point to the same underlying data.  That is, the National Assessment of Educational Progress (NAEP)!  This article points to the updated 2014 report (while the earlier article's data is from 2010).  This doesn't count as a really new data set, it's just an update of what we saw earlier.  What's more, the update in four years isn't statistically different. It doesn't count as a separate reference!  

Sigh.  

So what we have here, in the final analysis of the 7 webs pages are: 

     a. the NEAP data set (from 2010 and 2014)
     b. the American Council of Trustees data set  (2016)
     c. the Woodrow Wilson survey (which has a summary, but not much real data)  

Everything else is either missing or a repeat.  

I went through the next couple of SERP pages and while I found lots of articles, I found that almost all of them basically repeat the data from this handful of studies.  

As it turns out, these three (and the few other studies I found that were about specific historical questions, rather than history broadly speaking)  all agree:  We're not doing well.  In particular, on normed tests, or the Citizenship test, Americans don't seem to know much about their history.  

Of course, this alarm has been raised every few years since at least 1917 when Carelton Bell and David McCollum tested 668 Texas high school students and found that one third of these teens knew that 1776 was the date of the Declaration of Independence.  [3] Like that.  

It's a sobering thought do consider this on the July 4th holiday.  (Which is, coincidentally, our agreed-upon celebration date of the signing--even though it took several days to actually sign the document, as the signatories were scattered across several states!)   Like Bell and McCollum, I worry... but perhaps this is an inevitable worry.  To me, it suggests that teaching and education need to remain permanent features of our intellectual landscape.  

As should search.  


Search on!   





Search Lessons 
There's much to touch on here... 

1.  You have to go deep to look for redundancy.   Just because we found 10 separate articles does NOT mean that there were 10 different studies that all agree.  In this set of articles, there are really 3 common pieces of data.  

2.  Use parallel browsing to open up tabs that branch off the same SERP, and then use different windows to go deep on a particular topic.  That's what I do, and it certainly makes window management simpler!  

3.  Beware of lots of 404 errors (page not found).  If a publication can't keep references to their own pages up to date, you have to admit to being skeptical of their work overall.  It's inevitable to get some link-rot errors, but they shouldn't be common, as they were in some sites I visited here.  (Hint:  If you want to write scholarly text that lasts, make sure you keep a copy of the data your article depends upon.) 





[1] "Parallel Browsing Behavior" Huang and White.  Proceedings of the 21st ACM conference on Hypertext and hypermedia. ACM, 2010.[2]  "Online multitasking and user engagement." Lehmann, Janette, et al. Proceedings of the 22nd ACM international conference on Information and Knowledge Management. ACM, 2013. [3] Bell, J. Carleton, and David F. McCollum. "A study of the attainments of pupils in United States history." Journal of Educational Psychology 8.5 (1917): 257.


 •  0 comments  •  flag
Share on Twitter
Published on July 04, 2019 14:43