Trudy J. Morgan-Cole's Blog, page 46

October 9, 2017

Corners Without Corner Stores: A Walking Tour of Rabbittown

Here’s a video I made in my neighbourhood to introduce readers to some of the places that inspired my upcoming novel. You may enjoy this, especially if you have nostalgic memories of corner stores.



 •  0 comments  •  flag
Share on Twitter
Published on October 09, 2017 05:50

September 18, 2017

Also, twenty years ago…

I don’t mean to turn my blog into a memory-fest, but a lot of stuff happened in the summer and fall of 1997. I was, as you’ll recall, pregnant with my firstborn. The school I taught at closed down (I just blogged about that). Princess Diana died (I didn’t blog about that). Mother Teresa died (didn’t blog about that either). And then, on September 19, a far less attention-grabbing celebrity death: singer/songwriter Rich Mullins was killed in a car accident. He was 42 years old. 


I never got to see him in concert. I’ve never gotten over his death.


I know some folks with disagree with me, but I believe that people who say that “contemporary Christian music” is shallow, banal, and musically/lyrically/theologically vapid, either have not listened to Rich Mullins, or possibly have not listened to Rich Mullins enough. There is probably no-one except Jesus — not even C.S. Lewis or Anne Lamott — whose work has had a bigger influence on my faith than the songs of Rich Mullins. Sometimes his music was all that kept me hanging onto any kind of faith at all. 


When Rich made money from his music, he turned it over to his church. They paid him whatever the average salary was for a worker in the US that year, and gave the rest to charity. On his Wikipedia page you can find this fact coupled with one my favourite Rich Mullins quotes:


Jesus said whatever you do to the least of these my brothers you’ve done it to me. And this is what I’ve come to think. That if I want to identify fully with Jesus Christ, who I claim to be my Savior and Lord, the best way that I can do that is to identify with the poor. This I know will go against the teachings of all the popular evangelical preachers. But they’re just wrong. They’re not bad, they’re just wrong. Christianity is not about building an absolutely secure little niche in the world where you can live with your perfect little wife and your perfect little children in a beautiful little house where you have no gays or minority groups anywhere near you. Christianity is about learning to love like Jesus loved and Jesus loved the poor and Jesus loved the broken-hearted.


He would have been 62 this year. It’s impossible to imagine the songs he would have written, the directions his faith and his art would have taken him. I think he probably would have drifted farther away than he already was from the centre of American Evangelical Christianity and probably be shocked and horrified by the political/cultural directions that branch of Christianity has taken in these last 20 years … but who’s to say? It’s only guesswork. We never know what could have been, only what was. What was, and are, are the songs.


Here are five of my favourite Rich Mullins songs (some are videos and some just audio), with my comments.



I think I probably first heard Rich Mullins sing either “First Family” or “Boy Like Me, Man Like You” because those were his early Christian radio hits. But when I bought my first Rich Mullins album (The World as Best as I Remember It, Volume One), the first lines of “Jacob and Two Women” jumped out at me as being completely unlike anything I’d heard in Christian music and most of what I’d heard in church. When you hear a guy sing “Jacob he loved Rachel, and Rachel she loved him, and Leah was just there for dramatic effect/ Well it’s right there in the Bible so it must not be a sin, but it sure does seem like an awful dirty trick” … well, you know you’re in the presence of a songwriter who is not playing around with a bunch of Christian cliches and putting on a holy face. This was real stuff.



Outside of his hardcore fans, Rich is known for writing the praise-and-worship anthems “Awesome God” and “Sing Your Praise to the Lord.” While I like those songs, I don’t like them as well as his more singer-songwritery stuff. If I want to hear Rich Mullins do a praise and worship song, something to make me lift my hands and go all Pentecostal, it’ll be “Sometimes by Step.” It’s a song about one of my favourite spiritual themes — how God leads us only one step at a time — we can’t ever see the whole way, and the only way to know where the path is going is just to step out and follow it.




Another of my favourite themes: grace. This one contains the lines: “Stuff of earth competes for the allegiance I owe only to the Giver of all good things.” Stuff of earth is always competing for my allegiance, and this song reminds me where that allegiance belongs.



This is one of my favourite hymns. Rich’s tune is different from the one in my hymnbook and it’s possibly better than the hymnal. Again, this one is about following step by step. Being humble; being led.



Finally … if you know me really well, you know not only that I’ve always loved Rich Mullins’s music but also that it was a fandom I shared with my friend Jamie. Jamie did get to see Rich in concert, and I was so envious of that. Fourteen years after Rich died, Jamie, too, died at age 42 … another loss I’ve never gotten over. The last time I visited him before he died, the last thing we did together was sit for an hour or so listening to Rich Mullins songs. We listened to “Hold Me Jesus” and analyzed the lyrics, like we always did with songs. Listening to “Hold Me Jesus,” possibly my very favourite Rich song, while sitting with my dying friend, putting whatever faith we had between us into that prayer … it’s one of my loveliest and most painful memories.


 •  0 comments  •  flag
Share on Twitter
Published on September 18, 2017 16:48

September 10, 2017

Twenty Years Ago ….

The late summer/early fall of 1997 was a strange and life-changing time for me, for a lot of reasons. I was pregnant with my first child, Chris, who was due to make an appearance in January of 1998. I was also changing jobs, as the job I’d been in for the last five years was coming to an end due to a catclysmic upheaval in the Newfoundland school system. In September 1997 I started teaching English at Beaconsfield Senior High, a position I would hold for only five months before going on the world’s longest maternity leave (in some senses, it still hasn’t ended).


Before that, I’d been teaching English at the St. John’s Seventh-day Adventist Academy — the same school I attended from Kindergarten through high school graduation in 1982; the school both my parents attended. Even some family members of my grandparents’ generation attended the school. To say I and my family had a lot of history with the St. John’s SDA Academy would be an understatement. The baby I was carrying in summer 1997 would not carry on the tradition; our school would no longer exist by the time he started Kindergarten in 2003.


[image error]

St. John’s SDA Academy, early 1940s. Both my parents are somewhere in this picture.


My fellow Seventh-day Adventists will know something about the history and the emotional weight (both good and bad, depending on your experiences) of Adventist education. But you can’t really know what our experience here in Newfoundland was like, because it wasn’t much like anything anywhere else in the world. Adventists have always been big believers in church-run, Christian education for their kids, and in almost every place (at least in North America) this has meant starting small private schools supported by tuition and by the generosity and hard work of the local church, where Adventist kids could get an education without having to rub shoulders with “The World” too much.


Our situation here in Newfoundland was different. At the time the early Adventists decided to start their own school in 1905, the Newfoundland government operated no public schools; almost all schools were run by churches. There were Roman Catholic schools and Anglican schools and Methodist schools and eventually schools run by smaller Christian groups — Salvation Army schools and Pentecostal schools and Seventh-day Adventist schools. Over the twentieth century, this evolved into a system where the government fully funded all these schools — paying for teachers’ salaries and other expenses of running a school system, and establishing a provincial curriculum and government exams for high school graduates — but left most of the day to day running of the schools to the churches. Churches could hire teachers, set their own religious ed curriculum, have whatever religiously-oriented extra-curricular activities they wanted such as chapels and worships, all without charging tuition. All schools were free and open to everyone.



As a result of this, I was educated in a very unique system, unlike either public-school or church-school peers in other places. My education had many of the hallmarks of Adventist education — Adventist teachers, morning worships, Weeks of Prayer, Bible classes. But our student body was never more than about 10-20% Adventist kids. Since there were no tuition fees, anyone and everyone was welcome. The student body was largely kids from the neighbourhood whose parents just sent them to the nearest school — and, over the years, as our neighbourhood became more of a lower-income, centre-city neighbourhood, the student body reflected that. There was also a sizeable contingent of kids from other “non-mainstream” religious groups that weren’t numerous enough to run their own school system but wanted the semi-religious atmosphere of the Adventist school, even though they opted out of Bible classes. When I was in school myself, these were mainly Jehovah’s Witness kids; during the years I taught there that demographic had shifted to Bible Believers and a few Christadelphians.


I went to an Adventist school with Adventist kids and United Church kids and Anglican kids and Jehovah’s Witness kids and agnostic and atheist kids and a sprinkling of Catholics (mostly they stuck to their own large and well-run school system, but occasionally a Catholic kid who’d had a rough time at St. Pat’s or Presentation or wherever would find their way to us). I went to school with a lot of kids whose families were poorer than mine was, and a handful whose families were better off (I had two working parents, so I was definitely in the upper echelon socio-economically). Our school was small by the standards of other schools in St. John’s — my graduating class had 19 students and was the largest in the school’s history — but it was as diverse as St. John’s got in those days.


Look, I don’t want to idealize my old school, much as I loved it. Like any school, there were people who had great experiences there and people who had terrible experiences. Though most of both my classmates and my former students remember the place fondly, it had its flaws. I think of it as a great place to have gone to school despite the fact that I was relentlessly bullied by a clique of Mean Girls for my entire Grade 7 and 8 years. (I was keenly aware that if I had been thrown into a larger junior high, all that would have meant would have been even more Mean Girls to torment me). It was flawed and messy but it was ours. To me, as both a student and a teacher, it provided all the best aspects of attending both a small private Christian school and a neighbourhood public school.


Lots of other people felt the same way. When we saw changes coming during my teaching years in the mid-90s, I and my extended family and most of the other staff there fought as hard as we could to save our school, petitioning the government to make an exception for us.


It didn’t work. By the mid-90s, Newfoundland’s school system was clearly archaic in the context of the rest of Canada. Though our population was not yet very diverse in terms of non-Christian religious, it had become, with the rest of the world, much more secular. The Roman Catholic church and its school system had been shaken to its core, and lost much of its popular support, by a series of sexual abuse scandals. The mainstream Protestant denominations had long since amalgamated their schools — Anglican, United, and Salvation Army — into consolidated schools that were in many ways indistinguishable from public schools. Parents who weren’t religious chafed against having to choose a “religious” school to send their kids to and support it with their taxpayer dollars. Teachers were unhappy with a system in which a teacher at a Catholic or Pentecostal school could lose their job if their personal life wasn’t in line with church teaching. And the government promised that by dismantling the old denominational system, they would eliminate waste and redundancy and save millions of dollars that they could pour into newer, larger, more effective public schools.


It was controversial enough that it required a referendum, and the majority voted to scrap the old system. In June of 1997 I directed our last school drama production (Narnia), taught my last English classes at the old St. John’s SDA Academy, and went where I was re-assigned, to Beaconsfield, a formerly Catholic school that, like all the other big schools, was being reborn as a public school. (More or less. At Beaconsfield, the vice-principal, a former nun, used to read prayer requests for the “repose of the soul” of various dead folks over the PA system during homeroom announcement time. Newfoundland schools were slow to completely secularize, but the trappings of denominational education were gone).


The Adventist school, stripped of the government funding that paid its teachers’ salaries, was slated to close, its students and staff re-assigned to other schools. In fact, it struggled on for five years as a much smaller private school, attended almost entirely by the children of church members. The private school was crippled by the fact that most of those church members were either unable or unwilling to pay tuition fees for something they’d previously been getting for free. The two local Adventist churches, weighed down by the burden of supporting the school, voted to close it in the spring of 2003, just before my eldest child would have started Kindergarten.


My kids grew up going to the Adventist church and the neighbourhood public school. I’m happy with the education they got. As for me, I’ve moved on to other teaching challenges in a different environment. And over the years, I’ve changed my views on a lot of things.


I’ve come to believe that what I and others fought for in 1997 was not only unsustainable but philosophically wrong. I don’t think it’s right for taxpayer funds to support religious, church-run schools. Adventists have always had a strong belief in the separation of church and state, and here in Newfoundland we overlooked that belief for a long time because it enabled our schools to function. Today I believe that allowing government to fund church schools and other religious organizations compromises both sides of the equation: if you want to run a school that teaches and conforms to the beliefs of your religious groups, and excludes other beliefs, then you should be able to raise the money to do that yourself, and not rely on taxpayer funds. Reluctantly, I’ve come to feel that the Newfoundland people made the right decision in that referendum (even though the promised millions of dollars of savings to be poured into schools never quite materialized, as is often the case with government promises).


I say “reluctantly” because I do feel that something of value was lost in 1997, something more than just a place I’m nostalgic about. One thing that I still think was valuable about our unique public-private hybrid was that it allowed smaller schools to flourish with the same government funding as big ones. Even if you take religion out of the equation, this is still inefficient –governments can always save money by putting more and more students into larger and larger institutions. But that doesn’t always best serve the needs of the students.


Bigger schools can and do offer a wider range of courses and of extracurricular activities. My own kids have certainly benefited from that. But a big school is not an ideal environment for every learner. Over the  years, many of the students who left other schools to attend the Adventist school, both while I was a student and while I was a teacher, came because they or their parents felt they would learn better in smaller classrooms, with a higher student-teacher ratio and a more “family” feeling to the school community. And it’s no coincidence that after leaving the Adventist school and getting piped into the new public system, I gave up teaching for seven years and only returned to it in a small alternative school outside the regular school system. In the place where I teach now, we have about the same student-teacher ratio, and much the same feeling of community, as we had back in the old Adventist Academy.


I didn’t write this piece to make any particular point, just to mark a milestone for myself and others who loved the old school. I’m not sure what my take-aways are from this. Certainly I’ve discovered that I’m capable of believing something is theoretically wrong, yet recognizing that it had good elements and that I and a lot of others benefited from it. That’s one piece of my nostalgia as I think back to what we lost 20 years ago. Another is that I still believe an ideal school system would not be one-size-fits-all; it would have flexibility. While I don’t believe that flexibility should include paying churches to run schools, I do think it should involve the option of smaller schools, smaller class sizes for those who need them, different learning environments. Bigger is not always better, and what works for the majority doesn’t work for everyone.


Mostly, though, I just miss a place I used to love, even on the days when I didn’t love it. I miss it the way you miss home, a home to which you can never return.


 •  0 comments  •  flag
Share on Twitter
Published on September 10, 2017 15:05

August 16, 2017

Some Thoughts on Punching Nazis

I need to make two things clear at the start of this blog post.

[image error]


First, I am a pacifist. I believe absolutely and without reservation that for me as a Christian, it is against God’s will to ever use violence against another person. More broadly, I believe conflicts in general are better resolved, and oppression better resisted, through nonviolent direct action than through violence.


Second: it costs me nothing to be a pacifist, and therefore my opinion about pacifism isn’t worth much. (You know that’s not going to stop me from writing a blog post).


It costs me nothing to be a pacifist because I am not oppressed and I have never been a victim of violence. I’ve even been lucky enough (and it is sheer luck) to avoid the kind of casual sexual assault (unwanted touching/groping, etc) that many if not most women experience at some point in their lives. It’s easy for me to be a theoretical pacifist when I have never been in a situation where violence would be a likely or necessary response. I’m an extremely privileged person in this conversation and I get no points for theoretically renouncing a weapon I’ll probably never need to use.


Not only am I not a victim of violence, I probably wouldn’t be any good at using it if I had to. I don’t know how to shoot a gun. I’m not athletic and have never taken a self-defense class. My college boyfriend tried one time to teach me how to kick someone effectively and punch someone in the face without breaking my hand, but as I never practiced those skills I have no confidence I could do either of those things effectively.


[image error]When a person with a black belt, or a person who’s a deadly aim with a gun, or a person who’s six-foot-five and three hundred pounds of sheer muscle, renounces the use of violence to solve problems, their renunciation means something. Mine means nothing. Giving up violence, for me, would be like giving up liver for Lent — it’s just not my thing.


When a person who is the victim of systemic oppression — who, because of their skin colour, their social class, their gender identity, the place where they live, is in constant danger of physical harm — when than person renounces violence, it means something. It means nothing when I renounce it.


All that being said, I am still a pacifist. You may disagree with me. A lot of people do. A lot of my fellow Christians read the same Bible I do and come away convinced that Jesus would be fine with them defending their home with a gun, serving in the military, or punching a Nazi in the face (don’t worry, we’ll get back to the Nazis). What can I say? I read Walter Wink at an impressionable age (the age was 35, but still, I was impressionable). I have immense admiration for the tactics and commitment of Gandhi, Martin Luther King, Jr., and others who have led highly disciplined and courageous groups of people into nonviolent direct conflict with oppressive powers. Anyone who marches into a line of armed police or military willing to take a beating without lifting a hand to fight back, is a hero in my book.


All of which was pretty theoretical, living the safe and comfortable life of privilege I live, until the last week or so. In the wake of the white nationalist march Charlottesville, Virginia, the question of whether or not to resist evil with violence is suddenly much more relevant. While I, personally, may never be called upon to punch a Nazi, should I cheer for the person who does? Should I cheer at the sight of a flamethrower burning a Confederate flag (bearing in mind that the person holding the flag could be harmed by the flamethrower)? Should protests against fascists, white supremacists, neo-Nazis, and their ilk (which, we’ve been promised, we’ll see more and more of, and don’t think we haven’t got them in Canada) be met solely with nonviolent resistance, like those lines of clergy in their vestments and many other peaceful resisters marching down the streets of Charlottesville. Or should there be room for the antifascist protesters who come armed and ready to fight back?



The presence of counter-protesters who were willing to fight back made it possible for Donald Trump to claim that there was violence on “both sides,” and “two hate groups” present, just as the fact that some Black Lives Matter protests erupted into violence has led some Trump supporters to complain that the far left is just as violent as the far right. This is obviously false equivalency: there is every difference in the world between a group marching to declare hate against others and try to take away their rights, and a group marching to defend their own and others’ rights. You can’t pretend “antifa” [anti-fascist demonstrators] are the same as fascists: we wouldn’t have antifa if there weren’t any fa.


But apart from the fact that the motivations of both groups are quite different, the question of tactics remains. It’s obvious that if all the counter-protesters in Charlottesville had been completely committed to non-violence (which generally requires organization and considerable training as well as a lot of courage), there would have been no grounds for Trump and his supporters to make their claim of false equivalency. The very thing that made Gandhi’s movement, Dr. King’s movement, and other non-violent protests effective is that there was no room to say, “Well, one side is just as violent as the other.”


True non-violent protest takes away any claim of moral high ground from the oppressor and displays evil in the starkest possible light: as the club that beats the person lying prone on the ground. There’s no room, in the face of true non-violent action, for the oppressor to claim any false moral equivalency.


That’s a powerful argument in favour of keeping anti-Nazi protests, or anti-racism protests, non-violent. A counter-argument (made well in this article) is that some kinds of evil require a violent response and will not yield to non-violent resistance. I’m not sure this is true, but it’s a point worth considering, and obviously it’s a position embraced by some of the anti-fascist counter-protesters who show up to right-wing rallies.


It’s clear that the “Unite the Right” Charlottesville marchers, many of whom came to the protest heavily armed, embraced the possibility of violence from the beginning. So did some of the counter-protesters. A great many more counter-protesters (including the guy with the flamethrower, who picked up his improvised weapon after gunshots were fired at his feet and says he was later beaten with metal rods while running from protesters) went out that day intending to make a peaceful protest, but fought back in self-defense or in defense of others. Without the leadership and commitment of a Dr. King or a Gandhi, many non-violent resisters will resort to violence when they feel threatened. It’s obvious that the Nazis came intending to threaten, as Nazis always do.


Can you imagine being attacked — whether with guns, or sticks, or rocks — and not picking up a weapon to defend yourself? Not fighting back, but taking the beating or even the gunshot (or, as ultimately happened in Charlottesville, the vehicular homicide), knowing that the cameras are rolling and your suffering will be a powerful testament to the evil you are fighting?


I can’t really imagine it. I’ve never been hit in anger, never had to take a beating. And the very fact that I haven’t been in that position — that I don’t have to deal with that threat on an everyday basis — means that I don’t have the right to tell someone else to protest non-violently. 


Gandhi and King and other famous non-violent resisters were able to lead those movements because they themselves represented their own communities, communities under threat. A white preacher telling Southern blacks to take a beating from police, a British government official telling Indians not to fight back against British Army brutality — these people would have had no moral right to preach non-violence, any more than I do.


If you are the person on the front lines facing racism, police brutality, hatred and oppression, you have your own decision to make about whether you will fight to defend yourself and others, or choose the path of non-violent resistance. I am a pacifist, but my pacifism is meaningless until it’s tested, and it may never be.


It should go without saying that there were not two violent hate groups in Charlottesville on Saturday: there was one violent hate group, and there were people gathered to resist them. Some resisted non-violently; some fought back. My belief that the non-violent resisters were more effective is irrelevant: the people on the ground are the only ones who can decide how they will resist evil.


All this to say: I will never punch a Nazi. I might waste some breath trying to argue with one, but for a number of reasons I will not be throwing punches.


I will not cheer at, or laugh at, or cheer at pictures or videos or gifs of Nazis being punched (even if I secretly believe they deserve it). 


But if you end up in a situation where you need to punch a Nazi, I’m not here to judge you.


 •  0 comments  •  flag
Share on Twitter
Published on August 16, 2017 16:20

August 7, 2017

The Kids May or May Not Be All Right, and it May or May Not Be the Fault of Their Phones

It’s rare, in this polarized world of ours, that voices on the left and on the right sound off on the same side of an issue. Rare enough that when it happens, it’s probably worth paying attention.


I noticed just such a phenomenon on Facebook last week, when one of my more politically right-leaning friends shared a blog post by Professional Angry Christian Person Matt Walsh, while at the same time one of my more leftward-tilted friends shared an article from that generally progressive-ish magazine, The Atlantic. Both articles were ranting about the same thing: the use of smartphones by children and youth is not simply affecting, not merely changing, but literally destroying a generation. Today’s tweens and teens are weak, passive, immobile, unable to cope with the outside world – because they spend all their time on their phones.

[image error]


This wasn’t, it turned out, some rare confluence of independently-arrived-at opinions. Matt Walsh was responding directly to the Atlantic article, filtering author and psychologist Jean M. Twenge’s research and concerns through his own particular prisms. When we do find that rare agreement between people and parties who usually disagree – when Matt Walsh reads an article in The Atlantic and responds with agreement instead of venom – it’s either because the thing is so incontrovertibly true that it transcends our divisions, or because it agrees with some preconceived biases we hold (such as, Technology Is Evil or Everything Was Better in the Good Old Days).


On the surface of it, the agreement here seems to fall into the “incontrovertibly true” category. Kids today get smartphones at what seems to me ridiculously young ages (seriously, why would you put a $700 piece of electronics into the hands of the seven-year-old who just pulled off Barbie’s head??). They spend a lot of time on them, and this has changed both the kids and the culture. Our kids are having experiences online that those of us who grew up when there was no “online” cannot fully understand, and we don’t know what the consequences might be. We can see that change is happening quickly, and it scares us, regardless of whether we’re pre-programmed to think that change is usually a good thing or that change is the Devil’s calling card.


Probing a little more deeply into the original article, I began to question some of the panic it engendered. It starts, as such pieces always do, with an anecdote: Twenge talked to a 13-year-old girl and found that this young teenager is in the habit of going to the mall with her family, rather than hanging out there unsupervised with her friends; she’s more likely to spend time with her friends online than in real life. Twenge continues to pile anecdotal evidence alongside research, creating the impression that smartphone use has spawned a generation of children who spend all their time locked in their rooms staring at screens, unable to interact with the world outside in any meaningful way.



There is some genuine hard data backing Twenge’s article, and I’m sure if I were to read her book iGen I would find more: she has made a career-long study of generational differences. She cites, for example, the fact that more than one in four kids today does not have a driver’s license by the time they graduate from high school (a higher rate, it seems, than in the previous generation, though she doesn’t cite the numbers for earlier generations), and that the number of sexually active 9th-graders has dropped by 40% since 1991, with average teens now having sex for the first time in Grade 11, rather than in Grade 10 as the previous generation did. From these, and from the agreed-upon and undeniable fact that kids have access to a technology that no previous generation has had, she draws the conclusion that smartphones are the cause, and everything from less sex to fewer drivers’ licenses are the effect.


You’d think that some of the changes in teen behavior that Twenge cites would be ones that a conservative like Matt Walsh would celebrate: fewer fourteen-year-olds having sex, for example, or declining alcohol use among teens (another statistic Twenge cites). But because the cause of these behavioral changes must be smartphone use, even these apparently positive changes must be the result of something dark and sinister: the siren-like call of the screen luring kids away from both the positive and the negative “normal behaviors” of youth. You know, normal, risk-taking and independence-seeking behaviors like getting your drivers’ license and then using it to drive up to Signal Hill with your date and explore some fumbling attempt at intercourse in the back seat. Just like we used to do in the good old days, kids.


Look, I’m not trying to deny that smartphones (and similar technologies – tablets, etc) have changed and are changing us, our kids, and our culture. We’ve all seen the phenomenon of kids glued to their phones when people are trying to have conversations with them, or show them amazing views on vacation. (We’ve also all seen kids take beautiful phone pictures of the amazing views and post them on Instagram, or jump into the adult conversation with an interesting and relevant fact they just Googled. But these behaviors don’t fit the paradigm of “kids and their darned phones” so they rarely get included in the conversation about kids and smartphones).


There are real dangers online, like sexual predators, and less fully-understood dangers too. We don’t know what the long-term effects will be of kids carrying out some of their social interaction through text and emojis rather than face to face. Heck, we don’t know what the long-term effects of doing that ourselves will be. Every new technology brings changes, and most of society – the non-Amish part of it anyway — rushes headlong into embracing cool new stuff without fully understanding what those changes will involve.


And I am not, for one second, arguing that parents don’t need to be concerned about or set limits on kids’ screen time. I am, after all, infamous in certain circles as the parent who, right up to nearly the end of high school, used to turn off the wifi in the house at 10:00 p.m. (I also didn’t believe that teens needed data on their phones, so the loss of the wifi meant they were CUT OFF from the online world at bedtime. For this cruelty I was once called, by one of my offspring, “Nazi Mom,” leading me to reply, “Yes, that was the Nazis’ real crime – the way they turned the wifi off in the concentration camps at bedtime.” An informal online poll of my acquaintances conducted the same night I got called Nazi Mom revealed that while no other household had our exact same rules, every household with teenagers imposed some kind of rules or limits on kids’ use of phones and internet).


I absolutely believe that young teens should not have unlimited access to wifi and data, that parents should know and discuss with their kids what they’re doing online, and that kids should be taught basic phone etiquette, like put the darn thing away at the table or when your grandparents are trying to talk to you.


All that being said, I think the Atlantic article, and many of the responses and thinkpieces it spawned, struck an unnecessarily alarmist note. Nowhere in Twenge’s article was research used to show causation as well as correlation between kids’ use of smartphones and the other behavioral changes observed (it’s possible that the data may be there in her book, but that wasn’t reflected in the article). Even when the behavioral changes were backed up by data (not all were), there was no study showing that a decreased likelihood of getting a summer job (another generational change Twenge mentions) was linked to increased smartphone use, for example.


As a parent and a teacher, I’m certainly concerned about kids’ use of the internet for lots of reasons, but I also don’t see the claims put forth in Twenge’s article lining up with the behaviors I observe in real life. Yes, I see kids – mine and other people’s – on their phones All. The. Danged. Time. (I see adults doing this almost as much, btw – including, sometimes, myself). But I also see these kids engaging in lots of normal, independent teen behaviors. This includes positive behaviors, like working part-time jobs and singing in the school musical, and negative behaviors, like having sex and smoking weed. Behaviors which, both positive and negative, involve getting your face out of the screen and interacting with the real world.


Admittedly, I’m going with anecdotal evidence, not data here, only looking at what I personally have observed. I think of my 17-year-old daughter and her four closest friends, whom I’ve watched grow up together since elementary school. Of that group of five teenagers who will graduate from high school next year – yes, they all have phones (though they vary in when or whether they have data available) and they all use the internet a lot, in many different ways. Two of them have their drivers’ licenses; three don’t. Three of them have summer jobs and part-time jobs during the school year; two don’t. These numbers seem about similar to me to what I remember from my own friends at the same age, but the hazy memory of a middle-aged mom isn’t data, so don’t quote me on that.


The relevant thing to me, observing this small group of young people, is that I have a pretty good idea of the reasons why the three who don’t have their drivers’ licenses don’t have them, and why the two who don’t have jobs aren’t working. They’re very specific reasons related to family or personal issues or to socio-economic factors. There is absolutely no correlation, that I can see, between phone or internet use and the likelihood of getting your license or a job, in the group of kids that my kids know.


Obviously five teens is a tiny sample size, but when I think about the kids my daughter knows, the kids my son knows, the children of my friends, and my own students, I see the same variations. Some are risk-takers, some are cautious kids. Some are progressing to adult independence in what we might think of as “normal,” healthy ways; others are held back for a variety of reasons. While access to the internet may make it easier for a shy, socially anxious kid to hide in her room and avoid confrontation with the outside world, I don’t see much evidence that normally-developing kids are choosing to be on their phones rather than go to parties or to their shift at Mcdonalds. Rather, they are bringing their phones along on all these adventures, and using them to record and respond to those experiences – for better and for worse.


If the most alarmist generalizations in the Atlantic article were true, then we should also see a dramatic fall in the rates of participation in teenagers’ extracurricular activities, since any time they’re not mandated to be in school is being spent in a darkened room in front of a screen. High school sports teams, choirs, and bands should be shutting down for lack of participation; high school musicals and plays failing to go onstage because not enough kids can be lured away from their phones to try out. If such things are happening, I certainly haven’t seen evidence of it in my community, nor have I seen data suggesting this is the case. In fact, when such programs do falter, it’s more frequently because those darned adults at the school board or in government have cut funding for extracurricular programs, rather than because the darned kids can’t be bothered to show up.


Things like whether a kid gets a driver’s license or a summer job are far more likely, I’d argue, to be affected by socio-economic factors than by smartphone use. Getting your license, for example, requires a family car, a parent who can drive and has time free to teach their child to drive, as well as enough money to take the test – often more than once! – and possibly also to attend driving school. Not to mention money to insure a teenage driver on that family car once they have the license. When we see shifts and changes occurring, it can be quick and easy to blame the phone, because it’s a possible cause we can easily observe, but the studies haven’t been done (and probably won’t be for a long time) to actually demonstrate whether or not the kids who don’t have jobs, or haven’t had sex yet, are the same ones using their phones excessively, or whether broader societal factors might be at play.


Twenge notes that the changes she has seen in the behavior of this generation coincide not only with the widespread popularity of the smartphone but also with one of the biggest economic downturns in the US in recent decades. But she doesn’t explore whether economic factors such as greater insecurity and  instability in families might contribute to some of these trends at least as much as smartphones do. Similarly, she notes rising rates of mental illnesses such as anxiety and depression among youth but implies the correlation with the use of smartphones suggests causality without exploring what other factors might be at play. (Again, I’m referring to the Atlantic article; there is likely more sustained and thoughtful analysis in her book, but I haven’t read it).


As I read blogs and opinion posts about how we’re “losing an entire generation” to smartphones, I can’t help remember that I, too, am a survivor of a lost generation. We kids who were born in the 60s and grew up in the 70s – and those who came just before and after us, those late-Boomers and Gen Xers – were seduced by the siren call of a new technology: the television. The older generation shook their heads in dismay – well-documented dismay – at how kids of my era sat in front of the TV for hours, lured by the flickering boob tube, rather than going out swimming in the creek and committing petty crimes like the healthy, normal kids of the 1930s and 40s used to do. We were all going to be lazy, obese, unmotivated, incapable of focusing on anything for more than 30 minutes, unable cope with the real world.


And, ok, some of us are these things. And the introduction of television into every family home did have far-reaching consequences, as every new technology does. But we all grew up. We peeled ourselves away from the screens and got jobs and had sex and took risks and married and had families of our own and went on to create both the smartphone, and the kids who we now worry are spending too much time on their smartphones.


And, perhaps most tellingly, in these same thinkpieces and blogs where people of my age (and those a bit older and bit younger) moan and wring their hands about kids being online all the time, the writers flash back to the halcyon days of their own childhood: the unstructured outdoor play, the creativity, the risks they took. They do not seem to recall the 107,000 hours of Gilligan’s Island reruns they watched. But anyone today who is the right age to have a blog and rant on it was definitely told at least 107,000 times to “turn off that TV and get outside and play!!!!” when they were growing up.


It’s too facile to say, of course, that the kids are alright and will be alright. Some of the kids will be and some of them won’t, and as adults we all – including parents – have roles to play in helping them be more alright. And monitoring and limiting the use of technology is one of those roles, as is helping to create a society with socio-economic structures that allow more young people to have education, jobs and opportunities.


The latter, in case you didn’t notice, is a danged sight harder than the former, which may be why we want to rant about phones rather than having productive discussions about, say, how to reduce the burden of student debt, or how to encourage more kids to go into the skilled trades, or how to prepare them for a world in which automation may take over their jobs before they’ve even finished training for those jobs.


I also know that we live in the Golden Age of Clickbait, and that an article that tells us we are “Destroying An Entire Generation!!!!!!” is more likely to get clicks than one that says “Some Data Suggests Smartphone Use May Have Damaging Effects on Kids, but the Correlation and Causality Have Not Been Fully Explored Yet, However It’s Probably Still a Good Idea to Set Some Reasonable Limits on Your Children’s Use of Technology.” Most people are only going to read the headline anyway – certainly very few people are going to read to the end of this 2500-word-plus blog post I’ve just penned. But then, we all had our attention spans destroyed by half-hour sitcoms back in the 70s, so that’s not really surprising, is it?


If you made it this far, leave me a comment and let me know: did you survive your TV-soaked childhood and adolescence? And are your own kids, or the kids you see around you, with their faces glued to their iPhones, progressing semi-normally into adulthood? I’d love to know how other people’s observations line up with my own.


 •  0 comments  •  flag
Share on Twitter
Published on August 07, 2017 07:20

July 24, 2017

“Why isn’t all your underwear good?” Or, the lesson I learned from Sofia Vergara

In one of my favourite lines ever from the sitcom Modern Family, Jay Pritchett, a salt-of-the-earth type of guy in his 60s, asks his attractive younger wife Gloria (played by Sofia Vergara) if she knows where his “good underwear” is. Her reply is a funny sitcom one-liner, but it’s also become sort of my guiding principle moving into what I presume is the last third of my life. (This line is funnier if you can hear it in Sofia/Gloria’s Latina accent, but I couldn’t find a clip of it).


“The question is, why isn’t all your underwear good, Jay? You make a nice living.”


This is the question that has cut to the heart of my approach to “midlife and beyond.” Why is not all my underwear good?






If you were hoping this post was going to be mainly about my underwear … well, that’s weird. Sorry to disappoint. I am taking the question literally, throwing out old underwear as soon as they get holes or the elastic starts to go and immediately buying new ones in my favourite colours and styles, which I wouldn’t have done a few years ago. But I’m not going to post pictures or anything. (Jockey for Her French Cut, though, if you really want to know).


No, I’m thinking about the broader implications. Why are not all my T-shirts comfortable T-shirts? Why are not all the books on my shelf books that I love? And so on.


Although so far, I’ve really only gotten around to dealing with the books and the T-shirts (and the underwear). But given how much I love both books and T-shirts and how many of each I have, that’s a good place to start.



Maybe for some people it’s Feng Shui, or minimalism, or the Life Changing Magic of Tidying Up, or the Life Changing Magic of Not Giving a F#@k, or whatever, that guides their personal organizational principles. But for me, it’s Gloria’s question to Jay that I steer by. If I don’t consider it “good,” why is it here?


Let’s take T-shirts, for example. Anyone who knows me in real life knows I’m most comfortable in jeans and a T-shirt. And since I’m lucky enough to have a job where I can wear jeans and a T-shirt to work, this is the outfit I’ll most often be seen in, day in and day out, unless I’m going to church or to dinner at a nice restaurant. As a result, I have a lot of T-shirts. I’ve amassed this great collection of T-shirts with clever slogans, or images from favourite TV shows or books or music or whatever. Or shirts I cherish because they were a gift from someone.


But I realized this summer that I have a short list of about five shirts that I cycle through, and the twenty or so other T-shirts, no matter how amusing I find the designs, only get worn when all of those five are in the laundry. The reason is simple: those five are comfortable. Like super-comfortable, both in fit and in fabric. The others, even the ones where I deeply love the image on them, are either not a great fit, or made of a fabric that doesn’t feel great against my skin. I know I’m being insanely picky about this but … if this is the thing I wear most, why shouldn’t I be?


So this summer I decided to take a big step. Last week I did a purge of my T-shirt drawers. Some of the old shirts with great, fun, designs that I just don’t love the fit or fabric of, are in the process of finding new homes. And I went to the source of some of my most comfortable T-shirts, and ordered a few more designs I liked in the same fit and fabric as the ones I love, so I’ll have a new stable of comfortable shirts.


This is such basic stuff, I can’t believe I’m only figuring it out at the age of almost 52. And obviously it’s dependent on the fact that, like Jay Pritchett, I make a good living — if I didn’t have the money to replace my shirts I’d have to make do with the ones that aren’t so comfy. But in the grand, universal scheme of things, T-shirts are not that big an expense for an adult who makes a decent salary and spends very little on clothes. Why not have things that I truly love and am really comfortable with?


I’m also in the process of applying this principle to the vast metres of bookshelves that sprawl all around my house, though obviously in that case I’m not as much inspired by comfort. In a future blog post I’ll tell you how I’m applying “Why aren’t all your books good books?” to my shelf space, but for now, I’m going to go back to admiring my new, smaller T-shirt collection.


1 like ·   •  2 comments  •  flag
Share on Twitter
Published on July 24, 2017 08:09

April 30, 2017

I Refuse to Call it a “Faceversary”

[image error]One day late in April 2007, one of my students uttered four fateful words. “Are you on Facebook?”


I snorted my disdain. “No, because I’m an adult.”


I was pretty internet-savvy: I had had a personal website since 1995; I’d started this blog in 2006; I spent a good bit of my spare time back then on internet discussion boards (the late lamented ParentsPlace and Television Without Pity, and the still-going-strong Ship of Fools).


But Facebook? I’d heard of it, of course — heard that it was going to be the new MySpace and that all the college-aged kids and some of the high schoolers were hanging out there. It just didn’t seem like something I’d be interested in.


Just a couple of days after my snarky comeback to my student, I had coffee with a few other adults — my friends the Strident Women, also still going strong 10 years later — and found that a couple of them were on Facebook. And we agreed that if we all joined, and created a private discussion group, we could use Facebook to carry on the kind of snarky conversations we usually had over once-a-month Sunday coffee.


So I did it. I joined Facebook, and the rest, as they say, is history. So much history that yesterday, Facebook attempted to wish me a happy 10-year “Faceversary.”


No. Just no. I am not going to say that word.


But it’s probably worth a few moments’ reflection to think about the impact of a website that has played such a big part in my life, and the lives of others, over the past 10 years.



Facebook itself has changed in those 10 years, of course. Instead of being the purview of the young and cool, it’s now the hangout of the middle-aged and older. The kids, while they still may maintain rarely-updated Facebook pages (mostly for the purpose of getting birthday wishes from Nan and Pop, one feels), have decamped to Snapchat and Instagram and probably a bunch of other sites I haven’t even heard of. And Facebook’s influence in the broader culture has deepened immensely as it has moved from being more than just a purveyor of social contacts to a purveyor of information — and misinformation (aka “fake news”).


It’s also popular for people to decry Facebook: as a time-waster, as a perpetrator of the social-media bubble where we all listen to people who agree with us, as a flimsy substitute for real face-to-face contact. While many of the people I’ve been connected with over the years on Facebook have quietly slipped away from the site due to lack of interest in it, others have made highly public departures, announcing they are quitting Facebook because there’s too much drama, too much controversy, too much stress. Some return; others don’t. Some make it a personal discipline to self-impose limits: I’ve known many who gave up Facebook for Lent. (In the wake of the recent US election, I observed a Sabbath break from Facebook for a few months: for awhile there the news and opinions were so stressful that I needed a once-weekly break from my news feed just to feel mentally well).


I’m a big believer in the principle that very few things are inherently good or evil in and of themselves: most things, including media, are just tools. The good and evil exists in the use we make of them. So I don’t believe Facebook is  good OR evil. However, on this ten-year anniversary, it seems worth taking a moment to examine both the good and bad things that have come out of this particular tool, for me.


The Bad:



It’s definitely a time-waster. The whole internet is a time-waster; Facebook just distills all the potential distractions into one easy page with an infinite number of links to click on. There’s clearly an addictive quality here: I’ve succumbed to both the urge to spend hours mindlessly scrolling through my feed, as well as the urge to interrupt another activity to “just check my Facebook.”
Related to that, there are definitely times when I have scrolled my Facebook feed when I could have been engaging face-to-face with the real people in front of me. I try not to do this, but sometimes, like most of us, I do. (Of course, pre-internet, there were plenty of time I ignored real life people to read a book, so the real issue here is that Facebook takes a bad behavior I was already prone to, and makes it easier).
The monolithic growth of Facebook has killed off good discussion in a lot of other online spaces. I used to have a lively community of people commenting on this blog; now, when I do post, I’m lucky to get a comment or two. Even if people are interested in something I’ve posted, they’re more likely to comment on the link I put on Facebook rather than on the blog itself. Dedicated discussion sites like the ones I used to belong to are too often gone or at least struggling. It’s not that people have stopped talking about stuff, but more and more they’re talking about it on Facebook, in a forum that doesn’t always allow the same kind of thoughtful reflection that a good, well-moderated discussion board does. While everything from my blog to discussion boards would have inevitably changed anyway over a decade, there’s no doubt that much of the change that has happened has been driven by the mass exodus of virtually everyone to Facebook.

The Good:



Connection. For me, Facebook still does best what it did originally: makes it possible for me to keep in contact with people I’ve known over decades, many of whom are now far away. Sadly, I’ve lost two dear friends to cancer over these 10 years, and Facebook helped highlight an important difference for me. When my friend Jamie died, I felt the loss of someone who was still part of my daily life, because we had connected so much online — over email, on our blogs, but particularly in late years on Facebook — even though we saw each other rarely. When my friend Linda, who didn’t use Facebook or any other social media, passed away, I felt not only the loss of her but also the missed opportunities, the connections we DIDN’T have, the things I didn’t know about her life. Linda liked for people to write hand-written letters and print off photos and send them through the mail, and while that’s lovely, the reality is that the quick Facebook post, the “like” when a friend post pictures of her kid or pet or vacation, keeps that feeling of connection much more alive and immediate. Now, as another old friend from my college years battles cancer, I keep updated on her “cancer journey ” Facebook page, where I can see picture and. updates on how she’s doing, and offer good wishes. I feel as if, in her illness, I’m still a part of her life and aware of what’s happening with her.
While connection with old friends has been the greatest blessing of Facebook for me, there have been others. For someone like me who hates to use the phone, Facebook augments email and text as a quick and convenient way to get in touch with people. Sending a Facebook message is my first go-to to make contact with someone, as long as the other person is also a Facebook user, and it’s especially good if you want to make plans with several people at once.
A deeper benefit, for me, has been the diversity of opinion Facebook exposes me to. Yes, as I noted above, websites that promoted thoughtful, reflective online discussion have been hurt by Facebook. But the discussion that does happen on Facebook includes a wider range of people than I’d normally find on a dedicated discussion site, and I’ve often been informed and sometimes startled by insights from unexpected sources — a friend of a friend, or an old acquaintance I haven’t heard from in years.Despite the accepted wisdom about social media “bubbles,” the truth is that my circle of Facebook friends exposes me to far more diverse ideas and vigorous discussion than my real-life circles do — both because the people I know online are a more diverse group, and because I find it easier to discuss complicated and controversial issues in a written forum than face to face. So while, for example, I don’t know anyone in “real life” who voted for Donald Trump, I’m connected with several such people on Facebook. The fact that at last a few of them are folks that I knew long before the 2016 election as intelligent, thoughtful, compassionate people, has challenged a lot of my prejudices and exposed me to new ideas. Even if I still think these people are horrendously wrong in their politics — and I do — at least I’m learning to listen.

Have my ten years on Facebook been well-spent? Some of the hours that have passed in those years have definitely been wasted, while others have been spent in life-changing, wonderful connection. Yes, it’s a tool. I try to use it for good more than evil; I don’t always succeed, but I succeed enough that it feels worthwhile. 


Will Facebook still exist in 10 more years? Will I still be using it? Only time will tell. But on the whole, I can’t say I have any regrets about joining … even though I am an adult. I guess we’ll see how things look on my twenty-year … nope, still not going to say that word.


 •  0 comments  •  flag
Share on Twitter
Published on April 30, 2017 05:18

February 19, 2017

Meanwhile, in Sweden …

It’s one of those days when an unhappy coincidence between the fiction I’m reading and the real world I’m living in has led to some troubling thoughts.


[image error]For the last couple of days I’ve been reading Ariana Franklin’s Mistress of the Art of Death, the first in a series of medieval murder mysteries. In this book, the heroine, a female doctor from Salerno who specializes in examining corpses (i.e., a coroner before that was a job description) is called upon to investigate the death of “Little Saint Peter” in Cambridge, England — the latest in a series of mysterious disappearances of young children. This being the 1140’s, the deaths have been blamed on the Jews of Cambridge, who are reputed to have crucified at least one Christian child, possibly more. While the novel is fictional, some of the details of Little Saint Peter’s death are based on the death of William of Norwich in 1144, one of the earliest examples of Jewish blood libel, of which there were many instances in medieval Europe. (The “blood libel” link above goes to the Wikipedia article which gives a good overview; as always with Wikipedia, there are several more specific links available in the reference list at the bottom of the page).[image error]


I was interested in the character and the story, and not thinking particularly deeply about the blood libel aspect of the novel (which I knew about from history anyway), until I woke up this morning, finished the book, and went online to find that Swedish people were making fun of Donald Trump on the internet.


Not that Europeans, or anyone for that matter, making fun of Trump is particularly newsworthy. But this latest round of fun was based on something Trump said at a rally in Florida yesterday. Amid the usual round of incoherent ramblings aimed at assuring his supporters the world is a terrifying place and only he can protect them from Islamic terrorists disguised as refugees, he threw in the comment: 


“You look at what’s happening. We’ve got to keep our country safe. You look at what’s happening in Germany, you look at what’s happening last night in Sweden. Sweden, who would believe this?”


As it turns out, nobody (except Trump supporters at a rally) would believe “this,” because there’s no “this” to believe. No terrorist attack, no act of violence at all, carried out by refugees, terrorists, or anyone else, occurred in Sweden on Friday. There has not been a terrorism-related crime in Sweden since 2010, although it seems the US President (who allegedly gets much of his information from watching TV) may have watched a Fox News piece linking crime in Sweden to the increased refugee population. Maybe. But nothing was “happening” in Sweden the night before Trump made that statement.


[image error]It’s telling, of course, that only us enraged liberal snowflakes and the “left-wing media” who Trump recently labelled enemies of the people (and, of course, the Swedes) got up in arms about this. I haven’t seen any Trump supporters calling him out on this, anymore than they were upset when Sean Spicer thrice referenced Atlanta as a site of a terror attack by immigrants, or Kellyanne Conway blamed refugees and immigrants for the non-existent “Bowling Green Massacre” and then claimed that she misspoke. (Here’s a tip: if your job is being a spokesperson for the most powerful man in the world, maybe be a little careful about words like “massacre,” as “massacres” are things people tend to get upset about).


For months, since long before he won the election, Trump has been grooming his supporters to ignore the line between facts and lies — by attacking the mainstream media, changing the definition of the term “fake news,” and making obviously false statements about things that only matter to his swollen ego. A case in point occurred at Thursday’s bizarre press conference, when Trump claimed he had won the biggest electoral college victory since Ronald Reagan. When it was pointed out that wasn’t true, his response was, “I don’t know, I was given that information.” (In fact, Trump’s electoral college victory was the third-lowest since Reagan; only George W. Bush managed to do worse — twice).


Does anyone (other than Trump) care, now that he’s president, how big his electoral college win was? Of course not. The only purpose of blatantly false claims like that is to destablize the entire notion of “facts,” to remind Trump’s base that the only thing that matters is what the President says, and the only source he needs is “something I heard somewhere.” Don’t trust the mainstream media; they’re all fake news. Truth is whatever the leader says it is.


Why does this matter? Any of us can google how many electoral college votes every president has won and confirm for ourselves that the US president made a false statement and didn’t care about it. We can also check and confirm that there was no terrorist attack (or indeed nothing unusual at all) in Sweden on Friday night, no terrorist attack in Atlanta since 1996 (by a white right-winger) and no massacre, ever, in either Bowling Green, Kentucky, or for that matter Bowling Green, Ohio.


[image error]It matters because such false claims may be the modern equivalent of the medieval anti-semitic blood libel. It matters because twelfth-century English peasants could be led to believe that their Jewish neighbours were crucifying children, or mixing murdered children’s blood into Passover bread. In the same way modern, educated Westerners, surrounded by more sources of information than the world has ever imagined, can be led to believe that countless crimes are being committed by immigrants and refugees, even though almost no evidence of such crimes exists. And if someone comes forward with the evidence? It’s “fake news.” Or the mainstream media is not reporting all the attacks that are taking place. Or we misspoke, but the underlying idea is still true and shouldn’t be discounted just because we got some pesky little facts wrong.



You can almost imagine Kellyanne Conway or Sean Spicer as a medieval abbess and bishop at an 1144 press conference, assuring the press that there are dozens, maybe hundreds, of these poor kids being murdered by Jews every year. You can’t trust the mainstream press to tell you the truth — just listen to the Church, we’ve got the real story. By the way would you like to pray at the shrine of the little martyred saint while we hang a bunch of Jews?


Nobody is denying that refugees and immigrants sometimes commit crimes, though they generally do so in smaller numbers than native-born citizens do. (Note that I went with a conservative source for that link, just to be fair).


Nobody is denying that Islamic terrorist groups like ISIS and Boko Haram are real, horrible and deadly — and though the vast majority of their targets are other Muslims, ISIS in particular has carried out deadly attacks on non-Muslims in Western countries. Paris in 2015 and Brussels in 2016 have been the most notable such attacks in recent years. There have also, of course, been mass murders carried out by criminals who were Muslim and who appear to have acted alone but expressed support for ISIS (San Bernadino, Berlin market, Nice, etc).


People kill other people. It’s an unfortunate fact of human nature (I would say “sinful human nature” but of course that’s my own religious bias showing through). But there’s a particular kind of crime that the current right-wing populist politicians in the US, in England, in Europe, and yes, even here in Canada, are interested in highlighting and drawing attention to. This is the mass murder carried out by a Muslim, ideally an immigrant or refugee, against white, ideally Christian victims. If there’s a direct or indirect connection between the murderer and a terrorist group, so much the better for the politicians’ purposes. But even if no such link turns up, one can be hinted or suggested by the spin doctors.


The Paris attacks of November 2015 fit the template: so, as I’ve noted above, have several other killings in the last couple of years. What’s extremely fortunate for American citizens but inconvenient for the current US administration, which needs to bolster support for their immigration ban, is that very few attacks that fit that template have occurred on American soil in the years since 9/11.


[image error]Everybody agrees that mass killings are bad (of course, so are individual killings, another point I hope we all agree on). But mass killings that don’t fit the template do not get talked about or highlighted by the Trump administration. In fact, there was a terrorist attack, a deadly one, last week — but it didn’t occur in Sweden. It took place in Pakistan, where more than 80 people were killed and over 300 injured (according to most recent estimates) when ISIS carried out a suicide bombing at a Sufi shrine. I can’t find any record of Trump or anyone else in the US administration addressing this tragedy at all — because it doesn’t fit the narrative. Yes, “radical Islamic terrorists” are unquestionably to blame — but the victims were Muslims at prayer, and to draw attention to this attack would be to remind people that Muslims are far more often the victims of terrorism than its perpetrators (a point which also underscores why mostly-Muslim refugees are so badly in need of safe homes in other countries. The Trump administration, for obvious reasons, does not want to draw attention the suffering of refugees).


Mass shootings carried out by non-Muslims also don’t fit the narrative. The Quebec mosque attack was briefly of interest to the US administration — Sean Spicer mentioned it early on the day after it happened, when it was still a possibility the perpetrator might have been a man named Muhammed, detained by the police for questioning. When police confirmed that the killer was a white, Quebecois right-winger (and the officially anonymous man possibly named Muhammed was an innocent bystander), the US administration went silent and has had nothing else to say about a mass killing just north of their border.


Nor have they had much to say about the shooter who killed five people in Fort Lauderdale airport shortly before Trump took office. Once the killer was revealed to be a non-Muslim, mentally ill US military veteran, the tragedy remained a tragedy — but a tragedy without a political point to be made.


So if your goal is convince the population that Muslim immigrants and refugees pose a security risk so great that a wealthy nation must close its doors to people in desperate need, but Muslim immigrants and refugees aren’t co-operating by carrying out enough terrorist attacks to fit your narrative — what do you do? You do what medieval Jew-haters did in England — you make up attacks. You claim that lots of things are happening that the media isn’t covering, or you make vague allusions to things that never happened in Sweden, or Atlanta, or Bowling Green, or wherever,


Liberals, the mainstream media, and all of us who are fans of facts, will cry out and rage about these lies, but our protests don’t matter. You’ve already defined and dismissed us as purveyors of “fake news.” Your base support group, already carefully trained to believe that you, the leader, are telling them the truth while the media lies, will not question whether anything really happened.


After all, most Trump supporters will reason if confronted with their leader’s lies, if it didn’t happen in Sweden Friday night, it probably happened somewhere, sometime. We all “know” Muslim terrorists who got into our country disguised as refugees are carrying out crimes all over the place, just as medieval Christians in Europe “knew” Jews did terrible things to Christian children. And that “knowing,” unsupported by facts or evidence, becomes the basis for scapegoating and targeting of vulnerable groups.


Medieval peasants had an excuse. Most of them were illiterate; their lives were filled with backbreaking labour, and even if they had time to investigate the truth behind the blood libel, information was hard to come by. We, living in the information age, have no such excuse. 


Perhaps as Canadians we think we can sit back and shake our heads at all the Trump [image error]craziness in the US and believe it has nothing to do with us. Don’t fool yourself. In Toronto this week, about a dozen people rallied outside a mosque holding anti-Islamic signs. Who cares about 12 people? Maybe nobody. But a much larger group — over a thousand, including among their speakers some Conservative party leadership candidates — gathered inside a “Christian” college (yes, as a practicing Christian I have to put the term in quotes there) protesting the importance of their right to free speech (again, I chose to link to an article from a right-wing source supportive of the rally). Shouts of “Ban Islam!” were heard in the crowd, and a Muslim journalist who attended the rally wearing her hijab claims that at least one speaker explicitly defined “freedom of speech” as the “right to hate.”


People who want a right to hate, a right to stir up fear, need frightening stories to whip up support. And not much has changed since 1144.If enough frightening stories don’t exist, there are always people ready to make them up.


 •  0 comments  •  flag
Share on Twitter
Published on February 19, 2017 09:35

January 2, 2017

Top Ten Books of 2016

It’s that time again! I’ve made a video talking about ten of the books I liked best this year. Watch the video for a short description of each book, a little info about a new project I have coming up, and a chance to win a book from my favourites list. Check out my book blog for a more detailed discussion of what I read this year. 



 •  0 comments  •  flag
Share on Twitter
Published on January 02, 2017 05:58

December 27, 2016

May the Force be with You (and also with you)

[image error]

One weekend in the winter or spring of 1986, when I was a senior at Andrews University, my college boyfriend Rob spoke the words that college boyfriends have been saying to their girlfriends ever since:


“What? You’ve never seen any of the Star Wars movies?”


I had not. I didn’t think of myself as liking science fiction, back then. I liked fantasy — Narnia and Lord of the Rings — but I wouldn’t have gone to see a movie with spaceships and blasters and explosions in the outer reaches of the galaxy. However, when he proposed watching the videos to remedy this gap in my education, I was up for it.


This was 1986, and catching up with movies you hadn’t seen (apart from just randomly catching them when they aired on TV years later) was only beginning to be possible. It wasn’t yet easy. We had to find a friend (my cousin Jennifer) with an apartment and a TV, lug a borrowed or rented VCR over from someplace (actually, it may have been Rob’s own Betamax, brought over from his dorm —  yes, I think I originally watched Star Wars on a Betamax; how’s that for ancient?) We had to hook it all up and watch the movies, one casette at a time over three nights, if I recall correctly. And that was how I met Carrie Fisher, the iconic Princess Leia, who died today at age 60.


I loved her and I loved the movies. Yes, Luke Skywalker set off to “save the princess,” and along the way joined up with the roguish Han Solo (she loved him because he was a scoundrel) who became her love interest, but Princess Leia never passively sat around waiting for rescue. She was an active participant — in the Rebellion, in her own rescue, in every conflict that played out over those three movies. For many female movie-goers like me, unlike our boyfriends and brothers, the important thing was not that Return of the Jedi showed Leia in chains in a gold bikini; it was that while wearing that outfit, SHE FRICKIN’ STRANGLED JABBA THE HUTT.


As everybody has noticed by now, 2016 has been a bad year for celebrity deaths, and given that every year the stars of our youth get a year older, this trend may well continue into 2017. I was unpleasantly startled when I heard that Alan Rickman had died, having enjoyed many of his film roles. I wasn’t particularly moved by the deaths of David Bowie, Prince, or George Michael, as I wasn’t a fan of any of their music, though I recognized the impact they had on the culture and on the people who did love them. I loved Leonard Cohen’s songwriting, but at 82 I felt he had had a good long life and career. The 2016 celebrity death that actually hit me, that brought tears to my eyes, was the death of Carrie Fisher, just as the year is drawing to a close.


In the summer of 1977 — the same year the first Star Wars movie came out and I didn’t see it — I remember coming in from playing out on the street on a summer day and my mother saying, “Elvis Presley is dead.” I was only vaguely aware of who Elvis was, and I wouldn’t have considered my mother a fan of his — I’d never heard an Elvis record played in our house, and she didn’t really like pop music generally. But she was knocked back by Elvis’s untimely death, because they were the same age — forty-two that summer, which seems so young to me now. The celebrities of your own generation, especially those exactly your age (for me it’s Brooke Shields, a child star when I was just an ordinary child), hold up a weird mirror to your own life, and their deaths are a shock — I guess that’s why my mother grieved for Elvis.


The things is, celebrities, like ordinary people, are always dying. In 2016 and in every other year. Why do some celebrity deaths grab our attention, interrupt our own lives, so far distant from their star-studded sphere? Because something in them echoes our own lives, or shines a light on those lives. Poets and spiritual seekers recognized a kindred spirit in Leonard Cohen. People who refused to be crammed into the narrow boxes of gender norms celebrated the same refusal in Prince and Bowie. If a musician’s songs were playing, or an actor’s movie dominated the screen, at the crucial moments of your own life, you feel you’ve lost a piece of yourself, of your own history, when they go. We still have the music, the movies, the books, though there’s the loss of knowing they won’t make any more. But what makes us grieve is that kinship, that sense that their lives were somehow being lived parallel to, commenting on and illuminating, our own.


Young Carrie Fisher was Leia, the princess who didn’t need the guys to come rescue her, who wielded a blaster and weird hair-buns instead of a tiara and a sparkly dress. Post-Star-Wars Carrie was kind of a mess, with the drinking and drug use and failed relationships, along with real and good work — Postcards from the Edge, all the script doctoring we never knew about. And then middle-aged Carrie was mouthy and honest and real, talking about addiction and mental illness and the impossible, crazy demands that the entertainment industry makes on women. She was on stage and page playing her pain for laughs in Wishful Drinking, and then there she was doing the talk show circuit for The Force Awakens with her dog tucked under her arm and her hilarious willingness to say anything, especially the unvarnished truth. Then there she was as General Leia Organa on screen again, older and graying and luminous and beautiful, grieving in Han Solo’s arms for all they’d lost, and still a tough, no-nonsense, revolutionary leader.


There are a lot of things wrong with our culture’s obsession with entertainers-as-celebrities, but at their best we relate to them because they tell us something about our own lives. Women, particularly geeky women, I think, of my generation — we loved young Leia for her strength, and we loved aging Carrie for her vulnerability. Both touched and changed us; both are gone now.


May the Force be with her, and may she rest in peace. And may we all have a little of Leia’s courage and Carrie’s relentless honesty as we journey through this galaxy not so long ago, and not so far away.


 •  0 comments  •  flag
Share on Twitter
Published on December 27, 2016 21:09