Blog

Of Skills, Content and Concepts: The Trouble with Ontario’s Curriculum

I have written before on this blog about the English curriculum and what some of its problems are; specifically, I highlighted the way it embeds antiquated and classist cultural values within its organization of core competencies, all of which are hidden beneathe the vaneer of a skill-based curriculum. Yet, if we take a wider view of the Ontario curriculum accross subjects it shows that even the glaring flaws in the pseudo-skill-focused English curriculum, are minor compared to the content heavy, skill poor and conceptually vacant documents that dominate in many other fields.

If you take any curriculum document from the social sciences, humanities, family studies, science, math or Canadian and world studies and you will see the same thing. There is one strand about skills and three or four strands focused on specific content. In some cases it gets as specific as listing which isotopes students must memorize or which events they need to be aware of.

The trouble is that skill and content expectations miss the point. They emphasize decontextualized processes or rote memorization, respectively. Conceptual curiculum is what we need, as it implies the other two, without balkanizing either. If we organized our English curriculum around communication, representations, style and narrative it forces students and teachers to work towards developing the skills and content to understand these concepts deeply. 



 

'Life is Strange' and the Postmodern Condition

In a rare aspirational literary flourish in the midst of the typically dour prose of a thesis, I wrote the following:

"[T]he fractured post-modern self is comprised of shards blending production and consumption; the aesthetic and the industrial; fantasy and material life; childhood and adulthood…[Each has become an] increasingly unstable categor[y], threatening to melt before our eyes and disintegrate into a series of fragmentary, contested and uncertain moments in space and time.” 

I was writing about children’s toys and their connection to our present culture, but I may as well have been writing about the recent episodic video game Life is Strange. In the mystery-time-travel-mash-up, gamers are presented head on with the twin pillars of postmodernity - authenticity and nostalgia - and forced to watch them crumble through the time-warped camera lens of a 17-year-old-girl.


First, it is important to note that calling out nostalgia and authenticity as the pillars of postmodernity is not a particularly ground-breaking statement. Indeed, Hutcheon and Valdez  have highlighted the significance of irony and nostalgia to postmodernity. Similarly, Goulding has hit on the importance of nostalgia and authenticity to postmodern capitalism and identity. In more concrete terms, we need look no further than the remix culture that dominates Hollywood cinema. Nirvana and Kurt Cobain paraphernalia abound in high schools. Forget the age of mechanical reproduction; we inhabbit the world of digital reproduction whose speed and pervasiveness defies even Banjamin's most wild predictions. What is old, is (almost) new again. It is the age of the reboot. Video games are acutely dominated by these forces, as games are re-re-released and kick-starters abound for IPs long forgotten by the mainstream. A reskin of Call of Duty is released on a yearly basis for the cadres of unimaginative, escapist gamers. So it is perhaps not surprising that it is a video game that has launched one of the most effective interventions on this topic.

For those unfamilair with the game, players are asked to take on the role of Maxine “ Max” Caulfield as they try to unravel the theads connecting Max’s manifesting time travel ability with the disappearance of a student from the elite Blackwell Academy, and the increasingly unnatural weather. Each of these threats offers up a metaphysical or physical challenge to the peaceful and idyllic vaneer of the fictional sea-side town of Arcadia Bay, Oregon.

The premise is clever. In contrast, the stylistic and functional aspects of the game are aggressively mediocre. The dialogue is clunky in too many places. The character models move like drawing mannequins with joint deficiencies. And the female lead characters sound and act like they were written by a middle-aged white man with a voyeuristic lesbian fetish and a limited understanding of contemporary youth culture.

Despite its numerous and conspicuous faults, Life is Strange is one of those creations in the video game world where it’s whole is much greater than the sum of it’s parts. There is a deep river of narrative meaning beneathe the barren rock of it’s banal aesthetic and technical execution. Life is Strange effectively maps more compelling and fertile intellectual landscape when we consider it as an allegory for contemporary life. It is at this level that the game functions most effectively as a criticism of the postmodern condition.

In particular, it is the camera and photography that showcase the greatest faults of post-modernity. The trope of a teenaged girl with a photography hobby is a well worn cliche. Max runs around her world snapping photos of people, places, flora and fauna on an old polaroid camera, ostensibly, for a competition she will never enter. The love that Max expresses for her old polaroid, hand written letters and other obsolete technologies shows nostalgia for what it is: a yearning for a permanence that never existed. The act of photography in particular offers the illusion of capturing and freezing time. However, when the photos allow the character to shift through time and reality, the promise of nostalgia is turned on its head. Especially ironic is when Max’s carefully currated collection of nostalgia is burnt by the games antagonist, leaving her little choice but to rely on a digital selfie on a friends' cell phone to undue the most recent turn of unpleasant events. In the world of Arcadia Bay, the ability to change the past through a photograph or rewind time and make decisions again without having to re-load a save is more than a clever game mechanic; it lets the player see experiences as ephemeral along with our attempts to capture them. It works as a subtle criticism of those who surround us in contemporary culture who are too busy instagramming their food, or watching concerts through their smartphone screen to actually experience life. More and more we find people who act like archivists of their own narrow and silly lives. Life is Strange shines a light on these behaviours and undermines them by detabilizing the moments which are captured, rather than the ones which are fully lived. 

Additionally, the time travel mechanic brings into question the pursuit of authenticity which characterizes the postmodern malaise. As Max experiences different timelines, moving forward and backward, deleting and re-writing, the player and the protagonist begin to question what is authentic about Max or those around her. Indeed, Max is not the only photographer interested in “capturing” authenticity. Her photography teacher and the games’ principal antagonist, Mr. Jefferson, is in the habbit of drugging and forcibly photographing teenage girls in order to “capture” their authentic purity and innocence right before it is stripped away. Who Max is, or who anyone is that Max admires, is thrown into serious doubt throughout the game. Unlike similar story and decision based adventure games like those from Telltale Games, players are not so much asked to define a character through choice, but to play with choice as a means to slowly help the protagonist unravel into an uncertain heap of compromises and imperfect options. Indeed, the only characters which actually are authentic are frequently damaged, scared, or trapped by the circumstances of their life. The PTSD-suffering formed soldier and head of Blackwell security David Marsh jumps immediately to mind. We leave Life is Strange with not so much a sense of the inautheticity of everyone, but the sheer impossibility of authenticity and the terrifying outcomes that result from an over-emphasis on finding the authentic.

In short, if you are the kind of gamer with the right kind of eyes, Life is Strange is more than an interesting detective story layered atop mediocre technical and aesthetic execution. It is a potent and thoughtful narrative about the postmodern condition and the impossibility of our escape from it. The game is no manifesto - offerring an attractive alternative to our present state. Instead, it is a kind of survival manual about the importance of irony for self-awareness under conditions such as these. Like Max at the end of the game, we have no more photographs. Only a single moment that means nothing and everything.  

 

On Teaching the Rehtaeh Parsons Case

My Canadian Individuals and Families in Diverse Societies class (CanFam for short) wrapped up their unit on gender and sexuality today. The unit covered gender and sexual identity, gender expression, patriarchy, feminism, and systems of oppression and violence targetted at gender and sexual minorities. As I usually do in the social sciences, I spent some time thinking about a case study that would highlight all the things we had focused on in this unit. I tried avoiding the obvious, but eventually, and reluctantly, settled on the Rehtaeh Parsons case.

For those of you unfamiliar with the disturbing details, a quick summary is warranted. In November 2011 at a small house party, then 15-year-old Parsons consumed at least eight vodka shots. Two older males from her high school, also inebriated, proceeded to take turns having sex with her and snap photos, one showed a boy giving a thumbs up as he entered Parsons from behind while she vomitted out the window. After seventeen months of these photos circulating and Parsons being subject to harassment, including being banned from returning to her school and grilled by RCMP officers on two separate occasions, she attempted to take her own life by hanging. Three days later she was taken off life support. At least one of the males, in a widely publicized interview, maintains that the sex was consensual despite the above evidence. No sexual assault charges were ever layed, though both males were charged with distributing child pornography.

These facts are not in dispute. What is in dispute is the meaning attached to them.

On the one side sits the two men, their parents, segments of the Coal Harbour, NS community where the assault took place and right-wing journalists like Christie Blatchford. They have subtly used the codes of patriarchy to engage in a kind of character assassination on Parsons. They use small tidbits of information regarding her family life, her casual use of marijuana and previous sexual behaviour to imply (though not outright say) that Parsons was in essence consenting by virtue of her reputation as a "fast girl” or “slut". In contrast, Blatchford, the parents and the males emphasize their traditional families and stress their own innocence as “boys” who didn’t know any better and are incapable of controlling their urges. Indeed, the male interviewees parents were more concerned that their son had engaged in a threesome and made the “mistake” of taking a picture than any of the other disturbing events surrounding Parsons bullying and death. Blatchford described this stance as “brave”.

On the other side are Parson’s family and feminists of numerous genders and convictions who highlight that the law is quite clear: consuming alcohol, especially to the point of vomitting, makes it impossible to consent. Addtionally, they have highlighted the way Parsons was repeatedly revictimized by the police, medical practitioners, the school administration and teachers, and of course other students at Coal Harbour High School. All of these instances eventually snowballed, leading to Parsons death.

Facts do not speak for themselves. People speak for the facts.

This is heavy stuff for the seventeen-year-olds in my CanFam class. It is heavy stuff for adults. I’m not a crier. There are very few times I have chocked up during a lesson. Teaching about Parsons is among one of those handful of times. I didn’t like teaching about it, or talking about it. I don’t like thinking about it. Teaching about it was profoundly uncomfortable for me. It was hard for my students, too. 

But real education is supposed to be uncomfortable. No one learns much of value by playing it safe and sticking to their comfort zone.

When we had a discussion afterward there were certainly expressions of outrage, especially at Blatchford’s abysmal apologia for a patriarchal order and misogynistic behaviours that demand the degradation and objectification of women to the point of their anihiliation. In short, Parson’s case stands as a frightening distillation of the intersection of slut-shaming, sexual violence, sexism, and the pervasive impact of patriarchy on our daily lives - all the issues we had spent the unit talking about. But what my students took away - what they learned - is that patriarchy isn’t a set of things people do. Patriarchy is the thing that permeates institutions from the family, to the school, on up to the police force and medical profession. It is the thing that condones misogyny and sexism in advance and excuses it afterwards. It is the purview of none, but the sad and ruthless burden of all - a burden borne more heavily on the shoulders of women and girls than others.

As I encounter more and more post-feminists of all genders and sexual orientations, the Parsons case stands as a tragic and heartbreaking reminder that women and girls are still not equal. That even if progress has been made, there has not been nearly enough. Feminism, or the radical notion that men and women are equal and deserving of respect and dignity, is still an aspiration in many corners of Canada, including the column inches of the National Post and the homes of self-described “good” or “traditional” families. 

I don’t want to teach about Rehtaeh Parsons again. But I know I have to. Not because I want anyone’s son or daughter to grow up too fast, but because I want them to grow up in a world that loves, values and respects them, regardless of who they choose to be.

We Need a New English Curriculum in Ontario

gr11-12

Just about anyone you speak to either inside or outside of education will happily confirm the importance of English. Indeed, being able to understand and produce texts in various mediums is something of a pre-requisite for any employment in the contemporary economic world. However, our English curriculum in this province is so stuck in the past that it has become disconnected from teaching the kind of skills that make English relevant. 

If you read through the curriculum, you will find a great deal of language designed to create the impression that it is innovative and focused on contemporary literacy. Yet, when we consider the structure of the curriculum, what becomes clear is that it is overwhelmingly old-fashoned, myopically inward looking, and obsessed with leveraging the so-called “canon” in order to reinforce long-standing distinctions between high and low culture. For the sake of brevity, then, I won’t be dealing with all the minutiae of the curriculum. My focus is on the structure. 

Ontario’s English curriculum is divided into four strands and three to four “overall expectations” per strand. The strands are oral communication, reading and literature studies, writing, and media studies. This all sounds fine until you unpack what this means. 1) Being able to access a novel in oral format via audiobook places you outside the curricular competence for strand two. This is important for English language learners to help them master their new language, but the English curriculum places them at a disadvantage by deafult. The division between media and reading and literature/writing serves no purpose other than to reinforce the superiority of the novel and play (interesting, but mostly useless forms of writing) over and above other forms which offer sometimes greater insights. Furthermore, it perpetuates the distinction between high and low culture, between the canon and other works that educational scholars have long been advocating a departure from and newer, hybrid texts. Furthermore, the concept of media studies is ill-defined, including items as varied as print advertisements, video games and pop-songs.

All this aside, what I find most offensive is the general disassociation of this curriculum from critical thinking and transferable skills. There is a throw-away metacognitive expectation in each strand that equates critical thinking with reflecting on how badly you edited your essay. This isn’t metacognition. Metacognition is reflexivity about your subject position, which means it should be connected to critical pedagogy about race, class, gender and other forms of positionality. 

Like this blog, a great deal of my writing and that of others is done in digital mediums. However, as soon as a I ask a student to write something other than an essay or reflection, it ceases to be writing and becomes “creating media texts”. In-depth analysis of a hybrid or multimedia text is “analysing media texts” or "understanding codes and conventions”, not “reading for meaning”. In short, the curriculum implies that meaning is only evident in the canon; filled as it is with forgettable, racist, sexist and dated titles like To Kill A Mockingbird, Lord of the Flies and any number of Shakespeare's plays, which despite needing to be performed, still somehow count as literature on most teachers syllabi. On that note, a whole strand, "reading with fluency” all but begs English teachers to ask students to recite the bard from memory as if this cocktail-party knowledge has some innate value.



If this sounds odd and arbitrary to you, it is even worse for educators, aside from those who drank the cool-aid in their English undergrads. As someone who is by training a historian of popular children’s culture, and so thinks little of high-low cultural distinctions, I find this troubling. Where are the thinkers that animate our life and world? Where is the intertextuality that should punctuate a good liberal arts education?

As an alternative, I would be inclined to point to the Saskatchewan curriculum which has three strands:

Reading and Analysing Texts
   a. oral
   b. written
   c. visual

Producing Texts
   a. oral
   b. written
   c. visual

Reflecting on Skills and Strategies
   a. oral
   b. written
   c. visual

This makes sense in so far is it reflects the reality of the texts our students find themselves encoutering and encourages educators to make their course content meaningful and useful accross mediums. It also matters in so far as someone who is strong in oral and visual literacy, but weak in print literacy still has their skills acknowledged in a fair and equitable way. Let’s make this the model for Ontario’s next iteration of the English curriculum.


Doctors, Teachers and Hockey Players

A few weeks ago I was having a facebook discussion with a friend of mine about physician pay. I felt it was too high (I was wrong). She argued that in fact it was barely fair and probably much too low. “Doctors should be payed like professional athletes,” she argued. This seemed crazy to me at the time. It also made me a little curious, so I wanted to run the numbers over the entire lifespan of your average doctor and average hockey player.

If you are comparing superstars, then NHL players certainly look very overpaid. Sydney Crosby will make $16.5 million this year including endorsements. Endorsements will decline over time. Superstars also have longer careers. Let’s say 20 years for someone who starts at 20. At the start and end of their careers superstars are paid considerably less, so let us say a total compensation of around $200 million. In contrast, the highest paid physicians make about $550,000 a year. Let’s assume they make peak earnings for 20 years with slower years on the front end and back end of their work life. This puts them at a comparatively modest $15.5 million for their career. 

However, Sydney Crosby is not the norm among NHL players, nor is the $550,000 physician. Since the average NHL career is 5.5 years and the average NHL player makes about $2.5 million a year for those 5.5 years, they make around $13.2 million. After 25 or 26 they then retire and assuming they do nothing (which is unlikely) and just collect their $50,000 pension they make another $2.6 million before they die at the average age of 78. 

This is a lot of cash. So let’s look at how doctors stack up. Your average doctor has a long career. They average around 35 years. The average doctor in Ontario makes $340,000 annually. So running the rough numbers, the total work revenue for a doctor is $11.9 million over their career. Since those in private practice must pay for their own pension, while those who work in hospitals have some form of pension, let us err on the side of caution and say the average doctors pension is $30,000 (excluding CPP and other forms of support) for the 13 years they would live in retirement, giving a total retirement income of $390,000. 

In total, then, NHL players and doctors are payed $15.8 million and $12.3 million for their troubles, respectively. This means the average doctor makes 78% of the income of the average NHL player.

Now some would object that doctors have a much larger economic impact than hockey players. This is definitely true over the course of an entire career, but not necessarily on a year by year basis. Recent studies from the University of Ottawa show the Senators add $204 million to the economy annually. Assuming the player making the average salary on the senators (Chris Phillips made $2.5 million this year) and that his proportion of the overall payroll is proportional to the direct and indirect economic impact he has, it means Phillips is responsible for 4% of that $204 million, or $9 million annually. In contrast, doctors add $3.3 million to the economy annually, according to the American Medical Association. There is little reason to assume this would be significantly different in Canada. So the average economic benefit over an entire working career would look like this:

As you can see, the average NHL players has a total career benefit of $50 million dollars to the economy. Doctors, because they work longer, have a larger overall benefit that more than doubles that of NHL players. On a per year basis as well, doctors pay is only around 12% of their total annual economic benefit. Whereas hockey players make about 28% of their total annual economic benefit. In short, doctors are in no way overpaid, but it isn’t clear that they are underpaid either, as many economic benefits are indirect. In contrast, our NHL player seems to be making pretty great money as a proportion of indirect and direct economic benefits.

So doctors are paid almost like NHL players, as it turns out and produce an even greater economic impact. However, one might argue that both the average doctor and the average NHL player produce significant benefits that exceed the compensation they receive. This means we are certainly getting a fair return for money spent.

So what about other professions, like teachers? This is where things get even more interesting. When we look at conservative statistics regarding the impact of the average teacher on the economy, our graph looks like this:

Your average teacher produces $52 million, slightly more than the average NHL player, in economic benefit over the course of their career. On a yearly basis this is conservatively around $1.5 million. 

Now, when we take the average teacher salary in Ontario of ($53,000 a year) and assume a career length of 35 years and 18 years of retirement we get a lifetime income of $2.75 million. Here is what our second graph now looks like:


Teachers are not payed like NHL players or doctors, that is for sure. However, their economic benefit is greater. In fact, on an annualized basis, teachers are paid on average only 3.5% of their overall economic benefit. Compare that to the ratio of doctor pay (12%) and NHL player's pay (28%) to their annualized economic impact and it becomes pretty clear that teachers are the best deal going from the standpoint of economics.

Tests Aren’t Objective. They are Just Easy to Grade.

I get why you give tests. Trust me, I do. You have a family, a social life, and other obligations that need attention, and tests - let’s just be honest - are really easy to mark. I marked a mid-term exam today for my grade 12 social science course and it took me around an hour and a half. This is way easier than the effort required to properly assess an essay. In fact, ease of grading is the only honest defense I’ve ever heard of bothering with frequent (read unit-based) testing. 

This begs the question: how do tests benefit students? The answer is, they don’t. They aren’t meaningful exercises that enhance understanding. Doing more tests has not been shown to increase your ability to do well on tests. In short, they show some detail on where students are with the course content, but little regarding how well they understand it. At best, the act of studying for the test may improve retention and memory, but this isn’t a function of the test itself.

However, educators don’t have to spend much time defending their preference for testing. It is, by and large, an accepted right of passage in the education system. This is in spite of the fact that the research on testing shows only limited benefits and numerous potential pitfalls. Multiple choice tests can actually generate a tendency for students to internalize incorrect answers. Other studies have shown significant harm to students’ willingness to engage with new material and only exacerbate low performance as students decide that they are “a 60% student”. Furthermore, “the real world” has very few tests. What it has is a large number of projects and a few certification exams related to acquiring credentials. This is why, if I may be so blunt, you need to stop testing.

Keep your exam, by all means. It serves an appropriate socializing function that, while it will never improve accuracy through practice, at least helps students aclimate themselves to the reality of exams and practice their studying skills. This reflects and honours reality in important ways.

Tests, on the other hand, serve little educational purpose, especially as a unit capstone exercise. I certainly think tests can be useful as an assessment for learning - a quick snapshot showing what you, as the educator, need to revisit or spend more time on. They don’t, however, capture much in the way of student learning.

Tests are often rudimentary recall activities. Some overly-optimistic educators talk of designing tests with "rich questions.” To be blunt, this is an absurdity. A question is only as rich as the answers it solicits. The time constraints of testing mitigates against rich answers, regardless of the question. Turn that "rich question" into a rich assignment instead with multiple parts that brings together the content from the unit and forces students to apply it or test it in a meaningful context.

My students don’t write tests. My grade tens write papers on the meaning of gender identity at different times in Canada. My grade twelves analyze current events from a particular disciplinary perspective that asks them to decontextualize and recontextualize their learning within the world they live in. They write exams and score at grade level, or above because they understand the disciplinary logic and conventions they are working with. Even if they haven’t memorized specific content, many can use their deeper understanding to solve the problem in front of them come exam season. 

I know it is hard, but do your students a favour: stop giving them tests as an assessment of learning. If you must test, use it instead as an assessment for learning - a diagnostic to show you where you need to spend some additional time reviewing concepts.

Bloodborne and Hardcore Gaming: Why Miyazaki’s Latest Creation Isn’t as Good as you Think it is.

Bloodborne-Lamp-Post-Screenshot

As a relative newcomer to From Software’s punishing creations, I committed what Soul’s and Bloodborne fans consider a cardinal sin: I pointed out that for all its merits, Bloodborne has some serious flaws. I foolishly went over to the Bloodborne Facebook page to post a short comment about the lack of pre-boss fight checkpoints. Those who have gamed on any level are familiar with the long-standing convention of offering players an autosave before a big fight. I mentioned, that it would have been helpful to have lanterns (the save points in Bloodborne) in a few more locations. At present they are few and far between, located only after boss fights and then at the start of new areas, which are directly after any confrontation with a major foe. In short, two lamp posts are usually cramped close together followed by a lot of enemy infested space without a save point in sight. To me, as well as other gamers and reviewers, this seemed like a poor design choice.

The reaction was swift. Someone, who I'm sure meant to be helpful, told me to unlock shortcuts (which I already had) and concluded with the following bizarre remark: "don't blame the game just because you aren't playing it right." This is possibly one of the most ridiculous comments I have ever heard regarding a game, but from reading more about the fanboys and fangirls that rave about Hidetaka Miyazaki and his uber-challenging games, this is a common logic among cadres of so-called "hardcore" gamers.

The idea that there are "right" ways to play and "wrong" ways to play is like suggesting there are correct and incorrect ways to enjoy or provide constructive criticism on books, films and other media. We don't tell people who find House of Cards melodramatic or those who see Game of Thrones as too convoluted because of the many, many characters that enter and exit that they aren't watching these shows right. We acknowledge that for those individuals any positive meaning or experience they derive from viewing is outweighed by their concerns and issues with the program. This is entirely reasonable and a basic principle of free speech. Intelligent people can disagree.

So aside from being patronizing and annoying, comments about how to play a game right reveal something rotten at the core of hardcore gaming's subculture. Many of the issues become strikingly clear when we consider the major flaws in what is an important, but ultimately mediocre, game.

Narrative With No Gaming, Gaming with No Narrative

Certain segments of the hardcore gaming population have been vocal about what they refer to as "non-games". By this term, critics are referring to narrative driven interactive experiences like
Gone Home. While these kinds of games can be rich, engaging and interesting experiences, their lack of death, combat, or a male protagonist qualify them as not really games in the minds of some. In fact, these most certainly are games. The player controls a character and interacts with the world. It is, however, on the extreme end of the spectrum between narrative-driven and gameplay-driven video game experiences. It is this very tension which game criticism has dubbed ludonarrative dissonance. In short, ludonarrative dissonance describes the tension inherent between interactive gameplay (ludo) and attempts to tell a story (narrative). The Grand Theft Auto Series and other sandbox games are some of the more clear examples where players can behave in ways that undermine attempts to tell a story and thereby fall out of the game’s supposedly immersive world. Instead they allow the violent gameplay to transport them uncritically into a violent world and helps them to absorb messages that run counter to the story. You can read the post that started this discussion by game designer Clint Hocking here.

When games are done right designers try and use gameplay and narrative to reinforce one another. This tension between play and story can help to generate brilliant experiences. The Last of Us stands out as a game which provided meaningful gameplay decisions that worked with the narrative to create a fabulously rich experience. The self-interest of gameplay is ultimately reflected in the narrative and vice versa. 

While Bloodborne and Gone Home are both enjoyable and interesting games, both try to do an end-run around the problem by minimizing either narrative elements or gameplay elements. Rather than making them great, this frightened retreat from the major tension of contemporary game design shows the weakness of both games. Indeed, when one of the few articles about story describes Bloodborne's and other Miyazaki games' stories as "vague and bizarre" it raises serious questions about the quality of the game. Gone Home tells an interesting story that unfortunately does little more than bring the player along for the ride. Bloodborne has the opposite problem: engaging gameplay that is fundamentally meaningless because of a convoluted and empty narrative. Certainly Bloodborne has things to say about the futility of existence, the absurdity of life and other existential issues, but it doesn't explore these subjects enough to even develop them into themes. It is like a version of Beckett's Waiting For Godot where Vladimir and Estragon repeatedly hang themselves from the tree - over and over again until death loses all meaning in a Sisyphean blood-orgy. Thus, Ben Kuchera's attempt to suggest that Bloodborne is about mindfulness, while an appealing defense of the game on the surface, only highlights with extreme clarity this games major shortcoming. Mindfulness becomes the only coherent subject to be pulled from this lengthy game. Furthermore, when we consider the lack of a meaningful narrative that helped to enrich gameplay was one of the main reasons for Destiny's well-deserved mediocre ratings, it is clear that Bloodborne may be an enjoyable game, but is not a great game. It also seems to have been given a free-ride by the mainstay of the critical establishment. It wants players to be challenged, but as if it were a mirror image of Gone Home, opts for the conservative design choice of pretending narrative doesn't really matter to games. If, as many in the hardcore community suggest, Gone Home doesn't count as a real game, then neither does Bloodborne. For the record, I think both count as games, but neither deserves the praise they have received.

What's With All the Praise?

Why then does another good (but not great) game receive such stellar scores from reviewers? This is partly a product of the political economy of gaming and game reviews at the moment; however, it is also about the (mostly ugly) identity politics being played by the hardcore gaming community. Dave Thier's excellent article for Forbes highlights both of these issues. On the one hand, Bloodborne seems better than it is because the vast majority of PS4 and XBOX One games have been so aggressively mediocre since the consoles' releases that the Wii U is threatening to make a comeback from the edge of oblivion. Self-described hardcore elements within the gaming community are troubled by this, as it implies that “serious” games are failing a wide majority of players. Hardcore gamers desperately needed a hit AAA game, and so have latched on the the first half-way descent game published for core- and hardcore- friendly consoles. When coupled with the fact that most of the reviews were written by so-called "experts" (read fanboys) of Miyazaki games, inflated praise was partially to be expected.

Yet, the praise also reveals something about the culture of game journalism and its intersection with hardcore gamer identity. What hardcore gamers (of which many journalists may number themselves) love most about Bloodborne is it's inaccessibility. This is where comments, like the one that started this post, come from. Like a film buff who only watches experimental art house flicks so they can ridicule those who prefer less abstruse media texts, Bloodborne players like that most people will hate the game and give up quickly. It reinforces their own notions of a division between hardcore and casual gamers that helps them to claim special authority and expertise. It allows them to dismiss voices that offer alternative and less naive thoughts on the world of video games. What is perhaps most interesting is the way the punishing gameplay of Bloodborne and the
Soul's games feeds into very old ideals about masculinity that are central to hardcore gamer's sense of identity. Specifically, the mostly male reviewers have hit upon the satisfaction one feels, after repeated deaths, of beating a foe. This is the same kind of valourization of sacrifice and machismo that is part-and-parcel of the honour ethic of identity. Bloodborne is not alone in this. Rather unsurprisingly, many core and hardcore games, especially those given the inflated title of e-sports, draw on these same notions. The implication is that hardcore gaming and love of hardcore games is evidence of real masculinity and casual gaming, is by extension, synonymous with femininity or male effeteness. Game reviewer Gennevieve Leblanc suggested as much in a recent radio interview where she offered a mostly positive review of Bloodborne. Given the recent gender trouble that has plagued game culture and the absolutely disgusting activities of so-called “ amerGate" trolls, we should be very critical of these subtle slippages around gender, even among well-intentioned reviewers.

Consequently, the confluence of the current political economy in gaming as well as the male-centric category of hardcore gaming has created an environment where a good, but not great, video game can become a candidate for game of the year. The reality, however, is that this has nothing to do with From Software's recent creation, which is plagued by poor design choices and a superficial narrative, and everything to do with the culture war ongoing at the heart of the gaming community between "hardcore" gamers and the rest of us with enough basic intelligence to recognize that things in the virtual world also matter in the real world. Nothing, not even Bloodborne, is just a game. Everything happens in real life.

Let’s Talk About Cursive

Here is a quick test to begin. Can you read the document below?


scretaryhand

Source: University of Toronto Thomas Fisher Rare Book Library

Unless you have been well-trained in paleography, I assume its contents, aside from a few words here and there, were totally indecipherable. It may surprise you to know that this is in fact written in English in a form of writing called secretary hand. Secretary hand was used for official documents for much of the early modern period and survived in some cases up until the late-nineteenth century. This little annecdote illustrates quite clearly the absurdity of our continued obsession with cursive writing.

As the above sample amply demonstrates, cursive writing was created for speed, not accuracy. In fact, the proponderance of block letters in latin and vernacular language in many public notices from the classical and medieval period highlights this fact quite clearly. Finally, cursive is generally considered the product of adapting writing technologies like the quill and the calligraphy pen to the need to record information. If you have ever used these items, you would know that pen lifts create huge messes unless done with great care. Indeed, part of the reason Gutenberg’s press uses moveable block letters instead of cursive letter forms is precisely for the reason of accuracy. Cursive is in no way superior to block letter printing in any material sense or technical sense. In fact, given our technological surroundings and the need for widespread literacy it is an archaic writing form just like secretary hand.

So why are some educators and parents still so obsessed with children mastering it?

As with many things, the answer is a matter of social class, rather than social, or cultural necessity. As more and more documents were typed, or printed using block letters, cursive became the mark of distinction for the wealthy classes. While technical schools taught block printing because it was easier to read and more accurate, collegiate institutes and grammar schools continued to focus on cursive. Cursvie then is, and has been for some time, little more than an affect of class distinction. In the present material and technological reality, it is a poorly masked attempt to obfuscate meaning by those intent on projecting an image of themselves as the social betters of others. Cursive has no bearing whatsoever on the quality of your ideas.

All of this is to say, if you and your pen pal want to swap letters in your secret code of flowing handwriting, or take notes in cursive because it is faster for you, that is fine. However, do not mistake your preference for marks of class distinction as evidence of cursives superiority. It is a lovely skill to have, I suppose, but far from necessary, or relevant. That is, unless by some chance you happen to be a professional historian like me. In which case, it is a frequent occupational hazard.

As teachers we have enough important content to cover. Gestures oriented towards the vanity of certain social classes should never be a core educational component. After all, We havent been in the business of producing little ladies and lordlings for quite some time. 

Critical Thinking and Creativity

Unless you’ve been living under a rock, you have had some sort of PD focused on the importance of critical thinking and creative thinking to long-term academic success. As a gesture, I am fully on board. One of my biggest annoyances as a university instructor was the often limited analytical skills my students possessed. They could drown you in quotes and evidence, but offerred little in the way of exploring their meaning in order to progressively develop a thesis through analysis. With few exceptions, the best they could hope for was to “prove” their thesis, which aside from being weirdly positivist, is the academic equivalent of writing stereo instructions. 

Yet, much as I support the general tennor of this discussion, I have serious pedgogical concerns about the conflation going on here between critical and creative thinking. Both are essential, but treating them as interchangeable, as many do, means you are likely to foster the latter without any of the former.

Allow me to demonstrate. One of the most common assessment tasks given in intermediate history classes these days is the soldier’s letter home. In this scenario, students pretend they are a soldier writing back home and describe the conditions of their existence on the front as well as their feelings. Certainly this assignment leads to some absolutely lovely journals. Some students even rub tea on the paper to give it a weathered look. It’s all so amazing. That is, until we consider the fact that all the student has done is written a short story using facts and information supplied by a textbook. Creative it may be, in some limited sense, but it is little better than a fill in the blanks worksheet. Unless we think history is all about memorizing established facts, which it very much isn’t, than this is a pretty poor example of a rich task.

So how could we make this better? It begins with drawing clear distinctions between creative cognitive processes and critical cognitive processes. Creative thinking is about conveying meaning in new forms. However, the hard work of determining what that meaning is and deciding what is meaningful belongs to the realm of critical thinking. Creative thinking is about representation. Critical thinking is about analysis. Critical thinking must occur first to enrich the creative process, otherwise you are getting work that merely parrots the meanings made by others in a new form or style. Creativity as an end in itself is nothing but vapid aesthetics.

Let us revisit our sample assignment and revise it in light of the above paragraph. In practice we could still have the student write the letters home. However, rather than have those letters follow one perspective, it would be better to have the correspondence occur between two individuals such as the soldier and their offspring, a battlefield nurse and their spouse or offspring, or the soldier and their spouse. This allows students to begin to consider conflicting perspectives on, and experiences of, the war.

In order to write these letters, students should have to work with primary evidence, not their textbook. They need to see that the experience of soldiers and civilians was not uniform or easily reducible to a coherent narrative. Thus, it forces them to make choices about who they will represent and how they will represent them. This is where we can insert some research and analysis into the process of developing the letters.

Finally, students need to respond to their own production. They should explicitly analyze what they chose to include and what they chose to leave out and discuss the impact this has on the version of the past they present. They should comment on the conflicting perspectives of different group during the war and how these are faithfully (or not) conveyed by their series of letters. They should compare their letters and primary research to what their textbook says and discuss how their understanding of the war is similar and/or different than the formal version their ministry-sponsored tertiary source provides. The format of the response doesn’t matter. What matters is that they are evaluating their own reconstruction of the past in light of competing evidence and competing accounts. This is critical thinking. 

Many teachers will complain, I am sure, that there simply is no time for this kind of depth. My response would be that less is almost always more. With an assignment like this, do you really still need that unit test? Do you still need so many paragraphs regurgitating basic facts?

To put it succinctly - should your students be spending their time learning history or being historians?

Breadth and Depth in Education

The breadth versus depth debate is a pretty common one. Throughout secondary education you will hear the refrain over and over again that we should study fewer concepts, but study them deeper. I have no objection to studying a few concepts in great depth, but what I have noticed is that there seems to be a great deal of confusion over three things: (i) what a "concept" actually is in a humanities course; (ii) what the connection is between texts and the development of conceptual knowledge; and; (iii) the distinction between deep reading, by which I mean sustained intratextual criticism, and conceptual depth.

There has been a lot of writing on this issue lately, both in the popular media and elsewhere. One study receiving a great deal of attention declared that depth is unequivocally more important than breadth. It identified a strong correlation between the amount of time spent studying a topic and the performance of students in first-year university. It determined the breadth versus depth of their science education in senior secondary school by surveying the students about how many instructional days they spent on specific concepts in high school. To my mind there is a major problem with this study's conceptual design and the extent of its bold claims; Mainly, the conflation of depth with time spent on a subject is highly problematic. One could spend four weeks surface learning, drilling, re-drilling, memorizing and regurgitating to ensure student mastery and high results. One might ask relatively simple first-order questions during that time. There are just too many unknowns in this confusion of terms. Setting aside informant error and the possibly problematic conflation of time with depth, it raises important questions about studying a lot of different things rather than studying a few. But note what it refers to: concepts not texts. The Province of Alberta's Ministry of Education tackled the problem head on in a curriculum document in the mid 2000s. After a sustained and comprehensive survey of a number of academic studies and curriculums from around the world, the document concludes by encouraging educators to move from "surface learning, which tends to focus on fact finding and rote memorization, to concept-based inquiry [which] allows students to develop abstract thinking which causes them to think more deeply and in an inductive fashion. The universality of conceptual learning also has value beyond school, as students can see the relevance it has to them personally, to their community, to their country and to the world as a whole." Again notice that the discussion of depth focuses on the concepts investigated, not the texts used in their investigation. 

It does seem that less is more when we are speaking about conceptual knowledge and understanding, but it is as yet unclear how the number of texts, or resources used fits into this equation. Admittedly, I am a historian, and we have a curious understanding about the uses of texts when compared with our brethren in the humanities. We don't tend to wax poetic or philosophical over a single text. We use texts to access deeper understanding of overarching concepts and their individual multiplicity. This doesn't mean avoiding deep reading skills; It means that deep reading is only on tool available to us. This tendency is most explicit in historical methodologies, but should you venture to read any academic article in the humanities you will actually find the same implicit process at work. Most academic articles in English, for instance, will focus on one small portion, or aspect of a text, like Gatsby's parties. Typically, though, they do so to further understanding and knowledge about how Fitzgerald constructs gender, or how whiteness is performed. Without the ability to relate their investigation to broader conceptual questions that transcend disciplinary and institutional boundaries these studies would be relatively superficial and of limited interest. It is the conceptual self-awareness of the study that provides the depth befitting of academic merit. It is what allows the study to go beyond being interesting and to become an instance of knowledge creation. 

Yet, I am sometimes worried about the way we emphasize depth in secondary classrooms. Frequently it is done so in terms of core texts rather than overarching concepts. In my mind, this is a troubling misunderstanding of the issue. The depth versus breadth article widely quoted in the media didn't ask how many textbooks or resources the students used when studying concepts. In fact, what research we do have on that issue suggests that a variety of conceptually related, but diverse texts, fosters greater depth of understanding than a single text. To me, this suggests that we should be giving our students more texts, not fewer, but connecting them back to a small handful of overarching concepts like a few specific schools of literary criticism and an overarching issue for the course as a whole (i.e. isolation and community). Thus, the concept in English classes isn't The Great Gatsby; rich as that text may be, it isn't conceptual. It warrants a close reading, to be sure, but is not the basis of conceptual depth for students. Reading Gatsby and connecting it to a series of smaller texts that empahsize different parts of the book is the proper way to deepen conceptual knowledge. Core texts are the anchors we tie our concepts to, lest they float away; yet, all too often they are cofused for the concepts themselves.

Consequently, a rigid focus on intratextual analysis at the expense of intertextual analysis prior to grade twelve paradoxically ensures that students do not develop a depth of understanding even while engaged in close reading activities. A better approach is to decide what concepts you want to anchor within all of your core texts and to focus on reading them for those specific purposes while acknowledging that their are many other issues one could explore. This also means providing a rich array of conceptual texts to help model and develop student analysis along those chosen lines. For instance, one could just look at Hamlet for the tragic hero, or one might wish to explore it as a text that anticipates the numerous modern derivatives of the tragic hero, specifically the antihero and absurd hero. If studying the tragic hero alone, you would need many conceptualizations of it, not just the Aristotelean, to achieve any real depth of understanding through intertextual analysis. If studying the triumvirate I outlined, it immediately forces the student to read closely, weigh evidence and engage with overarching concepts. It is deeper because it understands the difference between texts and concepts and avoids conflating the two.

So while I know we can all get behind depth over breadth in matters of conceptual knowledge, lets also acknowledge that this depth in a particular concept involves encountering it in numerous iterations. This means that conceptual depth is partly a function of textual breadth. The issue isn't breadth versus depth, but how to use textual breadth properly to foster conceptual depth in the humanities.


© Braden Hutchinson 2014