Some Thoughts on March Madness (and I Don’t Mean Basketball)

Embed from Getty Images

The New York State Common Core English Language Arts Assessments will be upon us in a few weeks, and this year they arrive against a backdrop of controversy over the use of standardized tests. More parents than ever have joined the opt-out movement, refusing to allow their children to submit to tests whose validity they question. Diane Ravitch has called for congressional hearings on the misuse and abuse of high-stakes standardized tests. And many states, including New York, have decided to slow down implementation of the Common Core and its tests, because as a Huffington Post education blog post states, “in far too many states, implementation has been completely botched.”

Whatever the ultimate fate of the Common Core assessments are, this year’s tests are going on as scheduled, and teachers are struggling over how to best prepare the students in their care, which has not been easy. Many schools around the country, for instance, adopted packaged reading programs that claimed to be aligned to the Standards and the tests as a way of hedging their bets, with New York City going so far as to commission a few key publishers to develop programs0 to the City’s specifications. Yet having now seen some practice tests, many teachers feel that these programs haven’t adequately prepared students for these tests. And they’re not alone in thinking this.

Sleuth CoverAccording to a recent Education Week blog post—whose title “Boasts about Textbooks Aligned to the Common Core a ‘Sham’ says it all—these programs should be viewed with caution as few, if any, live up to their claims. Many, as the blog post points out, have recycled material from older, non-Common-Core-aligned programs, such as Pearson’s ReadyGen, which uses the magazine Sleuth from its old Reading Street program for close reading practice on texts that don’t really seem close reading worthy. Others, such as Scholastic Codex, are so overly scaffolded—with teachers repeatedly directed to “assist students in understanding”—that it’s hard to see how students are being prepared for higher order independent thinking.

Meanwhile the practice tests provided by Curriculum Associates’s Ready test prep program, which most city schools are using, are insanely hard. Sixth graders, for example, most of whom have had no exposure to chemistry, must read a speech given by Madame Curie about the discovery of radium. The passage contains much content-specific science vocabulary, and while some of the words are defined for students as you’ll see below (underlining mine), the definitions seem as incomprehensible as the words in the passage themselves.

Madame Curie Speech

Meanwhile seventh graders are subjected to an excerpt from Charles Dicken’s Oliver Twist, poems by Keats and Yeats, and a speech by Ronald Reagan commemorating the 40th anniversary of D-Day, which seventh graders won’t learn about until eighth grade (provided, of course, that amid all this test prep, there’s still room for social studies).

With these texts, traditional test prep strategies don’t really seem to help. Process of elimination, for instance, will only take you so far on tests where more than one multiple choice answer seems completely plausible. And telling students to “make sure you understand the question before choosing an answer” seems almost laughable when the questions and answer choices are like the following:

Hybrid word question

But what’s really disturbing is that the Ready instructional test prep workbook doesn’t seem to help either. It’s organized in sections that correlate to individual Standards and skills—summarizing informational texts, analyzing text structure, determining point of view, etc.—but the workbook’s texts, questions and tips seem absurdly simplified when compared to the company’s practice tests. Here, for instance, is how the test prep workbook for seventh grade talks about point of view:

Analyzing Point of View

And here is a point of view question from a seventh grade practice test on a text called “Country Cousin/City Cousin” that consists of two sections with different narrators who, though dialogue, not only express their perspective but their cousin’s as well:

Narrator POV Question

The workbook suggests that a point of view is synonymous with a character’s perspective, which can be conveyed through dialogue, thoughts and actions; yet this test question requires students to think of point of view only as a narrative stance, which isn’t covered in the workbook. And even if they did get that, every answer except A seems plausible, since they more or less say the same thing. But only D is correct.

Maurice Sendak Cropped

From Open House for Butterflies by Ruth Krauss and Maurice Sendak

So, once again, what’s a teacher to do? Aware of the problems inherent in both the packaged programs and test prep materials, the teachers from a middle school I work with and I decided to take a different tack. At each grade level, we invited a small group of students who’d just finished a few passages from a practice test to talk with us about how it went. The point was not to discover who had the right answer or not, but to hear specifically what the students found challenging and how they, as readers and test takers, tried to deal with those challenges.

What the students said was enormously enlightening, as it gave us a window on how students were thinking, not just what they thought. (The confusion over what was meant by point of view, for instance, emerged during one of these talks.) And after listening carefully to what the students said and considering the instructional implications, we were able to come up with a few tips and strategies that specifically addressed what students found challenging and how some had overcome that.

test-prep-strategies-©

We also noticed that the students were fascinated in how their classmates thought through their answers, so we also designed a new test prep practice. Rather than having the students practice simplified skills in the workbook or go over the answers to a practice test to find out which answer was right, we broke the students into groups, assigned each group a multiple-choice passage from a practice test they’d taken, and gave them a piece of chart paper. Their task was to first talk about the passage itself—what was easy, what was hard and why—then compare their answers, looking for questions for which they’d made different choices. Next each student explained to the group how and why they their answer they had—in effect, making a claim for an answer and supporting it with evidence from the text. And after listening to each other, they debated the answer and voted on one, recording their thinking on the chart paper. Then, and only then, did we consult the answer key.

Not only did the students find this more engaging than the worksheets and reviews, they also benefited from hearing how their classmates figured things out, which they could then try to do, too. Of course, it will be a while before we know how successful this approach was or not. But I have to believe that sharing the various ways different students solved the challenges these passages and questions posed was better than just reviewing the right answers. And in the meantime, I’ll keep my fingers crossed that the powers that be will listen to parents and teachers as attentively as we listened to these students and bring an end to all this testing madness.

Stop the Madness

Keeping It Real in Test Prep Season: Some Thoughts about Nonfiction Text Structure

After an amazing weekend at the Dublin Literacy Conference, which was all about real reading and writing, I arrived back home to find many schools plunging into test prep. The New York State tests aren’t until April, but many schools are already worried about this year’s ELA test, which supposedly has been aligned to the Standards. The New York City Schools Chancellor has already said that he expects scores to plummet, and the sample tests the state has posted on their engageny website have done nothing to allay fears. Third graders are expected to read a story by Tolstoy, which a parent of a city third grader called “excruciatingly dull and confusing.” And fifth graders are asked to compare two passages written from an animal’s point of view—one from The Secret Garden, the other from Black Beauty—and discuss how “the animal’s perspectives influence how events are described.”

Given that teachers are being evaluated by test scores in New York and other states, the apprehension seems justified. And so the test prep workbooks have come out. These workbooks, too, have supposedly been aligned to the Common Core, and at least in the ones I’ve seen, a whole new crop of questions are being asked about the text structure of nonfiction texts in order to assess whether students are meeting Reading Informational Texts Standard 5. These include questions not just about the structure of the entire passage, but also the structure of individual paragraphs and sentences, as can be seen below.

Here, for instance, is a 4th grade text-structure question about an article on the history of film making:

History of Film Making Question 2

And here is another on an excerpt from the autobiography of one of the first climbers to reach the top of Mount Everest:

Tiger in the Snow Question

dok-wheelEach of these questions ask students to identify or match a sentence with a text structure type, which, in terms of Webb’s Depth of Knowledge, is only Level 1 thinking. Each can also be answered without actually reading the passage, which surely is not what the Standards intended. And all this has led to  a new crop of test-taking strategies being taught—such as looking for text-structure signal words—which, in turn, is taking time away from authentic reading.

Ironically, these text-structure questions also fly in the face of some of the pronouncements of David Coleman, chief architect of the Common Core. I rarely agree with Coleman’s solutions to the problems he sees in classrooms, especially when it comes to overly prompted models of close reading, but I often agree with his diagnoses. Here, for instance, in a presentation he gave to the New York State Department of Education, he comes down hard on what he calls “the strategy of the week”—i.e., using texts to practice a skill or strategy, such as identifying cause and effect—which I, too,  believe is problematic in the way he describes:

“Nothing could be more lethal to paying attention to the text in front of you than such a hunt and seek mission. . . . When have you read a difficult text ever in your life and said, ‘I’ve got it now. It’s a cause and effect text not a problem and solution text.’ We lavish too much attention on these strategies in the place of reading. I would urge us to instead read.”

But all this does raise the question: Does knowing about concepts such as cause and effect, problem and solution and compare and contrast actually help us, as authentic readers, understand what an author of a nonfiction text might be trying to say? I think it can, but not as reflected in the above kind of questions. To see how, let’s look at one of the ‘one-page wonders’ Harvey Daniels and Nancy Steineke share in their great resource Text Lessons for Content-Area Reading: “Vampire Bat Debate: To Kill or Not to Kill” by Chris Kraul.

VampireBatDebate

If identification is the name of the game, the title alone lets us know that this is a compare-and-contrast piece. But if we want to truly understand the complexity of the debate, not just identify the text-structure, we need to remember what we instinctively know as readers: that nonfiction authors frequently explore problems and solutions, causes and effects, and different perspectives in the pieces they write. And so as readers, we enter the text on the look out not only for the different points of view alluded to in the title but for the problems that sparked the debate, the causes and effects of those problems, and the real and possible effects of whatever solutions have been undertaken or proposed.

Vice ClampIn this way, we use our understanding of those concepts to dig deeper into the text; they expand our understanding, rather than reduce it, which happens when we try to fit a text that explores virtually anything complicated into a text-structure vise. And so beyond test prep, I don’t spend a lot of time explicitly teaching text structures. Instead, with the vampire bat article, I’ve been asking students to consider how each paragraph adds to their understanding of the title’s debate and how each is connected to the next. This has allowed them to construct their understanding of the complexity of the issue as they make their way through the text—and for problem and solution and cause and effect to rise up naturally as they read and discuss it, not because I’ve sent them on a hunt and seek mission.

I’ve also been asking students whether they think the author has an opinion, and many have said that they think he does—that he sides with the scientists, not the cattlemen, because he devotes more words and space to the scientists’ side and lets them have the last word. That seems a far more insightful analysis of the text’s structure than anything the workbook questions ask for. And it involves much higher levels of thinking than those multiple choice questions demand.

Keep It RealI truly believe that this kind of real reading can ultimately prepare students for the test as well as any short-cut strategies, such as hunting for signal words, can. And it produces none of the negative effects—the narrowing of curriculum, the stressful climate in classrooms, and the lack of critical thinking—that a coalition of Massachusetts college professors recently cited as reasons why their state should abandon high-stakes standardized testing. And so I find myself in the surprising position of echoing David Coleman: Let’s try as much as humanly possible to keep it real by really reading.