Toward a Saner View of Text Complexity


As happened a few years ago, when eighth grade students took to Facebook to share reactions to a nonsensical passage about a talking pineapple from the New York State ELA test, this year’s Common Core-aligned test made it into the news again for another Facebook incident. Somehow a group called Education is a Journey Not a Race got their hands on a copy of the fourth grade test and posted over three dozen images of passages and questions on their Facebook page. Facebook quickly took the page down, but they couldn’t stop the articles that soon appeared, such as “New York State Tests for Fourth-Graders Included Passages Meant for Older Students” from the Wall Street Journal and “Educators alarmed by some questions on N.Y. Common Core test” from The Washington Post. 

PG13_rating_WaiAs their titles suggest, these pieces took a hard look at the kind of questions and concerns teachers have been raising since the Standards first appeared. And while it’s great that the press is finally reporting on what students really face on these tests, it seems like they haven’t completely grasped that these exceedingly hard and often age-inappropriate texts and the convoluted, picayune questions that come with them are precisely what the authors of the Common Core had in mind.

As I write in my new book (which Katie Wood Ray, my editor extraordinaire, assures me I’m closing in on), the Common Core seems to have ushered in an age where third grade has become the new middle school, middle school is the new high school, and high school is the new college. And that’s all because of the particular vision the Common Core authors have about what it means to be college and career ready.

According to the Common Core, students need to build knowledge through content-rich nonfiction plus have regular practice with academic language to be ready for college and text-complexity-trianglecareers. And as many of us know by now, they determine a text’s complexity by supposedly using a three-part model that considers the following:

  • A text’s Quantitative dimensions, as measure by Lexile Levels;
  • Its Qualitative dimensions, which scores the complexity of a text’s meaning, structure, language features and knowledge demands through a rubric;
  • And the Reader and the Task, which supposedly  involves “teachers employing their professional judgment, experience and knowledge of their students” to determine if a particular text and/or task is appropriate for students.

I say supposedly because if you look at the texts and tasks on the test as well as those in many Common Core-aligned packaged programs, you’ll see some patterns emerge. First there seems to be a preference for texts with high quantitative Lexile levels, regardless of The Clay Marblethe other two factors. And when it comes to the qualitative dimension, tests, packaged programs and even home-grown close reading lessons seem to favor texts that score high in terms of their language features and knowledge demands—i.e., texts with lots of hard vocabulary and references to things students might not know.

These preferences are why a text like Minfong Ho’s The Clay Marblewhich recounts the story of a Cambodian brother and sister who flee to a refuge camp in Thailand in the wake of the Khmer Rouge’s genocide and comes with a grade equivalent reading level of 6.8—was on New York State’s fourth grade test. And it’s why a text like Behind Rebel Lineswhich tells the true-life story of a young woman who disguised herself as a man to join the Union Army during the Civil War and comes with a grade reading level of 7.2—is part of Pearson’s Ready Gen’s third grade curriculum.You may have noticed that I didn’t mention the Reader and the Task, and that’s because it’s often not considered when it comes to choosing texts. On tests, in packaged programs and even in many home-grown close reading lessons, every child is expected to read the same text and perform the same tasks, which usually consist of answering questions aligned to individual standards. The only adjustment that seems to be made is the amount of scaffolding a teacher provides—and the Common Core Standards specifically direct teacher to “provide appropriate and necessary scaffolding and supports so that it is possible for students reading below grade level [to achieve] the required ‘step’ of growth on the ‘staircase’ of complexity.”

Overly Scaffolded BuildingAs I said last year at NCTE, the problem with this is that some children need so much support in order to read those required complex texts that we can barely see the student beneath all that scaffolding. In fact, when we adopt that “Do whatever it takes” approach to getting kids through those complex texts, we not only risk losing sight of them, but all that scaffolding inevitably limits the amount of thinking we’re letting students do. And in this way, I fear we’ve traded in complex thinking for getting through complex texts—and the ability to think complexly is surely as needed to succeed in college as possessing content knowledge and vocabulary.

And so, in the new book, I propose an alternate route up that staircase of complexity. It’s one that truly takes the reader into account and seeks a different balance between the complexity of a text, as determined by its Lexile level and high scores for its language and knowledge demands, and the complexity of thinking we ask students to do. And I spell out what that could look like in the following chart:

Alternate Complexity Route

Following this alternate route, for example, would mean not choosing a text like Behind Rebel Lines for third grade because, as you can see below, the vocabulary is so daunting, it’s hard to imagine a third grader making much of it without the teacher handing over the meaning (and, as a parent of a third grader writes, its meaning isn’t always age appropriate).

Behind-Rebel-Lines-Reit-Seymour-9780152164270Behind Rebel Lines 2A

Instead, you could choose something more like Patricia Polacco’s Pink and Say which is also set during the Civil War and explores similar themes. But because it’s far more accessible at the language features level, students who were invited to read closely and deeply could actually think about and construct those themes for themselves. They could even figure out what the Civil War was without the teacher explaining it because the book is full of clues that, if connected, could allow students to actually build that knowledge.

PinkandSayPink and Say Excerpt

Finally, it’s worth noting that I’m not the only one advocating for an alternate route. In a postscript to his book Holding on to Good Ideas in a Time of Bad OnesTom Newkirk makes a case for what he calls “a more plausible road map for creating readers who can handle difficulty”: giving students “abundant practice with engaging contemporary writing that does not pose a constant challenge,” which can help them build the “real reading power” needed to tackle challenging texts. And more recently, in the final post from his great series on literacy, Grant Wiggins called for making what he called “a counter-intuitive choice of texts,” that is, choosing “texts that can be easily read and grasped literally by all students” but which require complex thinking at the level of themes and ideas.

Those seem like incredibly sane ideas to me. And as for what’s insane, I’ll leave that to Einstein:

Einstein Insanity Quote

Looking at Complex Texts More Complexly (or What’s Wrong with this Picture?)

Clifford Loves Me -SunAlsoRises

By now many of us have experienced or heard about the effects of using Lexile levels as the sole arbiter of text complexity. In her wonderful post “Guess My Lexile,” for instance, Donalyn Miller looks at the absurdity of putting book with widely different reader appeal and age appropriateness in the same book bin because they share a Lexile level (as my own favorite Lexile odd couple, Clifford and Hemingway, do, with both clocking in at 610L). And for those of us who strongly believe in the power of choice and interest-based reading, young adult writer Mike Mullin shares a chilling story in a blog post about a mother frantically searching for a book that her dystopian-loving 6th grade daughter, whose Lexile level was 1000, would be allowed to read for school. The Giver—out. Fahrenheit 451—out. Margaret Atwood’s The Handmaid’s Tale—out, all because of Lexile levels which, in its arbitrariness and control, seems like something out of those dystopian books.

text complexity triangleWhile I can’t vouch for the intentions of the Common Core authors (as I can’t for any writer without direct communication), this is not what’s stated in the Standards themselves. In Appendix A’s “Approach to Text Complexity,” the Common Core authors offer a three-part model for measuring text complexity, which they capture with a now familiar graphic. This model, they clearly state, “consists of three equally important parts”—the qualitative dimensions, the quantitative dimensions, and the reader and the task—all of which must be considered when determining a text’s complexity in order to address “the intertwined issues of what and how students read.” Yet how often does that actually happen?

The Arrival coverThe sad fact is that too many schools, reading programs and test makers rely on quantitative measures such as Lexiles to make text selections for students because it’s simple and easy. Lexiles can be found with a click of a mouse, while assessing the qualitative measures is harder and much more time consuming, even when we use rubrics. That’s because the rubrics are often filled with abstract words that are open to interpretation, and they use what seems like circular logic—e.g., saying that “a text is complex if its structure is complex—which doesn’t seem terribly helpful. And how do you deal with a wordless book like Shaun Tan‘s The Arrivalwhich I recently explored with teachers from two schools that were looking at text complexity? Ban it from classrooms because, without words, there’s nothing to quantitatively measure?

Like other short cuts and quick fixes I’ve shared, dismissing a book like The Arrival, based on a non-existent Lexile level, risks short-changing students. The book requires an enormous amount of thinking, as the teachers I worked with discovered. And interestingly enough, their thinking mirrored that of the students of fourth grade teacher Steve Peterson, who wrote about his class’s journey through the book on his blog Inside the Dog. Both the fourth graders and the teachers had to make sense of what the author presented them by attending carefully to what they noticed and what they made of that. And while some of the initial ideas they came up with were different (the teachers thought the portraits on the page below were of immigrants, not terrorists, as some of Steve’s kids first did), the process was the same.


Both students and teachers had to constantly revise their understanding as they encountered new details and images that challenged or extended their thinking. And both debated the meaning of certain details in very similar ways. The teachers, for instance, argued whether the dragon-like shadow that first appeared in the picture below was real or a metaphor for something like oppression, while in a second post, Steve recounts how his kids debated whether the bird-like fish that appear later in the book were real or a metaphor for wishes.


The teachers only read the first part of the book, after which I passed out the rubric below, which many states seem to be using, and asked them how they’d qualitatively assess this text. Being wordless, the text couldn’t be scored for its Language Features, but for every other attribute on the rubric—Meaning, Text Structure and Knowledge Demands—the teachers all decided it was very complex, especially in terms of meaning.

Literary Text Complexity Rubric

If we give equal weight to both the qualitative and quantitative dimensions of this text, we have to say that even with a zero Lexile level, it’s at least moderately complex. And what happens when we add in the Reader and the Task, which sometimes feels like the forgotten step-child in text complexity discussions?

Steve and I used the text for different purposes—Steve to launch a unit on immigration, me for a workshop on text complexity. But we each set up our readersNCTE Logo to engage in critical thinking, which the National Council of Teachers of English defines as “a process which stresses an attitude of suspended judgment, incorporates logical inquiry and problem solving, and leads to an evaluative decision or action.” Both the teachers and students engaged in this process not because they’d had a lesson on suspending judgment or logical inquiry, but because they were curious about what the writer might be trying to show them. And to answer that question, both the students and the teachers automatically and authentically engaged in the work the Common Core’s Reading Standards 1-6.

Unfortunately many of the tasks we set for students aim much lower than that, including some of those found in the Common Core’s Appendix B, such as the following:

Students ask and answer questions regarding the plot of Patricia MacLachlan’s Sarah, Plain and Tall, explicitly referring to the book to form the basis for their answers. (RL.3.1)

Students provide an objective summary of F. Scott Fitzgerald’s The Great Gatsby wherein they analyze how over the course of the text different characters try to escape the worlds they come from, including whose help they get and whether anybody succeeds in escaping. (RL.11-12.2)

Each of these tasks are aimed at a particular standard, and frequently the instruction that supports them (plus the worksheets, graphic organizers and sentence starters) focuses the students’ attention on that single standard, rather than on a more holistic way of reading, which would naturally involve multiple standards. And while the Gatsby task is certainly harder than the third grade one, the prompt takes care of the hardest thinking by handing over a central idea instead of asking students to determine one.

But what if the reading task we set for students in every text they read is to think critically about what the writer is trying to explore or show them, through the details, story elements, word choice, structure—all those words that litter the Standards. Wouldn’t that, in addition to a complex qualitative measure, off-set a high Lexile level, if all three truly held equal weight?

I’ll share more thoughts on the reader and the task in an upcoming post. But for now I can’t stop thinking that if instead of ramping up the complexity of texts, we ramped up the complexity of thinking we aim for—trading in, say, some of the hardness of texts for deeper and more insightful thinking—we might, in fact, prepare students better for colleges, careers and life.

Preparation of Life Quote

Just the Facts, Ma’am: Setting Students Up to Solve Problems in Nonfiction

Just the Facts Ma'amAs part of the Close Reading Blog-a-Thon that Chris Lehman and Kate Roberts hosted to kick-off their new book, Falling in Love with Close ReadingKate reminded us that not every nonfiction text warrants a close reading. In particular she noted texts whose word choice and details don’t reveal an authorial point of view—or as Kate so wonderfully put it, “aren’t rippling with nuance.” Many of those texts are purely factual—i.e., they don’t use facts to explore a question, issue or event that the writer may have a stance on. And many are content area texts that provide social studies or science information without much of a discernible view point.

I agree completely that not every text deserves close point-of-view scrutiny, but there are other reasons to read those texts closely, as I think they pose many problems for students and offer many problem-solving opportunities. The title of this week’s post, for instance, alludes to something that not every reader might know—in this case, a TV show that was popular before some of you were born. References and allusions like this abound in all sorts of nonfiction, from Nicholas Carr‘s intriguing piece “Is Google Making Us Stupid?“, which begins with a reference to Stanley Kubrick’s movie 2001: A Space Odyssey, to Sy Montgomery‘s grade 4-5 text exemplar Quest for the Tree Kangaroo, which in passing mentions hobbits, trolls, Sponge Bob and Stuart Little. Most of these references are kid-friendly and add to the fun of the book. But like the old TV show Dragnet, I imagine that there are students out there who’ve never heard of Stuart Little. So what’s a fourth or fifth grader to do when reading a section that begins like this:

“Stuart Little, the small mouse with big parents, had nothing on baby marsupials. Marsupials (“mar-SOUP-ee-ulz”) are special kinds of mammals. Even the biggest ones give birth to babies who are incredibly small. A two-hundred-pound, six-foot mother kangaroo, for instance, gives birth to a baby as small as a lima bean. That’s what makes marsupials marsupials.”

QuestfortheTreeKangarooThe easiest way to solve the problem of what Stuart Little means would be for a teacher to tell the students who Stuart Little is. No doubt that might be entertaining and even lead some students to the book. But given that, just like vocabulary words, it’s simply impossible for a teacher to provide explanations for every allusion or reference students might encounter in a text, we might want to think twice about solving the problems that allusions and references pose and instead let students try to solve them on their own, at least some of the time. Some students, for instance, might solve the problem here by skipping right over Stuart Little and focusing instead on what they can understand: that marsupials are mammals whose babies are super small. Others, instead, might create what I call a “place holder”: they figure out that whoever Stuart Little is, the difference in size between him and his parents isn’t nearly as great as the difference between marsupial babies and their moms.

I believe that providing students with opportunities to wrestle with problems like these helps them become confident and resourceful readers. But for that to happen, we, as teachers, need to be more aware of the problem-solving opportunities that specific texts hold. We can do that by recognizing that many of the items that frequently appear in text complexity rubrics, such as allusions, vocabulary and complicated syntax, can be thought of as problems to solve, as can the kind of “holes in the cheese” I discussed in an earlier post—those places where a nonfiction writer hasn’t explicitly spelled out how the facts are connected. We can also better see the problems a text poses if we ask students what they’re confused about, as I wrote about last year and did as well with two groups of fourth graders that looked at this excerpt from Samuel de Champlain: From New France to Cape Cod by Adrianna Morganelli:

Trade & Exploration

Both groups of students had studied explorers earlier in the year, and so I began by asking each group to think about what they had learned. In both cases, the students shrugged more than spoke, which gave their teachers pause. Interestingly enough, though, as they made their way through the first paragraph, which was filled with things that confused them—”thirst for wealth”, “the spice trade” and “commodities”, which they solved by checking out the glossary—they started to remember more.

I think it’s important to note here that the call to activate schema before reading yielded virtually nothing, but the students automatically started pulling information without prompting from their memory banks in order to resolve their confusion. Problem solving, thus, gave them a purpose for strategically drawing on their background knowledge in a way that years of deliberately practicing the strategy of activating schema hadn’t. And with that paragraph mostly solved they moved on to the next.

The first group I read this passage with helped me better see the problems that the second part posed, as students were once again confused. In particular, they were confused by the references to trade routes, both overland and sea ones, as well as by the glut of place names and the different types of people. In fact, who controlled and discovered what where, along with why and how, were all problems that needed solving. And while I ran out of time with the first group, I came more prepared for the second, offering them this map to look at and use as a problem solving tool:

Age of Exploration Map

Using the map helped them figure out the difference between overland and sea routes as well as who controlled which and why. It also allowed them to understand what the first group hadn’t: that the New World was discovered almost by accident, as explorers sought to find the Moluccas, and that furs, fish, gold and silver were the new commodities mentioned in the first paragraph, which again were discovered through what had originally been a search for spices and silk. And here again, they automatically inferred in order to solve those problems.

Arriving at these understandings definitely took longer than it would have if I’d solved the problems for the students by pre-teaching or explaining what had confused them or modeling a think-aloud. But as I debriefed the lesson with the teachers, we all thought that in addition to helping students become stronger independent readers, they were also more likely to remember the content because they’d figured it out for themselves and it now belonged to them. And as some of the teachers who attended the session I did last month in New Hampshire said, putting students in problem-solving mode helped them “see themselves as ‘figuring-it-out’ kind of kids.” And that, I think is well worth the time, both for us and for students.

Thinking (Please be Patient)

What’s the Difference Between a Teacher & a Packaged Program?

Now that most of us have settled into the new school year, my corner of the blogosphere is buzzing with the first student responses to curricula designed to meet the Common Core through a steady diet of close reading. Last week, for instance, Chris Lehman shared some of the trove of tweets he discovered, like the one I found below, from students who were flummoxed, frustrated and furious with their close reading assignments. Clare Landrigan and Tammy Mulligan over at Teachers for Teachers shared the notebook entry of a student who confessed, “I find myself so focused on how to annotate that I’m not really thinking about what I’m reading.” And Kim Yaris, of Burkins & Yaris, shared a cautionary tale of her own after her fifth grade son came home from school, brought to the brink of despair and tears by a two-week-long close reading of a document that Kim’s research suggest is actually more college than lower school fare.

Close Reading Tweet2

I can’t verify that all these tweets and confessions are connected with packaged programs, though Kim’s son’s story definitely is. But I seriously suspect that in one way or another they reflect the effects and consequences of a document written by the authors of the Standards known as “The Publisher’s Criteria.” According to the authors, “These criteria [were] designed to guide publishers and curriculum developers” in creating Common Core aligned instructional material “to ensure that teachers receive effective tools.” And it’s here that some of the ideas and language that have taken over classrooms first appear, such as:

  • Whole class instruction should be focused on short texts on or above a grade’s complexity band throughout the year 
  • Students should be engaged in close reading of those texts, which include multiple readings
  • Those close readings should be guided by a set and sequence of text-dependent questions

What’s important to remember is that these criteria weren’t aimed at teachers, only those in the business of marketing products. Yet many a teacher has been forced, persuaded or enticed to follow, having been told, perhaps, that they’re the only way to raise test scores or meet the Standards. That’s not to say that teachers shouldn’t expose their students to challenging texts, nor have some text-dependent questions up their sleeves that encourage reading closely for deeper meaning. But providing texts, questions to ask, answers to look for and worksheets to pass out is pretty much all a program can do. And because teachers are living human beings, with active minds and hearts, they can do things programs cannot, beginning with the most obvious: A program cannot not know the students, only a teacher can.

Teachers know which students come from families who struggle and which come to school sleepless or hungry. They know which ones are wizards at math but feel defeated by reading and which are precocious but avoid taking risks. They know which don’t talk because they’re shy and which don’t because they’re lost. And knowing all this, they also know that not every student needs every question the program tells them to ask, nor will every student manage to read complex texts by the end of the year (which is what the Standards actually say) by constantly being thrown into the deep end of the pool.

Teachers know all this because they watch and listen to their students, which leads to another critical distinction between a program and a teacher: A program can tell you what to say, but it cannot tell you what you’ll hear if you’re listening for more than what the program deems an acceptable answer. I think this kind of listening is just like the way we want students to read, attending closely to the details of the text to think about what the author might be trying to show them, not just to ‘get’ a particular answer but to understand more deeply. And that close, attentive listening allows teachers to make all sorts of moves that programs simply can’t capture in scripts, let alone actually make. Teachers can, for instance, do all of the following, none of which a program can:

  • Seize a teaching moment when it presents itself
  • Tuck what you’ve heard into your pocket to consider its instructional implications
  • Probe student thinking to better understand what’s behind their responses
  • Respond in a way that helps students build identity and agency as readers
  • Welcome and value out-of-the-box thinking

Several of these moves were visible in a classroom I worked in last week, where teachers were using some of the scaffolds that Dorothy Barnhouse and I share in What Readers Really DoA seventh grade ICT class, for instance, began reading Shirley Jackson‘s story “The Lottery”—in which a community engages in an annual tradition of stoning the winner The Lotteryof a lottery to death—by asking students to fill out their own text-based Know/Wonder charts. As the teachers and I walked around the room, we were thrilled to see how many students had noted the odd details about stones in the first three paragraphs and had wondered why characters were putting them in their pockets and stacking them in piles.

But I also saw this: One of the boys had copied a sentence from the second paragraph in full: “School was recently over for the summer, and the feeling of liberty sat uneasily on most of [the children].” Aware of how many of these students had plucked lines from texts for evidence on the test without seemingly understanding them, I asked him what he thought that meant. “They’re not comfortable with the freedom they have now that school’s out,” he said in a way that allayed my concern. And when I then asked him what he thought about that, he said he thought it was weird. No kids he knew were uneasy with summer. And thinking that, he decided to add a new question to his chart: “Why did most of the kids feel uneasy when school was out?”

Probing this student’s thinking this way not only revealed that he understood more than I first suspected; he was, in fact, the only one who picked up on the current of unease that runs throughout the story. But it also allowed me to name for him both the way that texts operate and the work he’d done as a reader, which increased his confidence.

And on the other end of the spectrum, there was this: As the teacher asked the class to share what they’d learned and wondered about, one student said she learned that the children were talking about planting, rain, tractors and taxes, from this line in the text:

“Soon the men began to gather, watching their children, speaking of planting and rain, tractors and taxes.”

Clearly she’d miscomprehended the sentence because of its construction, and initially I saw this as an opportunity to seize a teaching moment by asking the class who they thought was speaking and why. But when I debriefed that moment with the teacher, who’d noticed the mistake as well, she said she didn’t want to call her out in front of the class because it was the very first time that student had shared her thinking. Rather, in what I thought was a wise move, she wanted to think about how to address it in a way that would empower, not deflate, the student, and so she tucked what she’d heard in her pocket to think about her next steps.

And this leads to one final difference between a program and a teacher: Packaged programs teach curriculums and texts. Teachers teach real, live students. And I wonder, if we kept that distinction in mind, whether we’d stope feeling as frazzled and frustrated as the students sometimes do when we march them through a series of pre-determined questions in an achingly hard text.

On Shortcuts, Quick Fixes and Why They Often Don’t Work

Short Cut Sign

This spring I found myself in many classrooms—from third grade right up to twelfth—working on content area nonfiction. In each school, teachers were worried that students weren’t comprehending what they were reading, even when the information was stated explicitly. And without understanding the basic facts, it was nearly impossible for them to engage with whatever less explicit ideas the writer might be exploring or with any of the essential questions the teachers had framed their units around.

Initially many teachers saw this as a problem of the students’ background knowledge—i.e., students couldn’t comprehend what the writer was saying because they didn’t have enough prior knowledge for the information to make sense. Or they saw it as a vocabulary issue, especially in those cases where the students were either English Language Learners or were working with texts that matched someone’s insane notion of text complexity (such as the third-grade-is-the-new-seventh-grade example I shared in a recent post).

Can of WormsI don’t want to minimize the need to help students build larger and more sophisticated word banks or to have more background knowledge. But I’m also reminded of what I wrote in a post last summer: that too much emphasis on vocabulary or gaps in background knowledge may actually undermine students’ ability to become stronger, active readers by implying that we can’t make meaning if we don’t know all the words and references. Plus obsessing about what students lack sometimes blinds us to what they can do, and so before I started making suggestions, I asked the teachers I was working with what kind of instruction they’d offered students and how they had done with that—which opened up another can of worms.

In almost every case, the teachers had offered students strategies for summarizing or finding the main idea, which often involved looking for topic sentences or repeated key words, as many a classroom chart advises. Some also taught students how to use text features to predict what information they’d find, which we could also call a strategy. These strategies, however, were in fact shortcuts; they offered students ways of synthesizing a text without actually reading it carefully and thoughtfully. And as the teachers shared anecdotes and student work, what seemed clear was that too often those strategies simply wound up backfiring.

In the case of using text features, for instance, students frequently became wedded to predictions they’d made based on pictures and headings, and with those in mind, they ignored any parts that didn’t match their predictions. Main idea and summarizing strategies, on the other hand, often sent students on scavengers hunts—or what SmartBrief blogger Fred Ende calls “Seek & Find” missions in a great post on readers versus scavengers—with students searching for key words or topic sentences without really thinking about how those words or sentences were connected.

Swiss CheeseRecognizing that the very strategies they’d offered might actually be interfering with real understanding, many of the teachers agreed to change tacts and focus on questioning instead—not the kind that would send students back to the text on more scavenging expeditions, but questions that would invite them to wrestle with the concepts and information an author presents. We also wanted them to become more aware of what I started calling ‘the holes in the cheese’—that is, the places where a nonfiction author doesn’t spell everything out, but rather relies on us, as readers, to connect the dots of facts together to figure something out. And to do this, we needed to study the texts we were giving to students, like this one from a fourth grade science textbook that I looked at with an ESL teacher named Cybi, to better understand how the author presented concepts and where the holes in the cheese were.

Mineral Textbook Page 1

In terms of concepts, we saw that the author explicitly described what a mineral was in the second paragraph. But by focusing on repeated or highlighted words, as Cybi had taught them to do, she wasn’t sure if her students would fully grasp the relationship or connection between minerals and rocks—i.e., that minerals were in rocks—which was exactly what happened when I modeled the shared reading later that day. Using the text features to predict the chapter’s content, the students concluded that minerals must be kinds of rocks. Acknowledging that they didn’t know that for sure, they agreed to let me reframe that as a question, which I asked them to hold in their heads as we read. But even with that, they glossed over the word ‘in’ until the very end when, with the question still unanswered, they went back and reread the beginning. At that point hands shot up around the room, and after they shared what they’d discovered, I noticed and named for them how paying attention to small words like ‘in’ had really helped them understand the connection and relationship between the more prominent words. And understanding how those words and facts were connected was really, really important.

We also wanted them to understand the concept of properties and how they helped scientists classify and differentiate minerals. Drawing on her knowledge of her students once again, Cybi thought they might be able to understand that based on the examples on this page and the next. But we both thought we detected a hole in the cheese in this page’s last two sentences where a reader would need to connect the information about hardness and scratching and apply the concept of properties to infer that calcite is harder than gypsum. And so we decided that this would be a good place to stop and ask a question, which I framed during the shared reading this way:

I want to pause here for a moment because I think there’s something the author’s not telling us that we might need to figure out. We know that hardness is a property and that properties help scientists tell minerals apart. We also know that scratching is a way of testing hardness and that gypsum is easier to scratch than calcite. But the author doesn’t come right out and say which mineral is harder, gypsum or calcite. I think he’s left that for us to figure out. So turn and talk. What do you think? Based on what the author has told us, which mineral do you think is harder and why?

This kind of question asked students to synthesize and apply information, not to simply retrieve it. And it asked them to actually think in a way that allowed them to construct understanding, not just consume and regurgitate information, as scavenger hunts often do. Ultimately, though, we wanted the students to be in charge of the questioning, and to that end we combined teacher-created questions, like the one above, that put students in problem-solving mode, with open invitations for the students to share whatever they found confusing or curious. And after I shared my holes-in-the-cheese metaphor, we began asking students if they thought there were things the writer hadn’t fully explained—i.e., holes in the cheese—then gave them time to figure those things out based on what the writer did say.

And as for those shortcuts: In the end, they weren’t so short after all, as they often took students away from real reading and real understanding, helping them, perhaps, to practice a skill but not really engage in deep thinking.

No Shortcuts

A Close Look at Close Reading

As teachers and schools continue to wrestle with implementing the Common Core Standards, I hear more and more talk—and more and more questions—about the term ‘close reading’. Interestingly enough, the term doesn’t appear in the actual Standards, though it crops up repeatedly in many Standards-related material, including the now famous—or infamous—videos of Standards author David Coleman dissecting Martin Luther King’s “Letter from Birmingham Jail.” And Text Complexity co-author Douglas Fisher has said that close reading is “the only way we know how students can . . . really learn to provide evidence and justification,” as the Common Core requires.

So what exactly do we mean by ‘close reading’? According to Timothy Shanahan, who’s become something of a spokesman for the Standards, close reading is “an intensive analysis of a text in order to come to terms with what it says, how it says it and what it means.” I agree completely that close reading allows a reader to understand what a text says and what it means, with what it means directly related to the author’s decisions about detail and language and structure—i.e., how it says what it says. But for me, analysis is an off-shoot of close reading, something I can produce, if I’m asked to do so, after I’ve read closely.

I think this because, by definition, analysis involves thinking about how the parts contribute to the whole, which presupposes an understanding or vision of the whole. Putting analysis in front of understanding seems a bit like putting the cart before the horse. And asking students through a text-dependent question to analyze a part before they’ve had a chance to consider the whole risks putting them in the position of the blind men in the old Indian tale who sought to understand what an elephant was by attending to its parts. One man touched the trunk and thought an elephant was like a snake; another felt the tail and concluded it was like a rope; while a third stroked the ear and thought it was a fan. None was able to make sense of the whole when asked only to consider a part.

My own vision of close reading is better captured in some of the guidelines colleges provide students. The Purdue Online Writing Lab, for instance, advises ‘tracking’ your understanding of a text through margin notes that often consist of questions, with an example that bares more than a passing resemblance to the kind of questions that come up when students are using a Know/Wonder chart, noticing patterns across a text, and wondering what the writer might be trying to tell them through the details he’s chosen.

Example of close reading annotation using Doris Lessing’s short story “A Woman on a Roof,” from the Purdue Online Writing Lab

Harvard also provides a “How to Do a Close Reading” guide to students, which breaks close reading down into a two-part process: First the reader observes facts and details in the text, then he interprets what he’s observed through inductive reasoning—that is, he builds an interpretation bottoms-up from the details, rather than by deductively starting with a claim and then finding evidence to support it. And they offer the following tips, which sound similar to the kind of thinking the fifth graders I described in a recent post engaged in (with the teacher transcribing their thoughts in lieu of annotating the text):

1. Read with a pencil in hand, and annotate the text, noting anything that strikes you as surprising or significant, or that raises questions.

2. Look for patterns in the things you’ve noticed about the text—repetitions, contradictions, similarities.

3. Ask questions about the patterns you’ve noticed—especially the how and why.

This two-pronged process has always seemed to me a lot like the scientific method. The reader attends to the details an author gives just as a scientist attends to the details of whatever phenomena he’s studying. And from those observations, each develops a hunch that attempts to explains what they’ve noticed, which in science we call a hypothesis. Then just like the scientist, the reader continues to probe and observe, testing her hunch out as she encounters new details and looks back on ones she’s read, revising, refining and developing her ideas until all the pieces fit—at which point she comes to a final understanding, which is like a scientist’s theory. Only then, I would argue, can the reader’s thinking be turned into a claim whose validity can be proved in a deductive fashion using many of the same details that helped her understand as evidence.

Unfortunately, however, some of the approaches that aim to support close reading rob students of the opportunity to notice and to develop ideas of their own—which, as Harvard says, “is central to the whole academic enterprise.” Take Achieve the Core’s 8th grade Close Reading Exemplar for “Long Night of the Little Boats” by Basil Heatter, which recounts an incident from the Battle of Dunkirk when a ragtag flotilla crossed the English channel to rescue soldiers who were stranded on a beach during World War II.

My hunch is that the exemplar writers followed a process similar to Harvard’s to arrive at their own understanding of the piece (noticing, questioning, and interpreting, perhaps, automatically in their heads). They then rephrased their understanding as a question for the final writing task: “How did shared human values, both on the part of the little boat rescuers and the soldiers, play a part in the outcome of Dunkirk?” With that in place they then designed a series of questions and steps that would focus the students’ attention on details that were key to their own understanding’s development, such as:

The students neither own the noticings here, nor the development of the ideas. And the ‘help’ that teachers are asked to provide in order that students ‘see’ what they’re supposed to runs the risk of being as much an act of spoon-feeding as some of the pre-teaching practices that have come under fire are. Of course, it does increase the likelihood that students will meet the Standards. But they’ll do so by plugging in someone else’s language about details someone else has noticed to support an idea someone else has formulated. And that’s a far cry from the independent thinking that colleges want students to have.

To support that kind of independence, we have to design instruction that engages students in both components of the close reading process: to first be observers and questioners and then to use their observations and questions to, as Harvard puts it,  “reason toward our own ideas.” That may, indeed, involve asking students questions, but those questions need to be open enough for students to engage in real close reading, not an overly-prompted knockoff.

And so to ensure that we don’t put the cart before the horse, let’s remember this when it comes to close reading:

Questions before Answers

Hunch before Claim

Understanding before Analysis

More Thoughts on the Journey: Helping Students—and Ourselves—Understand Nonfiction

Recently I looked at how inviting students to notice patterns across a nonfiction text can help them consider the large and often invisible—i.e., not explicitly stated—ideas a writer is exploring. Raising students’ awareness of patterns and how writers use them to explore and develop ideas can ultimately help students meet many of the Reading Informational Text Standards of the Common Core, especially RI2 and RI5. It also helps students reap the full benefits of reading nonfiction, which is not always just about learning new facts but considering a writer’s unique take or perspective on those facts in a way that can deepen a reader’s understanding of the world and the people in it.

To introduce your students to how writers use patterns to develop their ideas—and how readers, in turn, build their ideas about a text by noticing the patterns the writer’s laid down and considering what they might mean—you’d follow the same process that I engaged in to plan the blog post on patterns: I pulled out a handful of books from my shelves and asked myself the following questions as I looked through and read each book:

  • Does this seem like a text in which the writer is using facts to explore one or more ideas—or put another way, is it a text that I’d want students not just to comprehend but also understand?
  • Do I notice patterns in the book—words, images, events, even structural devices that somehow keep repeating?
  • Does asking myself what the writer might be trying to show me through those patterns help me dig deeper into the text?

My hunch is that we don’t always ask ourselves the last two questions—and we might not ask the first one either because of the way we’ve traditionally used nonfiction in the classrooms. We have students read nonfiction, for instance, to learn facts about specific content, whether it’s to know the names of the great explorers or the process of photosynthesis. We have them read nonfiction to learn about text features or different text structures, or to find facts for research projects. But unless we’re looking squarely at bias, we may not think about the writer at all when we’re reading nonfiction, at least not to think about why she’s chosen and arranged whatever facts she’s sharing in a certain way.

I think that all this has to change in light of the Common Core Standards, which, in standard after standard, ask students to think about how the parts of a text are related to the whole. Noticing patterns and thinking about what the writer might be trying to show us through them automatically helps students do that—without the kind of teacher-directed prompting that comes with the text-dependent questions approach. And again and again I’ve discovered that, just as with students, we, too, as teachers start noticing patterns when we look for them, as an instructional coach and teacher in Georgia attests to in her blog post “Confessions of a Plot Junkie.”

But what happens when you don’t notice patterns, which certainly happens to me sometimes when I read nonfiction? As experienced readers, we know that nonfiction writers often use facts to explore ideas they sometimes have an opinion about, and they unfold those ideas in more complicated, subtle and indirect ways than thesis-driven five-paragraph essays. Because of this we enter a text on the look-out for glimmers of ideas and opinions, asking ourselves, consciously or not, what the writer might want us to understand, as we both read forward and think backwards to draft and revise our ideas.

Unless we’re in a text outside our comfort zone, we tend to do this work automatically, barely aware of how we process and arrive at our sense of what the writers is up to. But to make this more visible for students, we can ask them to be trackers, reading the text paragraph by paragraph to sniff out possible ideas, and then reading forward and thinking backwards to consider how those might—or might not—be connected to what came before and comes after.

To show a group of K-12 educators what this approach could look like during a workshop on reading nonfiction, I searched for another text that, like many of the Common Core exemplars, took readers on a journey of thought that couldn’t be fully anticipated or deeply understood by most of the strategies we currently give students. To provide some common ground across grades, we decided to focus on a single topic, food, and for this activity I chose a short piece from The New Yorker called “The Big Heat,” by Elizabeth Kolbert, which begins with the grabbing and provocative lead, “Corn sex is complicated,” before taking all sorts of twists and turns whose purpose and logic aren’t immediately apparent.

I asked the participants to read it with a partner (as I invite to to do, too, on your own or with a colleague), stopping at every paragraph to share both what they thought the writer might want them to understand and how that might or might not be connected to whatever had come before. Interestingly enough in the beginning, several found the piece so disjointed they were tempted to deem it ‘bad’ writing. But by the middle of the second page, everyone began to see that there was a method to Kolbert’s seeming madness. And at that point they had to revise their understanding of what the piece was ‘about,’ which they had to do yet again as they reached the final two paragraphs.

The participants left the workshop that day with a deeper understanding of both what, beyond obvious measures like lexiles, makes a text complex and what readers need to do to navigate that complexity in a way that allows them to understand the ideas and the train of thought that holds the facts together. They also came away understanding that food has many implications, beyond health and nutrition. And when, after reading this text together, they explored ones that were at their students’ grade levels (some of which there are links for below), they were far more aware that there were ideas and opinions lurking in them. They saw more because they were looking for more—and they were eager to invite their students to look for more than text features and facts when they got back to their classrooms, as well.