DIBELS Draws Doers & Doubters

DIBELS Draws Doers & Doubters

Dynamic Indicators of Basic Early Literacy Skills monitor reading progress but raise questions.

When kindergarten students in Cincinnati-area schools were tested on their basic reading skills last fall, examiners found that 13.5 percent of the children needed intensive instructional support in phoneme segmentation fluency-dividing the different sounds, or phonemes, of a word. When tested at the end of kindergarten, only 2.7 percent of the children still needed that help.

Administrators credit the improvement to the tool they used to assess the children- Dynamic Indicators of Basic Early Literacy Skills (DIBELS), a set of tests that focus on the various skills necessary for learning to read. Ohio authorities say it helped them identify students who needed specialized instruction and that after receiving that instruction, more students scored better on the later DIBELS test.

"DIBELS has been fantastic. It has been the most powerful tool for changing student outcomes that I have ever encountered," says Stephanie Stollar, an educational consultant with the Southwest Ohio Special Education Regional Resource Center. One of 16 regional centers in the state, it serves school districts in Cincinnati and four surrounding counties.

But not all educators share Stollar's enthusiasm. DIBELS is "the worst thing to happen to the teaching of reading since the development of fl ash cards," asserts P. David Pearson, dean of the Graduate School of Education at the University of California at Berkeley. Pearson claims that it shapes instruction in bad ways for students and teachers alike. It's bad for students because they are "held accountable to the indicators" rather than to whether or not they really learn to read, he says.

It's bad for teachers, he continues, because it requires them to teach and judge students based on criteria that are "not consistent with our best knowledge about the nature of reading development," including whether students understand what they read.

Some teachers and parents agree. "I get nothing out of it. It tells me how fast they can read and nothing else. It is a waste of my time," declares Melissa Pares, who teaches fifth grade at Bullard Talent School in the Fresno (Calif.) Unified School District.

"I question how reliable and valid the test scores are," adds Sandra Blackburn, a veteran second grade teacher at Longstreet Elementary School in the Volusia County Schools in DeLand, Fla.

Lisa Laser is home schooling her second grade son, Ellis, after she and her husband removed him from Joseph Elementary School in Oregon because of "the horror of DIBELS" he experienced in first grade. His teacher suggested he be held back after first grade. "The test itself is harmful, and the curriculum essentially was to teach to the test," Laser says.

Controversy and Scrutiny

These contrasting perspectives define the parameters of a debate in the education community about the effectiveness of DIBELS, which 45 states have approved for use, according to the U.S. Government Accountability Office.

They also enter into an ongoing congressional investigation of how DIBELS became so widely used in the country's public schools following establishment of the $1 billion Reading First program, a key provision of the federal No Child Left Behind law. Reading First, which funds kindergarten through third grade reading programs, requires the use of reading assessments. Investigators are looking at ties between DIBELS and three former members of a committee established by the U.S. Department of Education to review assessment products.

"It has been the most powerful tool for changing student outcomes that I have ever encountered."-Stephanie Stollar, educational consultant,Southwest Ohio Special Education Regional Resource Center

One of them-Edward J. Kame'enui, a co-creator of DIBELS and a faculty member at the University of Oregon- was named the first commissioner for special education research in the Department of Education in 2005 but resigned that job in June. According to the U.S. House of Representatives' Committee on Education and Labor, Kame'enui and the other former committee members, Deborah Simmons and Roland H. Good III, also a co-author of DIBELS, "benefited financially" from the sale of DIBELS.

In written testimony last April to the House committee, John P. Higgins Jr., inspector general of the Department of Education, said his office found that some department activities had led to "a perception that there was an approved list" of assessments. He cited a handbook given to participants at three Reading Leadership Academies that the Education Department and the National Institute for Literacy sponsored to assist states in preparing Reading First grant applications.

Higgins said that while other assessments were listed, only DIBELS was featured in an article in the handbook and in a guidebook that the Education Department published later. "Not surprisingly, 43 states indicated that they would use DIBELS as one of their assessments," Higgins testified.

He added that his office found instances where Education Department officials "intervened" and "worked to influence states" to select DIBELS.

Higgins called the department's actions "inappropriate" and said they "created an appearance" that the agency may have violated laws that prohibit it from influencing schools' curricula.

Some states mandate use of DIBELS. In inviting funding applications for full day kindergarten literacy readiness programs for the 2007-2008 school year, for example, the New Mexico Public Education Department states that all such programs "are required to administer" DIBELS, although schools that apply can list additional assessments that they intend to use.

Assessment Tool

DIBELS grew out of research in education testing that a group of school psychologists and other education specialists began at the University of Oregon in 1988, says one of the specialists, Ruth A. Kaminski. With a background in speech language pathology and early childhood special education, she now runs Dynamic Measurement Group (DMG), a forprofit company in Eugene, Ore., that she founded with Good. DMG sells DIBELS to state education agencies and school districts and uses the proceeds to fund ongoing research and development of the reading assessment tools, now in their sixth edition, Kaminski says.

She emphasizes that DIBELS is not designed to improve students' reading abilities by itself. "It is an assessment tool, designed to monitor children's progress in acquiring important early literacy skills and informing intervention or instruction," she explains.

"Reading readiness is a multifaceted issue, and teachers have found a relationship between students who start at very low levels and by using DIBELS are able to get a higher level of reading readiness," says Douglas B. Reeves, founder of the Leadership and Learning Center, formerly the Center for Performance Assessment, an Englewood, Colo.-based organization dedicated to improving student achievement and educational equity.

DIBELS tests are basically one-minute benchmark assessments that teachers, reading specialists and other examiners in schools give one-on-one to students at the beginning, middle and end of the school year. When a test indicates that a child needs additional instruction, examiners might test the child as frequently as weekly to monitor progress, Kaminski says.

Like using a thermometer to take a child's temperature as an indicator of overall health, each DIBELS test is an indicator of how well a student is learning a particular basic reading skill. Test scores indicate whether a student is likely to be on track for learning to read or is likely to need help in developing particular skills, such as learning how to sound out unknown words.

Teachers use stopwatches, which DIBELS sells (roughly $7-$8 each). Tests are timed, Kaminski says, because one indicator of fluency is "not just how accurately children can do things but how effortlessly they can do them without having to think really hard about what they are doing."

At the end of a test, the examiner totals a score and records it on paper or a Palm Pilot. The examiner enters the results into computerized databases at school, which then print reports for later analysis.

Although teachers are usually the DIBELS testers, others can evaluate results and determine how to adjust instruction to serve students who need help. Kaminski says some schools have grade-level teams that meet monthly to review how children are doing. Teams might include a speech-language pathologist, school psychologist, special educator and Title I teacher.

And sometimes principals join the teams. "Principals are the administrative and instructional leaders of a school, and their understanding of and support for DIBELS can make the difference between effective and ineffective implementation," Kaminski says.

Nonsense Words

One issue critics have with DIBELS is the use of made-up "nonsense" words in testing a student's ability to blend letters and sounds. In the test, kindergarten and fist grade students are given a sheet of paper with randomly ordered nonsense words written on it, like "sig," "rav" and "ov." Students are asked to say the individual sound of each letter or the whole word. A student's score is the number of letter sounds produced correctly in one minute.

Blackburn, who plans to retire next year after 44 years of teaching, says she does not use nonsense words in her testing because they are not part of the curriculum. "Using nonsense words when children know real words and are already reading at grade level or better doesn't make a lot of sense to me," she says.

Even with nonsense word tests, however, school psychologists rate DIBELS as "very useful," says Laurice M. Joseph, associate professor in the College of Education and Human Ecology at the Ohio State University and a member of the National Association of School Psychologists. With DIBELS, "you can readily screen an entire classroom of students, then use the data to target students who are not performing comparable to their peers or to benchmark standards," Joseph says.

"It's the worst thing to happen to the teaching of reading since the development of flash cards.-P. David Pearson, dean of the Graduate School of Education, University of California at Berkeley

That's how it worked at Rosemont Elementary School, a Reading First school in the Orange County Public Schools in Florida, where teachers identified 60 struggling third-, fourth- and fifth-grade readers through a DIBELS assessment at the beginning of the school year last fall. Teachers began using an interactive, speech-enabled software program to help the students improve their reading fluency. The students, who initially tested at the lowest levels of literacy, had advanced to the next level on the next DIBELS test during the winter, and teachers looked for even more improvement to show on the final DIBELS test late in the spring.

Jacqueline Oester and seven other reading support teachers at Rosemont give the DIBELS test to all 900 students at the school in one week three times a year. "We take them into separate areas outside the classrooms so they can concentrate on reading the passages," says Oester. The children "remember their last score and are intent on beating it," she says.

"We are not encouraging them to think of it as a contest or race, because we don't want them to just speed through it. We want them to read fluently, and reading fast and fluently are two different things. But I think every child has a competitiveness inside," says Oester, who believes DIBELS is useful in indicating a child's progress in learning how to read.

Speed and Comprehension

But some teachers think timing DIBELS tests leads students to read rapidly. "I don't encourage that. I want them to read at their own speed; whatever they are comfortable with," says Blackburn. "I don't think speed-reading should be taught until middle or high school."

Some critics of DIBELS also contend that even if students read fast and fluently, tests do not measure whether they understand what they are reading. "I count how fast they read and how many mistakes they make and that tells me their fluency. It doesn't tell me whether they comprehend what they read, and we're not encouraged to ask them," says Pares.

"The actual DIBELS comprehension measure is a retelling, which is OK but probably not the best in terms of what teachers usually have students do in the classroom, which is to answer questions after they read a text," says Joseph.

She cites research findings that "kids who score high on the fluency measures tend to be better at comprehending texts than kids who don't score high on fluency." So DIBELS can be a good predictor of what "we might expect a student to do on comprehension, although you probably would want to supplement it with another test that has the student read a passage and then answer literal and inferential questions," Joseph says.

"Whatever reading instruction you are using, DIBELS indicates whether it works or not."-Ruth A. Kaminski, co-founder, DIBELS

Reeves says administrators shouldn't get "seduced" by a high DIBELS score. "You also have to teach kids how to summarize and get the main idea-all the elements of reading comprehension," he adds. "DIBELS does not do that, but that doesn't mean that DIBELS is worthless. It's just part of the reading process."

While Reading First extends only through third grade, DIBELS is designed to be used through sixth grade, according to Kaminski. The Metropolitan School District of Wayne Township in Indiana experimented with DIBELS from kindergarten through sixth grade but will use it only in kindergarten and first grade in 2007-2008, because that's where it works best, says Lisa Lantrip, assistant superintendent for curriculum and instruction and a former school principal. DIBELS doesn't work as well in higher grades, she says, because "you can't really get into whether or not the children understand the story. Their retell is very low, meaning they have no clue what the story is about."

Laser says her family's main issue with DIBELS was her son's school's "overreliance" on the test and teaching solely designed to support test success, with the child's "well-being be damned."

Because of her son's results on two DIBELS measures, his teacher suggested he either repeat kindergarten or be held back after first grade, Laser says. Laser thought Ellis would be ahead of his class in math and science because his Portland school had focused on them, but those subjects were not considered. "It was clear to us that DIBELS is double punishment- DIBELS in and of itself and the expectation that kindergartners must enter first grade with a specific and narrow range of reading skills in spite of their other skills and knowledge," Laser declares.

Phonics vs. Whole Language

Another issue DIBELS has raised is whether phonetic reading instruction is better than teaching "whole language" reading. Kaminski says DIBELS is neutral on the issue. Whatever reading instruction you are using, DIBELS indicates whether it works or not," she says.

The International Reading Association supports phonics as part of a whole language program. "The teaching of phonics is an important aspect of beginning reading instruction," IRA declares in a position statement. But Reeves claims reading is multifaceted. "You have to do both, and anybody who has spent any time with kids should not take an either/or position," he says. "You need letter recognition and sounds skills, but nobody has ever said that prevents you from reading rich, wonderful literature." Reeves compares it to music. "The fact that you get wonderful vibes from playing a Chopin nocturne doesn't mean you are prohibited from playing scales and doing hand exercises," he says.

Meanwhile, debate over DIBELS itself continues. "It digs too deeply into the infrastructure of reading skill and process and comes up with a lot of bits and pieces but not the orchestrated whole of reading as a skilled human process," Pearson says.

But Kaminski and Stollar defend it. "It was never intended to be the be-all and end-all of reading assessment but to work pretty darned well as an indicator of a core skill," Kaminski says. Stollar adds that criticism comes from a lack of knowledge. "There is no other reading assessment that can do what DIBELS does," he says.

Alan Dessoff is a freelance writer based in Maryland.

One Minute of Nonsense

The truth behind DIBELS, according to Ken Goodman and his book, The Truth About DIBELS: What It Is and What It Does.

One anonymous teacher, who is quoted in my book, claims that her district is not using DIBELS because administrators and teachers want to use it or because it gives helpful information, because it doesn't, she claims. "We're using it because Reading First requires it," she says. "Some schools are posting fluency scores of children ... and then the students have race cars, in the form of bulletin boards, where they are trying to race to the speed goal. On the phoneme segmentation part, some kindergarten classrooms have been known to drill and practice the segmentation while kids are in line waiting for the restroom."

DIBELS is not just an early literacy test. Teachers are required to group learners and build instruction around the scores. They're evaluated on the DIBELS scores their pupils achieve. Publishers are tailoring programs to DIBELS. And academic and life decisions for children, starting in kindergarten, are being made according to DIBELS scores.

I believe this period in American education will be characterized as the pedagogy of the absurd.

Roland Good, a DIBELS developer, told the U.S. House of Representatives' Education Committee during a hearing last April that three million children are tested with DIBELS at least three times a year from kindergarten through third grade. New Mexico provides every teacher with a DIBELS Palm Pilot so the pupils' scores can be sent directly to Oregon for processing.

Kentucky's associate education commissioner testified at the hearing that the state's Reading First proposal was rejected repeatedly until they agreed to use DIBELS. The DOE inspector general cited conflicts of interest by Good and his Oregon colleagues in promoting DIBELS.

Another teacher, quoted in my book, claims that while the DIBELS test is used throughout the school year, any child who receives the label "Needs Extensive Intervention" as a result of the first testing must be monitored with a "fluency passage" every other week.

No test of any kind for any purpose has ever had this kind of status. In my book, I analyzed each of the subtests in depth. Here are my conclusions:

- DIBELS reduces reading to a few components that can be tested in one minute. Tests of naming letters or sounding out nonsense syllables are not tests of reading. Only the misnamed Reading Fluency test involves reading a meaningful text, and that is scored by the number of words read correctly in one minute.

- DIBELS does not test what it says it tests. Each test reduces what it claims to test to an aspect tested in one minute.

-What DIBELS does, it does poorly, even viewed from its own criteria. Items are poorly constructed and inaccuracies are common.

- DIBELS cannot be scored consistently. The tester must time responses (three seconds on a stopwatch), mark a score sheet, and listen to the student, whose dialect may be different from the tester, all at the same time.

- DIBELS does not test the reading quality. No test evaluates what the reader comprehends. Even the "retelling fluency test" is scored by counting the words used in a retelling.

- The focus on improving performance on DIBELS is likely to contribute little or nothing to reading development and could actually interfere. It just has children do everything fast.

- DIBELS misrepresents pupil abilities. Children who already comprehend print are undervalued, and those who race through each test with no comprehension are overrated.

- DIBELS demeans teachers. It must be used invariantly. It leaves no place for teacher judgment or experience.

- DIBELS is a set of silly little tests. It is so bad in so many ways that it could not pass review for adoption in any state or district without political coercion. Little can be learned about something as complicated as reading development in one-minute tests.

Pedagogy of the Absurd

I believe this period in American education will be characterized as the pedagogy of the absurd. Nothing better illustrates this than DIBELS. It never gets close to measuring what reading is really about-making sense of print. It is absurd that self-serving bureaucrats in Washington have forced it on millions of children. It is absurd that scores on these silly little tests are used to judge schools, teachers and children. It is absurd that use of DIBELS can label a child a failure the first week of kindergarten. And it is a tragedy that life decisions are being made for 5- and 6-year-olds on the basis of such absurd criteria.

Ken Goodman is professor emeritus in the Language, Reading and Culture Department at the University of Arizona in Tucson. He is former president of the International Reading Association, a bestselling author and a contributing editor to The Pulse: Education's Place for Debate.


Advertisement