Archives for category: Direct instruction

Today, IUFSD teachers are encouraged to “take risks.” 

Neither the student nor the parent must be informed that a risk is being taken; no plans to evaluate the results of teacher risk-taking are required; nor is the board informed. And, of course, it’s not the teacher taking the risk. It’s the student. Teachers have tenure and a union to protect them from risk.

There is no realm apart from public schools in which taking risks with other people’s children would be acceptable. At the university level, all research involving human subjects, including projects as benign as interviewing people about their experiences, must be vetted and approved by an Institutional Review Board. Even teaching autistic children grammar using a software program must undergo thorough review prior to implementation.

Yet here in IUFSD, teachers are expected to “take risks.” 

Below are Siegfried Engelmann’s principles for making changes to curriculum and teaching:

Principles for school boards to follow when authorizing changes to curriculum and teaching practices

1. Don’t adopt any teaching method or curriculum unless you have substantial reason to believe that it will result in improvement of student performance;
2. Don’t adopt any approach without making projections about student learning;
3. Don’t adopt any practice without monitoring it and comparing performance in the classroom with projections;
4. Don’t adopt an approach without having a back-up plan;
5. Don’t maintain practices that are obviously not working as planned;
6. Don’t blame parents, students, or other extraneous factors if the plan fails.

~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

Don’t adopt any teaching method or curriculum unless you have substantial reason to believe that it will result in improvement of student performance.

A good plan is to require the administration to show that the plan works on a small scale before using it across the board.

Even though failure in a small-scale tryout is more humane than failure in an entire school district, children should not be guinea pigs for mindless experiments that have little hope of working. The small-scale tryout is not to be a learning experience for the administration as it discovers facts that it should already know. Therefore, the board should limit the number of tryout programs that are permitted, and should establish contingencies for failure.

The board, however, should require the administration to contact successful teachers within the district and solicit their advice and guidance before installing any approach. (These are teachers who consistently produce results that are above the demographically predicted level.)

Don’t adopt any approach without making projections about student learning.

Unless the benefits of the approach can be readily measured in terms of student outcomes, and unless they are outcomes we are concerned with, the administration should not be permitted to adopt the approach.

Don’t adopt any practice without monitoring it and comparing performance in the classroom with projections.

Monitoring is necessary for the administration that wants the program to succeed. . . Weekly evaluations indicate whether the projected material is presented on schedule, whether the teachers need significant help, and whether they are faithfully following the program.

Don’t maintain practices that are obviously not working as planned, and don’t stick with failed plan.

Part of the initial plan should have a “pull-the-plug” criterion and a back-up plan. The criterion should be expressed in a way that permits some flexibility, but that requires an empathic response to kid problems. . . . What we don’t want the administrators to do is to leave students in the approach all year long and then at the end of the yea conclude that it was a bomb.

Don’t blame parents, kids, or other extraneous factors if the plan fails.

The only factor that affects the plan is whether the kids and the teacher are in attendance on a regular basis. Aside from the unusual situations, this is the only consideration that should be used to demur the results of the implementation. If the teaching failed, it was because the teaching failed, not because the parents didn’t get involved.

Adapted from: War Against the Schools’ Academic Child Abuse by Siegfried Engelmann | Halcyon House | Portland, Oregon 1992

AND SEE:
Super’s plan: replace college prep with “workplace” prep
Teacher-centered v. learner-centered classrooms 
“Fast trends”
Teachers “taking risks”
Project Follow-Through

Our Failure To Follow Through by Billy Tashman

Reprinted from New York Newsday, November 15, 1994, with permission

Project Follow Through, America’s longest, costliest and perhaps, most significant study of public school teaching methods quietly concluded this year. The good news is that after 26 years, nearly a billion dollars, and mountains of data, we now know which are the most effective instructional tools. The bad news is that the education world couldn’t care less.

Started in 1968, Follow Through was intended to help kids, from kindergarten through the third grade, continue the progress they had made in Head Start. But the Feds also wanted to find out which instructional methods delivered the most bang for the bucks. So they funded 22 vastly different educational programs in 51 school districts with a disproportionate number of poor children. Standardized test results were collected from almost 10,000 Follow Through children, as well as from kids not in the Follow Through program.

Abt Associates in Cambridge, Mass., analyzed the numbers, then issued the verdict. When it came to academic performance, children who participated in the Direct Instruction method blew their peers out of the classroom. More important, later evaluations of 1,000 Direct Instruction graduates showed that they were still ahead of their cohorts in their senior year of high school.

If something works this well, why aren’t public schools using it? One reason is that Direct Instruction, at first glance, looks dated. Indeed, teachers who treat their jobs as a cross between stand-up comedy and the Superbowl halftime show might, after peeking into a Direct Instruction classroom, disappear faster than a spare textbook at the Board of Ed.

To make matters worse, these methods owe a lot to the late B. F. Skinner, the Harvard behaviorist some recklessly called a fascist. That’s unfortunate and unfair, because Skinner demanded a scientific approach to classroom instruction, which is lacking from almost every hot reform idea du jour.

Direct Instruction stresses basic skills, breaking them down into mini-components. Children learn to read, for example, by learning the sounds of the letters before the letter names. They master each skill before moving onto the next one. Teachers track each student’s progress on daily charts. They also track behavior, encouraging good conduct with praise, while ignoring bad behavior for the most part. In short, if you can’t measure it, you probably shouldn’t teach it. This kind of micro-management is almost unheard of in most classrooms.

But Direct Instruction’s most controversial feature is a script from which teachers conduct lessons. Picture this: A first-grade teacher, reading from her script, makes the “m” sound. The pupils respond in unison. After a word of praise, the teacher, prompted by her script, tells them to repeat the sound.

This may sound a bit like a “Road to Wellville” approach to education, but Direct Instruction has had stunning success at scores of schools. One of the original sites in the early ’70s was P.S. 77 in the South Bronx. After five years, DI “significantly raised the reading, writing and arithmetic performance and scores of the participating children,” said one report. Federal budget cuts eventually gutted the program but, interestingly, P.S 77 old-timers still cling lovingly to the teaching methods.

It may come as a shock to the layperson, but school policymakers haven’t adopted Direct Instruction because they have an aversion to scientific research. Educators throw their weight behind the latest fad, then refuse to abandon it when it doesn’t work. In fact, the federal oversight panel for Follow Through cut the Direct Instruction program even as it continued other models that were spectacular flops. Eschewing basic skills, the failed programs tried to teach kids how to learn on their own, or tried to raise students’ self-esteem (both categories, by the way, in which Direct Instruction students excelled). In these failed programs, students had even lower reading and math scores than the control groups that had no Follow Through program. Yet these failed programs have spread through America like fire through dry corn.

Follow Through demonstrated that scientific research and the classroom are still strangers to one another. Until they join forces, American schoolchildren will continue to receive a second-class education.

AND SEE:
Super’s plan: replace college prep with “workplace” prep
Teacher-centered v. learner-centered classrooms 
“Fast trends”
Teachers “taking risks”
Project Follow-Through