Over on the Huffington Post blog, in a fine remembrance of the founder of the American Federation of Teachers, Bob Slavin reminded readers that Al Shanker championed both professionalism and evidence.
Back in the day, I knew Al Shanker, the founder of the American Federation of Teachers. No one has ever been more of an advocate for teachers’ rights – or for their professionalism. At the same time, no one was more of an advocate for evidence as a basis for teaching. He saw no conflict between evidence-based teaching and professionalism. In fact, he saw them as complementary.
Professor Slavin continues his remembrance with sage argument about the importance of teachers embracing evidence-based practices. Ultimately he cited the value of Every Student Succeeds Act (ESSA) evidence standards.
I applaud Professor Slavin’s argument and amplify it with an additional call for educators not only to embrace evidence-based practices, but to do even more. As professionals, we educators have a duty to
- Identify evidence-based practices that have the strongest effects. Some evidence-based practices are more effective than others (in fact, Professor Slavin has been among the leaders in helping refine research methods to compare instructional practices). Professionals need to know how to find those practices.
- Implement evidence-based practices with fidelity (see Cook & Odom, 2013). Having a great recipe for a soufflé is no guarantee that one will serve a gourmet dish; one has to use the right ingredients and execute the steps in preparing it, as well. That’s a big part of professionalism.
- Monitor the effects of programs on individual students and groups of students so that, when needed, adjustments can be made. It is inevitable that evidence-based practices will move too slowly or rapidly for some learners. Professionals need to adjust instruction for them. These adjustments should, of course, employ evidence-based practices. See this post for an illustration.
There may be other important relationships between professionalism in education and evidence-based practices. Readers are welcome to suggest them in comments.
Cook, B. G., & Odom, S. L. (2013). Evidence-based practices and implementation science in special education. Exceptional Children, 79, 135-144. doi: 10.1177/001440291307900201
In “How to Improve Student Learning in Every Classroom Now,” Janet Twyman and Bill Heward (in press) published excellent descriptions of effective practices, complete with descriptions, examples, links, and more. If you follow Teach Effectively, you'll be familiar with the practices they describe, but their article brings them together in one clear summary. Here's their abstract:
This paper is our attempt to help any of the world’s 60 million teachers who ask, “What can I do right now to improve learning in my classroom?” We describe three easy-to-use teaching tactics derived from applied behavior analysis that consistently yield measurably superior learning outcomes. Each tactic is applicable across curriculum content and students’ age and skill levels. Considerations for using digital tools to support and extend these “low-tech” tactics are also discussed.
Oh, about that evidence stuff: They provide plenty of references to solid research.
Twyman, J. S., & Heward, W. L. (in press). How to improve student learning in every classroom now. International Journal of Educationa Research. doi:10.1016/j.ijer.2016.05.007
Professor Kerry Hempenstall wrote a literature review on teaching reading for Australia’s Centre for Independent Studies. It is an excellent resource, because it is true to the scientific evidence, but it is written in a way that is accessible to lay readers.
Parents, teachers, administrators, and interested others: You don’t have to put up with the statistics-ese and mumbo-jargon that we researchers often use when discussing scientific evidence. In Read About It:
Scientific Evidence for Effective Teaching of Reading, Professor Hempenstall clearly explains the five fundamental features of reading competence and how to foster them in learners. Down load a copy of this excellent PDF or follow this link to learn how to purchase a hard copy.
Many of us have probably heard anecdotes about accommodations that failed or even backfired. A summary of a state NEA survey of Washington state teachers indicated those teachers’ concern about students losing mandated IEP services because of administration of a Smarter Balanced Assessment, that state’s version of the Common Core.
A pair of two articles in the LA Times covers this topic, too. The lead one, “How new tools meant to help special education students take standardized tests actually made it harder” discussed one teacher’s experiences and some larger issues with references to Washington and Oregon. The second one summarized anecdotes from teachers about problems they encountered in administering California’s Common Core: “These are the problems some California teachers had when they tested students with disabilities.”
I encourage readers to be cautious about presuming that these stories and others like them indict the Common Core State Standards. There are many other players in the mix in these stories, too. Note how poorly designed or executed Universal Design for Learning might be at play in the representations of assessment materials, how the technologies themselves may be contributing to the difficulties, and of course, how these reports are only anecdotal. We have no idea how many other stories there are and how representative these may be of all the stories that could be told.
That does not mean educators should not try to address them, to fix them. Indeed, it’s important to examine problems carefully. Perhaps the National Center on Educational Outcomes, a respected US research and development group that provides technical assistance about the participation of students with disabilities and English language learners (and a collaborator with the Smarter Balanaced Assessment folks), is studying these issues.
One interesting way to study the problems might be to collect the anecdotes about problems in a systematic way…sort of crowd-source them into a data base: State test; student disability category; student age; testing area; accommodation…etc., problem encountered.
If there were a few 1000 examples, maybe some consistent patterns would be clear.
In the summary for a recently released policy analysis, John Stone of the Education Consumers Foundation argued that developmentally appropriate practice (DAP), the widely promoted approach to early childhood education, has effectively prevented struggling students from achieving what educational policy makers have sought since 1983: The chance to close the gap. In the statement, Misdirected Teacher Training, Mr. Stone details the ways that DAP has hindered young children’s progress.
Continue reading ‘ECF: Misdirected Teacher Training has Crippled Education Reform’
Does it actually help to monitor students’ progress and adjust instruction on the basis of how they are doing? Deborah Simmons and her colleagues provided compelling evidence that, within a tier-2 implementation of the Early Reading Intervention (ERI) program at the Kindergarten level, it surely does.
Although it was published online earlier, in the May 2015 issue of Journal of Learning Disabilities, Professor Simmons and her team described a study in which they compared the reading performace of children for whom teachers had made adjustments in the pacing of instruction, either providing additional practice on lessons or skipping lessons, to the reading performance of children who had not received the adjustments. The adjustments were based on frequent assessments of students’ progress through the ERI program.
Among the children who received the adjustment, they identified four different groups. The graphic at the right, taken from Simmons et al. (2015) Figure 2, depicts the four groups, as described in the following list.
Continue reading ‘Does monitoring progress help?’
I am a presenter at the conference of the Division of International Special Education and Services (DISES) of the Council for Exceptional Children in Braga, Portugal, on 15 July 2014. There are a lot of people here who will be, in keeping with the conference theme of “Embracing Inclusive Approaches,” talking a lot about where special education happens.
In many ways, on the international scene, simply having students with disabilities included—as in not excluded from—education is important. As faithful readers will know, I think it’s great to include students with disabilities, but I think that what happens with them when they’re included is incredibly important. The instruction that occurs in schools is critical. Why send them to school to be defeated by lousy instruction? This is especially true for students with high-incidence disabilities when being “included” very often plays out as meaning being assigned to a regular or general education setting full time.
So I’m talking about including science about effective teaching…just taking the opportunity to enter another plea for teaching effectively. A PDF copy of my slide deck and a couple of pages of the references to which I refer are available.
How does one know whether one’s teaching is working? That’s a dang important question. Over on myIGDIs, Scott McConnell provides a quick and clear introduction to the answer. In How Do I Know if My Classroom Practices Are Working?, Professor McConnell explains that one needs (a) goals or standards, (b) points of comparison against which to assess change or difference, and (c) trustworthy ways of measuring students’ performance, if one is to assess the effects of one’s teaching.
Although Professor McConnell’s analysis is aimed primarily at early childhood education, it’s base is general enough to be applicable across age groups. He’s talking about Individual Growth and Development Indicators, or IGDIs. Those are important tools in an effective educator’s apron. I’m thinking myIGDIs, which provides research-based, preschool language and literacy measures, looks like a valuable site. These link nicely to RtI, CBM, and other models that align with monitoring progress systematically.