September 27, 2005 Features

Show Me the Data

Finding the Evidence for School-Based Clinical Decision Making

In an era of greater accountability in education, 2005 Schools Conference participants got a look at a step-by-step approach for clinical decision making to select interventions that incorporate the current best evidence, student-parent factors, and the context of the school culture.

"Evidence-based practice is part of the total clinical process. It is our responsibility to students-it's our ethical responsibility to look at the basis of evidence," said Barbara Ehren, a research associate with the University of Kansas Center on Research on Learning. "Students don't have time to waste with practices that may not be proven."

Ehren, along with Marc Fey, a professor at the University of Kansas Medical Center, and Ron Gillam, a professor at the University of Texas at Austin, presented a review of key elements in evidence-based decision making in a plenary session on "SLPs, Start Your Engines: Evidence-Based Practice in Schools."

EBP is not just a trend that is moving from medicine to education, but has its roots in No Child Left Behind, which places an emphasis on scientifically based methods, and the Individuals with Disabilities Improvement Act of 2004, which calls on clinicians to use "scientific, research-based interventions."

"The goal is to use the literature in a savvy process that draws on a number of different factors in which evidence plays a key role," Fey said.

Six-Step Process

The presenters proposed a six-step process for school-based decision making:

Step 1: Frame the question. To begin making a decision based on evidence, clinicians must ask the right questions. Fey suggested that clinicians transform the clinical problem into a three- or four-part question that considers: relevant student characteristics and problems; the leading intervention; an alternate intervention; and clinical outcomes or goals.

One popular format used to frame the question is known as PICO-Population, Intervention, Comparison, Outcome. For example, one clinical question might be, "Do primary grade school children with language impairment and learning disabilities improve more after a computerized intervention or traditional school treatment?"

"We need to ask, 'What's the theory behind the intervention?' Show me the data," said Ehren.

Step 2: Consider the internal evidence. This is knowledge that is acquired through professional training and experience as well as personal clinical assessment and experience in working with the student. You may already have data on that particular student, or district-wide data that apply to the clinical question, Gillam said.

Step 3: Consider the external evidence. Research data can be gathered from a variety of sources, Gillam said, encouraging participants to check out the resources on the ASHA Web site, including the EBP section and articles from ASHA journals and guidelines. Other sources of education-related research include ERIC, MedLine, PsychLit, ComDis Dome: Communication Sciences and Disorder Information Service; Find Articles, university databases, and textbooks.

Gathering evidence does not have to be a solitary endeavor, but can be done as a team, according to Bonnie Singer, who offered practical tips to participants as part of the closing plenary session. "Take turns reading the evidence," she suggested. "Form study groups in your school and district to review the literature and earn CEUs. People have different ideas and you can share information."

Step 4: Critically appraise the evidence. Look at the quality of the study by considering questions such as these:

  • Are the results valid? Has the study been peer reviewed? Has it been replicated? Is there a logical rationale for the study? Were the groups equivalent before treatment? Were the coders and evaluators blind to the treatment group and control group?
  • Are the results clinically important? Gillam reminded participants to look at more than just whether the treatment produced a statistically significant improvement. "Move beyond 'does this work?' to 'how well does this work?'" he said.
  • Do the results apply to the student?

Fey encouraged participants to read studies with an eye toward the design being used because different studies offer various levels of evidence. The level of evidence provided by different study designs can be ranked according to a chart (members only).

Step 5: Integrate external and internal evidence. Sources of evidence-internal and external-may be supportive, non-supportive, or conflicting. The ultimate decision depends on weighing multiple factors, including student/parent factors and clinician/school factors. For example, the student and parents may have strong cultural beliefs or limited resources. Parents may favor a particular intervention, or they may be uninvolved in the process. Other factors include the school's culture and policies or district-wide data.

"When evidence indicates that outcomes are similar, the district should use the most cost-effective intervention-and the parent should be told this," Ehren said.

Step 6: Evaluate the decision-making process. Reflect on the process and outcome and any opportunities for improvement. "EBP is part of the total clinical process," she said. "We have to continually re-evaluate our path." 

Susan Boswell, an assistant managing editor of The ASHA Leader, can be reached at sboswell@asha.org. 

cite as: Boswell, S. (2005, September 27). Show Me the Data : Finding the Evidence for School-Based Clinical Decision Making. The ASHA Leader.

Evidence-Based Practice Resources

Organizations

What Works Clearinghouse. Funded and administered by the U.S. Department of Education, the Clearinghouse focuses on the development of reports characterizing the effects of educational interventions. Although most of the work completed by the Clearinghouse to date has focused on mathematics, they have stated an interest in the following topic areas of potential interest to speech-language pathologists:

  • Interventions for beginning reading
  • English language learning
  • Interventions for improving pre-school children's school readiness
  • Peer-assisted learning interventions in elementary schools

The Campbell Collaboration. The Campbell Collaboration (frequently referred to as C2) is an international non-profit organization devoted to the development of systematic reviews of interventions in a number of non-medical fields, including education. Among the reviews they have conducted are:

  • Effects of Sesame Street Television Programming on Cognitive and Affective Outcomes of Pre-School Children
  • Effectiveness of Phonemic Awareness Training on Reading and Spelling Achievement in Elementary School Children
  • Effectiveness of Behavioral-Based Interventions for Stuttering in Children and Adolescents

Articles

Research in Special Education: Scientific Methods and Evidence-Based Practices [PDF]

The Use of Single-Subject Research to Identify Evidence-Based Practices in Special Education [PDF]

Web sites

Introduction to Evidence-Based Practice (members only). The EBP section of the ASHA Web site contains educational material and links to sources of further information.

Education Resources Information Center (ERIC). Online database of journal and non-journal literature on education issues, sponsored by the U.S. Department of Education

Document

ASHA's 2005 Focused Initiative on Evidence-Based Practice

-Compiled by Gretchen Gould



  

Advertise With UsAdvertisement