Does a student's high grade-point average in her speech-language pathology graduate program mean that the student has the skills and knowledge to be an effective clinician?
Not necessarily, because it likely doesn't take into account such important competencies as diagnostic, treatment, and oral presentation skills. To address this issue, our program—the University of Kansas master's program in speech-language pathology—requires students to keep portfolios, online repositories of their graded course assignments and critiqued materials from clinical experiences.
Using these portfolios, we can more roundly assess students' knowledge and skills and improve their learning in response to the Council of Accreditation in Audiology and Speech-Language Pathology (CAA) standards requiring graduate programs to use formative assessments (see sidebar). The CAA defines formative assessment as "ongoing measurement during educational preparation for the purpose of improving student learning." Formative assessment provides qualitative data that complement the quantitative data provided by the grade-point average. Moreover, formative assessment provides a means of identifying areas for improvement that are common across diverse experiences, even though these particular areas of improvement might not strongly affect a final grade in one specific experience.
Although the CAA standards change was the original impetus, the faculty also wanted to find ways to identify and measure learning goals as a more effective means of evaluating program success and identifying areas for revision. This motivation is common at the program level, according to the National Institute for Learning Outcomes Assessment. We identified four broad skill areas (as well as more specific skills) related to evidence-based practice for evaluation:
- Foundational knowledge (e.g., understanding basic concepts, terminology, and theory; ability to find evidence).
- Application and use (e.g., developing assessment and treatment plans).
- Analytical processes (e.g., analyzing and integrating assessment findings; monitoring treatment progress).
- Communication skills.
We identified where in our curriculum we taught these skills [PDF], and found a variety of ways for students to acquire skills in different areas of practice. Although the offerings seemed sufficient, the question of whether students were actually learning the skills still remained.
To address this question, we adopted a student portfolio system in which students archive graded course assignments and critiqued materials from clinical experiences (with client names removed) in an individual online repository each semester.
We conducted two pilot studies to develop the portfolio guidelines [PDF], which have been in use since 2009.
Developing New Assessments
Each student's portfolio serves as the foundation for formative and summative assessment. Formative assessment takes place after the student has completed two to three semesters of graduate work; summative assessment take place in the student's last semester.
Both types of assessment begin with student self-reflection and self-evaluation. We developed two self-reflection rubrics, one for diagnostic skills [DOC] and one for treatment skills [DOC], with four levels of performance identified for each skill. Students rate themselves on each rubric and also complete an initial action plan [DOC] to reflect on their progress to date.
The student's advisor reviews these self-reflections and the contents of the student's portfolio. The advisor adds his or her reflections on the student's progress to the action plan.
The formative assessment process culminates with a meeting between the student and advisor to discuss the student's progress and ways to enhance learning during the remainder of the program. The formative assessment chart [PDF] shows frequent strengths, weaknesses, and recommended action plans from assessments conducted in 2010.
The summative assessment process culminates in a formal oral exam by a three-person faculty committee. For this final assessment, the students choose three artifacts to present [PDF] and respond to faculty questions related to the skill areas. We designed a rubric [XLS] to prompt questions and evaluate student performance in each area.
At the conclusion of the summative exam, the committee reaches a consensus rating of the student on the rubric and discusses this rating with the student. The committee completes the action plan, with particular focus on recommending continuing education activities and discussion topics for the student's clinical fellowship supervisor to facilitate the student's successful transition to the workforce.
Feeding Continuous Program Improvement
Copies of each student's summative rubric (with names removed) are archived for review by the program's Curriculum Committee. Every year, the data are summarized by computing the percentage of students who earned a particular rating in each skill area (see chart [PDF]). Because only one round of summative exams has been completed to date, the data have been used primarily to refine the portfolio and assessment procedures.
We already have noticed benefits to our program. The summative assessment has given us a shared body of student work, which has fostered faculty discussions about program goals, pedagogical practices, and fostering student success. In addition, we've evaluated faculty and student satisfaction with this system of accountability through anonymous online surveys. Results generally have been positive:
- 89% of faculty felt that the portfolio and assessment process captured student learning and that the summative exam was an efficient and accurate means of evaluating student learning.
- 82% of students felt that the portfolios helped them learn about their own skills and set goals for their studies.
Student impressions of the summative exam, however, were more mixed. Although some found it stressful, others felt it useful. One student noted, "I found the experience to be very rewarding and [it] gave me additional confidence going into the interview process. I think it is beneficial for us as students to gain the experience in public speaking and thinking on our feet."
Assessing student learning is challenging. It has taken us several years to develop our current process. However, the rewards are worth the effort. As educators of the next generation of clinicians, it is gratifying to see evidence that students are mastering the skills important for success as clinicians.