American Speech-Language-Hearing Association

EBP Compendium: Summary of Systematic Review


Identification of Aphasia Post Stroke: A Review of Screening Assessment Tools

Salter, K., Jutai, J., et al. (2006).
Brain Injury, 20(6), 559-568.

Indicators of Review Quality:

The review addresses a clearly focused question No
Criteria for inclusion of studies are provided Yes
Search strategy is described in sufficient detail for replication Yes
Included studies are assessed for study quality Yes*
Quality assessments are reproducible Yes*

Description: This is a review of published research literature regarding the psychometric and administrative properties of 6 screening tools for aphasia.

Question(s) Addressed:

Question not specifically stated.

Population: Individuals who have had a stroke.

Intervention/Assessment: Aphasia screening tools.

Number of Studies Included: 6 screening devices reviewed. Number of articles/studies not stated.

Years Included: 1960 - 2005

Findings:

Conclusions:

  • Assessment/Diagnosis
    • Assessment Instruments
      • Aphasia Screening Tests
        • For most tools that were evaluated, information pertaining to measurement properties and clinical utility were limited. With the exception of the The Reitan-Indiana Aphasia Screening Examination (ASE), all screening tools require minimal training and experience.
        • Only two screening tools (Frenchay Aphasia Screening Test and The Ullevaal Aphasia Screening Test) provided sensitivity and specificity data.
        • The sensitivity and specificity of the Frenchay Aphasia Screening Test is 87% and 80% respectively. Evidence suggests that the Frenchay Aphasia Screeing Test (FAST) is reliable, valid, and demonstrates sufficient diagnostic accuracy in terms of sensitivity and specificity, although the specificity of the screening at 80% should be considered with caution during implementation.
        • The sensitivity and specificity of the Ullevaal Aphasia Screening Test is 75% and 90% respectively.


Keywords: Aphasia, Stroke

Access the Review

Note:

*The included studies were not assessed for study quality, however included tools were asssessed for quality. "Identified tools were assessed for reliability and validity as well as for practical, administrative qualities such asa interpretability, acceptability and feasibility based upon the evaluation criteria recommended by Fitzpatrick et al." (p. 560).

Added to Compendium: January 2012

Share This Page

Print This Page