Scoring Overview

Quality Assurance

The Professional Educator Standards Board (PESB) contracted with ETS because of its standards for quality assurance. Controls exist in the scoring process to manage the quality of scores produced. Test validity, reliability and fairness are of foremost importance. Scoring adheres to the highest industry standards for evidence-based assessments that are required for high-stakes licensure. Throughout the scoring process, steps are taken to control for any inherent biases that may impact scoring. Scorers are trained to recognize when a personal, societal or professional bias might interfere with their ability to fairly score a response. Through adherence to industry standards, candidates can be confident that portfolio scoring is conducted to achieve the highest levels of fairness and reliability.

Fair and Unbiased Scoring

There are numerous checks in place to provide fair and valid scores. Scorers are required to participate in a rigorous training program that includes demonstration of understanding of the standards, criteria, entry directions, rubrics and more. Scorers must demonstrate mastery of the scoring process through multiple practice sessions conducted by experts who are trained in qualifying scorers. Scorers must take and pass a test verifying their mastery of accurate scoring processes.

All identifying information is removed from responses so scorers are prevented from knowing candidate identity. For resubmitted entries, scorers are not aware of any previous scores. They will not know that any entry is a resubmission, nor will they have access to any prior candidate scores.

Inter-rater Agreement

Scorer agreement is a desirable goal for all evidence-based assessments scored using rubrics. The ProTeach Portfolio is scored using a 4-point rubric. It is important that different scorers who have been trained to score candidate responses closely agree in the scores they assign to the same candidate's response. In other words, the score a candidate receives should not be dependent on which particular trained scorer happens to score the response. ETS will compute the extent of scorer agreement on the ProTeach Portfolio using a Kappa statistic, which ranges from 0 (no rater agreement) to 1 (complete rater agreement). The significance of the Kappa value is evaluated by comparing that value to the value that would be expected by a chance level of agreement. If the Kappa value exceeds what would be expected by chance, it may be concluded that the extent of rater agreement is statistically significant.

A Word of Caution

Each entry in your ProTeach Portfolio, whether initial or resubmitted, must be entirely your work. While we encourage collaborative teaching and utilizing a support provider, each entry must be distinctly your own work. Software is utilized to scan all written commentary for overlap with previous submissions or with another candidate's submissions. If such overlap is detected, scores will be voided and an investigation with the ETS Office of Testing Integrity may be initiated. Results of all investigations conducted are forwarded to PESB (WAC 181-87-050, Misrepresentation or Falsification in the Course of Professional Practice).

You can learn more about the specifics of scoring by clicking on the following links:

 

Become a ProTeach Portfolio Scorer

Find out more about becoming a 2014 Washington ProTeach Portfolio scorer and submit an application.