Intra-rater Reliability

          Intra-rater reliability refers to the degree of agreement among multiple repetitions of a diagnostic test performed by a single rater. In all cases of test marking, a marker should be consistent; meaning that he or she must be confident in marking the same performance under different circumstances. It means that their judgments are not altered by any external factors. Thus, it is recommended for a double routine marking as a way to come up with average score, leading to the final result of the performance.

          However, the only way to come up with inter-rater reliability is by allowing the examiner to re-mark the performance (scripts) that has been marked. The fact that the examiner is remarking a script should be left out of the process, so that it will not become an affecting factor during the process. After getting the score, the Team Leader for the examiners can determined the correlative means and standard deviations. Thus, the reliability of the examiner can be analyzed for further actions.  


No comments:

Post a Comment