Sampling Variability of Performance Assessments. Report on the Status of Generalizability Performance [microform] : Generalizability and Transfer of Performance Assessments. Project 2.4: Design Theory and Psychometrics for Complex Performance Assessment in Science / Richard J. Shavelson and Others
- Bib ID:
- 5559453
- Format:
- Book and Microform
- Author:
- Shavelson, Richard J
- Online Version:
- https://eric.ed.gov/?id=ED359229
Broken link? let us search Trove , the Wayback Machine , or Google for you.
- Description:
-
- [Washington, D.C.] : Distributed by ERIC Clearinghouse, 1993
- 32 p.
- Summary:
-
In this paper, performance assessments are cast within a sampling framework. A performance assessment score is viewed as a sample of student performance drawn from a complex universe defined by a combination of all possible tasks, occasions, raters, and measurement methods. Using generalizability theory, the authors present evidence bearing on the generalizability (reliability) and convergent validity of performance assessments sampled from a range of measurement facets, measurement methods, and data bases. Results at both the individual and school level indicate that rater-sampling variability is not an issue: raters (e.g. teachers, job incumbents) can be trained to consistently judge performance on complex tasks. Rather, task-sampling variability is the major source of measurement error. Large numbers of tasks are needed to get a reliable measure of mathematics and science achievement at the elementary level, or to get a reliable measure of job performance in the military. With respect to convergent validity, results suggest that methods do not converge. Performance scores, then, are dependent on both the task and method sampled. (Contains 36 references.) (Author)
- Notes:
-
- Sponsoring Agency: California Univ., Berkeley. Office of the President.
- Sponsoring Agency: Office of Educational Research and Improvement (ED), Washington, DC.
- Sponsoring Agency: National Science Foundation, Washington, DC.
- Contract Number: NSF-SPA-8751511.
- Contract Number: NSF-TPE-9055443.
- Contract Number: R117G10027.
- May also be available online. Address as at 14/8/18: https://eric.ed.gov/
- Reproduction:
- Microfiche. [Washington D.C.]: ERIC Clearinghouse microfiches : positive.
- Subject:
-
- Academic Achievement
- Educational Assessment
- Error of Measurement
- Evaluators
- Generalizability Theory
- Interrater Reliability
- Job Performance
- Mathematics Achievement
- Measurement Techniques
- Performance Based Assessment
- Sampling
- Science Achievement
- Science Instruction
- Scores
- Scoring
- Student Evaluation
- Test Reliability
- Test Validity
- Training
- Other authors/contributors:
- National Center for Research on Evaluation, Standards, and Student Testing, Los Angeles, CA
- Available From:
- ERIC
- Copyright:
-
In Copyright
You may copy under some circumstances, for example you may copy a portion for research or study. Order a copy through Copies Direct to the extent allowed under fair dealing. Contact us for further information about copying.
Copyright status was determined using the following information:
- Material type:
- Literary Dramatic Musical
- Published status:
- Published
- Publication date:
- 1993
Copyright status may not be correct if data in the record is incomplete or inaccurate. Other access conditions may also apply. For more information please see: Copyright in library collections.
Request this item
Request this item to view in the Library’s reading room.
Feedback
Similar items
- Error Sources Influencing Performance Assessment Reliability or Generalizability [microform] : A Meta Analysis / Ying Hong Jiang and Others
- Creating Meaningful Performance Assessments [microform] / Stephen N. Elliott
- A Construct-Centered Generalizability Model [microform] : Analyzing Underlying Constructs of Cognitively Complex Performance Assessments / Ying Hong Jiang and Philip L. Smith
- Statistical Test Specifications for Performance Assessments [microform] : Is This an Oxymoron? / Mark D. Reckase
- Optimal Designs for Performance Assessments [microform] : The Subject Factor / Jay Parkes