Test Scoring and Analysis Using SAS

Book description


Develop your own multiple-choice tests, score students, produce student rosters (in print form or Excel), and explore item response theory (IRT).

Aimed at nonstatisticians working in education or training, Test Scoring and Analysis Using SAS describes item analysis and test reliability in easy-to-understand terms, and teaches you SAS programming to score tests, perform item analysis, and estimate reliability. Maximizing flexibility, the scoring and analysis programs enable you to analyze tests with multiple versions, define alternate correct responses for selected items, and repeat the scoring with selected items deleted.

You will be guided step-by-step on how to design multiple-choice items, use analysis to improve your tests, and even detect cheating on students’ submitted multiple-choice tests. Other subjects addressed include reading in data from a variety of sources (text files and Excel workbooks, for example), detecting errors in the input data, and producing class rosters in printed form or Excel workbooks. Also included is a chapter on IRT—widely used in education to calibrate and evaluate items in tests in education such as the SAT and GRE—with instructions for running the new SAS procedure PROC IRT.

This book is part of the SAS Press program.

Table of contents

  1. List of Programs
  2. About This Book
  3. About These Authors
  4. Acknowledgments
  5. Chapter 1: What This Book Is About
    1. Introduction
    2. An Overview of Item Analysis and Test Reliability
    3. A Brief Introduction to SAS
  6. Chapter 2: Reading Test Data and Scoring a Test
    1. Introduction
    2. Reading Data from a Text File and Scoring a Test
    3. Explanation of Program 2.1
    4. Reading Space-Delimited Data
    5. Reading Comma-Delimited Data (CSV File)
    6. Reading Data Directly from an Excel Workbook
    7. Reading an Answer Key from a Separate File
    8. Modifying the Program to Score a Test of an Arbitrary Number of Items
    9. Displaying a Histogram of Test Scores
    10. Matching Student Names with Student IDs
    11. Creating a Fancier Roster Using PROC REPORT
    12. Exporting Your Student Roster to Excel
    13. Conclusion
  7. Chapter 3: Computing and Displaying Answer Frequencies
    1. Introduction
    2. Displaying Answer Frequencies (in Tabular Form)
    3. Modifying the Program to Display the Correct Answer in the Frequency Tables
    4. Developing an Automated Program to Score a Test and Produce Item Frequencies
    5. Displaying Answer Frequencies in Graphical Form
    6. Conclusion
  8. Chapter 4: Checking Your Test Data for Errors
    1. Introduction
    2. Detecting Invalid IDs and Answer Choices
    3. Checking for ID Errors
    4. Using “Fuzzy” Matching to Identify an Invalid ID
    5. Checking for and Eliminating Duplicate Records
    6. Conclusion
  9. Chapter 5: Classical Item Analysis
    1. Introduction
    2. Point-Biserial Correlation Coefficient
    3. Making a More Attractive Report
    4. The Next Step: Restructuring the Data Set
    5. Displaying the Mean Score of the Students Who Chose Each of the Multiple Choices
    6. Combining the Mean Score per Answer Choice with Frequency Counts
    7. Computing the Proportion Correct by Quartile
    8. Combining All the Item Statistics in a Single Table
    9. Interpreting the Item Statistics
    10. Conclusion
  10. Chapter 6: Adding Special Features to the Scoring Program
    1. Introduction
    2. Modifying the Scoring Program to Accept Alternate Correct Answers
    3. Deleting Items and Rescoring the Test
    4. Analyzing Tests with Multiple Versions (with Correspondence Information in a Text File)
    5. Analyzing Tests with Multiple Versions (with Correspondence Information in an Excel File)
    6. Analyzing Tests with Multiple Versions (with Correspondence Information and Student Data in an Excel File)
    7. Conclusion
  11. Chapter 7: Assessing Test Reliability
    1. Introduction
    2. Computing Split-Half Reliability
    3. Computing Kuder-Richardson Formula 20 (KR-20)
    4. Computing Cronbach’s Alpha
    5. Demonstrating the Effect of Item Discrimination on Test Reliability
    6. Demonstrating the Effect of Test Length on Test Reliability
    7. Conclusion
  12. Chapter 8: An Introduction to Item Response Theory - PROC IRT
    1. Introduction
    2. IRT basics
    3. Looking at Some IRT Results
    4. What We Aren’t Looking At!
    5. Preparing the Data Set for PROC IRT
    6. Running PROC IRT
    7. Running Other Models
    8. Classical Item Analysis on the 30-Item Physics Test
    9. Conclusion
    10. References
  13. Chapter 9: Tips on Writing Multiple-Choice Items
    1. Introduction
    2. Getting Started/Organized
    3. Types of Items for Achievement Tests
    4. Conclusion
    5. References
  14. Chapter 10: Detecting Cheating on Multiple- Choice Tests
    1. Introduction
    2. How to Detect Cheating: Method One
    3. How to Detect Cheating: Method Two
    4. Searching for a Match
    5. Conclusion
    6. References
  15. Chapter 11: A Collection of Test Scoring, Item Analysis, and Related Programs
    1. Introduction
    2. Scoring a Test (Reading Data from a Text File)
    3. Scoring a Test (Reading Data From an Excel File)
    4. Printing a Roster
    5. Data Checking Program
    6. Item Analysis Program
    7. Program to Delete Items and Rescore the Test
    8. Scoring Multiple Test Versions (Reading Test Data and Correspondence Data from Text Files)
    9. Scoring Multiple Test Versions (Reading Test Data from a Text File and Correspondence Data from an Excel File)
    10. Scoring Multiple Test Versions (Reading Test Data and Correspondence Data from Excel Files)
    11. KR-20 Calculation
    12. Program to Detect Cheating (Method One)
    13. Program to Detect Cheating (Method Two)
    14. Program to Search for Possible Cheating
    15. Conclusion
  16. Index

Product information

  • Title: Test Scoring and Analysis Using SAS
  • Author(s): EdD Ron Cody, Jeffrey Smith
  • Release date: December 2014
  • Publisher(s): SAS Institute
  • ISBN: 9781629594958