Date of Award

January 2014

Degree Type

Open Access Dissertation

Degree Name

Doctor of Education (EdD)

Department

Educational Leadership and Policy Studies

First Advisor

Charles S. Hausman

Department Affiliation

Educational Leadership and Policy Studies

Second Advisor

James R. Bliss

Department Affiliation

Educational Leadership and Policy Studies

Third Advisor

Robert Biggin

Department Affiliation

Educational Leadership and Policy Studies

Abstract

Substantial increases in online education since the start of the 21st century requires investigation in to how online courses differ from traditional face-to-face courses. It is particularly important to discover how online students learn and which assessment methods they prefer and see as most beneficial to online learning. Using online assessment techniques that correspond with those rated highly my online students can lead to better student experiences in online courses and improved persistence rates in online courses, which have traditionally be lower compared to face-to-face courses.

The participants for the study sample included online students majoring in Bachelor of Science degree programs of Criminal Justice, Police Studies, Homeland Security, and Correctional and Juvenile Justice Studies within the College of Justice and Safety at Eastern Kentucky University This quantitative study examined these online students' attitudes toward fifteen assessment techniques commonly used in online courses. The study participants were asked to complete an online survey where they rated each assessment technique from 1 to 6 based on their personal preference for the technique. They were then asked to rate each assessment technique from 1 to 6 based on the learning value of each technique. The mean ratings were order ranked with the top five rated assessment techniques for personal preference being were Multiple Choice Questions, Matching Questions, Reflections/Issue Papers, True-False Questions, and Short Answer Questions. The top five rated assessment techniques for learning value were Reflection/Issue Papers, Multiple Choice Questions, Short Answer Questions, Matching Questions, and Discussion Boards.

A series of pair-samples t-test were conducted comparing the mean ratings of each assessment technique's personal preference mean rating to its corresponding learning value mean rating. Significant differences were found between the personal preference and learning value mean ratings of Journals, Research Papers, Group Papers/Portfolios, Journal Article Reviews, Wikis, Multiple Choice Questions, Fill-in-the-Blanks Questions, and Essay/Discussion Questions. The data collected was finally tested to compare each assessment technique's mean personal preference rating to its corresponding learning value mean rating to determine a significant correlation relation existed between them. Tests revealed positive correlations between the personal preference mean rating and the learning value mean rating of each assessment technique.

Share

COinS