Loading

CARS Newsletter Spring 2021

Spring Assessment Day: Results

Spring 2021 Assessment Day occurred remotely, with students completing their assigned assessments within a 24-hour period. The Data Management team compared scores on some of the frequently administered assessments across multiple Spring administrations. For the majority of tests, there has been a slight decrease in the average percent correct scores over the past 3 to 4 Spring Assessment Days. This trend continued for Spring 2021, however the decrease in average percent correct scores was slightly larger than observed in previous years.

The average percent correct is presented for Spring 2020 and Spring 2021 administrations for 7 tests. Average percent correct scores from the Spring 2021 administration were lower than the Spring 2020 administration for all tests. For some tests, the decrease in scores was larger than what might be expected based on the decreasing trend over the years, particularly for long tests and/or tests that might not be on topics of interest for many students. The lower scores from the Spring 2021 administration might be attributable to the remote, un-proctored administration format. However, lower levels of learning due to COVID19 could also be a contributing factor.

On average, motivation scores from the Spring 2021 administration were slightly lower than previous administrations. However, there was a noticeable difference in the distributions of motivation scores across multiple spring administrations.

Notice the much larger bump at the midpoint of the motivation scale for Spring 2021 effort scores. The distributional differences could be attributed to 1) more students with lower motivation in Spring 2021 or 2) more students selecting a response of “3” for all effort items in Spring 2021. This pronounced “bump” for the Spring 2021 cohort was also observed at the lower end of total correct score for many tests. The Data Management team are conducting further analyses using testing times and scores on some tests to identify examinees that may be unmotivated, exploring whether removing them from the data (motivation filtering) is a good idea.

CARS Talk - Alumni Panels

During the spring semester CARS held two virtual alumni panel CARS talks. The first panel, held on March 19th, featured alumni from our Assessment & Measurement program who are currently working in jobs in higher education assessment. The panelists discussed their roles at academic institutions and provided insight to our current students on what a career in higher education assessment might entail. Panelists for this CARS talk were: Katie Busby (University of Mississippi), Megan Rodgers Good (Auburn University), Ross Markle (DIA Higher Education Collaborators), and Sue Pieper (Northern Arizona University).

The second CARS talk alumni panel featured Assessment and Measurement alumni who currently work in the testing industry. They also provided guidance to our students about careers in the testing industry. This panel was held on April 16th; panelists were: Carol Barry (American Board of Surgery), Bo Bashkov (IXL Learning), Madison Holzman (Curriculum Associates), Dan Jurich (National Board of Medical Examiners), Becca Marsh-Runyon (Educational Testing Service), Javarro Russell (MCAT), and Kristen Herrick (MacMillan Learning). A huge thank you to all of our alumni who made the time to participate in these CARS Talks. Their experiences are so valuable to our current students as they begin to think about the next step in their careers.

NILOA Occasional Paper No. 51

On their own, student learning and development outcomes assessment data have limited utility for improving programming. We believe outcomes data should not be collected until two fundamental questions can be answered: “Why should this programming result in the desired outcome?” (i.e., program theory) and “Was the intended programming actually experienced by students?” (i.e., implementation fidelity). Some assessment professionals may find this proclamation radical. Our call is fueled by the creation of unjustified programming and curriculum, coupled with the collection of outcomes data that are not used for improvement efforts. We contend that it is only after program theory is articulated that faculty and student affairs professionals can collect relevant, useful outcomes data. Moreover, valid inferences from outcomes data are contingent on knowing what programming students experienced. This “expanded” assessment practice has potential to afford better-designed, more impactful, research-informed programming to students. As our students have opportunities to engage in well-implemented, should-be-effective programming, their learning should demonstrably improve. Thus, we call for professional standards and professionals themselves to integrate program theory and implementation fidelity into outcomes assessment practice.

Graduate Showcase

Graduate students in the Assessment & Measurement and Quantitative Psychology programs participated in the 2021 Spring Graduate Showcase. The showcase was held in a virtual format again this year. Students recorded videos of their presentations, and the videos were posted on the JMU website for attendees to view. The students are listed below along with their presentations and recordings.

Large-Scale Assessment during a Pandemic: Results from James Madison University's Assessment Day - Sarah Alahmadi

The Accuracy of Online Testing: A Reliability and Validity Investigation - Kate Schaefer and Samantha Harmon

A More Efficient Path to Learning Improvement: The Utility of Evidence-Informed Programming - Holly Buchanan

The Power of Historical Data: An Accurate Method in Estimation? - Kathryn Thompson

Differential Item Functioning Analysis on the NIH Toolbox Picture Vocabulary Test on Black and White Participants (Poster) - Jaylin Nesbitt

Awards

Sara Finney - Ford Faculty Support For Excellence in Teaching

Kathryn Thompson - Outstanding Thesis award from The Graduate School

Kate Schaefer & Kathryn Thompson - Outstanding Service award from Graduate Psychology

Nikole Gregg - Outstanding Teaching award from Graduate Psychology

Caroline Prendergast - Outstanding Scholarship award from Graduate Psychology

Class of 2021

The most unusual academic year came to a close with The Graduate School's Commencement being held on May 6th, 2021. Though there was an in-person ceremony, CARS celebrated our graduates virtually with a Zoom reception to honor our four graduates. This year's graduates are listed below.

Nikole Gregg, Ph.D. - Nikole graduated as a triple-Duke receiving her Ph.D. in Assessment and Measurement. Her dissertation is titled "Getting Caught Up in the Process: Does it Really Matter?" Dr. Brian Leventhal was her advisor and chair of her dissertation committee. Nikole also won the Outstanding Teaching Award from the Dept. of Graduate Psychology for the 2020-2021 year. Nikole has accepted a job at Cambium Assessment as a Psychometrician.

Beth Perkins, Ph.D. - Beth received her Ph.D. in Assessment and Measurement. Her dissertation is titled "Does Coding Method Matter? An Examination of Propensity Score Methods when the Treatment Group is Larger than the Comparison Group." Dr. Jeanne Horst was her advisor and chair of her dissertation committee. Beth is currently job searching and hopes to find something where she can work remote so she and her family can stay in the Shenandoah Valley a little while longer!

Yelisey Shapovalov, M.A. - Yelisey graduate from our Psychological Sciences, Quantitative Psychology program with an M.A. His thesis is titled "Identifying Rater Effects for Writing and Critical Thinking: Applying the Many-Facets Rasch Model to the VALUE Institute" and Dr. John Hathcoat was his advisor and chair of his thesis committee. Yelisey has accepted admission into the Assessment and Measurement Ph.D. program and will continue his work as a GA in CARS while he pursues that degree.

Samantha Boddy - Samantha hopes to graduate this summer from our Psychological Sciences, Quantitative Psychology program. She is working on her thesis under the supervision of her advisor and thesis chair, Dr. Debbi Bandalos. Upon graduation, Samantha hopes to find a job and take a break before eventually pursuing a doctoral degree.

SUMMER EVENTS

CARS will be a sponsor at the annual conference of the Association for the Assessment of Learning in Higher Education (AALHE). CARS faculty and students will also present a few sessions and workshops during this virtual conference. If you're attending AALHE, stop by our virtual booth!

CARS will be hosting two Assessment 101 sessions this summer. These week-long workshops will be held virtually with a combination of asynchronous activities and live synchronous meeting times. Both the June and July sessions are full and we anticipate approximately 35 people in each session. Participants include JMU faculty and staff from both Academic and Student Affairs, new incoming graduate assistants, summer undergraduate interns, and faculty from institutions and organizations external to JMU. Doctoral student Sarah Alahmadi will lead the workshops under the supervision of Jeanne Horst.

APT Raters will also be held virtually in late July. Dr. Yu Bao and the graduate assistants on the PASS team will lead a two-day rater training and then a group of JMU faculty and graduate student raters will rate all academic APT reports over a three-day period. Following the rating session, the PASS team will review all reports before sending feedback to academic programs in the fall.

CARS STUDENT SPOTLIGHT

Samantha Harmon is entering her second year in the Psychological Sciences M.A. program with a concentration in Quantitative Psychology. A local to the area, Samantha grew up in the Harrisonburg area and attended JMU for her undergraduate studies. After receiving a degree in Psychology, Samantha joined our program. In her first year in CARS, Samantha served as a graduate assistant on the Assessment Day/Data Management team where she helped Dena Pastor and other graduate assistants implement JMU's first-ever remote fall assessment day. Samantha will resume her role on the Assessment Day team for her second year starting in fall. Here are Samantha's responses to our interview questions - you can hear about her experience in her own words!

What made you choose JMU, and this program specifically? I had taken an advanced psychological statistic course in my undergraduate studies and was introduced to a new realm of possibilities within the psychological sciences. I fell in love with the idea of contributing to research by revising and developing new ways to provide more techniques to measure human attributes.

What topics/areas of study and research are you most interested in? I am interested in looking at Diagnostic Cognitive Modeling and developing validity of the attributes used within this framework.

What has your favorite experience been so far in this program or in your GA?Creating a website for the recruiting initiate to recruit undergrads to the internship as well to the program. It was amazing to cultivate my skills on something I had no prior experience with.

Samantha also tells us that her favorite class so far has been Data Management (604) because she feels the class provided her with skills she will use for the rest of her life. Outside of her work at JMU, Samantha has an impressive side business making and selling gourmet cupcakes!

Created By
Paula Melester
Appreciate

Credits:

JMU Creative Media