Could Rubric-Based Grading Be the Assessment of the Future?

So, apparently the Association of American Colleges and Universities has been piloting the use of rubric assessments of "cross-cutting skills." They call their rubrics Valid Assessment of Learning in Undergraduate Education or VALUE.

According to Katrina Schwartz's reporting on the pilot last month, the professors involved were surprised by what they, themselves, learned by doing assessments in this way:

Professors began realizing how much the language of their assignment prompts communicated what they expected from students. That might seem obvious, but without other samples to compare to, professors just thought their students didn’t have the skills.

You don't get this type of reflection from multiple-choice tests.

Predicting College Success

I spent my morning analyzing the grades of the sixty-seven juniors and seniors who dual enrolled from my school this past semester. Of the 464 college credits attempted, 440 were earned, giving us a pass rate just a hair under ninety-five percent. Half the group had a college GPA above a 3.43. I'd say this is pretty good news for our first cohort of New Tech students taking college classes.

One of the goals of my analysis was to assess how well we predicted college readiness amongst these young advanced students. While only four of the sixty-seven students who dual enrolled experienced failure, some students still performed worse than expected. Pushing students to college too early could potentially blemish their college transcript. Defining "ready" has therefore become a really big deal.

Aligning our thinking with both our college partner and the state, we placed the greatest weight on students' college entrance exam scores last year. In deciding who got to go, we let test scores trump all other valid readiness indicators such as high school GPA, teacher perception, etc.

So, how did that work out for us?

The worst predictor of student success for us was their score earned on the COMPASS, taken by our current juniors who had not yet taken the ACT. The COMPASS is used by our community college partner to place students into courses at appropriate levels. For us, it turned out that the COMPASS provided only a minor ability to predict college success (r=0.25).

 The correlation between student COMPASS scores and college GPA was a low r=+0.25.

The correlation between student COMPASS scores and college GPA was a low r=+0.25.

Coming in second was the ACT assessment, taken by all juniors in the state of Michigan. The ACT proved to be a fair predictor of college success (r=0.44).

 The correlation between student ACT scores and college GPA was a moderate r=+0.44.

The correlation between student ACT scores and college GPA was a moderate r=+0.44.

The best predictor of college success turned out to be student GPA (r=0.76).

 The correlation between student high school GPA and college GPA was a high r=+0.74.

The correlation between student high school GPA and college GPA was a high r=+0.74.

While the state of Michigan allows schools to use varied methods of determining college readiness before allowing students to dual enroll, it is interesting that they will not not allow GPA be a primary determining factor, given it's apparent ability to correctly predict student success.

What we will most likely do in the future, given this data, is create a single numerical value for each student that takes into account their college entrance exam score and their high school GPA. This would appear to provide some additional predictive ability (r=+0.82 to r=+0.86) not possible using test scores alone.

UPDATE—January 30, 2015: Looking at this with fresh eyes, I think it's important to point out that we used the minimum COMPASS and ACT scores required for college-level coursework placement with our community college partner as our cutoff for allowing students to dual enroll. We did not use the state minimum scores, which are higher. It is logical that using the higher scores would have increased these assessments' predictive ability. We are choosing to use the lower scores to increase access with the hope of keeping risk to a minimum for our students.

My Reaction to Michigan's Switch to the SAT? Carry On.

I do not deny that we have a lot to learn about Michigan's switch to a new "college-readiness" assessment and the impact this switch will have on students' admission to colleges and universities. Any time we spend learning about the SAT is time we could spend learning about something else, like teaching practices that positively impact student achievement. At the same time, I can't help but feel indifferent about the news of the change.

High schools exist to teach students to be successful in the world they graduate into. College readiness exams measure a narrow band of that world. Assuming we are focused on teaching students the knowledge and skills they need (rather than just those that are assessed), the brand of exam should have little impact on how we work with students.

My message for teachers about this week's news:

Carry on.

College Board president key figure in development of Common Core

Nick Anderson, writing for the Washington Post Back in March 2014:

Coleman’s vision for the SAT, with emphasis on analysis of texts from a range of disciplines as well as key math and language concepts, appears to echo the philosophy underlying the Common Core and could help the test track more closely with what students are learning in the nation’s classrooms.

Differing Points-of-View

The Detroit News:

Michigan’s high school juniors will be required to take the SAT college assessment exam instead of the ACT next spring ...

Quoted in the article, here's Wendy Zdeb-Roper, Executive Director of the MASSP:

Colleges and universities have not even seen the test yet and will need to re-norm their acceptance standards, since it will include a new scoring scale ...

Later in the article:

Jim Cotter, Michigan State University’s director of admissions, said he expects the impact on the admission review process will be minimal.

By my measure, the gap between "re-norm their acceptance standards" and "the impact ... will be minimal." is pretty huge.