New Tech GPA Stronger Predictor of College Success

A few weeks ago, I shared that my dual enrollment students' high school GPA was the strongest predictor of college success — stronger even than scores on college placement exams. Last week, it struck me that half the group of students we sent (our juniors) were taught in 100% New Tech courses before dual enrolling in college. The other half were seniors who were taught in traditional classes one year ahead of our New Tech initiative. What a great opportunity for data comparison!

For those unfamiliar with New Tech, let me explain: 

Three years ago, my district contracted with the New Tech Network to support change in our high school in three key areas:

  1. Empowering students through increased voice and choice in their learning.
  2. Engaging students in deeper learning of course content through wall-to-wall implementation of project- and problem-based learning as our instructional model.
  3. Enabling students to foster their own learning by providing them with 1-to-1 technology and teaching them to use it effectively.

As part of this initiative, we spent over 2.5 million dollars renovating spaces, buying furniture and technology, and training teachers and leaders. As a result, our staff is now working collaboratively to design authentic projects. We've moved our teacher desks into one of two "Bullpens" where teachers meet between classes and during prep. We integrate courses whenever integration makes sense. Our students take classes like "GeoDesign," "BioLit," "American Studies," and "Civic Reasoning." Each of these classes have two teachers and more time to learn from their work. We are doing a lot of things differently. And better.

To put things back into perspective, we have two groups of students dual enrolling this year: seniors and juniors. Both were educated by the same teachers in the same school. The juniors are part of our New Tech initiative. The seniors are not. The circumstances are begging for further analysis!

To start, let me describe the students. Last semester, we had 67 students dual enroll: thirty-nine juniors and twenty-eight seniors. Both groups represent what we would consider our "top third" performers (more juniors dual enrolled because their class size was larger). The average high school GPA for the groups were close: 3.39 and 3.32 respectively.

They were also demographically similar. Both groups had a few more boys than girls. They represented only a third of our free and reduced lunch population (only 18% of dual enrolled students vs 55% total high school enrollment). They were racially similar, 99% white, which is consistent with our district and community makeup. 

The one demographic difference that stands out to me is the obvious one: seniors are, on average, one year older than juniors. They also have one more year of high school experience and are one year closer to entering college full-time. While I cannot say that this information is statistically significant, after working in high schools for the past ten years, it feels anecdotally significant.

In college, they also performed similarly when looking at the average. Seniors passed 96% of college classes with a GPA of 3.01. Juniors passed 92% of college classes with a GPA of 2.90. Failure was experienced by just three students, one senior and two juniors.

One other comparison that seems notable is that both juniors and seniors took similar courses in college with one potentially significant exception: being farther ahead in curriculum, more seniors took advanced math than juniors (46% vs 13% respectively). 

Where performance differences become noticeable is in the way individual GPA distributes across students. The graphs below demonstrate that difference by overlapping the distribution of high school and college GPAs for each group independently.

image (3).png

Generally speaking, it is clear that both groups performed better at the top of the GPA range in high school than they did in college; both groups saw fewer individual students with a college GPA in the 3.0—4.0 range. It is notable, however, that the size of the gap between  high school GPA and college GPA at the top of the range is smaller for the New Tech juniors than it is for the seniors (this will be highlighted later). And, while that gap continues to exist — albeit in the opposite direction — for seniors in the middle of the GPA range (1.5–3.0, it seems to disappear for juniors. At the bottom of the range, of course, more juniors than seniors earned a GPA below a 1.5.

The degree to which high school GPA and college GPA move together can be further illustrated in the following two scatterplots:

N=28, R=+0.65, r^2=0.418

N=28, R=+0.65, r^2=0.418

N=39, R=+0.84, R^2=0.705

N=39, R=+0.84, R^2=0.705

As previously reported, there was a strong positive correlation between high school GPA and college GPA for all dual enrolled students (r=+0.74). As this data shows, the correlation was higher for juniors (r=+0.84) than it was for seniors (r=+0.65). And, while I do not have the mathematical chops to tell you yet whether or not this difference (r=+0.19) is groundbreaking, I can only tell you that I find it encouraging.

As an educator, I strive to give students accurate information about their potential to succeed after high school. I find it satisfying to learn that our New Tech initiative may be increasing that accuracy. 

Time will tell whether or not this trend will continue. I don't want to make any broad claims about why our New Tech educated students' GPAs are better predictors of college success. I will, however, close with some wonders:

  1. I wonder what effect our measurement of skills (collaboration, agency, oral & written communication) in addition to content is having on high school success as it relates to college success? 
  2. I wonder if this trend will continue with our next group of New Tech students who dual enroll? Specifically, I wonder if the model will apply equally to lower high school GPA-earning students?
  3. I wonder if other New Tech high schools have found similar results. 
  4. I wonder if I will be satisfied if the only quantifiable difference between our New Tech educated students' college success and those students taught in our traditional high school is this increase in our ability to predict said success? I wonder if our community would be satisfied?
  5. I wonder what questions I'm not asking that may have compelling answers in this data?

Our New Tech students are taking the ACT for the first time next week. We will also begin scheduling our second group of Early College participants. I can't wait to add this data to the mix for further analysis to see how they compare.

Predicting College Success

I spent my morning analyzing the grades of the sixty-seven juniors and seniors who dual enrolled from my school this past semester. Of the 464 college credits attempted, 440 were earned, giving us a pass rate just a hair under ninety-five percent. Half the group had a college GPA above a 3.43. I'd say this is pretty good news for our first cohort of New Tech students taking college classes.

One of the goals of my analysis was to assess how well we predicted college readiness amongst these young advanced students. While only four of the sixty-seven students who dual enrolled experienced failure, some students still performed worse than expected. Pushing students to college too early could potentially blemish their college transcript. Defining "ready" has therefore become a really big deal.

Aligning our thinking with both our college partner and the state, we placed the greatest weight on students' college entrance exam scores last year. In deciding who got to go, we let test scores trump all other valid readiness indicators such as high school GPA, teacher perception, etc.

So, how did that work out for us?

The worst predictor of student success for us was their score earned on the COMPASS, taken by our current juniors who had not yet taken the ACT. The COMPASS is used by our community college partner to place students into courses at appropriate levels. For us, it turned out that the COMPASS provided only a minor ability to predict college success (r=0.25).

The correlation between student COMPASS scores and college GPA was a low r=+0.25.

The correlation between student COMPASS scores and college GPA was a low r=+0.25.

Coming in second was the ACT assessment, taken by all juniors in the state of Michigan. The ACT proved to be a fair predictor of college success (r=0.44).

The correlation between student ACT scores and college GPA was a moderate r=+0.44.

The correlation between student ACT scores and college GPA was a moderate r=+0.44.

The best predictor of college success turned out to be student GPA (r=0.76).

The correlation between student high school GPA and college GPA was a high r=+0.74.

The correlation between student high school GPA and college GPA was a high r=+0.74.

While the state of Michigan allows schools to use varied methods of determining college readiness before allowing students to dual enroll, it is interesting that they will not not allow GPA be a primary determining factor, given it's apparent ability to correctly predict student success.

What we will most likely do in the future, given this data, is create a single numerical value for each student that takes into account their college entrance exam score and their high school GPA. This would appear to provide some additional predictive ability (r=+0.82 to r=+0.86) not possible using test scores alone.

UPDATE—January 30, 2015: Looking at this with fresh eyes, I think it's important to point out that we used the minimum COMPASS and ACT scores required for college-level coursework placement with our community college partner as our cutoff for allowing students to dual enroll. We did not use the state minimum scores, which are higher. It is logical that using the higher scores would have increased these assessments' predictive ability. We are choosing to use the lower scores to increase access with the hope of keeping risk to a minimum for our students.

College Board president key figure in development of Common Core

Nick Anderson, writing for the Washington Post Back in March 2014:

Coleman’s vision for the SAT, with emphasis on analysis of texts from a range of disciplines as well as key math and language concepts, appears to echo the philosophy underlying the Common Core and could help the test track more closely with what students are learning in the nation’s classrooms.

Differing Points-of-View

The Detroit News:

Michigan’s high school juniors will be required to take the SAT college assessment exam instead of the ACT next spring ...

Quoted in the article, here's Wendy Zdeb-Roper, Executive Director of the MASSP:

Colleges and universities have not even seen the test yet and will need to re-norm their acceptance standards, since it will include a new scoring scale ...

Later in the article:

Jim Cotter, Michigan State University’s director of admissions, said he expects the impact on the admission review process will be minimal.

By my measure, the gap between "re-norm their acceptance standards" and "the impact ... will be minimal." is pretty huge.