Optimisism about the future of education in a world of AI

Ethan Mollick, writing for One Useful Thing:

I actually think the opposite is true: education will be able to adapt to AI far more effectively than other industries, and in ways that will improve both learning and the experience of instructors.

During a recent meeting with my district's Literacy Coaches, I saw an opportunity to introduce ChatGPT and help them understand how it could generate assessment prompts using a new technical format they had recently learned. The coaches identified a grade level and subject area, and I used ChatGPT to generate multiple assessment prompts. The prompts were not only coherent and grammatically correct, but they also perfectly fit the specifications learned at the training.

The coaches were amazed at the speed and accuracy of ChatGPT's responses. This was their first exposure to the technology. By demonstrating ChatGPT's capabilities, they were able to see the potential benefits that AI could bring to planning instruction.

With AI, teachers can quickly generate materials and assessments, saving time and allowing them to focus on individualized student support. Additionally, AI can analyze student data and provide personalized recommendations, helping teachers better understand each student's strengths and needs. By showing teachers these benefits through hands-on experiences early on, we can build their confidence and encourage more integration in the classroom.

Could Rubric-Based Grading Be the Assessment of the Future?

So, apparently the Association of American Colleges and Universities has been piloting the use of rubric assessments of "cross-cutting skills." They call their rubrics Valid Assessment of Learning in Undergraduate Education or VALUE.

According to Katrina Schwartz's reporting on the pilot last month, the professors involved were surprised by what they, themselves, learned by doing assessments in this way:

Professors began realizing how much the language of their assignment prompts communicated what they expected from students. That might seem obvious, but without other samples to compare to, professors just thought their students didn’t have the skills.

You don't get this type of reflection from multiple-choice tests.

The Problem with Boys

As previously mentioned, my high school is now dual enrolling more students than ever — about ten times more. A quarter of all juniors and seniors took half their classes at the community college last semester as part of our early college efforts.

By most measures, these students did very well. As a group, they earned over 95% of the credits they attempted with an average GPA over 3.0. They were, after all, able to dual enroll because of their past performance on standardized tests and high school coursework. They went to college because we thought they were "ready."

Yet, unsurprisingly, not all students performed equally well. About 15% of our dual enrolled students ended the semester with a college GPA below a 2.0.  A few students even experienced their first academic failure in college. So, even within our high average of success, not all students shared the same experience. 

First Semester 2014-15 Dual Enrollment GPA Distribution (N=67)

First Semester 2014-15 Dual Enrollment GPA Distribution (N=67)

We consider this fact — that some students didn't do as well as expected — to be a really big deal. It means that our algorithm for credentialing students for college readiness isn't yet perfect. To be clear, we didn't expect it to be, and while we acknowledge that reaching "perfect" isn't probable, wanting perfect gives us reason to dig into our data in hopes of finding some clues that will help us identify relative risk in the future.

Our biggest takeaway?

Boys did much worse in college coursework than girls — a whole grade point worse, on average.

Girls earned college GPAs that were 1.05 points higher than boys, on average.

Girls earned college GPAs that were 1.05 points higher than boys, on average.

This is despite the fact that girls and boys performed equally on both the COMPASS and ACT assessments, which we use to determine eligibility for college-level coursework. We're talking less than 0.01 difference between boys and girls on these tests.

Being a boy had a stronger negative effect on student success than any other factor: free/reduced status, high school GPA, etc. At the same time, these factors still added to the risk — going to college as a boy receiving free lunch with a high school GPA below 3.0 was clearly tough — these students earned an average GPA below 1.5 in college.

The average college GPA for girls receiving free lunch with a high school GPA below 3.0: a respectable 2.5.  

What now?

We certainly can't increase our requirements for boys above that of girls without raising some eyebrows. What we can do is educate parents and students on the relative risks of going to college and how our data should inform that risk. While hope will likely spring eternal for most, some students may delay college entry in hopes of better results down the road.

We can also raise our expectations overall since doing so would result in sending fewer students with high school GPAs below 3.0. Even though most boys saw their GPA decline in college, the decline was less detrimental on students that started college with a high school GPA that was above 3.0. This seems obvious. It is good to have data to back this up now.

Lastly, I think it's crucial that we think of new ways to support students, specifically these struggling boys, while in college. To do this appropriately, we're going to have to get to know our boys a bit better to start to decipher what is going on. Is it maturity? Is it social expectations? Is it video games? We need to learn more about what is going on with them so that we can build in better supports for them to be successful.

Predicting College Success

I spent my morning analyzing the grades of the sixty-seven juniors and seniors who dual enrolled from my school this past semester. Of the 464 college credits attempted, 440 were earned, giving us a pass rate just a hair under ninety-five percent. Half the group had a college GPA above a 3.43. I'd say this is pretty good news for our first cohort of New Tech students taking college classes.

One of the goals of my analysis was to assess how well we predicted college readiness amongst these young advanced students. While only four of the sixty-seven students who dual enrolled experienced failure, some students still performed worse than expected. Pushing students to college too early could potentially blemish their college transcript. Defining "ready" has therefore become a really big deal.

Aligning our thinking with both our college partner and the state, we placed the greatest weight on students' college entrance exam scores last year. In deciding who got to go, we let test scores trump all other valid readiness indicators such as high school GPA, teacher perception, etc.

So, how did that work out for us?

The worst predictor of student success for us was their score earned on the COMPASS, taken by our current juniors who had not yet taken the ACT. The COMPASS is used by our community college partner to place students into courses at appropriate levels. For us, it turned out that the COMPASS provided only a minor ability to predict college success (r=0.25).

The correlation between student COMPASS scores and college GPA was a low r=+0.25.

The correlation between student COMPASS scores and college GPA was a low r=+0.25.

Coming in second was the ACT assessment, taken by all juniors in the state of Michigan. The ACT proved to be a fair predictor of college success (r=0.44).

The correlation between student ACT scores and college GPA was a moderate r=+0.44.

The correlation between student ACT scores and college GPA was a moderate r=+0.44.

The best predictor of college success turned out to be student GPA (r=0.76).

The correlation between student high school GPA and college GPA was a high r=+0.74.

The correlation between student high school GPA and college GPA was a high r=+0.74.

While the state of Michigan allows schools to use varied methods of determining college readiness before allowing students to dual enroll, it is interesting that they will not not allow GPA be a primary determining factor, given it's apparent ability to correctly predict student success.

What we will most likely do in the future, given this data, is create a single numerical value for each student that takes into account their college entrance exam score and their high school GPA. This would appear to provide some additional predictive ability (r=+0.82 to r=+0.86) not possible using test scores alone.

UPDATE—January 30, 2015: Looking at this with fresh eyes, I think it's important to point out that we used the minimum COMPASS and ACT scores required for college-level coursework placement with our community college partner as our cutoff for allowing students to dual enroll. We did not use the state minimum scores, which are higher. It is logical that using the higher scores would have increased these assessments' predictive ability. We are choosing to use the lower scores to increase access with the hope of keeping risk to a minimum for our students.

Parent Communication 3.0

Imagine it's Sunday night. You're sitting on the couch with your spouse watching a reality show. During a commercial break, you pick up your smartphone to check Facebook and notice you have 3 new emails from your son's school. Each one is an automated messages about his grades.

He's failing.

Without much understanding of the school's online gradebook or new grading policies, you call him into the room and ask for an explanation about what's going on. You're concerned.

He's dumbfounded. There is no way he could be failing. This must be a mistake.

In fact, he asserts, it may not even be his fault. His teachers don't even teach any more; they call themselves facilitators now and expect the students to do all the learning on their own. It's very stressful and everyone is failing.

This hits a nerve. You know that the school is going through a change process; you attended a meeting about the changes at the beginning of the year. Could they be expecting too much too soon? Is your child, who has always received high marks in school, being harmed by these new changes at school?

You call a few other parents to hear how their kids are doing. The perception is mixed: some are doing better, some the same, and some worse. During each of these conversations, you share your concern for what's going on at school. Could it be the school's fault that your kid is failing?

In desperation, you send a terse email to the teachers and carbon copy the school principal.


While the actual conversations that play out may vary, the theme of the messages I receive is almost always the same:

  1. My child is not doing as well at school as he has in the past.
  2. The school has changed a lot since I was in school.
  3. My child can't articulate what he is doing wrong.
  4. The problem must be the school, program, or teachers.

This is understandable. When the only information parents have comes from their prior experience, an automated email, and a struggling child, it's tough to argue with their logic.

There's no solution to this problem; in fact, it's not even a "problem" to be solved. Rather, it's a complex circumstance of communication that needs to be addressed in multiple simple ways.

1. Reduce the default number of notifications sent out to parents by online gradebooks.

I've heard from parents that our online gradebook sends out as many as ten emails per day, by default, depending on teacher activity. Simply from a signal vs. noise point-of-view, that's far too many. While some parents tune out all emails they receive, others anxiously look into each one in fear that they are going to miss something important if they don't.

I feel sorry for the students with parents in either case.

The truth is that in a challenging school environment, grades fluctuate, especially at the start of each grading term. To notify parents of each fluctuation as they occur is unreasonable: it causes anxiety and leads to an inaccurate assessments of reality.

Schools need the capability to change these default settings to meet their school's particular needs. Or, the default ought to be set to zero; motivated parents who want notifications are more likely to turn them on than they are to turn them off.

2. Involve parents well before problems ever arise.

It's a common misconception that parents should become less involved in their children's education as they grow older. This is harmful. Teenagers need their parents support and understanding as much as they did when they were in elementary school. The only change should be what parental involvement looks like over the years.

A group of parent leaders recently told me that I need to hold a mandatory "parent bootcamp" every summer. While the logistics of this alone scare me to death, the need behind the sentiment is quite real: the high school their kids are going to is a lot different from the high school they attended. The building might be the same, but the culture, instruction, and resources are not.

Parents of 21st-century educated kids need more than a handbook and an hour-long orientation. They need consistent, quality, flexible, and varied opportunities to become involved in their children's educational development.

3.) Educate students on training their parents to use and understand the school's online gradebook.

There's a very practical reason why students should want to do this: to get their parents off their backs when grades do fluctuate. As long as the training comes well before problems arise, parents will know what to expect and when.

Parents may still send teachers emails (and we want them to), but with a more thorough understanding of the school's online gradebook, and after a practical conversation with their child, the tone of that email should change from terse to inquisitive.

4.) Urge teachers to email parents before posting grades that may have a negative impact on overall scores.

There are times when a teacher's lessons do not go as planned. Student writing, for example, does not always live up to expectations. Sometimes, students do not test as well as we'd like them to.

Without getting into an entirely different challenge, let's just acknowledge that when teachers raise performance expectations beyond that which students are accustomed to reaching, the scaffolding can sometimes fall apart.

When it does, it is essential that teachers email parents before grades are posted. And, it's equally important that this email is positive.

Here's an example email that was sent by one of my American Studies facilitators to learners (CC'ing parents) regarding a recent draft:

Learners,

We have published the Background Information grades. As we stated in class, the paper was graded on each person's individual section of this portion of the White Paper. We were looking for citations, in-text citations, proper use of mechanics, and organization. We also looked to make sure you were contributing equally to your group.

We acknowledge that many of you will be disappointed by your grade, but ask that you understand this is just one phase of the project. As you take our feedback and apply it to your paper, the paper will improve and so will your overall grade for it. Remember, we are all learning and a big part of learning is struggling with the concepts until we can grasp them. I have already seen how much many of you have learned as a result of your struggles and everyday we learn more and more. Please keep that in mind as we push toward completing our rough draft in the week to come.

Have a wonderful week!

This email informs learners and parents that:

  • expectations on the project are high and will not be lowered.
  • that many struggled to meet the expectation.
  • there is no reason for alarm.
  • lessons learned through feedback can be applied to future phases of the project.
  • With effort, everything will be okay.

Imagine the same Sunday night again. This time, something is different.

Getting emails from the school's gradebook is rare, so when one comes in, you investigate it. Recalling the parent workshops you attend, you check for grade comments, look at the posted agendas, and find the rubric used to assess your son's writing.

The conversation you have with your son is more productive because you have talked about the online gradebook before and you have a common understanding about what's going on at school. Moreover, you already had a similar conversation with your son about his grades when the teacher emailed his concern earlier in the week.

Feeling a sense of relationship with the school and the teachers there, you still send an email. This time, though, it's more to learn about after-school study hours and to ask if there is anything you can do to help at home.

Your son is still failing, but you understand why and know what steps he needs to take to improve.

What more could you ask for?