On a pretty regular basis we see reports like this—this one apparently prompted by a College Board press release. I wish we saw fewer of them.
If there were a metric that could accurately show that our educational policy choices were enabling more kids to succeed at challenging coursework, that would be worth celebrating. But the number of students taking AP courses, and the number of students getting a 3 or above on an AP exam, are absolutely awful, utterly useless proxies for anything worth measuring.
First, whether AP courses—courses geared entirely toward preparing for tests created by the College Board—are the most valuable type of advanced coursework is entirely debatable. That debate is less likely to happen if people take “honors” like these at face value.
Second, there are ways to raise AP participation and score numbers that are academically unsound—but become incentivized when people start valuing this kind of “honor.” For example, a district can simply open the doors to AP courses regardless of whether the students are ready for the course. The district can then encourage only the most successful students to take the AP test in that subject. Many students might be poorly served by that kind of course, but the College Board gets more business and the district gets an “honor”!
Our district has celebrated this kind of accolade in the past (examples here, here, and here). In our district, some students—more than just the rare outlier—are invited to take AP courses in the first semester of their freshman year in high school. Are those really “college-level” courses? If so, are they really right for high school freshmen? The answer might well be “no” to both questions.
The fact is: we don’t know what the “right” amount of AP participation is. The last thing we should do is start chasing isolated numerical indicators, which is just a recipe for unintended consequences. That’s all the more true when those indicators are in service of thinly disguised advertising for companies like the College Board.