There is sometimes a great awkwardness when students see their first practice test scores. It’s usually worse when the parents are there.

“This doesn’t make any sense. She’s on the Honor Roll at school. Why would her scores be this low?” they ask, as though I personally jinxed the test, or graded it incorrectly just to trick them. It’s not infrequent that there’s a discrepancy between a student’s grades in school and how they perform on the test. This can be due to a number of factors:

1. Grade inflation in private schools. Many private schools get the same push-back from parents that my business does—they can’t understand why they’re paying so much money for their child to have a shitty GPA. The response is to pressure teachers not to give anything below a B, which causes an unsurprising wave of grade inflation.

2. Grade inflation in public schools. While this is not the case in all public schools (just as the above problem is not endemic of all private schools), it happens in some underfunded, under-supported schools. As these schools struggle to keep their students in school and at the appropriate grade-level subject material, the mean comprehension and what’s often referred to as “achievement” sinks, meaning that the top students at some “underperforming” public schools would be in the lower fifty percent as other, better funded schools. An Honors student at one of these schools might still find him or herself underprepared for the SAT.

3. Testing anxiety, which can be hard to anticipate and even harder to combat. Anxiety is the curve ball that is sometimes combatable and sometimes proves to be the mountain that cannot be moved; the stone you can’t squeeze blood from. A student who shines in the classroom might freeze at the very sight of a timer, get sick to the stomach, we’ve even seen anxiety-induced mid-test nosebleeds from our students.

4. The ACT and SAT don’t reflect a lot of school curricula. This last one is sort of the dirty secret. Bright students who work hard and get justifiably good grades still might not rock the ACT or SAT out the gate. These tests not only measure a very distinct knowledge base and set of skills, but they test it in a very specific way. The tests are timed. They’re multiple choice. The ACT tests facility with data analysis. The SAT tests not only math content, but critical mathematical aptitude. While there are certain standards that all schools are held to maintain—the ability to read, basic algebra concepts, etc—the truth is that schools inhabit a certain amount of latitude in what any given student learns. I’ve worked with students who claim never to have taken a science class in high school. I’ve worked with students who haven’t studied grammar since the sixth grade, and some who have never studied it at all. I’ve met with a straight-A, prep school student who could not for the life of her tell me what a “noun” was.

On top of that there are language-based high schools, arts-based schools, STEM schools, project-based learning experiences, and homeschoolers. Each state has education standards and graduation requirements, but there is so much elbow room within those for students to either pursue their passion or fall through some rather sizable cracks.

And what does it say when our curricula, even when they are functioning as planned, don’t align with our schema for measurement? What does it mean that what schools elect to teach isn’t the same thing we’re testing to make sure all students know? Which is wrong—the school or the test?

In the minds of many of our clients, it’s neither. We are the ones who are wrong—those who work in test prep. We’re the ones they’re paying to close that gap. And why shouldn’t any parent expect that if their son or daughter has only ever gotten A’s in English that he or she will crush the Critical Reading portions of the SAT? Why should that seem unreasonable? This goes beyond the cliché that every mother thinks her son is the smartest boy on earth—she’s essentially been told by his school that he is, and then he comes to us and we can’t even get his ACT scores above the fiftieth percentile?

Here rears its head the ugly dichotomy between paying for an education and paying for straight A’s. All these vague accusations I’ve leveled at the test prep industry about paying for a score rather than an education—the same can be said of any part of the education industry. Paying for academic tutors, private schools, summer camps, any of it. It can all be used as a method to throw money at a college application rather than educate a student. And to be clear—I don’t think that all of these things are bad all of the time. As much as I cast aspersions on my own industry, I do believe that we teach important concepts that these students would otherwise lack. That Honors student who didn’t know what a noun was left our office that day with a working definition of every part of speech. I truly believe top-notch educations such as students receive at many of the country’s prep schools are great and amazing things. I simply also believe that in many cases the virtue of these resources gets warped and perverted in the race Get Into College.

I’ve never had a conversation with a parent on this subject in which they expressed interest in finding out exactly what their son or daughter really knows. I can count on one hand the number of parents who have seriously accused their child’s prep school of under-teaching them, but have lost track of the number of clients who have reacted with outrage that their honors student received an unsatisfactory score on a standardized test. The differences in test content versus curricular content are, in my opinion, under scrutinized. The grade-inflation talk is an even worse cherry bomb to throw at our parents. In not all, but perhaps too many cases, the primary concern is that their child has a 4.0.

The content of that 4.0? Less worth knowing, it seems.