Thursday, January 17, 2013

ACT exam as a high school graduation requirement?

By Dr. Darrin Hanson

Recently Louisiana enacted the ACT exam as a high school graduation requirement.  Many other states, including Wisconsin, are considering requiring high school students to take the ACT. 

The Louisiana program has a two-tier track for graduation.  The "academic endorsement track" would require an ACT score of 23 and the "career/technical endorsement track" would require an ACT score of 20 in order to graduate.  Having taught at the college level in Louisiana, I was quite surprised by these criteria.  In Louisiana, the average ACT composite score in 2011 was 20.2.  I would estimate this to mean at least 40% of Louisiana students didn't qualify for the lower career/technical endorsement graduation.  I know for a fact that I taught a few college students from Louisiana who had ACT scores under 20 and I'd say at least half had a score under 23.  Many of these students ended up doing reasonably well in college.  While they generally weren't our star pupils who went on to medical school, they did graduate from college and found reasonable jobs afterwards. 

So, how effective is the ACT as a measure of high school academic achievement?  Honestly, I don't think it's very good.  It certainly doesn't compensate for standard core curriculum requirements.

First, let me state up front that these new requirements are probably beneficial to me financially.  In addition to my consulting business and teaching the occasional course at the University of Wisconsin, I also co-own a business with my wife that specializes in tutoring for college prep exams.  I help students from a variety of academic levels prepare for the ACT exam.  From this experience, and my experience as a college professor, I can tell you that the ACT does not do a great job of measuring what you learned in high school.  You see, the ACT is designed to measure skills more than it is to measure knowledge. 

Let me try to illustrate this with the ACT "science section".  I can tell you without any hesitation that the science section of the ACT does not measure the amount of science you have learned.  I would estimate that 4% of the questions in the science section require any knowledge of science at all.  What the ACT science section really measures is a student's ability to critically read scientific charts and graphs.  While this is an important skill, and one that is very useful in college, it does not tell us how much science a student knows.  In fact, I haven't studied any science since I was a freshman in college (and I have forgotten almost all of it), and I could easily get a perfect score on the ACT science section. 

Then there is the ACT reading section.  I actually like the layout of the reading section.  There are always four passages, one fiction, one social science, one humanities, and one history of natural science, and each section has 10 questions measuring comprehension.  So far, so good.  My problem is that they give you 35 minutes to answer those 40 questions, including the reading.  I have a Ph.D., and I couldn't do that if I were actually reading the passages.  Critical reading requires more time that that.  I have to teach my clients how to answer the questions without actually reading the passages.  That is not a good measure of a students ability to comprehend reading. 

The ACT math section is probably the closest the exam comes to measuring content comprehension.  The problem is that the content includes trigonometry.  I can say for a fact that the only reason I know any trig is because I need to be able to explain it to students I work with.  I've never used it otherwise.  I've used algebra, geometry, and a lot of statistics (which, incidentally, is not on the ACT in any meaningful way).  While I can see the value of having trigonometry on a college entrance exam, I do not see its value in a high school graduation exam.  Students who aren't expected to go to college are never told to take trig.  In fact, I didn't take trig when I was in high school because I already knew that learning statistics would be far more beneficial given what I was going to study in college.  Why is this content in a high school graduation requirement?

(For the record, I don't have any problem with the English section of the ACT.  I think it does a good job of measuring what high school students should know for graduation.)

I recognize the dilemma in which state education departments find themselves, which motivates the desire to use the ACT.  I don't even mind requiring all students to take the ACT because it does provide a reasonable barometer for skill acquisition.  What I do mind is using it as a proxy for measuring student content attainment in high school.  That is not what the test was designed for and it does a very poor job of it. 

But I suppose I shouldn't complain as these changes could bring me a lot of money.  If you are a high school administrator in southern Wisconsin and you would like to hire me to help your students game the ACT system, I would be happy to help.  I can show your students how they don't have to actually know much to score well on the ACT.

However, if you are a policymaker in Wisconsin or anywhere else and would like me to help you develop a more meaningful measure of student achievement, I would be even more happy to help you achieve this superior goal.  Please contact me.

My LinkedIn Profile

Here is my LinkedIn profile for anyone who is interested.  Feel free to share this with anyone else.


http://www.linkedin.com/pub/darrin-hanson/47/47b/6a2

Friday, September 14, 2012

Is the SAT accurate in predicting student performance?

By Dr. Darrin Hanson

I come at this question from a somewhat unique perspective.  I used to serve on a university admissions committee.  I was also an assistant professor for three years who had access to the SAT/ACT scores of my advisees.  I am a social scientist who has studied much of the literature on standardized testing.  I currently co-own a side business, Madison Learns, tutoring high school students to score higher on the SAT.  Given this background, I've given a lot of thought to the effectiveness of standardized tests such as the SAT in predicting how well a student will perform in college.

Today I'm writing because a recent study came out arguing that the SAT does nearly as well at predicting college success for lower "socio-economic-status" students (SES) as it does for higher SES students.  The study has a relatively large data set and the statistical analysis isn't crazy, but I am somewhat skeptical of the findings.

Overall, from my experience as a college instructor and a member of an admissions committee, I have to say that by and large the SAT and ACT do a relatively good job at predicting success in college.  There are outliers on both sides (such as one student with an SAT combined score of 1270 who got straight A's, or another student with a combined score of 2250 who flunked out his first semester), but in aggregate both test do relatively well predicting college performance.  The academic studies largely support this finding.

The concern that the recent study tries to address is whether the SAT is a better predictor of those with higher SES background.  For purposes of the study, they looked at both income and parental educational attainment (highest level of school, etc).  The study found that there was a slightly smaller correlation between SAT scores and college success for lower SES students than for higher SES students, but that the difference was not statistically significant.

It's at this point that I start to wonder what is going on.  My own suspicion, honestly, is that SAT scores would significantly under-predict performance for low-SES students.  My reasoning is this: Higher SES students frequently get special training to prepare for the exams.  Looking at my current test-prep students, all of them have at least one parent with a graduate degree and all of them have a combined household income in the six figures.  To put it bluntly, parents who earn less can't afford my rates and parents who never went to college might not realize the value of getting extra help preparing for these exams.  High-SES students have an inherent advantage in taking the test because their parents understand the value of and can pay for extra help.

But, to be fully honest, the skills I teach to increase scores on the standardized tests are not the same as the study skills I teach to help students do better in school.  If you utilize the stuff I teach you to get a better score on the SAT for your college class, it will actually hurt your grades.  If you utilize the study skills I teach you to improve your grades while taking the SAT, then you will do poorly on the SAT (largely because you will run out of time).  I have yet to figure out how doing well on the SAT could truly be an accurate measure of success in college, as they involve so many contrasting skills.  (Perhaps someone from the College Board can comment and explain it to me.)

The final thing that makes me suspicious of the recent study is that it was funded by the College Board, the company that owns the SAT.  This doesn't necessarily mean that the study was corrupted, but it does make me wonder.  I am one of few social scientists will confirm the old saying, "Lies. Damned lies. And Statistics."  By asking certain questions in certain ways, you can manipulate results.  Again, let me emphasize that I have not found anything like this (yet) in the study that just came out.  But when the findings of a study don't match up with common sense, it does make me want to sniff around a bit more.