James Dargan

Logo

Aspiring Data Scientist

Seattle, WA
Python, SQL, R

Actively searching


Pandas, BeautifulSoup, Requests, Plotly, Dash, nltk, Sklearn, Keras
Regression, PCA, Time Series, Trees, RFs, SVMs, Boosting, Neural Networks

View My GitHub Profile

Participation Rates & State Averages (Repo)

Project Component: I utilized Python and fundamental linear regression methods to demonstrate a strong negative association between a state’s participation rate on the SAT/ACT and its average score.

My Medium Blogs on This Project:
Visualizing SAT & ACT Averages
Participation Rates Skew State Averages
Opt-In Bias Drives SAT & ACT State Averages
Estimating Classroom Impact on State Averages

Insight 1: SAT and ACT state averages correlate negatively. Because most states favor universal or near universal participation on only one of the two college admissions exams, the students taking the preferred exam represent the larger student body in that state. Only the highest-performing students self-select into taking both exams, leading states to have extremely high averages on the less popular exam and generating the appearance of a negative relationship between these tests.

Insight 2: SAT and ACT trend negatively with increased participation. I find robust evidence of strong linear association between a state’s participation rate on a specific college admissions exam and its average test score on that exam, with statistical significance at the 99% confidence level. Over 70 percent of variation between different state averages on each exam can be explained by participation rates alone.

Insight 3: Participation rate changes explain 90% of the change in state ACT averages between cohorts. Using change in participation rate measured by percentage points as X and percent change in a state’s average score from the class of 2017 to 2018 as y, a SLR produced an adjusted R-squared of 0.916 and reduced a mean absolute error of 0.77 points compared to baselines of 1.78 points. An increase in percent of graduating class partaking in the ACT of 10 points correlated with 2.4 percent drop in that states average ACT composite score.

Insight 4: Further research into Colorado, Illinois, and Rhode Island strengthen the argument of causality. By treating state level policy changes which dramatically altered participation rates as a natural experiment, I provide additional evidence from the SAT to demonstrate causality. Illinois increased SAT participation by 90 points and experienced 96 point drop in SAT average, which lowered its relative ranking to 43. Illinois increased SAT participation by 89 points and experienced 176 point drop in SAT average, which lowered its relative ranking to 42.

Why It Matters: When reporting relative rankings of states on the SAT or ACT as measured by average student exam score, blogs and news sites fail to recognize these ratings derive primarily from differences in participation rates rather than any underlying features of states’ education system or quality of students. States with extremely high or low averages should not be praised or criticized without additional evidence given how biased these rankings are.

Previous Project Page: Visualizing State Differences
Next Project Page: Modeling Impact of State Education System Features