Who Killed Reading?
A whodunnit-style analysis.
The Economist says EdTech. Jonathan Haidt says social media. The evidence points somewhere closer to home.
The Crime Scene
The evidence is hard to ignore.
In 2012, 27% of American 13-year-olds said they read for fun almost every day. By 2023, that number had fallen to 14%. Over the same period, the share who said they “never or hardly ever” read for pleasure more than doubled.
National reading scores tell the same story. By 2022, fourth-grade reading on the National Assessment of Educational Progress (NAEP) had fallen below every assessment since 2005 and returned roughly to its 1992 level.
The decline is most severe among struggling readers. In 2024, 34% of eighth-graders scored “below basic” on NAEP reading — the highest share in 32 years.
“This is not just a pandemic story,” said Peggy Carr, commissioner of the National Center for Education Statistics. “Our nation is facing complex challenges in reading.”
Something clearly happened to reading. The question is: what?
Suspect 1: Edtech
In January 2026, The Economist published a widely discussed investigation titled “Ed tech is profitable. It is also mostly useless.” It concluded that educational technology is a $165bn global industry that delivers little improvement in learning.
The scale is enormous. American schools spent roughly $30bn on educational technology in 2024. Today, 90% of high school students and 84% of elementary students have school-issued devices.
The correlations are striking. Fourth-graders who reported using tablets in “all or almost all” classes scored 14 points lower on NAEP reading tests than those who never used tablets — roughly a year of learning.
Research points in the same direction. A Stanford meta-analysis led by Rebecca Silverman examined 119 literacy technology studies and found modest gains on narrow skills such as decoding and vocabulary, but far smaller effects on standardized reading tests.
On the surface, the case looks strong.
But there is a problem. The timeline does not fit.
The largest expansion of classroom devices came after 2020, when schools purchased an estimated 37 million laptops and tablets during the pandemic. NAEP reading scores had already begun flattening nearly a decade earlier.
Suspect 2: Social Media
Reading scores had already begun flattening in the early 2010s, years before the rise of social video.
In The Anxious Generation (2024), Jonathan Haidt argues that between roughly 2010 and 2015, childhood underwent a “great rewiring,” as smartphones and social media moved rapidly into the center of adolescent life. In 2011, only 23% of American teenagers owned a smartphone. By 2015, the number had reached 73%.
Except that his timing is wrong. The algorithm-driven short-video feeds that dominate attention today arrived much later than 2010-2015. Instagram launched Reels in 2020. Facebook followed in 2021. TikTok did not become mainstream among American teenagers until around the same time.
The correlation with declining reading habits seemed like a likely suspect. But social media is merely the accomplice.
The Alibi
Whoever (or whatever) killed reading had a convenient alibi: the tests kids took in school said everything was fine.
American (and British) schools rely on the reading tests students take several times a year — Star Reading (Renaissance), MAP Growth (NWEA, now owned by HMH), and the i-Ready Diagnostic (Curriculum Associates). These tests were designed to identify struggling students quickly and track short-term progress. Used as light benchmarks, one could argue they work.
But schools have never had a shared definition of what reading proficiency at each grade level should actually mean. Does it mean you can pass minimum grade-level benchmarks? Or does it mean you are on track for college-ready reading?
A validation study of MAP Growth commissioned by the Virginia Department of Education concluded that the test “does not assess the full range of content, nor the highest levels of cognitive demand.” Yet over time, these assessments quietly became the benchmark.
Parents saw the word “proficient” and assumed their child was reading at grade level. In many cases, this simply wasn’t true.
The Alibi Cracks
The alibi has a name: the “proficiency illusion.” And it cracked.
For years, state reading tests suggested that large numbers of students were on track. But when researchers compared those results to the National Assessment of Educational Progress (NAEP), the numbers told a different story. A mapping study by the National Center for Education Statistics of 2019 state reading tests found that in 44 of 49 states, the score labeled ‘proficient’ corresponds to NAEP’s Basic level rather than NAEP Proficient. Separate analyses by FutureEd indicate that proficiency rates on state reading tests are often about 15 to 20 percentage points higher than on NAEP.
Researchers identified the problem long ago. In 2007, analysts at the Fordham Institute and NWEA coined the term “proficiency illusion” to describe a testing system that creates “a false impression of success, especially in reading.”
The illusion becomes visible when students encounter harder assessments. NAEP and the SAT are often the first tests students take that demand genuinely complex reading.
The difference is not just difficulty — it is what the tests ask students to do. Star Reading and MAP Growth are primarily multiple-choice assessments built around short passages; they measure whether a student can locate information and make straightforward inferences. NAEP requires students to analyse longer texts, compare perspectives, and justify their interpretations in written responses. The SAT, according to the College Board's own cognitive lab research, measures "cognitively complex thinking required for postsecondary readiness" — synthesis, evaluation, and reasoning across extended passages.
A student can score "proficient" on a screening test and still lack the skills to be on track for high school or college-ready reading. That is the gap the alibi conceals.
Two states show what that looks like in practice.
In Texas, the state’s STAAR exam reports that 49% of fourth-graders meet grade-level reading standards. NAEP says only 28% are proficient — a 21-point gap. Every test in the system signals “on track.” Then the SAT arrives, with a benchmark tied to college-level reading.
Connecticut offers a revealing contrast. Every public-school junior takes the SAT. NAEP still shows roughly the same 20-point inflation gap — but 58% of students meet the SAT’s college-readiness benchmark. The inflation is similar. The difference is that the reading instruction underneath actually works.
The Motive
The alibi didn’t appear by accident. It was driven by policy.
When the No Child Left Behind Act took effect in 2002, schools faced a single overwhelming mandate: raise reading and math scores or face consequences. A national survey by the Center on Education Policy found that 71% of districts reduced instructional time in at least one non-tested subject. Social studies was cut in 36% of districts, and science in 28%, while reading instruction was increased by an average of 141 minutes per week.
On paper it sounded sensible. In practice it created the perfect conditions for the crime.
The pressure reshaped every classroom. Sustained independent reading gave way to short passages and test-prep drills. The subjects that supply critical background knowledge needed for NAEP- and SAT-style comprehension — history, science, and the arts — lost ground.
The pressure also reshaped the profession. In an NEA survey, nearly half of teachers said they had considered leaving the profession because of the emphasis on standardized testing. Richard Ingersoll’s longitudinal research at the University of Pennsylvania found that 44% of new teachers quit within five years — and dissatisfaction with testing and accountability pressures was among the top reasons cited.
What actually happened? Students spent more time in reading class, and less time actually reading. And many of the teachers who cared most about books left.
By the early 2010s, the trade was largely complete, and NAEP reading scores stopped rising at almost exactly the same moment.
The Method
Reading wasn’t killed by outside forces such as social video, although they may have contributed. The real suspect was hiding in plain sight — inside the classroom.
Schools cut the things that actually build readers — independent reading of books, guided reading in small groups, and subjects like history, science and the arts that build the background knowledge needed for NAEP- and SAT-style comprehension. In their place came short passages, comprehension strategies and practice questions designed to prepare students for standardized tests.
The consequences show up in what students actually read — and in what classrooms actually teach.
Data from Renaissance’s What Kids Are Reading report shows the average book read by twelfth-graders has an ATOS level of about 5.2 — roughly a fifth-grade reading level. College textbooks (and the SAT) typically fall closer to a 12th-grade reading level. Fewer books, easier books, every grade, every year.
A Harvard systematic review led by Peter Capin, examining 66 classroom studies spanning four decades, found that only 23% of instructional time in reading and language arts classrooms is devoted to comprehension — far below what research recommends.
“The obvious problem,” Capin told the Hechinger Report, “is that it’s hard to support reading comprehension if students are not reading.”
The alibi hid the decline. The method was a generation of reading instruction built on fragments instead of books.
The Verdict
Blame EdTech and you’re just banning the suspect. Blame social media and you’re accusing the accomplice.
But the system itself created the conditions for the decline.
Benchmark tests hid — and continue to hide — the problem. The word “proficient” drifted away from any real standard, and the tools used to measure reading progress were never designed for the job they were given.
The consequences are now visible.
Only 35% of twelfth-graders are academically ready for college-level reading, according to NAEP. At community colleges, more than 40% of students enrol in remedial reading or writing courses — and about 70% of those students never complete a degree.
University professors report the same pattern. Daniel Shore, chair of Georgetown University’s English department, says students struggle to remain focused even on short poems. Anthony Grafton, a historian at Princeton, says students arrive with narrower vocabularies than in previous generations.
Maybe Reading is Not Fully Dead
Maybe it’s just hobbling down a dark, lonely street, looking for the things that once sustained it: books, knowledge and classrooms built for reading rather than testing.
Students need books at the edge of their ability — not the easiest text they can finish, but the most challenging one they can understand with support.
The stakes extend beyond universities. Economic forces may help rekindle reading. Prompting AI is easy. Evaluating it well enough to get paid requires deep reading. In an AI-driven economy, the advantage will belong to people who can evaluate arguments, detect weak reasoning and judge whether an answer is reliable.
Students who realise this will take agency over their own education — whether schools ask them to or not.
Reading was never supposed to be fast, fragmented or easily measured.
Until we stop rewarding easy benchmarks and fragmented reading, the decline will continue.
The author is the founder of StudyHall.AI and CollegeCopilot.me, which builds AI reading tools for students and SAT prep.
Bibliography
ACT. (2023, October). The Condition of College & Career Readiness 2023: National.
https://www.act.org/content/act/en/research/condition-of-college-and-career-readiness.html
ACT. (2024, October). Profile Report — National Graduating Class 2024.
Barshay, J. (2025, March 10). Reading comprehension loses out in the classroom. The Hechinger Report.
https://hechingerreport.org/proof-points-reading-comprehension-classroom/
Berman, A. I., Haertel, E. H., & Pellegrino, J. W. (2020). Comparability of Large-Scale Educational Assessments: Issues and Recommendations. National Academy of Education.
https://doi.org/10.31094/2020/1
Bone, J. K., Bu, F., Sonke, J. K., & Fancourt, D. (2025). The decline of reading for pleasure in the United States. iScience.
https://www.cell.com/iscience/fulltext/S2589-0042(25)01549-4
Capin, P., Dahl-Leonard, K., Hall, C., Yoon, N. Y., Cho, E., Chatzoglou, E., Reiley, S., Walker, M., Shanahan, E., Andress, T., & Vaughn, S. (2025). Reading comprehension instruction: Evaluating our progress since Durkin’s seminal study. Scientific Studies of Reading, 29(1), 85–114.
https://doi.org/10.1080/10888438.2024.2418582
Center on Education Policy. (2006, March). From the Capital to the Classroom: Year 4 of the No Child Left Behind Act.
https://www.cep-dc.org/
College Board. (2023). The Cognitively Complex Thinking Required by Select SAT Suite Questions.
https://satsuite.collegeboard.org/media/pdf/digital-sat-cognitive-lab-report-sldr.pdf
College Board. (2024). 2024 Connecticut SAT Suite of Assessments Annual Report.
College Board. (2024). 2024 Texas SAT Suite of Assessments Annual Report.
https://reports.collegeboard.org/media/pdf/2024-texas-sat-suite-of-assessments-annual-report-ADA.pdf
Cronin, J., Dahlin, M., Adkins, D., & Kingsbury, G. G. (2007). The proficiency illusion. Thomas B. Fordham Institute and Northwest Evaluation Association.
https://fordhaminstitute.org/national/research/proficiency-illusion
Curriculum Associates. (2024, August). Curriculum Associates Celebrates Schools Nationwide for Remarkable Growth in Reading and Mathematics.
Durkin, D. (1978). What classroom observations reveal about reading comprehension instruction. Reading Research Quarterly, 14(4), 481–533.
FutureEd. (2025). The new NAEP scores highlight a standards gap in many states. Georgetown University McCourt School of Public Policy.
https://www.future-ed.org/the-new-naep-scores-highlight-a-standards-gap-in-many-states/
Gareis, C. R., McMillan, J. H., Smucker, A., & Huang, K. (2021). MAP Growth Validation Study: An Evaluation of the Alignment of Selected MAP Growth Assessments to the Virginia Standards of Learning and an Exploration of the Utility of MAP Growth Reports for Determining Student Performance Relative to Grade Level. William & Mary / Virginia Commonwealth University.
https://eric.ed.gov/?id=ED618690
Haidt, J. (2024). The Anxious Generation: How the Great Rewiring of Childhood Is Causing an Epidemic of Mental Illness. Penguin Press.
Harvard Graduate School of Education. (2023). Building background knowledge in science improves reading comprehension. Usable Knowledge.
Harvard Graduate School of Education. (2025, March). How far have we come in supporting children’s reading comprehension? Usable Knowledge.
HMH. (2023, May 1). HMH Completes Acquisition of NWEA.
https://www.hmhco.com/about-us/press-releases/hmh-completes-acquisition-of-nwea
Horovath, J. (2024, November). The elite college students who can’t read books. The Atlantic.
Horvath, J. (2024). The Digital Delusion: What We Get Wrong About Technology in Schools.
Ingersoll, R. M. (2012). Beginning Teacher Attrition and Mobility: Results from the First Through Third Waves of the 2007–08 Beginning Teacher Longitudinal Study (BTLS). National Center for Education Statistics.
https://nces.ed.gov/pubsearch/pubsinfo.asp?pubid=2012049
Mapping State Proficiency Standards Onto the NAEP Scales. Results From the 2019 NAEP Reading and Mathematics Assessments. (2021). NCES 2021‑036. U.S. Department of Education, National Center for Education Statistics.
https://nces.ed.gov/nationsreportcard/subject/publications/studies/pdf/2021036.pdf
Meta. (2021, September 29). Launching Reels on Facebook in the US. Meta Newsroom.
https://about.fb.com/news/2021/09/launching-reels-on-facebook-us/
McLaughlin, D. H., Bandeira de Mello, V., Blankenship, C., Chaney, K., Esra, P., Hikawa, H., Rojas, D., William, P., & Wolman, M. (2008). Comparison Between NAEP and State Reading Assessment Results: 2003 (NCES 2008‑474). National Center for Education Statistics.
https://nces.ed.gov/nationsreportcard/pubs/studies/2008474.aspx
National Assessment Governing Board. (2017). NAEP Reading Assessment Framework.
https://www.nagb.gov/naep/reading/naep-reading-assessment-framework.html
National Assessment Governing Board. (2025, January 29). Nation’s Report Card: Decline in reading, progress in math.
National Center for Education Statistics. (2022). Digest of Education Statistics, Table 311.40: Percentage of first-year undergraduate students who reported taking remedial education courses, by selected student and institution characteristics: Selected academic years, 2003–04 through 2019–20.
https://nces.ed.gov/programs/digest/d22/tables/dt22_311.40.asp
National Center for Education Statistics. (2022). NAEP Reading: Highlights 2022. The Nation’s Report Card.
https://www.nationsreportcard.gov/highlights/reading/2022/
National Center for Education Statistics. (2023). NAEP Long-Term Trend: Highlights 2023. The Nation’s Report Card.
https://www.nationsreportcard.gov/highlights/ltt/2023/
National Center for Education Statistics. (2024). 2024 NAEP Reading State Snapshot Report — Connecticut, Grade 4.
https://nces.ed.gov/nationsreportcard/subject/publications/stt2024/pdf/2024220CT4.pdf
National Center for Education Statistics. (2024). 2024 NAEP Reading State Snapshot Report — Connecticut, Grade 8.
https://nces.ed.gov/nationsreportcard/subject/publications/stt2024/pdf/2024220CT8.pdf
National Center for Education Statistics. (2024). 2024 NAEP Reading State Snapshot Report — Texas, Grade 4.
https://nces.ed.gov/nationsreportcard/subject/publications/stt2024/pdf/2024220TX4.pdf
National Center for Education Statistics. (2024). 2024 NAEP Reading State Snapshot Report — Texas, Grade 8.
https://nces.ed.gov/nationsreportcard/subject/publications/stt2024/pdf/2024220TX8.pdf
National Center for Education Statistics. (2024). 2024 NAEP Reading Assessment: Grade 12 Results. The Nation’s Report Card.
https://www.nationsreportcard.gov/reports/reading/2024/g12/
NWEA. (2024). Predicting Proficiency on the State of Texas Assessments of Academic Readiness (STAAR) 3–8 Using NWEA MAP Growth. Linking Study Report.
https://www.nwea.org/uploads/2024-TX-STAAR-3-8-MAP-Growth-Linking-Study-Report.pdf
Pew Research Center. (2021, November 12). Among many U.S. children, reading for fun has become less common, federal data shows.
PRiME Center, St. Louis University. (2025, June). A Tale of Two Tests: NAEP vs. MAP Results in Missouri, 2019–2024.
https://www.primecenter.org/prime-blog/two-tests
Renaissance. (2024, April 9). Renaissance report highlights 179 million books read in 2022–2023 school year.
Renaissance. (2025). What Kids Are Reading: 2025 edition.
https://www.renaissance.com/resources/what-kids-are-reading/
Scholastic. (2017). Teacher & Principal School Report: Focus on Literacy.
https://www.scholastic.com/site/teacher-principal-school-report/key-findings/focus-on-literacy.html
Shanahan, T. (n.d.). If you really want higher test scores: Rethink reading comprehension instruction. Shanahan on Literacy.
Silverman, R. D., Keane, K., Darling-Hammond, E., & Khanna, S. (2024). The effects of educational technology interventions on literacy in elementary school: A meta-analysis. Review of Educational Research, 95, 972–1012.
https://doi.org/10.3102/00346543241261073
Texas Education Agency. (2024, June). TEA Releases Results for 2024 STAAR 3–8 Assessments.
Texas Education Agency. (2024). SAT/ACT Texas and U.S. Class of 2024.
World Economic Forum. (2025). The Future of Jobs Report 2025.
https://www.weforum.org/publications/the-future-of-jobs-report-2025/
Yee, D., & Cramer, B. (2026, February). Mapping State Proficiency Standards Onto the NAEP Scales: Results From the 2022 NAEP Reading and Mathematics Assessments (NCES 2026‑014). U.S. Department of Education, Institute of Education Sciences, National Center for Education Statistics.
Connecticut State Department of Education. (2024). Connecticut SAT School Day Results. EdSight.
https://public-edsight.ct.gov/performance/connecticut-school-day-sat
Connecticut State Department of Education. (2024). Smarter Balanced Achievement/Participation. EdSight.
https://public-edsight.ct.gov/performance/smarter-balanced-achievement-participation



very persuasive!
Great article. Is this trend US specific or replicated regionally or globally?