What Everyone Should Know About the Massachusetts’ High Stakes Testing Diploma Requirement


Louis J. Kruger, Psy.D.
Northeastern University

1. One of the primary purposes of using a passing score on statewide exams as a requirement for a high school diploma is to close the achievement gaps. However, research indicates that high school exit exams (HSEEs) have no effect on closing the racial achievement gap (Lee, 2008; Warren, Grodsky & Kalogrides, 2009). Despite consistently high results on the National Assessment of Educational Progress (NAEP), Massachusetts (MA) is tied with Connecticut for the largest achievement gap in eighth-grade reading between English language learners (ELLs) and non-ELLs (National Center for Educational Statistics, 2013). Moreover, Reardon Arshan, Atteberry and Kurlaender (2010) found that Blacks, Hispanics, and English language learners tend to perform lower when higher stakes are attached to tests.

2. The most rigorous studies on HSEEs indicate that these tests have no relationship to high school students’ academic achievement (Holme, Richards, Cohen & Jimerson, 2010; Hout &Elliott, 2011; Lee, 2008; Warren, Grodsky & Kalogrides, 2009). Furthermore, Papay, Murnane, and Willett (2014) and Reardon et al. (2010) found that failing a HSEE does not increase a student’s academic motivation.

3. Another purported purpose of using the MCAS as a requirement for a high school diploma is to better prepare traditionally low-achieving students for the labor force and post-secondary education. Despite the MCAS requirement, approximately 2/3 of community college students need to take remedial college courses (Massachusetts Department of Education, 2008a). Also, a nationwide longitudinal study found no evidence that HSEEs have a positive impact on employment status, wage earnings, or enrollment in postsecondary education (Warren, Grodsky, & Lee, 2008).

4. High stakes testing in Massachusetts has been exacerbating instead of diminishing the opportunity gap for ELLs and students with disabilities. In 2010, 69% of the high school seniors who failed to pass the MCAS had a disability and 11.5% were English language learners. In 2014, 75% of the high school seniors who failed to pass the MCAS had a disability and 18.5% were English language learner (MA DESE, 2015). Although Massachusetts has an alternate assessment for students with disabilities, it is a dead end for almost all the students who take it. During the 2013-14 school year, only 1 of 964 high school students in special education who attempted the English language arts alternate assessment was able to earn a passing score, and only 3 of 979 were able to pass the math version. As of 2014, more than 38,000 high school students have failed to pass the MCAS requirements by the end of their senior year, including nearly 25,000 students in special education.

5. The denial of a high school diploma has serious financial consequences for both the individual and society. The median yearly income for a young adult without a high school diploma is $22,900 (The National Center for Education Statistics, 2012). Individuals without diplomas are denied entrance into the military and many trade unions and postsecondary programs. The net cost to society of someone without a high school diploma is estimated to be more than $250,000. This estimate includes the loss of taxes, as well as the considerable costs of social and correctional services (Center for Labor Market Studies, 2009).

6. The high school completion rate is about 2% lower in the states that have HSEEs (Warren, Jenkins, & Kulick, 2006). This negative effect is particularly large for African-American students (Helmet & Marcotte, 2013). High school students who fail the MCAS are 13 times more likely to drop out of school (Massachusetts Department of Education, 2014). Only 50% of 10th grade students who fail the MCAS graduate on time (Papay, Murnane, & Willett, 2010). In addition, failing the MCAS has long-term effects on students (Papay, Murnane, & Willett, 2014).

7. HSEEs divert resources and time away from valuable school activities. They reduce instructional time and dominate a significant part of the high school schedule. In Massachusetts, mandated testing occurs on 28 of 180 school days at public high schools (Croes & Morgenstern, 2008; Massachusetts Department of Education, 2008b). In addition, the intense pressure to pass HSEEs in academically weak high schools limits the time and resources devoted to college preparation activities, such as college advising (Perna & Thomas, 2009).


Center for Labor Market Studies (2009). Left Behind: The Nation’s Dropout Crisis. Retrieved on September 7, 2009 from http://www.clms.neu.edu/publication/documents/CLMS_2009_Dropout_Report.pdf

Croes, J. & Morgenstern, M. (December 25, 2008). Too much testing cuts into learning. The Boston Globe. . Retrieved December 29, 2008 from http://www.boston.com/bostonglobe/editorial_opinion/oped/articles/2008/12/25/too_much_testing_cuts_into_learning/

Helmet, S. W. (2013). High School Exit Exams and Dropout in an Era of Increased Accountability. Journal of Policy Analysis and Management, 32, 323-349.

Holme, J. J., Richards, M., Cohen, R. & Jimerson, J. (2010). Assessing the Effects of High School Exit Exams. Review of Educational Research, 80, 476-526.

Hout, M., & Elliott, S. W. (Eds.). (2011). Incentives and test-based accountability in education. Washington, DC:

National Academies Press.

Lee, J. (2008). Is test-driven external accountability effective? Synthesizing the evidence from cross-state causal comparative and correlational Studies. Review of Educational Research 78, 608-644.

Massachusetts Department of Education (2008a). Progress Report on Students Attaining the Competency

Determination Statewide and by School and District: Classes of 2008 and 2009. Retrieved November 27, 2008 from http://www.doe.mass.edu/mcas/2008/results/CD.doc

Massachusetts Department of Education (2008b). 2008-2009 Schedule for MCAS and MEPA Testing. Retrieved December 21, 2008 from http://www.doe.mass.edu/mcas/0809schedule.pdf

Massachusetts Department of Elementary and Secondary Education (2014). (E-mail communication, September 22, 2014).

National Center for Educational Statistics (2013). Data retrieved on April 19, 2015 from


Papay, J.P., Murnane, R.J., & Willett, J.B. (2010). The consequences of high school exit examinations for low performing urban students: Evidence from Massachusetts. Educational Evaluation and Policy Analysis, 32(1), 5-23.

Papay, J.P., Murnane, R.J., & Willett, J.B. (2014) High-School Exit Examinations and the Schooling Decisions of Teenagers: Evidence From Regression- Discontinuity Approaches, Journal of Research on Educational Effectiveness, 7:1, 1-27, DOI: 10.1080/19345747.2013.819398

Perna, L. W. & Thomas, S. L. (2009). Barriers to College Opportunity. The Unintended Consequences of State-

Mandated Testing. Educational Policy, 23, 451-479.

Reardon, S.F., Arshan, N., Atteberry, A. & Kurlaender, M. (2010). Effects of failing a high school exit exam on course taking, achievement, persistence, and graduation. Educational Evaluation and Policy Analysis, 32 (4), 498-520.

Warren, J. R., Grodsky, E., & Kalogrides, D. (2009). State high school exit examinations and NAEP long-term trends in reading and mathematics, 1971-2004. Educational Policy, 23(4), 589-614.

Warren, J. R., Grodsky, E., & Lee, J. C. (2008). State high school exit examinations and postsecondary labor market outcomes. Sociology of Education, 81(1), 77-107.

Warren, J. R., Jenkins, K., N., Kulick, R. B. (2006). High School Exit Examinations and State-Level Completion and GED Rates, 1975 Through 2002. Educational Evaluation and Policy Analysis 28(2): 131-152