‘Keepin’ it REAL’: The Costs of a Drug Prevention Program
December 01, 2013
Author: Theodore Caputi, W’17
Today, I’m going to ask you to flashback to sixth grade. If you attended public school in the United States, you will likely remember the Drug Abuse Resistance Education (D.A.R.E.) program- specifically the “just say no” campaign, wearing drunk-goggles, and D.A.R.E. graduation ceremonies.
Now think back to 12th grade. Even though students had gone through D.A.R.E. and signed “D.A.R.E. contracts” promising not to use drugs or alcohol, most of your classmates had already started drinking alcohol (Eaton, et al.). So where’s the gap? Why doesn’t D.A.R.E. work?
Despite D.A.R.E.’s popularity, it showed virtually no sign of actually working, or being evidenced based. In fact, several studies have shown that the original D.A.R.E. program (in use until 2009) was ineffective, and in some cases, counterproductive (Vincus, Ringwalt, Harris, & Shamblen; Rosenbaum; Rosenbaum & Hanson).
This is not to say that all prevention programs are ineffective. For example, Project ALERT, and Botvin’s Life Skills were both recognized by the National Registry of Evidence Based Programs and Practices (NREPP) as evidenced based initiatives, and continue to provide cost-effective prevention programs that reduce drug and alcohol use in adolescents.
So why was the original D.A.R.E. program so popular?
The evidence suggests that D.A.R.E. was chosen over other prevention programs because it was “the default” program. Sarah Birkeland, Erin Murphy-Graham, and Carol Weiss of Harvard Graduate School of Education studied sixteen school districts that used D.A.R.E. Through this study, they found that school districts used D.A.R.E. because it facilitated positive relationships between police officers and students. But 1.3 billion dollars is a pretty hefty price tag for ensuring that young people are more familiar with police officers. Moreover, with federal costs of drug addiction hovering over 500 billion dollars, the opportunity cost for not implementing a successful prevention program is mind-boggling.
In 2009, D.A.R.E. took a step that seemed promising. Realizing its efficacy was in question, D.A.R.E.’s leadership adopted an evidence-based program calledKeepin’ it REAL.
Maybe not. After reviewing the available literature, I noticed that we may be falling into the same trap that allowed us to: A) spend billions of dollars on a program that didn’t work and B) deprive our country’s youth of effective prevention programming.
Keepin’ it REAL is a prevention program developed by Penn State researchers and given evidence-based recognition in 2006. However, when D.A.R.E. adopted the Keepin’ it REAL program in 2009, D.A.R.E. changed its strategy. D.A.R.E. offered Keepin it Real, which was initially designed for high school students, to 5thand 6th graders. Further, because D.A.R.E.’s leadership continues to place greater value on branding than scientific evidence, they failed to implement a long-term evaluation system. Relative to its cost and popularity, few studies have been performed on D.A.R.E.’s Keepin’ it REAL program. Consequently, the Substance Abuse and Mental Health Services Administration (SAMHSA) ranks its “readiness for dissemination” at just a 1.5 out of 4.
It’s not clear why policy makers are using a suboptimal prevention program. It’s even less clear why we are ignoring the opportunity costs associated with using non-evidence based programs, even though scientifically proven alternatives are available.
It seems we are facing many of the same problems we faced with the original D.A.R.E. program. Again, the new D.A.R.E. program appears to be “the default” program because of its brand-name recognition. Unfortunately, that recognition is not correlated with positive results.
As the U.S. faces a major budget crunch, we can no longer afford to spend billions on programs that are ineffective. Even if the new D.A.R.E. program is taking some steps in the right direction, it is not the best use of taxpayer dollars. Our country needs to utilize a more scientific method for choosing prevention-based programming.
As a result, I recommend that states implement a centralized agency to both fund and oversee school-based prevention programming. School boards, which typically make the decision on school-based prevention programs, often lack the time to navigate the market to determine the best prevention programs for their students. Consequently, many schools have relied on D.A.R.E.’s brand-name recognition with the hopes that it will provide scientifically proven results. A centralized agency that focuses on prevention programming could help schools select scientifically based prevention programs. Moreover, a centralized agency could ensure that misspending is minimized and that students participate in the best prevention programs possible.
Drug prevention programs may seem like a “little problem” in the scheme of things. But with billions of taxpayer dollars at stake, it deserves more than a passing glance.
- Eaton, Danice K., et al. “Youth risk behavior surveillance-United States, 2011.”MMWR Surveill Summ 61.4 (2012): 1-162.
- Rosenbaum, Dennis P. “Just say no to DARE.” Criminology & Public Policy 6.4 (2007): 815-824.
- Rosenbaum, Dennis P., and Gordon S. Hanson. “Assessing the effects of school-based drug education: A six-year multilevel analysis of Project DARE.” Journal of Research in Crime and Delinquency 35.4 (1998): 381-412.
- Birkeland, Sarah, Erin Murphy-Graham, and Carol Weiss. “Good reasons for ignoring good evaluation: The case of the drug abuse resistance education (DARE) program.” Evaluation and Program Planning 28.3 (2005): 247-256.
- Shepard III, Edward M. “The economic costs of DARE.” Institute of Industrial Relations, Research paper 22 (2001).
- Vincus, Amy A., et al. “A short-term, quasi-experimental evaluation of dare’s revised elementary school curriculum.” Journal of drug education 40.1 (2010): 37-49.
Additional Blog Posts
Student Blog Disclaimer
The views expressed on the Student Blog are the author’s opinions and don’t necessarily represent the Penn Wharton Public Policy Initiative’s strategies, recommendations, or opinions.