The Need for Neighborhood-Level Data in Youth Violence Prevention Initiatives
by Jeffrey A. Butts and Alana M. Henninger
January 17, 2017
This project was supported by Grant No. 2010-MU-FX-0007 awarded by the Office of Juvenile Justice and Delinquency Prevention, Office of Justice Programs, U.S. Department of Justice. Points of view expressed in this document are those of the authors and do not necessarily represent the official positions or policies of OJJDP or the U.S. Department of Justice.
Acknowledgements
This report was prepared with support from the Office of Juvenile Justice and Delinquency Prevention (OJJDP), part of the U.S. Department of Justice and its Office of Justice Programs (OJP). The authors are grateful to the staff and leadership of OJJDP for their guidance and support during the development of this project. We are also grateful for the support and advice of the Office for the Advancement of Research and the Office of Sponsored Programs at John Jay College of Criminal Justice, as well as assistance received from all current and former colleagues from the Research and Evaluation Center who contributed to this report and to the Center’s assessment of the National Forum on Youth Violence: Michelle Cubellis, Douglas Evans, Marissa Mandala, Laura Negredo, Jennifer Peirce, Emily Pelletier, Cassandra Ramdath, Rhoda Ramdeen, Pamela Ruiz, and Kathleen Tomberg. Finally, we would like to thank everyone we met with and interviewed in the cities participating in the National Forum on Youth Violence Prevention.
Summary
The data infrastructures available for tracking youth violence in the United States do not provide a clear view of neighborhood-level change. Effective strategies for dealing with youth violence inevitably focus on small areas like neighborhoods, and they involve partnerships with community organizations, local schools, hospitals, housing agencies, and organizations in the cultural and recreational sectors. This small-area focus makes it essential to measure the effects of violence prevention efforts at the neighborhood level. At best, however, national data systems track violence at the level of entire cities.
Violent crime in the U.S. fell sharply after the mid-1990s and it remains at historically low levels. Some cities and specific neighborhoods within cities, however, are still beset with violence. In an attempt to assist local jurisdictions with violence prevention, the U.S. Department of Justice and a number of other federal agencies launched the National Forum on Youth Violence Prevention in 2010. More than a dozen cities participated in the National Forum, collaborating to increase the effectiveness of their local strategies for reducing youth violence. The Department of Justice asked John Jay College of Criminal Justice to monitor and assess the outcomes of the National Forum beginning in 2011.
The assessment was not designed to attribute cause-and-effect relationships to activities undertaken by participating cities. The study mainly investigated the accomplishments and perceptions of the leadership networks in each city. Conducting a more rigorous evaluation of the National Forum was not feasible because a multi-city network of neighborhood-level data about youth violence and its correlates does not exist in the United States. Steps are being taken, however, that may eventually lead to better data resources. This report describes some of the most promising resources and suggests the type of work needed to provide communities with accurate, localized crime trend data with which to judge the effects of multi-jurisdictional violence prevention initiatives.
Cities Participating in the National Forum on Youth Violence Prevention
Introduction and Background
Between 2011 and 2016, the John Jay College Research and Evaluation Center (JohnJayREC) in New York City assessed the implementation of the National Forum on Youth Violence Prevention. The National Forum was a network of communities across the United States that received support and assistance from several federal agencies led by the U.S. Department of Justice (DOJ) and its Office of Juvenile Justice and Delinquency Prevention (OJJDP). The National Forum began in 2010 as a partnership between the Departments of Justice and Education (DOJ and DOE 2010). Launched by the White House, the initiative organized stakeholders from the federal, state, and local levels to discuss common challenges, prepare strategies, and coordinate actions to reduce youth violence. Within the first year, the initiative expanded to include the Office of National Drug Control Policy (ONDCP), the Department of Health and Human Services (HHS), and the Department of Housing and Urban Development (HUD).
The first ten communities to participate in the National Forum were Boston, Camden, Chicago, Detroit, Memphis, Minneapolis, New Orleans, Philadelphia, Salinas, and San Jose. Later, the initiative expanded to include Long Beach, Cleveland, Louisville, Seattle, and Baltimore. Each city assembled a team of individuals from varying backgrounds to deploy community-based strategies to prevent youth violence as well as analytic tools to gauge their effects. The teams met regularly during the implementation of the National Forum, receiving training and technical assistance from the federal partners, consultants, and contractors in the non-profit and academic sectors.
Initially, cities involved in the National Forum received only small amounts of technical assistance funding (Consolidated and Further Continuing Appropriations Act 2012). In 2011, the DOJ requested additional funding to “[create] a context for participating localities to share challenges and promising strategies with each other and to explore how federal agencies can better support local efforts” (Department of Justice 2011: 26).
The federal budget for FY2012 included the DOJ request as an allocation for grants and technical assistance in support of the National Forum (Office of Management and Budget 2011). In its justification for the request, DOJ cited the need to help cities create comprehensive plans to prevent youth violence and to spur innovation at the local level. The measure included support for more intensive technical assistance on violence prevention strategies as well as for training on topics such as data collection, data analysis, and the coordination of information systems (Department of Justice 2011).
Congress responded by funding the National Forum with an initial appropriation of $2,000,000 (Consolidated and Further Continuing Appropriations Act 2012). The legislation authorized support for participating cities to “share challenges and promising strategies, and develop or enhance effective comprehensive plans to prevent youth and gang violence in their cities, using multidisciplinary partnerships, balanced approaches, and data-driven strategies,” with the aim to “reduce violence, improve opportunities for youth and improve public safety, and encourage innovation at the local level and Federal levels” (S. REP. NO. 112-78, 2011). Federal agencies used the funds to prepare and deliver technical assistance to the cities and to convene cross-site meetings. The funding also supported travel which enabled local teams to visit each other’s cities and to learn about the development and implementation of key strategies.
Next, OJJDP issued a solicitation for a provider of training and technical assistance. Development Services Group of Bethesda, Maryland won the competition and soon began helping National Forum cities (Department of Justice 2012). Over time, funding grew and the mission of the federal partners expanded to include direct support through operational improvement grants to the cities (Consolidated and Further Continuing Appropriations Act 2013; Consolidated Appropriations Act 2014; Consolidated and Further Continuing Appropriations Act 2015). Finally, OJJDP provided funding for JohnJayREC to assess the efforts of the National Forum.
Results of the Assessment
The JohnJayREC team visited each of the first ten National Forum cities (often more than once) to interview local officials and to observe the strategies being used to accomplish the goals of the initiative. Researchers also conducted a series of surveys to measure the perceptions, opinions and attitudes of the members of local partnerships in the National Forum cities. The first survey was administered in 2011. The last survey was completed in 2016.
The assessment project produced several reports. The first report described the strategies being pursued by cities involved in the National Forum (Butts, Tomberg, Evans, Ramdeen, Roman and Taylor 2012). A second report analyzed changes over the initial three waves of the project’s surveys in five of the earliest cities to join the National Forum (Butts, Roman and Tomberg 2012). The findings suggested the initiative was generating important changes at the community level and respondents believed the improvements were associated with the launch of the National Forum. Participating cities reported more opportunities for youth and stronger local collaborations. In addition, there were indications that the cities were developing better capacity to reduce youth violence and that local perceptions of law enforcement were improving.
A third report analyzed the final iteration of the survey (Tomberg and Butts 2016). Respondents in 2016 believed the initiative continued to be a successful approach for improving collaboration and for helping cities to implement more effective strategies to reduce violence. Organizational networks in the cities appeared to be moving in positive directions and the individuals involved in those networks were confident that their efforts had improved public safety. Respondents believed their communities were stronger as a result of participation in the National Forum. The findings again indicated increases in youth opportunities, better violence prevention approaches, improved perceptions of law enforcement, and broader engagement of community members.
The results of the stakeholder surveys were encouraging, but even the most accurate survey simply reflects the perceptions of individuals whose answers are inevitably affected by bias, misunderstanding, and subjective self-interest. Surveys are never totally sufficient for answering the most important question about a complicated crime prevention initiative—i.e., did it work as intended?
Two critical questions about the National Forum are: 1) did participation in the National Forum really change each city’s approach to preventing youth violence; and, 2) was youth violence actually lower than it would have been had cities not participated in the National Forum? Answering the first question in one city is challenging; answering the second question across a range of different cities is impossible at this time. Local data infrastructures in the United States do not measure youth violence consistently and with enough detail to support evaluation research across jurisdictional boundaries. Current data resources focus on state-level and city-level changes. No existing data source is capable of tracking crime trends in multiple areas at the neighborhood level.
Measuring Crime
The amount of crime in a community is traditionally measured in one of three ways. First, researchers may ask people directly about the criminal acts they have committed (i.e., self-reported offending). This measure has the advantage of capturing data about law violations that are not reported to police. Researchers have found high levels of agreement (80%) between self-reported arrests and official arrests (Pollock, Menard, Elliott and Huizinga 2015). Self-reported measures, however, are complex and expensive, making them financially impractical. They require the involvement of professional researchers and they are most feasible with small samples. Few cities would be capable of continuously surveying the number of neighborhood samples required to generate micro-level, longitudinal estimates of self-reported violence.
Second, researchers can effectively measure the level of crime in an area by asking residents about their victimization experiences. Victimization surveys typically track the time and place at which crime incidents occurred, the types of crimes involved, and whether victims were able to describe the offender(s) in terms of age, sex, and race. Like self-reported measures, victim surveys are able to detect violent crimes that never come to the attention of police, but they are also expensive to maintain and often require professional research organizations to implement. Federal justice agencies only recently began to invest in the creation of subnational victimization estimates (Fay and Diallo 2015). The U.S. is decades away from having enough victimization data to evaluate violence reduction efforts, especially at the level of individual neighborhoods.
The third and most common method of measuring crime is to rely on administrative data from state and local agencies. Hospitals, for example, keep data about patients with injuries due to violence. Police agencies collect data about reported crimes and arrests, often including the age, sex, and racial/ethnic background of each person arrested, as well as the time and location of crimes, whether weapons were involved, and any known relationships between victims and offenders. Court systems share information about the cases they receive from law enforcement and how they are handled—whether offenders are diverted or prosecuted, pleaded out or tried, acquitted or found guilty, and what actions were taken as a result. Justice agencies invest heavily in information gathering, analysis, and dissemination.
As policymakers recognize the growing importance of geography in crime prevention, some justice systems have begun to collect data at smaller geographic levels, such as patrol areas, police districts and precincts (Weisburd and Telep 2014). A few cities even publish fully geocoded data that pinpoint the exact coordinates of every violent crime. Dissemination is also improving. Many cities once published crime data in annual paper reports, but it is increasingly common for police departments to release crime data online and to update the files monthly, weekly, or even on a constant, real-time basis. Like other sectors of government, the movement toward “open data” has begun to change the way communities monitor the activities and effectiveness of justice systems (Tashea 2016). Perhaps the best expression of this growing movement was the formation of the Police Data Initiative, inspired and organized by the Obama Administration’s Task Force on 21st Century Policing.
Innovations in National Forum Cities
Cities involved in the National Forum developed innovative approaches to tracking neighborhood trends in violence. With the encouragement of their Mayors, several cities worked closely with public health agencies to combine violence indicators from health and justice. Larger cities (e.g., Baltimore, Detroit, New Orleans, Philadelphia, San Jose, and Seattle) and some smaller cities (e.g., Camden and Louisville) even participated in the Police Data Initiative. In all National Forum cities, the federal partners supported local efforts to improve the measurement and assessment of violence prevention.
The City of Chicago, for example, provides public, online access to geocoded crime data. The dataset includes all reported crimes occurring in the city since 2001. The website advises users that new data are added within seven days of the occurrence of crimes. Data are extracted from the Chicago Police Department’s CLEAR (Citizen Law Enforcement Analysis and Reporting) system and include codes for the date and time of the incident, the mid-block coordinates of the incident’s location, the offense type, whether an arrest was made, and the date of the most recent update to each record (in cases where more accurate information was added subsequent to the initial posting). Data availability in Chicago is excellent for researchers who want to plot the location and time of all violent crimes. If the focus of an investigation is “youth violence,” however, data resources in Chicago are not as rich. Researchers can access information about incidents, but not about arrests organized by age of the arrestee. In addition, many arrest reports available from the Chicago Police Department website have not been updated since 2010.
New Orleans, which the website FiveThirtyEight once dubbed the “queen of open police data,” opened its entire “calls for service” database to the public in 2011, soon followed by the launch of a comprehensive homicide reduction initiative, NOLA FOR LIFE. The initiative is led by the Mayor’s Office but incorporates the efforts and leadership of dozens of other entities throughout New Orleans, including the police department and health department. It also involves an elaborate and well-coordinated strategy for using data analytics to design, deliver, and evaluate the initiative’s effects.
New Orleans is one of the best examples of a local government that fights violence by drawing on several strategies. Rather than choosing or endorsing one particular approach, New Orleans recognizes that several of the most prominent approaches for combating violence have something to offer. It uses the focused deterrence approach (“Group Violence Reduction Strategy”), the Cure Violence model (known as “Ceasefire New Orleans”), and “Project Safe Neighborhoods,” among others. Coordinating these diverse efforts creates serious measurement challenges. The Mayor’s Innovation Delivery Team manages a series of output and outcome measures to track the effectiveness of the overall initiative. One tactic monitors the number of murders reported each month for several years before and after the launch of NOLA FOR LIFE.
Minneapolis partnered with Hennepin County and the Minneapolis Foundation to create a violence prevention strategy called the Blueprint for Action. The strategy relies on a multi-component approach that views youth violence as a public health issue and not simply as a matter for law enforcement and criminal justice. Officials began the strategy by recognizing that cities tend to embrace a problem-solving approach in other areas of public concern (e.g., water and air quality, childhood immunizations, prenatal care). In other policy areas, the focus is not on controlling and punishing the people most to blame for a problem. Rather, public policy works to identify whatever conditions and incentives are producing an unwanted behavior, and then it implements strategies to reverse those conditions and incentives. Minneapolis’ problem-solving approach involves broad participation and community collaboration, ongoing measurement, and frequent communication.
The City of Minneapolis began to track key indicators immediately upon the launch of the Blueprint and it made the results available to the public by posting regular updates about youth violence and factors that are often correlated with youth violence, including high school graduation rates and teen pregnancy rates. The Minneapolis Police Department continues to publish weekly crime reports at the neighborhood level and it maps crime locations. The information is available online and publicly accessible through the LexisNexis Community Crime Map platform.
Beyond Policing
Most cities involved in the National Forum significantly improved their use of law enforcement data, but they did not restrict their efforts to justice information. As suggested above, justice data are inherently incomplete. Much of the actual violence in a community never comes to the attention of the justice system. One study comparing official crime data with victimization data estimated that more than half (52%) of all violent victimizations are never reported (Langton et al. 2012). Moreover, reported crimes do not always result in an arrest. In 2015, for example, police agencies nationwide made arrests in just over half of all serious assault cases (FBI 2016). Even in cases of homicide, more than a third do not result in arrests, which means the age of the offender is never known. For some offenses, only one in ten reported crimes result in any arrests.
Law enforcement information is also difficult to obtain at the local level. For more than six decades, policymakers and the public have become accustomed to judging changes in violence by reviewing the data gathered and disseminated by the Federal Bureau of Investigation’s Uniform Crime Reporting program (UCR). Jurisdictions across the United States participate in the UCR program by providing state and local crime data that the FBI aggregates into a national database and releases to the public. Thousands of police agencies participate in the program representing more than 90 percent of the U.S. resident population.
The UCR program has been an effective organizing force in the measurement of crime and violence. State and local governments must comply with certain reporting requirements in order to participate in the UCR. Over time, the requirements inspired greater consistency and comparability in data file structures among disparate cities and states. Many challenges remain, however, and the FBI data series is not yet a sufficient source of information for evaluating the impact of city-specific—and especially neighborhood-specific—crime initiatives. The weaknesses of FBI data reflect the heterogeneity of U.S. justice systems. Offense definitions vary across states and sometimes within states. Arrest practices vary. Cities make different choices about how to count crimes and when to consider a crime to be resolved by arrest. City boundaries vary. Some cities include extensive suburban or semi-rural areas with low-density populations; others are highly dense with large low-income populations. Due to many variations in police department reporting practices, the FBI advises researchers not to rank cities with its data.
For all these reasons, cities must expand the type of information they use to track the incidence of youth violence across neighborhoods. One alternative is the information that hospitals maintain about patients with violent injuries, often with exact street addresses. Another alternative could be data about violent incidents reported by schools. Both in healthcare and education, governments have been improving the availability of data. In these systems as well, however, variations in policy and practice tend to complicate cross-jurisdictional comparisons. When a violence prevention initiative is implemented in multiple locations, it is still very difficult to answer the simple question: “Did it work?”
Existing Models for Improved Data
The United States does not have the capacity to track violence—much less “youth violence”—across jurisdictions and over time at a sufficient level of geographic detail to inform rigorous evaluations of violence prevention strategies. There are, however, a number of programs and data initiatives that may serve as models for future efforts to develop such indicators.
Morbidity and Mortality Weekly Report (MMWR)
Morbidity and Mortality Weekly Reports (MMWR) are brief research papers that report findings from data submitted to the Centers for Disease Control and Prevention (CDC) by all 50 state health departments and the District of Columbia. Unfortunately, many violence indicators disseminated by the MMWR are not available at the state level. The series provides even less data at the local level. CDC publications, however, are still an excellent model for a readily accessible system of nationally distributed data on violence prevention. The purpose of the MMWR series is to disseminate timely public health information and recommendations for how the information should be used. The reports are considered provisional; they change as more information is received and revised.
A 2015 MMWR, for example, discussed youth violence prevention using several different CDC data sources. David-Ferdon et al. (2015) used the CDC’s Web-based Injury Statistics Query and Reporting System (WISQARS) to report the total number of homicide victims (4,481) between the ages of 10-24 in one year. Unintentional injury, suicide, and homicide were the top three causes of death among this age group. The report estimated that 547,260 youths between the ages of 10 and 24 were treated for nonfatal physical assault-related injuries in U.S. emergency departments. Using a tool that measures the cost of injuries, they estimated the combined medical and lost productivity costs resulting from youth homicides and nonfatal physical assault-related injuries totals as $19.5 billion per year. Other MMWR publications, such as the Recommendations and Reports series, provide detailed discussions concerning policy, prevention, and treatment. MMWR Supplements provide additional information that may not meet the content and format requirements for the regular MMWR series (e.g., proceedings from national conferences, description of historic events). The Summary of Notifiable Diseases presents official statistics submitted to the National Notifiable Diseases Surveillance System (NNDSS) from state and territorial health departments.
Academic Centers of Excellence in Youth Violence Prevention (ACE)
In 2000, the Division of Violence Prevention at the CDC established Academic Centers of Excellence in Youth Violence Prevention (ACE). The information collected in ACE studies does not constitute a national or even a consistent multi-jurisdictional effort to track youth violence, but the Centers are a good example of how public health and criminal justice approaches can be used in tandem for tracking and preventing youth violence.
The ACE program partnered with high-risk communities and local health departments to implement and evaluate a number of strategies. Each of the Centers focused on an individual community, but their efforts were coordinated and standardized whenever possible. During the first ten years, the Centers focused on developing multidisciplinary and community partnerships, establishing surveillance systems at the local level, building capacity in local communities, researching risk and protective factors for youth violence, training the ACE workforce, developing and evaluating various youth prevention programs, and informing larger policy initiatives.
Six of the ACE Centers were funded to reduce youth violence through the implementation and evaluation of existing strategies: Johns Hopkins University in Baltimore, University of Chicago, University of Colorado, University of Michigan, University of North Carolina, and Virginia Commonwealth University. As in the National Forum, however, communities were not required to use identical strategies or to collect comparable data. Each team designed and assessed the effectiveness of its own approach.
STRYVE
In 1999, the White House convened a Council on Youth Violence to coordinate youth violence prevention efforts across federal agencies. In 2004, the Council and other federal partners developed the National Youth Violence Prevention Resource Center (NYVPRC) to provide public, web-based access to information about youth violence in user-friendly formats. The NYVPRC was later expanded to provide information and tools that could be used within communities to implement prevention strategies. These efforts evolved into the STRYVE national initiative (Striving To Reduce Youth Violence Everywhere), which provides communities with frameworks and tools to prevent youth violence.
STRYVE focuses on the use of scientific evidence, program expertise, and collaborations to prevent violence before it occurs. Various professional sectors and disciplines are brought together to share the responsibility of youth violence prevention. This extends beyond criminal justice and public health to include education, labor, and social services. Recognizing that communities experience violence differently, STRYVE seeks to build the capacity of health departments, government agencies, and community-based organizations by providing tools and information that can be used for the specific needs of individual communities. STRYVE also focuses on preventing violence comprehensively at the individual level (e.g., strengthening interpersonal skills) and societal level (e.g., changing the perception that violence is normal), as well as through positive role models and safer communities. Other components of STRYVE focus on building partnerships to prevent youth violence and to raise awareness by providing information on successful approaches that can be replicated. This includes providing training, resources (e.g., research, educational materials, and databases), and tools that can be customized to individual communities.
STRYVE helps communities determine which practices may work best and the CDC provides suggestions on how to use data to plan and implement youth violence interventions. The first step is defining the boundaries of the community where prevention efforts will take place using zip codes, neighborhood blocks, and school catchment areas. This leads to more focused compilations of data at the neighborhood level and helps to indicate where youth violence is occurring. STRYVE provides local jurisdictions with direct access to resources and tools that can be used to examine neighborhood characteristics, risk factors, and protective factors. Local officials may learn how to obtain key measures from the U.S. Census, including the percentage of families living below the poverty line, a community’s unemployment rate, and protective factors such as commitment to school, educational attainment, and school enrollment. A wide range of measures already available by census tract can be used in tandem with justice data to identify neighborhood-level factors associated with the effectiveness of violence reduction efforts.
WISQARS
The CDC’s WISQARS tool provides several methods of tracking and measuring changes in youth violence. An online database provides an interactive format for users to search data about violent deaths, fatal and non-fatal injuries, and the costs of injuries. Detailed information is available about intent (e.g., homicide or accident) and the relationship between victims and perpetrators, but WISQARS does not provide demographics about perpetrators. The WISQARS platform provides only state-level data about violent deaths and it does not include all 50 states.
Data are provided by a variety of sources. For example, the National Violent Death Reporting System (NVDRS) was created to track changes in violence in thirty-two states. Data are assembled from vital statistics records, law enforcement, crime labs, medical examiners, and coroners. Mortality data may be filtered by intent, including unintentional, homicide, legal intervention, suicide, undetermined intent, and homicide followed by suicide. Researchers may select all categories or specify groups. Violent deaths may also be searched by the type of mechanism (cause) of injury, including firearm, cut/pierce/stab, struck by/against, fall, drowning, fire, motor vehicle, hanging/suffocation, poisoning, or other. Users of WISQARS may also search within firearm type (e.g., handgun, shotgun, rifle, other firearm, or combination).
WISQARS also provides data about fatal injuries, including the total number of deaths according to intent (e.g., unintentional, homicide, suicide), cause of injury (e.g., firearm, cut/pierce), and the race/ethnicity, sex, and age of the injured person. Fatal injury data are provided at the national, regional (i.e., northeast, south, midwest, west), and state level since 1999. Some data are drawn from the National Vital Statistics System (NVSS), which is the oldest inter-governmental data sharing system in the country. WISQARS provides national data about nonfatal injuries, which would be helpful in adjusting state and local-level estimates for all forms of violence. Finally, researchers use WISQARS to estimate the total (national) cost of treating patients with fatal injury deaths, violent deaths, and nonfatal injuries in hospitals or emergency departments.
Youth Risk Behavior Surveillance System
Six types of health-risk behaviors are monitored by the Youth Risk Behavior Surveillance System (YRBSS) because they are considered contributors to leading causes of death and disability. Health-risk behaviors include anything that may lead to injury or violence, sexual behaviors that may result in unintended pregnancy or sexually transmitted infections, substance use (including alcohol), tobacco use, unhealthy dietary behaviors, and inadequate physical activity. The first category, of course, is of greatest interest to youth violence prevention practitioners and researchers (Brener et al. 2013).
Information for the YRBSS is collected through surveys. Ongoing surveys are conducted at the national, state, tribal, and large urban school district level. For example, the Youth Risk Behavior Survey (YRBS) provides data about a representative sample of U.S. high school students (grades 9 through 12). One-time national surveys also contribute data to the YRBSS, such as the Youth Risk Behavior Supplement (an addition to the 1992 National Health Interview Survey of 12 to 21 year olds), the National College Risk Behavior Survey (undergraduates), the National Alternative High School Youth Risk Behavior Survey (9th to 12th graders in alternative high schools), and the National Youth Physical Activity and Nutrition Study (grades 9 through 12). Other, special-population surveys are conducted periodically as short-term federal initiatives are established (Brener et al. 2013).
The CDC itself noted five key limitations to the YRBSS that are particularly relevant to tracking changes in youth violence. First, not all 50 states participate in collecting state-level data. Second, the focus of the YRBSS is on health and risk behaviors that lead to morbidity and mortality, rather than all types of violence. Third, data collected from national, state, territorial, tribal, and large urban school districts are not representative of all school-aged youths because data are only collected from youths currently attending school (Brener et al. 2013).
School Crime and Safety
The Bureau of Justice Statistics and the National Center for Education Statistics produce a series of reports examining indicators of school crime and safety. The purpose of the Indicators of School Crime and Safety series is to provide accurate information for the development of effective violence and school crime prevention programs. Indicators are drawn from a wide variety of school, academic, criminal justice, and public health data sources that may be used to track youth violence in school settings.
Indicators are set out in categories: violent death; nonfatal student and teacher victimization; school environment; fights, weapons, and illegal substances; fear and avoidance; discipline, safety, and security measures; and postsecondary campus safety and security (Robers et al. 2015). Each indicator is analyzed using varying methods of sampling and data collection. The resulting reports also summarize findings from ten key datasets: the School-Associated Violent Deaths Study, the Supplementary Homicide Reports, the Web-based Injury Statistics Query and Reporting System Fatal (WISQARS), the National Crime Victimization Survey, the School Crime Supplement to the National Crime Victimization Survey, the Youth Risk Behavior Surveillance System, the Schools and Staffing Survey, the School Survey on Crime and Safety, the Campus Safety and Security Survey, and EdFacts (Robers et al. 2015).
Limitations of Existing Resources
Various data resources are used to monitor the incidence and implications of youth violence, but all have serious limitations. Many of the resources described here are available only as publications and users cannot access the original data. Most of the resources provide information only at the national or state level, which offers no help for communities tracking the effects of violence prevention initiatives in specific neighborhoods. Few data resources examine youth violence specifically and most focus on general data about health and criminal justice for the entire population. Most importantly, very few data sources pinpoint the times and locations of violent acts.
Most data used to track youth violence are collected from surveys or administrative sources (e.g., police departments, courts, hospitals, medical examiners). Administrative data are convenient, but they often fail to include detailed information on the context of violence, including victims and offenders. Police records usually provide information on offenders; health records tend to focus on victims. Official records never include data about unreported crimes (i.e., the “dark figure” of crime) (Lynch 2013; Piquero, Schubert and Brame 2014).
Many non-police data sources are tied to the provision of services by healthcare providers and social services. This may negatively affect the quality of the reported data (and introduce bias from missing data) because data collection is not the primary mission of provider agencies and only some violence requires services. Furthermore, service providers are not always objective reporters because they have a vested interest in how indicators reflect on their organizations or their ability to recoup the costs of their services (Lynch 2013).
Victimization surveys are unlikely to become standard tools for monitoring youth violence. Victimization data were developed in part as an effort to address the well-known problems with police data (Lynch 2013; Piquero et al. 2014), but conducting household surveys is increasingly expensive—especially for small areas and neighborhoods. Telephone surveys are becoming more difficult as well with the rising number of households that lack landline phones. Victimization surveys are also forced to define some crimes in different ways for different areas, which affects accuracy (Lynch 2013).
A critical limitation of many systems for tracking youth violence is the lack of detail about age. Criminal justice records often provide the exact age of offenders and victims, but many data sources do not. Health resources and school surveys typically report age in ranges (ages 10 to 14, 15 to 19, etc.). Other resources provide data that cannot be separated into the age ranges necessary to examine “juvenile” violence, the definition of which varies from state to state.
Finally, the time frames for many data resources are insufficient. Some provide data only on an annual basis. More importantly, few data sets provide information at the local or neighborhood level. This is a fatal flaw for communities trying to design and evaluate violence prevention strategies.
Conclusions and Recommendations
The efforts inspired by the National Forum on Youth Violence Prevention may have led to important innovations at the local level. The collaborative networks in National Forum cities reported strong commitments to the initiative and high degrees of confidence in their prevention strategies. All the cities faced difficulties, however, in measuring changes in youth violence. Local police data and health data were sometimes accessible, but not in sufficient detail to support evaluation. None of the cities involved in the National Forum tracked youth violence trends in a way that would have allowed researchers to evaluate the effects of the initiative over time and in specific neighborhoods.
Cities committed to preventing and reducing youth violence must have access to detailed information about youth violence in small geographic areas and over significant periods of time. Some of the data series described here could serve as models and perhaps inspire new initiatives to integrate different types and sources of data for assessing the effects of violence reduction programs.
Policymakers and practitioners at the federal, state, and local levels should continue to collaborate with researchers to craft new approaches so that cities may target future violence interventions more effectively. To evaluate youth violence prevention initiatives at the neighborhood level and in varying jurisdictions, researchers should pursue the following goals and strategies.
- Before launching any multi-jurisdictional initiative, researchers should collaborate with local officials to convene a cross-disciplinary data committee charged with identifying the strongest possible set of identical violence indicators that can be generated by all jurisdictions at the neighborhood level.
- Researchers must be able to measure violence at the smallest sub-city level of geography (i.e. census tracts, street/block segments, or fully geocoded X/Y coordinates). City-wide indicators are not sufficient.
- Violence indicators must be coded for time—ideally the day and hour of the incident but month at the very least. Annual indicators are not sufficient.
- Whenever possible, violence indicators should measure all violent acts and not merely those reported to law enforcement. Where feasible, data from the healthcare, housing, social services and education sectors should be used to complement law enforcement data.
- Surveys may be used to complement administrative data, but they should be professionally designed and administered, paying special attention to sampling frames and representativeness. Surveys may focus on crime victimization or citizen perceptions of justice and safety.
- When comparing violence across neighborhoods, the area with the least amount of data and the least detailed data should determine the core analytic approach. In other words, one common set of indicators must be available across all areas subject to comparison. For special projects, researchers may capitalize on the more complex or detailed measures available in a subset of areas, but the primary evaluation questions must be answered using measures common to all areas.
References
Brener, Nancy D., Laura Kann, Shari Shanklin, Steve Kinchen, Danice K. Eaton, Joseph Hawkins and Katherine Flint (2013). Methodology of the Youth Risk Behavior Surveillance System-2013. Morbidity and Mortality Weekly Report: March 1, 2013. Recommendations and Reports, Volume 62, Number 1. Atlanta, GA: Centers for Disease Control and Prevention, U.S. Department of Health and Human Services.
Butts, Jeffrey A., Caterina Roman, and Kathleen A. Tomberg (2012). Teaming up for Safer Cities. A Report from the Implementation Assessment of the National Forum on Youth Violence Prevention. New York, NY: Research and Evaluation Center, John Jay College of Criminal Justice, City University of New York.
Butts, Jeffrey A., Kathleen Tomberg, Douglas Evans, Rhoda Ramdeen, Caterina Roman, and Caitlin Taylor (2012). Interim Report 1: National Forum on Youth Violence Prevention. New York, NY: Research and Evaluation Center, John Jay College of Criminal Justice, City University of New York.
Consolidated and Further Continuing Appropriations Act, 2012 (2011). Public Law No. 112-55, 125 Stat. 617. Washington, DC: 112th Congress of the United States.
Consolidated and Further Continuing Appropriations Act, 2013 (2013). Public Law No. 113-6. Washington, DC: 113th Congress of the United States.
Consolidated Appropriations Act, 2014 (2014). Public Law No. 113-76. Washington, DC: 113th Congress of the United States.
Consolidated and Further Continuing Appropriations Act, 2015 (2014). Public Law No. 113-235. Washington, DC: 113th Congress of the United States.
David-Ferdon, Corrine, Thomas R. Simon, Howard Spivak, Deborah Gorman-Smith, Sheila B. Savannah, Robet L. Listenbee and John Iskander (2015). CDC Grand Rounds: Preventing Youth Violence. Morbidity and Mortality Weekly Report, 64(07). Atlanta, GA: Centers for Disease Control and Prevention, U.S. Department of Health and Human Services.
Department of Justice (2011). FY 2012 Performance Budget. Office of Justice Programs. Washington, DC: U.S. Department of Justice.
Department of Justice and Department of Education (October 5, 2010). Federal Agencies Host National Forum on Youth Violence. Press Release. Washington, DC: U.S. Department of Justice and U.S. Department of Education.
Department of Justice (2012). OJJDP FY 2012 National Forum on Youth Violence Prevention Expansion Project. OMB No. 1121-0329, OJJDP-2012-3314. Washington, DC: U.S. Department of Justice, Office of Juvenile Justice Delinquency Prevention.
Fay, Robert E. and Mamadou Diallo (2015). Developmental Estimates of Subnational Crime Rates Based on the National Crime Victimization Survey. Washington, DC: Bureau of Justice Statistics, U.S. Department of Justice [NCJ 249238].
FBI, Federal Bureau of Investigation (2016). Crime in the United States 2015. Washington, DC: U.S. Department of Justice.
Langton, Lynn, Marcus Berzofsky, Christopher Krebs and Hope Smiley-McDonald (2012). Victimizations Not Reported to the Police, 2006-2010. Special Report: National Crime Victimization Survey. Washington, DC: Bureau of Justice Statistics, U.S. Department of Justice [NCJ 238536].
Lynch, James (2013). The Evolving Role of Self-report Surveys of Criminal Victimization in a System of Statistics on Crime and the Administration of Justice. Proceedings of the 59th ISI World Statistics Congress, 25-30 August 2013, Hong Kong.
Office of Management and Budget (2011). Budget of the U.S. Government Appendix, Fiscal Year 2012. Washington, DC: Author.
Office of Juvenile Justice & Delinquency Prevention (2010). Get SMART- Mapping Resources for Crime and Delinquency Prevention. Washington, DC: Office of Juvenile Justice & Delinquency Prevention, U.S. Department of Justice [NCJ 222504].
Piquero, Alex R., Carol A. Schubert and Robert Brame (2014). Comparing Official and Self-report Records of Offending Across Gender and Race/ethnicity in a Longitudinal Study of Serious Youthful Offenders. Journal of Research in Crime & Delinquency, 51(4), 526-556.
Pollock, Wendi, Scott Menard, Delbert S. Elliott and David H. Huizinga (2015). It’s Official: Predictors of Self-reported vs. Officially Recorded Arrests. Journal of Criminal Justice, 43(1), 69-79.
Robers, Simone, Anlan Zhang, Rachel E. Morgan and Lauren Musu-Gillette (2015). Indicators of School Crime and Safety: 2014. (NCES 2015-072/NCJ 248036). National Center for Education Statistics: U.S. Department of Education, Bureau of Justice Statistics: Office of Justice Programs, U.S. Department of Justice. Washington, D.C.
Senate Report 112-78 (2011). Departments of Commerce and Justice, and Science, and Related Agencies Appropriations Bill, 2012. Washington, DC: 112th Congress of the United States.
Tashea, Jason (2016). Emerging Justice Technologies and the Need for Evaluation. New York, NY: Research and Evaluation Center, John Jay College of Criminal Justice, City University of New York.
Tomberg, Kathleen and Jeffrey A. Butts (2016). Durable Collaborations: The National Forum on Youth Violence Prevention. New York, NY: Research and Evaluation Center, John Jay College of Criminal Justice, City University of New York.
Weisburd, David and Cody W. Telep (2014). Police and the Microgeography of Crime: Scientific Evaluations on the Effectiveness of Hot Spots and Places [Technical Note IDB-TN-630]. Washington, DC: Inter-American Development Bank.
________________________
Copyright by the Research and Evaluation Center at John Jay College of Criminal Justice, City University of New York (CUNY), and the Office of Juvenile Justice and Delinquency Prevention (OJJDP), U.S. Department of Justice.
John Jay College of Criminal Justice Research and Evaluation Center
524 59th Street, Suite 605BMW
New York, NY 10019
www.JohnJayREC.nyc
About the Authors
Jeffrey A. Butts is director of the Research and Evaluation Center at John Jay College of Criminal Justice, City University of New York (CUNY). Previously, he was a research fellow with Chapin Hall at the University of Chicago, director of the Program on Youth Justice at the Urban Institute in Washington, DC, and senior research associate at the National Center for Juvenile Justice. He earned the Ph.D. from the University of Michigan.
Alana Henninger is research criminologist with the Center for Justice, Safety & Resilience at RTI International. Before joining RTI, she was a researcher with ICF International and Johns Hopkins University. She graduated with honors in psychology from York College of Pennsylvania and earned the Ph.D. from John Jay College and the CUNY Graduate Center.
Recommended Citation
Butts, Jeffrey A. and Alana M. Henninger (2017). Local Measures: The Need for Neighborhood-Level Data in Youth Violence Prevention Initiatives. New York, NY: Research and Evaluation Center, John Jay College of Criminal Justice, City University of New York.