[Shot Detection Systems]

An analysis of ShotSpotter in Detroit, Michigan

 

Mission

The goal of this analysis is examine the effectiveness of ShotSpotter in crime prevention.

 

Introduction

Despite its growing popularity, ShotSpotter, a gunfire detection system aimed at reducing crime and making communities safer, has faced heavy criticism over the last few years. For example, in a Green Light Black Futures (GLBF) survey, a Black-centered coalition dedicated to fighting against the use of hyper-surveillance, over-policing, and facial recognition technology in Detroit, asked residents, "What does safety mean to you?" The answers varied greatly, but one response stood out: not surveillance. Instead, residents had many suggestions to make their neighborhoods safer while avoiding what they feel is an invasion of privacy by police. These ideas included the construction of city infrastructures that help build interpersonal relationships, placing green chairs (instead of Green Lights) in local parks to encourage people to sit outside more and interact with fellow citizens, and more city funding dedicated to housing, education, healthcare, employment,  and streetlights. Residents agree that "safety" should be defined not only in terms of "crime" but in other ways as well. Regardless of these concerns, the Detroit Police Department (DPD) and City Council have considered these surveillance systems highly successful. This report provides an analysis of those "success" rates. While ShotSpotter has been used in Detroit since 2015, it did not receive much attention until DPD brought a new funding proposal to city council in 2020. The project will focus on the impacts of the surveillance devices since their further implementation in early 2021.

 

To learn more about ShotSpotter from its developer, please click here:

 

Background

For more details on the reports mentioned in this section, please visit the Data and Resources pages.

 

Methodology

The full data set on 9-1-1 calls can be found on the Data page.

To properly analyze the impact of ShotSpotter within the city of Detroit, data collection focused solely on 9-1-1 calls to DPD. Since ShotSpotter has only been installed for roughly two years, research has significant limits. The bulk of data for this analysis comes from the City of Detroit Open Data File titled "911 Calls For Service." Dating back to 2016, the table and map give all 9-1-1 police emergency response and Officer-initiated calls for service in Detroit.

Prior to any filtering, the dataset contains over 5.2 million cases. To view calls only related to shots fired since the implementation of ShotSpotter, the data was filtered to include only calls under the codes "Shots IP," "Shots JH," and "Shot Spt" from 2019 to 2021. These results are broken down by year and by council district. Although ShotSpotter has only been installed in two precincts, all eight council districts contained at least one call related to "Shot Spt." It could not be determined if this occurred due to an error in data entry or if another explanation exists.

Furthermore, following data cleaning, the nature of variables in the data should be noted, i.e., many variables are categorical, complicating the analysis even more. That said, the main focus is set on DPD's response time to incidents resulting in 9-1-1 calls. The following sections detail regression analysis separated by both district and call type. Both analysis of variance (ANOVA) tests and two-sample t-tests are carried out.

Data

A link to the data used for this section can be found on the Data page, and a link to interactive maps can be found on the Resources page.

Figure 1

Figure 2

Figure 3

The current population for each council district is as follows:

District 1: 101,183

District 2: 100,723

District 3: 83,017

District 4: 75,840

District 5: 88,587

District 6: 93,893

District 7: 95,906

Dividing the number of calls per district by its respective population, the volume of calls is calculated:

  District 1: 0.064 calls/person

District 2: 0.056 calls/person

District 3: 0.093 calls/person

District 4: 0.121 calls/person

District 5: 0.035 calls/person

District 6: 0.034 calls/person

District 7: 0.051 calls/person

The below graph gives an overview of the total response time for a call from each category.

Figure 4

The next graph gives an overview of the total response time to a call in each district.

Figure 5

Additionally, a comparision between response time for each call type in each district is shown in the tables below.

FIgure 6

Figure 7

An analysis of variance test is conducted to determine if the difference between response time in each difference is significant or due to chance. The output is displayed below.

Figure 8

Several two sample t-tests are carried out to deterimine the difference in response time between each of the districts.  95% confidence intervals for each are as follows:

Difference between District 1 and District 2: (-1.302, 2.657)

Difference between District 1 and District 3: (-0.805, 2.817)

Difference between District 1 and District 4: (0.895, 4.211)

Difference between District 1 and District 5: (5.585, 8.735)

Difference between District 1 and District 6: (5.165, 8.413)

Difference between District 1 and District 7: (2.894, 6.186)

Difference between District 2 and District 3: (-1.449, 2.104)

Difference between District 2 and District 4: (0.254, 3.497)

Difference between District 2 and District 5: (4.946, 8.018)

Difference between District 2 and District 6: (4.525, 7.698)

Difference between District 2 and District 7: (2.254, 5.472)

Difference between District 3 and District 4: (0.139, 2.957)

Difference between District 3 and District 5: (4.845, 7.465)

Difference between District 3 and District 6: (4.416, 7.153)

Difference between District 3 and District 7: (2.140, 4.931)

Difference between District 4 and District 5: (3.516, 5.697)

Difference between District 4 and District 6: (4,416, 7.153)

Difference between District 4 and District 7: (2.140, 4.931)

Difference between District 5 and District 6: (-1.409, 0.667)

Difference between District 5 and District 7: (-3.692, -1.547)

Difference between District 6 and District 7: (-1.111, 1.111)

 

Results

Based on Figure 1, which lists the number of calls per year for each category, something important to note is that 2020 had the highest overall 9-1-1 calls related-to shots fired. This is interesting because many believe that crime decreased during this time period due to the pandemic and lockdowns. While this still may be true regarding all types of crimes, it does not appear to be the case for those related to gunfire.

Next, in Figure 2, which shows the relation of all calls by year and district, it is interesting to see that all seven districts have at least one ShotSpotter call. This occurs even though the systems is only installed in two areas. It could not be determined whether this information is accurate or a data collection error. Also seen in these tables, district 4 had the highest number of calls over the three-year period, while district 5 had the lowest number of calls. For districts one through four,  Shots JH had the fewest  number of calls. In district four, ShotSpotter received the highest number of 9-1-1 calls, while in districts one, two and three, Shots IP received the highest number of 9-1-1 calls.

Figure 3, depicting all shot-related 9-1-1 calls by district, is rather interesting since it shows the large areas that shots are fired throughout the city of Detroit. Whether ShotSpotter is in place or not, gunfire occurs frequently, as can be seen from the limited amount of white space on the map. Please note, the gaps in the middle of the graph include Hamtramck and Highland Park, which are not governed by Detroit's city council and have their own police departments rather than DPD.

Based on the calculated volume of calls per district, as expected, district 4 has the highest volume of calls. While district 6 has the lowest volume of calls, district 5 is very close to this number. Given the above graphs for the number of calls per district, these numbers make sense.

Figures 4 and 5 show the total response time for a shot-related call by category and the total response time for a shot-related call by council district, respectively. Here, both graphs show a negative response time for some elements. Although it could not be determined what the negative time means in this context, the points were left in to ensure accuracy to the data. It is also important to notice that there seems to be much variability between both category and council district. The variability in response time for each district is analyzed further in the ANOVA and two-sample t-tests. Figure 7 shows there is variability in response time for each category of call within each district as well. However, this may be due to the different priorities of each category of call. For example, Shots JH has priority 3, while Shots IP and ShotSpotter have priority 1. This means that Shots IP and ShotSpotter typically should have quicker response times than Shots JH.

To determine whether the difference in response time by district is statistically significant or due to chance, an ANOVA test is conducted. The output from RStudio is shown in Figure 8. Here, since the p-value is very small, < 2.2e-16, we reject the null hypothesis that there is no difference in the variability of total response time in favor of the alternative hypothesis that there is a difference in variability. To then determine which districts are different in terms of total response time, several two-sample t-tests are carried out comparing each district to the others.

The 95% confidence intervals represent the difference between response time of the first district to the second. For example, the confidence interval for district 1 and district 2 is (-1.302, 2.657). This means that we are 95% confident that the true difference between total response time in district 1 and the total response time in district 2 is between -1.302 and 2.657 minutes. Now, while it would not normally make sense for time to be negative, in this context, since the original data included negative values of time, having a negative difference could be logical. While this interpretation is not provided for every confidence interval listed above, it is important to note that some intervals are much larger and show a much greater difference between response time for the given districts. One example is the confidence interval for district 1 and district 5, which is (5.585, 8.735). Here, we are 95% confident that the true difference between total response time in district 1 and the total response time in district 5 is between 5.585 and 8.735 minutes. As stated, this is a much greater difference than that between district 1 and district 2. Going along with this, there seems to be a pattern in these data. Districts with ShotSpotter already installed appear to have an overall lower response time (including for calls that are not ShotSpotter) than districts without ShotSpotter installed. This can be seen in the confidence interval between district 1, which has ShotSpotter, and district 5, which does not have ShotSpotter. The ANOVA test confirms that this pattern is statistically significant.

 

Discussion

Due to the "success" of ShotSpotter, it is likely that DPD will continue to expand throughout the city of Detroit. According to their findings, nearly 90% of shot-related crimes are not reported to police without the surveillance systems. Additionally, response times for ShotSpotter calls are significantly lower than response times for other 9-1-1 calls, which shows the need for the system. For those districts with ShotSpotter already in place, overall response times to shot-related calls are quicker than districts without ShotSpotter.

 

Conclusion and Further Work

While this report provides significant insight into the current success rates of ShotSpotter in Detroit, it is important to note that, like all social justice issues, data is constantly changing. This analysis, written in April of 2023, may not be accurate as the implementation of ShotSpotter continues and expands.

Additionally, future work should consider analyzing data on shots fired prior to the installation of ShotSpotter to serve as a comparison. This report did not consider original data due to time constraints and the inability to access the information publicly. Shortly after the publication of the project, DPD implemented the new ShotSpotter systems (from the proposed $7 million contract). While these new data points (shown on the left) were not considered in this report, the analysis may be useful as a base for future work using the new systems. It also is important to determine what "Shots IP" and "Shots JH" are in common terms. Clearly, some difference exists as Shots IP, like ShotSpotter, are priority one calls, while Shots JH are priority three calls. This may explain the large gap in response times between these types of calls. Finally, it may be of interest to discover the meaning of negative values in the dataset for response time. Obviously, DPD cannot arrive on scene in a negative amount of time. However, due to lack of knowledge on the matter, these values were left alone and included in the analysis.

 

References

References for this analysis can be found on the Data and Resources pages.