Learning from the Dallas Crime Stat Quality Crisis

Grits for Breakfast points out that Dallas is still in hot water over the accuracy of their crime stats. An editorial in the Dallas Morning News gets at the fundamental problem:

No doubt [Police Chief David] Kunkle had good intentions when he urged the department to alter its reporting procedures to make the city’s crime-reporting procedures more accurate. However, several criminal justice experts say the change appears to violate the FBI’s standardized, nationwide classification systems and may also have inflated the improvement in the overall crime rate.

Kunkle, who will retire next year, brought strong leadership to the Dallas Police Department and solid progress in fighting crime. The city’s crime trend is moving in the proper direction.

Nonetheless, doubt has been cast over the integrity of the crime stats, and an independent third-party review is the best way to clear up these questions. Citizens can then have confidence in the city’s crime-counting policy – or learn where reform is needed.

Police departments adopt the procedures by which the self-report crime stats to the FBI.  This is the case in Austin as well.  Police Chiefs want data to be reflective of actual underlying crime, both to avoid unnecessary fear and to optimally allocate policing resources.  Unfortunately, in an attempt to carve out noisiness, it’s possible for procedures that go too far and eliminate actual signal, so to speak.  A lot of police departments are going to come under fire for their procedures as the Gov 2.0 movement asks for more public safety data. Austin should be pro-active and tackle this problem by setting up continuous, detailed independent review.

This entry was posted in Public Safety and tagged , , , , . Bookmark the permalink.