New Haven Police Department to Test AI Software for Writing Police Reports

New Haven Police Department to Test AI Software for Writing Police Reports

The New Haven Police Department is on the cusp of launching a pioneering pilot program that leverages artificial intelligence to revolutionize the way police reports are written, with the ultimate goal of freeing up officers to focus on more pressing matters.

Christian Bruckhart, the public information officer for New Haven’s police department, revealed that the department is exploring the use of Draft One, a cutting-edge software developed by police tech company Axon, to streamline the report-writing process and significantly reduce the time officers spend on paperwork, thereby enabling them to devote more time to patrolling the streets and engaging with the community.

“By harnessing the power of artificial intelligence, we aim to make our reporting process more efficient, allowing officers to spend less time typing away and more time out on the streets, where they can make a meaningful impact,” Bruckhart explained, highlighting the potential benefits of this innovative approach.

According to Bruckhart, incident reports can often be a time-consuming and laborious process, sometimes taking hours to complete, which is why the department is eager to explore the use of technology to improve overall efficiency and productivity.

The police reports generated using Draft One would be based on the audio recordings from officers’ body cameras, which would then be reviewed and verified by the officer who initially responded to the incident, ensuring that the reports are accurate and reliable, Bruckhart noted.

It’s worth noting that the AI software would not be used for more serious crimes such as homicides or shootings, but rather for lower-level offenses like larceny and theft, where the reporting process is more routine and less complex.

The pilot program is slated to begin in February and will run for a period of three to six months, after which it will be subject to a thorough review and evaluation to assess its effectiveness and identify areas for improvement.

“This is essentially an experiment to see how well the technology works and whether it can help us make our officers more efficient and effective in their roles,” Bruckhart said, emphasizing the department’s commitment to innovation and continuous improvement.

Local residents had mixed reactions to the proposal, with some expressing enthusiasm and others voicing concerns about the potential risks and limitations of relying on AI-generated reports.

“While I think the idea is intriguing, I do worry that AI can sometimes misinterpret or misrepresent certain words or phrases, which could lead to inaccuracies in the reports,” said Anttwon Brown, a resident of New Haven, highlighting one of the potential drawbacks of this approach.

Others, like a New Haven native who wished to remain anonymous, expressed more profound concerns about the role of AI in the reporting process, suggesting that computers may not be able to capture the nuances and complexities of human interactions.

“I’m not convinced that a computer can do a better job than a human when it comes to writing up people’s histories or incidents,” the native said, echoing concerns about the potential erosion of human judgment and empathy in the reporting process.

Assistant Professor of Computer Science and Computer Data at the University of New Haven Vahid Behzadan cautioned that while the use of AI-generated reports may offer some benefits, it’s essential to be aware of the potential pitfalls and limitations of this technology.

“One of the main concerns is that officers may become too reliant on this technology and miss out on small factual errors or mistakes in their reporting, which could have serious consequences,” Behzadan said, highlighting the need for ongoing review and verification of AI-generated reports.

Moreover, Behzadan noted that the use of audio transcriptions alone may not be sufficient, as it may miss out on other important modalities of information, such as visual cues or contextual details, which are essential for a comprehensive understanding of the incident.

Behzadan praised the police department for taking a cautious and measured approach to implementing this technology, emphasizing the importance of careful evaluation and benchmarking to ensure that the system is working effectively and accurately.

“Pilot programs like this should always be accompanied by rigorous evaluation and testing to ensure that the technology is working as intended and that any potential errors or biases are identified and addressed,” Behzadan said, highlighting the need for transparency and accountability in the development and deployment of AI-generated reports.

The American Civil Liberties Union (ACLU) also weighed in on the issue, expressing concerns about the potential consequences of introducing AI-generated reports into the criminal justice system.

The New Haven Police Department’s proposal to use AI software for drafting police reports raises significant concerns about the potential erosion of human judgment and empathy in the reporting process, as well as the risk of introducing biases and errors into the system.

Furthermore, this decision raises important questions about transparency and accountability, including how the technology works, how data is stored and used, and whether it could be sold to third-party vendors. There must be clear insight into how AI-generated reports might influence or amplify biases within the machine learning models, and the public deserves to have a say in the development and deployment of this technology.

Leave a Replay