news

generate crime scene report in 8 seconds, us police use ai tools to write documents, which is more accurate than human memory

2024-08-31

한어Русский языкEnglishFrançaisIndonesianSanskrit日本語DeutschPortuguêsΕλληνικάespañolItalianoSuomalainenLatina

editor: ear qiao yang

【new wisdom introduction】american police have begun using the artificial intelligence tool draft one to assist with clerical work. crime reports are generated in seconds and are more accurate than human brain recall.

ai is used in all 360 industries——

ai can be used to assist police work, and recently, ai powered by gpt-4 has begun to be used to write crime reports and file paperwork.

in april, axon launched a new tool called draft one that can transcribe audio from body camera footage and automatically turn it into a police report.

draft one uses the same ai generation model as chatgpt, and the cloud service is provided by microsoft.

the launch of draft one was quickly welcomed by police officers, because trivial data collection, on-site recollection and copywriting are very time-consuming and energy-consuming, and now there is an "ai police assistant" to help.

the fort collins police department in colorado, one of the first testers, said that carrying draft one on police work reduced the time it took to write reports by 82%.

axon ceo rick smith estimates that if a police officer spends half of his time writing reports each day, using draft one can reduce that workload by at least half, giving the officer the opportunity to free up 25% of his time to focus on busy police work.

field testing

it sounds like draft one can directly complete the tedious document organization work, but how does it perform in completing specific police work?

the oklahoma city police department is one of a handful of police departments that has begun piloting the use of ai chatbots to write initial drafts of case reports.

officer matt gilmore and his sniffer dog gunner searched for a group of suspects for nearly an hour, with body cameras and draft one capturing every word spoken by the officer and the suspects.

normally, after an oklahoma city police officer goes on duty, he picks up his laptop and spends 30 to 45 minutes writing a search report, but this time he had an artificial intelligence tool write the first draft.

draft one extracted all the sounds and radio chatter from gilmore's microphone and generated a report in eight seconds.

“it was a better report than i could have written and it was 100 percent accurate,” gilmore said. “it even captured a detail that he didn’t even remember, like another officer mentioning the color of the car the suspect fled in.”

in one case, an officer loaded a video of a traffic stop into the system and, after pressing a button, the program generated a narrative report based on the audio from the car’s camera in conversational language, complete with date and time, just as the officer would write from notes.

the whole process was smooth and completed in just a few seconds. after the police officer reviewed it, there was no part that needed to be modified.

at the end of the report, officers must check a box indicating that the report was generated using ai.

officers who have tried the technology are enthusiastic about the time-saving ai tool that does a great job.

some prosecutors, police regulators, and legal scholars are concerned that when case reports written by ai are part of the criminal justice system or serve as important evidence, who can guarantee the accuracy of the reports? because large language models can produce hallucinations or fabricate facts, or even have racial biases.

for example, a district attorney prosecuting a criminal case would want reports written by police officers, not just an ai chatbot, because they would be held accountable for the authenticity of what they witnessed.

if an officer were to say on the witness stand, “well, the ai ​​wrote this, i didn’t write it so i don’t know,” that would be extremely absurd and show contempt for the law.

ai technology is not new to police agencies, which already employ algorithmic tools to read license plates, recognize suspects’ faces, detect gunfire and predict where crimes might occur.

many of these applications raise privacy and civil rights issues, and legislatures have been asked to establish laws and regulations to ensure that ai tools operate within reasonable limits.

however, ai-generated case reports have only just been introduced, and there are almost no standards or thresholds for their use.

factors of insecurity

because the field of application of draft one is very sensitive, many people are skeptical about the introduction of new technology.

how to solve the inherent bias problem of llm? how to ensure the correct use of tools? who will limit the scope and threshold of technology use?

racial prejudice

aurelius francisco, a community activist in oklahoma city, said the potential for racial bias in ai technology is just one of the reasons he is "deeply disturbed" by the new tool.

law professor andrew ferguson worries that the ease of automation and the introduction of ai technology could lead police officers to be less careful in their writing.

the large language models behind ai chatbots can easily fabricate information, a problem known as “hallucination” that can add hard-to-detect falsehoods to police reports.

axon senior principal ai product manager noah spitzer-williams told forbes that they chose to configure draft one based on openai's state-of-the-art gpt-4 turbo model to avoid racial or other bias.

the simplest and most brutal way is to simply turn off the tool's self-creativity and make it just a ruthless recording machine, which can greatly reduce the number of illusions and errors.

axon conducted a test by taking real body camera recordings and changing only the race of the suspect in each case, such as replacing the word "white" with "black" or "latino," and then had the model write a report.

noah spitzer-williams said police reports generated in the test showed "no statistically significant differences by race" in the hundreds of samples.

oral retelling

for some officers who have tried the new technology, draft one has changed the way they handle cases.

for example, officers will tell the machine what happened like telling a story, allowing the camera to better capture the content they want to highlight in the case report.

as the technology becomes more common, oklahoma city police chief jason bussert said he expects officers to become "more and more eloquent" in describing what they see because of verbal retellings.

scope of use

axon has advised police not to use ai to write reports for serious criminal cases like shootings because criminal cases are so complex and the stakes are so high.

early users used draft one only for misdemeanor reporting, but more and more customers are using it to write more serious cases, including violent cases.

however, axon is only responsible for providing ai tools and has no control over how individual police departments use them.

for example, scott galloway, police chief in lafayette, indiana, told the associated press that all officers can use draft one in any type of case, and that it has been well received by officers since the pilot began earlier this year.

robert younger, a police officer in fort collins, colorado, said officers are free to use it on any type of report, but they found it didn't work well on patrols in the downtown bar district because "there's too much noise."

in addition to using ai to analyze and summarize audio recordings, axon has experimented with using multimodal vision systems to summarize what’s seen in video footage, but the technology isn’t mature yet.

because policing requires careful consideration, new technologies must be tested in many areas before visual recognition is introduced.

axon would not say how many police departments are using the technology, and it's not the only vendor; startups like policereports.ai and truleo have similar products.

in addition, it is worth mentioning that in addition to the new technology draft one, axon also provides tasers, a type of electric shock gun, to american police.

therefore, axon has a deep cooperative relationship with police departments and has become the first choice for police departments for business cooperation and purchasing ai tools.