The Malta Independent 17 May 2024, Friday
View E-Paper

AI’s potential in domestic violence cases

Mark Said Thursday, 2 May 2024, 07:27 Last update: about 15 days ago

The latest data from the National Statistics Office (NSO) shows that the number of people seeking support services for domestic violence is increasing, while the vast majority of victims remain women. This could easily be attributable to a significant number of effective legislative and institutional reforms. Still, the fight against domestic violence remains a formidable challenge for our authorities, and one should never refrain from considering additional, modern and effective tools.

ADVERTISEMENT

One such tool can easily be AI. Artificial intelligence is the science of making machines do things that would require intelligence if done by humans. While I am not an expert in AI, like the likes of Professor Alexiei Dingli, I have come across various systems that have effectively adopted AI in preventing domestic abuse.

Domestic violence victims, unfortunately, continue to have the equivalent of living in a constant real-life thriller, where you do not know what's going to happen to you or your children at any moment.

Machine-learning methods are far more effective at assessing which victims of domestic violence are most at risk than conventional risk assessments.

It transpires that, of the domestic abuse calls received by police, around one in ten people will call again within a year about a repeat violent attack. The police service needs to assess the risk that domestic abuse victims will be targeted again to keep victims safe and prevent future violence. Currently, this is done using a standardised set of questions.

But there is a way to predict such repeat attacks more accurately, giving the police a better chance of preventing serious harm. It is precisely here that machine-learning systems that analyse existing information, including criminal records, calls made to the police, and reported incidents of violence, can identify the risk of repeat incidents more accurately than the current standardised questionnaires used by our police.

Such systems can turn out vital, as in more than one case, the police have sometimes been too slow in getting to domestic abuse incidents and there were delays in responding to cases. In a small number of cases, the delays were because the force did not have enough officers available to attend.

At the moment, police officers responding to a domestic abuse call are instructed to complete the DASH (Domestic Abuse, Stalking, Harassment, and Honour-Based Violence) form. DASH is a checklist of around 27 questions, which is then used, alongside any other relevant information, to inform an officer’s assessment of the case as standard, medium, or high risk. Assessing a case as high-risk implies that an incident causing serious harm could take place at any time and trigger resources aimed at keeping the victim safe.

Unfortunately, we had repeat attacks in situations that the DASH system had not classified as high-risk. Still, we can improve on our satisfactory negative prediction rate. By applying machine learning, a form of basic artificial intelligence, to the DASH information, this satisfactory negative prediction rate can be further reduced.

By replacing the DASH data with different existing information about people involved, which includes criminal convictions, incidents of violence, or the number of calls made to the police about domestic abuse, the machine-learning system could become even more accurate.

DASH data is available only after an officer has appeared on the scene, but information about someone’s criminal history is potentially available as soon as the call comes in and the call handler has identified the parties involved. This means that an initial prediction of violent recidivism could be made while the caller is on the line. Indeed, it could be used to set the priority score of the call, meaning police would have a better chance of responding quickly to high-risk cases.

One way to reduce the cost of dealing with false positive predictions—the prediction of an attack that then does not occur—would be through developing a second, more sensitive screening procedure. A two-part procedure would do better than the DASH risk assessment, both in prioritising calls for service and in providing protective resources to victims with the greatest need for them.

On a somewhat uplifting note, technology and AI can enable us to help domestic abuse victims in many effective and creative ways. Technology has bridged gaps in data, documentation, reporting, and policy and has provided faster, more efficient tools for victims.

More than one exemplary project and solution deserves a campaign to raise awareness and help them scale nationally.

Crowdsourcing can be a powerful way to understand domestic violence and sexual harassment, as well as trigger policy-making and institutional change. Crowdsourcing sites, such as #StopFemicides, can not only raise awareness of the scale and seriousness of the issue, but they can also bring the much-needed transparency required to accelerate change by calling on the government and policymakers to take action.

On an equally exciting level, various AI and NLP (natural language processing) tools can be developed to spot domestic violence and online harassment trends. Tools like these adorned with artificial intelligence can help protect victims of domestic violence by identifying patterns to enable subject matter experts, local authorities and the police service to predict potential domestic violence cases and enable prevention or timely action.

Furthermore, there are numerous safety apps and SMS-based services that can educate users about domestic violence and help them in times of need. Kitestring, Circle of 6, Watch Over Me, Safetipin and Saahas are a few examples worldwide. Responsible design could ensure vulnerable groups are not at risk from using these tools.

In addition to phone lines, chatbots and personal agents can be implemented to address increased call volumes. On another level, several wearable technologies, such as Safelet (an SOS bracelet) or another form of workplace or home panic button, can help potential victims stay safe.

Perhaps Professor Dingli might be tempted to delve deeper into the potential of AI in combating domestic violence in our country and provide us with a more learned insight, as he has been successfully doing within the road network and educational sectors.

 

Dr Mark Said is a lawyer

 

  • don't miss