Emma Pickering, Head Of Technology-Facilitated Abuse And Economic Empowerment, Refuge
Forensic FocusArchived Mar 17, 2026✓ Full text saved
Emma Pickering of Refuge warns of a 62% surge in tech-facilitated abuse—from stalkerware and spy cams to deepfakes and wearable surveillance—and explains why policing and digital forensics must urgently catch up.
Full text archived locally
✦ AI Summary· Claude Sonnet
Emma Pickering is Head of Technology-Facilitated Abuse and Economic Empowerment at Refuge, where she leads national strategy on tech and economic abuse and oversees the National Domestic Abuse Helpline. With more than 15 years in the VAWG sector, she works at the intersection of technology, policy and survivor safety, shaping frontline responses and influencing national practice.
Emma, it’s been a while since we last spoke to you on the Forensic Focus Podcast. How has technology-enabled abuse evolved since then?
Refuge’s frontline domestic abuse services have seen a significant rise in survivors reporting tech-facilitated and economic abuse, highlighting how perpetrators are increasingly weaponising technology to exert coercive control – from advanced stalkerware which involves tracking a survivor’s every move, to the creation of harmful deepfakes.
Broadly speaking, almost every survivor we support has been subjected to some form of tech-facilitated abuse. Referrals to Refuge’s specialist Technology-Facilitated Abuse and Economic Empowerment team rose by more than 62% in 2025 compared to 2024, with the final three months of the year the highest on record for a single quarter, reflecting the increasing complexity of tech-facilitated abuse cases presenting to frontline services.
Refuge has also seen a 24% increase in referrals involving survivors under the age of 30, highlighting the worrying prevalence of digital control and surveillance in younger people’s relationships.
We know that stalking is one of the most common forms of abuse, often forming part of a perpetrator’s web of coercive control. The proliferation of advanced and accessible tracking devices means that it’s worryingly easy for perpetrators to stalk survivors using tech – from phone monitoring and using image geotags on social media, to the weaponisation of items such as air tags.
Get The Latest DFIR News
Join the Forensic Focus newsletter for the best DFIR articles in your inbox every month.
Unsubscribe any time. We respect your privacy - read our privacy policy.
Refuge is extremely concerned by the use of surveillance devices, disguised as household items, which listen to and record survivors without their knowledge.
Since we last spoke, AI development has also accelerated significantly, resulting in an increase in harms from deepfake technology, including spoofing apps being used to impersonate survivors.
This has all been compounded by the quick pace of technological development, which can make it difficult for law, policy and the criminal justice system to keep pace with emerging forms of technology, and the ways in which it is being weaponised.
We consistently see a significant lag when it comes to the law catching up with new harms – and this has direct consequences for survivors’ safety.
From your experience supporting survivors, what are the most common forms of digital evidence that are overlooked or misunderstood by investigators in cases involving tech-enabled abuse?
Domestic abuse has incredibly low charge and conviction rates in general. Although the law has expanded to cover different forms of tech abuse in recent years, we frequently hear from survivors that police do not always take tech-facilitated abuse seriously, or prosecute it effectively.
Often, police lack the training needed to fully understand the nature or impact of certain forms of tech abuse. This is particularly the case with intimate image abuse. More generally, investigators may not realise how incidents of tech abuse are not isolated acts, but rather form part of a perpetrator’s web of coercive control.
Survivors often tell us that because tech abuse is classified as ‘online abuse’, police fail to recognise the true scale of harm, grading the abuse as ‘low’ to ‘medium’ risk. This obscures the fact that incidents of online and tech-facilitated abuse are not isolated incidents, and instead usually indicate an escalation of a wider pattern of domestic abuse.
On top of this, survivors frequently tell us that their devices are removed for analysis after reporting abuse – even though we know the evidence typically sits on the perpetrator’s device.
There is also an ongoing failure by police to use specific kinds of legislation which can be used to charge and convict perpetrators of tech abuse. For example, we are still not seeing police utilise the Computer Misuse Act, which can be applied when there has been unauthorised access to a survivor’s online accounts. This is a kind of abuse that Refuge sees every day, and yet the legislation is almost never applied to survivors’ cases.
Sadly, the police are stretched, and they need more capacity to dedicate time to investigations and gathering evidence, of which there is plenty: from recording devices in the home, to tracking devices on survivors’ personal belongings and digital trails showing that a perpetrator has gained access to a survivor’s email account.
Many digital forensic practitioners focus on traditional artefacts such as mobile extractions or cloud data. What new or emerging technologies are creating potential risks for survivors?
It is currently far too easy for perpetrators to weaponise smart accessories, and our sector-leading Technology-Facilitated Abuse and Economic Empowerment Team is seeing the devastating consequences of this every day. In January, Refuge announced that the team are increasingly seeing cases where AI and wearable technology – including smart glasses and watches – are being misused by abusers to stalk, surveil and control survivors.
We are aware of cases involving perpetrators using wearable tech, such as smartwatches, Oura rings and Fitbits, to track and stalk women, by monitoring data, such as step counts, and using this to infer survivors’ movements or tracking them through linked cloud accounts.
Refuge is also seeing a notable rise in surveillance, with more survivors reporting concerns about hidden microphones and cameras in their homes, in tandem with an explosion in the highly concerning ‘spycam’ market – which is woefully under-regulated.
In the last year or so, we have also seen a huge rise in survivors being targeted and harmed by horrific deepfake abuse. As exemplified by the recent scandal involving Grok and the related proliferation of images across X, AI-generated image abuse is escalating at an unprecedented rate, with devastating implications for survivors.
Spoofing apps are another key concern, and time and time again we have seen perpetrators use these to impersonate survivors in order to cause them harm.
In your work at the intersection of technology, policy and survivor safety, where do you see the biggest gaps in forensic capability when responding to tech-enabled abuse?
There are concerns that new iPhone settings and features can make it more difficult for digital investigators to gain access to devices for analysis. We need Apple and other tech companies to work with police to help with evidence gathering, rather than creating barriers.
We also have significant concerns about perpetrators remotely wiping data from the cloud after their devices have been seized by police. Delays in evidential analysis buy perpetrators time to manipulate the evidence, preventing survivors from accessing justice and putting them at risk of further harm.
Ultimately, we need more dedicated funding for police to invest in tackling digital abuse effectively and in a trauma-informed way.
We also need a long-term Government plan setting out an approach to tackling tech-facilitated abuse and online harms. This must involve specialist training and be backed by financial support and resources for police to ensure they are equipped to tackle this kind of abuse properly.
We need to ensure that police understand both the relevant technical equipment, and how perpetrators are misusing it. This is where training in conjunction with specialist services like Refuge is crucial, to ensure police stay ahead of emerging forms of technology, like spoofing apps.
It is vital this training comes from the VAWG sector and the agencies that are working directly with survivors, as we have firsthand knowledge about trends and the impact of tech abuse, and can advise on survivor-centered evidence gathering and solutions.
How can law enforcement better engage with specialist domestic abuse services to ensure investigations prioritise victim safety and minimise unintended harm?
The recent policing white paper rightly recognised that most crimes, including domestic abuse, now involve some form technology. It is therefore vital that police training equips all officers with the knowledge and practical skills required to effectively identify, investigate and gather evidence relating to tech-facilitated abuse.
Refuge already works with some police forces to provide trauma-informed training, and it is critical that any police training is delivered by specialists and informed by survivor insights. Without using these insights, the criminal justice system risks re-traumatising survivors and allowing perpetrators to evade justice.
There is currently a huge issue with the enforcement of laws to tackle tech abuse, and we know conviction and charge rates for intimate image abuse offences are woefully low. Engaging with specialist services is absolutely critical to changing this and ensuring that survivors get the justice they deserve.
Poor conviction and charge rates can result from inadequate evidence gathering practices, a misunderstanding of what constitutes image abuse, and harmful victim-blaming attitudes – particularly where abuse involves intimate images sent by the survivor. This is unacceptable, and for the law to have the impact it needs, specialist training is key.
Women have the right to use technology without fear of abuse, and when that right is violated, survivors must be able to access swift justice and robust protections.