Advancements and innovation in tech have made our lives easier and more accessible than ever. Modern tech allows us to make international money transfers at the touch of a screen, helps us monitor fertility and menstruation cycles from our phone, and even see who’s knocking at the front door when we aren’t home.
But while well-intentioned and the work of committed, thoughtful designers, if in the wrong hands, tech can cause significant harm. When it comes to domestic violence, the products and platforms many of us use to enrich our daily lives can be a key tool for control. Statistics from ITV and domestic abuse charity Refuge suggest that this is a growing practice.
With this in mind, technology company IBM has proposed a series of five design principles that it believes should be considered to reduce the risk of “tech-enabled coercive control”. The report in which these principles are set out comes from IBM’s Policy Lab, and was prompted by the uncomfortable fact that global lockdown in the face of the coronavirus pandemic has exacerbated domestic violence across the world.
The necessity of diversity
The first principle IBM offers to designers is diversity: considering a diverse user base, using a diverse development team and preparing for a diverse set of use cases. As the report states: “Designing technology for yourself is simple… But it’s important to remember that you alone cannot represent all users”.
Children, for example, are not usually considered as potential users. Though many will understand how to use photo sharing apps, not having a grasp of privacy issues and, in the case mentioned by the report, location data stored in images, poses a threat when an abuser is involved.
While designers and developers don’t envisage their products being used maliciously, the report goes on to say that exploring the “unhappy paths” is a necessity for avoiding tech-enabled coercive control.
“Empowering” users to make informed privacy choices
Beyond diversity, IBM also suggests privacy settings need to be rethought, in order to “empower” users to make decisions that are right, and safe, for them. Certain design decisions, like small red buttons, or the phrase “advanced settings”, can intimidate users, according to IBM UK’s Lesley Nuttall, one of the authors of the report, in a write up. This causes users often to opt for default settings without understanding the choice being made.
Instead of this, privacy settings should be simple to understand and set according to need, and designers should be sure to not influence users’ choice with “big green accept default buttons”, she says. Moreover, periodic notifications for users to review their settings should be considered.
Gaslighting, when a person makes someone doubt their memories or judgement through psychological manipulation, is another important consideration to have, according to IBM. In an age when a conversation, a picture or a post can be deleted with the press of a button, removing evidence of abuse is all too easy.
To combat this, Nuttall explains that “timely and pertinent notifications, as well as auditing are essential for making it obvious who has done what and when”. In short, tech platforms and apps must make it clear where changes to a timeline or conversation thread have been made, so as to make it difficult for abusers to obscure and distort the truth.
Data and security
Most designers will send their projects into the world in good faith – but that’s not to say all do. The prevalence and popularity of spyware today is evidence of this, and it poses a real threat to the security of domestic abuse sufferers.
Such circumstances require designers to think beyond the traditional security threat models, Nuttall says. Telling users when spyware has been detected is perhaps the most obvious solution to this, but designers of platforms and apps should also consider the data they collect too and how it could be intercepted, says the report.
Maps data, for example, can be intercepted if an app is connected to an email account, but is recording such data necessary in the first place? The answer to this, according to the report, should come from a careful weighing up of the risk of the data collected versus its value.
Making tech “intuitive for everyone”
The last principle put forward refers to technical ability, and is perhaps one of the most pressing. As explained in the report, abusers don’t necessarily have to be able to hack phones or accounts, they just have to make their victim think they can. For those without a confident grasp of a given piece of tech, it’s easy to fear its potential in the hands of someone else.
Designers should be aware of this, Nuttall claims, saying “If all end user technology was intuitive to use and understand, this could help reduce the risk of abusers dominating with their greater technical confidence”.