Legal experts from Harbottle & Lewis suggest how developers can help stamp out bad and inappropriate behaviour
The issues of bullying and harassment in online gaming faced by women and others have been under the spotlight, particularly following Gamergate and other reported incidents which have helped to raise awareness.
YouTube and Twitch have recently spoken out about the extent of the problem and both have made commitments to combat abuse on their platforms. More generally, the cross-party campaign – ‘Reclaim the Internet’ – launched recently by MPs including Maria Miller and Yvette Cooper has as its mission statement “to stamp out violent threats, misogyny and sexist abuse online” and reflects an increasing public awareness and concern.
Amongst other things, the campaign has called for a review of our laws tackling online abuse. Whilst there is no overarching legal framework in the UK regulating use of the internet, online abuse is illegal under both the criminal and civil law and can trigger legal action being taken for:
- Various criminal communications offences
One of the driving forces of the campaign is that the focus of the laws that currently exist to tackle online abuse is on locating and holding accountable perpetrators. The campaign on the other hand calls for more victim centric laws, and one area of focus is likely to be on the obligations and responsibilities of website hosts and platform providers to take a more active role in monitoring, preventing and addressing the problem of online abuse.
This will be particularly relevant to games platforms and studios, as the use of online games to target, bully and harass others can take place in a myriad of forms and can be extremely difficult to detect and to monitor.
Whilst there is no legal obligation to act if a website becomes aware of inappropriate or even abusive conduct, it does raise serious issues. A responsible games platform or studio does not want to expose its users to abuse, and it may suffer adverse PR consequences for failing to act to prevent inappropriate behaviour.
Games with online voice chat and messaging features present the greatest risk of ‘traditional’ cyberbullying and harassment as they can be used to send sexist and misogynistic communications, of which women are usually the victims.
However, the format and very nature of online games also give rise to a unique opportunity for bullying within the parameters and experience of the game itself. “Griefing” is the use of game mechanics as a means of targeting and bullying others through actions taken within the game, and it perhaps comes as no surprise that the victims of griefing are quite often women.
One solution is for more creative measures which specifically target the problem at the root and by games platforms themselves implementing measures such as:
- Improving reporting facilities for players to report bad behaviour, including abuse which takes place through play within the game itself;
- The imposition of sanctions including bans on repeat offenders;
- A peer review approach, such as the League of Legends Tribunal feature, which makes negative play accountable by other members as part of the game, with consequences within the game itself; and
- Designing games with built in features and objectives which encourage collaborative play and support positive rather than negative interaction.
Games platforms should have in place clear policies and procedures to deal with online abuse which are not only enshrined in the platform’s terms and conditions but are promoted by the platforms and the players themselves. These should be implemented through swift action as soon as an issue is identified, with a view to not only avoiding liability but ultimately creating an environment where the issue is eradicated.
Jo Sanders is a partner at law firm Harbottle & Lewis. Natasha Brierley is an associate. Find out more at www.harbottle.com.
Image Credit: Stella Stig