Crackdown on 'cyberflashing': Social media firms face call for action
Crackdown on ‘cyberflashing’: Social media firms face call for action as it emerges that some children are sent sexual images daily
- Girls are increasingly targeted on the platforms of Snapchat and Instagram
- Lack of accountability and identity-checking fuel the online sexual harassment
- Report calls on companies to create ‘clearer and extensive’ privacy settings
Social media giants have been urged to clamp down on ‘cyberflashing’, with a study revealing that children are being sent sexual pictures every day.
Girls are increasingly targeted on the likes of Snapchat and Instagram, as a growing bombardment of unwanted images becomes ‘dangerously normalised’.
Researchers claim a lack of thorough accountability and identity-checking measures are helping to fuel the online sexual harassment of young people.
The study, led by UCL Institute of Education, found that non-consensual image-sharing practices were ‘particularly pervasive, and consequently normalised and accepted’. This is contributing to ‘shockingly low’ rates of reporting online sexual abuse.
Social media giants have been urged to clamp down on ‘cyberflashing’, with a study revealing that children are being sent sexual pictures every day (stock image)
The report calls on tech companies to create ‘clearer and more extensive’ privacy settings and introduce rigorous identification procedures to protect children from adult predators.
The shake-up could involve checking a user’s identity with passports, as well as putting verified ages on social media profiles.
Snapchat is also urged to keep a record of images, videos and messages to identify perpetrators and aid the reporting of incidents.
Researchers quizzed 144 boys and girls aged from 12 to 18 in focus groups, and a further 336 in a survey about digital image-sharing. Thirty-seven per cent of the 122 girls surveyed had received an unwanted sexual picture or video online.
Three in four of the girls in the focus groups had also been sent an explicit photo of male genitals, with the majority of these ‘not asked for’.
This form of harassment was ‘often experienced on a regular, sometimes daily basis’. Girls described ‘getting used’ to receiving this unwanted content and no longer seeing it as a ‘big deal’.
Snapchat was the most common platform used for image-based sexual harassment, according to the survey findings. But reporting on Snapchat was deemed ‘useless’ by young people because the images disappear.
Half (51 per cent) of the respondents who had received unwanted sexual content online or had their image shared without their consent admitted doing nothing about it. A third claimed they didn’t think reporting ‘works’.
Girls are increasingly targeted on the likes of Snapchat and Instagram, as a growing bombardment of unwanted images becomes ‘dangerously normalised’ (stock image)
The report’s lead author, Professor Jessica Ringrose, of the UCL Institute of Education, said: ‘Young people in the UK are facing a crisis of online sexual violence.’
Instagram said: ‘Keeping the young people who use our apps safe is our top priority, and we have measures in place to protect them.’ Snapchat said: ‘Any sexual harassment is deplorable and we work with the police and industry partners like Childnet to keep it off Snapchat.’
The Children’s Charities’ Coalition on Internet Safety has demanded that age verification for pornography websites be introduced in the UK.
The charity has told the data watchdog, the Information Commissioner’s Office, that if it does not act, it will face a High Court challenge.
Source: Read Full Article