INSTAGRAM’S chief has admitted that the app is failing to track down dangerous “self-harm and suicide” images.
It comes after the suicide of a 14-year-old British schoolgirl was blamed on Instagram â€“ which is seen by some as a safe haven for posts that encourage harmful behaviour.
Instagram, like its parent company Facebook, has been struggling to contain the flow of dodgy images on the site.
And writing in the Telegraph, Instagram boss Adam Mosseri admitted his team weren’t doing a good enough job.
“The bottom line is we do not yet find enough of these images before they’re seen by other people,” he explained.
“We have put in place measures to stop recommending related images, hashtags, accounts, and typeahead suggestions.”
Mosseri, who is due to meet UK health secretary Matt Hancock on Thursday to discuss these issues, said that there are plans to introduce “sensitivity screens” on the app.
These screens will appear on any content that involves cutting.
This will mean that the images won’t be “immediately visible” to other users, but will still be accessible on the app.
According to Mosseri, Instagram won’t be removing self-harm images automatically, however.
“We don’t allow people to promote self-injury, but because of the advice we’ve received, we don’t want to stigmatise mental health by deleting images that reflect the very hard and sensitive issues people are struggling with.
“Instead, for images that don’t promote self-harm, we let them stay on the platform.
“But moving forward, we won’t recommend them in search, hashtags or the Explore tab.”
But the age-old Facebook excuse of “we need to do more” is wearing thin.
Speaking to The Sun,Â Andy Burrows, Associate Head of Child Safety Online at the NSPCC, said: “We have had over a decade of social networks like Instagram saying they will do better but they have consistently failed to keep children safe.
“Time and time again harmful content is left up on social networks that no parent would want their child to see.
“It is clear now that the only way forward for the Government is to regulate, which will force social networks to abide by rules to keep children safe and to punish them when they fail to do so.”
Writing on Twitter, tech investor Malcolm Evans wrote of Instagram: “All of them ‘who aren’t there yet’ or who ‘need to do more’ â€“ just get there. Or do more. By tomorrow.”
And marketing expert, author and professor Stephen Waddington slammed Instagram on Twitter, suggesting that pop-up warnings on the app simply aren’t good enough.
“Takes seconds to find content related to eating disorders, porn and self-harm on Instagram and Twitter,” Waddington tweeted.
“And a pop-up ain’t gonna stop your kids finding it.”
Mosseri’s comments come just days after a grieving dad accused Instagram of “helping” his schoolgirl daughter take her own life.
Heartbroken Ian Russell recently told how 14-year-old Molly died in 2017, after viewing scores of images glorifying self-harm and suicide.
She was found dead just hours after handing in her homework – and packing a schoolbag for the next day.
Her devastating suicide note read: “I’m sorry. I did this because of me.”
Molly â€“ who went to Hatch End High School in Harrow, Middlesex â€“ had started viewing the disturbing posts without her family’s knowledge.
Ian told the BBC: “I have no doubt that Instagram helped kill my daughter. She had so much to offer and that’s gone”
“She seemed to be a very ordinary teenager. She was future-looking. She was enthusiastic.
“She handed her homework in that night. She packed her bags and was preparing to go to school the next day and then when we woke up the next morning, she was dead.”
There were accounts from people who were depressed or self-harming or suicidal.
One haunting image shows a blindfolded girl hugging a teddy bear, captioned “This world is so cruel, and I don’t wanna to see it any more.”
Mr Russell – who directed the BBC coverage of the Queen’s 90th birthday service – said Molly had access to “quite a lot of content” that sparked concern.
He said: “Quite a lot of that content was quite positive. Perhaps groups of people who were trying to help each other out, find ways to remain positive to stop self-harming.
“But some of that content is shocking in that it encourages self-harm, it links self-harm to suicide and I have no doubt that Instagram helped kill my daughter.
We’ve asked Instagram for comment and will update this story with any response.
How to delete your Instagram account â€“ permanent and temporary options revealed
Here’s what you need to know…
- To permanently delete yourÂ InstagramÂ account, first you need you log in via the website, as you cannot delete it via the app
- Go to the ‘delete your account page’, and select from the drop-down menu ‘why are you deleting your account’?
- You need to select an option in order to progress to the next step, and you to re-enter your password
- You should see ‘permanently delete my account’, which you can click or double tap
- For security reasons, Instagram can’t delete an account on your behalf
- You can also temporarily disable your account, which means your photos, comments and profile will be hidden
- But your account will be reactivated when you next log in, and your followers will still be there, apart from anyone who may have unfollowed you
- To temporarily disable an account, log into the websiteÂ online, not via the app
- Select ‘edit profile’ and scroll down until you see ‘temporarily disable my account’
- Choose an option why, re-enter your password, and then click or double click ‘temporarily disable my account’
Instagram has a long and troubled history when it comes to keeping the app clean.
In December, The Sun reported on how paedos were using “toddler bikini” hashtags to steal photos for their sick porn sites.
Earlier in 2018, we reported on dangerous “eating disorder hashtags” that were circulating on Instagram without any warnings.
And in July last year, a Sun Online investigation found that secret sex hashtags were being used to share hundreds of smutty videos.
The popular image-sharing app has a strict “zero tolerance” policy on sexual content â€“ but had failed to crack down on porn hashtags that help users find smut online.
The Sun tracked down more than a dozen different hashtags that lead directly to smut.
We found hundreds of inappropriate posts in a matter of minutes, just by entering rogue hashtags easily searchable online.
Some videos depicted full sex with genitals in clear view, while others showed oral sex or masturbation.
One clip even showed a bestiality scene involving an adult woman and a horse â€“ which is illegal to distribute in the UK.
Others didn’t necessarily depict nudity, but included male ejaculation or close-up crops on hardcore sex scenes â€“ leaving genitals just out of shot.
Facebook was recently exposed for paying children up to Â£15 a month to install a “spying app” that monitored everything they did online.
Here’s how to find out exactly what Facebook knows about you.
And here’s why your Facebook posts can tell Mark Zuckerberg how much money you earn.
Do you think Instagram should be doing more to clean up its act? Let us know in the comments!
® Copyright of authors and sources is cited by the newspaper.