Instagram faces suspension of ads from companies concerned about child abuse material. Two dating app developers have paused their ad displays on the platform. They claim Instagram breaches an agreement to exclusively showcase ads in safe environments without inappropriate content like sexually explicit material or hate speech.
According to The Wall Street Journal’s report on Monday (27th), Instagram’s recommendation algorithm might display unsuitable content. During tests, accounts received content suggestions featuring youth and children influencers alongside sexually suggestive videos.
Journalists initiated an investigation after noticing adult men among followers of young gymnasts, cheerleaders, and other underage influencers. They observed Instagram providing “shocking doses” of content, including images of children and sexually explicit adult videos, alongside ads from major US companies.
In one test on Instagram’s feed, an ad from the dating app Bumble appeared between a video of a person caressing a life-sized latex doll and another video showing a young person lifting their shirt to expose their abdomen.
Another instance highlighted by the newspaper featured a Pizza Hut commercial following a video of a man lying on a bed with a caption claiming the person beside him was a 10-year-old girl.
Investigation Exposes Pedophilia Network on Instagram
Meta defended itself, claiming that The Wall Street Journal’s tests produced results inconsistent with what billions of other users see. A spokesperson mentioned new ad control tools introduced in October to address such issues.
Names of companies, including Disney, Walmart, Match Group (owner of Tinder), and Hims (a telehealth service for erectile dysfunction), appeared alongside inappropriate content in the tests.
Match Group’s spokesperson, Justine Sacco, expressed the company’s refusal to pay Meta for marketing brands to predators or placing ads near such content. Robbie McKay, Bumble’s spokesperson, stated their intention to suspend ads on Meta platforms, emphasizing they never intentionally advertise inappropriate content.
An investigation revealed Instagram as the primary social network connecting pedophiles for sharing child sexual abuse material. The Wall Street Journal’s study in June suggested that recommendation algorithms play a crucial role in this. Large accounts openly advertised this content and used direct messaging to connect buyers and sellers, often through hashtags related to pornographic material.