Sick sexualized “face swapping” apps aimed at children nine and older
VIDEO apps, which can instantly create sexualized fake likenesses, are targeting children as young as nine, a Sun on Sunday investigation has found.
Teens simply enter a photo of a boy or girl’s face and within seconds it’s transferred to a scantily clad body in a provocative pose.
One of the artificial intelligence-based apps, Facemega, was removed from both the Apple App Store and Google Play this week following our investigation.
But online watchdog group App Magic estimates that the app has been downloaded more than a million times since its launch last year.
The use of AI-driven fake videos and photos has increased by 900 percent since 2019.
Carolyn Bunting, CEO of children’s cybersecurity organization Internet Matters, told us: “The creation of sexualised, non-consensual deepfakes in these apps is incredibly troubling, as is the impact this type of content can have on children.
“The sexual use of someone’s image without their consent will be extremely harmful to the child. It can lead to complex and lasting problems that affect their well-being.”
Before Facemega was pulled from the app stores, it had climbed to number 77 on the entertainment charts – ahead of Lego. It cost £7.49 a week and was rated as suitable for children aged nine and over.
While putting young lives on the line, it has earned millions of pounds for both the app stores and developer Ufoto Ltd, owned by Chinese parent company Wondershare.
Users are never asked to verify their age when accessing the image-altering technology, but the selection of videos to have a face grafted onto includes scantily clad women in bikinis and a section titled Hot.
Within ten seconds of uploading your chosen mugshot, AI magic reassigns it to another body, with often startling results.
Following our investigation, Facemega’s developer removed the Hot and For Women categories – which contained sexually provocative videos – from their app.
Similar apps like Facemega remain on mainstream platforms, weaving their worrying web of twisted reality.
Deep Fake’s face swap video, created by US company Deepfaker LLC, was promoted on the App Store this week as suitable for children four years and older. The ad features a young woman having her face swapped to someone else’s social media image — albeit not in a sexual manner.
Then a video starts playing, making it difficult to tell the difference between what is real and what is fake.
A three-day free trial results in a £7.99 per week subscription.
Faceswap, which is also listed on the App Store for children aged nine and over, gives kids access to deepfakes for free before paying for a £19.99 annual subscription.
Our revelations come just months after ministers announced that deepfake pornography would be targeted, making the unauthorized creation and sharing of images illegal under the Online Safety Act, which has been passed but not yet come into force is made.
The children’s charity NSPCC has called on the government to legally oblige the major app retailers to help protect the target groups of these apps, especially women and girls.
Rani Govender of the NSPCC added: “App stores play an important role in preventing the risks of deepfake technology at source. The government can also act through its Online Safety Act, legally obliging businesses to combat violence against women and girls online.”
Communications regulator Ofcom said last year that fake or misleading images and videos were among the top 20 potential online harms faced by UK internet users.
Education platform Safer Schools says the number of deepfakes online rose from about 14,000 to 145,000 between 2019 and 2021 — a 900 percent increase. Of these, 96 percent contained pornographic material, while about 90 percent contained indecent images of young women.
NSPCC’s Mr Govender added: “Deepfake technology is already having an insidious impact on children as it becomes easier and easier to create and share this degrading and harmful material.
“This rapidly evolving technology is quickly becoming a child abuse risk as it is introduced without proper consideration of the way in which it encourages the abuse of intimate images.
“Girls and women suffer most from apps like this, which exist in a toxic online culture of misogyny that’s growing at a worrying rate.”
Apple said it removed Face Mega from the App Store and said it has no specific rules for deepfake apps. It claimed to ban apps with pornographic, defamatory or discriminatory content.
A Google Play spokeswoman confirmed that Face Mega was removed from its platform, but did not comment on other apps.
Tory MP Siobhan Baillie called the deepfake technology appalling, adding: “Clearly age verification and additional safeguards need to be considered.
“I applaud The Sun on Sunday for removing this app from the Apple App Store and Google Play. Our children must be protected from the deeply feigned threat.”
Three victims tell their story
CHILDLINE has shared details of three teenagers who have been threatened with fake videos and photos as the charity shows how traumatic it can be.
A 14-year-old told how she was threatened online that if she refused to send nude photos to an abuser, she would have a fake video made of herself.
They said: “I was being friendly, just making small talk with someone on Snapchat. They asked how I looked, so I sent a picture of my face, then they kept asking me for nudes.
“I told them no, but they said if I don’t they will edit my face on nudes and sell them. I know I should report them but it won’t change anything as they will still have my photos on their camera roll. Please help me, I’m really worried.”
A terrified 13-year-old said: “Someone I know is threatening to post a fake nude and claiming it would be me if I didn’t send her real nudes.
“She says she’ll tag my friends and show them it’s me. I’ve never sent nudes before and I worry that my real friends will judge me if this happens.
“I met this person online and we used to be friends but we haven’t spoken to each other for a while. I don’t understand why she is doing this to me. I do not know what to do.”
A third teenager told Childline they’d called police investigators into fake pornography.
He said: “I feel so embarrassed and angry. Someone created a fake account on Instagram under my name and placed inappropriate pictures (porn) with my face on it.
“I reported the account and the police tried to trace the person. I feel a little safer knowing that, but I worry my friends will find out and I’ll be bullied for it.”
https://www.the-sun.com/tech/7608077/sick-sexualised-face-swap-apps-targeted-children/ Sick sexualized “face swapping” apps aimed at children nine and older