In an international operation spanning 38 countries, German investigators said on Wednesday that they had dismantled “KidFlix”, a major online platform for the distribution of child sex abuse images with 1.8 million users worldwide. Cyberlockers continue to be exploited by criminals sharing child sexual abuse imagery. These high-storage sites can be used to share one image or video at a time, or one or more folders that could potentially contain hundreds of images or videos under a single URL.
About Sky News
- He says he is more concerned about the risks children are exposing themselves to by appearing on the site.
- The government is requesting accountability from the platform akin to what the United States has done.
- Analysts upload URLs of webpages containing AI-generated child sexual abuse images to a list which is shared with the tech industry so it can block the sites.
While it is illegal to post or share explicit images of someone under the age of 18, Mr Bailey says the police are extremely reluctant to criminalise children for such offences. He says he is more concerned about the risks children are exposing themselves to by appearing on the site. It may seem like the best solution is to restrict or remove access to digital media, but this can actually increase the risk of harm. A youth may then become more secretive about their digital media use, and they therefore may not reach out when something concerning or harmful happens. Instead, it’s crucial that children and youth have the tools and the education to navigate social media, the internet, and other digital media safely.
Germany shuts down major online child sex abuse material platform ‘KidFlix’
Understanding more about why someone may view CSAM can help identify what can be done to address and stop this behavior – but it’s not enough. Working with a counselor, preferably a specialist in sexual behaviors, can begin to help individuals who view CSAM take control of their illegal viewing behavior, and be accountable, responsible, and safe. Adults looking at this abusive content need to be reminded that it is illegal, that the images they’re looking at are documentation of a crime being committed, and there is a real survivor being harmed from these images. The site, named Welcome to Video, was run from South Korea and had nearly eight terabytes of content involving child abuse – enough to store hundreds or even thousands of hours of video footage. Other measures allow people to take control even if they can’t tell anybody about their worries — if the original images or videos still remain in device they hold, such as a phone, computer or tablet.
France’s first lady Brigitte Macron steps up legal battle over gender rumours
Children and teenagers are being sexually abused in order to create the images or videos being viewed. Excuses such as “they’re smiling so they must be okay” ignore that these children and youth are being told what to do by adults, may be threatened to do this, and are not legally able to consent. Having CSAM available online means that children are re-victimized each time it is viewed 1. The U.S. Department of Justice defines CSAM, or child pornography, as any sexually explicit images or videos involving a minor (children and teens under 18 years old). The legal definition of sexually explicit does not mean that an image or video has to depict a child or teen engaging in sex.
It said it is now liaising with the police, but had not previously been contacted about the account. Aaron was 17 when he started making videos on the site with his child porn girlfriend in Nevada, US. The site requires applicants to pose next to an ID card and then submit a photograph holding it up to their face. But the age verification system failed to distinguish between them at any stage of the process, despite the age gap.