Categories
Shop

Schools are increasingly on the frontline of AI generated child abuse material

Source link : https://photovideomag.com/news/schools-are-increasingly-on-the-frontline-of-ai-generated-child-abuse-material/

RHIANA WHITSON, REPORTER: The ability to create fake pornographic images of everyone from celebrities to children using artificial intelligence has become frighteningly easy. 

SUSAN MCLEAN, CYBER SAFETY EXPERT: There are hundreds of apps that promote creating a nude image or taking someone’s clothes off. There’s a variety of different terms for these apps or platforms, it is as simple as accessing them.

JULIE INMAN GRANT, ESAFETY COMMISSIONER: Apps that blatantly say you just choose the age of a child you want, and a body type and within seconds, we will create an image for you

RHIANA WHITSON: And schools are increasingly on the frontline of AI generated child abuse material.

JULIE INMAN GRANT: We’ve given these incredibly powerful tools to kids on their smartphones, where they’re harvesting images of their classmates on social media and they’re creating deep faked image-based abuse, which is incredibly district distressing and when it deals with under 18s, it’s also synthetic child sexual abuse material. 

RHIANA WHITSON: Today these students are focusing on normal teenage things like their school formal.

But earlier this month they were shocked to learn AI deepfake nude images were created of 50 of their female classmates and shared on social media.

JAKE PARTON: It was gross. It was disgusting to see something like that happen to people in and around our communities.

TANSY SEYMOUR: You do worry about what’s being put online and what you put out online.

ANDREW NEAL, BACCHUS MARSH GRAMMAR PRINCIPAL: This is something that affects the 50-odd girls. But the reality is, it’s reverberated throughout the community as well.

RHIANA WHITSON: Bacchus Marsh Grammar principal Andrew Neal has been working with Victoria Police. So far, no charges have been laid against the teenage boy, who is a former student of the school. 

ANDREW NEAL: This is not a Bacchus Marsh Grammar issue. This is not a school X or Y issue. It’s an issue of every school in the country, and indeed every school anywhere. 

SUSAN MCLEAN: Three words that will help you make good choices online – respect responsibility and reputation. 

RHIANA WHITSON: Former Victoria police officer Susan McLean agrees. She travels around the country educating schools and parents about cyber safety.

SUSAN MCLEAN: It’s extremely widespread. I’ve been dealing with AI generated nudes for probably 18 months. Over the last 18 months, it has got more, it has got worse. 

ANDREW NEAL: We can educate children, we can provide them with information, and we can tell them about the dangers, their parents can do that. Ultimately, we need the support of the third arm, which is government to put some boundaries in. 

JULIE INMAN-GRANT (reading): Nudify any girl with the power of AI, just choose body type and get results in a few seconds. 

RHIANA WHITSON: Australia’s eSafety Commissioner is asking federal parliament to support new powers to force social media companies and other online cloud and messaging services to detect and remove child abuse material like AI deepfakes. 

JULIE INMAN GRANT: The companies are, frankly turning a blind eye on file storage platforms and photo sharing platforms. So that’s really the holy grail for paedophiles.

These standards will put the onus back on the platform libraries and the AI apps themselves from creating this kind of content.

RHIANA WHITSON: Susan McLean is supportive of the commissioner’s new powers which include fining tech companies up to $800,000 for every day an image remains online, but she’s also concerned it won’t stop the offending.

SUSAN MCLEAN: We know that fines alone don’t always change business practices of platforms because they have a truckload of money and they can simply pay the fine, and they can keep doing the same thing over and over again. 

RHIANA WHITSON: The cyber safety expert says anyone caught producing online abuse material should feel the full force of the law. 

SUSAN MCLEAN: It is absolutely 100 per cent the offender who’s in the wrong, then we have to look at well, are there things that platforms can do and app stores can do to reduce the likelihood of this offence taking place?

RHIANA WHITSON: She says examples of online misogyny aren’t restricted to AI-deepfakes.. Last month two students were expelled from Yarra Valley Grammar in Victoria for creating a list of female students and ranking them from “wifeys” to “cuties” to “un-rapeable”.

HONOR THORNTON: Everyone’s hearts basically went out to like those group of girls and just sort of, you know, poor things. 

NOAH CAMERON: Everyone was able to see just how kind of objectifying and humiliating it was. 

RHIANA WHITSON: How common is this kind of behaviour? 

MARK MERRY, YARRA VALLEY GRAMMAR PRINCIPAL: Very common, almost universal, in schools, to be honest, almost universal. 

I’m not sort of absolving ourselves of what happened here by any stretch. But I can say that if we believe that it’s a problem only in certain schools or certain types of students, then we’re missing the point.

RHIANA WHITSON: Yarra Valley principal Mark Merry says so far his school has avoided being caught up in a scandal involving AI deepfake sexually explicit images.

MARK MERRY: Could it happen here? Absolutely. It’d be a very brave principal anywhere in Australia, who says these things can’t happen here because they can. 

RHIANA WHITSON: He says while schools have a role to play in teaching respect, most online abuse material is not created during school hours. 

MARK MERRY: The primary responsibility has to be the family. Now, I’m not blaming families, because the technology to be honest, has got ahead and so quickly that a lot of parents just aren’t familiar with it. 

SUSAN MCLEAN: We have to work more on education because a lot of young men are growing up with a culture of pornography and that’s what they’re using as their sex education tool. They’re using it to work out how to have a relationship and of course, that then translates to their willingness to create these nude images because they don’t actually see anything wrong with that.

Source : https://www.abc.net.au/news/2024-06-24/schools-on-the-frontline-of-ai-generated-child-abuse-material/104016762

Author :

Date : 2024-06-24 10:22:19

—-

Author : PhotoVideoMag

Publish date : 2024-06-28 11:29:00

Copyright for syndicated content belongs to the linked Source.

.. . . . . . . . . . . . . . . . . . . . . . . . . .... . . * . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . + + + + + + + + + + + + + + + + + + + + - - - - - - - - - - - - - - - - - - - - . . . . .