New legislation in the UK would punish those who digitally alter and spread sexually explicit images with unlimited fines or prison sentences.
The problem is that technology has really made it possible to invade someone’s privacy without even being about to see them. In the old days you’d have to cut out a picture of someone’s head, Prit Stick it on a rude photo from a magazine you found in a bush. All of this didn’t really embarrass someone as much as it made you look weird.
With AI they can take a picture of your face and somehow seamlessly create an adult video of you and let’s be honest, it looks better than the real thing would. The AI version of me hasn’t eaten all those pizzas.
This new law makes sense. It’s an act that victimises someone. The fact that to break the law you have to spread the images or use them to intimidate the person featured means you can still make them for your own personal use, you wrong ‘un.
The government will table an amendment to a criminal justice bill that is before parliament, but officials would not say when the law would be implemented.
That feels like it’s inviting people to hurry up and make all the deepfakes they need before the cut off.
Reality star Cally Jane Beech is in this article saying the law is a good thing, and her picture credit is from “Splash News”. I didn’t know there was such a specialist site.
» Read the source story
| ☕ TIP (Help by donating)
| 📻 LISTEN (to the new radio podcast)
| 📺 WATCH (YouTube)
'So now I actually have to film those scenes!'@TheSimonEvans, @LeoKearse and @MrSteveNAllen jokingly react to a story from Wednesday's Times about deepfake pornography. pic.twitter.com/jkDrFLBbYN
— GB News (@GBNEWS) April 17, 2024