The dangers of AI in the porn industry | DW Documentary
The dangers of AI in the porn industry | DW Documentary
Millions of women are victims of fake sexual content on the internet, created with AI. Deepfakes can look deceptively real, but the images, audio, or video recordings are manipulated.
When Kate Isaacs clicks on the video, she is thoroughly confused: She and a stranger are having sex. Her first thoughts are: “Who’s that man? Who’s recording…
source
Reviews
0 %
World is going crazy day by day. This is serious issue. And there is no law protecting victims. It just not women it can be anyone you me.
good luck going up against SATAN
This is defamation of character isn't? You should be able to put a lawsuit on the industry that allows you to do this and possibly incriminate them
How so is this legal‼️
It's not possible to ban this. What if someone made a video but did a slight modification of the person's face?
Disturbing
These are women who inspire me. We will win!
The "His Hand" in Daniel 11:41 am understanding to be the ideology of Patriarchy.
This "activist" talks about a private video of a female friend and her partner ending up on PornHub but her activism only addresses the woman. Isn't the man as important?
People will have to get used to that. It won't go away and there's nothing anybody can do about it.
Deepfakes are easy to distinguish. Only dumb people would bite on it. If you feel you're in danger. You can easily ditch social media and delete all your internet presence.
Go to check how good they have done these deep fake videos, I hope that good as they described 😮
I knew something like this would be with people’s faces, and I was always scared of it, I never knew about Ai or deepfakes years ago but I am just so thankful I don’t post my face on social media.
Don't put your face into the public domain. You have to accept this kind of thing is a possibility when you are using the same tool to promote yourself and are gaining from using the internet and social media.
As a man i have to take offense with this doc. Why is this problem set up like another horrible thing that only men do to women? Do you suggest that all men are so inherently physically disinteresting to the world that not one victim exists? Or that women have so high morals or poor tech skills that none of them have ever made a deep-fake porn with someone they know, out of spite, jealousy, or a personal sexual desire?
Don't worry soon thanks to the new AI regulations there will be no porn industry as watching porn will be a privilege reserved to the most wealthy.
I'm an older Millennial. We were taught to never put our image up on the internet. Glad social media came after I was done with school. I stayed off Social Media for my whole life. There is no image of me out there on the internet. And I look for some every couple of weeks – I only once had to use the right to my own image to take down an image of me. I feel sympathy, especially for women who used social media – but I will never sympathize with politicians lol. Nice try.
Also what's up with 19:40?
Women's problems are the only ones that are being taken seriously. Just imagine a guy complaining about anything. Nobody cares. In what world is that woman living in? This world is and always has been Gynocentric – that's how we as a species survived. Only recently it started to not be with feminism. Because Gynocentrism is basically only useful when women have children, Women without wanting children are as useful as men for society. Welcome to Equality feels like oppression Mrs. time to learn that nobody cares about your problems and you need to solve them yourself like men always did. Man up, feminist.
Why aren't women worried about their identity to be used in AI p*rn. Why only women are so much worried? 😂
😂🤣🤣🤣🤣🤣🤣🌋🌋🌋🌋🌋🌋🌋🌋🌋🌋🌋🌋🌋🌋
"…of all the pictures on the internet 96% of them are pornographic…." I suspect that this quote from the video is accurate.
Great film on a very new challenge to the society. AI deep fakes have such a wide arrange of applications, we need to keep it in check.
The only didnt like about the film is the nudging towards "men have always exploited women" at the end. Can we please keep it to "bad, immoral people have always exploited others"? There is no point in antagonising the sexes. Its a serious issue, and good people should face and solve it together for healthy relationships and society.
Even if she did make a porn video… what about that is wrong??
Do men ever contact you to create a fictional "porn" scene? Alter their endowment and/or physique … while deep fake. This all feels problematic 😕 Let's just unplug for a bit
False accusations against men should then be considered violence as well.
Thank God I’m ugly no one will deepfake me but this is horrific I feel so sorry for these women
I'm very sorry for the victims of this and i support their fight but giiirl, you speak uglyyyy
It's not violence to any degree. But it is defamatory and clearly subject to tort action. If I owned a law firm, I'd be all over this.
Women FINALLY understand what happens to men when they are 'falsley accused' of rape or sexual assault. There is no difference between DeepFakes and false accusations, only in the vehicle used. Yet, when it happens to men, they don't care one iota. When it happens to them (DeepFake) the world BETTER take notice and make it stop. Oh, the hypocrisy is deafening.
16:24 exactly. p0rn "actresses" in many cases choose to lose their dignity themselves by consenting to this. on the other hand, women whose likeness was stolen to put into these deepfakes have never consented to it, therefore it is v¡olence and abvse, and v¡olators and abvsers should be held responsible for their cr¡mes. this surely must be considered a crime, and if not – that's not a good thing to have lawlessness like that, is it?
sometimes medicine that fights fire with fire can be really bitter, so let's not go that way and just be sensible in making and adjusting laws to actually protect women and men (to whom it also might happen, i.e., gay p0rn, various nasty f3tsih p0rns, etc.) of all ages, whose likeness is stolen and abvsed, and to hold abvsers, thieves, and v¡olators responsible.
this should extend to images of celebs tbh, as it breaks my heart a little to see their images stolen and (even not talking about such malicious things as this video is about) "innocently" "collaged" or republished as is, obviously without permission, and nothing is being done about it. but something should be done, as celebs are also people, human beings, not objects to be thrown around on the web, despite their looks being well known. (i have always felt this way but haven't actually said it before.)
13:27 i think this signals that it's time to make laws that make this a seriously punishable offense.
countries' laws "catch up" to sense eventually, i.e., Japan finally banned p0rn involving ch¡ldren in the year 2015, or 9 years ago.
but, let's not wait that long to make obvious law making decisions.
Perhaps making a deep fake is not a crime, yet. But harassing a woman with the excuse of the deep fake, is surely against the law. In fact, it would be illegal even if the video was 100% real. So the police should, at least, act on that.