The dangers of AI in the porn industry | DW Documentary

Author Avatar

DW Documentary

Joined: Mar 2024
Spread the love

The dangers of AI in the porn industry | DW Documentary


Millions of women are victims of fake sexual content on the internet, created with AI. Deepfakes can look deceptively real, but the images, audio, or video recordings are manipulated.

When Kate Isaacs clicks on the video, she is thoroughly confused: She and a stranger are having sex. Her first thoughts are: “Who’s that man? Who’s recording…

source

Reviews

0 %

User Score

0 ratings
Rate This

Sharing

Leave your comment

Your email address will not be published. Required fields are marked *

30 Comments

  1. This is defamation of character isn't? You should be able to put a lawsuit on the industry that allows you to do this and possibly incriminate them

    How so is this legal‼️

  2. This "activist" talks about a private video of a female friend and her partner ending up on PornHub but her activism only addresses the woman. Isn't the man as important?

  3. Deepfakes are easy to distinguish. Only dumb people would bite on it. If you feel you're in danger. You can easily ditch social media and delete all your internet presence.

  4. I knew something like this would be with people’s faces, and I was always scared of it, I never knew about Ai or deepfakes years ago but I am just so thankful I don’t post my face on social media.

  5. Don't put your face into the public domain. You have to accept this kind of thing is a possibility when you are using the same tool to promote yourself and are gaining from using the internet and social media.

  6. As a man i have to take offense with this doc. Why is this problem set up like another horrible thing that only men do to women? Do you suggest that all men are so inherently physically disinteresting to the world that not one victim exists? Or that women have so high morals or poor tech skills that none of them have ever made a deep-fake porn with someone they know, out of spite, jealousy, or a personal sexual desire?

  7. I'm an older Millennial. We were taught to never put our image up on the internet. Glad social media came after I was done with school. I stayed off Social Media for my whole life. There is no image of me out there on the internet. And I look for some every couple of weeks – I only once had to use the right to my own image to take down an image of me. I feel sympathy, especially for women who used social media – but I will never sympathize with politicians lol. Nice try.

    Also what's up with 19:40?
    Women's problems are the only ones that are being taken seriously. Just imagine a guy complaining about anything. Nobody cares. In what world is that woman living in? This world is and always has been Gynocentric – that's how we as a species survived. Only recently it started to not be with feminism. Because Gynocentrism is basically only useful when women have children, Women without wanting children are as useful as men for society. Welcome to Equality feels like oppression Mrs. time to learn that nobody cares about your problems and you need to solve them yourself like men always did. Man up, feminist.

  8. Great film on a very new challenge to the society. AI deep fakes have such a wide arrange of applications, we need to keep it in check.
    The only didnt like about the film is the nudging towards "men have always exploited women" at the end. Can we please keep it to "bad, immoral people have always exploited others"? There is no point in antagonising the sexes. Its a serious issue, and good people should face and solve it together for healthy relationships and society.

  9. Do men ever contact you to create a fictional "porn" scene? Alter their endowment and/or physique … while deep fake. This all feels problematic 😕 Let's just unplug for a bit

  10. Women FINALLY understand what happens to men when they are 'falsley accused' of rape or sexual assault. There is no difference between DeepFakes and false accusations, only in the vehicle used. Yet, when it happens to men, they don't care one iota. When it happens to them (DeepFake) the world BETTER take notice and make it stop. Oh, the hypocrisy is deafening.

  11. 16:24 exactly. p0rn "actresses" in many cases choose to lose their dignity themselves by consenting to this. on the other hand, women whose likeness was stolen to put into these deepfakes have never consented to it, therefore it is v¡olence and abvse, and v¡olators and abvsers should be held responsible for their cr¡mes. this surely must be considered a crime, and if not – that's not a good thing to have lawlessness like that, is it?
    sometimes medicine that fights fire with fire can be really bitter, so let's not go that way and just be sensible in making and adjusting laws to actually protect women and men (to whom it also might happen, i.e., gay p0rn, various nasty f3tsih p0rns, etc.) of all ages, whose likeness is stolen and abvsed, and to hold abvsers, thieves, and v¡olators responsible.

    this should extend to images of celebs tbh, as it breaks my heart a little to see their images stolen and (even not talking about such malicious things as this video is about) "innocently" "collaged" or republished as is, obviously without permission, and nothing is being done about it. but something should be done, as celebs are also people, human beings, not objects to be thrown around on the web, despite their looks being well known. (i have always felt this way but haven't actually said it before.)

  12. 13:27 i think this signals that it's time to make laws that make this a seriously punishable offense.
    countries' laws "catch up" to sense eventually, i.e., Japan finally banned p0rn involving ch¡ldren in the year 2015, or 9 years ago.
    but, let's not wait that long to make obvious law making decisions.

  13. Perhaps making a deep fake is not a crime, yet. But harassing a woman with the excuse of the deep fake, is surely against the law. In fact, it would be illegal even if the video was 100% real. So the police should, at least, act on that.