Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

We literally already had AI fake porn of Taylor Swift making the rounds a while ago. Prepare for women in public positions to face that kind of bullshit more frequently.


Eh, once it's ubiquitous, nobody will care.


Once fakes in politics are ubiquitous, people will stop trusting the real evidence.


That appears to have already happened, no AI required.


The trust in video evidence certainly can be much lower than it is now.


It's more an issue of indifference than trust. For instance, you can show Trump supporters any number of legitimate videos that depict Trump and his associates saying, doing, and promising all kinds of outrageous, offensive, and destructive things, and they won't care in the slightest. It's not that they don't trust the video, it's that they've been programmed not to care. The leader cannot fail.

That's the ultimate purpose of disinformation -- it's not to make you believe false things, it's to make you believe nothing.

So yes, AI fakery will contribute to that phenomenon on behalf of numerous bad actors, but it was always going to happen anyway. You don't need Hinton and Sutskever on your side if you have Aisles and Murdoch.


> So yes, AI fakery will contribute to that phenomenon on behalf of numerous bad actors, but it was always going to happen anyway.

That's like saying: "Yes, crime might increase, but we will always have crime anyway." What will happen anyway is irrelevant precisely because it happens anyway. What's relevant is the expected increase in media distrust once everything might be a fake.




Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: