Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

They literally have a demonstration of the tool being used to evade an intrusive "send us a video of your face, turn your head both ways, blink".

That isn't a defensive use!?



Definitely useful for scammers.


Then build a cultural revolution that instills distrust in what you see. We already teach the elderly not to trust a link that says, "you are the millionth visitor, click to win your prize". Teach them also to look past the video call and what looks (and sounds) like their grandson to what is actually being said. He's asking for your bank password, that behavior doesn't match!

The cat is already out of the bag. Going "nooo, unpublish this" doesn't protect people from scammers, who will continue to use it.

Society must unlearn that seeing is believing. That is what is morally irresponsible, not this software.


I don't think anyone so far has called for this to be unpublished, just for it to be more carefully considered and for the ethical aspect to be addressed when publishing. That's part of 'building a culture of distrust'.

Knee-jerk refusal to even consider thinking/talking about the use of technology is far more irresponsible than any other stance.


Ignoring human nature is far worse. There are people that will release this simply because you tell them no.

Even beyond that, if you're an authoritarian nation with a rather good lockdown on your news and internet, what's the downside of throwing tools like this at democracies?

In my view the age of tools like this has already been here for some time and they are going to their next step of evolution, and there is very little we can do about preventing the distribution of them without affecting other pieces of software and their distribution.


But you are making the argument that releasing it is protective, and there is no basis for that argument.

> Society must unlearn that seeing is believing.

A total lack of trust of our shared perception? Let me know how that works out. Might as well have everyone on acid 24/7.


>A total lack of trust of our shared perception? Let me know how that works out. Might as well have everyone on acid 24/7.

It's no different from religion, which was the backbone of the western civilisation for quite some time.


Religion is a contemplation of free will, not sure if contemplating that is like being on acid :)


Religion claims that certain things are true when they contradict observable reality. There's no fundamental difference between claimed gods and deepfaked ones.


Many of the claims of religion can’t be contradicted by observed reality because they are pointing to things beyond the observable (ie non-Falsifiable, like the proposed existence of a multiverse).

For example, religion claims you have free well. For a materialist, any interesting definition of free will is impossible, because we are made of quantum particles with defined statistical behavior.

Of course, we cannot prove or disprove the existence of free will in other people either…


The deepfaked ones actually speak.


Religion is a very complicated and touchy subject - it means drastically different things to different people.


Yeah I can’t believe that people think it’s possibly a desirable future for there to be no such thing as trust, and furthermore that we should accelerate our march in that direction. What an insane way to live.


It's not desirable at all, but it is inevitable and in many ways already here. While I'm not sure I agree with the argument, I think the idea is holding this stuff back just means it's more powerful and nefarious for those that do have it because the population at large will remain unaware for longer that trust is already gone. And that means innocent people are going to get hurt while that knowledge asymmetry in the population is delayed from catching up.


I honestly am puzzled: the both of you (if I understand you right), do you really think some fake videos will cause there to "be no such thing as trust"? Why?

Videos aren't the only way to understand the world, and there was never any such thing as "direct observation" in the first place.


Videos are considered one of the more authoritative sources of factual information. E.g. Imagine how disruptive it’d be if a deepfake of Biden announcing support for Putin’s invasion of Ukraine were to circulate during those first hours of the invasion? Sure, there are other communications channels that would be able to dispute the authenticity of that video, but would those channels be as fast and as (apparently) authoritative as video of Biden “himself” declaring his support?

Now consider a few months after this hypothetical fake-and-refutation, what if Biden actually does announce an emergency threat to America, and people end up waiting a few days (or weeks, or months) for the refutation to come.

The “no such thing as trust” is not coming from this innovation in particular but rather the confluence of two general trends: 1) heavy reliance on digital media and 2) ever-increasing ability to fake digital media.

Note that you don’t even need large, successful falsified media campaigns to create distrust. All you need is for people to know that the capability exists, and trust is immediately damaged. Even trivially-detectable falsifications can end up being difficult to sort through if there are a billion fake pieces of information crowding out a hundred real pieces of information. This is true even if all billion fakes are trivially detectable by a human (which they won’t be).


All of this is not "no such thing as trust".

I too dread a little all the mischief that's going to happen while the world learns about this. But people don't have to uncritically believe "the evidence of their own eyes", even though that has been very useful corrective to many mistakes, and will I guess be less so now. Evidence doesn't work that way: people always have to interpret everything in terms of their understanding of the world, and they can get better at that in response to changes like this.


I agree. I don't think this in particular will land us in "land of no trust." I think it, and innovations like it, move us clearly towards that land though.

The advent of widespread photo/video evidence was a big help for people trying to interpret was real and what wasn't. Then photography was knocked out of service by e.g. Photoshop (generally "easy photo manipulation"). Now video is getting knocked out of service as well. So, yes, it removes a useful tool in interpreting the world's evidence, and it's not clear what will replace that tool (if anything).

To be clear, the "no such thing as trust" was specifically a response to GP's assertion that people will "have to unlearn that seeing is believing." Okay, fine, but then what can you believe if you not even what you see? Again this specific tech doesn't get us all the way there, but it gets us closer.


Trusting photos wasn't destroyed by the advent of photoshop, same with videos.

There are ways to prove authenticity, e.g. with cryptographic signing.


Why weren't people this concerned about the existence of photoshop? To this day people continue to try to fool and manipulate others by using it. You don't even need video or photos to trick someone. Plenty of people lie in text, but text editors aren't considered dangerous tools.

People have been aware that lies can be typed for as long as we've had typing, but it still works. The fact that it still works a lot of the time hasn't ended the world. People are more aware that they can't trust a photo, yet people still fall for edited photos, and the world still turns. I don't think deep fakes are any different. Videos were not very trust worthy before deep fakes. Special effects have been a thing for many many decades. The world is still learning exactly how untrustworthy video is and there will always be people who fall for tricks and manipulations, but eventually the majority will adjust and society won't crumble.

We'll put our trust in what we see with our own eyes and in people and organizations who have proven themselves to be trustworthy. We'll worry about how real something is in proportion to what it would mean to us if it were real. When it really matters we'll trust experts to authenticate a photo or video as legitimate just like we do currently. If the world becomes less dependent on information presented in formats that can we know can be convincingly manipulated we'll all be better off. History tells us technology like this in the hands of the public won't cause people to abandon faith in everything and anything they see, or destroy civilization. We all just get a little more savvy, more industries will take advantage of the cool tech, lots of people will have fun and games with it, not much really changes for most people.


Really really knocking down these straw men one by one eh? Anyway, people do complain about Photoshopping, e.g. the psychological effects of digitally-generated beauty standards. No, photography isn't broken, but yes there is less trust in what a photograph is saying. No, this won't "break" video, whatever that means, but yes of course it will mean people will be less trusting of video.

No one said it'll destroy civilization, don't worry.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: