Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> Scholar Thomas Doherty describes how the HUAC hearings swept onto the blacklist those who had never even been particularly active politically, let alone suspected of being Communists...

I was referring more to the original Hollywood Ten. Although, I'm curious to hear how the witch hunt you describe would not, in fact, happen with a similarly concerted attempt targeted at far-right views.

> I think a perfect example of this is the Unite the Right rally 1 year ago, vs the reunion this past year. After the events a year ago, many white supremacists were doxxed, shamed, and fired. This year, no one came out to the reunion. If the rally a year ago was a success, do you think that would be the case?

If anything, that's an argument that we don't have to ban Nazis, because all of that happened without any kind of laws in the first place.

Although, I think this also demonstrates what complete rank amateurs the Charlottesville crowd really was. If they had followed the most basic precautions taken by the antifa/black bloc crowd and just covered their faces, it would have been a lot harder to dox them.

The only reason these people are more visible is because the media is putting a spotlight on them to drive the narrative that they're a growing and increasingly dangerous group. It's very reminiscent of the opening stages of the McCarthyist witch hunts.



> Although, I'm curious to hear how the witch hunt you describe would not, in fact, happen with a similarly concerted attempt targeted at far-right views.

This is a concern of mine, which is why I took time to differentiate Jones and Damore. I see one as having a legitimate cause for deplatforming, and the other as reactionary.

> If anything, that's an argument that we don't have to ban Nazis, because all of that happened without any kind of laws in the first place.

I'm curious, do you see a difference between social deplatforming, and a technological deplatforming? Those Nazi's are no longer able/willing to express their ideas in physical space because of pressures exerted by private citizens and institutions. As a society, we've limited these peoples' free speech, and I view that as A Good Thing because it has resulted in lower attendance at subsequent rallies. In the same vein, big tech companies may also be limiting the access to free speech of these people, but I'm okay with that given the views of the people they are limiting.


> This is a concern of mine, which is why I took time to differentiate Jones and Damore. I see one as having a legitimate cause for deplatforming, and the other as reactionary.

Well, isn't that the thing? If you already accept as a given that censorship is legally and morally justified and you turn it into a mere policy decision of who is worthy of censorship, there is a real and present danger of that mere policy decision turning into outright and explicit oppression.

It's like a lot of the classic civil liberties scenarios, like how the "ticking time bomb scenario" can justify torture. So let's talk about that case, by way of analogy. If there's a ticking time bomb, and you can verify within minutes where the bomb is, you can just club someone with a wrench or something until he tells you, and if it turns out the bomb isn't where he told you, you keep torturing him until he tells the truth. In that single, specific, narrow circumstance, you might be able to justify torture. But once you've made a policy of "torture is sometimes justified", how the hell are you going to make sure they only use it in the extreme edge cases where it's called for? More likely, from a rule-utilitarian standpoint, you just end up causing a lot more pain and suffering by letting a bunch of people get tortured unnecessarily.

So, even if there are individual cases where you can have a net reduction in expected totalitarianism and state murder by censoring totalitarian, murderous ideologies, allowing such censorship risks those exact ideologies sneaking in through the back door.

In the free speech case, it's even more dire, because the standard propaganda narrative of murderous, totalitarian ideologies seems to be, "$BOOGEYMAN is murderous and dangerous and we need to restrict civil liberties in order to protect you from them". Nazis never campaigned for universal liberties; they just campaigned for installing themselves as the oppressors and their perceived oppressors as the victims, while pooh-poohing anyone who did advocate for universal liberties.

> I'm curious, do you see a difference between social deplatforming, and a technological deplatforming?

I'm actually idealistic enough that I don't think anyone, even literal Nazis, should be fired for their political views as long as they don't bring those views into the workplace.

I think the pattern of tabooing certain forms of extremism causes more problems than it solves. If you don't actually let racist people say overtly racist things, they're going to say racist-adjacent things that are moderate enough to be held in good faith, like "unrestricted immigration is culturally disruptive" or "the Muslim world doesn't seem to share our cultural values when it comes to respecting women", and then you can't tell the racist trolls apart from people who genuinely hold those moderate views, and now we're at the point of absurdity and indirection where the racist trolls themselves post slogans like, "it's OK to be white", which makes it really, really awkward for those of us who aren't racist trolls, but don't have any rhetorical space left to argue against the scores of far-leftists who regularly state that, actually, it isn't OK to be white.


> It's like a lot of the classic civil liberties scenarios, like how the "ticking time bomb scenario" can justify torture...

That's all good, but we still have a bomb to find. If we say censorship is off the table of acceptable methods to combat Nazis, what are our tools? I'm going to stress again that I haven't seen any convincing argument that a passive approach will work here. The arguments all sound great ("Just ignore them and let them fizzle out!") but I'd like some historical precedents to look at before I can believe that's true. Additionally, the passive approaches don't mention that these groups are active and recruiting. They have plans and playbooks for radicalizing more members. I feel it is legitimate to worry that they will continue to galvanize and grow in power unless we stop them. But I am open to hearing alternative solutions to censorship, as long as they are more thought out and convincing.

> If you don't actually let racist people say overtly racist things...

I've got almost the same question here. In your ideal world, how do we respond when someone says something racist? Do we ignore it? Do we try to teach them why they're wrong? Do we denounce them and call it a day? These are serious questions.


> In your ideal world, how do we respond when someone says something racist? Do we ignore it? Do we try to teach them why they're wrong? Do we denounce them and call it a day? These are serious questions.

Well, there are two options:

If a lot of people seem to give credence to their ideas, then we absolutely have to debate those ideas in the public sphere, as overtly as possible, and because we happen to be right, we will prevail in an open debate.

If, as seems to be the case today, almost nobody gives credence to their ideas, then we just let them make fools of themselves, kind of like how no one minds David Icke's claims about how the British royal family are secretly reptilians from outer space.

> Additionally, the passive approaches don't mention that these groups are active and recruiting. They have plans and playbooks for radicalizing more members. I feel it is legitimate to worry that they will continue to galvanize and grow in power unless we stop them. But I am open to hearing alternative solutions to censorship, as long as they are more thought out and convincing.

I would actually question the notion that these groups are growing in size and influence. While it's hard to have a perfectly controlled experiment, here's an interesting data point. One of the most politically successful overt white supremacists, David Duke, has run for political office on numerous occasions, including two campaigns for a US Senate seat in Louisiana. Duke went from polling at 11.5% (141,489 votes) for a Louisiana US Senate election in 1996 to 3% (58,581 votes) for a Louisiana US Senate election in 2016.


> If, as seems to be the case today, almost nobody gives credence to their ideas, then we just let them make fools of themselves, kind of like how no one minds David Icke's claims about how the British royal family are secretly reptilians from outer space.

Is that really the same though? The key differences as I see them: 1) History. Racism has been taught for many years (for a long time "scientifically proven"). I'm guessing Icke's views are rather new. The leads to 2) Cultural relevancy. There are plenty of racist people out there, each generation teaching the next. I'm not sure how many people believe that about the royal family, but I'm guessing not as many. This matters because 3) Racists are harmful. Unless Icke is about to break into the palace to prove his theory, he seems pretty harmless. We can leave him alone to make a fool of himself. Discrimination and hate crimes are not things I want to leave alone, I'd like them to end.

> I would actually question the notion that these groups are growing in size and influence.

While size may be debatable, I think influence is less so. The data I've seen says hate crimes are up in numbers. The president refuses to call these groups terrorists. Social media is filled with their propaganda and troll mobs. This doesn't feel like a group that is losing power or influence.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: