I'd be more than happy to have an anonymized hiring process. If you're right that in-group preference is what drives the gender disparity, we should expect an anonymized hiring process to produce an employee base that's closer to gender parity. Some companies have experimented with this [1]. But interesting no tech DEI advocate I've met in real life has been supportive of anonymized hiring. More than a few have actively disapproved, saying that anonymization tends to make the representation worse.
Of course, because the problem that's trying to be solved is that the tech industry has default, implicit biases in its hiring processes, which tend to favor the majority. Anonymization acts as a force multiplier for those defaults/biases.
I don't understand. If gender discrimination is the cause of the disparity, anonymization should eliminate the disparity. Under an anonymous hiring process, you can't know the gender of the applicants and so you can't discriminate on the basis of gender.
If coding interviews were done with cameras off, and voice masked so gender can't be known, how would that be more subject to bias than with the camera on and the gender known to the interviewer?
When orchestras put a veil between the auditioner and the evaluators, that made the process more biased? That's new to me.
Anonymization wouldn’t work. Turns out people are very very good at picking up subtle signals that might hint at the anonymized identity likely is. What ends up happening is more discrimination rather than less. If you don’t anonymize, people will still discriminate, but they are always aware their decisions may be illegal or unacceptable. But when you anonymize, they just discriminate against anyone that has any hint of belonging to the discriminated class. And now you have given them an out plausible deniability by anonymizing.
This has been demonstrably shown to be true over and over again. Anonymizing is an elementary school student solution to a complex phD level type problem
Some interviews I've encountered consisted of uploading code that gets executed on a remote server. Grading is exclusively done on the correct output, runtime, and memory usage. This is a truly anonymous interview that cannot be biased with regards to protected class. How does such an interview pick up on the protected classes of the candidate?
The hostility to anonymized interviews stems from the fact that one cannot discriminate in favor of particular demographics. This is the goal of the above commenter, as stated here: https://news.ycombinator.com/item?id=42830509
I think we're talking past each other. Tech hiring biases, implicitly, for stuff that's considered to be culturally normative. That's not just about gender labels or how someone looks. It's also about stuff like how the applicant phrases and delivers answers to questions. The high-confidence and authoritative tone used by many western white male engineers tends to be -- again, implicitly -- preferred, over, for example, a more nuanced and lower-confidence response that might be delivered by a non-western woman engineer.
Every company I worked at grades interviews based once correctness and performance. A candidate that fails to produce a working solution at all receives a worse score than one that produces a working, but inefficient solution, which get a worse score than one that has a working and optimal solution.
And again, if the bias comes from people's tone then the interview can be conducted over text. Or have a transcript of the interview that is used by the hiring committee, to ensure that a "high confidence and authoritative tone" doesn't introduce bias. Bias can be eliminated. And if the disparity remains the same, the disparity is not due to bias.
You continue to focus very narrowly on the specific details of the hiring process. I'm trying to make points about higher-level stuff, related to the intent and scope of DEI-type initiatives. From these few comments, I gather that you're not really interested in talking about any of those higher-level things, so I'll stop trying to explain them.
The specific details of the hiring process are in question. You are running away from grappling with the (increasingly likely) possibility that bias wasn't the (only) driver in hiring disparity.
The point I'm trying to make is that the details of any specific hiring process aren't really germane to the overall discussion. Hiring disparity is a metric that's measured at a much higher level than any individual organization.
> Hiring disparity is a metric that's measured at a much higher level than any individual organization.
And how do we know if the hiring disparity is due to bias? The details of the hiring process are absolutely relevant, because the notion that the hiring disparity is due to bias is a claim about the details of the hiring process.
My main issue with a lot of DEI programs is that they don't try to eliminate bias. They just assume disparities are due to bias and work towards "fixing" those assumed biases with explicit discrimination. The problem is that when you actually try to measure and quantify bias in tech, the results often aren't what DEI advocates assume. E.g. https://www.pnas.org/doi/10.1073/pnas.1418878112
This is why there's such a a strong pushback against anonymizing interviews and other bias mitigation measures. What happens if your interviews and applications are all anonymized and the hiring disparity remains? The justification for "fixing" the representation is now a lot weaker since it's harder to claim it's due to bias. "Let's address bias in our hiring process" is a lot more popular than "let's set quotas". So the people who want quotas try to claim that they're just fixing bias by setting quotas.
You don't eliminate biases by focusing on the gender ratio of the orchestra. You eliminate biases by putting a veil between the auditioner and the evaluators. We all know this, but some people feel compelled to pretend that they're working towards eliminating biases when in reality they're working towards achieving certain demographics outcomes.
You’ve explained your position and OP exposed the holes in your logic. Please don’t pretend to take the high road when someone has engaged with good faith discussion that didn’t end the way you hoped.
This silly solution is often suggested by tech bros who have a really rudimentary understanding of social problems. I guess it makes sense to someone who doesn’t fully understand social issues and thinks of how they would solve it if it was a technical problem. It’s not. Anonymizing does not reduce discrimination. It demonstrably makes it worse. The only things that do reduce discrimination in the short term is rules that deliberately relatively advantages the discriminated class and in the long term, socio-cultural shift.
It's not that the "tech bros" don't understand the problem. It's that DEI advocates tend to misstate the problem they're trying to solve.
> The only things that do reduce discrimination in the short term is rules that deliberately relatively advantages the discriminated class and in the long term, socio-cultural shift.
This isn't reducing discrimination. This is deliberately engaging in discrimination to change demographic outcomes.
We could have avoided a lot of confusion if DEI advocates were honest that their goals are not to eliminate discrimination but rather to employ employ affirmative action to achieve more equitable outcomes. If that's your goal, then of course anonymized hiring doesn't work: if you can't tell which candidate belongs to which demographic, then you don't know who to give advantage to.
1. https://interviewing.io/blog/voice-modulation-gender-technic...