Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

As an interviewer at one of the FAANG companies I can confirm this is one tactic we use as well. A lot of people are very confident spewing trash for as long as you'll let them.


Likewise. I open my interviews with the explicit statement that I’d like them to disclaim when they don’t know, and that furthermore, the discussion is arranged around challenging them until we reach that point. We have less than an hour together and I need to judge your technical abilities in an intrinsically imperfect medium. Help me help you - I can only work with what I’m given. If you bullshit, what I’m given isn’t good.

I don’t really get it. I can perhaps understand that some candidates might have gotten good feedback from blatantly guessing in the past, but that’s why I now explicitly tell them to disclaim guesses. If anything it looks more impressive when you honestly don’t know something but intuit the substantially correct answer (as long as it’s something that could be realistically intuited).

Yet even with my disclaimer, I’ve still conducted phone screens and onsite interviews where the candidate eventually started bullshitting. It’s one thing to say you don’t know and give a wildly incorrect answer - at least then I can try and steer the interview towards another of the candidate’s strengths. It’s even okay to preface your wild guess with an, “I think...”. But the cavalier way in which people will just spout nonsense is disturbing. Even if you’ve been performing well up to the point, engaging in bullshit is nearly immediate grounds for me to discount you as a candidate.


Obviously that is part of the point of the interview, though, right?

You are able to identify these over-confident people and prevent them from being a toxic influence on a team.


Well yeah, that's why I continue to ask. I guess what I was getting at is that I'm sort of shocked people will still bullshit despite my explicit declaration ahead of time.


After they said "I don't know" I'd then encourage (or nudge) them to derive a solution. It gave me a very good insight into their thought process and their ability to apply their knowledge to solve an unseen problem.

For instance, students (during campus interview) would at some point reach a dead end while explaining process scheduler. However after encouraged to work it out about 70% ended up with some version of timer interrupt. It was fascinating to watch them go through the process.


It depends on what you mean by derive a solution. If a candidate doesn't know an algorithm they couldn't be expected to derive it on the spot.

In many cases the popular algorithms were carefully designed by Computer Scientists as part of their research. People don't pass that type of test because of the ability to derive solutions on the spot, they pass based on their skill at rote memorisation and application.

(I absolutely agree that if despite not knowing they show signs of being able to reason their way towards something sensible - robust, likely to be efficient, etc - that is a very good sign.)




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: