This is definitely true. But, let's be honest here: anyone who's done at least 10 or so standard, ~1-hour technical interviews has probably run into a candidate who looks great on paper, but just can't demonstrate enough basic skill to do the job.
One such candidate I interviewed seemed like they'd be really great for the role: PhD in graph theory, publications, projects listed on the résumé, couple of different programming languages (including ones we used). To me, this person's résumé screamed "solid mid-level developer." I would have probably been willing to pass them at a junior level, had they been able to perform at that level, though.
The interview itself was a pretty familiar story. For the technical portion, I introduced the problem (not a LeetCode-type problem, a more practically-oriented problem), we talked about requirements, drew some stuff on the board, and then got to coding.
I had a feeling when we were going through the requirements discussion that this might not go as smoothly as I'd hoped, but I pushed that feeling aside and did my best to let them shine.
We let people code in any reasonable programming language, but they must write actual code. They can fill in stuff like dummy helper functions, if necessary, but we want to see some kind of running, syntactically correct, and, preferably, at least lightly tested code.
They chose to code in Java, which, while not a terrible choice, seemed to me kind of like they were just handicapping themselves when stuff like (IIRC) Javascript and Ruby were listed on their résumé.
To make the long-ish story a bit shorter, we muddled through trying to implement the requirements we'd talked about earlier in Java, meanwhile the candidate was showing me a distinct lack of familiarity with basic facilities of the language, such as "what sort of methods do lists have that might be helpful here?"
Needless to say, this person did not pass my interview, and we did not end up hiring them. But, I really, really wanted them to succeed. Like I said, on paper, they look great. And, I'm sure they could have gotten through a culture fit interview just fine. I'm just not sure how well they would have done on our team, working on our rather large, pre-existing, and somewhat crufty code bases.
If you can figure out a good way to automate the task of "filter these developers down to the ones who can write some semblance of code," in a way that goes deeper than just "Write some code and run it against our automated test cases," I'd like to hear about that. And, I'm not doubting that it could be done, in theory. For instance, maybe something like the engine behind GitHub's CoPilot could provide a way to analyze and grade the candidate's code on things like style, testability, test coverage, modularity, &c.
But, AFAIK, there's nothing like that out there now, so, a structured process consisting of ~1-hour technical interview sessions, one-on-one with the candidate, attempting as best as possible to simulate the real work environment, is about the best I can think of.
One such candidate I interviewed seemed like they'd be really great for the role: PhD in graph theory, publications, projects listed on the résumé, couple of different programming languages (including ones we used). To me, this person's résumé screamed "solid mid-level developer." I would have probably been willing to pass them at a junior level, had they been able to perform at that level, though.
The interview itself was a pretty familiar story. For the technical portion, I introduced the problem (not a LeetCode-type problem, a more practically-oriented problem), we talked about requirements, drew some stuff on the board, and then got to coding.
I had a feeling when we were going through the requirements discussion that this might not go as smoothly as I'd hoped, but I pushed that feeling aside and did my best to let them shine.
We let people code in any reasonable programming language, but they must write actual code. They can fill in stuff like dummy helper functions, if necessary, but we want to see some kind of running, syntactically correct, and, preferably, at least lightly tested code.
They chose to code in Java, which, while not a terrible choice, seemed to me kind of like they were just handicapping themselves when stuff like (IIRC) Javascript and Ruby were listed on their résumé.
To make the long-ish story a bit shorter, we muddled through trying to implement the requirements we'd talked about earlier in Java, meanwhile the candidate was showing me a distinct lack of familiarity with basic facilities of the language, such as "what sort of methods do lists have that might be helpful here?"
Needless to say, this person did not pass my interview, and we did not end up hiring them. But, I really, really wanted them to succeed. Like I said, on paper, they look great. And, I'm sure they could have gotten through a culture fit interview just fine. I'm just not sure how well they would have done on our team, working on our rather large, pre-existing, and somewhat crufty code bases.
If you can figure out a good way to automate the task of "filter these developers down to the ones who can write some semblance of code," in a way that goes deeper than just "Write some code and run it against our automated test cases," I'd like to hear about that. And, I'm not doubting that it could be done, in theory. For instance, maybe something like the engine behind GitHub's CoPilot could provide a way to analyze and grade the candidate's code on things like style, testability, test coverage, modularity, &c.
But, AFAIK, there's nothing like that out there now, so, a structured process consisting of ~1-hour technical interview sessions, one-on-one with the candidate, attempting as best as possible to simulate the real work environment, is about the best I can think of.