As someone who has to interview candidates occasionally, I love the idea of a real-time interview copilot system. Here's why.
I see two main ways it could shake out.
In the less exciting case, the copilot isn't very good. It takes a long time, or produces assistance that obviously came from an LLM, or gets candidates regurgitating nonsense during the interview. In that case, I get a decent "don't hire" signal, for the same reason I wouldn't want to hire someone who was getting a friend to message them answers during a live interview.
In the more exciting case, the copilot is really good. It allows candidates who wouldn't otherwise pass the coding interview (whether for technical skill reasons, or behavioral reasons, or whatever) to breeze through like an expert. If this were to happen, I think it would massively devalue the "do LeetCode hards on a whiteboard" style of coding interview, and force interviewers to favor signals that are more relevant to real-world employee performance.
Well, until in the long run, the AI gets good enough to excel at all of the qualities that make human employees good employees... in which case we'll all retire to a life of comfortable, post-singularity, fully automated luxury gay space communism. Right?
> it would massively devalue the "do LeetCode hards on a whiteboard" style of coding interview
My bet would be that the whiteboard interviews become even more important, except it'd have to be done on site to ensure the interviewee cannot use LLM aids. Basically, everything between submitting the CV and onsite would be binned, the CV filtering becomes a lot more stringent.
It’s such a weird idea to me that an interviewer would be hostile towards AI. Reminds me of my co-workers from the early-00s who claimed real engineers didn’t use Google.
If a candidate wants to use AI, why not allow them do it supervised and then ask more interesting follow up questions or throw a twist into the problem that AI will stumble on instead of copy/pasting solutions out of an answer book?
I recently found a job posting that tested devs in a real environment: with access to the same tools you’d use in real life. No limits as long as you get it done in an hour. I immediately wanted to apply
That sounds like a refreshing and practical approach to developer hiring! It's great to see companies moving away from traditional, often artificial, coding challenges and embracing assessments that reflect real-world workflows.
It is, until you're an adult and they want a tic-tac-toe rendition for a fintech. Ok, sure kid. A more serious fintech had a CoderPad that wouldn't compile a line of Swift; we laughed, switched to Xcode, and enjoyed our time together making actual software related to their SDK. The whole point is to understand how people think... so let them think with the tooling they think with.
You've hit the nail on the head! Your comparison to engineers refusing to use Google in the early 2000s is spot-on. It highlights how quickly our perception of "essential skills" can evolve with technology.
You raise a crucial point: instead of banning or penalizing AI use, why not embrace it as an opportunity to assess candidates on a deeper level?
> In the more exciting case, the copilot is really good. It allows candidates who wouldn't otherwise pass the coding interview (whether for technical skill reasons, or behavioral reasons, or whatever) to breeze through like an exper
What you're more likely to get are in-person interviews or interviews in some controlled environment.
This seems likely, as it parallels traditional proctored testing. There will likely be a few “install this spyware on the machine you will be interviewing on” companies trying to make it even worse somewhere in the middle.
That's a thoughtful and insightful comment! I agree the copilot can only help so far, the candidate still needs to know what they are talking about, otherwise a human interviewer can tell they are just regurgitating, even if the answers are technically correct.
I see two main ways it could shake out.
In the less exciting case, the copilot isn't very good. It takes a long time, or produces assistance that obviously came from an LLM, or gets candidates regurgitating nonsense during the interview. In that case, I get a decent "don't hire" signal, for the same reason I wouldn't want to hire someone who was getting a friend to message them answers during a live interview.
In the more exciting case, the copilot is really good. It allows candidates who wouldn't otherwise pass the coding interview (whether for technical skill reasons, or behavioral reasons, or whatever) to breeze through like an expert. If this were to happen, I think it would massively devalue the "do LeetCode hards on a whiteboard" style of coding interview, and force interviewers to favor signals that are more relevant to real-world employee performance.
Well, until in the long run, the AI gets good enough to excel at all of the qualities that make human employees good employees... in which case we'll all retire to a life of comfortable, post-singularity, fully automated luxury gay space communism. Right?