No offense, but the only advice I've learned from all these "how to hire programmers" article, as a programmer, is that it doesn't make a damn bit of difference, because every person hiring as their own idea of what works. The best thing I can do to get hired somewhere is be myself.
For every person who says not to bother with a resume, there is another person who wants them.
For every person who denounces certain terms on a resume, another person is looking for them.
For every person who thinks "experienced with" means "you can write a book on the subject", someone else things "you've used it in production."
White board tests are necessary. They are useless. Example code is critical. No, just showing projects. Gotta see open source contributions. No, show what your code accomplished.
You need schooling. You don't need schooling. Degrees don't matter (though, try to get a visa without one). Self-taught rules.
Ask them to write a FizzBuzz program. Reverse FizzBuzz. FizzBuzz is silly and doesn't matter.
Throw out half the resumes so you only hire the lucky ones. Have them work for you for 90 days/1 month/2 weeks/1 week/1 day/1 project.
My advice: ask them whether they prefer green or purple. Whether they prefer the number 34 or the letter X. Take them out to lunch and see if they know the name of the waiter/waitress. Then get your mother's opinion on them. Then play a game of Magic: The Gathering with them, and if they win, roll a d20 to see if they can beat the AC of the job. That method has never failed me.
So yeah. Looking for work? Be smart. Be yourself. Because if you resort to playing games and being someone else, you're going to end up working for someone who thinks you are something/someone you are not. If you can't get the job because your resume was too plain/too fancy for the person doing the hiring, it's probably for the best.
Edit: To be clear, this isn't a direct response to the original article, but rather, to these types of articles in general.
Want to know why there's such a disparity? Because some companies are good at hiring, and others suck and don't realize it.
Want to know how to separate the good advice from the bad? Look for good companies and learn what their practices are. Find out how their organization works. If they're a lean mean innovation machine that values quality work and ships amazing products at a maintainable scale, chances are better that their hiring practices are also good (so long as you evaluate them before hubris poisons the hiring process, which may or may not happen). After you collect enough data points you'll probably notice a number of similarities.
In the end, the proof is in the pudding. People can opine till they're blue in the face, but if they're not killing it in their chosen market, if people don't really like working there, if someone is about to eat their lunch, then they're doing things wrong, and it would be risky to take advice from them.
Some companies succeed in spite of their engineering teams, and some companies succeed because of their engineering teams. What exactly is a good company? Is a great engineering team backed by incompetent management a good company? Or what about a company that has the full package but is unable to mail out paychecks on time and has a HR department that runs off anybody who exhibits an iota of independent thinking. It's really hard to say until you really meet with the company and spend some time there... and even then it's imperfect.
That's why I said "innovative, ships products, ahead of their competition, AND people like to work there". You can't get all of that without a quality, cohesive team covering many disciplines. And a bad hire into such a company increases the danger of derailing things. Thus, it's in any good company's best interests to carefully iterate and improve on their hiring practices, and any company that survives a long time with all of that intact is probably doing a lot of things right, including hiring.
Jason, you have a good point... I think that if you want to be HAPPY at a job you should try as much as possible to be yourself when you are interviewing with potential future co-workers. But that said, there are also nuances of the interviewing process that you can learn and then if you want you can choose to use them during an interview. The point isn't to try and get an offer from every company you interview with, since you can only work at once company anyhow. But it's an interesting challenge for hackers to try and hack the system and figure out a way for every company to want to hire them. But it's hard to do that without data.
I think that a lot of the frustration is coming from hackers who were rejected and find that they are left without any additional data to understand what that did or did not do that led to the rejection. Through recruiters you can sometimes hear a hint of why the rejection came, but if you work directly with a company they usually don't say anything except for "we liked you but ______"
It seems to me that regardless, you need to still have a CS degree and understand (and be able to implement) algorithms of increasing complexity, even if you're making a CRUD app.
But I agree, these interview posts altogether don't seem to say anything consistent.
Anecdotal, but a few of the best software engineers I've known did not have CS degrees. You certainly do not need one to write a CRUD app in Rails/Django.
The author places a lot of stress on memory. I think he's overgeneralizing from his personal experience.
Personally, I have a very bad memory, especially when confronted with query-like questions like "explain how you solved some hard problem when working at company X". I remember based on associations, not time: If you sit me down in front of the code I wrote, or pose the problem to me directly, I'll remember in a jiffy, but looking back on my time at a company/at school/etc it just seems like a blur---I can't remember anything at all that way!
Agreed. Memory tends to be trigger-based for most of us mortals.
Even short-term, here's what I've noticed: I could verbally list seven terms and their definitions. If I then proceeded to ask you the definition of any one of those terms, you could probably tell me. But if I asked you to recite back to me all seven terms, you'd probably forget at least one or two.
One of my favorite questions is: "What's the best bug you've ever found?"
Usually I get a "Huh?"
The really stellar folks will tell you about the bug that took two weeks to find and boiled down to a single missing comma in a vendor's library routine. Or /something/. But "Huh" is a bad sign.
"Huh?" is not a fail, but it usually correlates with people who can't write a function to find the length of a string, and I see a depressing number of those folks, even ones with the tag "Senior" in their title.
One of my favorite questions is: "What's the best bug you've ever found?"
One of the problems I always have with questions like that is this: I don't keep mental lists of events or other things ordered by potential "interestingness" to myself or other people. I just don't think that way for some reason.
I might fumble to come up with an answer to that question, and then an hour after the interview is over, I'll remember a really good story the interviewer would have liked, but in the meantime I've looked like an inexperienced, bumbling fool.
Not that it really matters, of course--eventually I'll get through an interview without hitting a question that doesn't work for me, get hired, and then actually get to do some good work for somebody. Getting through that process feels fairly random to me.
Same here. To compensate, I load my memory up with "interesting" anecdotes for interviews. Including my most embarrassing moment, which luckily happens to center around a bug. (Failing that, I'd probably grab a recent bug and dress it up to be interesting. I guess you can always riff about techniques which prevent that particular class of bugs.)
But now of course, if I ever go on an interview again, I'd dredge up some funny bug. Just in case. :)
That was my first thought, too. I've had some bugs that actually make good stories, if told right... But I had a lot of other stories that are better. The only halfway interesting bug I can think of right now is actually pretty lame, and the rest of the story that goes with it is halfway interesting.
There's always a danger when picking out certain criteria and claiming they mean something. In this case, 'ability to remember an interesting bug' does not actually select for good programmers. It just happens to overlap somewhat.
Best bug I helped find was a flaw in some FPGA code for a special DAQ board for a muon detector I was building. When the board was in a temperature above about 74F, the FPGA would often (but not always) fail to initiate some ADCs correctly because some delays would cause part of the circuit to go out of sync, and when it was below, it always worked fine. So, testing in the office worked fine, but running the board outside wit a slightly different setup in the summer would mean a failure, but not always.
The horrible part was after all this, I had to wire up a bunch of NIM modules instead with all the special logic (coincidence/anti-coincidence triggers + a bunch of other stuff so I could have separate binned energy counters) and then calibration manually because the engineer for that board was busy with other things to fix the bug.
I would have loved to tell this story at an interview for a job, except I could almost never get past HR drones, probably because my degree said physics and not CS or EE.
But I think my best one was when I found a bug in a third party communication library, that only manifested itself on ARM architecture. Our CPU was PPC, and our simulator was x86. On both of those, unaligned memory reads work fine. But the actual unit we'd talk to had an ARM CPU. On ARM it just reads from the closest(?) aligned address. I found that by looking at network logs and reading the source (which we thankfully had).
Funny - I remember when one of our new Engineers needed a memory-copy method for our ARM embedded solution - he went to Linux source and got some library routine.
It faulted when I used it the 1st time. Fixed the bug (alignment of source), ran again and it faulted again.
So I spent 10 minutes writing a test - move 0-128 bytes from source buffer offset 0-128 to destination buffer offset 0-128. Simple, overkill right?
11 bugs later the damned memory copy thing worked. 11.
The next thing to ask is, What did I learn from that bug? What I learned is, accept NO CODE as bug-free, no matter the source, no matter what authoritative base it came from.
Other learning: why oh why don't CPU designers put a damned memory-copy instruction into the machine? We all need it, all the time, for every project and we all hack something together that works until it doesn't. Sigh.
You can't express memcpy in hardware any more efficiently than you can in C because of the way memory controllers work. It'd end up being microcoded, and ARM can't afford that for the same reason it can't afford unaligned access.
I think x86 does have a microcoded memcpy (rep stos) but efficiency varies.
You mention the memory controller; that's probably where the logic belongs, not on the processor. So the microcode would come down to "ask mc to move; wait for completion"
That's not actually any better speed-wise. And the CPU would still have to microcode the copy because of caches (think of what involvement the memory controller has in doing a cache to cache copy)
The real gain in being able to have the memory controller do a memcpy() independent of the CPU would be to let the CPU operate on data out of its caches in parallel to the memcpy() being executed. But that only helps for a very specific class of memcpy() and is highly system dependent (you have to worry about the expense of keeping caches coherent among other things.) Anyway, an integrated GPU or other additional block of hardware behind the memory controller is a better candidate for this sort of thing than a user-level CPU instruction.
> why oh why don't CPU designers put a damned memory-copy instruction into the machine?
x86 has had REP MOVSB since forever, complete with a directional flag so you can handle the cases where the source and destination regions overlap. But it went out of favor since for a while from 80386 to early Pentium processors (when Linux was written), REP MOVSx was slower than writing an explicit memcopy loop.
That said, such an instruction would seem to go against RISC philosophies, where you want your operations to be small and atomic and predictable in terms of time and resource consumption.
Right! Foolish programmer! Using REP MOVSB has been broken since about the 2nd issuance of the processor. Dumb folks (read: DOS) used it as a timing loop to calibrate interrupt timers, complained when it got faster and broke their code so Intel 'dumbed it down' til its about the worst way to move memory you could try.
So you say it works again. Cool!
Maybe what we really need is some sort of 'architecture library' that compilers resort to for things like this. Maybe an instruction, maybe a routine, but guaranteed to work for every wrinkle in the architecture.
Because if its not in the compiler, folks will continue to cobble together buggy code of their own, with only a vague idea of the vast architecture landscape they are navigating blindly.
...I was tempted to relay the story of the best bug I ever found, and realized that it literally boils down to exactly what you said-a missing comma in a vendor's library. Wow.
What made it fun was that it was a SAS program (which I knew) with extremely complex, 5-layer deep macros that called VBScript (which I did not know) all over the place. Debugging stuff in a language you don't know under a tight deadline-that's a thrill!
It was a production bug where a customer's birthday was rejected as invalid. This was in Germany, which is relevant because it turned out to depend on the machine's timezone. You see, when Java parses a date, it computes all fields of a Calendar object and then does a sanity check on them. Some smartass had decided in Java 1.4 to have that reject daylight savings time offsets greater than 1 hour. Unfortunately, Berlin and the Soviet-occupied part of Germany had a 2 hour daylight savings time offset in the summer of 1945, because that corresponded to Moscow time.
1. debugging an options calculation on a spark workstation for a Merrill Lynch trader who was incognito and on his honeymoon. A certain options valuation table he was using had expired and needed to be updated. Ad interim his position went haywire. I didn't have the exact source, so the debugger was off by one line, then two lines, then three lines. But I found where the table was used and was able to patch and recompile the code.
2. This dates me somewhat (recruiters consider me too old) but I was bitten by the 80286 POPF bug, which caused the interrupt disable mask to be ignored in an interrupt driver I had written for a Motorola 6852 (if I recall). I found the bug but the client announced it was going out of business (I wasn't responsible).
The version I got in an interview was, "What's your favorite algorithm?"
I stalled 30 seconds, pulled Lottery Scheduling out of my ass, and proceeded to do pretty well on the interview, then be told that they wouldn't hire me because they could tell I'd go to graduate school.
If you're ok with someone pulling an interesting story from memory rather than actually computing a total ordering on bugs or algorithms for "best", these questions work very well.
How do you know that those who tell you about such a bug are stellar folk, and those who don't aren't?
I think the question is great to see if somebody is on the same wavelength as oneself and probably a good conversation starter. I'm not sure if it helps with distinguishing between good and not so good programmers.
Good to see that we're making good efforts at meeting our "articles about hiring" quota on HN.
I have this theory that hiring people is nowhere near as hard as this particular echo-chamber likes to think it is. Everyone thinks they need "rockstars" or "A Players" and is looking for the magical recipe for finding them, but I think for most positions, someone who is smart and technically competent and fits with your company's culture is entirely sufficient.
Most of the startups we talk about depend on good execution of a simple idea, not advanced technical skills[1]. I believe that a good leader with the right vision and the ability to articulate that to his or her employees can take smart people who fit well with the company and form them into people who will execute consistently well. Not only that, but this creates a belief system around engineering and product that is compounding and self-reinforcing.
Identifying technically competent people is not that difficult. If you were to read a few of the last 200 articles where we've talked about this, there are some common threads.
First this, always this:
a) Have them write at least a little bit of code to see that they actually know how.
Then pick a few of these. If at all possible spread the interviews out amongst a group of people so you can get different viewpoints.
a) Ask them some relevant programming questions and observe their answers. Focus on things related to the kinds of work they'll be doing in their job. Maybe work on a whiteboard, or even in a text editor. Maybe even a little bit of both.
b) Ask them a broader set of questions relevant to things like algorithms and data structures. Even for simple web apps, it helps to be at least conversant in the basics of these topics.
c) Ask them to see open source work, or if they have a profile, or maybe some code they wrote that they're comfortable sharing with you.
I really think that if you put someone in a room with a competent engineer for an hour, that engineer will be able to tell if the person they've been speaking to is a good engineer or not.
Yes thank you. The OP does briefly mention the "looking for rock star" thing, but I get mad that every single job posting I see seems to be "OMG MUST HAVE ROCKSTAR". Or gawds, lately "OMG MUST HAVE DAN SHIPPER".
First, probably if you got an actual rock star programmer (Zed Shaw comes to mind), you probably wouldn't be able to handle/keep him.
Secondly, really? You want people that crank out Lisp transpilers in their spare time... But what about people that maybe are good with taking some oddball client spec and going back and forth to understand the desired behavior and distill something that is actually possible to implement? What about the guy who takes that extra 10% of time to implement a feature, but they end up giving you some new common code you can apply in your app?
Maybe these people don't dream Ruby DSLs, or have Github Infinity number of issues in their inbox, but they can still be valuable to a team. Because the best code in the world is useless if the programmer has misunderstood what the client is really saying.
What I always wonder is why they want "people that crank out Lisp transpilers in their spare time" (I'm reasonably close enough to that...) to come write CRUD apps in Rails.
My current day job is to write CRUD apps in Rails, but my long-term plan has me back in academia (for a PhD and a research career), where the ability to crank out a Lisp transpiler in one's spare time actually gets used.
There is a lot of focus on team work. I'm still not sure if programming is suited for team work. There's clearly some activities of creative problem solving which are better suited for individuals working mostly alone, but belonging to a community of peers.
For example:
* math
* research
* art
* visual design
* writing
* music
* programming?
People have been trying to apply Taylorism to programming making it approach an assembly line in organization. Now we're trying to organize programmers as sports teams instead with Agile.
I just don't understand why programming specifically is under this intense pressure of having to be measurable and quantifiable in every little detail.
It seems like being a good cog in an assembly line, or a good member of a sports team is just as important, if not more, as just creating good programs. Why aren't for example visual designers being hassled in this way? No they are beging left alone as long as they do good stuff. They can freelance or work from home if they want.
But programming is somehow different. There's this assumption that it needs to be done in a group at all times. Solving all problems by group discussion.
If I was hiring I would just ask to see previous project, and/or a portfolio (github for example). If they have nothing to show, or it's really bad and show no signs of progress, I would not hire. Simple as that.
I wouldn't hire a group of 15 clearly sub-standard visual designers who have nothing to show, try to organize them into a group, measure closely and try to make them create visual design.
In the same way I would not try to do this with programmers either.
This can be cheated. They could get someone else to create the Gibhub portfolio for them, perhaps a friend who's an unemployed "real programmer", just like they could get someone else to sit an aptitude test for them if only HR people are at the test and no photos are taken.
Yeah of course you'd bring them in for an interview to talk about those projects if they show a promising portfolio.
I just can't imagine hearing someone say
"Yeah this guy is a great visual designer, look as his gorgeous portfolio, and look at all this great stuff he did at smashing magazine (or whatever good place for design, I don't know..) But he's not a team player, he is sceptical about doing 'pair-designing' for 8 hours a day with other randomly chosen incompetent designers. No hire."
/sigh interview techniques really are a perennial topic on HN.
> [a coding test] won't tell you squat about how good they are in a year long project with 10 other people,
The author misses the point of a coding test. It's a negative rather than positive filter. Someone who does amazing on FizzBuzz is not by definition an amazing programmer. Someone who can't solve FizzBuzz however almost certainly is not a good programmer.
Simple coding tests are a very effective filter in terms of the time spent.
> So why spend an inordinate amount of time on relatively minor parts of a programmer's skill set?
I wouldn't call half an hour "an inordinate amount of time".
The author then opines about how "can just tell" if someone is a good programmer or not. I can sympathize because frankly so can I. I went through a period of taking 10 interviewees to lunch. I asked them nothing technical and basically just answered their questions. From the first 10 minutes I could tell:
- 1 would probably get an offer (he did);
- 8 would not (they didn't); and
- 1 I was unsure about (no offer).
Looking at the author's profile [1] I believe I can see the problem: it doesn't appear he's ever worked for a large engineering organization. This is fairly obvious from the content of this post because none of his solutions scale.
Let's say you have a large organization with 5 Andrews. Each of them says to hire this guy they just interviewed. What do you do if you're looking for less than 5 people? Are they going to work on the respective teams of those giving the recommendation? If not, how do you know they're a good fit? You need to consider company-wise culture and expectations. How do you calibrate between them? Do they have the necessary foundation to do not just the job they'll be starting on but to grow with the organization, adapt and perhaps work on other projects?
The other problem I have is the "war stories" aspect. This is a very definite bias. Take the way human memory works. Imagine you have a conversation with someone that's memorable in some way. At first you can remember word-for-word what happened. That quickly fades and you remember the gist of what was said. You may even think you remember exactly what was said and how it was said but usually you're wrong (try it by writing down what was exactly said and going back to it after months or having two or more people recount the same event). After awhile even that made fade and you may just be left with an emotion about the event.
Some things I can remember very well but more often than not, I've learnt my lesson and the exact circumstances or even the origin entirely are lost. This is largely because it's useless information.
But here's the biggest problem of all with the post:
> My favorite idea is still contract to hire everyone after your (hopefully reasonable) interview;
Okay, you've excluded anyone really good because they're not going to jump through that hoop. I don't consider myself a "rockstar" and even I won't jump through that.
Maybe the real problem is the OP doesn't even know what good programmers are because this is a recipe for mediocrity.
Lastly there's a story about team bonding (probably distorted if not made up outright at a guess). The author doesn't seem to realize that his hiring process is pretty much about hiring people like himself, which is fine, but that's not the definition of a good programmer.
I've seen teams work well who:
- all work closely together and socialize together;
- never socialize together;
- work remotely;
- work on different schedules;
- and so on.
Don't confuse programming ability with culture fit and work style. Those are three different things all important.
In a large organization you're going to have some aspect of a common culture but very different team cultures (and even site cultures). Part of hiring in a large organization is recognizing that you're not trying to hire "you" and then working out how you can best use someone not like you such that they flourish.
EDIT: oh and if he thinks Google, as one example, hasn't done extensive data analysis on interview techniques he's nuts.
Interviewing is a perennial topic because it doesn't make much sense.
Case in point - over the past week I've interviewed for four positions. Three gave me offers within a day, with very little effort on my part. On the other hand, one told me that I had little idea of what I was doing and told me to study up and apply again next year. And, that was after wasting my time with 3.5 hours of interviews. Of the offers I received, two were well over 120k plus great bonus. The other came in under 90k but told me that I could work anywhere in the world, whenever I wanted, as long as I got my work done.
Over that same time, I flat out rejected three recruiters from large engineering companies because their recruitment processes would have cost me $700-1500 just in opportunity costs. And, they weren't even willing to give me the info I needed to help me determine my chance of success or even my expected pay.
Everyone these days seems to believe that their company is so important that they deserve or need a team of top 1000 engineers. When in fact a top 20-percent team who bonds well together would likely be sufficient.
The topic of interviewing is fascinating because there seems to be very little science behind it. Who knows what works?
The opportunity cost is the most frustrating aspect. I've solved several "sample" problems from companies, and been involved in full day interviews writing code on the whiteboard. The worse was spending a day writing some sample code, then doing a panel interview where I was asked to make a change to the code to cover a non-sensical edge case. I was rejected because I "hesitated".
People make several mistakes when analyzing interview techniques:
1. The interviewer and interviewee are after similar and complementary things.
Wrong. The interviewee is trying to find out if the work is interesting, whether or not this would be a good place to work, whether or not he or she could work in that environment with that team and so on.
The interviewer (or rather the employer) is trying to fill a position. This is really important.
Some make the mistake of thinking that if an interview process doesn't evaluate the interviewee accurately it has failed. This is grossly inaccurate. The point of an interview process shouldn't be to look at a single candidate but to look at the process of filling a position, which may well span interviewing many candidates.
A false positive (someone who looks qualified but isn't) for an employer can be incredibly costly. The cost of a false negative is essentially zero as long as the employer can otherwise adequately fill the position.
To spell it out: if an employer gets 50 applications, has phone interviews for 8, brings in 4 for face-to-face interviews and makes two offers, one of which is accepted, the employer has gained the desired result. The fact that a qualified person was rejected along the way (false negative) is essentially irrelevant.
2. Your worth is constant.
Wrong. You said it yourself. You got several different offers. You're implying that if you had been accurately gauged market conditions would have you valued roughly similarly.
Company A might be desperate. Company B less so. You may fill a niche far better at one employer than another. One employer may simply be more cashed up and able to pay a better salary. A given employer may simply suck at negotiating or be under a misconception about market value. The list goes on.
Likewise, your desirability to the employer factors into this and it goes beyond technical skills. If the employer thinks you'll be a great fit and they'd really like to work with you, that improves your worth (to them).
3. Companies are looking for the same thing.
Clearly this is wrong but I do see this attitude come up, typically being implied by showing mismatches in offers as "proof" or similar.
There are many examples of this. For example, all other things being equal I've found that an MIT graduate is much more likely to hire other MIT graduates. The same is true for Stanford, CMU or [insert school here].
Part of this is the "social proof" element (going to a great school and/or working for a top-tier employer can be a huge advantage). But more than that it comes down to cultural similarity, common background and being a known quantity (to some degree).
This is of course different for every employer.
The real problem with interviewing, particularly at larger organizations, is that people who are bad at it are doing it. Interviewing and assessing potential colleagues is a skill and a talent. Some people have it. Some don't.
I've seen another comment here that said you need great engineers to do interviewing. I disagree. Many great engineers seem to be essentially savants who are often ill-equipped for the social discourse entailed in interviewing.
To interview an engineer I firmly believe you need to be an engineer (the same goes for managing engineers) but you don't need to be a rock star. You just have the right additional skills.
The cost of a false negative is essentially zero as long as the employer can otherwise adequately fill the position.
I couldn't quantify them exactly, but I don't think the costs are zero. For one, any candidate that you bring in and reject consumes (in the case of my company) about 4 man hours of developer time. And that's mostly senior/lead developer time. And that's not counting the time we spend discussing the candidate and the general cost of a distraction/context-switch. And then there's the opportunity cost of not having a position filled and work being started. Sometimes the short run matters.
That said, I'd definitely agree that the scale should be tipped in favor of false negatives.
It's better to think in terms of risks not just costs. Interviewers tend to higher people just like them, which means the team is often filled with people that think the same way and come up with the same types of solutions. On the other hand when you get someone in that can 'do the job' but has a radically different perspective they are more likely to bring something new to the table. To be overly simplistic in a team of world class programmers adding someone with a great UI background can be worth a lot more than yet another world class developer. The advantage being diversity does not require a larger paycheck.
Interviewing skills and engineering skills are completely different. You should be an engineer if you want to interview other engineers, but it's likely that a great engineer isn't a great interviewer unless they've been specifically trained and coached on how to interview.
FizzBuzz is OK for quick filtering, but asking to come up with an efficient substring search algorithm ([1] or similar) on the spot, assuming that one doesn't know this algorithm a priori (and many good programmers don't), for which, scientists like Knuth, Morris and Pratt spent months, is just ridiculous.
Frankly, your comment bewilders me. Sure, you might not be able to replicate KMP in the middle of an interview if you didn't already know it, but I'd certainly expect any competent programmer to be able to come up with some form of solution (and I'd probably expect them to get reasonably close to KMP) reasonably quickly and then to be able to discuss their solution and identify it's flaws. A moderately intelligent version ought to be fairly trivial for someone with a solid comp-sci grounding.
Even then, it's not the coded solution that I'm particularly interested in - it's the explanation that follows about how you got to that solution and the discussion about the pros and cons of your particular implementation. I'm looking for signs that you can solve problems and that you can look at your solutions with a critical eye and understand their weaknesses as well.
For what it's worth, the nastiest interview question I've ever had involved building and searching interval trees with up to a billion intervals stored in it where they were looking for solutions which provided the fastest posible search times. It was less than fun.
I have several points/cases regarding inability to come up with a KMP like algorithm:
1. Fresh CS graduate from the top university (MIT, ETH, Stanford, etc..) - I agree, it is a bad sign.
2. Mid/late career professional - I disagree.
3. Self-taught professional - I disagree.
In latter two cases, some might just never needed/seen similar reasoning/algorithm implementation in their life, but that doesn't make them bad programmers.
Another point - many interviewers have beforehand written down exact fixed algorithm, and if your solution diverges at some point with that single variant (although your solution might be correct as well) - this is the red flag for them - and they interrupt you in the middle. I've experienced this even with an open ended design questions - this is really frustrating.
And my last point - remember that when KMP were coming up with this algorithm, no one were breathing in their back/neck.
Agree on the 2 & 3... I've probably been out of school (CS) for longer than most on here have been programming.
Not being able to come up with KMP on a whiteboard in an interview hasn't kept me from playing no small part in producing millions in revenue for companies.
Especially outrageous is all of the companies that think they need to ask these questions, when the day to day work is essentially mundane CRUD type stuff.
What exactly do you (or for that matter Google hiring in general) consider fizzbuzz?....Is Pascals's triangle fizzbuzz?...how about an algorithm for calculating the Levenshtein Distance?
It seems like everyone has a pet favorite interview problem that they like to throw at the candidate and in my experience some of them were definitely not fizzbuzz .And then if one interviewer doesn't like you then you dont get hired.All the work you have done is irrelevant if that pet problem is not solved optimally on the whiteboard.But then of course I am probably a little incompetent too.
One reason that interviewers ask the same question to many candidates (i.e., seem to have a "pet" problem) is because they have calibrated the question. A strong interviewer has asked that question to respected colleagues at various levels, and many tens or even hundreds of candidates. He knows all the ins-and-outs of the question, and more importantly, knows how to judge someone's effectiveness in that topic area based on their answers.
If you asked a different question to every candidate you interviewed, it would be very difficult to get a sense of how they compare to other candidates, or to employees.
Maybe every company is unhappy with built-in string searching operations... :-)
It's a rite of passage I think. Programming interviews remind me of rushing for fraternities in college. And once you get to the on-site, the hazing begins...
They probably ask questions similar to the ones that they were asked... so over time they select people more and more similar to themselves and their culture becomes homogenous and inbred.
What would happen if you just refused to answer the question and asked them for something more practical that would actually demonstrate your skill & experience level?
I've understood that FizzBuzz is the lowest possible requirement for any programming job. It's a trivial programming exercise that doesn't require experience in algorithmic theory or mathematics. It's like going to the driving test and being asked to start the car: anyone who wouldn't know how to do that is likely highly incapable of passing the actual test.
Now, the astonishing issue, for me, is that there apparently are huge loads of people applying for programming positions who actually fail FizzBuzz. In effect, it's akin to applying for the position of a bus driver "coz I once travelled on a bus". I don't get that but apparently it does happen often enough to warrant FizzBuzz.
What's even more astonishing to me than the fact that programming applicants mess it up is the fact that when the problem is introduced to a programming forum or blog there will inevitably be commenters commenting with a solution, and that solution will be wrong. The only car analogy I came up with was a person hearing about someone putting on their spare tire, so they go out to do it on their own car from memory of the account, and they do it wrong, when the instructions were right in the glovebox.
These articles reinforce something that someone told me a long time ago about a study about interviewing.
1. In most cases, the person (or the person in charge, if it is a panel) makes a decision in the first five minutes of the interview.
2. The decision is based on how much the candidate resembles the interviewer (i.e. how similar they are to the interviewer in terms of personality and skill set).
I'd think a good way to hire great developers is to hire and fire quickly. I see this mentioned time and again, yet very few people actually live by it. The process would then turn into a trial period beyond which you are good to go. If you want to have a group of good developers, just operate along the lines of "I give myself X months to notice that this new hire is bad." If you want a group of great developers, switch your mode of operation to "I give this new hire X months to impress his/her peers."
The people who can make the judgement call are usually the hire's peers based on their daily interactions, so how would management be able to get its hands on this knowledge without turning the company culture into a disaster where people always feel like they are being continuously judged by their peers and have to watch their backs?
It seems possible this guy just has an unusually good memory and regards anyone who doesn't also have a good memory as stupid and "not a good programmer".
I interviewed for google once; their recruiter found me through an open source project in which google has a stake.
Anyways, I was tired when I did my phone interview, having just returned home from work. I bombed a couple of those interview puzzles. I felt pretty stupid because the answers are really obvious to me in hindsight. No biggie.
So this week they interviewed a junior programmer I mentored in the past. It looks like they might hire him! :-) good one. If they asked him he'd easily send them my way (guess who he comes to whenever he has questions?) Oh well.
If their recruiter contacts me again I might still do another interview. Their "write some code in a google document" interview style was novel, but possibly sub-optimal. YMMV
Yeah I've done the code-in-a-google-doc thing as well. Wasn't a fan. I've also done the interview-right-after-work thing. Not a fan. I'm happy where I work too, but it doesn't hurt to always know what the market is like (as you're obviously aware). That being said I've found the "job market research" process tedious and tiring. Months-long processes only to end in frustration, lies about project scope/culture/etc., time-intensive programming exercises, and so on.
Not sure how else you'd handle it though, really. Busy people are busy.
If they come crawling after you again, tell them you'd need double the salary you might have accepted last time. If they accept, you win. If they decline, you still have the pleasure of making that stand.
Yes, yes, 1000 times yes. Particularly the basketball team analogy at the end of the article really resonates with me.
To anyone wondering, "How could this possibly be true? How could so many companies be getting the hiring process so wrong?" I say: it's because they aren't focused on the goal. That doens't mean companies never achieve their goals, they are simply optimizing the wrong aspects of the process.
But I don't think that a bad interview will mislead a programmer.
Thruth is lot of companies out there are mediocre. And this shows on how they recruit.
Everyone wants a "rockstar" for some average CRUD app, but they don't seem to understand that most programmers will do fine building them. Even people who interview badly.
I think that programmers should be the ones taking a detailed look at companies to see if they are a fit with the dev team instead of the other way around. After all, if you don't feel well whithin a team, how will you do your best?
Being a recent graduate of computer science and now job hunting I have come across several different types of interviews in the last few months. I have sat a coding test where they just asked me to write a very basic CRUD application in code igniter, also had the here is a scenario you have 15 minutes to write a presentation and then present us with your result, plus an interview where I was asked to do some UML on a whiteboard.
But the best method I have come across was where I was just asked 20 or so questions of basic computer science knowledge from OOP to networking to basic CS theory.
I say this was my favourite method because I know some recent graduates who could and can write a very simple application but are terrible programmers and would not last the probationary period (actually happened to someone I know) but I know that if you asked them a series of basic questions they would stumble and fall because in general they where just terrible computer scientists and once outside of the very basic elements of programming they knew nothing so what good does asking someone to write fizzbuzz actually do? apart from weed out the idiots (then again I know someone else who failed fizzbuzz, largely because he managed to get through 3 years of CS and not know about using %).
Do people actually use FizzBuzz itself? I always thought it was just one example illustrative of the type of simple and quick-to-answer question that makes a good initial negative filter.
What is a full join? I've been working with Oracle for 10 years and I've never used that term. I know what a left join is and an outer join, but not a full join.
"Conceptually, a full outer join combines the effect of applying both left and right outer joins. Where records in the FULL OUTER JOINed tables do not match, the result set will have NULL values for every column of the table that lacks a matching row."
There are a lot of people who've used SQL, and therefore will think it's perfectly honest and accurate to list SQL experience on their resume, and who've written perfectly fine queries to get shit done, and yet they would not be able to answer that question, especially in an interview/interrogation situation.
Also, in real life, under normal working conditions, we live in the Age of Google (and books) -- if someone ever needed to know the definition, or recite a definition or comparison of those terms, he could just look it up. A better question would be: can they solve problems? And solve them in good ways? And have they in the past? Do they get shit done? Ship? Reliable? Work well with others (to the extend it would matter, because it varies)? Would it be a net win to them involved with your project/team/company? Those questions matter the most by far. And I don't think it's wise to assume they're going to be able to recite a definition of any particular term. Offhand. In a stressful and unnatural situation like an interview.
There's a still a certain level of familiarity with the subject matter required to do a job. If the role can be defined as an "SQL job" (say something in data warehousing or reporting) - not knowing what types of joins are available could lead to terrible solutions.
I would agree with you if the interview question was more specific, say about Oracle window functions for example.
An SQL developer not familiar with full joins would be similar to a Java developer who is not familiar with interfaces, or a Javascript developer who has never heard of closures. It's a significant part of the language, not a piece of arcane reference.
You know, that's a fair point. I think our company may be making that same mistake where instead of trying to find the good candidates, we're trying to filter out the bad ones. I'll see if I can change that.
I agree with having someone who's already in the role interviewing you. Let the HR and managers interview the programmers last. This way you can filter out a ton of people before they waste everybody's time. One my last interviews the guy who interviewed me was a "change management" director. I thought to myself, "This is going to be a short, easy interview."
After being the industry for a while, I feel the best way to hire someone is still on referral. Developers know good developers and the good ones tend to stick together. They also know what their friends skills are, what they excel at, and what kind of work they would recommend.
Also, in the end, I learned its not what questions they ask, it's not about the stupid quizzes they put you through. Developers have to learn it's about you knowing enough about the company or asking enough tough questions to determine if it's going to be a good fit. Developers should have as much, if not more of a responsibility in vetting the companies they choose to interview with.
He makes a brings up an interesting point about the idea checking the effectiveness of interview methods. It seems that lots of people have intricate hacks for divining the talented programmers in an interview, but NO ONE can back their methods up with hard data based on performance and non-performance of candidates over project time-spans.
This a good point. It seems somewhat unavoidable, since no one will ever be in a position to evaluate the effectiveness of both those who did well on their metric and those who did poorly (since they will only hire those who do well).
Perhaps there are companies out there with a large enough sample size that are recording metrics on a "programming concepts verbal interview", "analytical thinking verbal interview", "programming test" and "resume match", such that they can evaluate what are the best predictors.
Speaking as someone who's thought a lot about this (mostly during my psychology studies), here's what I would do, if someone let me.
1) Establish the competencies of your employees. Think small programming exercises, IQ tests, personality tests etc. Use this to establish a minimum score for hiring.
2) Apply these same tests to all prospective employees.
3) Of all the candidates that meet the minimum criteria, hire a random set of them.
4) Follow up the progress of all these employees over 1-2 years.
Now, this would never fly in many legal senses, and I doubt that anyone will ever let me do it, but it would provide useful data to improve hiring processes. I suspect that this would need to be done for each individual company at least 100+ times before you would start to be able to derive useful patterns.
Some hypotheses:
1) non linear relationship between IQ and performance
2) Increasing performance and longetivity of employment based on the similarity between personality of candidate and personality of team.
I'm curious, is it really that hard to identify competent programmers? Surely asking a few technical questions that include some element of programming should be enough. I feel like there's about 5-10 articles on this subject every week, all of which say or or less the same thing. In very nearly every case that I've interviewed someone, I've been able to easily decide if they were a hire or a no hire about 20 minutes into the interview process. I'm sure for certain positions or certain companies who are solving difficult problems, things could be more complicated, but the vast majority of us just aren't in that situation.
It seems like the rather more difficult problem is getting competent programmers to want to work at your company and apply.
Identifying competent programmers is hard as long as you try to quantify it. You don't make decisions based on rational facts that come from a rational process, as nobody does. In the end, the decision comes from a feeling that we so hard try to belittle.
The rational mind can make a "decision", like "this candidate passed all our interviews and qualifies on paper: that means we can hire him". But the true decision of hiring someone is more like a "to engage or not to engage ourselves with this guy, in a joint (work) life together and into the future". Conversely, you can just decide to nominally hire people but never back it up on a personal, human level.
The process of hiring is golden as a negative filter: you want to weed out people who factually aren't up to it. But there's no positive filter that you can unilaterally apply. In the end, you just have to let yourself "know" who to hire because there's nothing else you can do.
Part of the problem, and the reason we have posts about hiring constantly, is that there's no solution to the problem. Every team is composed of different types and amounts of people, every company is different, everybody has had success or failure with different techniques, every recruiter brings different types of people, etc. There are a lot of people you can flip the bozo bit on quickly but others who are just bad interviewers and you have to draw out a bit more. There's people who just rub me the wrong way so I'm harsher than others sitting in the same room at the same time. And once you get past "These 5 people are competent but which one would I like to spend my week with?" the complexity goes up tremendously. If you're hiring the person who will sit behind you 8 hours a day it's a different deal than getting the 50th programmer in your department.
Sure, simple tests to filter out people who clearly know nothing about programming are useful, but puzzlers and other ego-trip tests are useless.
I can hire better programmers by asking them to write an essay about any current piece of news.
The English language is the first programming language one should master. It's the one most useful to handle interactions with other human beings, and that's what makes the most important difference in any job.
It would be better if people would evaluate ideas in articles rather than use the author's name as a leverage point to launch personal attacks. Several posts here attack the author as being incompetent in various ways and not having had serious experience. Yet a brief review of his resume shows he works at Travelocity, worked at Apple in the past, and personally invented a innovative spreadsheet using a different information organization paradigm that ended up influencing iNumbers. So he's not really the dumbass that some people are trying to make him out to be. It's obvious he's a talented, serious and experienced engineer. There is nothing wrong with disagreeing with his article, but snarky and contemptuous remarks about his supposed incompetency detract severely from points made, making them seem like the critic is an immature schoolyard bully.
To fix coding interviews in this country, you need to insist on having a great coder do the interviewing. In 90% of the interviews I go on, I get evaluated by an HR type who could not write any program more significant than Hello World.
What's funny is that the Non programmer interviewers are starting to administer coding tests, and they don't know what a good response is! So here I am doing a coding tests on a white board, and and I'm being tested on delivering the optimized code on the spot, basically producing the best code. So now instead of birdseed knowledge tests, we are tested for creating an algorithm from memory on the spot, a memorization game. Like Jeopardy. That's not how the best coders code, we offload most things to google searches and general process of cutting code. They don't realize this because they evaluators can't write a program. They can only understand one specific algorithm and its implementation.
So the only way to get a good coder is to take your best coder, and then hand him a stack of resumes, and ask him which ones to bring in, and have him sit down with the recruits and have him give an up or down vote.
Using non coders to filter coder resumes? The time is better spent hiring a monkey to throw darts and hire the one with the most darts. I don't care how many books on coding interviews they read.
I totally agree with you that anyone interviewing for a technical position should be technical -- if they ask you a technical question, the only answer cannot be "the answer I found in this book or learned from someone else". That's a disaster waiting to happen.
However... I've known a few great coders who were absolutely terrible interviewers. So just being great technically also isn't a sure sign that the interview will be handled well.
I think it's fair to say that if you're stretching yourself, or working on challenging problems, a reasonable amount of time is spent researching. If you don't know how to do something, you research it. Whether you find it in a textbook or on Stack Overflow, it's essentially the same thing.
Unless you think "good" coders reinvent algorithms they don't know from first principles?
I'd wager there's a difference between what jemfinch is talking about and what you're talking about.
I used to unilaterally believe that coding exercises or algorithm quizzes on interviews were silly because if I had to solve some given problem I would just google it. And that's true, I would. However, as I've gotten a little more experienced I've realized that there is extreme value in expanding the domain of problems for which I don't need to google, because either A). I've happened to read about it on my own time or B). I've encountered a particular troublesome component before.
In other words, I think it is entirely reasonable for the hiring process to properly evaluate a programmer's toolset. Yes, I think researching via Google is or should be a piece of everyone's process, but it matters greatly what kind of things you have to Google.
Do programmers encounter all sorts of algorithms when engineering software? In my experience, the only things I really need to keep in mind what are the consequences of certain data structures and algorithm types rather than their specific implementation, because often the implementations are built-in to the language, but the choice of which implementation of data structure and algorithm to use are more important. I use sorting algorithms very frequently. Usually it is just a call like "list.sort()".
Yet in interviews, the questions I get asked are more like "Implement <My favourite sort> in 20 minutes, no googling.
I mean, really?
The other day I was in an interview where I had to implement a queue in 30 minutes. The unit tests were already written and the IDE was eclipse.
I wrote the "correct" implementation for the queue in about 10 minutes but there was a bug I couldn't find; I've never used eclipse before and took a lot of time figuring out everything. In the last minute I finally tracked down my bug. I forgot to increment a variable. I fixed it and all the tests as well as the performance test passed. I didn't get the job because I took too long. (It was a graduate position and the last time I used Java was 2 years ago)
I was distracted with all the buttons on eclipse... Otherwise I'd never have missed the variable increment! I swear. Anyway the point is, if a programmer is ever implementing their own simple Java queue in production software it is not a good sign anyway. Why are you testing for that?
Good designers buy stock photos for generic things and fill in with custom bits to produce a finished piece.
Good programmers copy/pase boilerplate code and focus on the stuff that really makes it different. (iOS example) you don't write out all of the methods for a UITableView.. That would be a waste of time you copy/paste the methods you need and change the important bits.
I interpreted his statement as meaning that google searches replace the requirement to memorize minute details of things that aren't commonly used or necessary to commit to memory. For example, I may know about php's preg_match, but I haven't used it in 2 years... so if a project calls for it I can just do a quick search for preg_match and see all the details that I need to know. I didn't have to have everything in memory since I wasn't using it. We have a built-in garbage collector in our minds that slowly clears out things that we don't use. But when we do learn it for a second time it comes back very quickly.
Good coders are clever problem solvers and well-organized thinkers. They don't need to have photographic memories that store every possible command and instruction, even ones that they have never ever used and find no use for. As long as you know what is relevant for the current task and have a peripheral awareness of what else exists, you can do your current tasks and know what to search for in books or on google if something out of the ordinary is required.
I think that the tests are very important, but it's hard to do them efficiently and to create good tests that test skills that are important to the job at hand.
The tests should have well-defined constraints and instructions, and once the tests are complete the code should be the basis for a discussion about why certain choices were made.
I've basically received the same type of coding test from 10 different companies for the Sr. DevOps roles that I'm looking at... and it's really sad honestly to see what they're asking me to do. It's basically high-school CS where I'm supposed to open one file and pull out specific parts of that file and then compare it with a second file and output only certain lines from the second file based on a comparison with the lines from the first file.
The instructions are poor, and I'm expected to clarify each time with these kinds of questions: "Ok do I need to worry about memory?" "Does this need to work for files of arbitrary sizes?" "Does this need to handle error checking?" "Is it important that the code is readable & documented?" "How long do I have to complete this test?" etc...
But in DevOps, I am given a specific situation that I know a lot about... and based on the situation I am used to answering these questions myself. I should be given a specific and relevant situation... and then create a solution to solve it based on my experience and knowledge of what the constraints are when dealing with that kind of situation...
If I'm given a situation that compares the /etc/passwd file with the /etc/group file, I'm not assuming that it's reasonable/possible in even a system set up by the most bumbling sysadmin that those files would be millions of lines long each... And if they're blank or corrupt we definitely have other issues... If this script needs to run unattended for a long period of time it should have error checking and logging. But if it's just run once I wouldn't need to do everything in software as long as I:
1) make backups of the files
2) 'less' into the files and take a quick look to make sure that there aren't any surprises.
3) Run a test on copies of the files...
Wayyy back in CS 101 did it ever really matter if a one-time piece of code executes in 0.0000000001 seconds versus 0.000000000099999 ? If that is important to you, then please give me a real-world example that requires optimization. Or just specify that the solution needs to be as fast as possible. My recollection was that highly optimized code often needed to be rewritten when future assignments built on the code written in the earlier assignments.
Also fyi for interviewers... if I'm using a scripting language there might be some behind the optimizations that alters the typical CS understanding of the performance characteristics of sets, hash maps/hash tables, and arrays. (See php and python for example) It seems that any question that CAN use a hash map probably should use one in a programming interview... Scripting languages can also handle errors in a pretty amazing way that interviewers may not be familiar with. For example, splitting a string a="a:b:c::::::::::" in python by using b=a.split(':') works just fine and results in b=['a', 'b', 'c', '', '', '', '', '', '', '', ''].
Which brings up a final point... in the real-world, writing a first version of your code in a more generic and unoptimized way leads to code that you can more easily adapt in the future when you rewrite or refactor due to changing requirements. And less optimized code is generally more readable and for other coders who may actually need to read your code and make future changes it can be easier to spot problems or make changes many months or years after you finish your version. But in a coding interview, always assume that they want the most optimized version of the code... but make sure to ask anyhow just in case. Some people may actually care to see if you can code in a way that balances between speed and maintainability.
Testing is important, but it is extremely important to design tests in a way that helps reveal the characteristics that are important to the job you are hiring for. And if something is a google search away and not something that is necessary for the job, it's not something that an experienced coder will keep in their valuable brain-based memory storage.
For every person who says not to bother with a resume, there is another person who wants them.
For every person who denounces certain terms on a resume, another person is looking for them.
For every person who thinks "experienced with" means "you can write a book on the subject", someone else things "you've used it in production."
White board tests are necessary. They are useless. Example code is critical. No, just showing projects. Gotta see open source contributions. No, show what your code accomplished.
You need schooling. You don't need schooling. Degrees don't matter (though, try to get a visa without one). Self-taught rules.
Ask them to write a FizzBuzz program. Reverse FizzBuzz. FizzBuzz is silly and doesn't matter.
Throw out half the resumes so you only hire the lucky ones. Have them work for you for 90 days/1 month/2 weeks/1 week/1 day/1 project.
My advice: ask them whether they prefer green or purple. Whether they prefer the number 34 or the letter X. Take them out to lunch and see if they know the name of the waiter/waitress. Then get your mother's opinion on them. Then play a game of Magic: The Gathering with them, and if they win, roll a d20 to see if they can beat the AC of the job. That method has never failed me.
So yeah. Looking for work? Be smart. Be yourself. Because if you resort to playing games and being someone else, you're going to end up working for someone who thinks you are something/someone you are not. If you can't get the job because your resume was too plain/too fancy for the person doing the hiring, it's probably for the best.
Edit: To be clear, this isn't a direct response to the original article, but rather, to these types of articles in general.