We’d like to double the size of our team by the end of the year.
My boss told me that a couple of years ago, and that’s when I was first introduced to the question in the title. As any company goes through an accelerated growth phase, this question certainly pops into their engineering managers' minds.
In this article, I'll attempt to answer that question based on my learnings and experience.
Right from the start, I'd like to tackle the most obviously unhelpful answer: lower the quality bar. We all want to hire smart people that can get the job done and are a good fit for our team and company. Lowering the quality bar is not an option, and it’s not what this article suggests.
We’ll look into reducing false negatives instead.
So, we want to hire rapidly, we can't wait until hundreds of people apply, and we don't want to lower the bar. That means we're left with only one option: we must avoid false negatives. We have to make sure every step of the hiring process is rejecting candidates for the right reasons.
Why would perfectly good candidates be rejected? Hint: it has everything to do with human bias.
While we can't avoid bias, we can still put shock absorbers in place to reduce the impact it has on the hiring process.
Let's look at the first, the most important, and most overlooked filter of the process: job descriptions.
You might not have seen it as such, but the job description is a filter and arguably the most important one.
There is a fundamental difference between the job ad and the rest of the hiring funnel: the job description is a step where candidates filter themselves out.
Hiring managers and interviewers have zero say about who gets filtered out by the job description. That is the candidate's decision, one hundred percent. They read the ad and say, "10 years of Python experience required? Oops, I only have 8, this isn't for me".
That’s an example and a lesson: don't add years of experience to the job description unless it matters. That person could otherwise have been the ideal candidate, but they won't even apply because of an ill-thought requirement.
A great job ad helps candidates filter themselves out for the right reasons.
Here are some tips to improve common job description issues:
Keep requirements to a minimum
3-4 bullet points should be enough. The more requirements you add, the more reasons you're giving suitable candidates to drop out before applying.
That's also true for job titles. Avoid using seniority descriptors as there is no agreed-upon definition of senior. There are plenty of candidates that might fit your definition of Senior but won't apply because they don't think they're deserving of such a title.
Be careful with language
Another reason attractive candidates might not be applying to your position is because of biased language in ads.
Be as careful as possible with word choice. Studies show some words are gender-coded: they're more commonly used to describe one gender than another. Examples: words that evoke competitiveness and analytical work appeal more to men, while those that evoke teamwork and trust will speak more to women. Women tend not to apply for positions if the ad is masculine-coded.
It's not only about gender, however. If your ad boasts about the beer tap in the office or the "work hard play hard" culture, it will drive away people with families who want to be home in time for dinner.
Likewise, if you include requirements about academic programming languages and computational complexity theory, you will be biasing towards CS grads. The ad will be unappealing for those without traditional education in the field.
There have been volumes written about how “the first five minutes” of an interview are crucial. Several studies, from 1958 to 2012, reported that’s how long it takes the interviewer to make their initial assessment, and then they spend the rest of the 1-hour interview asking questions to confirm their biased judgement.
One study reported in 2000 that judgements made in the first 20 seconds of an interview could predict the outcome of the interview. That's right, twenty seconds.
Here are a few strategies we can apply to reduce the impact of unconscious biases during interviews.
Do multiple interviews
To avoid dropping a perfectly good candidate because of our interviewer's biases, having multiple people interview the candidate is essential. Bonus points if the interviewers themselves are from diverse genders, races, cultures, and educational backgrounds.
Doing fewer interviews in pairs is better than several individual ones. Having one person lead and another take notes, chipping in occasionally, is a great strategy.
Immediately after the interview, split up and fill out the scorecard (you have one, right?). Then, get together again with your interview partner and compare notes. Agree upon a decision and report back.
Have a set of questions
Having a set of questions designed to assess candidates for the position and their rubrics is another essential strategy. It helps avoid interviewers going easy on candidates they like and making it hard for those they don't.
It's also crucial to write down and report back the answers the candidates gave to each question. Yes, it's a lot of work. That's why I recommend doing interviews in pairs, so one person can focus on note-taking.
Reasons why note-taking is important:
- You can fairly compare candidates based on the answers they gave to the same questions.
- Someone who wasn't in the room during the interview can read the responses and help make a decision.
- If we hire a candidate that doesn't work out, we can go back to the interview records, learn from them, and fine-tune the questions to avoid the same mistake in the future.
I think it's an industry consensus that interviews aren't enough to assess the quality of an engineer's work, so we should test candidates. However, there is absolutely no consensus about which type of testing is best.
When looking to reduce false negatives, I prefer to use take-home tests. They're low stress for the candidate and can be paired with an interview to review and maybe expand on the solution. Take-homes give me a lot more information to work with than a 1-hour live coding test.
Some people might not be able to take time out of their personal lives to do tests. Allow them alternatives, such as doing the test in your office, or ask them to submit an existing piece of code that is representative of their current abilities.
Some notes about tests:
- Design a test that is comparable to the work they'll perform daily. But make sure candidates know you're not asking for free labor (you're not, RIGHT?).
- If you require advanced knowledge of esoteric data structures during the test, people with a non-traditional educational background will be less likely to succeed. This point is related to the previous one. Make sure whatever you ask for in the test is relevant to the work.
- Unless the job entails writing code on a whiteboard, don't ask your candidates to do so during the test.
- Anonymize a candidate's submission and have it reviewed by other engineers that aren't otherwise involved in hiring this candidate.
Ideally, everyone involved has logged their final assessment (hire/no hire) somewhere by now. The idea now is for everyone to read everyone else's report and make a collective decision.
The main risk at this stage is giving in to power dynamics, i.e., everyone in the room agreeing with whoever has the most authority. A few tips:
- People in positions of power/authority should speak last. Gather everyone else's thoughts first.
- People should have a good reason to change their hire/no hire decision.
- People should be speaking truth to power, i.e., they feel comfortable disagreeing with those more senior. If that rarely happens, it's a problem, and it needs to be addressed.
This article is a non-exhaustive list of common biases I see that hinders our ability to hire engineers fast. It's in fact just the tip of the iceberg. I hope that this guide can help you become aware of how pernicious those biases are.
If you suspect one step in your hiring funnel rejects too many candidates, look for ways to put shock absorbers for biases.