Hire based on the conversation about code, not the code itself
30 points by brbash
30 points by brbash
A mini-product that a person can actually build in a few days with an AI agent, but that has enough surfaces for making decisions: architecture, data, styles, API, error handling, tests.
I don't like this approach. It's not only that they think it's acceptable for applicants to develop a proof-of-concept app "in a few days" of their free time for their interview process, but they also require the use of AI for this task. It seems they expect applicants to only apply to one job at a time. The "few days" soon accumulate, leading to a few weeks. They could create a demo project themselves and then engage in a live discussion about it with the interviewer. You should be able to draw the same conclusions from this method.
Totally! I usually aim for a ready-made toy repository that reflects the internal code quality with an already open PR and the applicant has to do a code review, maybe do some suggestions for where the author of the original PR had challenges. Later in the synchronous interview we go through the PR, I let them explain their review and we might do some small exercise together.
I feel like organizations that hire by searching extremely tidy coders that have a lot of attention to detail and later then just have a much lower standard internally are dishonest and set both the newly hired person and themself up for struggles.
I like this idea. The only time I had a code review interview, they gave me a file and asked me what I'd improve or change or comment on, and it was so devoid of context that it was difficult to see anything meaningful. Having a PR to understand would make that a lot more concrete and give you a place to start.
You took everything too literally. It sounds as if all companies suddenly decided to conduct interviews exactly this way, which is unrealistic. If you're not ready to invest time in a task to show interest in the project and solve a more realistic problem that demonstrates a candidates real skills, and instead just want to do a leetcode style interview, then that's fine -- if it works for you.
But hiring the wrong candidate who gets fired after 3 months will cost the company more.
All these approaches can be flexible depending on the type of company, its size, and so on.
and thank you for comment :)
You’re literally saying this is how companies should interview. Given that position you can’t then say “the practicality of this depends on companies not doing this”.
It’s also demanding not just uncompensated work, it’s demanding work that actually costs the candidate.
And of course by including “ai” in this you’re essentially saying “and the code won’t be representative of the candidates skill or understanding”, making the entire exercise pointless. There are myriad problems with this that I make in my top level comment.
Also: repeatedly (and only) posting your own blog is a breach of lobsters rules.
Honestly the point I took away from this train of thought is that if the discussion of the code is what matters, don't have them write any code at all. Give them an hour to analyze a piece of existing code, and ask them what they would do to improve it. This will give you a much better signal and maps more closely to what the candidate will be spending most of their effort doing anyway !
I do ask people to review code in interviews. Find the "bug" ones are questionable, find the potential security bugs is more practical. You do need to ensure that you don't have "potential problems" in the code as even if you try to tightly scope the kinds of problems you care about, they're easily able to cause tunnel vision.
Or you ask them to whiteboard code a solution to a problem. I do not care about it being perfect code, syntactically correct, and I'm perfectly happy with a "just write some pseudo-API for some library function that you want to use, and then use that".
The point of whiteboard coding is to ask questions along the way: why X?, what about Y?, etc obviously general correctness matters but seriously I don't care if the actual code is not strictly correct - you're writing on a whiteboard, not to a compiler. An interviewer expecting "writing at a keyboard" coding on a whiteboard is a bad interviewer as that's not useful information.
<short rant on "puzzle" interview questions> This is the problem with those dumb puzzle questions: either someone knows the answer and writes it down (and may not know the why of the answer), or they don't and it's a 50/50 if they can work it out, and then write a solution. But by their very nature someone not working out the answer provides no information, and someone "working out" the answer may already know it (there are literally books about these dumb questions) and you can't distinguish "I worked it out" from "I knew the answer but also knew to hide that I knew the answer". The result is that those questions give you no information (but this is not news to anyone on lobsters) <end rant>
IMO, even though ostensibly banned, Google has a bunch of interview questions that bounce right up against the puzzle question barrier
Hello, I’m not an admin, but I noticed all your submissions and comments are exclusively to your own blog and within the last few days. It’s be nice if you engaged in the larger community. I suspect people might start reporting you as spam if you continue at this rate.
Self-promotion: It's great to have authors participate in the community, but not to exploit it as a write-only tool for product announcements or driving traffic to their work. As a rule of thumb, self-promo should be less than a quarter of one's stories and comments.
^ From our about page
I also like the "conversation about code" approach, though this sentence ruffled my feathers a bit:
A mini-product that a person can actually build in a few days
Sorry for the harsh tone, but I think that asking a candidate to spend a few days of their time, without compensation, on a throwaway project, is egregiously disrespectful.
A few years back I was hiring at a startup and we also used a take home assignment as part of the process. In the instructions we gave candidates I really stressed how we didn't expect them to spend more than 4 hours on the assignment, and that it was completely OK to send it in unfinished in case that was not enough.
That made the time investment much more symmetrical (though still not completely equal).
Sorry for the harsh tone, but I think that asking a candidate to spend a few days of their time, without compensation, on a throwaway project, is egregiously disrespectful.
It also won't work with senior enough candidates, who'll just reject it and find someone else who won't require days of free labor.
In the instructions we gave candidates I really stressed how we didn't expect them to spend more than 4 hours on the assignment
This sounds like you never did this yourself, because if you think about it, it does not make sense to follow these kind of limits. Imagine if I'm unemployed and I'm assigned a take-home coding challenge for an interview, why should I limit myself this way and submit less than ideal code when I can spend all I can perfecting my code and maximizing my chances to get the job? No one who really wants or needs the job will do that.
It also won't work with senior enough candidates, who'll just reject it and find someone else who won't require days of free labor.
Yes, exactly.
Imagine if I'm unemployed and I'm assigned a take-home coding challenge for an interview [...]
That's a fair point, but (assuming the legitimacy of the take home assignment as an evaluation method) how can you "forbid" the candidate to spend more than 4 hours? A sort of time-trial maybe, but that'd be horrible for a whole other host of reasons.
In that case I tried to make it clear that the code they'd produce would just be a starting point for the interview, not something that needed to be shipped to production. I also made the task big enough that it was obviously unreasonable to expect someone to implement it all, and told candidates to just focus on one aspect, though I can also see how a candidate could still think to go above and beyond to demonstrate their talent. (Here's one of the assignments we were giving, btw.)
But the main point I'd say is, as a company, trying to respect the candidate's time, and trying to smooth as much as possible the tedious-but-necessary process.
how can you "forbid" the candidate to spend more than 4 hours?
I don't think you can, that's one of the many reasons why I think this method of candidate evaluation is also not useful.
In that case I tried to make it clear that the code they'd produce would just be a starting point for the interview, not something that needed to be shipped to production
It doesn't matter what the company says. If I'm a candidate who really wants or needs the position, I'll do my absolute best spending days and nights delivering the best code I can possible deliver, simply to maximize my chances. I think most people who want or need the position will do the same.
But the main point I'd say is, as a company, trying to respect the candidate's time, and trying to smooth as much as possible the tedious-but-necessary process.
So there's really no other way to evaluate a candidate while respecting their time? Even the usual 1-3-1-1 setup(*) is more respectful as the company and the candidate spending equal amount of engineer-hours on the process. (actually the company sometimes spends more when each interview is attended by two engineers instead of one, which is not uncommon)
You can also just sit down with the candidate with a few of your engineers and do pair programming or design. You can be creative and find other ways I'm sure.
(*) 1 hour of preliminary, 3 hours of coding, 1 hour of system design/architecture, 1 hour of culture/behavior/just chatting
It doesn't matter what the company says. If I'm a candidate who really wants or needs the position, I'll do my absolute best spending days and nights delivering the best code I can possible deliver, simply to maximize my chances. I think most people who want or need the position will do the same.
Yes, but at that point it's the candidate's choice, like they might choose to spend time on writing personalized cover letters, or on doing other things they think will give them an edge.
And live coding has drawbacks as well. I've had candidates "panic" during live coding and perform in a way that didn't reflect at all their abilities.
I don't think there's a best / correct way of doing interviews. And I accept that when I am the candidate, as long as I see that the company is actually trying to respect me and my time, and not telling me to "jump through hoops, or we'll get any of the other 100 people that applied". (Also because, as you pointed out, it most likely ends up determining who is applying and who is getting hired -> the people you'd be working with.)
You have no limits. The limits are only in the initial conditions. If you communicate that you need more time and explain why, it already says something about you as a person who asks themselves critical questions.
You always have the option to say a hard no to such an interview process, which other people wrote about in the comments, or to invest time in getting into a company that interests you.
I said this "few days", but anyone in their company can choose any other deadline for this. It can be 1 hour, it can be 2 hours, it can be 4 hours, it can be 1 day, it can be a week, it can be 2 weeks or it can be even... a Super day where you just pay for a day of work and it will work for you. So it can be absolutely flexible in your organization.
I’m sorry if a job is saying “spend a few days on a project”, you’re charging the candidate money in the form of vacation time. You likely also can’t pay them (see the addendum below). Even if it’s a “throwaway project”, you’re saying “take time off from your job”.
That said throwing in “ai” here also demonstrates that there’s no interest in knowing anything about whether a candidate can design or engineer software, so it’s useless as well. Having someone talk about a project they didn’t write, that is built on the theft of code from actual competent engineers is worse than useless.
In any interview the importance is correctness of code, I’ve never been interested in exact code or library correctness (I see people complaining about how they’d have google to look up APIs in interviews and I would never consider that necessary and it’s weird to hear people saying it’s required). I guess/assume bad interviewers do bad things, but a bad interviewer isn’t going to become better after demand free labor.
Edit:
There’s also the other super fun thing: you would be prohibited from using anything at all (design, ideas, etc) from the project presented (in the “if you do this you can be sued and will almost certainly lose” sense). So your interviewers we need to have an absolutely clear understanding that nothing they see or talk about can be used. The entire IP - again design, ui, architecture, etc belongs to either the candidate, or their current employer. You can’t pay them for this time: almost all engineering employment contracts prohibit work for other companies.
If it's so important to you that a person can write code without an AI agent or some other helpers, ask yourself a question: do you meet this requirement yourself?
Yes.
On a more serious note, there's a big imbalance here. Writing code without using an LLM means that you can write code, as otherwise you're just reading machine-generated text and passing it as your own, sometimes post-review.
Now, writing code without some other helpers means that you just get the code from the top of your head, without access to documentation of any kind. Thats preposterous except for the most basic language features. No one wants a new hire searching how to write a for-loop, but if an interviewee solves a doubt about some particular case in <algorithm> in under 30 seconds, that's an almost instant hire.
Still, I like your interviewing process ideas. Thank you for the good read :-)
Discussing code in conversation is absolutely valid during an interview, but requiring AI for the process is a hard no from me and I won't be doing it in my hiring.
Does your company use any AI tools or optimizations in its infrastructure? are the engineers actually using them?
We use some in our infrastructure for specific use cases, but only one of our engineers uses AI tools for actual development. I'm the one that has to clean up the subtle, or not so subtle, issues that result from said tools. As I am the one that does the interviews for new team engineering team members, I'm certainly not desiring it to be a prerequisite.
I have done a huge amount of technical interviewing and trained others in the art. This article is absolutely on the money.
For the last ten years I have been asking a fairly simple but loosely specified problem; I explain that I do not expect the interviewee to solve it by themselves but this will be a collaboration between us. I then invite them to pitch a design and discuss the different approaches.
We then work on the code, mostly them but with me offering suggestions. Then I ask them to suggest unit tests, possibly discovering bugs in the process (no shame in having bugs, but the unit tests should be good enough to detect the ). Finally I ask them to explain what would need to change if the code needed to be threadsafe and possible performance improvements.
At no point during the interview do we actually run the code. I don’t even care if it compiles so long as it looks reasonable. The meat of the interview is in the conversation - how quick are they to grasp the problem, do they admit to mistakes or bluster through, can they take instruction, what questions do they ask, etc.
So much of software engineering is collaboration and I have working with too many people who were amazing at writing code but terrible to work with that selecting for straight coding ability seems counterproductive to me.
Also I have met several people who are fantastic team members but who suck at leetCode style problems and vice versa.
There was a time (around 2017) when we adopted coderpad (or something like it) at work, and so for a while I gave candidates the option of coding on the whiteboard or using the tool (and being able to run their code).
What surprised me is that the whiteboard interviews seemed to go better. The people using coderpad kept getting hung up on debugging minor syntax errors (e.g. missing parenthesis, etc) that prevented the code from running, and that on the whiteboard I would have just ignored. (I tried to just point those out as I saw them, but sometimes they were subtle.)
The people using coderpad kept getting hung up on debugging minor syntax errors (e.g. missing parenthesis, etc) that prevented the code from running, and that on the whiteboard I would have just ignored
This problem is real, one time I almost botched an entirely trivial coding problem because of this.
I don't use syntax highlighting, I only highlight comments and code differently. All code is highlighted the same(*). One time we used a coderpad-like platform where we not just type checked but also run the code, with tests. So this platform required selecting the langauge you're writing (instead of writing as plain text) to be able to check and run the code which made it almost impossible for me to focus. I think I must've stared at the screen for 1 minute or so to adjust.
Another time we selected one language but wrote in another (I was free to use whatever language I want but the interviewer copy/pasted some signatures etc. in a language and used the language's syntax highlighting), and I also got stuck for a minute or so before I could adjust.
(*): My Zed theme to do this is here, VSCode config is here. If you know a better way to do this in VSCode let me know!
At least among the large companies I don't think it's common to hire without evaluating collaboration/communication skills, though some have better methods for that than others.
IME this kind of thing is most commonly evaluated at a 1 hour dedicated behavior/culture/etc. interview, but IMHO it's not the right way as those interviews are easily faked by making up stories to give the right signals.
Reviewers may be able to tell fake from real apart in some cases maybe, but I'd think (and I also heard from others, even in SWE podcasts online) that the worst candidates who would be comfortable lying/role playing/story telling would pass these.
A better approach that I've experienced myself as a candidate is doing all interviews in a collaborative way. E.g. back and forth discussions, discussing each other's code/making changes, clarifying ambiguities etc. and just chatting during the whole process instead of just watching the candidate code and explain their thought process.
Also, if the interview is on-site, anything that you say or do any time while you're there (e.g. during small talk at lunch) can be a positive or negative signal about various things. I actually think that this can be a good way to evaluate. At least for me, interviews are very stressful and after 3 hours of them I get extremely tired, and when I'm tired I usually get brutally honest about things. So if you want to know what I think about e.g. AI assisted coding, that's a good time to ask me about it and get any signals you may be looking to get or not get :-)
This is essentially what we do as well in my team. The best way to see if someone is a good fit is to listen them talk about their own code and ideas.
If look at a time period of several years, do you see positive signals that the right decisions were made during hiring? Or have you only recently started doing it this way?
Good idea in general, but I have to point out that social engineering and face-to-face cons are a longstanding tradition for influencing people's opinions. At worst you just select for a different variety of bullshit.
Maybe, but given how much awful software exists, I’m not convinced any of the practices from the last 20 years are good.
And people have been valuing soft skills over technical skills increasingly over that time.
(horrible FAANG leetcode style things not withstanding, but there are more soft skills interviews at google than technical ones- “Googliness” being my personal fav.)
I have done the same. With the increase in AI usage I think it is becoming more important to see if the candidate can reason on the virtues or failings of what is produced
Honestly I don’t think there’s much point to a take home that asks people to write code. You can just game those with an agent now, and frankly I don’t see a reason people shouldn’t.
The ability to read and understand code is much more important, and has become even more so. Agents can support that activity but they can’t understand the code for you and their ability to reason about code is still limited.
Edit: and talk about why code is as it is
With this mindset we can start conducting system design interview based on candidate's blog post on a priorly agreed topic.
What would this test? Claude and the candidate's short term memory? Does this help extrapolate that the engineer is good at designing and supporting elegant systems? Are they a clear ideas communicator when requirements change? Can they trace their decisions back and reiterate on assumptions in case of a design bottleneck? And last but not least, how do they react to unexpected technical questions IRL?
The best interviews I've ever gone through were the ones where I was asked to present work I'd done before. Great opportunity to present work that I'm proud of and provide signals about what I'm interested in and who I am. It was also great to see the kind of feedback and questions I got from potential coworkers to see what it would be like yi work with them. But the key thing here is that I could bring work I'd already done.
At $work, we do ask for a project. We analyze the result to find the right thing for the interview: Explaining what is done Explaining what may be improved in the future Saying "I don't know" Refute something we are saying (we genuinely craft wrong assertions in our code understanding to let the candidate opposé something)
In the end, the code itself is not important, merely a component to support the technical knowledge