All companies build software differently. Some have automatic deployment, some don't. Some have strong testing procedures, some don't. Just because a company doesn't use a CI, doesn't necessarily make them "worse". It's just an indifference to the indoctrination of the "SV mindset".
The more important answers are not binary yes/no by rather "why aren't using a CI". Common answers are:
- I'm not sure what CI is
- We don't have enough unit tests to justify it
- We're a small team and it doesn't really justify the effort to setup
- We're working on setting up and should be live in the next 3 months
You can tell a lot about engineering competency and leadership from those answers.
This does not contradict to the idea that methods that are rather necessary for CI, such as really high test coverage (this is typically even a requirement for such a kind of software), automatic building (can improve productivity a lot) often also make sense in such an environment.
One nice factor of the Joel Test (not that I saw it being used in reality - but as a mental model anyways), was that you could easily categorize companies into places you want to work or places you don't want to work. A perfect score? You want to work there. More than 2 things they don't do? You don't want to work there. 1 thing? Maybe look into it and see how important it is to you.
With this, your scores could be all over the map. What's more, the questions a company misses on might be ones that aren't that important to you (having a library), or even where a 'no' might be preferable to you (daily stand-up).
Even once you get past all that, many questions aren't easily answered objectively. What's a short iteration? I've worked in places that touted 2 weeks as a remarkably short iteration, and others who bemoaned how long that was.
That's all well and good if you have the luxury of picking and choosing from multiple offers. Here in the real world (i.e., not in SV), getting a decent offer (if you're not entry level) that's at least equal to your current pay generally takes 6 months - 1 year of hard interviewing. If I demanded a prospective employer scored even 50% on the Joel Test, I'd be perpetually unemployed. Which is probably why employers generally get away with providing sucktastic working environments for software developers.
I find it especially disheartening that the only one of Joel's 12 'tests' that's pretty much a universal 'yes' these days is uses source control.
How's that too many?
B) Weirdly, all of them are heavily process focused and none of them have much interest in the ability to write functioning code, because
C) Software engineering, as a field, is built on the belief that anyone can write software if properly managed.[4]
[2] http://www.sei.cmu.edu/certification/
[3] https://www.computer.org/web/education/certifications
[4] and that would make software development much cheaper.
I taught martial arts for many years, and I can honestly say that I can teach martial arts to anyone. However 98% of those learning will suck at it. They don't have the aptitude, the dedication, or pain tolerance.
YMMV here. I've much more often encountered situations where I had to interview the maintainer of a repo (if there even was an official one) to find out how to contribute, whether they were even interested in contributions, and how to make sure my change didn't break anything.
A lot of these tools, processes, and special words are as much about good passive communication as anything else. That being said, tooling that doesn't fit development use cases is often worse than no tooling (presuming devs can make their own productivity scripts as needed).
When the business is willing to let it work, Agile can be kinda nice. But if the business doesn't buy in, if they keep interrupting, changing priorities and tasks mid-sprint, then it's just going to make everyone miserable.
The most important thing you need to make sure is that developers talk with end users. That is one of the biggest point of Agile.
Now, if you look again at the article, you'll notice this one interaction is completely missing. Not only that, but Scrum de-emphasize it too by creating middle-men, and most formal "Agile" methodologies don't even think about it.
After all most of the CICD stuff in this list is covered by his original 1-step builds rule. If you have that then scripting cicd stuff is trivial. Similarly his test covers testing.
Everything else in here is just disguised Scrum.
Isn't this a very good indicator that the Joel test is still sufficient?
Only once almost all companies pass the Joel test with 12/12 points [1], a modernization is needed to distinguish further between multiple offerings.
[1] or 11/12 points, see https://news.ycombinator.com/item?id=14078069
EDIT: Fix typo.
Does your software work?
These criteria are all rituals and processes, rather than the end result.
"Does your software work?" is almost impossible to answer objectively, and doesn't help you determine if the software is going to work 2 years from now (which a good deal of development best practices work to achieve). You might as well replace the test with "Is this company awesome?"
In some areas of software development, such as heavy duty algorithmic/mathematical programs such as encryption, video compression, computer graphics, there are pretty rigorous objective ways to measure whether the program works and how well, without human testers or QA people. Usually best to do both even in these cases, just in case there are some subtle issues that the performance metrics don't capture.
On the other hand, predicting the future is noted for being rather hard. Predicting correctly whether a program will need to be changed in two years, in what way, and whether the program can in fact be changed easily all two years in the future is speculation, a matter of opinion, rarely objective at all.
A well functioning development team is wonderful...for developers.
That's sort of my concern with these indirect questions about tools - it's very common to have them but not have them doing their job.
All of those practices can help, if used appropriately. But they're not going to magically make everything better.
If you have all of the list in software that doesn't work well I can spend the next 5 years fixing bugs and eventually get to useful working software - it won't be the most fun job but it won't be a job I hate. Everything you are missing from the list makes it that much more likely that the missing thing will frustrate me until I just want to quit.
Your stringent definition of "working software" seems to fall in this category.
- Do you fix bugs before writing new code?
- Do programmers have quiet working conditions?
- Do you use the best tools money can buy?
This one is the most important one for me, and the absolutely hardest to find. I believe we (as programmers) let ourselves get overwheled with extreme programming, daily standups, burndown charts and other mostly meaningless stuff. We forgot the basics; peace and quiet. Everything else is just extra.
> 9. Do you use the best tools money can buy?
I hope I don't do too much injustice on Joel, as I always loved to read his writings back when his "Joel on Software" blog was active.
However, this item on the list always sounded to me as an attempt to promote their FogBugz tool, not as an objective advice.
Recognizing there are many excellent Free Software tools, specially in the software development area, I'd rephrase it is as:
> 9a. Do you use the best tools available?
or maybe:
> 9b. Do you invest in your tools?
which means, depending on the exact tool, one or more of:
- buying a proprietary tool
- using a Free Software tool, and donating money to the project
- using a Free Software tool, and providing bug fixes and/or new features
- having one or more team members dedicated to improve the tooling and infrastructure
I always took this as a criticism of companies where you've got developers who earn multiple thousands per month pecking away at old PC's and squinting at 15" CRT screens. Waiting for 5-10 mins for the OS to startup and > 5 mins to build a project.
Hiring someone to exclusively babysit a Jenkins instance is incredibly expensive. Paying for Travis CI/Codeship/Gitlab CI is really cheap in comparison. Having developers fill out purchasing orders and waiting for software or hardware is very expensive.
I like to call it the "IntelliJ test", can I requisition IntelliJ ($499) and have it the same day (week? month?) or is the company going to flinch, hem and haw at the absolutely inconsequential price of the software in comparison to the expensive developer time they're paying for.
One thing that Joel's list isn't, it isn't how-to-develop-software, or how to manage software projects, etcetera. There are many books about that. Joel's list is about recognizing ineffective leadership that will waste your time, which is finite, and limit your career and your earning power. And it is a checklist that you should be able to move through during a single on-site interview, not something that requires a four hour conversation and requires you to do an in-depth analysis of their decisions.
I'm thinking desktops/laptops with lots of RAM and SSD, fast internet, a proper chair, standing desks, a 4k monitor, ...
The quote "Those who only have a hammer tend to see every problem as a nail" is a good summary of the junior/mid-level mindset.
I don't know the author, but based on the rigidity of the article, I would guess that they've only worked for big companies. I would argue that a most of these rules are only effective in the context of a very large company; in literally every other context, many of these rules are inefficient.
Big companies are all about risk mitigation; they are willing to sacrifice speed and agility in exchange for stability, certainty and visibility but this is actually a luxury that only big companies can afford and should not be taken as a rule of thumb.
My experience is that it's very hard to keep untested code to a high quality level. Any modification that isn't directly justified by a customer feature or a bug fix is frowned upon, because it's hard to tell if it breaks anything ; which means you pile up new features, but you can never modify their design so they fit better together.
When the philosophy is "now it works, let's never touch this module again!", code quality goes down to the toilet.
Better to just do real TDD in the first place.
#1 and #2 (CICD) are fair additions to Joel's list, but I'd argue are already encapsulated by "do you make daily builds?". In most shops, if you make daily builds, then you CICD.
#3 = Joel's #4
#4,#5,#8,#10,#11,#12,#14,#15 are all "Do you SCRUM/TDD?". If that's the kind of place you're looking for, great. But there are many competent code-oriented organizations that do not SCRUM. So these don't really belong on a Joel List. (Also, "We don’t know the better way to make sure that code does what it’s supposed to, then to have another code [author means unit tests] that runs it and check results" just isn't true. We know better ways, and sometimes they're even relevant to a list like this. "Do you use any form of static or dynamic analysis (e.g., types, valgrind, quick-check style tools, linters, etc.)" is on my personal "Joel Test".)
That leaves "do you have a library?". IMO work-place libraries are close to useless as signals (everyone has one), and rarely useful in practice (unless you're curious how PHP code was written in 2003 or really want to brush up on complexity theory).
As an aside, it's kind of depressing to me that we still make these lists. Back in the 90's, software engineering was still a relatively young craft with relatively few experts. Joel was part of a surprisingly small group of people who: 1) had a career's worth of experience developing software for micro-computers in high level languages; and 2) had deep and successful experiences across several organization roles in different types of organizations (coder, manager at MSFT, CEO at Fog Creek). The existence of managers who were in charge of software engineers but had no engineering experience wasn't surprising at all, given the youth of the field. Hence the Joel Test.
The world is a very different place today. There are a lot of people with this level of experience. Joel Tests aren't ubiquitous in other engineering domains, and hopefully they'll eventually die out in software as well. Not because the items on them aren't important, but because experienced Engineers manage Engineers.
Of all the Merits, this is one I disagree with. It's merit is primarily "joining like minded folks" (i.e. cult) than any inherent merit in and of itself.
How many great developers do you know who do not have this "merit"?
If employers were encouraging or at least allowed it you'd see a lot more great developers have this merit, but for most medium to large organizations it's a one way street with OSS and prohibit their engineers from open sourcing projects or contributing to projects the organization depends on.
I did like the questions around OSS and sharing expertise. I'd like to see more questions that address recruitment anti patterns (diversity, agism, disclosing previous salary, etc) and tech organization anti patterns (an actual career path on par with management, non transparent equity grants, etc)
Like, what would the questions be if even, say, Google didn't look so good if it answered them.
It is not unlikely to find companies that fulfill all requirements, although they will likely know how attractive their working environments are and will filter candidates accordingly. An interview I had with such a small-sized software house two months ago confirmed this. I gave them the Joel test, which they had never heard of before, and they scored perfect. Dedicated testers, usability testing, quiet working environments (like a library, the team lead said; no need for headphones). Predictably, they were extremely picky as to who they let in.
The ones much less likely to get good test scores? 1) Government IT, by and large 2) IT for any non-tech company less than a certain size. 3) Non-tech corporations (and even some tech ones). One notable one I was aware of used excel for bug tracking, was full of red tape and their main technical test was a 20 question multiple choice.
Seriously? What is considered a DSL?
Then again, the client likes that project, because their customers also like it. It solves a problem for them that no-one had solved in a similar way before. I don't think we've had a single major bug reported against that part of the system by any customer in over five years, and typically that includes a multi-month lab evaluation by each customer before deployment.
So, does the development process on that project suck or not? :-)
Full disclosure: I lead the project.
Constructive comments very welcome!