It makes me think that we're training the wrong people in college by making CS a very difficult, math heavy field which often causes the more human skilled people to drop out. Programming doesn't have to be anymore math heavy than building a house yet we force undergrads to implement algorithms on paper? The amount of wasted potential talent due to college is staggering.
I disagree. I think instead there is a category error being made: that CS is an appropriate degree (on its own) to become a software engineer. It's like suggesting a BS in Physics qualifies somebody to work as an engineer building a satellite. It doesn't, but that doesn't mean "physics is too math heavy." In fact, engineering a satellite requires almost as much basic mathematics education as a BS in physics requires (some exceptions might include the specialized mathematics required for upper-level theoretical physics concepts that may not apply at an engineering firm).
Completely agree here. I often find myself wondering why 'Software Engineering' isn't the degree required to be a Software Engineer and further doesn't really exist as a major, whereas 'Mechanical Engineering', 'Electrical Engineering', 'Civil Engineering', 'Chemical Engineering', etc. are the degrees associated with those professional titles. To your point, I don't think it's a simple matter of nomenclature (i.e. that CS and Software Engineering are synonymous). Not a CS major myself, but amongst my friends who did their undergrad in the US I don't think they had any classes that covered requirements gathering, putting together schedules, etc. Any CS majors here have a class/classes that covered those topics?
Really, this goes back to the purpose of an undergraduate degree. Some people think they should be job training programs while others think they should teach more fundamental skills / topics so students can learn what they need to know for a job.
In the end, CS is a technical degree about a technical topic. There may be an argument for a less technical CS degree, but CS without mathematical rigor is not CS.
Musicians generally are. But all the talented ones are drawn to more fun, lower paying careers.
Even developers with good interpersonal skills or business sense are discouraged from participating in more producty discussion. They may be encouraged by way of somebody taking them aside early in their career and saying "Hey kid, you really _get it_. You're not like the rest of these nerds. How about you start calling shots on what to build and switch to a product role?" But they'll rarely be encouraged to stay as technical while simply getting listened to more by management. Management may argue that after 2 or 3 years of getting one's hands dirty, you understand programming as well as you need to, and that for the rest of your career, persuasion trumps skill acquisition. It's somewhat taboo to cultivate both skill sets at the same time. So, I think even if you got more "people people" to study CS/whatever, a reverence for specialization will silence a lot of voices.
However, I've been having doubts about how long I should still stay there, because of this 'digital nomad' lifestyle which seems to be so popular now (and which appeals to me also). So, that article was a little relief for me, seeing that there are indeed still people who value a willingness to immerse oneself deep into a topic and come up with novel and simple solutions to problems your users face in that field (something which I've liked about software engineering from the start).
Also, to add to your point about the CS courses: I think the math-heavy courses, although they really may be overdoing it in a lot of CS courses, can at least help sharpen your analytical thinking skills. This can help tremendously when you're thrown into something new where the most important thing you need to do is figure out what the problem actually is and what people need from you to solve it.
But I don't see anyone suggesting that we remove math from engineering curriculums because the design programs can do the math for them.
The solution is to train more experts in human computer interaction, the same way we train architects and interior designers to work with civil engineers.
Ugh.
If that's true, and I've never seen any actual evidence to believe that it is, my guess is that it's because bootcamp grads are older the CS grads in general.
They're usually career switchers and they have experience from that previous career. You really shouldn't be comparing a 35 year old ex-teacher, or a 28 year old law school grad to a 22 year old CS grad on the soft skills front.
I always say the customer/client does not have requirements, they have problems. You will not discover all the requirements until you start solving some of the problems and providing solutions. Only then will they say "oh but...." and drop more requirements on you that they didn't think of up front.
Back to that quote. It's not that software development is exploratory in itself. It's that the development is intertwined with an exploration of the problem being solved.
I think one of the important qualities of an architect is to anticipate what these requirements are going to be and define solutions to them ahead of time.
I have this conversation all the time with our client-facing team.
Me: "What is supposed to happen if this data changes?"
Colleague: "Well the customer didn't give us a requirement for that so we don't have to worry about it"
Me: Screams inside
> the software development process is exploratory by nature
>> customer/client does not have requirements, they have PROBLEMS. You will NOT DISCOVER all the requirements until you start SOLVING SOME of the problems and providing solutions.
> "The most valuable asset in the software industry is the synthesis of programming skill and deep context in the business problem domain, in one skull."
> But If Someone Else Knows the Business, Why Can't They Can’t They Just Give Me a Spec?
> The Unmapped Road
>Miles and miles of a software project are spent roving this vast territory that is not exactly computer related, nor exactly business related. Rather, it is a new and emergent problem space that explodes into existence while bridging the middle distance between the business problem and the technology.
> In reality, this vast field of micro-problems was there all along. But the only way to discover them was to encounter them one at a time, like boulders in the path of a road-laying crew
> What is Deep Context?
> Deep context is the state of having achieved a kind of mental fluency in some large percentage of this immense field of micro-problems that appears in the space between technology and a business domain.
And those problems change as the business adapts to changing market forces.
We started with a simple problem that plagues HR departments in every conceivable industry with unions, finding substitute personnel and erroneously assumed that it was a simple fix. Over the past year and a half we have accumulated a great deal of knowledge after interacting with as many people as possible and have finally released a version that meets our original criteria (and much more). It was obviously not a simple fix.
If I have one thing to tell anyone who is looking for business ideas to try out their new programming skills on, I strongly suggest taking the time to learn as much as possible about the people to whom you want to provide a solution, then recruiting one of them to help you build it, lest you become another project that solves a non-issue beautifully.
We built systems that literally had people say, "Oh, thank God!" when demoed. I haven't seen any other development methodology that matches it. You really have to understand a problem at a deep level in order to reason well about it.
We would have spent at least 10 times longer trying to get these insights otherwise, if there was even a possibility that we would arrive to the same conclusion.
It takes a little time to get over the fact that you are no longer building this product for yourself (unless you are building dev tools), but seeing customers use your product happily and telling you how much they value it is well worth the investment.
In the consulting world, we call this job "enterprise architecture". It does, in fact, pay very well: it requires someone with both a sharp business mind and comprehensive technical skills, and those are very difficult to find in one person. I personally am more of a "jack of all trades" type; but you can be a successful architect by focusing on specific technologies as well.
I honestly find that it's easier to take someone who's a hacker type, and teach them the business. You look at the business itself as a large, complex system and model your application development around that. But you also have to be a good enough technologist yourself so that you can tell your dev team when their designs don't match up to the business problem (this is a common problem when requirements are not clearly communicated).
A good architect is the person who understands both the business context and the technology implementation. You don't have to be in-the-weeds building the product, but often you do have to build quick POCs to prove out an approach before handing off the designs to development - so being able to code is a necessity IMO.
Put like this, it's hard to agree with the above statement.
Following the spirit of the article, I assume the author means that the problem domain is pretty stable. But I've been in this for more than 20 years, and I know that requirements always change. Not only our understanding, but also the customer/user's understanding of their needs and priorities change.
(Edit: typo)
And not only does understanding change, but the business requirements actually change too depending on market conditions.
The business domain constrains the field over which the requirements can change, and the sort of deep context which the author mentions will also range over that field.
- intuitiveness i.e how easy is it to communicate with this person (language fluency, etc...)
- quality i.e. how well do this person understand not only the requirements but also the actual goals
- 'latency' i.e. how convenient and how fast can you communicate with this person (time zone, can you both see facial expressions, hear changes in voice, etc...)
* intuitiveness i.e how easy is it to communicate with this person (language fluency, etc...)
Native English speakers have an advantage here.
* quality i.e. how well do this person understand not only the requirements but also the actual goals
Experience, empathy, critical thinking, intelligence. Not necessarily common or easy but on site vs remote doesn't affect this really.
* 'latency' i.e. how convenient and how fast can you communicate with this person (time zone,
Hire people from your country or even your time zone.
* can you both see facial expressions, hear changes in voice, etc...)
Use video chat constantly.
Remote work is a skill like any other. It makes sense most employers that offer it require 5+ years doing it previously. The article author makes a good point. A great way to get this is to work at a company with many remote employees and start on site before transitioning to full time remote.
Maybe some people are just cut out for remote too. I remember at the beginning of my career running a business where I talked to the CEO of a mid sized company regularly about his needs, and always delivered. He was thrilled and amazed. It was just good listening, communication, programming skill and hard, applied work. Nothing fancy.
The weird thing about the Bay Area is if you want to live on 5 acres in a home built in the last five years somewhere quiet and pretty that is 20 minutes from the office in traffic, you're basically looking at Woodside. On the low end, those houses start at around 3 million. Good luck paying that mortgage on the income of even two software engineers.
Whereas you could buy the same house somewhere else in California for 300k.
So even though it is indeed pleasant to have coworkers to talk to in person for social needs, the compensation to housing cost math just flat out doesn't come close to working unless you are willing to make some serious housing quality sacrifices.
Edit: typo.
Also the experience may vary. In my admittedly not so long career(less than a decade) I have seen teams where business rules are the major source of complexity while there are other teams which have less business rules (example the databases/data warehouse/build systems team). Admittedly there are less teams of second type in the world so the general perception is that the hardest part is communication and understanding of business context.
Coming to domain knowledge, even the Mainframe and COBOL chaps make a lot of money while smart open-source contributors freelancing don't. Money is not the only way to judge the hardest problem about software development.
> multiple if-elses encoding the business rules
A good architect would add a linter with a cyclomatic complexity check that fails the build. There's always a way to avoid these hideous nested else ifs. Code review is helpful here to train people who don't understand some of the various abstractions to avoid this such as polymorphism. This is also the code that has high value for unit tests.
Solving well defined algorithm puzzles has zero bearing on the skills described by the author.
That, ironically, is part of the authors' point: software engineering hasn't changed that much. They were saying that in the late 70s, looking back on the past two decades (all the way back to the late 50s, when software was typically written in assembler or machine language and had to be rewritten if you bought a new computer!), but it hasn't become much less true. The technology changes, sure, but the failure of software development projects has never been mostly the result of technology. It's always been on the human side -- failure to understand the requirements, failure to meet them, failure to estimate the effort appropriately, failure to work with the customer, failure of the customer to understand what they were paying for, etc. There's a long list of things that go wrong, and I suspect anyone who has been in software for a while has seen many of them.
It was eye-opening for me, the first time I read it, to realize that people had been dealing with the same issues I was dealing with, for longer than I'd been alive. (And a bit depressing, too, that we seemingly haven't gotten much better.) Languages and project-management methodologies come and go, and the tech skills and understanding are certainly necessary, but they are not sufficient. The business knowledge and human factors seem to be the difference, or at least the largest controllable variable that leads to a difference, in a successful or failed outcome.
Ha! We need a "tribal leadership" discussion forking off this point.
Rather than just a "wisdom dump" it becomes more of a story with a purpose.
Case in point: I've been working remotely for 2 years now, and I've gathered as much domain knowledge (if not more) as I had in previous roles over the same timespan.
What's important is to get your software out there, so you can map the field.
- Here's a pool of knowledge about software development: hardware, operating systems, memory, disks, file formats, databases, networks, protocols, languages, debuggers, design patterns, security, accessibility, UI/UX, distributed systems, paradigms, typical algorithms & data structures, and CS problems
- There's a pool of knowledge about whatever industry you get into as a developer: user demands, existing workflows, existing infrastructure, previous decisions, legal regulation & compliance, physical laws, profitability, and practical limits.
Your software development skills should reach a point where you don't write "Bad Code" -- anything that's wrong like loading a entire database table that eats memory when you can read individual rows, storing passwords in cleartext, or not doing anything for accessibility (this is not design pattern, space/tab debates). These have been done hundreds of times by new and 'experienced' people.
It takes time to get to this point. More time than anyone likes to admit because the pool of knowledge grows and shrinks daily, but has undoubtedly had a net expansion since computers were a thing.
It takes time to get deep knowledge about whatever industry you get into. This is different for every industry. There's a practical minimum that you need to work on solutions or do maintenance on software within this industry. This is to avoid "Bad Code" which will hurt you, other people, or your business.
You can gain industry knowledge by just being given problems and being shown. This is probably how most of us know our industries from the get-go. A minority of us came from those industries and transitioned to programming later, so we already had a base level of knowledge of our problems.
If I've got the definition of Deep Context right from this article, it means to get to that point, you have to spend a good amount of time within the industry. It's not something you can gain completely by reading out of a book.
If you're to gain deep context within an industry, you have to devote some time away from software. You can't do both at the same instant (but certainly within a day). When you study an industry, there's an opportunity cost to not learning something new about software and vice versa.
When you add more requirements to a single job, it increases the time we have to spend before we're employable. Not every industry changes as fast as software does, but some certainly do, possibly catalyzed by software.
If you increase the time requirements, it's going to reduce the available pool of engineers as long as all of the engineers are honest and don't apply for jobs or remote contracts until they're ready.
If you don't want the time requirements to increase, you have pay the opportunity costs from one of your pools of knowledge.
So really, we need a much better "good enough" for employing developers and career development, including teaching software and industry knowledge. Because eventually the time requirements are going to become steeper and steeper. It can't go up forever.
How about instead of becoming the expert at whichever business domain the software is for, we become experts at helping business domain experts find and express the business rules that need to be implemented?
You can fit a decent amount of that skill and the technical knowledge you mentioned in one skull, and it still lets you be quite effective in more than one domain.
That's what I'm going for anyway. Wish me luck.
It's really an argument for who's going to spend the time and appears simple:
- Devs learning software and industry knowledge
- Business experts learning software knowledge (incl. technical writing) and industry knowledge
Product Managers with some technical knowledge and writing skills are best at being a middle layer between raw customer requests and development specs in my experience. PMs and customers struggle when they don't have a good vocabulary to use to describe features that they want. That's when a dev has to translate or teach the person. Then again, that's asking a PM to learn industry knowledge and technical knowledge and product management knowledge. This is especially true if you have a good QA pipeline.
I've seen analysts and PMs that didn't have a good UI/UX vocabulary or weren't exposed to different UI/UX's, and usually their requests were the most vague and resulted in the most unspoken details.
I've also had PMs that knew how to write a good technical spec down to quick UI mock-ups and error handling. They also had technical writers to pose questions about some of the details.
Pretending I could be as good as the latter is foolish, and if I could, my salary should have been combined for doing 3 jobs well. I think one-man-army, $250k/yr full-time positions are rare though. We seem to be inching closer to it though, maybe without the salary.
I'm working with an intern who's implementing a "proof-of-concept" for voice control using both Alexa and API.AI (Google). He's a CS freshman. It's amazing what he's gotten working so far, but his code is unreadable noise. Like he never learned what a subroutine is for.
But he's giving a demo to the CEO today, and said CEO will undoubtedly say "Nice, finish it and let's ship!" when it's probably not even usable code in any way. The hard part -- connecting thousands of users through a db and thousands of persistent cloud connections from individual IoT devices, hasn't even been sketched yet.
So he looks like the hero (with a demo that does something amazing), and I'm going to look like the can't-get-it-done idiot because no one in the organization understands the complexity of going from that proof-of-concept to a working product, or even a next-level demo that uses actual connections from actual devices.
Of course, that's when you're supposed to quit, I guess.
Of course both viewpoints are valid, I'm just really feeling the commenter's point today because of my particular circumstance.
So you could read this essay as just an expanded version of the saying.
As the article suggests, though, getting there is not easy!