But in some other ways, the author is too self-critical:
> I'd spent months sitting alone in libraries and cafes, blindly installing tools from the command line, debugging Linux driver problems, and banging my head over things as trivial as missing parenthesis.
All these things have value, and I doubt that this was time truly wasted. Although some people can get by for a while without it, becoming intimately familiar with the command line is extremely valuable. Also, although this is by no means a requirement for being an effective programmer, most good programmers I know use a traditional text editor such as emacs or vim.
Ultimately, I think the point is that expending time is an investment, and you should prioritise based on the expected future utility from your time investment. More often than not, this means finding the most important thing and putting almost all of your energy into it.
> All these things have value, and I doubt that this was time truly wasted. Although some people can get by for a while without it, becoming intimately familiar with the command line is extremely valuable. Also, although this is by no means a requirement for being an effective programmer, most good programmers I know use a traditional text editor such as emacs or vim.
I agree. I think the prime takeaway is do what you need to do to solve a problem and then most of these things that the OP described came organically for me: I started on Windows, learned programming via Python, used a wide variety of text editors. Eventually I grew dissatisfied by the tools windows provided me and installed Linux. I spent many hours debugging weird weird issues with my then laptop, until one issue (a kernel bug? I think...) forced me to abandon that laptop and switch to a thinkpad. When I started my first software job my coworkers recommended vim to me. After I used it for a while, I began to become fond of its key bindings so now I'm using sublime with vintageous and just straight up vim if necessary for my development.
Along the way, I've learned unix tool sets, dealt with init system, know how to find and interpret logs in a reasonable manner, figured out where to get help from people if needed, and so forth. The experience from this is invaluable to me when looking at a problem and being able to quickly identify at least where the problem is coming from based on what I've seen in the past.
At the end of the day it all depends on what you're interested in and what problems you have/need to solve. I'm interested in the stuff I've done, so I found it to be a valuable exercise (albeit some headbanging every now and then). If someone is only interested in doing iOS development or Windows development. They might find the things I've fiddled around with to be not useful, and I think they would be right.
Of course, from a professional perspective, you want to minimize how often that happens, but there will be plenty of blind stumbling as your work takes you into less-traveled territory.
But I'm privileged in that I wasn't ever pressured to learn, it wasn't a point where my livelihood depended on it, so that's a factor.
Learning without casting a wide net may means you end up with the kind of tunnel vision that used to lead poor souls to 'learn Dreamweaver', or 'learn Crystal Reports', or ... ASP.NET controls, or something.
This guy ended up with a sense of the tech landscape, and knowledge that will serve him well. In one year!
Honestly, if people copied him and just avoided editor/keyboard silliness, they'd do well.
As you learn more, you'll find things that annoy you about your editor/environment. Fix those as you see fit. Eventually you'll have an environment that fits you "like a glove".
It's absolutely detrimental to try to get a fully customized environment set up before you know what your workflow is going to be like. It took me years to get to my environment (tmux, vim+plugins, ack, a VirtualBox environment with a proxy). I experimented a lot. But getting "the perfect environment" was never a goal; it was just a lot of "this is annoying, there has to be a better way -> google -> environment change".
If it only it were so simple as keeping a small tool set and a dev team committed to it. You can certainly build a business from scratch with a small toolset. But as the business adapts to customers, you always start to find things that customers want that the tools do not support. In initial dev, you don't necessarily care, but when you're live, it's better to implement than say 'no' to executive management and sales in most (not all) cases.
Once you modify the tools a bit to support the unique business, you're no longer able to keep new developers focused on a small toolset, because you've customized. New devs may not agree and may look for ways to work around the customizations. Soon, you have a unique codebase that is valuable- central- to the business, and your changes have crept in to the point that off-the-shelf devs and tools and upgrades don't work.
Now you're in maintenance mode. Code schools don't teach maintenance, but it is the lifeblood of the software business. Developers avoid maintenance jobs. They try to find new, greenfield work. That's why we end up with so many disposable business models, 3 year dev-to-acquisition cycles, and ridiculous amounts of abandoned code.
Maintenance is hard. Much harder than development. But it's much more important. Anyone can launch an app that builds a business. Not everyone can adapt and grow that business with code changes that require getting out of the dev comfort zone.
Code schools don't teach this because they don't want to expose aspiring coders who just want to get rich to the grimy dirty details of a real profession. But ask yourself this- would we have cars and highways if all we trained were new car designers (not mechanics or road builders)?
How a reasonably balanced individual nearly went insane
I was just a guy in a suit in an office with a vague healthcare idea. Then I decided to learn to fix teeth.
I overheard some guy at a happy hour bragging about how easily he was able to automate his overbite by using a technique called "4 Handed Dentristry". I thought, "huh, 4 Handed Dentistry." I went home, googled it, and within 15 seconds, I was working through a random 4 Handed Dentistry tutorial.
A week later, I went to my first dentalspace meeting. Everyone was talking about techniques like orthodontics, endodontics, and maxillofacial surgery. There was so much to learn. I borrowed three O'reilly books and got about 50 pages into each of them.
Most dentistry books start off nice and easy before making big assumptions about your prior knowledge.
A friend told me I should get good at drilling, and gave me his drill bits. I spent a few hours learning basic foot controls so I could further configure it.
Then some guy walked by and saw me drilling. "Why are you drilling?" he asked me. "Don't you know lasers are better?" "Hm. Lasers." So I started memorizing dozens of laser shortcuts.
Most arguments about restoration techniques are what dentists call "religious wars" - rooted more in historical differences than practical merit.
At the time, it seemed reasonable to think that the faster I could drill, the faster I could fix teeth. I switched to a hydraulic drillset because, hey, it was objectively the most efficient method a dentist could use.
Can you count how many letters, numbers and teeth are in their original oral positions? I'll give you a hint - it's in the low single digits.
On the days I could actually get my netbook to successfully boot DentalCAD - and that I was able to drill more than 10 teeth per minute - I studied oral surgery by working through books and Udacity courses.
After 7 months of grueling self-study and going to dental events, I landed my first Dental HMO job...
You kinda get the idea by now? I dunno, I think what we do is every bit as professional as dentistry. So why don't posts like OP's seem as absurd as mine?
[UPDATE: For those of you who have suggested that programmers don't have others' well-being in their hands, actually quite a few of us have, just not as directly as dentists. We are a profession and we do do important work. A few examples in an old post: https://news.ycombinator.com/item?id=2882620]
Because one can read universally-acknowledged figures explaining how a large number of people with a software engineering degree can't code their way out of a paper bag or pass the most basic "fizz buzz test". (1)
Because we can see non-genius 17-year-olds writing apps that are bought by top tech companies for $30MM, month in and month out. (2)
Because we call it "software engineering", but it still isn't engineering at all. (3)
Software development is still a very, very young field. The fundamentals are not properly understood yet. It will still take decades before they are, possibly over a century. We won't be able to put proper education in place before the fundamentals are well-determined.
Agriculture, livestock breeding and cloth making are many millennia old. Architecture, engineering, and the law are 2-3 thousand years old. Book printing and complex music are about 1,000 years old. Dentistry is centuries old. Cinema is over a century old. We know a lot about how to do those properly, and schools are pretty good at teaching the important parts. Software development is less than 50-years-old, and schools are still dismal at figuring out the important parts (practitioners are only so-so most of the time too). That makes it different.
It would be hard to get more misguided advice than what the OP received (pro tip: don't learn vim, Emacs, configure Linux or switch to Dvorak before you can write functional, working code). That doesn't mean teaching yourself is a bad way to learn.
(1) Why can't programmers program, by Jeff Atwood (http://blog.codinghorror.com/why-cant-programmers-program/)
(2) Summly
(3) Just a random sample, but very representative: I developed a software package for building structure calculation about 20 years ago, helping an architect with the software part. There are manuals enumerating the exact steps to follow and excess safety measures to add: assume 50% extra weight, 40% wind-load force with higher percentages for higher structures, twice the amount of iron to be put into the concrete when certain conditions are met, etc... Those manuals are the law. If you are an architect or an engineer, and you follow those rules, two things happen: (1) you are not legally liable, and (2) the building doesn't fall down! Software projects fall down all the time (read: Obamacare). That is engineering, software projects with today's tools and techniques are not. This will happen some day in software. We are not there yet, by far.
Sure we are, at least pretty close.
Commercial avionics software developed to DO-178B standards calls for reams of requirements, verification tests, compliance to process, internal quality reviews, external audits, and sign-off by FAA representatives.
A one-line code change can take days to implement, and might not be released to "users" for months or years.
But the software is extremely robust.
If we wanted to engage in the same level of software engineering for all software, we could. But we don't want to. Developers don't want to, and users don't demand it. If an iPhone game crashes, who cares? If a productivity application crashes, you might have lost an hour's work, but it's probably not so annoying so as to warrant a couple orders of magnitude more cost associated with the software.
But if a software failure could kill people, well, that's different. It's worth spending a huge amount of time to make it perfect.
Avionics software can be so thoroughly tested because it is thoroughly designed up front. You know exactly what it's supposed to do. Much less-critical software is designed in a more ad hoc fashion; or there might not even be a design at all! How much software has been organically grown, starting with an idea and hacking on it until it seemed to work?
If you want to thoroughly test that, you have to go back and thoroughly state what it's supposed to do.
It's quite possible, but by and large it's not desired enough to make it worth actually doing. I'm not sure how this could change, or even if it should change. Instant bug fixes on web applications are cool, even though they come with the risk of having broken something else...
This statement is based on exactly the same fallacy as the featured blog post: ignorance. You are ignoring all the tools and techniques available for software engineering today: garbage collection, lexical closures, Hindley–Milner type systems, purely functional programming, MISRA-C, unit testing code coverage tools, bug tracking systems, distributed version control systems and code review practices around them, etc.
People developing widely used tools are still making idiotic mistakes that had widely known solutions 50 years ago: https://twitter.com/vsedach/status/527904732145537025
Engineering is about learning from previous designs. When you shrug your shoulders and say "software engineering isn't engineering yet" and "the field is too young" you are just discouraging people from learning from past mistakes.
Just because some idiot can pick up an oxy-acetylene torch and cut and weld some metal together doesn't make them an aerospace engineer. What's the difference with PHP developers?
Stop making abstract excuses and start treating software and software practices as tools and techniques. Tools and techniques have a history, a learning curve, and areas for improvement.
People have changed less (ed: more slowly) than you might think. Consider, people where applying makeup 6,000 years ago and there is some evidence the practice is ~100,000 years old.
Technical people are, necessarily, very adamant about the technologies they use. When you're first starting out, you just want the "best."
Among the most common misunderstandings for non-technical people is what a programming "language" is. You don't realize that almost all programming languages are made up of very similar constructs. For example, knowing what I can do with a string in JavaScript and transferring that knowledge over to Ruby is a thirty second, syntactical exercise. Non-technical might imagine that you have to learn everything over from scratch. That's probably a result of etymology; when you hear the word "language" you think Russian vs. Spanish - completely different alphabets, concepts, grammar structures, etc. It would make more sense for those who can't program to think of it in terms of "dialects" or something like that, instead of "languages."
It doesn't actually mater what language you learn first, even if it's (god forbid) PHP. But we don't tell that to would-be programmers enough.
I probably wasted a month of my time because I started using zshell and oh-my-zsh on the recommendation of some guy I talked to at a meetup. He loved a tiny aspect of the flexibility of the prompt's highlighting. I barely knew how to use the command line. So heaven knows I didn't understand what happens to my $PATH when I'm dropping whatever the github repo is telling me to into ~/.bashrc instead of ~/.zshrc.
The time I really started to learn was when my company got into an accelerator, and all of the sudden I was the de-facto front-end guy. The only CSS I had ever written was tweaking colors on my blog, and I really had no idea what I was doing. But that didn't matter - I had to build, and it was a real project -- one that was seen by 10s of thousands of people on day one. It doesn't have to be that stressful to learn, but you have to build something and solve challenges learning along the way. There's really no other path that works.
So, my advice to budding programmers or those who may learn to code: Pick a language/framework and don't move on until you are fairly adept with that stack. Your tech buddies may mock your technology choices, someone will say you're an idiot because "this would be so much cooler in Lisp," but you don't have to be writing functional Haskell when you're learning to program. Take things form beginning to end, start to finish, and start changing technologies once you are well versed enough to understand the shortcomings of what you're currently using.
I really wish someone had sat me down and told me that when I started. I'd probably have saved six months of after-hours and early morning struggling.
OK, start with Prolog. Now move to Ruby. Then Haskell, and include some SQL in that as well, somehow.
Now write me a program in APL.
Languages within the same paradigm are mostly similar. But there are a lot of paradigms, and some concepts don't transfer well at all. (Quick, what's the equivalent of an anonymous inner class in Prolog?)
Now I spend most of my time in Clojure, Ruby and C++, but, if it wasn't for PHP, I might not have ever felt that itch that led me to pursue a career in software development.
The people elements of communication are not easy to parse correctly. You can never truly know how other people think, you can only assume based on experience collected a priori, which may be all together constructed on a false premise that intialized the pattern of thought construction.
I wish I had known the things I know now, 15 years ago. But I don't. That's part of living. We learn and grow because of our experiences. It can't be learned in any other way.
Personally, I just ignore assholes, or if I choose to engage, I learn to play their own game. Then I typically stop judging them.
There is no such thing as "freelance amateur dentistry". You can't start tinkering with teeth and get a job the way you can tinker with programs as a hobby before getting a junior position at a programming gig. To compare the two professions and suggest the path to engagement and skill are the same is obtuse.
This article resonated with me more than most HN pieces, because it describes my path as well. I've not had professional guidance, and with the absolute flood of options, of data, of debates and educational material, it's REALLY difficult to know where to start. I don't understand how you find it acceptable to mock someone's interest and lack of guidance.
The difference isn't just that most programmers don't directly have others physical health in their hands, although that is true. Some programmers do, but all dentists do.
The difference is that you don't get do-overs. Have you ever written code that didn't even compile the first time? Or code that failed a test, or code that did something wrong in a test environment, or code that did something buggy in production?
If you'd made a mistake like that as a dentist, a patient might well have lost a tooth that they're never getting back. Much worse could happen, you could cause someone to lose feeling to their face, even kill someone.
There isn't any low-stakes dentistry but there is low-stakes programming. That makes it easier to start as an entry-level programmer because even at a low level of skill and experience you can potentially do useful work for someone.
He shouldn't be taking random advice to learn emacs/VIM from people, or use a new keyboard, or (a lesser extent) a new operating system. He needed a guiding instructor to tell him to learn how to code first and focus on building projects instead of trying to turn a 3-5 year education into a 1 year accelerated course so he can be like the cool kids.
I'm self-taught as well, so I'll never say university or professional accreditation is the right answer. But that doesn't mean we should be without effective mentors.
Speaking of which, I've been meaning to look into mentoring...
On the other hand, I do suggest you look at hobbyist/quasi-professional magazines. It's.... not terribly dissimilar.
Example cover: http://popularwoodworking.woodworkingplansplans.com/images/w...
Conceptually, that's the same exact approach taken..
As long as we don't kill the goose which laid a lot of us golden eggs.
If we destroy the ability of newbie programmers to come up outside the university-professional path, we've just irreparably damaged the whole field.
This is also why I don't like the idea of unionizing programmers: Even if we come up with a union which isn't based on the wage-and-hour, put-in-your-time model, unions are still based on seniority and coming up the "right" way as opposed to being able to strike out on your own in your own little company, without needing to pay dues, literal or metaphorical.
Most new webdevs do really basic things which are more akin to them being receptionists anyway (I've seen tickets as ridiculous as "change this while loop to a for loop", "add this css class to this button", etc.). And let's face it, most people who learn programming in 6 months are probably not doing anything more complicated than web dev, with near-zero knowledge of devops (disclaimer: I too know very little of devops)
What is interesting to me is that we have gone from having computers go from giant obscure machines in room, leased by large corporations, to having them all over the place in your house. There is a gradation of expertise and capability as there are with cars or other complex systems. From 'tinkerer' (usually a hobbyist) to 'mechanic' (who earns money adjusting and fixing) to 'engineer' (who earns money designing from scratch). They also come with different financial liabilities.
And that last bit is something that computers have largely avoided by consistently disclaiming all warranties. When that changes, and programmers (or their employers) are held liable for the incidental or consequential damage caused by their bugs, you will see a much stricter code for hiring and employing people who write code that runs on other people computers.
Your example from hobbyist to professional has real physical constraints that are huge to overcome for an autodidact like "where do I get a constant influx of patients to practice on" and that's solved with university. Building software only has "own a computer and have internet access".
As a self-taught my reasons for not attending university are purely financial. It's basically a subtle form to gatekeep lower-class "peasants" from following their own "brain craves" for knowledge and ascend the societal ladder under the guise of a higher moral cause, specifically the appeal to the professional worth. Tough luck for you and for anyone else from your social stratum that's going away sooner than you think, regardless of how loud your barking can be.
But what is more sad is the fact that there are at this time thousands of people trying to disrupt education and yet the highest rated comment in a place that should be focused in disrupting things actually tries to grip even tighter. That's what's sad.
When dentistry started out, there likely were more than a few dentists who learned this way. There are still, on occasion, unlicensed dentists found practicing dentistry with decently successful businesses.
Software development doesn't have the Software Development Association protecting the practice like dentistry and medicine do, for their respective associations.
You need a lot of clout to draw that line, you are going to upset a lot of people on the bottom half of the stratification, and you really need to work hard to convince a lot of programmers that creating a professional organization is a good idea, for some unknown reason.
Yes, there is a professional pathway to follow to be allowed to meddle with someone's teeth.
Once you've got that far, however, you're free to carry on much as you want.
Dentistry has its own religious wars, the acronym TMJ springs to mind.
I think the Fine Article does seem absurd. And while he doesn't come right out and say it, I think he would agree that in hindsight his approach was absurd. The whole second half of the post is about how actual programmers get actual work done, and it's nothing like the first half.
The interesting question is why did he think his original approach was reasonable? Did he not talk to an actual software engineer before embarking on his journey? Sure something like his original path can be successful for some people, but it that a reasonable expectation?
2. Because you can probably learn enough software development to do something useful in several months, maybe even enough to succeed in an entry level job. I think learning enough dentistry to actually practice it will take longer.
3. Because programming is much more lax about credentials. Sure, many employers require a CS degree for anyone they hire, but many do not.
Except that you're not dealing directly with a human being's health and anatomy? You must be trolling right?
As such, it's a really useful preparation for learning other languages.
If you have the basic concepts from Python, then when you learn Ruby, you just learn some syntax and the special things that make Ruby distinctive (metaprogramming, monkeypatching, ubiquitous gemmery, Rails). When you learn JavaScript, you just learn some syntax and the special things that make JavaScript distinctive (callbacks, artisanal object systems, the interplay of JS and the DOM). You never have to learn to do without some special thing you've learned to depend on, because Python has no special things.
Well, that and it's incredibly easy. And it has a really strong batteries-included standard library. And it has a really friendly, helpful community. But mostly the plainness.
If you're coming to Javascript from pretty much any other language, prototypal inheritance is probably going to be the biggest difference in paradigm to wrap your head around.
One thing I wish I knew starting out was how to create a basic VM. I shudder to think of all the time wasted thinking I was a sysadmin genius for dual booting Linux and later doing something stupid to my hard drive. VMs give me my Linux environment without the pain from those moments when I think to myself "maybe I want to be a kernel developer."
Ding ding ding.
You could replace startups with company though, because there are loads of companies of different ages that would hire someone with only 7 months of self study of programming.
Cause the thing is that the industry is starving for software developers. Those that are not employed either live in rural areas without jobs in general, aren't trying, or are just unbelievably bad at job interviews or work overall.
I too bounced between Vim and Emacs until recently I decided to stick with Emacs. I too switched to Dvorak, later switching to Colemak. I used a typing tutor software with which I was actually able to get up to a respectably 50-60 WPM if I remember correctly. However I eventually switched back to Qwerty after growing tired of keyboard shortcuts never working the way they were supposed to. Sure I could rebind them in my favorite editor (and even that was a pain in the ass) but each time I installed something new I'd have to do it again. One thing I am grateful for from Colemak is rebinding Caps lock to Control, what a great idea.
I've also been dabbling between C, Lisp, C++, Python, Bash, and a few others, but I never really became really good at any of them. It was more than just the OPs 50-page dabble (something like 400 pages into C++ Primer) but I feel like I can relate. Sure I can write basic programs in all of them but I didn't have a "default" so to speak that I've mastered. Only recently I made the decision that Python would be that default. The reasoning behind it is simple, I already kinda knew it and it fit my use case well. I just wanted to do stuff with the language and Python makes it easy with its wealth of libraries and (in my opinion) intuitive structure. Stuff like Lisp and Haskell still have a place in my heart because of how elegant they are, but I just feel more productive in Python.
That's my sort of ongoing story of getting at what I feel is the same kind of focus the OP was talking about. Now if only I could just settle on a Linux distro instead of hopping around every few months (currently messing around in Slackware, though I suspect I'd be better off switching back to Ubuntu which I was using before I switched to Slackware).
But if you're hunt a peck QWERTY typist who decides to pick up touch typing, choosing an ergonomic layout like Dvorak or Colemak makes sense. You're essentially staring from zero anyways so you're not wasting much time on the layout anymore so than you would on the QWERTY layout.
This way, you retain your two-finger QWERTY skills while learning a layout that minimizes the odds of developing RSI.
Being someone who did that, yes I would say so and recommend against it. However one thing I can say about it is that it (Colemak to be specific) felt legitimately more comfortable, having all the most frequently pressed keys on the home row. If you're suffering from RSI it might help.
I had the same issue with emacs/vim and while I use vim now, I don't consider myself a vim power user
Lastly, the OS... I started with Ubuntu, wasted a week trying to get CentOS minimal up and running (I couldn't), ran with Kubuntu and have used it for about a year. Fedora is growing on me right now since I just spent the past week downloading/destroying/upgrading VMs, but I think that was more trying to find the right flavor of Linux for what I need than anything else (and with Faience theme, GNOME 3 is pretty good, although slow in a VirtualBox VM since you can't have more than 256VRAM)
Great article.
I personally think every programmer should know some C. Basic things like what the stack and the heap are.
Edit: Ok, probably not in the case of absolute beginners, but after a year or two...
Even things like memory alignment and cache lines can bite you really badly if you don't know about them and order your loop the wrong way around :)
Date and Time Handling, Unicode and Data Structures are also a common area for many misconceptions and sources of error. Of course nobody needs to know everything, but have a lively thirst for knowledge always helps, especially one you got over the initial confusion of learning the basics. There's just so much interesting stuff out there and a lot of it will help you improve even if you don't end up using it right away / at all.
Even if you're just swimming on the surface, it's always a good thing to know at least the 1-3 meters beneath you, just in case something happens or you get stuck in some seaweed and struggle to get out on your own.
I've actually made quite some good experiences with teaching people a few things about ASTs right once they started writing code, even though I only gave the some very basic lessons about how the "text" is eventually transformed, it really helped them a lot in understanding why certain text does certain things.
In the end, it's always very hard to play the game successfully if you don't know the rules by which you have to play. And a little can go a long way.
I've got 4 years of professional software experience and the only C I've written is a couple hours on an Arduino device. It's just not that needed or relevant to my work as a web developer. I don't think the article is too focused on web dev, that is where the self-taught developers can most easily insert themselves.
FWIW, dynamic scripting language is redundant.
Reading this was kind of the reminder I needed. Focus. Pick one, become proficient (with the language and just the concepts of programming), and I guess somewhere down the road if I feel compelled try another.
So for now, I'll focus.
Or, converge onto interesting problems and fail gloriously; the only way to do anything of worth.
The post comparing this to dentistry is precisely what I was thinking, except nowhere near as hilarious. I wish I was intelligent enough to write a Markov algorithm for logic that paralleled reasoning in such a way automatically. Thank you for that.
Stop trying to cram n years of experience into a blog post seeking some form of social validation. You are guaranteed to contradict yourself more than half the time. That's because you can't include all the important bits that helped you learn. Every stupid mistake is really your future best friend.
But maybe I am just trying to make excuses for myself, maybe if I learn a new language/framework/IDE every month I will get better at learning new languages/frameworks/IDEs.
Yet this illustrates a very important thing that most proponents of "learning to code" neglect entirely.
Writing code isn't an isolated black box. To write code is to interact with the extremely entangled software environments that we have cumulatively been building up for over 60 years now.
Anything beyond Fibonacci sequences will require you to spill out into lots and lots of domain-specific areas and subdisciplines. A proficient programmer in general will also need at least basic skills in system administration.
To build a real useful application intersects with areas such as network protocols (which is immeasurably vast, depending on what layer you pick and how much you abstract), widget toolkits (and the wider quandaries of computer graphics, windowing, displays, etc.), cryptography (which pretty much intersects with most bodies of computer science), the workings of the kernel, dynamic linker (thus object files and libraries), the C library...
Profiling an application will likely require you to learn some complexity theory. I/O-bound applications will require you to learn how file systems, I/O schedulers and disks work. System programming is a rabbit hole of its own, with POSIX alone just about being an independent branch.
The bottom line is that programming can mean just about anything. And since programming itself is not innate, but a member of its own in the tangled web of computing, programming is thus a vast body of theory itself, a lot of which one will need to learn, unless they intend on staying frozen.
Paradoxically, the more we try to make things so that people won't have to be specialized in order to use them, the more we necessitate huge drifts of specialization in the people that want to do more than cursory tasks.
While picking a few components and sticking with them is probably necessary at least in professional environments to maintain sanity and interoperability, it is not very realistic for hobbyists and learners.
Of course, if you just want to write macros to automate tedious tasks, you can get by with the basics. But that is pretty much something that will end up being born from necessity, and likely self-discovered rather than taught through compulsory means.
I had come up with an idea for a niche product - I reckoned at the time that something like this would sell. All I needed to do was to code it up - at the time I had been reading about Python and had tentatively prodded and poked it as I wanted to find out why a lot of my acquaintences were always scoffing at this language... "I friggin' hate whitespace!" seemed to be the number one reason for not bothering with it, which was an instant reason for ME to get to know it as I'm the kind of guy who likes to see what all the fuss is about.
So I had an idea, and I had chosen the language to learn and implement the idea in. That was the easy part. The hard part was to learn enough Python in order to implement it.
I bascially started off with nothing and worked my way up from there by asking "How do I do X with Python?"
I knew in my head what the program should be and what it should look like - "it should be a GUI", for example. I'd already used the Gtk bindings for python, but had gone off Gtk (as I've written before in a post on HN). So I decided to give Qt a try. The answer to that is to either use PyQT or PySide. I decided on PySide as I couldn't afford the PyQt commercial license (I want to sell my application after all, and my budget is next to nothing).
After a while, I was getting used to how Qt Designer works for designing layouts.
Next question was "How do I get the GUI widgets to send signals to my Python code?" followed shortly after by "How do I see signals from dynamically created tab widgets?" - I ended up both asking AND answering my own question on StackOverflow. ( http://stackoverflow.com/questions/17344805/how-to-see-signa...)
And so as the days, weeks, and months progressed, I needed to Do Things. For me it was a matter of asking the How Do I Do That question and researching the answer, then implemented it.
Before I knew it, I was increasingly getting proficient in Python, in Qt, in parsing configuration files using ConfigParser, in loading and saving files, in using QtWebkit in sneaky ways so I can display my company's animated logo using CSS. And so on.
Then I needed a web site. So it took me 4 months to learn enough Django to go implement a basic site. That site and the product I recently launched. ( https://xrdpconfigurator.com ) - that's all done in Python, including the application - which was converted to C via Cython then compiled to object code after that.
[ And now I've learned another important lesson after all that time and effort - no one seems interested in my product - perhaps I should have open sourced it and asked for donations instead ;) Or perhaps I just need to patiently market the thing better. Or perhaps it is just TOO niche! ]
Once you go from "all debian and ubuntu users" subset "those that use xrdp" subset "those that need to customize their configuration" subset "those that are willing to pay money", you probably don't have a lot of people left ;-)
http://www.amazon.com/gp/product/B004J4XGN6?btkr=1 - The Lean Startup is received wisdom on MVP. On the positive side, you've built a product and shipped. Now you can do it again, but with a product people actually will pay money for.
And yes, I do have one or two other ideas that aren't as niche and more to do with real-world applications needed by many more people - something which I was intending to move on to after I'd completed this one.
Thanks for the link to that book. :)