Deciding What to Learn Next as a Programmer

Or not to?

The fields of web and mobile software development appear to change very quickly, with new languages (and versions of languages) and frameworks appearing all the time. There are also countless named methodologies for how to go about software development.

New features are appearing in languages all the time, through specific updates to existing languages — for example: Lambdas and first-class functions in Java 8, generators in ECMAScript 6) — to more general trends like the increasing inclusion of functional programming concepts in modern languages, particularly in new languages like Swift where there is the opportunity to start completely afresh. (Of course it's not that functional programming ideas are new!).

To a curious-minded developer, languages and even frameworks are fun to learn almost for their own sake, particularly new ones. Each language has its own character and subtly leads you into doing things in particular ways. Sometimes these new methods solve problems that you've often come across whilst using other languages, so there is excitement at the prospect of future benefit from the new features or from the approach. There is an emotional response to the creative aspects of the language, to things as simple as its appearance or empathy for particular decisions its designers have made.

When a language update or new language makes a big deal about a computer science concept that you're not familiar with, it can serve as a nice introduction and an extra motivator to go and learn about it, or to refresh your knowledge about it.

As you climb up the stack and the new things become higher and higher-level, or the multiplicities of new things becomes greater, it becomes more and more necessary to be pragmatic about what you spend your time on. The hierarchy of knowledge and skills that I need, as I see it, goes roughly (from low- to high-level):

  • Mathematics
  • Creativity
  • Computer science
  • Programming languages
  • Software design
  • Databases
  • Web protocols (e.g. HTTP, SSH)
  • Cloud technologies
  • Web frameworks
  • Libraries

All of the above areas of skills and knowledge are required in order to make complex web applications. As you go down the list, the concepts become higher and higher-level, they become easier and easier to create from scratch and examples of them become more and more numerous.

Things like computer science knowledge and software design are skills that a developer needs to be able to apply to all software development situations. Practical and thorough understanding of these comes through actually making applications, in diverse scenarios. So to some extent is necessary to delve into multiple programming languages and application frameworks.

At some point though you have to start being pragmatic about what is worth spending time on learning. Once a developer has learned several languages or frameworks, acquiring new ones becomes quite easy and inexpensive. For me, usually following a book and making a moderately complex personal project gives me ~95% of the knowledge and fluency I need in any particular language or framework, in order to be really effective with it professionally. Different projects are going to use different aspects of languages or frameworks to lesser and greater amounts, so as long as you have knowledge of a bit of everything, any codebase is going to be recognisable.

My personal emphasis is towards spending more time actually making things. Given an initial concept, I am confident that I can create what I set out to, choosing the tool that feels best for the particular project at that particular time. However it is still important to keep up with the trends by doing things like reading blogs or Hacker News, or going to technology meet-ups, and anything else that helps keep you up-to-date.

It can feel as though a lot of emphasis gets placed on the virtues of having experience with a given new web framework, as it suddenly becomes popular. And of course over time, these new ways of doing things are healthy as a way of gradually optimising ways of working, keeping up with the changing demands of web software as the infrastructure becomes more powerful over time.

But I do think it's important to keep the focus on the fundamentals — the fundamental skills of being able to choose the right tool or language to use, to write software that is structured nicely, and to make usable applications. I think these are the real differentiators between developers, are the most valuable to aspire to and take the most time to learn in the first place. A word of warning: this approach does tend to make you most-suited to hipster startup environments.

Therefore for me the natural conclusion is to spend good time at every level of the 'knowledge stack', on an on-going basis (always stay humble and open to new ideas and knowledge), but to prioritise towards the fundamentals. A great way to encourage all of that is to keep making things, incorporating the new things that you learn, making each codebase a bit better than the last.