Modern Web Development: The Agony of Choice Some thoughts on todays's seemingly endless flood of web development tools and frameworks
Ye Olden Days...
Except for a short period of coding in QBASIC/QuickBasic around 1997, and some toying around with very early web technologies at school around the same time, I started my programming career as a hobby and freelance web designer and developer in the year 2001. Back then, web development was easy and fun. I think it was one of the most accessible and motivating ways to enter the world of software development. On the client-side, there was (X)HTML, CSS and optionally a bit of JavaScript. On the server, PHP and MySQL were all the rage. You could gradually step up from writing plain static HTML and CSS documents to building dynamic web sites by adding pieces of code on the client (JavaScript) or on the server side (PHP). Especially exciting for me was the close integration of visual design and programming, and the fact that your works could be viewed by everybody around the world. Back then, most experienced web workers were what is called a "full-stack developer" today.
It was the time when Mozilla Firefox was brand new, Internet Explorer 6 was the de-facto standard browser, and some people still even used Netscape Navigator 4.9. Google Chrome didn't yet exist for many years to come, and neither did mobile web devices, except for a few early, rather exotic experiments. It was also the time when the web was slowly but surely maturing, at least as seen from the perspective of these days. People began to care more and more about standards-compliance and accessibility. <table>-layouts were getting out of style in favor of purely CSS-driven design, and "barrier-free" was a huge buzzword and quality criterion by which avant-garde designers and developers compared their works. Back then, CSS Zen Garden went online to showcase what could be done with modern web standards. Ironically, JavaScript - today the big driver of the web - was widely considered "evil" due to inconsistent browser support and (perceived?) accessibility and security issues. Everybody who considered her- or himself and expert would build web pages that still worked with JS disabled, if they contained any JS at all.
When I went to university to study computer science in 2005, I considered myself an experienced web developer. I had written thousands of postings on a popular web design and programming forum. I had developed my own PHP content management system and "application framework" with an architecture that I'm still proud of and that I'd do surprisingly similar even today. And I had used it to build a series of very different web sites that proved its practicability and great flexibility.
But then, lack of time due to my studies forced me to stop all this for many years. Also, university introduced me to new programming languages and fields of application, so when I had time to do some hobby development, I spent it on things other than web. It was only during the last three to four years when I began to slowly bring my web development skills up to date again, mostly due to my company job that involves - among other things - the building of web-based geographic information systems. And oh boy, how has everything changed!
...and what it's like today
Back in 2005, you could count the number of technologies a typical web developer had to master by one hand: HTML, CSS, PHP, MySQL, JavaScript (if at all...). What a contrast to the year 2017, where new and experienced developers alike see themselves confronted with a seemingly endless flood of tools: For general-purpose client-side development alone, there are now Grunt, Gulp, npm, Bower, Browserify, Babel, Angular, React, TypeScript, CoffeeScript, Sass, Less, Bootstrap, JQuery (not even cool anymore), Lodash, Vue, Ember, Webpack, Require.js, ECMAScript 5/6/7, and countless others I didn't mention or haven't ever heard of. There are build tools and package managers, preprocessors and transpilers, frameworks, polyfills, and more.
I don't know about you, but to me, this is intimidating. It is strange to see how the field that once got me started with programming and set the foundation of my career became so complicated that it now feels almost "alien" to me. While I could still build any sort of web application today (after all, I'm doing it every day), I'm doing it with a constant feeling of "perhaps not doing it right", perhaps missing out essential technologies that make something faster/better/easier, you get the idea.
It is important to realize that programming itself remained essentially the same. Most of the coding concepts and patterns that we use today already existed 15 years ago, and longer. The really new thing are the tools - more precisely, the shere amount of tools - and how they try to force you to spend a lot of time on them alone and do things "their" way. I am a programmer. I love coding in the sense of being creative, solving hard problems and building cool things. I am not a "dev-ops" person. I hate strictly predefined ways of doing things and having to learn and set up tons of boilerpate stuff just to get started with a project. I hate things that require more discipline than mind, and cause distractions that push me out of "flow" state or prevent me from getting into it.
That said, I don't doubt that all these tools do have their purpose. I believe that they can indeed help you to save a lot of time. But the question is when, under which circumstances? Am I using something because it really helps me, or am I using it only because I (or my boss or customer) believe that it helps? As you might already know from other articles of mine, I'm a strong supporter of small, elegant solutions. I am convinced that keeping things as simple as possible is the only way to keep control over a project. This means that in each new scenario, you have to decide again which additional dependencies to pull in or not, and not just "do it because everybody does it this way" or because it is considered "modern". Each time again, you need to find that sweet spot where the benefits of using some tool really begin to be greater than the costs, and find out which of multiple alternatives solves your problem best. In the dizzying zoo of present-day web technologies, this is no easy task.
P.S.:
One of all those fancy new toys that really gave me a tremendous productivity boost is TypeScript. Among all programming languages that I use more or less frequently, JavaScript has always been the one that I enjoyed least. First, its paradigms are so different from the ones I'm used to from other languages, that it took me a long time to understand how things work there. Second, even now that I understand it much better, I'm still frustrated by its horrible lack of support for code maintenance and refactoring.
I'm spoiled by the powerful code navigation and editing functionality that IDEs like Eclipse, Netbeans or Visual Studio provide for languages like Java, C#, C/C++, or even Python and PHP (to a lesser extent, but still much better than for JS). I'm so used to features like code auto-completion, "rename this variable in the entire project" or "show me all usages of this method in the entire project" that I'm always having a hard time with JavaScript, where such things simply don't exist due to the characteristics of the language. TypeScript solves exactly this problem, and finally enables me to reach a level of productiveness in client-side web development that is similar to other programming environments. And this is a truly great thing.