How developer copilots, no-code, and app-generating LLMs might impact product development
Throughout the history of computer science, software development has become increasingly accessible. It is much easier to make complex software today than in 1980.
Abstraction is the primary mechanism by which software development has become easier. Abstraction is the process of simplifying complex systems by layering simpler systems on top. When you abstract away complexity, you make it easier to talk to your computer’s hardware, but you reduce the specificity of what you can tell it.
For example, you can be extremely specific if you write instructions for your computer hardware using binary code. Binary code is a series of 1s and 0s directly representing instructions for the computer’s processor. These instructions control the flow of electricity through the processor’s circuits1. So, you have a lot of control over what the computer will do, but you have a meager chance of understanding this binary code yourself. It is practically impossible to write complex software using this method because it is inconceivable to a human how the flow of electricity through a processor could amount to the rules and logic of a web application. So, on top of binary, computer science has begotten layer after layer of abstraction.
In a scripting language like JavaScript, which lives many layers of abstraction above binary code, you have a plethora of tools available to you that make it very easy to write software. You don’t need to tell the computer how to do arithmetic because JavaScript has built-in functions for that, amongst many others. The benefit of abstraction is that writing software is easier and quicker. The downside is that, with each abstraction layer, you lose some control over precisely what the computer is doing. Generally, this is unimportant, though it does have downsides. For example, it is easier to optimise software for high-performance and low-power consumption with lower-level languages (assuming you know what you are doing).
Another constant throughout the history of computer science is skepticism from engineers directed at new layers of abstraction. Suppose you are an expert with the popular programming technology of the day, and along comes a new technology, built on top, that is easier to learn and develop with, but less performant and more bug-ridden. There is a good chance you will be skeptical of this new technology, the developers who adopt it (especially those who have yet to learn how to write the more complex code you’re proficient in), and the slow and buggy software built with it.
When JavaScript first gained traction as a language for developing software applications, it was met with significant skepticism from established programmers. JavaScript was never designed for this purpose, had legitimate, obvious, and serious flaws, and wasn’t even that user-friendly2.
But it was also familiar to many people with little programming experience. Before JavaScript became a language for software creation, it was used by web designers and developers to lightly enhance websites. When it suddenly became possible to build complex software with JavaScript, many web designers and developers became software engineers. Like programming languages before it, JavaScript went from unserious and problematic to ubiquitous. Many more software exists today thanks to the ratification of JavaScript as a software development tool.
The thing about abstraction is, when you build software at one layer, the lower layers can improve without much (if any) need for you to change your software. So, the popularity of JavaScript led to improvements to the layers beneath it, which ironed out the creases. Additionally, things got better at the lowest layer: computer processors have become dramatically more efficient (in fact, some modern CPUs are now optimised for running JavaScript).
When a new layer of software development tooling is ratified, what happens to the developers proficient in lower-level languages? Many eventually adopt the new layer, while others continue to use their low-level languages for the purposes that remain viable. Not everything can or should be built using JavaScript or PHP, so low-level programmers who used to produce consumer software may instead work on other types of software like compilers or more performance-sensitive applications. Even when a low-level language dies in terms of new software projects, many engineers lucratively maintain legacy software that is still dependent on them.
I tell this story because the very same thing is happening today. First, with no-code and low-code tools, and now much more significantly with GitHub Copilot, ChatGPT, and the plethora of other LLM-backed development tools now emerging. The software development world is having the greatest existential crisis in its history. Will everyone be able to build software soon? Will software developers still have jobs?
Nobody knows how this technology will impact the industry or labour market. But, if history is anything to go by, I think software developers will survive the AI revolution, at least for the foreseeable future.
Many more people will be able to build high-level software in the very near future. At first, this software will suck, so many engineers will resist this new abstraction layer. But, soon, this software will get good and many engineers will move to this new way of working. They’ll be way better at it than laypeople, so they might become 10x engineers. Like any technology, there is no finite amount of software that must be created before we’re finished and can move on to something else. There will always be economic processes to digitise. The demand for software development is currently dramatically greater than the supply of software developers3.
Some engineers will stick or even move to lower-level programming languages and we’ll be grateful because we will need their help to optimise the plumbing of these new abstractions. In fact, LLMs will likely lead to more rapid improvements to low-level technology. Just because new layers exist, does not mean development on old layers has ever wholly halted. Every layer needs maintenance and improvement, and LLMs will empower more engineers to dive deep to make these improvements. This technology will make every layer of the stack more accessible, which will ultimately lead to major changes at every layer.
I expect that a dramatic improvement in the accessibility of software creation will lead to a considerable increase in the amount of bespoke software in the technium4. Today, most businesses run on standardised software built by centralised product companies (i.e., B2B SaaS companies). Even the most consolidated markets are highly fragmented, though (look at how many competitors Shopify has). This is because there is no one-size-fits-all way to digitise an entire industry. Soon, it will be much more affordable for a business to digitise on its own terms by building a considerable portion of its software stack. This bespoke approach to digitisation, which is a significant competitive advantage for large businesses like Amazon, will once again become a potential competitive advantage for mid-sized companies. Within enterprises, functions typically dependent on engineering teams for technological innovation will be liberated by the ability to build their own software: operations teams, customer service teams, sales teams, and more. SaaS companies might be disrupted more by their customers than by new SaaS companies.
Eventually, AI will be so good it can own the entirety of our digital world. I think it is safe to say that software engineers will no longer exist as a profession when this time comes. Creating software by hand will be more irrelevant than making horseshoes by hand today. When this time comes, we software engineers will have much bigger things to worry about — namely, the end of the world as we know it, for better or worse.
Footnotes
Will software engineers make less money? On average, of course. But many of today’s engineers will still have superpowers compared to laypeople adopting these new layers, so their skills will be more valuable than the mean. ↩︎
The technium is the human-made system of all technologies working together as defined by Kevin Kelly. ↩︎
Privacy and terms
I will only use your email address to send you this newsletter or to reach out to you directly, and you can unsubscribe at any time. I will not share, sell, or rent your email address to any third party, though I do store it the software I use to dispatch emails.
The information provided on this blog is for informational purposes only and should not be considered investment advice. The content on this blog is not a substitute for professional financial advice. The views and opinions expressed on this blog are solely those of the author and do not necessarily reflect the views of other organizations. The author makes no representations as to the accuracy, completeness, currentness, suitability, or validity of any information on this blog and will not be liable for any errors, omissions, or delays in this information or any losses, injuries, or damages arising from its use. The author may hold positions in the companies or products discussed on this blog. Always conduct your own research and consult a financial advisor before making any investment decisions.