GPT4: Software Engineering's iPhone Moment
Musings on GPT4 and how it can shape software engineering in the coming years!
Hello!
iPhone's introduction changed software engineering with introduction of touch interfaces, and constant connectivity leading to important innovations — ranging from much derided Instagram/Tiktok to much underrated Google Maps.
GPT similarly transforms software engineering by combining a natural language interface with code generation. GPT accelerates the development process by converting natural language instructions into code, simplifying the syntax and language-related effort required from developers. GPT is a very early prototype of the natural language to machine compiler.
For new or early career developers, the use of cheap, mostly accurate code generation has a big impact.
With GPT handling many tasks that used to be assigned to entry-level programmers, there may be fewer opportunities for these positions. This means that young developers may need to learn new and specialized skills that complement GPT's strengths. Experienced folks who have built careers on programming language-specific skills of the past will become less important — not because of new programming languages but because GPT4 will be able to translate between most of them!1
This is not a reason to be discouraged. It is a chance for engineers to learn from GPT and explore new challenges: ranging from building guardrails for LLM behavior to LLM security. I wrote about getting started with this here on Twitter.
Further out, software dev careers might resemble an influencer in that, you're rewarded for understanding how to make the most of your algorithm: your GPT assistant. And similarly you've to adapt your work format when there is a new release, or a major new feature like reels.23
Natural Language → Formal Logic
As devs, we spend a lot of time putting together human thought and desire into executable logic — but the programming language (or framework) is an expression of that logic constrained by that language design.
A fun question to ask about GPT4’s code generation capabilities is why even bother with say, Python or JS? Why not something else entirely?
Could a more mathy language similar to APL be easier for GPT4 to write in? Leslie Lamport was perhaps 2 decades of ahead his time when he designed a specification language called TLA+. Humans use natural language to describe what we want and GPT4 writes code/logic. (Ofc, describing what we want in enough detail is an unsolved problem)
But taking this a bit further, why does this logic have to be human-readable or optimal? Perhaps one scenario is that human and machine writing logic/code side by side, and GPT4 translates seamlessly back and forth between human and machine-optimal!
Input Beyond Text-as-Code
OpenAI’s demo showed that we can now scribble on paper/iPad and expect to have generated code from it to work4. Soon, we should be able to do so for video, speech, music and more to use that as inputs: modify-translate-generate between them!
With the introduction of GPT-4 and its ability to automate many aspects of software development, we may see a temporary drop in traditional dev demand over the next 3-5 years — that is okay. As machine wizards, we must embrace this new magic and invent, adapt our way out of this!
On that cheerful note, I’d love to hear what you think —and take your leave till we meet next weekend!
Till we meet again,
Natkhat Nirant
I’ve used GPT4 and GPT interchangeably here but I meant GPT-like LLM models —including adjacent adaptations like Microsoft Copilot, Toolformer, ReAct or MRKL agents, Llama/Alpaca under this “GPT” umbrella.
h/t Ravi Theja for convincing me that saying specific programming language are worthless
h/t Yash Pandya on the influencer analogy
This is something famous, popular projects like Langchain have to do! They’re still trying to make Chat GPT work for some use cases, while text-davinci-003 is what they have widely.
Thanks to Yash Pandya for reminding me of this! Livestream link of that demo!