Mate, most code I ever written across my career has been throw away code. The only exception being some embedded code that's most likely on the streets to this day. But most of my desktop and web code has been thrown away by now by my previous employers or replaced by someone else's throwaway code. There's no shame in that and no pride in that either, I'm just paid to "put the fries in the bag", that's it.
Most of us aren't building DOOM, the Voyager probe or the Golden Gate bridge here, epic feats of art and engineering designed to last 30-100+ years, we're just plumbers hacking something quickly to hold things together until the music chairs stop playing and I have no issue offloading that to a clanker if I can, so i can focus on the things I enjoy doing. Do you think I grew up dreaming about writing GitHub Actions yaml files for a living?
Oh and BTW, code being throwaway, is the main reason demand and pay for web SW engineers has been so high. In industries where code is one-and-done, pay tends to scale down accordingly since a customer is more than happy to keep using your C app on a Window XP machine down in the warehouse, instead of keep paying you to keep rewriting it every year in a facier framework in the cloud.
It's kind of amazing that the really mainstream jobs create and pitch throwaway code, while a few key niche jobs, with little demand, can really create enduring products.
Kind of like designing a better social media interface probably pays 100x what a toilet designer would be paid, but a better toilet would benefit the world 1000x.
Which is why I dislike the GDP being thrown around in discussions as the ultimate dick measuring metric. High economic value activities don't translate or don't trickle down into high social value environments.
For example, I went to visit SF as young lad and I was expecting to be blown away given the immense wealth that area generates, but was severely disappointed with what I saw on the street. I used to think my home area of Eastern Europe is kind of a shithole, but SF beats that hands down. Like there's dozens of places on this planet that are way nicer to live in than SF despite being way poorer by comparison.
RAG, llm pipeline industry just continues in the same fashion of throwing even more glue, insanely slow, expensive, but works due to somehow companies having money to waste, perpetually. Not that much different from the whole Apache stack or similar gluey expensive and slow software.
There is similar mindless glue in all tech stacks. LLMs are trained on it, and successfully do more of it.
Even AI companies just wastefully do massive experiments with suboptimal data and compute bandwidth.
Yeah this is what kills me. Most of the problems we solve are pretty simple. We just made the stacks really painful and now LLMs look sensible because they are trained to reproduce that same old crap mindlessly.
What the hell are we really doing?
What looked sensible to me is designing a table, form and report in Microsoft Access in 30 minutes without requiring 5 engineers and writing 50k lines of React and fucking around with kubernetes and microservices to get there.
LLMs just paste over the pile of shit we build on.
cold take speculation: the architecture astronautics of the Java era probably destroyed a lot of the desire for better abstractions and thinking over copy-pasting, minimalism and open standards
hot take speculation: we base a lot of our work on open source software and libraries, but a lot of that software is cheaply made, or made for the needs of a company that happens to open-source it. the pull of the low-quality "standardized" open source foundations is preventing further progress.
I feel like a lot code is pretty sticky actually. I spend two weeks working on a feature and most likely that code will live for a time period measured in years. Even the deprecation period for a piece of software might be measured in years.
Lately I have had the cursed vision as I'm building a new IoT product. I have to learn _so_ much, so I have stopped using claude code. I find directly altering my code too hands off.
Instead I still use claude in the browser mainly for high level thinking/architecture > generating small chunks of code > copying pasta-ing it over. I always make sure I'm reading said library/code docs as well and asking claude to clarify anything I'm unsure of. This is akin to when I started development using stackoverflow just 10x productive. And I still feel like I'm learning along the way.
Post is clearly very heavily glued together/formatted and more by an LLM, but it's sort of fascinating how bits and spurts of the author's lowercase style made it through unscathed.
I get where the author is coming from, but (I promise from an intellectually honest place) does it really matter?
Modeling software in general greatly reduced the ability of engineers to compute 3rd, 4th and 5th order derivatives by hand when working on projects and also broke their ability to create technical drawing by hand. Both of those were arguably proof of a master engineer in their field, yet today this would be mostly irrelevant when hiring.
Are they lesser engineers for it? Or was it never really about derivatives and drawings, and all about building bridges, engines, software that works?
How many people could, from scratch, build a ball point pen?
Do we have to understand the 100 years of history behind the tool or the ability to use it? Some level of repair knowledge is great. Knowing the spring vs ink level is also helpful.
Following up - I am the most excited about using computers because the barrier from intent to product are being dropped. At this point my children can 'code' software without knowing anything other than intent. Reality is being made manifest. Building physics into a game would take a decade of experience but today we can say "allow for collision between vehicles".
If you have ever gone running the ability to coordinate four limbs, maintain balance, assert trajectory, negotiate uneven terrain, and modify velocity and speed at will is completely unknown to 99.9% of mortals who ever lived and yet is possible because 'biological black box hand wave'.
I respect this choice, but also I feel like one might need to respect that it may end up not being particularly "externally" valuable.
Which is to say, if it's a thing you love spending your time on and it tickles your brain in that way, go for it, whatever it is.
But (and still first takeaways) if the goal is "making good and useful software," today one has to be at least open to the possibility that "not using AI" will be like an accountant not using a calculator.
Has anyone measured whether doing things with AI leads to any learning? One way to do this is to measure whether subsequent related tasks have improvements in time-to-functional-results with and without AI, as % improvement. Additionally two more datapoints can be taken: with-ai -> without-ai, and without-ai -> with-ai
The missing step seems to be identifying what is worth learning and your goals. Will learning X actually benefit you? We already do this with libraries, they save us a great deal of time partially by freeing us from having to learn everything required to implement that library, and we use them despite those libraries often being less than ideal for the task.
From the author:
>ai-generated code is throw-away code
Mate, most code I ever written across my career has been throw away code. The only exception being some embedded code that's most likely on the streets to this day. But most of my desktop and web code has been thrown away by now by my previous employers or replaced by someone else's throwaway code. There's no shame in that and no pride in that either, I'm just paid to "put the fries in the bag", that's it.
Most of us aren't building DOOM, the Voyager probe or the Golden Gate bridge here, epic feats of art and engineering designed to last 30-100+ years, we're just plumbers hacking something quickly to hold things together until the music chairs stop playing and I have no issue offloading that to a clanker if I can, so i can focus on the things I enjoy doing. Do you think I grew up dreaming about writing GitHub Actions yaml files for a living?
Oh and BTW, code being throwaway, is the main reason demand and pay for web SW engineers has been so high. In industries where code is one-and-done, pay tends to scale down accordingly since a customer is more than happy to keep using your C app on a Window XP machine down in the warehouse, instead of keep paying you to keep rewriting it every year in a facier framework in the cloud.
It's kind of amazing that the really mainstream jobs create and pitch throwaway code, while a few key niche jobs, with little demand, can really create enduring products.
Kind of like designing a better social media interface probably pays 100x what a toilet designer would be paid, but a better toilet would benefit the world 1000x.
The difference between economic value and social value.
Which is why I dislike the GDP being thrown around in discussions as the ultimate dick measuring metric. High economic value activities don't translate or don't trickle down into high social value environments.
For example, I went to visit SF as young lad and I was expecting to be blown away given the immense wealth that area generates, but was severely disappointed with what I saw on the street. I used to think my home area of Eastern Europe is kind of a shithole, but SF beats that hands down. Like there's dozens of places on this planet that are way nicer to live in than SF despite being way poorer by comparison.
RAG, llm pipeline industry just continues in the same fashion of throwing even more glue, insanely slow, expensive, but works due to somehow companies having money to waste, perpetually. Not that much different from the whole Apache stack or similar gluey expensive and slow software.
There is similar mindless glue in all tech stacks. LLMs are trained on it, and successfully do more of it.
Even AI companies just wastefully do massive experiments with suboptimal data and compute bandwidth.
Yeah this is what kills me. Most of the problems we solve are pretty simple. We just made the stacks really painful and now LLMs look sensible because they are trained to reproduce that same old crap mindlessly.
What the hell are we really doing?
What looked sensible to me is designing a table, form and report in Microsoft Access in 30 minutes without requiring 5 engineers and writing 50k lines of React and fucking around with kubernetes and microservices to get there.
LLMs just paste over the pile of shit we build on.
cold take speculation: the architecture astronautics of the Java era probably destroyed a lot of the desire for better abstractions and thinking over copy-pasting, minimalism and open standards
hot take speculation: we base a lot of our work on open source software and libraries, but a lot of that software is cheaply made, or made for the needs of a company that happens to open-source it. the pull of the low-quality "standardized" open source foundations is preventing further progress.
I feel like a lot code is pretty sticky actually. I spend two weeks working on a feature and most likely that code will live for a time period measured in years. Even the deprecation period for a piece of software might be measured in years.
Lately I have had the cursed vision as I'm building a new IoT product. I have to learn _so_ much, so I have stopped using claude code. I find directly altering my code too hands off.
Instead I still use claude in the browser mainly for high level thinking/architecture > generating small chunks of code > copying pasta-ing it over. I always make sure I'm reading said library/code docs as well and asking claude to clarify anything I'm unsure of. This is akin to when I started development using stackoverflow just 10x productive. And I still feel like I'm learning along the way.
Post is clearly very heavily glued together/formatted and more by an LLM, but it's sort of fascinating how bits and spurts of the author's lowercase style made it through unscathed.
I get where the author is coming from, but (I promise from an intellectually honest place) does it really matter?
Modeling software in general greatly reduced the ability of engineers to compute 3rd, 4th and 5th order derivatives by hand when working on projects and also broke their ability to create technical drawing by hand. Both of those were arguably proof of a master engineer in their field, yet today this would be mostly irrelevant when hiring.
Are they lesser engineers for it? Or was it never really about derivatives and drawings, and all about building bridges, engines, software that works?
I can't believe I took a mandatory technical drawing class.
How many people could, from scratch, build a ball point pen?
Do we have to understand the 100 years of history behind the tool or the ability to use it? Some level of repair knowledge is great. Knowing the spring vs ink level is also helpful.
Following up - I am the most excited about using computers because the barrier from intent to product are being dropped. At this point my children can 'code' software without knowing anything other than intent. Reality is being made manifest. Building physics into a game would take a decade of experience but today we can say "allow for collision between vehicles".
If you have ever gone running the ability to coordinate four limbs, maintain balance, assert trajectory, negotiate uneven terrain, and modify velocity and speed at will is completely unknown to 99.9% of mortals who ever lived and yet is possible because 'biological black box hand wave'.
I respect this choice, but also I feel like one might need to respect that it may end up not being particularly "externally" valuable.
Which is to say, if it's a thing you love spending your time on and it tickles your brain in that way, go for it, whatever it is.
But (and still first takeaways) if the goal is "making good and useful software," today one has to be at least open to the possibility that "not using AI" will be like an accountant not using a calculator.
Has anyone measured whether doing things with AI leads to any learning? One way to do this is to measure whether subsequent related tasks have improvements in time-to-functional-results with and without AI, as % improvement. Additionally two more datapoints can be taken: with-ai -> without-ai, and without-ai -> with-ai
> What scares me most is an existential fear that I won’t learn anything if I work in the “lazy” way.
You're basically becoming a manager. If you're wondering what AI will turn you into just think of that manager.
The missing step seems to be identifying what is worth learning and your goals. Will learning X actually benefit you? We already do this with libraries, they save us a great deal of time partially by freeing us from having to learn everything required to implement that library, and we use them despite those libraries often being less than ideal for the task.