I think it's always important to consider incentives when thinking about what institutional leaders are saying.
> In a productivity boom such as this, a rise in unemployment may not indicate increased slack. As such, our normal demand-side monetary policy may not be able to ameliorate an AI-caused unemployment spell without also increasing inflationary pressure
I'm not saying AI isn't impacting the employment market, but this statement isn't really about AI so much as it is an advance warning that inflationary monetary policy is unavoidable if all the people saying that software engineering is dead are correct.
Perhaps, but her phrasing seems to imply they will act to reduce unemployment, and we can also assume Trump will mandate that they do so once he takes control of the fed.
As any president should in that scenario? I'm sorry, we're going to nuke professional class workers and let tech executives keep their 2026 money from the proceeds and let the losers go jobless? Not likely if you don't want a bloodbath. Let me be clear: fuck Trump, but any president who doesn't do that is out of their mind.
Software is a "good", as far as economic statistics go.
AI is helping produce more software, right? Including more software that is for sale?[1] Or more online services that are for sale?
[1] One of the interesting things here is going to be liability. You can vibecode an app. You can throw together a corporation to sell it. But if it malfunctions and causes damage, your thrown-together corporation won't have the resources to pay for it. Yeah, you can just have the company declare bankruptcy and walk away, leaving the user high and dry.
After that happens a few times, the commercial market for vibecoded apps may get kind of thin. In fact, the market for software sold by any kind of startup may also get thin.
Software stopped being a good when it no longer came in a box with finite inventory, that you had to pay for only once. It's part of the services economy, same as insurance or car rental services, regardless of how the Fed classifies it.
So is the premise here that making more software is going to have a deflationary effect on the entire economy of material goods? If so then that's obviously nonsensical.
I’ve been a nay sayer up till two weeks ago when I actually sat down and coded a non trivial feature with AI and I was blown away. The problem was one that there may have been at most 5 open source versions that they had been trained on.
Today I am planning an exit strategy. Anyone else considering what they’ll do in a post AI software engineering world?
We used to call the job Analyst Programmer. I am not sure what thing your code did but I am pretty sure you needed it - but who could understand that gap existed ? Who could explain to an AI that it needed to create this obscure code to solve that problem. And now comes the hard part - persuading your organisation to adopt it.
AI can code - but can it understand what it missing from the organisation and persuade it to chnage - to spend years at industry conferences?
Look at Starliner.
NASA just announced that Boeing stuffed up, not with an engineering mistake (no one still knows exactly what broke) but that the whole organisation is so screwed up and so political Nasa just don’t believe Boeing can fix it.
AI cannot fix our turf wars. That’s not intelligence (humans know going to war is bad, but Putin still exists). I ye the systems we live in, and work in.
Changing those is feasible - once they are coded, transparent and open To inspection in a democracy.
We need programmable introspective systems of organisation - democracies in other words.
The engineering was not the problem - the problem was the organisation was more or less toxic and incapable of doing engineering. Writing code that won’t get used because if politics is a job we and AI can both do.
How do you plan an exit strategy for something that may or may not obsolete a whole field in a matter of months? Not sure there's a real way to do such a thing.
I would emphasize self reliance and sustainable living. Things like home ownership, dollar-resistant assets like gold, and self education in topics that matter to you and your existence.
Another way to frame it is what would you do in a low trust environment where corporations and the government were not to be trusted. You would likely avoid things like bubble bursting AI stock investments, jostling for rank in a company, etc.
I've been using these tools for nearly a year and half on a daily basis. They've become an integral part of my tool box for solving problems.
But writing code was never much more than 35-40% of my job while working for companies/others. Most my time has always gone towards communication, design, and validation. All three of those are not particularly vulnerable to mass AI automation except for the most trivial of scenarios and I have not seen evidence that has changed in over 2 years of so called "improvements".
My "exit plan" ultimately is to be one of the engineers capable of using these tools to scale my impact accordingly so I can focus on higher order problem solving, which ultimately is what is most valuable. I would be more concerned if I was in marketing/sales or frankly middle management.
Maybe this is just "copium" on my part, who knows, this sector is moving fast.
Yeah, it's pretty wild, huh? I'm at one of the big tech companies and the strategy that I've fallen into is to basically just mostly coast, save / invest as much as possible, and hope that this gets me to a financial threshold of "permanently comfortable" before the job losses kick in.
As for what happens after that, I'd really prefer not to have to do physical labor or trades. And it doesn't seem like any other white collar occupation is really going to be insulated, other than perhaps medical. So my strategy is to basically wait and see what society looks like after the transition and I guess I'll try and decide on something then?
I'm not saying it is totally untrue, but this is the same generic, hedged statement that every business and political leader has been repeating for a couple years now. Unless there's anything new or noteworthy added (and looking at the article there isn't) what is really worth discussing?
> ... on average, industries with 1 percentage point higher time savings experienced 2.7 percentage points higher productivity growth relative to their prepandemic trend. We stress that this correlation cannot be interpreted as causal, and that labor productivity is determined by many factors. However, the current results are suggestive that generative AI may already be noticeably affecting industry-level productivity.
People in much more important and powerful positions than her (presidents, CEOs of multi-trillion dollars companies, top economists, heck Jerome Powell himself) have been saying this exact thing in countless interviews and business forums. She is hardly the first.
It would be helpful if this was articulated in depth. It's used as a shibboleth alongside "productivity" but it's rarely followed with the concrete details
Important thing to remember is that "new opportunities", whatever they are, are neither for you nor your children to partake - they're for the people just entering the workforce. Those whose careers suddenly disappear, as well as their families and children, are too busy dealing with consequences of being suddenly thrown down a rung or three on the socioeconomic ladder.
This isn't the first time they new technology has reshaped society, or even just the economy. How well were the results of prior things predicted ahead of time?
“Thanks to AI (and new immigration policies), there are tons of exciting new opportunities for Americans that want to pursue careers in: harvesting crops, installing roofs, or home health care”
Is it AI that's causing problems or is the US govt finally acknowledging how high the bar is in general for getting employment? It's not just AI that has influenced that.
More importantly, are they planning to do anything about it?
Two different things. My understanding is that the Goldman Sachs take was about the effect of AI investments (largely the humongous CapEx spends by the hyperscalers) showing up in the GDP.
Nah, the articles are non-contradicting. That article focuses on how the spend mostly goes to imports, which decreases GDP. This one focuses on the effects on unemployment. It's very plausible that a decrease in interest rates right now would lead to more imports and AI spending, not increased employment.
If productive output really stays the same while employment (and tax revenue) drops, there are only two ways for the government to stay solvent:
1. Print money
2. Increase taxation
That’s it. An eroding tax base necessitates one of those or a combination.
sure, but also looking at the pattern of who is involved - they could also declare bankruptcy (even if federal gov’t bankruptcy isn’t actually a thing).
> "Monetary policymakers would face tradeoffs between unemployment and inflation. ... Education, workforce, and other policy that is non-monetary may be better suited to address these challenges in a more targeted way."
How comforting. Sounds to me like "ZIRP won't fix this one folks, it's gonna take something other than money to fix what's coming."
The only question is how ugly it's going to get. Globally.
The closest thing we've seen in terms of scope/velocity is probably the introduction of the web in the late 90s to the broader world. Very few jobs were killed by that, though, relatively speaking.
The global effect at my company is that we fired a bunch in the US, hired a bunch in India, and are probably gonna do the same again in a year or so, if I'm reading the tea leaves right. We're heavily pushing "AI" but we're not cutting headcount, just shifting them overseas.
The closest thing is probably workers displaced by machines in the Industrial Revolution. Some people took it into their own hands to smash machines, and it didn’t go well for them.
Today we use Luddite as an epithet, but they were right about the effect of automation on their jobs.
I think it's always important to consider incentives when thinking about what institutional leaders are saying.
> In a productivity boom such as this, a rise in unemployment may not indicate increased slack. As such, our normal demand-side monetary policy may not be able to ameliorate an AI-caused unemployment spell without also increasing inflationary pressure
I'm not saying AI isn't impacting the employment market, but this statement isn't really about AI so much as it is an advance warning that inflationary monetary policy is unavoidable if all the people saying that software engineering is dead are correct.
Or that the fed is preparing us to expect higher levels of unemployment for the same level of inflation.
Perhaps, but her phrasing seems to imply they will act to reduce unemployment, and we can also assume Trump will mandate that they do so once he takes control of the fed.
As any president should in that scenario? I'm sorry, we're going to nuke professional class workers and let tech executives keep their 2026 money from the proceeds and let the losers go jobless? Not likely if you don't want a bloodbath. Let me be clear: fuck Trump, but any president who doesn't do that is out of their mind.
Monetary policy isn’t inflationary if it’s on par with real production gains. More money chasing even more more goods is deflationary.
What goods?
Software is a "good", as far as economic statistics go.
AI is helping produce more software, right? Including more software that is for sale?[1] Or more online services that are for sale?
[1] One of the interesting things here is going to be liability. You can vibecode an app. You can throw together a corporation to sell it. But if it malfunctions and causes damage, your thrown-together corporation won't have the resources to pay for it. Yeah, you can just have the company declare bankruptcy and walk away, leaving the user high and dry.
After that happens a few times, the commercial market for vibecoded apps may get kind of thin. In fact, the market for software sold by any kind of startup may also get thin.
Software stopped being a good when it no longer came in a box with finite inventory, that you had to pay for only once. It's part of the services economy, same as insurance or car rental services, regardless of how the Fed classifies it.
So is the premise here that making more software is going to have a deflationary effect on the entire economy of material goods? If so then that's obviously nonsensical.
I’ve been a nay sayer up till two weeks ago when I actually sat down and coded a non trivial feature with AI and I was blown away. The problem was one that there may have been at most 5 open source versions that they had been trained on.
Today I am planning an exit strategy. Anyone else considering what they’ll do in a post AI software engineering world?
I have no workable exit strategy because I have small children and our family has been continuously under layoffs and terminations.
If the doom really comes to pass then what future is there for us? I fear a life of impecunious servitude and poverty more than death.
Regardless of the future you have to plan for the worst and hope for the best.
I don't have time to post significantly about it but I'd love to trade thoughts and figure this out.
Email?
Save up enough money to be safe for many years after I get laid off so I have plenty of time to figure out what to do.
Writing the code has never been what has held companies back.
A lot of companies will use the speed of AI to wallpaper over the fact that they don't know what to make or how to prioritize.
We used to call the job Analyst Programmer. I am not sure what thing your code did but I am pretty sure you needed it - but who could understand that gap existed ? Who could explain to an AI that it needed to create this obscure code to solve that problem. And now comes the hard part - persuading your organisation to adopt it.
AI can code - but can it understand what it missing from the organisation and persuade it to chnage - to spend years at industry conferences?
Look at Starliner. NASA just announced that Boeing stuffed up, not with an engineering mistake (no one still knows exactly what broke) but that the whole organisation is so screwed up and so political Nasa just don’t believe Boeing can fix it.
AI cannot fix our turf wars. That’s not intelligence (humans know going to war is bad, but Putin still exists). I ye the systems we live in, and work in.
Changing those is feasible - once they are coded, transparent and open To inspection in a democracy.
We need programmable introspective systems of organisation - democracies in other words.
The engineering was not the problem - the problem was the organisation was more or less toxic and incapable of doing engineering. Writing code that won’t get used because if politics is a job we and AI can both do.
How do you plan an exit strategy for something that may or may not obsolete a whole field in a matter of months? Not sure there's a real way to do such a thing.
I would emphasize self reliance and sustainable living. Things like home ownership, dollar-resistant assets like gold, and self education in topics that matter to you and your existence.
Another way to frame it is what would you do in a low trust environment where corporations and the government were not to be trusted. You would likely avoid things like bubble bursting AI stock investments, jostling for rank in a company, etc.
If post AI software engineering is a thing then this is NOT a bubble.
I've been using these tools for nearly a year and half on a daily basis. They've become an integral part of my tool box for solving problems.
But writing code was never much more than 35-40% of my job while working for companies/others. Most my time has always gone towards communication, design, and validation. All three of those are not particularly vulnerable to mass AI automation except for the most trivial of scenarios and I have not seen evidence that has changed in over 2 years of so called "improvements".
My "exit plan" ultimately is to be one of the engineers capable of using these tools to scale my impact accordingly so I can focus on higher order problem solving, which ultimately is what is most valuable. I would be more concerned if I was in marketing/sales or frankly middle management.
Maybe this is just "copium" on my part, who knows, this sector is moving fast.
Yeah, it's pretty wild, huh? I'm at one of the big tech companies and the strategy that I've fallen into is to basically just mostly coast, save / invest as much as possible, and hope that this gets me to a financial threshold of "permanently comfortable" before the job losses kick in.
As for what happens after that, I'd really prefer not to have to do physical labor or trades. And it doesn't seem like any other white collar occupation is really going to be insulated, other than perhaps medical. So my strategy is to basically wait and see what society looks like after the transition and I guess I'll try and decide on something then?
I'm not saying it is totally untrue, but this is the same generic, hedged statement that every business and political leader has been repeating for a couple years now. Unless there's anything new or noteworthy added (and looking at the article there isn't) what is really worth discussing?
They linked to the economic report article:
https://www.reuters.com/world/us/us-third-quarter-productivi...
Productivity up 5%.
Productivity/dollar up 3% Q2 and 2% Q3 even as labor costs up 1%.
Another source that has been finding labor productivity increases in national-level data, since 2024 now:
https://www.stlouisfed.org/on-the-economy/2025/nov/state-gen...
> ... on average, industries with 1 percentage point higher time savings experienced 2.7 percentage points higher productivity growth relative to their prepandemic trend. We stress that this correlation cannot be interpreted as causal, and that labor productivity is determined by many factors. However, the current results are suggestive that generative AI may already be noticeably affecting industry-level productivity.
What's new is who is saying it, i.e. people with better understanding of the economy than you or I.
People in much more important and powerful positions than her (presidents, CEOs of multi-trillion dollars companies, top economists, heck Jerome Powell himself) have been saying this exact thing in countless interviews and business forums. She is hardly the first.
> people with better understanding of the economy
> People in much more important and powerful positions than her
I said "understanding," you said "power." There's a difference: presidents and CEOs say lots of dumb stuff.
> While AI will offer "new opportunities,"
It would be helpful if this was articulated in depth. It's used as a shibboleth alongside "productivity" but it's rarely followed with the concrete details
Important thing to remember is that "new opportunities", whatever they are, are neither for you nor your children to partake - they're for the people just entering the workforce. Those whose careers suddenly disappear, as well as their families and children, are too busy dealing with consequences of being suddenly thrown down a rung or three on the socioeconomic ladder.
It’s not articulated in depth because nobody knows what opportunities there are on the other side.
Prediction is hard, especially of the future.
This isn't the first time they new technology has reshaped society, or even just the economy. How well were the results of prior things predicted ahead of time?
“Thanks to AI (and new immigration policies), there are tons of exciting new opportunities for Americans that want to pursue careers in: harvesting crops, installing roofs, or home health care”
Is it AI that's causing problems or is the US govt finally acknowledging how high the bar is in general for getting employment? It's not just AI that has influenced that.
More importantly, are they planning to do anything about it?
Which means we'll all work on expanding the welfare state? Which exists to take care of the unemployed portions of our population. Right?
Just yesterday, Goldman and MS said the exact opposite.
https://www.washingtonpost.com/technology/2026/02/23/ai-econ...
Two different things. My understanding is that the Goldman Sachs take was about the effect of AI investments (largely the humongous CapEx spends by the hyperscalers) showing up in the GDP.
This is about labor productivity, a standard national-level economic indicator (see https://www.bls.gov/news.release/pdf/prod2.pdf and https://fred.stlouisfed.org/series/OPHNFB) going up 4.9%, as reported in this article linked in TFA: https://www.reuters.com/world/us/us-third-quarter-productivi...
Nah, the articles are non-contradicting. That article focuses on how the spend mostly goes to imports, which decreases GDP. This one focuses on the effects on unemployment. It's very plausible that a decrease in interest rates right now would lead to more imports and AI spending, not increased employment.
If productive output really stays the same while employment (and tax revenue) drops, there are only two ways for the government to stay solvent: 1. Print money 2. Increase taxation
That’s it. An eroding tax base necessitates one of those or a combination.
sure, but also looking at the pattern of who is involved - they could also declare bankruptcy (even if federal gov’t bankruptcy isn’t actually a thing).
> "Monetary policymakers would face tradeoffs between unemployment and inflation. ... Education, workforce, and other policy that is non-monetary may be better suited to address these challenges in a more targeted way."
How comforting. Sounds to me like "ZIRP won't fix this one folks, it's gonna take something other than money to fix what's coming."
The only question is how ugly it's going to get. Globally.
The closest thing we've seen in terms of scope/velocity is probably the introduction of the web in the late 90s to the broader world. Very few jobs were killed by that, though, relatively speaking.
The global effect at my company is that we fired a bunch in the US, hired a bunch in India, and are probably gonna do the same again in a year or so, if I'm reading the tea leaves right. We're heavily pushing "AI" but we're not cutting headcount, just shifting them overseas.
The closest thing is probably workers displaced by machines in the Industrial Revolution. Some people took it into their own hands to smash machines, and it didn’t go well for them.
Today we use Luddite as an epithet, but they were right about the effect of automation on their jobs.
This whole thing stinks.