Programmatic usage of LLMs has gone up astronomically over the past 1 year. There are so many startups that are mere LLM wrappers. It’s the latest tech hotness and I get it.
Ever since 2007/08, tech has seen the most innovation.
- Explosion of socials
- Modernization of web
- 8 million new JavaScript libraries
- Mobile Apps
- No-code
- Crypto
- Machine Learning
- and now, LLMs
The only difference LLMs have with the others is the learning curve. The others, one could easily hop on the trend. With LLMs, before being able to understand what everyone’s talking about, there’s a lot to know. There can be a sense of being left out. I think that can be demotivating.
I don’t know about you guys, but I’m exhausted by all of the tech feeds being filled up with the same topics around LLMs.
To add to it, I’ve been seeing CEOs saying “it’s so over” while in reality, that’s far from being true. Also, it’s always the CEOs and not the actual engineers behind these LLMs making such claims.
It’s toxic af. It really is.
I get it though. These CEOs need to sell their products and that’s one way to sell it. Fucking with mass paranoia is one way to get the needed publicity for your product.
I’ve seen CEOs claim that their AI is going to overtake the world. Years later, we are still here. ChatGPT launched in late 2022 and pretty much their CEO has been claiming that their AI is going to overtake the world ever since, once every month. In reality though, the damn thing can’t generate a basic fucking test case even when it has access to the entire code base.
As a person reading news and being worried, you’ll continue to worry. You worried in 2022. You worried in 2023. You worried in 2024. You continue to worry in 2025. These CEOs know that you’re worried and continue to make claims that it’s GG for everyone.
First, it was AIs replacing junior roles. Now it’s the mid roles. Again, all of this while the damn turds can’t code a test case.
I’ve seen CEOs claim that a significant percentage of the new code is being written by AIs. Tell you what, before that, how much of the code was copied from StackOverflow? These CEOs claim AIs are writing the code. Who’s reviewing it? Who’s rewriting it? Humans. Then, would it really make sense to claim that the code is AI generated? See how it’s always these CEOs, selling such tools, making those claims?
I’ve been saying “CEOs” instead of being specific about a single CEO ’cause it applies to a broader category of the tech CEOs right now.
During the pandemic, most companies over-hired. They laid them off later. If a company had layoffs during the same period as the other companies, they’d be fine in the public eye. If a startup isn’t doing well and needs to layoff their employees, they’d want to control the narrative and they’d say that they’re laying off the staff because they’ve been replaced by LLMs. This is what some startups did.
A tech company (read LLM wrapper) launched last year, claiming that it’d be a software engineer. After their early beta proved to be bad (I was gonna say dogshit, but I chose to be polite :D), it’s now been demoted to a “collaborative AI teammate”.
At one point, for a very very brief period – 1 day – I was worried too. I even told a couple of my friends about it. Then, I snapped to reality. I’m embarrassed.
AI exhaustion is real. Don’t get caught in it. Do what you enjoy and try to have fun. Use LLMs and build cool shit. Just don’t worry about AI taking away your jobs. 🤙

Comments