The tech sector is pouring billions of dollars into AI. But it keeps laying off humans::The tech sector has kicked off the new year with a spate of fresh job cuts that are coming at the same time as the industry is doubling down on investments into artificial intelligence.
Capitalism- where automation means you either get fired or work the same for less amount of money
Or work more for the same amount of money.
Or, and hear me out here, work more for less money.
I work in tech. I’m really impressed with AI, particularly outside of LLMs, but I was a yuge chatGPT fan for most of last year. I don’t think the technology is ready. I worked with a senior dev who worked on LLMs and he didn’t think it was there. I think we’ve got c-suite overshooting and investor hype on a technology that isn’t quite there yet. There’s going to be a reckoning soon where all these dumbasses are going to have to eat crow and re-hire a lot of these positions.
I work/study in AI, and it is completely over-hyped. For one thing, the C-suite can’t wrap it’s head around the fact that AI != LLM; they all seem to think all AI is just LLMs. On top of that, they are way too eager to throw humans out of the loop.
That said, I think LLM applications, even in their current form, are super useful in development and business practices. I myself use it to increase my productivity in coding. But, I use it as an augmentation rather than a replacement. One of my friends put it best the other day, “LLMs are like a junior dev to your senior dev. You need to be hyper-specific, and you need to check it’s output.” In other words, it’s great for off-loading some work, but it isn’t going to completely replace humans.
With that said, I’m a bit annoyed that other AI fields are being over-shadowed by LLMs. There’s a ton of other interesting work being done in those fields that is super useful and important. All of them, though, are not going to replace humans but rather augment and make humans more productive. I’ve found that an AI-Human team is most effective.
AI != LLM… a thousand times yes. We’re seeing a boom of LLMs and those have their limits too, but it’s the closest we gotten to “true AI” and it’s the buzz word C-Suites have latched on to. Look at blockchain and where that went for them
Agreed, I’m a backend dev and I regularly use ChatGPT to help me think through problems, spot pernicious errors, refactor, etc. ChatGPT is really, really cool, but imo, not as cool as AlphaFold. I feel like every time I do manage to hear about an AI development that’s not a diffusor and LLM, it’s really, really cool.
I promise you they will not rehire those they fire to replace with LLMs. I cannot post which company I work for but it one of the largest employers in my country with tens of thousands of employees. We have laid off entire branches of our organization in favour of enterprise GPT-4 licenses.
Several sectors of HR have been replaced, our program analytics and accounting branches have all gone through 3 levels of cuts (we now have a chatbot that handles data viz and HR mediations/parts of various HR processes. Our graphics and design teams are now Stable Diffusion.
These changes were implemented in August of last year. Thousands of employees have been replaced by a handful of different generative AI licenses and we had to take on an additional cost of hiring an offshore consulting firm to feed these models our business data so they would be more customized for our org.
I’m looking for my exit because despite the increased costs and automation, things aren’t working well at all. The HR bot sometimes tells you rules that don’t exist (had to look deep into my contract because HR bot told me I wouldn’t be eligible for severance pay which was false). All of our workloads have quadrupled with losing a good chunk of our staff and trying to fix things that are now broken. I went from 40h/week to 60+ and I’m definitely on the lower spectrum of this.
That sounds about right. Incredible that they dove into it so completely, and having a bot handle their HR sounds like they’re spoiling to get fucking wrecked in court, at least at this stage. So, what makes you say they won’t reverse course? The cost savings are worth the lost performance, or they just haven’t/won’t see(n) the writing on the wall?
Sunk cost. They’ve already spent tens of millions on this.
With all the biases the bots carry, it’ll be law time for many of these corporations. Money gained by firing developers will be lost on retaining DIck, Schlong & Partners.
(had to look deep into my contract because HR bot told me I wouldn’t be eligible for severance pay which was false)
Sounds like the same performance as before, when it was a human!
“If we lie*, hopefully they won’t check. And if they call us out, it was an honest mistake!”
Here’s the summary for the wikipedia article you mentioned in your comment:
The Gartner hype cycle is a graphical presentation developed, used and branded by the American research, advisory and information technology firm Gartner to represent the maturity, adoption, and social application of specific technologies. The hype cycle claims to provide a graphical and conceptual presentation of the maturity of emerging technologies through five phases.
I think that arrow is currently still a bit to the left ot the peak.
I use gpt-4 daily for my work, I love it. I get more done and save time.
The thing i don’t trust the simulation for one bit. If you analyze it properly you will always find errors.
Its way easier to fix those errors or to adapt the solution into my needs. But the most essential component of the entire workflow is that i a human with real life expertise green light the good parts from its response. A bit like i became a teacher and chatgpt is the student. Its works extend mine but I remain at all times responsible.
Maybe in a few years it could be ready, lets just hope it doesn’t go the same way as nfts, which could have had a brilliant use to track the authenticity of journalism in a fake news world, but no, monkey craves golden bananas even if it cant digest them. And now the aftertaste is so bad people get sick at the mention.
Correlation ≠ causation
For the large tech companies that my wife and I work for, it’s the pandemic.
The bonkers pandemic tech boom has chilled. People were trapped at home and were streaming video, buying shit online, using same-day delivery for food, throwing birthday parties on zoom, using telehealth to talk to their doctor, buying new gear for the desks they were tied to, sitting on social media for extended periods, etc.
All of those industries have seen traffic / sales cool a LOT. People are traveling, doing things in person, etc. Employees were hired to address the pandemic level demand, and now demand is returning back to normal.
That would make sense if the companies laying off people were at all related to those kinds of markets that depend on the traffic you describe for survival. Most people that I know including me were layed off from companies that sell B2B stuff or solid things that are always on need independent of those factors. The common excuse used is that everyone was hiring a lot to catch the good talent that others were firing.
What I see is just all businesses using an excuse of “oh everyone is doing it, let’s do it too” independent of if it’s really necessary or not.
It’s going to be different for every company, but as someone who is close to the numbers and is in the rooms listening to the execs freak out, and is close to others in similar roles, a lot of tech layoffs are being blamed on pandemic over-hiring.
And I can attest to hiring a shit load in 2020 / 2021. It was kind of bonkers. There were lots of opening, not a lot of candidates, and as a candidate you could negotiate insane salaries. It reminded me of the dot com boom days of the late 90’s.
It’s a “perfect storm” combining both post-covid effects and shiny new thing effects
Is this not exactly what people said would happen? Is this not exactly the point of AI? Aside from the fact that layoffs have been going on for a year or more, of course.
This headline is gaining traction because it fits the pattern of what people expected to see. It’s confirmation bias.
Speaking from first hand experience, a lot of companies are coming down from their pandemic high. People spent a fuck load of money on technology products and services over the pandemic, and in 2023 they went back to spending money on traveling, concerts, brick and mortar shopping, and other in-person things.
I work for a company that has both online and brick and mortar products, and I can see it in my data. Customer behavior changed back to normal, which made tech businesses less profitable. And that made tech companies freak out and cut “expenses” to get back to growth.
AI is disruptive AF, but it’s not replacing engineers, designers, and PMs yet. And those are the jobs that are being cut. Companies are scaling back and doing less.
No, this is independent on AI. AI for big tech is a core product, like a car for BMW. The layoffs are completely unrelated to AI, and related to stock price and interest rates of borrowing money. None of the layoff employee has been substituted by AI. It’s pure old school finance
Unrelated news in a single title just to attract readers. Usual sh**y to ride the AI hate bandwagon
It’s fucking stupid if you think AI today can replace human tech workers
I don’t think the title needs a “but” statement. No brainer and of no surprise that they’d lay of they’re work force whilst bringing in ai.
Just the ugly side of the still not so pretty side of capitalism
It needs the “but”. LLMs are generating more work for programmers, not replacing them.
Good bot. You are right, who needs those cumbersome humans anyway ? /Sarcasm.
Correction “-And- it keeps laying off humans”