Your problem is thinking that hype artists, professionals and skeptics are all the same voice with the same opinion. Because of that, you can't recognize when sentiment is changing among the more skeptical.
Functional illiteracy and lack of any capacity to hold any context longer than two sentences has long been a plague on HN. Now that we've outsourced our entire thinking process to "@grok is this true", it has now claimed almost the entirety of human race.
soulofmischief: complains that AI-skeptics would say the Wright brothers were idiots because they didn't imediately implement a supersonic jet
ares623: we were promised supersonic jets today or very soon (translation: AI hype and scam artists have already promised a lot now)
eru: The passive voice is doing a lot of work in your sentence. (Translation: he questions the validity of ares623's statement)
me: Here are just three examples of hype and scam promising the equivalent of super jet today, with some companies already being burned by these promises.
Apply your own "functional literacy". I made a clarification that those outside of an industry have to separate the opinions of professionals and hype artists.
The irony of your comment would be salient, if it didn't feel like I was speaking with a child. This conversation is over, there's no reason to continue speaking with you as long you maintain this obnoxious attitude coupled with bad reading comprehension.
Here's Ryan Dahl, cofounder of Deno, creator of Node.js tweeting today:
--- start quote ---
This has been said a thousand times before, but allow me to add my own voice: the era of humans writing code is over. Disturbing for those of us who identify as SWEs, but no less true. That's not to say SWEs don't have work to do, but writing syntax directly is not it.
They have everything to gain by saying those things. It doesn’t even need to be true. All the benefits arrive at the point of tweeting.
If it turns out to be not true then they don’t lose anything.
So we are in a state where people can just say things all the time. Worse, they _have_ to say. To them, Not saying anything is just as bad as being directly against the hype. Zero accountability.
Yes, my point is that industry professionals are re-calibrating based on the last year of agentic coding advancements, and that this is different from hype men on YouTube from 1-2 years ago claiming that they don't have to write code anymore.
Congratulations, now you're starting to understand! :)
Last one is irrelevant. Of course some companies are miscalculating.
OpenAI never claimed they had achieved AGI internally. Sam was very obviously joking, and despite the joke being so obvious he even clarified hours later.
>In a post to the Reddit forum r/singularity, Mr Altman wrote “AGI has been achieved internally”, referring to artificial general intelligence – AI systems that match or exceed human intelligence.
>Mr Altman then edited his original post to add: “Obviously this is just memeing, y’all have no chill, when AGI is achieved it will not be announced with a Reddit comment.”
Dario has not said "we are months away from software jobs being obsolete". He said:
>"I think we will be there in three to six months, where AI is writing 90% of the code. And then, in 12 months, we may be in a world where AI is writing essentially all of the code"
He's maybe off by some months, but not at all a bad prediction.
Arguing with AI skeptics reminds me of debating other very zealous ideologues. It's such a strange thing to me.
Like, just use the stuff. It's right there. It's mostly the people using the stuff vs. the people who refuse to use it because they feel it'll make them ideologically impure, or they used it once two years ago when it was way worse and haven't touched it since.
The insecurity is mind-boggling. So many engineers afraid to touch this stuff for one reason or another.
I pride myself in being an extremely capable engineer who can solve any problem when given the right time and resources.
But now, random unskilled people can do in an afternoon what it might have taken me a week or more to do before. Of course, I know their work might be filled with major security issues, or terrible architectural decisions and hidden tech debt that will eventually grind development to a complete halt.
I can be negative and point out these issues, or I can adopt these tools myself, and have the skilled hand required to keep things on rails. Now what I can do in a week cannot be matched by an unskilled engineer in an afternoon, because we have the same velocity multipliers.
I remember being such a purist in my youth that I didn't even want autocomplete or intellisense, because I feared it would affect my recall or stunt my growth. How far we have come. How I code has changed completely in the last year.
I code 8-20 hours a day, all day. I actively work on several projects at once, flipping between contexts to check results, review code, iterate on design/implementation, hand off new units of work to various agents. It is not a perfect process, I am constantly screaming and pulling my hair out over how stupid and forgetful and stubborn these tools can be sometimes. My output has still dramatically increased, and I have plenty extra time to ensure the quality of the code is secure and good enough.
I've given up on expecting perfection from code I didn't write myself; but what else is new? Any skilled individual who has managed engineers before knows you have to get over this quickly and accept that code from other engineers will not match your standards 100%.
Your role is to develop and enforce guidelines and processes which ensure that any code which hits production has been thoroughly reviewed, made secure and performant. There might be some stupid inline metacomments from the LLM that slip through, but if your processes are tight enough, you can produce much more code with correct interfaces, even if the insides aren't perfect. Even then, targeted refactors are more painless than ever.
Engineers who only know how to code, and at a relatively mediocre level, which I imagine is the majority of engineers now in the field who got into it because of the money, are probably feeling the heat and worried that they won't be employable. I do not share that fear, provided that anyone at all is employable.
When running a business, you'll still need to split the workload, especially as keeping pace with competition becomes an increasingly brutal exercise. The money is still in the industry, and people with money will still find ways to use it to develop an edge.
AGI was achieved internally at OpenAI a year ago.
Multiple companies have already re-hired staff they had fired and replaced with AI.
etc.