> I'm a bit nervous where I'd end up though - with code I'd "written" but wasn't familiar with
This does seem like quite a big downside. It turns every new feature into “implement this in someone else’s code base”. I imagine you’d very quickly have complete dependency on the AI. Maybe that’s an inevitability in this new world?
It sounds fine as long as you can fully trust the AI to do good work right?
I don't think there's any current AI that is fully trustworthy this way though.
I wouldn't even put them at 50% trustworthy
I think we are going to see a cliff where they become 80% good, and every tiny bit of improvement past that point will be exponentially more difficult and expensive to achieve. I don't think we reach 100% reliable AI in any of our lifetimes
I think we are going to reach a cliff where a type of old school developers keep saying, "it just can't write code like I can" while at the same time wondering why they can't land a job.
Current AI is likely already beyond 50% trustworthiness, whatever that means.
> "it just can't write code like I can" while at the same time wondering why they can't land a job
People had this same prediction about offshore development
Those old school devs are able to find well paying work fixing broken software churned out by overseas code sweatshops
I predict if you can read and understand code without the help of AI models you will be in even higher demand to fix the endless broken software built by AI assisted coders who cannot function without AI help
This does seem like quite a big downside. It turns every new feature into “implement this in someone else’s code base”. I imagine you’d very quickly have complete dependency on the AI. Maybe that’s an inevitability in this new world?