Hacker Newsnew | past | comments | ask | show | jobs | submit | logrot's commentslogin

But surely if it's artificial intelligence then it'd know its limits and would respond appropriately? Oracle use no problem?

It is it because it's actually shit but it's the best thing we've seen yet and everyone is just in denial?


People constantly misevaluate their own limits though. Why should AI not be allowed to do that?


Professionals don't constantly misevaluate their limits, if the AI is to replace a professional it has to know its limits.


Current AI is for productivity boost, not to replace. And automation of certain use cases, but not all. It is already really good at those things.


It depends on who you mean.

Most normal people look at AI like ChatGPT as an amazing tool and have used it effectively as a replacement for Google, Grammarly etc. And for them it's fine because any mistakes are localised to them.

The problem are those building products on LLMs e.g. Legal, Customer Service who are knowingly misrepresenting the capabilities of what it can do to companies who don't know any better. And I would argue this is fraudulent and where we will see most of the problems.




The dream is collapsing.


As Something of a Vagueposter Myself, I’ll bite.

Which dream is collapsing? I’m not disputing it, legitimately curious which of several collapsing dreams you mean.


It's the title of a Hans Zimmer song from the Inception soundtrack, pretty intense!


Cool sidebar, thanks for reminding me. I find Zimmer soundtracks to be some of the best coding music and many if not most of them are on e.g. Spotify.


I have a title ready:

From loved to loathed in 25 years.

Subtitled: How Prabhakar Raghavan, armed with greed and ignorance tanked the world's biggest search engine for the second time.

(Yep, see https://www.wheresyoured.at/the-men-who-killed-google/ )


This code yellow thing is so childish, it is almost unbelievable that one of the biggest companies in the world is managed by a bunch of finance guys thinking they are cool tech kids (which is a concept that doesn't exist anymore, grand part thanks to Google becoming IBM).


Would that not increase the chances of things getting burnt at the base?


Article says it does the opposite, as it effectively stirs the bottom


Not really, since milk contains water the max temperature you can reach is 100 degrees (given it has a path to atmospheric pressure). So the bottom can only burn if all the water boils off.


A beautiful demonstration of the difference between theory and practice.


"contains water" is very obviously insufficient for this to be anywhere close to true.


Milk is mainly water as I'm sure you know. If you think it's useful to know what mixtures won't be held to 100 degrees, perhaps you could contribute some information to that effect - as opposed to just snark


you should really put your theory to test


Put some milk on high and see if your theory holds true.


Sounds interesting! I wonder what happens, when milk gets burned!

- - -

I gave it some thought, and a possible explanation is that water evaporates from the bottom of the pot, it leaves some waterless milk there, and that burns.

But this assumes that the inside of the pot can be hotter than 100°C, I wonder if that's true, or milk keeps it ~100°C (water is very good at transferring heat after all).


I envy kids today to be able to look up pretty much anything what they want to learn. It was not like that back in the day.


I (not the programmer in the video) started programming 2 years or so during the pandemic during sophomore year of high school. I can affirm - there is a lot to wade throughm it's hard to know what's relevant, easy to learn and useful in real life.

That said, despite the availability of content online I tend to only watch the videos where something is a) summarized or b) something very advanced is broken down (e.g. the micro-gpt from scratch series). Imo, the best way to learn programming is to get excited about bringing your ideas to life, and choosing ideas small enough (at the start) to accomplish.

I am worried though about the rise of machine learning which may increase the barrier to entry in industry jobs and require more expertise to get ones "foot in the door". Additionally I find it hard to resist learning new technologies that are not as widely used / changing quickly do to their ease of use (Svelte, Tauri, V lang, etc).

Anyways, just thought I'd chime in as a kid learning to code


> I find it hard to resist learning new technologies that are not as widely used / changing quickly do to their ease of use

This isn’t a bad thing. I’ve never regretted trying something new, even if it’s just to learn that I don’t like it or whatever. “normal” software engineering stuff (day to day) is so much more about building and maintaining what you have already.


> Additionally I find it hard to resist learning new technologies that are not as widely used / changing quickly do to their ease of use (Svelte, Tauri, V lang, etc).

One thing I'd suggest is to be somewhat critical of what it means to be learning something, and find a way to apply whatever that is multiple times, without spreading yourself too thin.

When I was getting started around the same age, I never bothered repping things out like at a gym or going deeper into how things worked and piecing them back together manually from scratch (such as the micro-gpt project), but those are two ways you can keep things interesting long term, and develop useful/practical knowledge. Instead, what I did was read a book or follow a tutorial and figured that was enough to learn, but it's often just what was possible to teach, or a good jumping-off point, and that's basically what I've been getting from GPT lately. Nothing really gets learned imo unless there's an input and output to the process, such as watching the video and writing all the code from scratch or on paper, and then hopefully building something with it where you'll be forced to push yourself mentally. Very little of the depth might apply in a job, but you'll be better at troubleshooting and understanding what's happening, and repping out gradually more abstract and complex projects using the same technology will make you more efficient. Especially with ADHD that I didn't realize I had earlier, it would have really helped to rep things out more, because only through repetition and expanding complexity do you understand the gritty bits that aren't complete, or areas where you're likely to need another tool, or useful shortcuts and edge cases etc..

Incidentally, NAND2TETRIS is a great example of an end to end course that is totally worth trying and slowly grinding through.


> I tend to only watch the videos where something is a) summarized o

I hope you don't give views to those asinine videos that take 30 minutes to 'summarize' a 90 minute movie.


Begging my parents to drive me to every library in the county so I could try and find books on computer graphics and AI in the 1980s. Not very many around then :(


Alternatively, our routes back in the day was relatively linear, whereas today kids have to wade through so much more to find their paths


I just wrote out a reply to the parent comment about learning to code but forgot to reply to this one, my previous comment is here: https://news.ycombinator.com/item?id=40484956


i feel this is a sentiment shared by every generation. “Now with printing presses, kids have access to tombs of knowledge”

Conversely, you used to look at a line of OS code and knew how that translated to registers within a processor. Now i barely know which kubernetes cluster the code is currently deployed in


Sure. On the other hand, the mere fact that it was difficult to find the information means you might have developed skills and perseverance going out for it.


But it's certainly no longer the golden era of the Internet either, with human mass produced clickbait content already on the rise and now we've got the AI variant to contend with as well :(


It used to be called executive summary. It's brilliant but the kids found it a too formal phrase.

IMHO almost every article should start with one.


Executive summary?


Google employee scheduled the deletion of UniSuper’s resources and Google (ironically) did not cancel it.


there's literally a tl;dr in the linked article


There should be an Executive Summary


8% of men are colour blind. Not taking that into account when designing UI is ... interesting to put it lightly.


Afaik, there is no black/white/blue variant of color blindness.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: