Most normal people look at AI like ChatGPT as an amazing tool and have used it effectively as a replacement for Google, Grammarly etc. And for them it's fine because any mistakes are localised to them.
The problem are those building products on LLMs e.g. Legal, Customer Service who are knowingly misrepresenting the capabilities of what it can do to companies who don't know any better. And I would argue this is fraudulent and where we will see most of the problems.
This code yellow thing is so childish, it is almost unbelievable that one of the biggest companies in the world is managed by a bunch of finance guys thinking they are cool tech kids (which is a concept that doesn't exist anymore, grand part thanks to Google becoming IBM).
Not really, since milk contains water the max temperature you can reach is 100 degrees (given it has a path to atmospheric pressure). So the bottom can only burn if all the water boils off.
Milk is mainly water as I'm sure you know. If you think it's useful to know what mixtures won't be held to 100 degrees, perhaps you could contribute some information to that effect - as opposed to just snark
Sounds interesting! I wonder what happens, when milk gets burned!
- - -
I gave it some thought, and a possible explanation is that water evaporates from the bottom of the pot, it leaves some waterless milk there, and that burns.
But this assumes that the inside of the pot can be hotter than 100°C, I wonder if that's true, or milk keeps it ~100°C (water is very good at transferring heat after all).
I (not the programmer in the video) started programming 2 years or so during the pandemic during sophomore year of high school. I can affirm - there is a lot to wade throughm it's hard to know what's relevant, easy to learn and useful in real life.
That said, despite the availability of content online I tend to only watch the videos where something is a) summarized or b) something very advanced is broken down (e.g. the micro-gpt from scratch series). Imo, the best way to learn programming is to get excited about bringing your ideas to life, and choosing ideas small enough (at the start) to accomplish.
I am worried though about the rise of machine learning which may increase the barrier to entry in industry jobs and require more expertise to get ones "foot in the door". Additionally I find it hard to resist learning new technologies that are not as widely used / changing quickly do to their ease of use (Svelte, Tauri, V lang, etc).
Anyways, just thought I'd chime in as a kid learning to code
> I find it hard to resist learning new technologies that are not as widely used / changing quickly do to their ease of use
This isn’t a bad thing. I’ve never regretted trying something new, even if it’s just to learn that I don’t like it or whatever. “normal” software engineering stuff (day to day) is so much more about building and maintaining what you have already.
> Additionally I find it hard to resist learning new technologies that are not as widely used / changing quickly do to their ease of use (Svelte, Tauri, V lang, etc).
One thing I'd suggest is to be somewhat critical of what it means to be learning something, and find a way to apply whatever that is multiple times, without spreading yourself too thin.
When I was getting started around the same age, I never bothered repping things out like at a gym or going deeper into how things worked and piecing them back together manually from scratch (such as the micro-gpt project), but those are two ways you can keep things interesting long term, and develop useful/practical knowledge. Instead, what I did was read a book or follow a tutorial and figured that was enough to learn, but it's often just what was possible to teach, or a good jumping-off point, and that's basically what I've been getting from GPT lately. Nothing really gets learned imo unless there's an input and output to the process, such as watching the video and writing all the code from scratch or on paper, and then hopefully building something with it where you'll be forced to push yourself mentally. Very little of the depth might apply in a job, but you'll be better at troubleshooting and understanding what's happening, and repping out gradually more abstract and complex projects using the same technology will make you more efficient. Especially with ADHD that I didn't realize I had earlier, it would have really helped to rep things out more, because only through repetition and expanding complexity do you understand the gritty bits that aren't complete, or areas where you're likely to need another tool, or useful shortcuts and edge cases etc..
Incidentally, NAND2TETRIS is a great example of an end to end course that is totally worth trying and slowly grinding through.
Begging my parents to drive me to every library in the county so I could try and find books on computer graphics and AI in the 1980s. Not very many around then :(
i feel this is a sentiment shared by every generation. “Now with printing presses, kids have access to tombs of knowledge”
Conversely, you used to look at a line of OS code and knew how that translated to registers within a processor. Now i barely know which kubernetes cluster the code is currently deployed in
Sure. On the other hand, the mere fact that it was difficult to find the information means you might have developed skills and perseverance going out for it.
But it's certainly no longer the golden era of the Internet either, with human mass produced clickbait content already on the rise and now we've got the AI variant to contend with as well :(
It is it because it's actually shit but it's the best thing we've seen yet and everyone is just in denial?