Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

When I ask ChatGPT a question, it explains it's reasoning and gives me concepts I can follow up with googling to learn more.

When I use Google for research, I get articles written for SEO to push products and often have to refine and refine and refine to get something useful, which I then can follow up by googling to learn more. With difficulty.

Honestly I don't know how much I'd use ChatGPT if I had the internet of 2016 and Google.



Careful, It explains but both answer and explanation are sometimes completely hallucinated, it sometimes looks like a plausible answer, but actually it completely made up. And this happens way too often for me to take it seriously for now.


It's probably got as good a success rate as many of my colleagues.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: