Hacker Newsnew | past | comments | ask | show | jobs | submit | jmoggr's commentslogin

All they have to do is not reproduce. This is already happening.


> Automerge/Yjs require learning CRDTs

I've been using Automerge for a while and haven't had to look at any CRDTs. To me this looks very similar to Automerge.

Neat project!


I think the comments here have been overly harsh. I have friends in the community and have visited the LessWrong "campus" several times. They seemed very welcoming, sincere, and were kind and patient even when I was basically asserting that several of their beliefs were dumb (in hopefully somewhat respectful manner).

As for the AI doomerism, many in the community have more immediate and practical concerns about AI, however the most extreme voices are often the most prominent. I also know that there has been internal disagreement on the kind of messaging they should be using to raise concern.

I think rationalists get plenty of things wrong, but I suspect that many people would benefit from understanding their perspective and reasoning.


> They seemed very welcoming, sincere, and were kind and patient even when I was basically asserting that several of their beliefs were dumb

I don't think LessWrong is a cult (though certainly some of their offshoots are) but it's worth pointing out this is very characteristic of cult recruiting.

For cultists, recruiting cult fodder is of overriding psychological importance--they are sincere, yes, but the consequences are not what you and I would expect from sincere people. Devotion is not always advantageous.


Does insincerity, cruelty, unfriendliness, and impatience make a community less likely to be a cult?


> They seemed very welcoming, sincere, and were kind and patient even when I was basically asserting that several of their beliefs were dumb

I mean, I'm not sure what that proves. A cult which is reflexively hostile to unbelievers won't be a very effective cult, as that would make recruitment almost impossible.


> How much would a 10cm sphere of gold be worth in GBP?

Is there some trick to this? Or do you have to input it like:

You have: 4/3pi(10 cm)^319320 kg/m^345000 GBP/kg

(What ChatGPT gave me)


units has (I assume room temp/pressure) densities for all elements, as well as some precious metal prices and currency exchange rates (you need to run the units_cur program regularly to update the database for these). It also has tab completion to make discovering these a bit easier.

The invocation is

You have: goldprice * golddensity * spherevol(10cm/2)

You want: GBP


Neat! Thank you!


TIL -- thank you, brother!


You can just save a step and ask ChatGPT the answer. It can google the current spot price of gold.


Sure, but then I need to do all the math to verify the answer it gives me isn’t gibberish anyway.


You can just save a step of double-checking everything by using WolframAlpha

https://www.wolframalpha.com/input?i=%2810cm+sphere+of+gold+...


What if its wrong


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: