I ran a test yesterday on ChatGPT and co-pilot asking first if it knew of a specific paper which it confirmed and then to derive simple results from which it was completely incapable of. I know this paper is not widely referenced (ie few known results in the public domain) but has been available for over 15 years with publicly accessible code written by humans. The training set was so sparse it had no ability to "understand" or even regurgitate past the summary text which it listed almost verbatim.
I entered into WASM assuming it was simple and minimal but there are a lot of runtime edge cases that are quite surprising. I built an interpreter that passes all the non-SIMD test cases from the official spec. Would be interesting to clean-up the spec to be a real minimal execution only core. There is a real barrier to entry for producing new lightweight implementations.
What's a good simple calculator for programmers? Easy entry/conversion of hex and binary but simple arithmetic with possible shifts. I don't need to do trigonometry.
I have one, bought it new many decades ago. If the SwissMicro clones have as good keys as the old HP originals then I wouldn't hesitate buying one of those if you're interested in a 16C.
SwissMicros make really good devices if you’re willing to pay the price. I carry my HP42 clone everywhere.
They make a clone of the programmer HP-16c that is pretty nice. Everything is open source too, so there’s modding potential if you are into that kind of thing.
Calculators are a relic of when we used to require separate single function devices to accomplish anything, like VCRs, Cameras, GPS receivers, music samplers, or transistor radios.
There is nothing a physical calculator can do that an IPython session can’t. And a physical calculator can’t copy paste the result where you want it (except modern phone camera OCR has I guess got to the point where you could probably grab the result off a calculator screen).
Outside of cases where you need dedicated pieces of hardware to accomplish a task, like motors or heating elements or something, there’s very little reason for preferring a dedicated device these days over a software or app equivalent. And even then, a version of the hardware that hooks up to a general purpose device to use its input and screen is likely the most useful form factor.
Modern ‘single function devices’ are most likely actually just low power multifunction computers running most of the functionality in software anyway.
Dedicated physical buttons are far superior to 'universal' touchscreen interfaces, if the thing you need fits on them.
And having a separate dedicated device helps having a workflow that doesn't invite distraction or context switching on a universal device, so having a dedicated device often is a reasonable intentional choice even if the same functionality is already available on another non-dedicated device.
Some numpads even come with an attached specialized screen to show you what you entered and a bunch of other features - they're called "calculators" :)
I use a computer, and, as you mentioned later, one with a keyboard connected. However, except for simple stuff I still reach for my phone which has an HP emulator running, to do calculations with what looks like an actual calculator. Except that it doesn't have those very special old HP calculator buttons, so it's much worse than the real thing (I have an HP 16C laying around, but not a general style HP). But still preferable to the computer. So, here I am, with a computer and having used computers daily for close to half a century, and still grabbing for a dedicated thing for doing certain types of calculations.
Take a look at https://github.com/tibru/tibru for an explanation of why the underlying technology is poorly chosen and how systems of this type don't need to be esoteric