Kernel developers don't use and IDE for the Linux kernel, and they are arguably some of the best programmers in the world. Great programmers use vim or emacs, because the imagery in their mind is far more powerful than anything an IDE could display. Besides, any screen real estate used for "interpretive" purposes is just an annoyance.
Are they the best developers, and if so how do you measure that?
> Great programmers use vim or emacs
This another form of No True Scotsman, and not a valid argument.
> because the imagery in their mind is far more powerful than anything an IDE could display
How do you know, and how do you measure this? Surely, you would admit that Vi and Emacs are better than pen and paper, or punch cards, right? So does it not follow that Vi and Emacs could be improved upon? Or are they the pinnacle of inputting instructions into a computer? If they are which one is better? Why? How do we measure that?
The steering wheel could be improved upon in some theoretical sense, but the chances are that any new car steered by, say, an iPad is much worse and definitely isn't going to be used in races any time soon.
These kind of generalizations aren't helpful, simply because they aren't true. Are vi and emacs still going to be in wide use 500 years from now? Likely not. Then it stands to reason that there might be a way to improve upon them.
I use an IDE because I don't have to sink an inordinate amount of time into customizing my environment, since that activity doesn't deliver any value to the folks that pay me. I'd prefer to let a really smart team of engineers set those tools up for me. I do not, however, assume that anybody that doesn't use an IDE must be inferior.
Is Visual Studio going to be in use 500 years from now? Light Table? No.
The people who made Visual Studio have not made something that is smarter for everyone's work. Maybe your work just doesn't require any customization. That doesn't mean nobody should ever want customization.
> That doesn't mean nobody should ever want customization.
I fail to see where I made that claim. IDE's are also customizable - I would argue that they are, in fact, much more customizable than either vim or emacs, simply because of the breadth of features one may customize. I was addressing the parent's claim that "great programmers use vim or emacs."
If you run "stty -ixon", it will prevent Ctrl-S from pausing the terminal. I understand that feature in the days of slow connections, but it feels rather silly at this point.
That is fascinating to me, in a train-wreck sort of way.
We had a discussion a few days ago about the ways in which some interfaces (command line in particular) can be user-hostile. (https://news.ycombinator.com/item?id=9831429) Vim's Ctrl-S appears to be a function, keyboard-adjacent to several commonly-used functions, whose main effect for many users is "cause the program to fail immediately with no indication of how to fix it." I don't think I could make up a better example of user-hostile design if I tried.
Ah, my mistake. Same criticism applies, though; possibly more so, as a terminal emulator running within a GUI would find it even easier to display a useful status message or similar.
I've used vim pretty much every work day in the last 15 years or so, and I can count the number of times it's crashed on one hand. If you can actually verify that it crashed (i.e. core dump file), then you've got the source available, and you can send in the patch.
Isn't complaining that (INT_MAX + 1) is undefined the same as complaining that (8/0) is undefined? What meaningful functionality could be gained by defining overflow behavior? If you really want to know what INT_MAX + 1 is, try:
I'm only speaking for my personal preferences here.
My two problems with undefined behavior is that 1) it is not always obvious that your program is guilty of undefined behavior and 2) when a C program contains undefined behavior the compiler can generate surprising machine code.
I don't have any problem with 8/0 being undefined mathematically, but I do have a problem with a C compiler generating code that will delete my hard drive if it ever occurs (an unlikely but theoretically possible for a conforming C compiler).
INT_MAX + 1 is problematic because most people expect wrap around (and in fact they may be using a compiler that does this). It's more problematic because UINT_MAX + 1 does get wrap around. And finally there's a bunch of difficult to remember auto integer conversion rules. Taken together it's not surprising that many people will find it difficult to determine if their integer usage is correct.
One solution is of course to use a safer language than C (where I unfortunately have to end up). But I personally object to the idea that you have to have undefined behavior in order to get the performance of C (waiting to see how the development of languages like Rust goes to determine if I'm right).
There exists an enormous body body of reliable and fast code written in "C" (e.g. the Linux kernel). I've been coding in C for 27 years, C++ for 20. I think the idea that a language can permit only useful work to occur is naive. In order to code well, the programmer must understand how the computer works, and have a great deal of focus. If you have those, "helpful" languages only seem to get in the way.
I think a good middle-ground would be to have more narrowly scoped behavior. Not quite implementation-defined (as I believe that requires the implementation to be consistent in its treatment?) but for instance "Integer overflow will result in the integer taking an unspecified (and potentially changeable) valid value for that integer". I'm not sure whether that particular example sufficiently covers the requirements of the optimiser, but defining it as "May cause any of the following results, but not nasal demons" would perhaps be an improvement?
DISCLAIMER: I haven't done any C programming in years, and I wasn't particularly good at it, even then.