I have the exact opposite experience. For me, part of the problem is that in math, you have options. If you forget the quadratic formula, you can:
- Re-derive it by completing the square on the general quadratic
- Solve a specific quadratic you're interested in by factoring or graphing
- Try to find zeros with numerical methods
The point is, everything's connected to everything else, so it doesn't matter if you forget any part of it as long as you remember some critical mass.
To be fully fluent in a programming language, you have to memorize two or three dozen keywords and a handful of operators (mostly the same as standard math). To be fluent in a natural language, you have to memorize two or three dozen words a week for years. Not to mention understanding all the insane grammatical stuff.
With human languages, sure it's easy enough to remember that "nec-" means "dead" (especially if you consume fantasy novels / games where necromancy is a school of magic), but you're still totally screwed if you can't remember whether the word you need is "necare" or "necire," "necobitis" or "necabitis" or "necibetas". There's no rhyme or reason to any of it, your knowledge won't help you figure out whether that letter's supposed to be an "o" or an "i", or the difference between past tense and future tense and perfect tense and imperfect tense and tensile strength and gaaah why is this all so complicated!?
It's like if a programming language had 20 different versions of each keyword, and they were all spelled with one or two letters different, usually only the vowels are different. And they all compile, getting the wrong one just creates subtle semantic bugs at runtime. And there's a maze of rules telling you which version of the keyword you use if you're inside a loop body or a function body or a conditional statement body. Oh, and there are six different versions of the "if" keyword, and they're all two letters long, good luck getting the right one of those. And if your code's going to be executing two or more times, all the names of your variables will suddenly be different. And when you refer to a struct outside of the function that created it, all the field names are transformed. And whoever was writing the compiler put ten different special cases in the code for transforming your variable names, just because they could.
That is such an interesting and...low-level way to think of it.
To me it feels a little deeper than just there being a "maze of rules". Your earlier mention of "everything's connected to everything else" resonated with me more.
In a sense, the expressive power of programming languages, and mathematical notation, feels very small. I liked the board game metaphor brought up in the top comment: programming languages and mathematical notation merely feel like arrangements of board game pieces (from an infinite box, and the rules of how they can be arranged are "context-free"). They don't have meaning except what we impose (ideally, assisted by comments).
A metaphor I brought up in another comment is that it feels "easy" to systematically translate an arbitrary program into an equivalent diagram, such that the diagram would contain 100% of the operational information in the program, and could be runnable as-is. (Would be cumbersome to input into the computer, ofc.) Whereas the idea of systematically translating an arbitrary natural sentence into a diagram just seems...nonsensical to me. I wouldn't even know where to begin.
> Whereas the idea of systematically translating an arbitrary natural sentence into a diagram just seems...nonsensical to me. I wouldn't even know where to begin.
we actually literally did this in high school. It was very informative for me.
This is really an interesting thought. Would it surprise you if I said that for me, it works in an opposite manner? I first think in terms of diagrams and the construct sentences for it, whether it be programming language, natural language or math.
- Re-derive it by completing the square on the general quadratic
- Solve a specific quadratic you're interested in by factoring or graphing
- Try to find zeros with numerical methods
The point is, everything's connected to everything else, so it doesn't matter if you forget any part of it as long as you remember some critical mass.
To be fully fluent in a programming language, you have to memorize two or three dozen keywords and a handful of operators (mostly the same as standard math). To be fluent in a natural language, you have to memorize two or three dozen words a week for years. Not to mention understanding all the insane grammatical stuff.
With human languages, sure it's easy enough to remember that "nec-" means "dead" (especially if you consume fantasy novels / games where necromancy is a school of magic), but you're still totally screwed if you can't remember whether the word you need is "necare" or "necire," "necobitis" or "necabitis" or "necibetas". There's no rhyme or reason to any of it, your knowledge won't help you figure out whether that letter's supposed to be an "o" or an "i", or the difference between past tense and future tense and perfect tense and imperfect tense and tensile strength and gaaah why is this all so complicated!?
It's like if a programming language had 20 different versions of each keyword, and they were all spelled with one or two letters different, usually only the vowels are different. And they all compile, getting the wrong one just creates subtle semantic bugs at runtime. And there's a maze of rules telling you which version of the keyword you use if you're inside a loop body or a function body or a conditional statement body. Oh, and there are six different versions of the "if" keyword, and they're all two letters long, good luck getting the right one of those. And if your code's going to be executing two or more times, all the names of your variables will suddenly be different. And when you refer to a struct outside of the function that created it, all the field names are transformed. And whoever was writing the compiler put ten different special cases in the code for transforming your variable names, just because they could.