Was nodding along as I was reading this. I recently was given a paper and spoke with the engineer implementing it. The paper was incredibly dense and hard to parse. But through talking with the engineer and rewriting some terms to more common names, the math turned out to be quite simple. Echoing your sentiment, I wish more mathematicians would use simple terminology. My personal theory as to why this isn't done is the same reason why overengineering happens, that the writer is trying to cover every base but makes the hottest path a jumbled mess.
My personal theory is that (notation+terms of art) is incredibly information dense even if inscrutable to outsiders.
What you wish for is more akin to coding like this:
declaring a function whose name is "max" and its arguments are "x" (of type number) and "y" (of type number) that returns a number:
statement: if a is greater than b, the function returns a
statement: the function returns b
But programmers don't bat an eye at {}[](),.!&^| (and I just realized I used the term "function" which outsiders might wish was replaced by simpler terminology!)
// This is more readable if you're "in the know"
// even if it looks like a jumbled mess to outsiders
fn max(a: num, b: num): num => a > b ? a : b
Math uses terms of art like "group", "field", "modulo" and "multiplicative inverse"; and notation like "∑"; because they are short and communicate very specific (and common) things, many of which are implicit and we probably wouldn't even notice.
> Math uses terms of art like "group", "field", "modulo" and "multiplicative inverse"; and notation like "∑"; because they are short and communicate very specific (and common) things, many of which are implicit and we probably wouldn't even notice.
I don't have anything against introducing new words. If your concept can be adequately described by existing language that seems like a good way to allow people to learn and talk about it. Technically as a person who has studied philosophy the greek alphabet is also no big hurdle to me. But it is to others. Try googling some weird sign you found in a formula. First you don't know how it is called or how to write it, second any signed might have been used in 100 different formulae so even if you, know how to search for it (there are applications people use to identify mathematical signs) good luck at finding any meaningful answer.
I know for mathematicians these signs are arbitrary and they would say you could just use emojis as well. But then it turns out mathematicians ascribe meaning to which alphabet they are using and whether it is upper- or lowercase. Except sometimes they will break that convention for what appears to be mostly historical reasons.
I know mathematicians will get used to this just fine, but the mathematical notation system has incredibly bad UX and the ideals embedded within it are more about density and intransparency (only the genius mathematician knows what is going on), than about rigorous precision and understanding.
When I studied philosophy there were philosophers like Hegel who had to expand the German language to express their new thoughts. And there were philosophers who shall remain unnamed that would use nearly unparseable dense and complex language to express trivial thoughts. The latter always felt like an attempt to paper over their own deficiencies with the convoluted language they had learned to express themselves in.
Mathematicans can also have a degree of the latter at times. If your notation is more complex than the problem it describes your notation sucks and you waste collective human potential by using it.