Most adults give up on maths around the level where the amount of symbols start to grow, or slightly before (to be clear, I'm not claiming notation is the only cause of this - it certainly isn't, but it does mean we really don't know whether most people would manage to pick it up; personally I think it is a major factor in making people struggle with maths).
For my part, the notation was certainly part of it. It matters. I used to love maths but I started finding it impossible to read even as the concepts were still easy enough to understand. I'd write things out as programs instead to understand it without being hampered by the notation. I learned symbolic differentiation that way, for example.
But I quickly realised that this basically closed maths off to me as a viable subject, and I opted out of all optional maths courses other than boolean logic for my CS studies.
I get that for those who find mathematical notation easy to work with, it seems indispensable, but don't underestimate the amount of people for whom the notation is the barrier that basically makes mathematics inaccessible.
I've picked up quite a bit of maths since, but always by understanding the concepts through code rather than trying to parse mathematical notation.
I've had - and still have, to a degree - similar troubles with understanding math notation. Some 9 years ago a friend, solidly mathematician, gave a good advice to me - at least try to pronounce the math expression aloud. That forces you to pay attention to each symbol - instead of trying to immediately find word-like "phrases", failing that and missing the expression as a whole.
APLs are - as Arthur Whitney likes to employ - supposed to be read symbol-by-symbol, since each one represent a whole operation. That's why APL program are so dense - you may have plenty of work done in a short line (but many APL writers prefer to keep lines short, to help with understanding). A screenful of APL can be moderately sized program, without need for scrolling... When you write J, you might be tempted to add more and more operations to the left (APL and J work from right to left), until it's hard to "explain" what the whole expression does - because it does so much. With an alternative - short, understandable expressions - you have another problem, that of naming :) - trying to give a traditional, long-worded name to an intermediate result of expression doesn't fit well with the rest of the style... So a good sense of balance is very valuable.
Another saying which may help to understand APL's mindset is that APLers spend 5 minutes to write a program, then spend next 1 hour to write the same program better and cleaner. That's what I understand as refactoring; the idea is to make the code more readable, the expression more obvious, more obviously correct, more reuseable... Properly done, the expression is a pleasure to look at and well worth the efforts to decipher it symbol-by-symbol. And another great option is documentation - lots of comments, just as in good math texts, where the idea is explained in plain English and then succinctly put to code.
This way I better see the math background behind the problem. I certainly have an easier time to change something here and there - the changes required are so small. I also see similarities between different pieces of code, if they are put as similar expressions close to each other, on the same screen. Yes, the notation is terse... but it at least has good sides.
Musicians learn to read music. Yes, it is hard to learn it first, before playing or listening, but that is how it has been incorrectly taught. Same with mathematics. Learning mathematics by coding is perfectly fine, but when you start to go higher to the more abstract level, you need abstract symbols. That can, and should come later, I believe. But, I would rather deal with learning C=2πr as standing for the circumference of a circle, than writing it out longhand. Not to mention, diagrams are good too, to understand what a 'radius' or 'pi' is. The longhand version would be at least a paragraph. You just wouldn't write many pages of longhand to avoid symbols. Fear of mathematics is a result of poor teaching, not symbols.
For my part, the notation was certainly part of it. It matters. I used to love maths but I started finding it impossible to read even as the concepts were still easy enough to understand. I'd write things out as programs instead to understand it without being hampered by the notation. I learned symbolic differentiation that way, for example.
But I quickly realised that this basically closed maths off to me as a viable subject, and I opted out of all optional maths courses other than boolean logic for my CS studies.
I get that for those who find mathematical notation easy to work with, it seems indispensable, but don't underestimate the amount of people for whom the notation is the barrier that basically makes mathematics inaccessible.
I've picked up quite a bit of maths since, but always by understanding the concepts through code rather than trying to parse mathematical notation.