> [P]eople are taught to think in strings[, b]ut what people actually need are grammars.
I don’t actually disagree with you for the most part, but I feel that an important caveat has gone unacknowledged.
Grammar formalisms have the same weakness compared to dealing with raw strings as sound static type systems do compared with dynamic typing: there are small, mostly isolated islands of feasibility in a sea of intractable (often undecidable) generality, and if your problem doesn’t fit inside those borders things start to get nasty (cf how even GCC’s handwritten rec-descent parser didn’t get its lexer hack interactions correct in all cases[1]).
I still agree that we spend criminally little time on syntax. Starting with the simplest cases: with how much time is spent in school on “order of operations” you’d think we could take a moment to draw[2] a damn syntax tree! But nooo. There are in fact working mathematicians who don’t know what that is. (On the other hand, there are mathematicians who can explain that, in a sense, the core of Gödel’s incompleteness is not being able to reason about arithmetic—it’s being able to reason about CONS[3], which arithmetic happens to be able to very awkwardly do.)
I feel like nobody ever ends up having this discussion about JSON.
Generating JSON data using string interpolation or templating is clearly wildly insane, right? You don’t do it.
Maybe for some config file generation scenarios you might just run a template JSON file through a token substitution or env var interpolation or something. But you’d feel bad about it, because it’s so easy to NOT do it that way. And even then you’re not interpolating in JSON fragments like ‘“age”: 25’ - you’d have the decency to only interpolate in values like ‘25’.
In the node ecosystem it’s so easy to switch from a .json file to a .js file, too, if you want to build the json dynamically.
For some reason people feel more willing to attempt it with YAML. And then regret it when they realize how significant indenting has screwed them.
And then with HTML people just give up and go ‘yup, it’s all text, even the angle brackets’
> Generating JSON data using string interpolation or templating is clearly wildly insane, right? You don’t do it.
I'm sorry to report that I've seen a lot of JSON generated by string concatenation and templating, in different projects.
Often using 'printf' or 'echo' in various languages. Sometimes using whatever's used for HTML string templating if the JSON is embedded in HTML or served as a resource similar to HTML.
Yes, its horrible and breaks if fed variable values that have characters like quotation marks in. People do it anyway.
Even in languages that have perfectly good data structures and JSON libraries.
I've seen a fair amount of parsing values out of JSON using regexes too, assuming specific formatting of the supplied JSON.
Well yeah! Of course people do that because JSON is just text!
It's less common, because JSON is simpler, so the tradeoff point for using a grammar is lower, but it still makes sense in things like shell scripts, and other cases where the equivalent of `print(obj)` (or `eval(totally_not_rce)`, but let's pretend that's not available anyway) doesn't happen to produce (or consume) valid JSON by coincidence.
is a general-purpose solution that can be adapted to pretty much any text-based format just by looking at examples, without having to cross-reference with a external specification (that the thing you're feeding input to or pulling output from may not even correctly implement anyway), so obviously people do that!
See also various discussions under the heading "Worse is Better". Whether it's the right thing or the wrong thing, it very clearly is a thing.
This file generates a feed of events (rehearsals for my high school play) to be rendered by the FullCalendar JS plugin. FullCalendar required a particular data schema that didn't match the format of my MySQL table, which meant I couldn't just json_encode() the MySQL results. I guess I just didn't conceptualize that I could create a new object that matched the FullCalendar format, and then call json_encode(). So, I generated JSON with strings.
Honestly it's a toss-up whether the JSON generation is the worst thing about this file. It looks like I also made a separate database query for every single row to get the username, because I apparently didn't know how to do joins. Could probably spend an hour listing some of the other little nuggets of awful in there. But hey, it got the job done! :)
It knows when to do escaping and how. It also can detect, though dynamically, when fragments have been combined into an illegal sequence which would be rejected by the full grammar. It can not however guarantee that the result will parse only that it can not detect that it would fail.
Was that a question? In that case, now I have :) The authors include both of the Eelcos I’ve ever heard of no less!
The ugly quasiquoting seems unfortunate (I’ve a half-serious suspicion the reason Template Haskell never got popular is that it looks so bad), and the GLR sledgehammer precludes ever having a lightweight implementation, but otherwise it seems like a interesting entry in the extensible languages story.
This ironically brings us back to the point of the original article: if spending that much time teaching people to do it right didn’t help, spending even more time doing that in the same way is hardly going to.
Also, respectfully, it doesn’t matter. Not having learned maths in English, I don’t know the mnemonic, I don’t care to know it, and I find even the concept of it completely asinine. (For eighteenth-century mathematicians, addition and subtraction bound tighter than multiplication and division, and they could calculate perfectly fine.) You can look up the precedence table if you need to—as long as you need to understood the idea of precedence (and not order of operations, for goodness’ sake). You won’t then be able to calculate fluently, but fluency is a different problem with a tedious and time-consuming solution, and given the time crunch I’d rather talk about some actual Maths as She Is Spoke instead.
It's the more ambiguous ones I see that get people to argue about how PEDMAS is interpreted. Your example is unambiguously -4. But consider:
6 / 2 (1 + 2) = ?
I approach the problem the same as I would 6 / 2(x + y). When the multiplication is missing 2(x + y) is a single term. The implicit multiplication is part of the parenthesis and reduces the problem to 6 / 6. People who argue that you have to strictly use PEDMAS left-to-right will divide 6 / 2 first and get 9.
Neither way is wrong as long as you can explain the process but everyone wants to argue and have there be a single answer.
I’ll stick my chin out, and claim that nobody who knows anything about math would ever write the expression like that. Either use parentheses or a long, horizontal division bar making it obvious what groups with what. As given, it looks like some small-minded school teacher (I’m not saying all teachers are small-minded, just a few of them!) who has taught a set of rules, with little regard for actual practice outside the classroom, and then tests to exactly those rules.
When calculating by hand, it’s useful to have multiplication × (not dot, if you value your sanity at all) binding as tight as division / and multiplication-by-juxtaposition binding tighter than that. On the other hand, this only comes handy when your intermediate results are so large that one or two levels of (unambiguous) fractions still aren’t enough, and if at all possible you shouldn’t be communicating results that unwieldy. If you really need to, don’t confuse your readers and use some bloody parens. (You’ll probably need more than one kind.)
I see it as more important to realize that not all questions are well formed enough to have a right solution.
A test question like "6 / 2 (1 + 2) = ?" is not asking for the mathematical meaning of those symbols it is asking for "Guess what I as thinking when I wrote this".
(Unless it is a programming class and you are learning how the compiler reads your code)
Are schools failing to teach it, or do some lessons fail to stick through the decades depending on how much math you’re doing? Schools have no control on what your older relatives have been up to.
Today’s Facebook grumpy posters would be the same kids doing “New Math” in the 1960s: https://en.wikipedia.org/wiki/New_Math - just something to consider
My school taught PEMDAS but failed to teach that multiplication and division are equal in priority, nor did they teach that when there is ambiguity working left to right takes priority.
In math notation, division is indicated using fractions, which removes the ambiguity. The idea of equal priority of multiplication and division is a programming thing.
The thing schools don’t do a great job of doing is explaining when transition from doing ‘arethmetic’ to doing ‘algebra’. Many of the symbols you use in arithmetic continue to be used in algebraic notation, but what they mean changes subtly. Arithmetic is a procedural activity - performing a series of operations to get to an answer. Algebra is a declarative activity - making truthful statements about the world.
For example in arithmetic
x + y
means ‘add y to x’. But in algebra it means ‘the sum of x and y’. In arithmetic ‘=‘ means ‘gives the result:’; in algebra it means ‘is equal to’.
The failure of teaching to explain that you’re moving on to use those symbols to do something fundamentally different is, I think, one of the things that leaves some kids behind and dooms them to always annoy their relatives in Facebook comment threads about operator precedence.
I don’t actually disagree with you for the most part, but I feel that an important caveat has gone unacknowledged.
Grammar formalisms have the same weakness compared to dealing with raw strings as sound static type systems do compared with dynamic typing: there are small, mostly isolated islands of feasibility in a sea of intractable (often undecidable) generality, and if your problem doesn’t fit inside those borders things start to get nasty (cf how even GCC’s handwritten rec-descent parser didn’t get its lexer hack interactions correct in all cases[1]).
I still agree that we spend criminally little time on syntax. Starting with the simplest cases: with how much time is spent in school on “order of operations” you’d think we could take a moment to draw[2] a damn syntax tree! But nooo. There are in fact working mathematicians who don’t know what that is. (On the other hand, there are mathematicians who can explain that, in a sense, the core of Gödel’s incompleteness is not being able to reason about arithmetic—it’s being able to reason about CONS[3], which arithmetic happens to be able to very awkwardly do.)
[1] https://gcc.gnu.org/bugzilla/show_bug.cgi?id=67784
[2] https://mlochbaum.github.io/BQN/tutorial/expression.html
[3] https://dx.doi.org/10.1215/00294527-2008-028