Author of the blog post here. It's amusing to see this submitted by Don Stewart, since it was his explanation[0] of algebraic data types that first made them "click" for me! I think I forgot to mention him in this post, but I remembered in the talk[1].
If anyone's interested in this and where else it might go, I recommend read Don's SO answer[0], this SO question and the related answers[2] and these papers [3,4] on derivatives and dissections by Conor McBride. There are also several good blog posts[5] and a paper[6] by Brent Yorgey on the topic of combinatorial species, and of course Dan Piponi's blog[7] is a treasure trove of interesting insights at the intersection of math, physics and computer science.
Edit: It should go without saying, but because a few people last time around seemed to think I was trying to pass this off as my own ideas, let me state that there is nothing original in this blog post. It's a repackaging of the ideas and work of lots of other people, all of whom are way smarter than I am.
I don't mean to be negative, it's probably the material as much as you but as I read the article, I find a voice repeating "what is the point, where is this leading, why, why?". I mean, as I read the article, my impression is "here's verbose syntax that does describe types but whose relation to the tasks of an ordinary programming language, to getting things done is left unexplained for too long for attention span, if it is explained at all".
And I'm fairly mathematically sophisticated (MA a while back with some effort to keep current).
It seems like the constructions "thrown on the floor" everything that happens in the creation of a type. But I can't understand what that does except make simple operations look like a giant mess.
I would add that the stack overflow mentions that the construction is a way to construct an algebraic with "polymorphic parameterization". IE, Haskel uses the laws of algebra and some supplied primitives (AddL AddR, which you mention but don't define!)to calculate A + B. Perhaps if you make that explicit, the article wouldn't have the "floating in the clouds and never coming down" quality that it has for me now.
The most obvious use for derivative types is the automated creation of "zipper" types for functional data structures. Among other uses, zipper types make certain purely functional data structures much more efficient. This is important both for Haskell and ML programmers, and also in situations where you need to leave older versions of a data structure intact, such as when implementing rollback or creating append-only file formats.
Aside from the practical interest in defining zippers as noted in another response, isn't it enough that there be this interesting and (by many, anyway) unsuspected correspondence, that holds fairly deep down? I mean---taking the Taylor series of a type! After all, we are told that "anything that gratifies one's intellectual curiosity" is suitable for an HN submission, no?
Well, the correspondence sounds interesting but if remains just on the level of the unexplained, it is hard to see it really being interesting.
If you define a function, call it a "type" and then take the Taylor series of that function, how mind blowing is that really?
My point is that without a motivation to these constructions, they are just constructions. It may be everyone in-the-know understands the motivation already, knows why this thing labeled type really has a "deep" relation to an abstract type-thing. Fair enough but I'm just saying if one omits this motivation, your whole exercise doesn't look interesting to those not in-the-know, OK?
Yes, this is a great article series, thank you for it!! (I only had a vague intuition of why Either was a sum type or why pairs are a multiplication type)
If you find this way of looking at of algebraic data types strange, or want to understand why it is sound, pick up a textbook on analytic combinatorics (a great one is free [1]) because the parallels are very close. (In analytic combinatorics, the goal is to count the objects in various combinatorial classes.)
Part 2 of the linked-to article, for example, shows that the list data type
data List a = Nil | Cons a (List a)
can be mapped to the algebraic form
L(a) = 1 + a * L(a),
having the solution
L(a) = 1 / (1 - a).
In analytic combinatorics, the sequence operator F(a), representing sequences of the underlying class a, is given
F(a) = 1 / (1 - f(a))
where f(a) is the ordinary generating function (OGF) representing the class a. It's basically the same as the list data type's representation, which is what we ought to expect since a list is just a sequence.
The topic is incredible. Basically all the math you learn in Undergraduate mathematics gets pulled in to solve counting problems and perform algorithm analysis. It's inspiring. :)
For me, this was a great article. I deepened my admittedly limited understanding of algebra, and picked up a little of the flavor of Haskell in the process.
"Every little bit helps," said the old lady as she pissed in the ocean. I now own a little less dumbass.
The idea that types are the same because the cardinalities of their sets of values are the same is weird. So my type for the days of the week is "equal" to my enumeration of the Seven Dwarves, and the sum of Maybe Bool and Bool->Bool? How is that a useful definition?
They aren’t the same, they’re isomorphic—there exists some isomorphism between them, which is to say a bijective homomorphism, which is to say a one-to-one mapping between two algebraic structures, which is to say types. Category theory is too damn abstract.
In any case, having isomorphic structures means that any transformation you apply to one can be applied to the other by way of a particular isomorphism. There is a “next” day of the week, in the same way that there is a “next” Dwarf by order of introduction[1], in the same way that there is a “next” value in this partial ordering I just made up for (Either (Maybe Bool) (Bool -> Bool)) which you can’t program in Haskell:
Left Nothing < Left (Just False) < Left (Just True) < Right (const False) < Right id < Right not < Right (const True)
[1]: Doc, Bashful, Sleepy, Sneezy, Happy, Dopey, and Grumpy.
If anyone's interested in this and where else it might go, I recommend read Don's SO answer[0], this SO question and the related answers[2] and these papers [3,4] on derivatives and dissections by Conor McBride. There are also several good blog posts[5] and a paper[6] by Brent Yorgey on the topic of combinatorial species, and of course Dan Piponi's blog[7] is a treasure trove of interesting insights at the intersection of math, physics and computer science.
[0] http://stackoverflow.com/a/5917133/546084
[1] http://www.youtube.com/watch?v=YScIPA8RbVE (shitty audio for the first minute or 2 - it gets better).
[2] http://stackoverflow.com/questions/9190352/abusing-the-algeb...
[3] http://strictlypositive.org/diff.pdf
[4] https://personal.cis.strath.ac.uk/conor.mcbride/Dissect.pdf [5] http://byorgey.wordpress.com/
[6] http://www.cis.upenn.edu/~byorgey/papers/species-pearl.pdf
[7] http://blog.sigfpe.com/2010/08/divided-differences-and-tomog...
Edit: It should go without saying, but because a few people last time around seemed to think I was trying to pass this off as my own ideas, let me state that there is nothing original in this blog post. It's a repackaging of the ideas and work of lots of other people, all of whom are way smarter than I am.