originally posted by Jonathan Loesberg:
fb,
I expect I got my misbegotten notions about mathematical language from Leibniz, further confirmed by numbers of other philosophers. I'd be happy to find out I'm wrong, but neither your rant (which I can't decipher), nor the site you refer to, which is about the understanding of words for math numerals, gets anywhere near to making things clear to me. Here is the usual issue. The following sentence (and I use the word advisedly):
2 + 2=4
can be understood in that form by natives speakers of any number of different languages who have been instructed in the language of math equations. It clearly isn't "parasitic," on any given spoken language, nor does it come from any of them. And this is true regardless of whether it was originally invented by a native speaker of some given language. It is its own language. Now it may be the case that we learn its meaning, each through our own language in various ways that may make our translations go back through, for me, for instance, an English sentence such as:
two plus two equals four.
But that hardly shows that the sentence 2 + 2=4 is parasitic on the English sentence.
this is a half-assed reply. i apologize. it's late here and I'm running out the door.
but if your go back to the paper i pointed you to and read the brief section on how we actually come to learn the meaning of 2 and 4: it requires not only a language, but a language that is organized in such a way that it allows the speakers of a community to situate the relationship between "two" and two in a given system of sound / meaning discriminations. if the speakers of a language can't rouse themselves to do that, then any talk of math is off the menu. if they can organize themselves to do that (because they have a speech system and have developed the notions of orthography that allow them to situate all the math talk they want to devise in a system of conventions that can support it, then they are free to proceed.
give them long enough, and they might even start to come to grips with probability. might.
they might even proceed to develop a system of conventions that they share with other languages, as indeed we do with math, html and porn site search queries. and while you are right that math isn't "parasitic," on any given spoken language, and nor does it come from any of one them -- it takes its characters from the middle east, its logic from the anglo saxons, and those few conventions that seek to understand probability from a rag bag of card sharps, religious fanatics and nerdy geniuses --but this simply underlines the point that a language is what a community makes it. in the case of math, what this tells us is that for a long time, mathematicians have preferred to talk to each other, whatever their native language. but this only means that math is the language of mathematicians, just as there are some people whose understanding of english sort of approximates one another close enough that we can call them a community, and their shared tongue a language.
the residue to all this blurb is simply that if you think that math exists independently of a community that agrees mathematical usage conventions, you are far closer to the greeks than any eductaterized citizen of the 21st century ought to be (and hey, we now know what happens to the greeks in act VII, and it ain't pretty).
a final consideration: lots of people who don't use math in their everyday lives have a very distorted idea about the content of mathematical notation (viz otto's story about the precision of math above). rather than wax philosophical, i'll simply point you at two of the most illustrative counter-examples to this that i personally know of (given the denizens of the disorderly, i have no doubt that we will soon have a whole data base of this shit).
from my limited perspective, two of the more interesting bits of math from the last century are shannon's definition of the entropy in a signal, and rescorla & wagner's definition of a simple error-based learning rule that could capture a huge array of behavioral findings in the animal and human literature.
these are two of the very coolest ideas of the last century: shannon basically invented the information age -- along with card counting -- and though you may not realize it, pretty much every aspect of the electronic interaction we are having is being facilitated by shannon's equations; rescorla & wagner opened up a tractable way of thinking about learning that ultimately has its echoes in every successful engagement with the interweb that you have. they also share an interesting trait: outside of a very small field of specialists, pretty much everything you read about information theory (shannon's baby) and learning theory (rescorla & wagner's) is based on such a catastrophic misunderstanding of the math as to be the opposite of informative: put simply, the vast majority of mathematically literate people who look at the shannon equations and the rescorla & wagner equations fuck.it.up.
and they don't fuck.it.up. because they don't know math. they fuck.it.up. because the semantics of mathematical equations are much like the semantics of language: the code is an abstraction, and your understanding depends on the interpretative knowledge that you bring to it. (in a very real sense, you can only understand something when you are on the cusp of being able to generate it yourself).
yeah yeah, it's a nice story. but why should you believe my claim that everyone gets the semantics of the equations wrong? well, hopefully here's one reason why you should. shannon and rescorla were both so totally annoyed by the wankers jizzing all over their intellectual legacy that they wrote about it. their papers are not subtle.
shannon's was written in 56 (less than a decade after he published the mathematicsl theory of communication). its title is, unambiguously, "the bandwagon," and in it, he laments the widespread misapplication of his math. rescorla's paper was written nearly 20 years after the first article, but its message in the same. it is called, "pavlovian conditioning: it is not what you think it is." and it systematically takes down the ridiculous nonsense taught about learning in textbooks (here's a neat twist: claims about what this misunderstood conception of learning can and can't do dominate modern day linguists, and are used to justify phlogistonic horseshit like phonemes, morphemes and the like... )
so where are we? is math parasitic on speech? if you can't get to math without speech (and perhaps, even orthography)-- and if speech and linguistic convention learning are a necessary precondiion to math, i'd say yes. i'd also add that i'd rather take my theory of math from shannon that derrida, because i trust that shannon actually knew what he was talking about. your mileage may differ. but if so, why?
as for sign languages, it gets harder and more speculative. but there are good reasons to believe that speech is a better medium for getting language off of the ground than gesture (for one, the dimensional properties of our auditory cortex are far more conducive to the learning of low dimensional symbols than those of the visual cortex, which is why you never see anything as dramatic as the loss of l2 sound contrast discriminations in vision). as ever the story is complex and empirical, but hey, when you build math models of this shit, you get predictionds, and the predictionds turn out to be right, which always feels like understanding (to me, anyway).
my phone says i'm keeping people waiting.
fb.