no free lunch

What seems as a complimentary act in a restaurant often has a profit maximizing motive behind it. Three examples:

1- At a nice sushi restaurant, I was offered a glass of free chilled champagne before I even had a chance to sit down. Dry champagne is highly acidic and therefore quite appetite stimulating. After drinking a glass of it you end up ordering more food than you would otherwise have.

2- Many Turkish restaurants serve pickles before the meal. In fact, some of them keep pickles on their tables all the time. Pickles increase your appetite.

3- At an all-you-can-eat restaurant, right after settling down in our chairs, we got served four different varieties of breads and dips. The waiter explained us how each of these breads were of different cultural origin etc. Sophisticated? Yes. Interesting? No! They just wanted to get us full before we took a look at the buffet. Eating bread decreases your appetite. You end up eating less than you would otherwise have. That, of course, is in the interest of an all-you-can-eat restaurant.

thoughts on intuitionism

The following is a polished and expanded version of an email that I sent to Umut Eldem. You can read more about intutionism here.

The faculties we consider as "apriori" have developed out of a long string of interactions between a developing organism and its environment. In other words, there can be no apriori concept without environmental inputs prior to the full development of the faculty in question.

Space and time help us to survive and reproduce. They are certainly based on some aspects of the reality, but the determination of these aspects is beyond our reach. Why? Because the only way we can access reality is through these lenses.

Sometimes things go wrong in the developmental process. People end up having "distorted" senses of space and time. We call them mentally sick.

On the other hand, even cells have a sense of time. You do not need a huge, over-complicated brain to host a biological time keeping mechanism. There are many metabolic processes in our body that operate on pulsations which have such cellular origins.

Bacteria also have a sense of their surroundings. Through signalling mechanisms they can keep track of the local population of their kins. They can also sense food, light etc. Moreover they can skilfully navigate themselves within the space relevant to their existence. Nevertheless we do not consider bacteria to possess a sense of space and time. Why? Because our notions have an implicit prejudice to "awareness" whatever the hell that means. Space/time seems to be based on very basic and fundamental features of the reality. So basic that even the most simple life forms possess a mechanism that exploits it. But these life forms are not "aware" that they are exploiting something. That capability belongs only to certain vertebrates like us.

Make no mistake that we do not really know what we are exploiting neither. Our intuitive notions of space and time are based on the local environment that is relevant to our existence. These stuff stop making any sense in environments that have been irrelevant to our evolutionary history. Don't you get a funny feeling when a physicist utters sentences such as these:

"Expansion of the universe is stretching of the space-time fabric."

"The age of our universe is blablabla."

"Extremely massive objects tear the space-time fabric."

Bear in mind that there can be no non-classical mathematics. Yes, there is non-classical physics. (e.g. quantum phenomena) But the tools that scientists use to explain these phenomena are grounded in classical mathematics. This should not be surprising. Explanation means that a phenomenon is narrated in a language that is accessible to us. And what is accessible to us is mathematics that is born out of notions of space and time that were in turn born out of the part of physics that was relevant to our evolutionary history.

From these experiments it is seen that both matter and radiation possess a remarkable duality of character, as they sometimes exhibit the properties of waves, at other times those of particles. Now it is obvious that a thing cannot be a form of wave motion and composed of particles at the same time - the two concepts are too different... As a matter of fact, it is experimentally certain only that light sometimes behaves as if it possessed some of the attributes of a particle, but there is no experiment which shows that it possesses all the properties of a particle; similar statements hold for matter and wave motion. The solution of this difficulty is that the two mental pictures which experiments lead us to form - the one of particles, the other of waves - are both incomplete and have only the value of analogies which are accurate only in limiting cases... yet they may be justifiably used to describe things for which our language has no words. Light and matter are both single entities, and the apparent duality arises in the limitations of our language.

- Heisenberg as quoted on page 80 of Heisenberg by K. Camilleri

Evolution endowed us with intuition only for those aspects of physics that had survival value for our distant ancestors, such as the parabolic trajectories of flying rocks. Darwin's theory thus makes the testable prediction that whenever we look beyond the human scale, our evolved intuition should break down. We have repeatedly tested this prediction, and the results overwhelmingly support it: our intuition breaks down at high speeds, where time slows down; on small scales, where particles can be in two places at once; and at high temperatures, where colliding particles change identity. To me, an electron colliding with a positron and turning into a Z-boson feels about as intuitive as two colliding cars turning into a cruise ship. The point is that if we dismiss seemingly weird theories out of hand, we risk dismissing the correct theory of everything, whatever it may be. 

Tegmark - Mathematical Cosmos

What I am claiming is more radical then what Tegmark is saying. I am claiming that even the weird theorems, such as those put forth by Tegmark, will never be sufficient, because that weirdness is weirdness of classical descent. What physicists call non-classical is phenomenon that a) is observable by a mental faculty whose development was moulded within classical physics, b) takes place at scales irrelevant to the survival of these faculties. The weirdness stems from b), not from a).

Similar morphological structures, such as the eye, have emerged out of different evolutionary lineages, because we are all part of the same reality. (See biological structuralism.) For the purpose of its survival, an organism needs to make sense of its environment. That process involves certain simplifications and amplifications. Nevertheless what is filtered through can not be a hugely distorted version of the reality, because an exceptionally delusional organism sooner or later dies.

In some sense, it is not surprising that physics was so successful at explaining classical phenomena. After all, this is what our faculties, from which our mathematical constructs and analogies stemmed, were evolved for. (Of course, the fact that we could explain so much classical stuff using so few classical principles is still startling.)

Mathematics is effective in characterizing and making predictions about certain aspect of the real world as we experience it. We have evolved so that everyday cognition can, by and large, fit the world as we experience it. Mathematics is a systematic extension of the mechanisms of everyday cognition. Any fit between mathematics and the world is mediated by, and made possible by, human cognitive capacities. Any such "fit" occurs in the human mind, where we cognize both the world and mathematics.

- Lakoff & Nunez, Where Mathematics Comes From (Page 352)

I do not think that you need to have an in-built sense of time to be able to construct and make sense of mathematics. Whenever a time parameter is introduced in physics, its conceptual origin lies in geometric structures. Line is pictured as a continuum of points, and each point is pictured as an instant. In the discrete case, you don't even need the continuum construction.

On the other hand, the idea of space transformations have origins in time perception. This origin is not immediately apparent since the idea itself is stated in a "timeless" language.

I think the fact that all of mathematics looks "timeless" have misled a lot of adventurous minds who now believe that the next big discovery in physics will involve the removal of time from our theories. That will not be possible. Again this is simply because we can explain only what we experience.

The great irony in our discussion is that we forget that we are expressing ourselves in a language which is far more vague and ambiguous than what is being examined. As one can not measure physical distances with apparatuses whose resolution is cruder than the distances, one can not hope to understand a language by examining it with an inferior language. Mathematics precedes English, and ideas precedes all. (The fact that I can successfully convey the meaning of this sentence to you, by using English, is just tantalizing.)

Kant's notion of "apriori" has conceptual origins in time perception. (i.e. something that precedes) Hence the reason why I do not think that "apriori" is an enlightening concept. You can not explain something using that thing.

Let's now touch on the most fundamental and deepest of all of our ideas, namely the idea of contradiction/non-contradiction. Well... This, too, is a fiction. Although quite a useful one. (See this blogpost.)

Several specific points regarding Brouwer's Intuitionism and Formalism:

1) Brouwer's conception of science is out-of-date. The following extract sounds too simplistic after the discovery of quantum physics:

"And that man always and everywhere creates order in nature is due to the fact that he not only isolates the causal sequences of phenomena (i. e., he strives to keep them free from disturbing secondary phenomena) but also supplements them with phenomena caused by his own activity, thus making them of wider applicability. Among the latter phenomena the results of counting and measuring take so important a place, that a large number of the natural laws introduced by science treat only of the mutual relations between the results of counting and measuring."

2) Within ZFC (currently the most widely accepted set theoretical axioms), Burali-Forti "paradox" is no longer a paradox. This is for the same reason why Russell "paradox" is no longer a paradox. (See this Wikipedia article.)

3) Brouwer says:

"In this way the apriority of time does not only qualify the properties of arithmetic as synthetic a priori judgments, but it does the same for those of geometry, and not only for elementary two- and three-dimensional geometry, but for non-euclidean and n-dimensional geometries as well. For since Descartes we have learned to reduce all these geometries to arithmetic by means of the calculus of coordinates."

True. But the passage from geometry to arithmetic involves the introduction of real numbers, namely the continuum. Depicting a line by an equation is possible only if you view the line as a set of points, and if you let the function admit real numbers as inputs. Since the continuum is a highly controversial object from the point of view of constructivism, I do not understand how Brouwer can be in favor of the apriority of time for the above mentioned reason.

4) Brouwer says:

"But the most serious blow for the Kantian theory was the discovery of noneuclidean geometry, a consistent theory developed from a set of axioms differing from that of elementary geometry only in this respect that the parallel axiom was replaced by its negative."

No, it is not a blow to Kant. The so-called non-Euclidean geometries are built on hyperbolic or elliptical surfaces. In other words, they are embedded in the usual Euclidean space of 3 dimensions. Moreover, in these new geometries, the words "line" and "parallel" possess different meanings. People find it amazing when you tell them that there are geometries in which two parallel lines always meet. But if they bother to look up the precise mathematical definitions, they will discover that there is nothing special going on. The actual definition of two lines being parallel is very local in nature: Two lines are called parallel if they share a common perpendicular.(Look at the first picture in this Wikipedia article.)

Does Brouwer really understand what Kant means by apriori? Even the most contrived geometry that we can think of is a product of our mental facilities. It is just the meaning of the word "geometry" that is changing from theory to theory. There is still only "one" intuitive notion of space.

5) Brouwer says that he can not embrace the first limit ordinal as an object. Infinity is a process, and only processes can embrace other processes. He views number 5 as an object, but omega as an unfolding process.

The notion of a procedure is temporal in its nature, and Brouwer openly favours apriority of time perception over the apriority of space perception. I am wondering whether he would have been better off manipulating Turing machines rather than playing with numbers. He could have treated the Turing machine that generates five consecutive 1s as the number five, and the Turing machine that prints 1s without halting as omega. Unlike the process that it is capable of unfolding, a Turing machine is a perfectly tangible object that can be embraced as a whole.

I find it weird to debate whether certain ordinals can be granted existence or whether numbers are fundamental for human perception, while there is still an Amazonian tribe that does not have even the most basic number system. (See this blogpost.) Since this tribe is doing perfectly fine, I am wondering whether something is wrong with us instead... I think we are brainwashed into believing that the notion of "absolute" existence has an actual meaning. There is a big difference between the following:

a) When the red light in my right hand blinks, tell me whether there
exists an X on the wall in front of you.

b) Does X exist?

The difference is that the second does not make any sense.

Update (November 2010) : Apparently Konrad Lorenz was the first person to bring an evolutionary perspective to epistemology.

axiom of choice

Informally put, the axiom of choice says that given any collection of bins, each containing at least one object, it is possible to make a selection of exactly one object from each bin, even if there are infinitely many bins and there is no "rule" for which object to pick from each. The axiom of choice is not required if the number of bins is finite or if such a selection "rule" is available. (Source)

Axiom of Choice is controversial because it leads to some weird theorems such as the following:

The Banach–Tarski paradox is a theorem in set theoretic geometry which states that a solid ball in 3-dimensional space can be split into a finite number of non-overlapping pieces, which can then be put back together in a different way to yield two identical copies of the original ball. The reassembly process involves only moving the pieces around and rotating them, without changing their shape. However, the pieces themselves are complicated: they are not usual solids but infinite scatterings of points... Unlike most theorems in geometry, this result depends in a critical way on the axiom of choice in set theory. This axiom allows for the construction of nonmeasurable sets, collections of points that do not have a volume in the ordinary sense and require an uncountably infinite number of arbitrary choices to specify... The existence of nonmeasurable sets, such as those in the Banach–Tarski paradox, has been used as an argument against the axiom of choice. Nevertheless, most mathematicians are willing to tolerate the existence of nonmeasurable sets, given that the axiom of choice has many other mathematically useful consequences... It was shown in 2005 that the pieces in the decomposition can be chosen in such a way that they can be moved continuously into place without running into one another. (Source).

This counter-intuitive result is partly due the bizarreness of set theory itself. Our intuition is based on classical physics. However the category of sets has nothing remotely to do with physics. It has the least structured morphisms possible. Unlike their set theoretical counterparts, "physical points" have cohesive relationships. Therefore, unlike set theoretical functions, physical transformations exhibit a substantial amount of structure. In particular, they can not perform the reassembly process required in the Banach-Tarski paradox, because moving a complicated subset of a solid ball without affecting the other subsets is physically impossible.

Of course it is the Axiom of Choice that allows one to select the pathological subsets in the first place. In other words it is the Axiom of Choice that is the true source of the paradox. However the paradox feeling arises only when the selected subsets are spatially reassembled into two identical copies of the original ball.

(At a deeper level the real culprit is the Space-as-a-Set-of-Points metaphor. This mathematical metaphor, like any other metaphor, does not capture exactly our intuitive notion of space. The paradox feeling arises from this deficiency.)

symmetry and ignorance

Confronted with total ignorance we expect symmetry: If we do not know anything about a process except for its set of possible outcomes, then we assume that all outcomes are equally likely. In other words, we use the most symmetrical probability distribution as an expression of total ignorance.

Confronted with symmetry we expect ignorance: Presence of symmetry, by definition, means that there are certain transformations (whose kinds depend on the nature of the symmetry) of the underlying substrata that leaves everything observable unchanged. In other words, we have no hope of being able to tell whether the substrata is currently being subject to such transformations.

Hence, in some sense, use of symmetric designs in the interior decorations of mosques is very appropriate. They remind us the state of ignorance that all mortals are in.

snobbery and meritocracy

Like many phenomena, snobbery is easier to recognise than to define. The definition of a snob in The Shorter Oxford English Dictionary is inadequate. The Penguin English Dictionary does much better. It defines a snob as ‘Someone who tends to patronize or avoid those regarded as social inferiors; someone who blatantly attempts to cultivate or imitate those admired as social superiors; someone who has an air of smug superiority in matters of knowledge or taste.’ The same dictionary defines ‘inverted snob’ as one ‘who sneers indiscriminately at people and things associated with wealth and high society.’ One possible derivation of the word snob is from the Latin sine nobilitate, without nobility.

I doubt whether there is anyone in a modern society who is entirely free of snobbery of some sort, straight or inverted. After all, everyone needs someone to look down on, and the psychological need is the more urgent the more meritocratic a society becomes. This is because, in a meritocracy, a person’s failure is his own, whether of ability, character or effort. In a society in which roles are ascribed at birth and are more or less unchangeable, failure to rise by one’s own achievement is nothing to be ashamed of. To remain at, or worse still to sink down to, the bottom of the pile is humiliating only where a man can go from log cabin to White House. Of course, no society is a pure meritocracy and none allows of absolutely no means of social ascent either; thus my typology is a very rough one, and is not meant to suggest that there is ever a society in which the socially subordinate are perfectly happy with their lot or are universally discontented with it. But it does help to explain why justice, of the kind according which everyone receives his deserts, might not necessarily conduce to perfect contentment. It is obviously more gratifying to ascribe one’s failure to injustice than to oneself, and so there is an inherent tendency in a meritocracy for men to perceive injustice where none has been done.

Theodore Dalrymple - Of Snobbery and Soccer

Let's for a moment leave aside the caveat that there is no absolute measure of meritocracy...

Some CEOs believe that success is more due to effort and intelligent planning than to luck and market related unpredictable factors. They tend to be more snobby than the CEOs who acknowledge the overwhelming importance of unpredictable factors. The latter group recognizes the fact that the market place can never be meritocratic. Hence, they feel less need to find people to look down upon.

Some academic fields, such as mathematics, are as meritocratic as a system can practically get. Of course there are trends in mathematics too. Your work may be unappreciated today, but hailed later on. Nevertheless it is easier to separate the good mathematicians from the bad ones, than to separate the good CEOs from the bad ones. Hence, in the world of mathematics, one expects more snobbish behaviour. Is it true?

I am not sure. There seems to be other factors at work here. For instance, I find it hard to imagine someone devoting his life to such an abstract and difficult subject without really loving it for its own sake. Those who seek fame, status, money and power usually prefer other endeavours.

Update (November 2011) : Apparently the link between humiliation and meritocracy was first pointed out in the 18th century:

The long line of German authors who radically rejected "Western" ideas of the Enlightenment and the social philosophy of rationalism, utilitarianism and laissez faire as well as the policies advanced by these schools of thought was opened by Justus Möser. One of the novel principles which aroused Möser's anger was the demand that the promotion of army officers and civil servants should depend on personal merit and ability and not on the incumbent's ancestry and noble lineage, his age and length of service. Life in a society in which success would exclusively depend on personal merit would, says Möser, simply be unbearable. As human nature is, everybody is prone to overrate his own worth and deserts. If a man's station in life is conditioned by factors other than his inherent excellence, those who remain at the bottom of the ladder can acquiesce in this outcome and, knowing their own worth, still preserve their dignity and self-respect. But it is different if merit alone decides. Then the unsuccessful feel themselves insulted and humiliated. Hate and enmity against all those who superseded them must result. 
Ludwig von Mises - Anti-Capitalistic Mentality (Pages 13-14)

principle of noncontradiction

The traditional source of the Principle of Contradiction is Aristotle’s Metaphysics where he gives three different versions.

ontological: “It is impossible that the same thing belong and not belong to the same thing at the same time and in the same respect.”
psychological: “No one can believe that the same thing can (at the same time) be and not be.”
logical: “The most certain of all basic principles is that contradictory propositions are not true simultaneously.”

(Source)

Ontological version is just a word play. The word "belonging" is, well, the word "belonging". It is obnoxious to ascribe physicality to it.

Psychological version is not interesting. Moreover, most of us have immense capacity to believe in outright contradictory statements. (Yes, even if we are aware of them.)

The only proper domain for this principle is logic. (In fact, this is the most basic principle of logic.)

Some physicists claim that the quantum world operates with respect to a logic that is different than that of the classical world. By ascribing physicality to logic they are effectively endorsing the ontological version of the principle of non-contradiction. I sincerely think these people are very confused. Neither classical or quantum world operates with respect to any logic. Logic is too contingency-rich to be physically embedded. The sentence (P & not P) is deemed to be false by the principle of non-contradiction. Nevertheless it is a legitimate sentence, and there is no physical counterpart to it. There is no such thing as a physical contradiction. In fact, the problem is even more basic than this disparity: You can not even write a physical sentence P. Which language will that sentence be written in? What makes you think that the reality is revealing itself in its full nakedness before your eyes?

There is a reason why "non-contradiction" is called a principle. It is taken as an axiom because it can not be proven from primitives:

As is true of all axioms, the law of non-contradiction is alleged to be neither verifiable nor falsifiable, on the grounds that any proof or disproof must use the law itself prior to reaching the conclusion.
- Source

Like all other principles, this one is also human, all too human. Its humanness stems from the fact that only stories can be wrong.

Perhaps we should stop using the phrase "Not Even Wrong" as a derogatory expression. Who are we to insult nature?

being the last guy

Herding is a pervasive social phenomenon. The last guy who joins the herd typically loses the most. Here are two examples:

Finance:

If you are the last guy investing in a finance bubble, then you will lose the most because you have bought in at exactly the highest price level. When the bubble collapses, the collapse will be measured from the pinnacle of herd's stupidity, namely the moment of your transaction.

Physics:

My colleagues Ed Witten and Juan Maldacena and others who created string theory are birds, flying high and seeing grand visions of distant ranges of mountains. The thousands of humbler practitioners of string theory in universities around the world are frogs, exploring fine details of the mathematical structures that birds first saw on the horizon. My anxieties about string theory are sociological rather than scientific. It is a glorious thing to be one of the first thousand string theorists, discovering new connections and pioneering new methods. It is not so glorious to be one of the second thousand or one of the tenth thousand. There are now about ten thousand string theorists scattered around the world. This is a dangerous situation for the tenth thousand and perhaps also for the second thousand. It may happen unpredictably that the fashion changes and string theory becomes unfashionable. Then it could happen that nine thousand string theorists lose their jobs. They have been trained in a narrow specialty, and they may be unemployable in other fields of science.

Dyson - Birds and Frogs

algebra as cognitive science

There exists a zoo of algebraic structures in mathematics. All have emerged naturally from different areas of geometry and analysis. Yet they have profound similarities. In fact, most fall into a few general categories, some of which are so ubiquitous that one can not help but speculate that they emanate from certain human cognitive biases.

Of course, simpler structures are more likely to emerge in numerous places. But this is not the phenomenon I am referring to here... There is more to ubiquity than the prevalence of the simple. There are infinitude of possible algebraic structures out there, but human mathematicians seem to be consistently picking some in favour of others.

The discovery of the ubiquity of adjunctions is not solely a mathematical achievement. An account of it should have been published in cognitive science journals as well.

Some further observations:

1) If there are intelligent alien civilizations, then their catalogue of ubiquitous algebraic structures may be different from ours due to our cognitive differences.

2) Those, who claim that computers can generate mathematics that is of interest to human mathematicians, will first have to understand the biases and particularities of the human mind.

It must be admitted that the use of geometric intuition has no logical necessity in mathematics, and is often left out of the formal presentation of results. If one had to construct a mathematical brain, one would probably use resources more efficiently than creating a visual system. But the system is there already, it is used to great advantage by human mathematicians, and it gives a special flavour to human mathematics.

Ruelle as quoted in Leinster, Higher Operads, Higher Categories (Page 1)

Recent computer analysis of chess, working backwards, has revealed surprising results. There are positions which result in a win for one of the players in over 90 moves, none of which involves the capture of a piece or the movement of a pawn. The winning "line", moreover, is almost impossible to memorize, because there seems to be no rhyme or reason for the specific winning moves. Seemingly random moves results in a win, without anyone being able to characterize the moves as a "strategy" of one kind or the other. This is the opposite of what we find in mathematics.

Steiner, The Applicability of Mathematics as a Philosophical Problem (Page 64)

Proving human theorems via inhuman means... This is what the future computers should be instructed to do. Using non-geometric and pedantic proof techniques that humans are not good at, they should seek theorems that humans will like.

3) It is true that after the discovery of category theory, we realized that most of our mathematical structures in use are natural (i.e. adjunctional) in the precise categorical sense. But these are the structures that have managed to survive centuries of mathematical discourse. Those that failed to survive probably died off due to their categorical unnaturality. Evolution selects for optimality. What this means is that, now that we are equipped with the guidance of category theory, development of mathematics will accelerate. We can eliminate unnatural structures outright rather than waiting for time and public discourse to filter them out.

creation myths and numbers

Recursion is the first cognitive step in the process of abstraction. It helps us overcome the restrictions imposed by our innately empiricist view of the world. Number systems emerge when one asks the question "What comes right after number N?" Creation myths arise when one asks the question "What comes right before year N?"

One prediction that this makes in Pirahã follows from the suggestions of people who worked on number theory and the nature of number in human speech: that counting systems—numerical systems—are based on recursion, and that this recursion follows from recursion in the language. This predicts in turn that if a language lacked recursion, then that language would also lack a number system and a counting system. I've claimed for years that the Pirahã don't have numbers or accounting, and this has been verified in two recent sets of experiments, one of which was published in Science three years ago by Peter Gordon, arguing that the Pirahã don't count, and then a new set of experiments which was just carried out in January by people from Brain and Cognitive Sciences at MIT, which establishes pretty clearly that the Pirahã have no numbers, and, again, that they don't count at all.

...Peter Gordon and I were colleagues at the University of Pittsburgh, and Peter did his Ph.D. at MIT in psychology, with a strong concern for numerosity. We were talking, and I said, there's a group that doesn't count—I work with a group that doesn't count—and he found that very difficult to believe, so he wanted to go do experiments. He went, and I helped him get going; he did the experiments, but his explanation for the reason that the Pirahã don't count is that they don't have words for numbers. They only have one to many. I claim that in fact they don't have any numbers. His idea is that the absence of counting in Pirahã has a Whorfian explanation—that there's a linguistic determinism: if you lack numbers, you lack counting—that is, that the absence of the words causes the absence of the concepts. But this really doesn't explain a lot of things. There are a lot of groups that have been known not to have more than one to many—as soon as they got into a relationship where they needed it for trade, they borrowed the numbers from Portuguese or Spanish or English or whatever other language.

The crucial thing is that the Pirahã have not borrowed any numbers—and they want to learn to count. They asked me to give them classes in Brazilian numbers, so for eight months I spent an hour every night trying to teach them how to count. And it never got anywhere, except for a few of the children. Some of the children learned to do reasonably well, but as soon as anybody started to perform well, they were sent away from the classes. It was just a fun time to eat popcorn and watch me write things on the board. So I don't think that the fact that they lack numbers is attributable to the linguistic determinism associated with Benjamin Lee Whorf, i.e. that language determines our thought—I don't really think that goes very far. It also doesn't explain their lack of color words, the simplest kinship system that's ever been documented, the lack of recursion, and the lack of quantifiers, and all of these other properties. Gordon has no explanation for the lack of these things, and he will just say, "I have no explanation, that's all a coincidence".

...When I began to tell them the stories from the Bible, they didn't have much of an impact. I wondered, was I telling the story incorrectly? Finally one Pirahã asked me one day, well, what color is Jesus? How tall is he? When did he tell you these things? And I said, well, you know, I've never seen him, I don't know what color he was, I don't know how tall he was. Well, if you have never seen him, why are you telling us this?

...The Pirahã, who in some ways are the ultimate empiricists—they need evidence for every claim you make—helped me realize that I hadn't been thinking very scientifically about my own beliefs.

(Source)
 
Because of their culture's ingrained emphasis on referring only to immediate, personal experiences, the tribesmen do not have words for any abstract concept, from colour to memory and even to numbers. There is no past tense, he says, because everything exists for them in the present. When it can no longer be perceived, it ceases, to all intents, to exist. "In many ways, the Pirahã are the ultimate empiricists," Professor Everett says. "They demand evidence for everything."

Life, for the Pirahã, is about seizing the moment and taking pleasure here and now. "I suddenly noticed how excited they were whenever planes crossed the sky then disappeared. They just love sitting around watching people coming around the bend in the jungle. Whenever I came into the village then left, they were amazed."

The linguistic limitations of this "carpe diem" culture explain why the Pirahã have no desire to remember where they come from and why they tell no stories.

(Source)
 
Over the course of three weeks, Everett wrote what would become his Cultural Anthropology article, twenty-five thousand words in which he advanced a novel explanation for the many mysteries that had bedevilled him. Inspired by Sapir’s cultural approach to language, he hypothesized that the tribe embodies a living-in-the-present ethos so powerful that it has affected every aspect of the people’s lives. Committed to an existence in which only observable experience is real, the Pirahã do not think, or speak, in abstractions—and thus do not use color terms, quantifiers, numbers, or myths. Everett pointed to the word xibipío as a clue to how the Pirahã perceive reality solely according to what exists within the boundaries of their direct experience—which Everett defined as anything that they can see and hear, or that someone living has seen and heard. “When someone walks around a bend in the river, the Pirahã say that the person has not simply gone away but xibipío—‘gone out of experience,’ ” Everett said. “They use the same phrase when a candle flame flickers. The light ‘goes in and out of experience.’ ”

(Source)

Living the moment... Demanding evidence for every claim... Don't these sound like familiar aspirations? I feel like these people are light years ahead of us when it comes to cultural advancement.

Some languages predispose their speakers to leave the present. Although our minds are innately capable of going back and forth in time, our tendency to do so increases immensely once the capability becomes institutionalized through language acquisition.

Here is an interview about a man who learned his first language when he was 27 years old. SS refers to the person who taught him English.

RW: Was Ildefonso able to tell you anything about his life before? I take it he did become ordinarily competent in language.

SS: He did. I'm going to guess that at about six or seven years old he was herding sheep and goats and begging. Think about that. His brain was kept alive with problem solving. He had to be taught somehow that you go to this street and put your hand out, or your cup out, and try to get money even though he didn't know what money was.

Back to your question. Of course, you and I are interested in learning what was it like? It's another frustration that Ildefonso doesn't want to talk about it. For him, that was the dark time. Whenever I ask him, and I've asked him many, many times over the years, he always starts out with the visual representation of an imbecile: his mouth drops, his lower lip drops, and he looks stupid. He does something nonsensical with his hands like, "I don't know what's going on." He always goes back to "I was stupid."

It doesn't matter how many times I tell him, no, you weren't exposed to language and... The closest I've ever gotten is he'll say, "Why does anyone want to know about this? This is the bad time." What he wants to talk about is learning language.

RW: What does he say about that?

SS: He uses "dark" and "light." That's when everything "became light." That's when he understood that he could ask questions and get answers. He learned that there were explanations. Before, he'd had to figure out everything on his own. He could get macho behavior. You could see that. But if he couldn't see it, like history... He had no clue why he was picked up by people wearing green and put back in brown-skin land. So he wanted to talk about what you could do with language. Did you know that you could talk to anyone in the world! Did you know that you could read!

The only thing he said, which I think is fascinating and raises more questions than answers, is that he used to be able to talk to his other languageless friends. They found each other over the years. He said to me, "I think differently. I can't remember how I thought." I think that's phenomenal!

RW: It is. It's quite... well, I don't know what it is.

SS: As far as the critical language learning period goes, of course, his adult brain couldn't learn language as well as a child's brain. But he's got a lot of language! Where he gets lost is especially with too many references to time in one sentence. He can't handle too many tenses in one sentence. But he can handle more than one reference and he can handle any amount of information. He did learn language. The few problems he has are nothing compared to not having language.

But the second thing is the psychological slash philosophical things with language. He says he thinks differently. However, there are a few things he doesn't think differently about. I try to meet him once a year and I always ask him, "When was the last time we saw each other?" I ask him a "when" question because it tickles me. Time was the hardest thing for him to learn. And he always prefers to say "the winter season" or "the Christmas time." He wants to point to a season or to a holiday. It's not a cognitive problem. To this day, he thinks it's weird that we count time the way we do. He can do it, but he doesn't like it. Think about it. For twenty-seven years, he followed the sun. He followed cows. He followed the seasons. It's that rain-time of the year.

RW: And this is how people have done it in more traditional societies.

SS: There's some sanity in it. We care too much about counting time. Every single religion and spiritual practice that I've ever seen has something in it about living in the present. So there's some sanity in it.

(Source)

codifiable knowledge

While a practitioner over-emphasizes the importance and the share of non-codifiable knowledge in his field, an academician does the exact opposite. The latter's behaviour is understandable since

1) Only what is codifiable can be transferred to students inside a classroom environment, in a short duration of time.

2) Due to its higher level of generality and explicitness the codifiable knowledge has greater scientific value.

Practitioners, on the other hand, ignore the successful academic codifications at their own peril. Since most can not help making their own attempts at codification, they often end up reinventing the wheel or coining some awkward terminology.

Here are two examples from the finance world. I am sure one can find plenty of examples from other fields as well:

- McComish coins the term "anti-cash" in his book Anti-Logic

- Soros invents the term "reflexivity" in his book Alchemy of Finance

The former concept is awkward and unenlightening. The latter was codified in academia a long time ago. It was sad to hear Soros describing his finding as a major philosophical discovery.