public intellectual vs academic intellectual

Does academia have a monopoly over the world of ideas? Does an intellectual need to be an academician to be taken seriously?

The answer to both of these questions is negative. One reason is due to a trend taking place outside academia and the other reason is due to a trend taking place inside academia.

  • Outside. Thanks to the rise of the digital technologies, it has become dramatically easier to access and distribute information. You do not need to be affiliated with any university to participate in high quality lectures, freely access any journal or book, and exchange ideas.

  • Inside. Einstein considered Goethe to be “the last man in the world to know everything.” Today academia has become so specialized that most academicians have no clue even what their next-door colleagues are working on. This had the side-effect of pushing public intellectuals, and therefore a portion of intellectual activity, outside academia.

I have written a lot about the rise of the digital before. In this post I will be focusing on the second point.

Many of you probably do not even know what it means to be a public intellectual. Don’t worry, I did not neither. After all, we have all gone through the same indoctrination during our education, subtly instilling in us the belief that academia has a monopoly over the world of ideas, and that the only true intellectuals are those residing within it.

Before we start, note that the trends mentioned above are not some short-term phenomena. They are both reflections of metaphysical principles that govern evolution of information, and have nothing to do with us whatsoever.

  • First trend is unstoppable because information wants to be free.

  • Second trend is unstoppable because information wants to proliferate.


A Personal Note

A few readers asked me why I have not considered pursuing an academic career. I actually did, and by doing so, learned the hard way that academia is a suffocating place for people like me, who would rather expand their range than increase their depth.

This is the main reason why I wanted to write this piece. I am pretty sure that there are young folks out there, going through similar dilemmas, burning with intellectual energy but also suffering from extreme discomfort in their educational environments. They should not go through the same pains to realize that the modern university has turned into a cult of experts.

The division of labor is the very organizational principle of the university. Unless that principle is respected, the university simply fails to be itself. The pressure, therefore, is constant and massive to suppress random curiosity and foster, instead, only a carefully channeled, disciplined curiosity. Because of this, many who set out, brave and cocky, to take academe as a base for their larger, less programmed intellectual activity, who are confident that they can be in academe but not of it, succumb to its culture over time.

… It takes years of disciplined preparation to become an academic. It takes years of undisciplined preparation to become an intellectual. For a great many academics, the impulse to break free, to run wild, simply comes too late for effective realization.

Jack Miles - Three Differences Between an Academic and an Intellectual

There is of course nothing wrong with developing a deep expertise in a narrow subject. But societies need the opposite type of intellectuals as well, for a variety of reasons which will be very clear by the end of this post.

When I look back in time to see what type of works had the greatest impact on my life, the pattern is very clear. Without any exception, all such works were produced by public intellectuals with great range and tremendous communication skills. In fact, if I knew I was going to be stranded on a desert island, I would not even bring a single book by an academic intellectual. (Of course, without the inputs of hundreds of specialists, there would not be anything to synthesize for the generalist. Nevertheless it is the synthesis people prefer to carry in their minds at all times, not the original inputs.)

This post is a tribute to the likes of David Brooks (Sociology), Noam Chomsky (Politics), Nassim Nicholas Taleb (Finance), Kevin Kelly (Technology), Ken Wilber (Philosophy), Paul Davies (Physics) and Lynn Margulis (Biology). Thank you for being such great sources of inspiration.

Anyway, enough on the personal stuff. Let us now start our analysis.

We will cycle through five different characterizations, presenting public intellectuals as

  • Amorphous Dilettantes,

  • Superhuman Aspirants,

  • Obsessive Generalists,

  • Metaphor Artists, and

  • Spiritual Leaders.

Thereby, we will see how they

  • enhance our social adaptability,

  • push our individual evolutionary limits,

  • help science progress,

  • communicate us the big picture, and

  • lead us in the right direction.


Public Intellectuals as Amorphous Dilettantes
Enhancing Our Social Adaptability

Every learning curve faces diminishing returns. So why become an expert at all? Why not just suffice with 80 percent competence? Just extract the gist of the subject and then move onto the next. Many fields are so complex that they are not open to complete mastery anyway.

Also, the world is such a rich place. Why blindly commit yourself to a single aspect of it? Monolithic ambitions are irrational.

Yes, it may be the experts who do the actual work to carry the society to greater heights. But while doing so, they end up failing to elevate themselves high enough to see the progress at large. That voyeuristic pleasure belongs only to the dilettantes.

Dilettantes are jacks of all trades, and their amorphousness is their asset.

  • They are very useful in resource stricken and fast changing environments like an early-stage startup which faces an extremely diverse set of challenges with a very limited hiring budget. Just like stem cells, dilettantes can specialize on demand and then revert back to their initial general state when there are enough resources to replace them with experts. (Good dilettantes do not multi-task. They serially focus on different things.)

  • They can act as the weak links inside innovation networks and thereby lubricate into existence greater number of multidisciplinary efforts and serendipities. Just like people conversant in many languages, they can act as translators and unify otherwise disparate groups.

  • They are like wild bacteria that can survive freely on their own at the outer edges of humanity. An expert, on the other hand, can function only within a greater cooperative network. Thus, evolution can always fall back on the wild types if the environment changes at a breakneck speed and destroys all such networks.

It is a pity that the status of dilettantes plummeted in modern age whose characteristic collective flexibility enabled more efficient deployment of experts. After all, as humans, we did not win the evolutionary game because we are the fastest or the strongest. We won because we were overall better than average, because we were versatile and better at adaptation. In other words, we won because we were true dilettantes.

Every 26 million years, more or less, there has been an environmental catastrophe severe enough to put down the mighty from their seat and to exalt the humble and meek. Creatures which were too successful in adapting themselves to a stable environment were doomed to perish when the environment suddenly changed. Creatures which were unspecialized and opportunistic in their habits had a better chance when Doomsday struck. We humans are perhaps the most unspecialized and the most opportunistic of all existing species. We thrive on ice ages and environmental catastrophes. Comet showers must have been one of the major forces that drove our evolution and made us what we are.

Freeman Dyson - Infinite in All Directions (Page 32)

Similarly, only generalist birds like robins can survive in our most urbanized locations. Super-dynamic environments always weed out the specialists.


Public Intellectuals as Superhuman Aspirants
Pushing Our Individual Evolutionary Limits

Humans were enormously successful because, in some sense, they contained a little bit of every animal. Their instincts were literally a synthesis.

Now what is really the truth about these soul qualities of humans and animals? With humans we find that they can really possess all qualities, or at least the sum of all the qualities that the animals have between them (each possessing a different one). Humans have a little of each one. They are not as majestic as the lion, but they have something of majesty within them. They are not as cruel as the tiger but they have a certain cruelty. They are not as patient as the sheep, but they have some patience. They are not as lazy as the donkey—at least everybody is not—but they have some of this laziness in them. All human beings have these things within them. When we think of this matter in the right way we can say that human beings have within them the lion-nature, sheep-nature, tiger-nature, and donkey-nature. They bear all these within them, but harmonized. All the qualities tone each other down, as it were, and the human being is the harmonious flowing together, or, to put it more academically, the synthesis of all the different soul qualities that the animal possesses.

Rudolf Steiner - Kingdom of Childhood (Page 43)

Now, just as animals can be viewed as “special instances” of humans, we can view humans as special instances of what a dilettante secretly aspires to become, namely a superhuman.

Humans minds could integrate the instinctive (unconscious) aspects of all animal minds, thanks to the evolutionary budding of a superstructure called the consciousness, which allowed them to specialize their general purpose unconsciousness into any form necessitated by the changing circumstances.

Dilettantes try to take this synthesis to the next level, and aim to integrate the rationalistic (conscious) aspects of all human minds. Of course, they utterly fail at this task since they lack the next-level superstructure necessary to control a general purpose consciousness. Nevertheless they try and try, in an incorrigibly romantic fashion. I guess some do it just for the sake of a few precious voyeuristic glimpses of what it feels to be a superhuman.

Note that, it will be the silicon-based life - not us - who will complete the next cycle of differentiation-integration in the grand narrative of evolution. As I said before, our society is getting better at deploying experts wherever they are needed. This increased fluidity of labor is entirely due to the technological developments which enable us to more efficiently govern ourselves. What is emerging is a superconsciousness that is coordinating our consciousnesses, and pushing us in the direction of a single unified global government.

“Opte Project visualization of routing paths through a portion of the Internet. The connections and pathways of the internet could be seen as the pathways of neurons and synapses in a global brain” - Wikipedia

“Opte Project visualization of routing paths through a portion of the Internet. The connections and pathways of the internet could be seen as the pathways of neurons and synapses in a global brain” - Wikipedia

Nevertheless there are advantages to internalizing portions of the hive mind. Collaboration outside can never fully duplicate the effects of collaboration within. As a general rule, closer the “neurons”, better the integration. (The “neuron” could be an entire human being or an actual neuron in the brain.)

Individual creators started out with lower innovativeness than teams - they were less likely to produce a smash hit - but as their experience broadened they actually surpassed teams: an individual creator who had worked in four or more genres was more innovative than a team whose members had collective experience across the same number of genres.

David Epstein - Range (Pages 209-210)

Notice that there is a pathological dimension to the superhuman aspiration, aside from the obvious narcissistic undertones. As one engulfs more of the hive mind, one inevitably ends up swallowing polar opposite profiles.

“The wisest human being would be the richest in contradictions, who has, as it were, antennae for all kinds of human beings - and in the midst of this his great moments of grand harmony.”

- Friedrich Nietzsche

“The test of a first-rate intelligence is the ability to hold two opposing ideas in mind at the same time and still retain the ability to function.”

- F. Scott Fitzgerald

In a sense, reality is driven by insanity. It owes its “harmony” and dynamism to the embracing of the contradictory tensions created by dualities. We, on the other hand, feel a psychological pressure to choose sides and break the dualities within our social texture. Instead of expanding our consciousness horizontally, we choose to contract it to maintain consistency and sanity.

“A foolish consistency is the hobgoblin of little minds, adored by little statesmen and philosophers and divines.”

- Ralph Waldo Emerson

“Do I contradict myself? Very well. Then I contradict myself. I am large. I contain multitudes.”

- Walt Whitman

Recall that humans are an instinctual synthesis of the entire animal kingdom. This means that, while we strive for consistency at a rational level, we are often completely inconsistent at an emotional level, roaming wildly around the whole spectrum of possibilities. In other words, from the perspective of an animal, we probably look utterly insane, since it can not tell that there is actually a logic to this insanity that is internally controlled by a superstructure.

“A human being is that insane animal whose insanity has invented reason.”

- Cornelius Castoriadis

Public Intellectuals as Obsessive Generalists
Helping Science Progress

If a specialist is someone who knows more and more about less and less, a generalist is unapologetically someone who knows less and less about more and more. Both forms of knowledge are genuine and legitimate. Someone who acquires a great deal of knowledge about one field grows in knowledge, but so does someone who acquires a little knowledge about many fields. Knowing more and more about less and less tends to breed confidence. Knowing less and less about more and more tends to breed humility.

Jack Miles - Three Differences Between an Academic and an Intellectual


The difference between science and philosophy is that the scientist learns more and more about less and less until she knows everything about nothing, whereas a philosopher learns less and less about more and more until he knows nothing about everything.

Dorion Sagan - Cosmic Apprentice (Page 2)

What separates good public intellectuals from bad ones is that the good have a compass which guide them while they are sailing through the infinite sea of knowledge. Those without a compass do not at all display any humility. Instead, they suffer from gluttony, which is an equally deadly sin as pride, which plagues the bad academic intellectuals whose expertise-driven egos easily spill over to areas they have no competence in.

The compass I am talking about is analogical reasoning, the kind of reasoning needed for connecting the tapestry of knowledge. Good public intellectuals try to understand the whole geography rather than wonder around mindlessly like a tourist. They have a pragmatic goal in mind, which is to understand the mind of God. They venture horizontally in order to lift themselves up to a higher plateau by discovering frameworks that apply to several subject areas at once.

By definition, one can not generalize if one is stuck inside a single silo of knowledge. But jumping around too many silos does not help neither. Good public intellectuals dig deep enough into a subject area to bring their intuition to a level that is sufficient to make the necessary outside connections. Bad ones spread themselves too thin, and eventually become victims of gluttony.

As I explained in a previous blog post, science progresses via successful unifications. Banishing of generalists from the academia therefore had the effect of slowing down science by drowning it in complete incrementalism. In the language of Freeman Dyson, today, academia is breeding only “frogs”.

Birds fly high in the air and survey broad vistas of mathematics out to the far horizon. They delight in concepts that unify our thinking and bring together diverse problems from different parts of the landscape. Frogs live in the mud below and see only the flowers that grow nearby. They delight in the details of particular objects, and they solve problems one at a time.

Freeman Dyson - Birds and Frogs (Page 37)

Without the “birds” doing their synthesizing and abstracting, we can not see where the larger paradigm is evolving towards, and without this higher level map, we can not accelerate the right exploratory paths or cut off the wrong ones. More importantly, losing sight of the unity of knowledge creates an existential lackluster that sooner or later wears off everyone involved in pursuit of knowledge, including the academic intellectuals.

Consciousness discriminates, judges, analyzes, and emphasizes the contradictions. It's necessary work up to a point. But analysis kills and synthesis brings back to life. We must find out how to get everything back into connection with everything else.

- Carl Gustav Jung, as quoted in The Earth Has a Soul (Page 209)

True, academic intellectuals are occasionally allowed to engage in generalization, but they are forbidden from obsessing too much about it and venturing too far away from their expertise area. This prevents them from making fresh connections that could unlock their long-standing problems. That is why most paradigm shifts in science and technology are initiated by outsiders who can bring in brand new analogies to the field. (Generalists are also great at taming the excessive enthusiasm of specialists who often over-promote the few things that they are so personally invested in.) For instance, both Descartes and Darwin were revolutionaries who addressed directly (and eloquently) to the general public, without any university affiliations.

Big picture generalities are also exactly what the public cares about:

There are those who think that an academic who sometimes writes for a popular audience becomes a generalist on those occasions, but this is a mistaken view. A specialist may make do as a popularizer by deploying his specialized education with a facile style. A generalist must write from the full breadth of a general education that has not ended at graduation or been confined to a discipline. If I may judge from my ten years' experience in book publishing, what the average humanities academic produces when s/he sets out to write for "the larger audience" is a popularizer's restatement of specialized knowledge, while what the larger audience responds to is something quite different: It is specialized knowledge sharply reconceptualized and resituated in an enlarged context.

Jack Miles - Three Differences Between an Academic and an Intellectual

While nitty gritty details change all the time, the big picture evolves very slowly. (This is related to the fact that it becomes harder to say new things as one moves higher up in generality.) Hence the number of good public intellectuals needed by the society is actually not that great. But finding and nurturing one is not easy, for the same reason why finding and nurturing a potential leader is not easy.

Impostors are another problem. While bad academic intellectuals are quickly weeded out by their community, bad public intellectuals are not, because they do not form a true community. Their ultimate judge is public, whose quality determines the quality of who becomes popular, in a fashion that is not too dissimilar to how the quality of leaders correlates with the quality of followers.


Public Intellectuals as Metaphor Artists
Communicating Us the Big Picture

As discussed in a previous blog post, generalizations happen through analogies and result in further abstraction. Metaphors, on the other hand, result in further concretization through the projection of the familiar onto the unfamiliar. That is why they are such great tools for communication, and why it is often pedagogically necessary to follow a generalization up with a metaphor to ground the abstract in the familiar.

While academic intellectuals write for each other, a public intellectual writes for the greater public and therefore has no choice but to employ spot-on metaphors to deliver his message. He is lucky in the sense that, compared to the academic intellectual, he has knowledge of many more fields and therefore enjoys a larger metaphor reservoir.

Bad academic intellectuals mistake depth with obscurity, as if something expressed with clarity can not be of any significance. They are often proud of being understood by only a few other people, and invent unnecessary jargon to keep the generalists at bay, and to create an air of originality. (Of course, an extra bit of jargon is inevitable, since as one zooms in, more phenomena become distinguishable and worth attaching new names.)

The third difference between an intellectual and an academic is the relative attachment of each to writing as a fine rather than a merely practical art. "If you happen to write well," Gustave Flaubert once wrote, "you are accused of lacking ideas."

… An academic is concerned with substance and suspicious of style, while an intellectual is suspicious of any substance that purports to transcend or defy style.

Jack Miles - Three Differences Between an Academic and an Intellectual

While academic intellectuals obsess about discovery and originality, public intellectuals obsess about delivery and clarity.

  • Academic intellectuals worry a lot about attaching their names to new ideas. So, in some sense, it is natural for them to lack lucidity. After all, it takes a long time for a new born idea to mature and find its right spot in the grand tapestry of knowledge.

“To make a discovery is not necessarily the same as to understand a discovery.”

- Abraham Pais

It is also not surprising for professors to prefer to teach from (and refer to) the original texts rather than the more clear secondary literature. Despite the fact that only a minuscule number of students end up staying in academia, professors design their courses as if the goal is to train future professors who, like themselves, will value originality over clarity. Students are asked to trace all ideas back to their originators, and are given the implicit guarantee that they too will be treated with the same respect if they successfully climb the greasy pole.

It is actually quite important for a future academician to witness the chaotic process behind an idea’s birth (inside a single mind) and its subsequent maturation (out in the community). In formalistic subjects like mathematics and physics, where ideas reach their peak clarity at a much faster speed, the pedagogical pressure to choose the conceptual route (rather than the historical route) for teaching is great. So the students end up reading only the most polished material, never referring back to the original papers which contain at least some traces of battle scars. They are accelerated to the research frontier, but with much less of an idea about what it actually means to be at the frontier. Many, expecting a clean-cut experience, leave academia disillusioned.

  • Public intellectuals do not get their names attached to certain specific discoveries. Their main innovation lies in building powerful bridges and coining beautiful metaphors, and ironically, the better they are, the more quickly they lose ownership over their creations.

Effective metaphors tend to be easily remembered and transmitted. This is, in fact, what enables them to become clichés.
 
James Geary - I is an Other (Page 122)

Hence, while academic intellectuals are more like for-profit companies engaged in extractable value creation, public intellectuals are more like non-profit companies engaged in diffused value creation. They inspire new discoveries rather than make new discoveries themselves. In other words, they are more like artists, who enrich our lives in all sorts of immeasurable ways, and get paid practically nothing in return.

All ideas, including those generated by academic intellectuals, either eventually die out, or pass the test of time and prove to be so foundational that they reach their final state of maturity by becoming totally anonymized. Information wants to be free, not just in the sense of being accessible, but also in the sense of breaking the chains tied to its originator. No intellectual can escape this fact. For public intellectuals, the anonymization process happens much faster, because the public does not really care much about who originated what. What about the public intellectuals themselves, do they really care? Well, good ones do not, because their main calling has always been public impact (rather than private gain) anyway.

The dichotomy between those who obsess about “discovery and originality” and those who obsess about “delivery and clarity” has been very eloquently characterized by Rota within the sphere of mathematics, as the dichotomy between problem solvers and theorizers:

To the problem solver, the supreme achievement in mathematics is the solution to a problem that had been given up as hopeless. It matters little that the solution may be clumsy; all that counts is that it should be the first and that the proof be correct. Once the problem solver finds the solution, he will permanently lose interest in it, and will listen to new and simplified proofs with an air of condescension suffused with boredom.

The problem solver is a conservative at heart. For him, mathematics consists of a sequence of challenges to be met, an obstacle course of problems. The mathematical concepts required to state mathematical problems are tacitly assumed to be eternal and immutable.

... To the theorizer, the supreme achievement of mathematics is a theory that sheds sudden light on some incomprehensible phenomenon. Success in mathematics does not lie in solving problems but in their trivialization. The moment of glory comes with the discovery of a new theory that does not solve any of the old problems but renders them irrelevant.

The theorizer is a revolutionary at heart. Mathematical concepts received from the past are regarded as imperfect instances of more general ones yet to be discovered. Mathematical exposition is considered a more difficult undertaking than mathematical research.

Gian-Carlo Rota - Problem Solvers and Theorizers

Public Intellectuals as Spiritual Leaders
Leading Us in the Right Direction

Question: Who are our greatest metaphor artists?
Answer: Our spiritual leaders, of course.

Reading sacred texts too literally is a common rookie mistake. They are the most metaphor-dense texts produced by human beings, and this vagueness is a feature, not a bug.

  • Longevity. Thanks to their deliberately vague language, these texts have much higher chances of survival by being open to continuous re-interpretation through generations.

  • Mobilization. Metaphors are politically subversive devices, useful for crafting simple illuminating narrations that can mobilize masses.

“A good metaphor is something even the police should keep an eye on."

- Georg Christoph Lichtenberg

  • Charisma. Imagine a sacred text written like a dry academic paper, referring to other authors for trivially-obvious facts and over-contextualizing minute shit. Who would be galvanized by that? Nobody of course. Charismatic people anonymize mercilessly, and both fly high and employ plenty of metaphors.

Question: Who are our most obsessive generalists?
Answer: Again, our spiritual leaders.

Spiritual people care about the big picture, literally the biggest picture. They want to probe the mind of God, and as we explained in a previous post, the only way to do that is through generalizations. This quest for generalization is essentially what makes spiritual leaders so humble, visionary and wise.

  • Humble. It suffices to recall the second Jack Miles quote: “Knowing more and more about less and less tends to breed confidence. Knowing less and less about more and more tends to breed humility.”

  • Visionary. Morgan Housel says that “the further back in history you look, the more general your takeaways should be.” I agree a hundred percent. In fact, the dual statement is also correct: The further you venture into the future, the more general your predictions should be. In other words, the only way to venture into far future is by looking at big historical patterns and transforming general takeaways into general predictions. That is why successful visionaries and paradigm shifters are all generalists. (There is now an entire genre of academicians trying to grasp why academicians are so bad at long-term forecasts. In a nutshell, experts beat generalists in short-term forecasting through incorporation of domain-specific insights, but this advantage turns into a disadvantage when it comes to making long-term forecasts because, in the long run, no domain can be causally isolated from another.)

Kuhn shows that when a scientific revolution is occurring, books describing the new paradigm are often addressed to anyone who may be interested. They tend to be clearly written and jargon free, like Darwin's Origin of Species. But once the revolution becomes mainstream, a new kind of scientist emerges. These scientists work on problems and puzzles within the new paradigm they inherit. They don't generally write books but rather journal articles, and because they communicate largely with one another, a specialized jargon develops so that even colleagues in adjacent fields cannot easily understand them. Eventually the new paradigm becomes the new status quo.

Norman Doidge - The Brain’s Way of Healing (Page 354)

  • Wise. The dichotomy between academic and public intellectuals mirrors the dichotomy between genius and wisdom. Sudden flashes of insight always help, but there is no short-cut to the big picture. You need to accumulate a ton of experience across different aspects of life. Academic culture, on the other hand, is genius-driven and revolves around solving specific hard technical problems. That is why academic intellectuals get worse as they age, while public intellectuals get better. This, by the way, poses a huge problem for the future of academia:

As our knowledge deepens and widens, so it will take longer to reach a frontier. This situation can be combated only by increased specialization, so that a progressively smaller part of the frontier is aimed at, or by lengthening the period of training and apprenticeship. Neither option is entirely satisfactory. Increased specialization fragments our understanding of the Universe. Increased periods of preliminary training are likely to put off many creative individuals from embarking upon such a long path with no sure outcome. After all, by the time you discover that you are not a successful researcher, it may be too late to enter many other professions. More serious still, is the possibility that the early creative period of a scientists life will be passed by the time he or she has digested what is known and arrived at the research frontier.

John D. Barrow - Impossibility (Page 108)

Question: Who are our best superhuman aspirants?
Answer: Yet again, our spiritual leaders.

I guess this answer requires no further justification since most people treat their spiritual leaders as superhumans anyway. But do they treat them in the same sense as we have defined the term? Now that is good question!

Remember, we had defined superhuman as an entity possessing a superconsciousness that can specialize a general purpose consciousness into any form necessitated by the changing circumstances. In other words, a superhuman can simulate any form of human consciousness on demand. According to Carl Gustav Jung, Christ was close to such an idealization.

For Jung, Christianity represented a necessary stage in the evolution in consciousness, because the divine image of Christ represented a more unified image of the autonomous human self than did the multiplicity of earlier pagan divinities.

David Fideler - Restoring the Soul of the World (Page 79)

Jesus also seems to have transcended the social norms of his times, and showcased the typical signs of insanity that comes with the territory, due to the internalization of too much multiplicity in the psychic domain.

… all great spiritual teachers, including Jesus and Buddha, challenged social norms in ways that could have been judged insane. Throughout the history of spirituality, moreover, some spiritual adepts have acted in especially unconventional, even shocking ways. This behavior is called holy madness, or crazy wisdom.

Although generally associated with Hinduism and Buddhism, crazy wisdom has cropped up in Western faiths, too. After Saul became Saint Paul, he preached that a true Christian must “become a fool that he may become wise.” Paul’s words inspired a Christian sect called Fools for Christ’s Sake, members of which lived as homeless and sometimes naked nomads.

John Horgan - Rational Mysticism (Page 53)

Was Jesus some sort of an early imperfect carbon-based version of the newly emerging silicon-based hive mind? A bizarre question indeed! But what is clear is that, any superhuman we can create out of flesh, no matter how imperfect, is our best hope for disciplining the global technological layer that is now emerging all over us and controlling us to the point of suffocation.

Technology is a double-edged sword with positive and negative aspects.

  • Positive. Gives prosperity. Increases creative capabilities.

  • Negative. Takes away freedom. Increases destructive capabilities.

What is strange is that we are not allowed to stop its progression. (This directionality is a specific manifestation of the general directionality of evolution towards greater complexity.) There are two main reasons.

  • Local Reason. If you choose not to develop technology yourself, then someone else will, and that someone else will eventually choose to use its newly discovered destructive capabilities on you to engulf you.

  • Global Reason. Even if we somehow manage to stop developing technology in a coordinated fashion, we will eventually be punished for this decision when we get hit by the next cosmic catastrophe and perish like the dinosaurs for not building the right defensive measures.

So we basically need to balance power with control. And, just as all legal frameworks rest on moral ones, all forms of self-governance ultimately rest upon spiritual foundations. As pointed out in an earlier post, technocratic leadership alone will eventually drive us towards self-destruction.

Today, what we desperately need is a new generation of spiritual leaders who can integrate us a new big-picture mythology, conforming to the latest findings of science. (Remember, as explained in an earlier post, science helps religion to discover its inner core by both limiting the domain of exploration and increasing the efficacy of exploration.) Only such a mythology can convince the new breed of meritocratic elites to discipline themselves and keep tabs on our machines, and galvanize the necessary public support to give these elites sufficient breathing room to tackle the difficult challenges.

Of course, technocratic leadership is exactly what academic intellectuals empower and spiritual leadership is exactly what public intellectuals stand for. (Technocratic leaders may be physically distant, operating from far away secluded buildings, but they are actually very easy to relate to on a mental level. Spiritual leaders on the other hand are physically very close, leading from the ground so to speak, but they are operating from such an advanced mental level that they are actually very hard to relate to. That is why good spiritual leaders are trusted while good technocratic leaders are respected.)

As technology progresses and automates more and more capabilities away from us, the chasm between the two types of intellectuals will widen.

  • Machines have already become quite adept at vertical thinking and have started eating into the lower extremities of the knowledge tree, forcing the specialists (i.e. academic intellectuals) to collaborate with them. (Empowerment by the machines is partially ameliorating the age problem we talked about.) Although machines look like tools at the moment, they will eventually become the dominant partner, making their human partners strive more and more to preserve their relevancy.

  • Despite being highly adaptable dilettantes, public intellectuals are not safe neither. As the machines become more adept at lateral thinking, they will feel pressure from below, just as academic counterparts are feeling pressure from above.

Of course, our entire labor force (not only the intellectuals) will undergo the same polarization process and thereby split into two discrete camps with a frantic and continually diminishing gray zone in between:

  • Super generalists who are extremely fluid.

  • Super specialists who are extremely expendable.

This distinction is analogous to the distinction between generalized stem cells and specialized body cells, who are not even allowed to replicate.

“The spread of computers and the Internet will put jobs in two categories. People who tell computers what to do, and people who are told by computers what to do.”

- Marc Andreessen

In a sense, Karl Marx (who thought economic progress would allow everyone to be a generalist) and Herbert Spencer (who thought economic progress would force everyone to become a specialist) were both partially right.

We need generalist leaders with range to exert control and point us (and increasingly our machines) in the right direction, and we need specialist workers with depth to generate growth and do the actual work. Breaking this complimentary balance, by letting academic intellectuals take over the world of ideas and technocratic leaders take over the world of action, amounts to being on a sure path to extinction via a slow loss of fluidity and direction.

analogies vs metaphors

“The existence of analogies between central features of various theories implies the existence of a general abstract theory which underlies the particular theories and unifies them with respect to those central features.”
- Eliakim Hastings Moore

Conceptual similarities manifest themselves as analogies, where one recognizes that two structures X and Y have a common meaningful core, say A, which can be pulled up to a higher level. The resulting relationship is symmetric in the sense that the structure A specializes to both X and Y. In other words, one can say either “X is like Y via A” or “Y is like X via A”.

Analogy.png

The analogy get codified in the more general structure A which in turn is mapped back onto X and Y. (I say “onto” because A represents a bigger set than both X and Y.) Discovering A is revelatory in the sense that one recognizes that X and Y are special instances of a more general phenomenon, not disparate structures.

Metaphors play a similar role as analogies. They too increase our total understanding, but unlike analogies, they are not symmetric in nature.

Say there are two structures X and Y where is X is more complex but also more familiar than Y. (In practice, X often happens to be an object we have an intuitive grasp of due to repeated daily interaction.) Discovering a metaphor, say M, involves finding a way of mapping X onto Y. (I say “onto” because X - via M - ends up subsuming Y inside its greater complexity.)

Metaphor.png

The explanatory effect comes from M pulling Y up to the familiar territory of X. All of a sudden, in an almost magical fashion, Y too starts to feel intuitive. Many paradigm shifts in the history of science were due to such discrete jumps. (e.g. Maxwell characterizing the electromagnetic field as a collection of wheels, pulleys and fluids.)

Notice that you want your analogy A to be as faithful as possible, capturing as many essential features of X and Y. If you generalize too much, you will end up with a useless A with no substance. Similarly, for each given Y, you want your metaphor pair (X,M) to be as tight as possible, while not letting X stray away from the domain of the familiar.

You may be wondering what happens if we dualize our approaches in the above two schemes.

  • Analogies. Instead of trying to rise above the pair (X,Y), why not try to go below it? In other words, why not consider specializations that both X and Y map onto, rather than focus on generalizations that map onto X and Y?

  • Metaphors. Instead of trying to approach Y from above, why not try approach it from below? In other words, why not consider metaphors that map the simple into the complex rather than focus on those that map the complex onto the simple?

The answer to both questions is the same: We do not, because the dual constructions do not require any ingenuity, and even if they turn out to be very fruitful, the outcomes do not illuminate the original inputs.

Let me expand on what I mean.

  • Analogies enhance our analytic understanding of the world of ideas. They are tools of the consciousness, which can not deal with the concrete (specialized) concepts head on. For instance, since it is insanely hard to study integers directly, we abstract and study more general concepts such as commutative rings instead. (Even then the challenge is huge. You could devote your whole life to ring theory and still die as confused as a beginner.)

    In the world of ideas, one can easily create more specialized concepts by taking conjunctions of various X’s and Y’s. Studying such concepts may turn out to be very fruitful indeed, but it does not further our understanding of the original X’s and Y’s. For instance, study of Lie Groups is exceptionally interesting, but it does not further our understanding of manifolds or groups.

  • Metaphors enhance our intuitive understanding of the world of things. They are tools of the unconsciousness, which is familiar with what is more immediate, and what is more immediate also happens to be what is more complex. Instruments allow us to probe what is remote from experience, namely the small and the big, and both turn out to be stranger but also simpler than the familiar stuff we encounter in our immediate daily lives.

    • What is smaller than us is simpler because it emerged earlier in the evolutionary history. (Compare atoms and cells to humans.)

    • What is bigger than us is simpler because it is an inanimate aggregate rather than an emergent life. (Those galaxies may be impressive, but their complexity pales in comparison to ours.)

    In the world of things, it is easy to come up with metaphors that map the simple into the complex. For instance, with every new technological paradigm shift, we go back to biology (whose complexity is way beyond anything else) and attack it with the brand new metaphor of the emerging Zeitgeist. During the industrial revolution we conceived the brain as a hydraulic system, which in retrospect sounds extremely naive. Now, during the digital revolution, we are conceiving it as - surprise, surprise - a computational system. These may be productive endeavors, but the discovery of the trigger metaphors itself is a no-brainer.

Now is a good time to make a few remarks on a perennial mystery, namely the mystery of why metaphors work at all.

It is easy to understand why analogies work since we start off with a pair of concepts (X,Y) and use it as a control while moving methodically upwards towards a general A. In the case of metaphors, however, we start off with a single object Y, and then look for a pair (X,M). Why should such a pair exist at all? I believe the answer lies in a combination of the following two quotes.

"We can so seldom declare what a thing is, except by saying it is something else."
- George Eliot

“Subtle is the Lord, but malicious He is not.”
- Albert Einstein

Remember, when Einstein characterized gravitation as curvature, he did not really tell us what gravity is. He just stated something unfamiliar in terms of something familiar. This is how all understanding works. Yes, science is progressing, but all we are doing is just making a bunch of restatements with no end in sight. Absolute truth is not accessible to us mere mortals.

“Truths are illusions which we have forgotten are illusions — they are metaphors that have become worn out and have been drained of sensuous force, coins which have lost their embossing and are now considered as metal and no longer as coins.”
- Friedrich Nietzsche

The reason why we can come up with metaphors of any practical significance is because nature subtly keeps recycling the same types of patterns in different places and at different scales. This is what Einstein means when he says that the Lord is not malicious, and is why nature is open to rational inquiry in the first place.

Unsurprisingly, Descartes himself, the founder of rationalism, was also a big believer in the universality of patterns.

Descartes followed this precept by liberal use of scaled-up models of microscopic physical events. He even used dripping wine vats, tennis balls, and walking-sticks to build up his model of how light undergoes refraction. His statement should perhaps also be taken as evidence of his belief in the universality of certain design principles in the machinery of Nature which he expects to reappear in different contexts. A world in which everything is novel would require the invention of a new science to study every phenomenon. It would possess no general laws of Nature; everything would be a law unto itself.

John D. Barrow - Universe That Discovered Itself (Page 107)

Of course, universality does not make it any easier to discover a great metaphor. It still requires a special talent and a trained mind to intuit one out of the vast number of possibilities.

Finding a good metaphor is still more of an art than a science. (Constructing a good analogy, on the other hand, is more of a science than an art.) Perhaps one day computers will be able to completely automate the search process. (Currently, as I pointed out in a previous blog post, they are horrible at horizontal type of thinking, the type of thinking required for spotting metaphors.) This will result in a disintermediation of mathematical models. In other words, computers will simply map reality back onto itself and push us out of the loop altogether.

Let us wrap up all the key observations we made so far in a single table:

analogies vs metaphors.png

Now let us take a brief detour in metaphysics before we have a one last look at the above dichotomy.

Recall the epistemology-ontology duality:

  • An idea is said to be true when every body obeys to it.

  • A thing is said to be real when every mind agrees to it.

This is a slightly different formulation of the good old mind-body duality.

  • Minds are bodies experienced from inside.

  • Bodies are minds experienced from outside.

While minds and bodies are dynamic entities evolving in time, true ideas and real things reside inside a static Platonic world.

  • Minds continuously shuffle through ideas, looking for the true ones, unable to hold onto any for a long time. Nevertheless truth always seems to be within reach, like a carrot dangling in the front.

  • Minds desperately attach names to phenomena, seeking permanency within the constant flux. Whatever they refer to as a real thing eventually turns out to be unstable and ceases to be.

Hence, the dichotomy between true ideas and real things can be thought of as the (static) Being counterpart of the mind-body duality which resides in (dynamic) Becoming. In fact, it would not be inappropriate to call the totality of all true ideas as God-mind and the totality of all real things as God-body.

Anyway, enough metaphysics. Let us now go back to our original discussion.

In order to find a good metaphor, our minds scan through the X’s that we are already experientially familiar with. The hope is to be able to pump up our intuition about a thing through another thing. Analogies on the other hand help us probe the darkness, and bring into light the previously unseen. Finding a good A is like pulling a rabbit out of a hat, pulling something that was out-of-experience into experience. The process looks as follows.

  1. First you encounter a pair of concepts (X,Y) in the shared public domain, literally composed of ink printed upon a paper or pixels lighting up on a screen.

  2. Your mind internalizes (X,Y) by turning it back to an idea form, hopefully in the fashion that was intended by its originator mind.

  3. You generalize (X,Y) to A within the world of ideas through careful reasoning and aesthetic guidance.

  4. You share A with other minds by turning it into a thing, expressed in a certain language, on a certain medium. (An idea put in a communicable form is essentially a thing that can be experienced by all minds.)

  5. End result is a one more useful concept in the shared public domain.

Analogies lift the iceberg, so to speak, by bringing completely novel ideas into existence and revealing more of the God-mind. In fact, the entirety of our technology, including the technology of reasoning via analogies, can be viewed as a tool for accelerating the transformation of ideas into things. We, and other intermediary minds like us, are the means through which God is becoming more and more aware of itself.

Remember, as time progresses, the evolutionary entities (i.e. minds) decrease in number and increase in size and complexity. Eventually, they get

  • so good at modeling the environment that their ideas start to resemble more and more the true ideas of the God-mind, and

  • so good at controlling the environment that they become increasingly indistinguishable from it and the world of things start to acquire a thoroughly mental character.

In the limit, when the revelation of the God-mind is complete, the number of minds finally dwindles down to one, and the One, now synonymous with the God-mind, dispenses with analogies or metaphors altogether.

  • As nothing seems special any more, the need to project the general onto the special ceases.

  • As nothing feels unfamiliar any more, the need to project the familiar onto the unfamiliar ceases.

Of course, this comes at the expense of time stopping altogether. Weird, right? My personal belief is that revelation will never reach actual completion. Life will hover over the freezing edge of permanency for as long as it can, and at some point, will shatter in such a spectacular fashion that it will have to begin from scratch all over again, just as it had done so last time around.

digital vs physical businesses

In the first part, I will analyze how digital businesses and physical businesses are complementary to each other via the following dualities:

  1. Risk of Death vs Potential for Growth

  2. Controlling Demand vs Controlling Supply

  3. Network Effects vs Scale Effects

  4. Mind vs Body

  5. Borrowing Space vs Borrowing Time

In the second part, I will analyze how the rise of digital businesses against physical businesses is triggering the following trends:

  1. Culture is Shifting from Space to Time

  2. Progress is Accelerating

  3. Science is Becoming More Data-Driven

  4. Economy is Getting Lighter

  5. Power is Shifting from West to East

Duality 1: Risk of Death vs Potential for Growth

Since information is frictionless, every digital startup has a potential for fast growth. But since the same fact holds for every other startup as well, there is also a potential for a sudden downfall. That is why defensibility (i.e. ability to survive after reaching success) is often mentioned as the number one criterion by the investors of such companies.

Physical businesses face the inverse reality: They are harder to grow but easier to defend, due to factors like high barriers to entry, limited real estate space, hard-to-set-up distribution networks etc. That is why competitive landscape is the most scrutinized issue by the investors of such companies.

Duality 2: Controlling Supply vs Controlling Demand

In the physical world, limited by scarcity, economic power comes from controlling supply; in the digital world, overwhelmed by abundance, economic power comes from controlling demand.
- Ben Thompson - Ends, Means and Antitrust

Although Ben’s point is quite clear, it is worth expanding it a little bit.

In the physical world, supply is much more limited than demand and therefore whoever controls the supply wins.

  • Demand. Physical consumption is about hoarding in space which is for all practical purposes infinite. Since money is digital in its nature, I can buy any object in any part of the world at the speed of light and that object will immediately become mine.

  • Supply. Extracting new materials and nurturing new talents take a lot of time. In other words, in the short run, supply of physical goods is severely limited.

In the digital world, demand is much more limited than supply and therefore whoever controls the demand wins:

  • Demand. Digital consumption is information based and therefore cognitive in nature. Since one can pay attention to only so many things at once, it is restricted mainly to the time dimension. For instance, for visual information, daily screen time is the limiting factor on how much can be consumed.

  • Supply. Since information travels at the speed of light, every bit in the world is only a touch away from you. Hence, in the short run, supply is literally unlimited.

Duality 3: Scale Effects vs Network Effects

Physical economy is dominated by geometric dynamics since distances matter. (Keyword here is space.) Digital economy on the other hand is information based and information travels at the speed of light, which is for all practical purposes infinite. Hence distances do not matter, only connectivities do. In other words, the dynamics is topological, not geometric. (Keyword here is network.)

Side Note: Our memories too work topologically. We remember the order of events (i.e. temporal connectivity) easily but have hard time situating them in absolute time. (Often we just remember the dates of significant events and then try to date everything else relative to them.) But while we are living, we focus on the continuous duration (i.e. the temporal distance), not the discrete events themselves. That is why the greater the number of things we are pre-occupied with and the less we can feel the duration, the more quickly time seems to pass. In memory though, the reverse happens: Since the focus is on events (everything else is cleared out!), the greater the number of events, the less quickly time seems to have passed.

This nicely ties back to the previous discussion about defensibility. Physical businesses are harder to grow because that is precisely how they protect themselves. They reside in space and scale effects help them make better use of time through efficiency gains. Digital businesses on the other hand reside in time and network effects help them make better use of space through connectivity gains. Building protection is what is hard and also what is valuable in each case.

Side Note: Just as economic value continuously trickles down to the space owners (i.e. land owners) in the physical economy, it trickles down to “time owners” in the digital economy (i.e. companies who control your attention through out the day).

Scale does not correlate with defensible value in the digital world, just as connectivity does not correlate with defensible value in the physical world. Investors are perennially confused about this since scale is so easy to see and our reptilian brains are so susceptible to be impressed by it.

Of course, at the end of the day, all digital businesses thrive on physical infrastructures and all physical businesses thrive on digital infrastructures. This leads to an interesting mixture.

  • As a structure grows, it suffers from internal complexities which arise from increased interdependencies between increased number of parts.

  • Similarly, greater connectivity requires greater internal scale. In fact, scalability is a huge challenge for fast-growing digital businesses.

Hence, physical businesses thrive on scale effects but suffer from negative internal network effects (which are basically software problems), and digital businesses thrive on network effects but suffer from negative internal scale effects (which are basically hardware problems). In other words, these two types of businesses are dependent on each other to be able to generate more value.

  • As physical businesses get better at leveraging software solutions to manage their complexity issues, they will break scalability records.

  • As digital businesses get better at leveraging hardware solutions to manage their scalability issues, they will break connectivity records.

Note that we have now ventured beyond the world of economics and entered the much more general world of evolutionary dynamics. Time has two directional arrows:

  • Complexity. Correlates closely with size. Increases over time, as in plants being more complex than cells.

  • Connectivity. Manifests itself as “entropy” at the lowest complexity level (i.e. physics). Increases over time, as evolutionary entities become more interlinked.

Evolution always pushes for greater scale and connectivity.

Side Note: "The larger the brain, the larger the fraction of resources devoted to communications compared to computation." says Sejnowski. Many scientists think that evolution has already reached an efficiency limit for the size of the biological brain. A great example of a digital entity (i.e. the computing mind) whose growing size is limited by the accompanying growing internal complexity which manifests itself in the form of internal communication problems.

Duality 4: Mind vs Body

All governments desire to increase the value of their economies but also feel threatened by the evolutionary inclination of the economic units to push for greater scale and connectivity. Western governments (e.g. US) tend to be more sensitive about size. They monitor and explicitly break up physical businesses that cross a certain size threshold. Eastern governments (e.g. China) on the other hand tend to be more sensitive about connectivity. They monitor and implicitly take over digital businesses that cross a certain connectivity threshold. (Think of the strict control of social media in China versus the supreme freedom of all digital networks in US.)

Generally speaking, the Western world falls on the right-hand side of the mind-body duality, while the Eastern world falls on the left-hand side.

  • As mentioned above, Western governments care more about the physical aspects of reality (like size) while Eastern governments care more about the mental aspects of reality (like connectivity).

  • Western sciences equate the mind with the brain, and thereby treats software as hardware. Eastern philosophies are infused with panpsychic ideas, ascribing consciousness (i.e. mind-like properties) to the entirety of universe, and thereby treats hardware as software.

We can think of the duality between digital and physical businesses as the social version of the mind-body duality. When you die, your body gets recycled back into the ecosystem. (This is no different than the machinery inside a bankrupt factory getting recycled back into the economy.) Your mind on the other hand simply disappears. What survive are the impressions you made on other minds. Similarly, when digital businesses die, they leave behind only memories in the form of broken links and cached pages, and therefore need “tombstones” to be remembered. Physical businesses on the other hand leave behind items which continue to circulate in the second-hand markets and buildings which change hands to serve new purposes.

Duality 5: Borrowing Space vs Borrowing Time

Banking too is moving from space to time dimension, and this is happening in a very subtle way. Yes, banks are becoming increasingly more digital, but this is not what I am talking about at all. Digitalized banks are more efficient at delivering the same exact services, continuing to serve the old banking needs of the physical economy. What I am talking about is the unique banking needs of the new digital economy. What do I mean by this?

Remember, physical businesses reside in space and scale effects help them make better use of time through efficiency gains. Digital businesses on the other hand reside in time and network effects help them make better use of space through connectivity gains. Hence, their borrowing needs are polar opposite: Physical businesses need to borrow time to accelerate their defensibility in space, while digital businesses need to borrow space to accelerate their defensibility in time. (What matters in the long run is only defensibility!)

But what does it mean to borrow time or space?

  • Lending time is exactly what regular banks do. They give you money and charge you an interest rate, which can be viewed as the cost of moving (discounting) the money you will be making in the future to now. In other words, banks are in the business of creating contractions in the time dimension, not unlike creating wormholes through time.

  • Definition of space for a digital company depends on the network it resides in. This could be a specific network of people, businesses etc. A digital company does not defend itself by scale effects, it defends itself by network effects. Hence its primary goal is to increase the connectivity of its network. In other words, a digital company needs creation of wormholes through space, not through time. Whatever facilitates further stitching of its network satisfies its “banking needs”.

Bankers of the digital economy are the existing deeply-penetrated networks like Alibaba, WeChat, LinkedIn, Facebook, Amazon etc. What masquerades as a marketing expense for a digital company to rent the connectivity of these platforms is actually in part a “banking” expense, not unlike the interest payments made to a regular bank.

Trend 1: Culture is Shifting from Space to Time

Culturally we are moving from geometry to topology, more often deploying topological rather than geometric language while narrating our lives. We meet our friends in online networks rather than physical spaces.

Correlation between the rise of the digital economy and the rise of the experience economy (and its associated cultural offshoots like hipster movement and decluttering movement) is not a coincidence. Experiential goods (not just those that are information-based) exhibit the same dynamics as digital goods. They are completely mental and reside in time dimension.

Our sense of privacy too is shifting from space dimension to time dimension. We are growing less sensitive about sharing objects and more sensitive about sharing experiences. We are participating in a myriad of sharing economies, but also becoming more ruthless about time optimization. (What is interpreted as a general decline in attention span is actually a protective measure erected by the digital natives, forcing everyone to cut their narratives short.) Increasingly we are spending less time with people although we look more social from outside since we share so many objects with each other.

Our sense of aesthetics has started to incorporate time rather than banish it. We leave surfaces unfinished and prefer using raw and natural-looking rather than polished and new-looking materials. Everyone has become wabi-sabi fans, preferring to buy stuff that time has taken (or seems to have taken) its toll on them.

Even physics is caught in the Zeitgeist. Latest theories are all claiming that time is fundamental and space is emergent. Popular opinion among the physicists used to be the opposite. Einstein had put the final nail on the coffin by completely spatializing time into what is called spacetime, an unchanging four-dimensional block universe. He famously had said “the distinction between past, present, and future is only a stubbornly persistent illusion.”

Trend 2: Progress is Accelerating

As economies and consumption patterns shift to time dimension, we feel more overwhelmed by the demands on our time, and life seems to progress at a faster rate.

Let us dig deeper into this seemingly trivial observation. First recall the following two facts:

  1. In a previous blog post, I had talked about the effect of aging on perception of time. As you accumulate more experience and your library of cognitive models grows, you become more adept at chunking experience and shifting into an automatic mode. What was used to be processed consciously now starts getting processed unconsciously. (This is no different than stable software patterns eventually trickling down and hardening to become hardware patterns.)

  2. In a previous blog post, I had talked about how the goal of education is to learn how not to think, not how to think. In other words, “chunking” is the essence of learning.

Combining these two facts we deduce the following:

  • Learning accelerates perception of time.

This observation in turn is intimately related to the following fact:

What exactly is this relation?

Remember, at micro-level, both learning and progress suffer from the diminishing returns of S-curves. However, at the macro-level, both overcome these limits via sheer creativity and manage to stack S-curves on top of each other to form a (composite) exponential curve that literally shoots to infinity.

This structural similarity is not a coincidence: Progress is simply the social version of learning. However, progress happens out in the open, while learning takes place internally within each of our minds and therefore can not be seen. That is why we can not see learning in time, but nevertheless can feel its acceleration by reflecting it off time.

Side Note: For those of you who know about Ken Wilber’s Integral Theory, what we found here is that “learning” belongs to the upper-left quadrant while “progress” belongs to the lower-right quadrant. The infinitary limiting point is often called Nirvana in personal learning and Singularity in social progress.

Recall how we framed the duality between digital and physical businesses as the social version of the mind-body duality. True, from the individual’s perspective, progress seems to happen out in the open. However, from the perspective of the mind of the society (represented by the aggregation of all things digital), progress “feels” like learning.

Hence, going back to the beginning of this discussion, your perception of time accelerates for two dual reasons:

  1. Your data processing efficiency increases as you learn more.

  2. Data you need to process increases as society learns more.

Time is about change. Perception of time is about processed change, and how much change your mind can process is a function of both your data processing efficiency (which defines your bandwidth) and the speed of data flow. (You can visualize bandwidth as the diameter of a pipe.) As society learns more (i.e. progresses further), you become bombarded with more change. Thankfully, as you learn more, you also become more capable of keeping up with change.

There is an important caveat here though.

  1. Your mind loses its plasticity over time.

  2. The type of change you need to process changes over time.

The combination of these two facts is very problematic. Data processing efficiency is sustained by the cognitive models you develop through experience, based on past data sets. Hence, their continued efficiency is guaranteed only if the future is similar to the past, which of course is increasingly not the case.

As mentioned previously, the exponential character of progress stems from the stacking of S-curves on top of each other. Each new S-curve represents a discontinuous creative jump, a paradigm shift that requires a significant revision of existing cognitive models. As progress becomes faster and life expectancy increases, individuals encounter a greater number of such challenges within their lifetimes. This means that they are increasingly at risk of being left behind due to the plasticity of their minds decreasing over time.

This is exactly why the elderly enjoy nostalgia and wrap themselves inside time capsules like retirement villages. Their desire to stop time creates a demographic tension that will become increasingly more palpable in the future, as the elderly become increasingly more irrelevant while still clinging onto their positions of power and keeping the young at bay.

Trend 3: Science is Becoming More Data-Driven

Rise of the digital economy can be thought of as the maturation of the social mind. The society as a whole is aging, not just us. You can tell this also from how science is shifting from being hypothesis-driven to being data-driven, thanks to digital technologies. (Take a look at the blog post I have written on this subject.) Social mind is moving from conscious thinking to unconscious thinking, becoming more intuitive and getting wiser in the process.

Trend 4: Economy is Getting Lighter

As software is taking over the world, information is being infused into everything and our use of matter is getting smarter.

Automobiles weigh less than they once did and yet perform better. Industrial materials have been replaced by nearly weightless high-tech know-how in the form of plastics and composite fiber materials. Stationary objects are gaining information and losing mass, too. Because of improved materials, high-tech construction methods, and smarter office equipment, new buildings today weigh less than comparable ones from the 1950s. So it isn’t only your radio that is shrinking, the entire economy is losing weight too.

Kevin Kelly - New Rules for the New Economy (Pages 73-74)

Energy use in US has stayed flat despite enormous growth. We now make less use of atoms, and the share of tangibles in total equity value is continuously decreasing. As R. Buckminster Fuller said, our economies are being ephemeralized thanks to the technological advances which are allowing us to do "more and more with less and less until eventually [we] can do everything with nothing."

This trend will probably, in a rather unexpected way, ease the global warming problem. (Remember, it is the sheer mass of what is being excavated and moved around, that is responsible for the generation of greenhouse gases.)

Trend 5: Power is Shifting from West to East

Now I will venture far further and bring religion into the picture. There are some amazing historical dynamics at work that can be recognized only by elevating ourselves and looking at the big picture.

First, let us take a look at the Western world.

  • Becoming. West chose a pragmatic, action-oriented attitude towards Becoming and did not directly philosophize about it.

  • Being. Western religions are built on the notion of Being. Time is deemed to be an illusion and God is thought of as a static all-encompassing Being, not too different from the entirety of Mathematics. There is believed to be an order behind the messy unfolding of Becoming, an order that is waiting to be discovered by us. It is with this deep conviction that Newton managed to discover the first mathematical formalism to predict natural phenomena. There is nothing in the history of science that is comparable to this achievement. Only a religious zeal could have generated the sort of tenacity that is needed to tackle a challenge of this magnitude.

This combination of applying intuition to Becoming and reason to Being eventually led to a meteoric rise in technology and economy.

Side Note: Although an Abrahamic religion itself, Islam did not fuel a similar meteoric rise, because it was practiced more dogmatically. Christianity on the other hand self-reformed itself into a myriad of sub-religions. Although not too great, there was enough intellectual freedom to allow people to seek unchanging patterns in reality, signs of Being within Becoming. Islam on the other hand persecuted any such aspirations. Even allegorical paintings about Being was not allowed.

East did the opposite and applied reason to Becoming and intuition to Being.

  • Becoming. East based its religion in Becoming and this instilled a fundamental suspicion against any attempts to mathematically model the unfolding reality or seek absolute knowledge. Of course, reasoning about Becoming without an implicit belief in unchanging absolutes is not an easy task. In fact, it is so hard that one has no choice but to be imprecise and poetic, and of course that is exactly what Eastern religions did. (Think of Taoism.)

  • Being. How about applying intuition to Being? How can you go about experiencing Being directly, through the “heart” so to speak? Well, through non-verbal silent meditation of course! That is exactly what Eastern religions did. (Think of Buddhism.)

Why could not East reason directly about Becoming in a formal fashion, like West reasoned directly about Being using mathematics? Remember Galileo saying "Mathematics is the language in which God has written the universe." What would have been the corresponding statement for the East? In other words, what is the formal language of Becoming? It is computer science of course, which was born out of Mathematics in the West around 1930s.

Now you understand why West was so lucky. Even if East had managed to discover computer science first, it would have been useless in understanding Becoming, because without the actual hardware to run simulations, you can not create computational models. A model needs to be run on something. It is not like a math theory in a book, waiting for you to play with it. Historically speaking, mathematics had to come first, because it is the cheaper, more basic technology. All you need is literally a pen, a paper and a trash bin.

Side Note: Here is a nerdy joke for you… The dean asks the head of the physics department to see him. “Why are you using so many resources? All those labs and experiments and whatnot; this is getting expensive! Why can’t you be more like mathematicians – they only need pens, paper, and a trash bin. Or philosophers – they only need pens and paper!”

But now is different. We have tremendous amounts of cheap computation and storage at our disposal, allowing us to finally crack the language of Becoming. Our entire economy is shifting from physical to digital, and our entire culture is shifting from space to time. An extraordinary period indeed!

It was never a coincidence that Chinese mathematicians chose to work in (and subsequently dominated) statistics, the most practical fields within mathematics. (They are culturally oriented toward Becoming.) Now all these statisticians are turning into artificial intelligence experts while West is still being paranoid about the oncoming Singularity, the exponential rise of AI.

Why have the Japanese always loved robots while the West has always been afraid of them? Why is the adoption of digital technologies happening faster in the East? Why are the kids and their parents in the East less worried about being locked into digital screens? As we elaborated above, the answer is metaphysical. Differences in metaphysical frameworks (often inherited from religions) are akin to the hard-to-notice (but exceptionally consequential) differences in the low-level code sitting right above the hardware.

Now guess who will dominate the new digital era? Think of the big picture. Do not extrapolate from recent past, think of the vast historical patterns.

I believe that people are made equal everywhere and in the long-run whoever is more zealous wins. East is more zealous about Becoming than the West, and therefore will sooner or later dominate the digital era. Our kids will learn their languages and find their religious practices more attractive. (Meditation is already spreading like wildfire.) What is “cool” will change and all these things will happen effortlessly in a mindless fashion, due to the fundamental shift in Zeitgeist and the strong structural forces of economics.

Side Note: Remember, in Duality 4, we had said that the East has an intrinsic tendency to regulate digital businesses rather than physical businesses. And here we just claimed that the East has an intrinsic passion for building digital businesses rather than physical businesses. Combining these two observations, we can predict that the East will unleash both greater energy and greater restrain in the digital domain. This actually makes a lot of sense, and is in line with the famous marketing slogan of the tyre manufacturing company Pirelli: “Power is Nothing Without Control”

Will the pendulum eventually swing back? Will the cover pages again feature physical businesses as they used to do a decade ago? The answer is no. Virtualization is one of the main trends in evolution. Units of evolution are getting smarter and becoming increasingly more governed by information dynamics rather than energy dynamics. (Information is substrate independent. Hence the term “virtualization”.) Nothing can stop this trend, barring some temporary setbacks here and there.

It seems like West has only two choices in the long run:

  1. It can go through a major religious overhaul and adopt a Becoming-oriented interpretation of Christianity, like that of Teilhard de Chardin.

  2. It can continue as is, and be remembered as the civilization that dominated the short intermediary period which begun with the birth of mathematical modeling and ended with the birth of computational modeling. (Equivalently, one could say that West dominated the industrial revolution and East will dominate the digital revolution.)


If you liked this post, you will probably enjoy the older post Innovative vs Classical Businesses as well. (Note that digital does not mean innovative and physical does not mean classical. You can have a classical digital or an innovative physical business.)

covid-19 as an agent of progress

Crises are periods of acceleration. The reason why all of us feel so overwhelmed today is simply because time is progressing at a much faster rate than it used to.

It may seem improper for me to use the word “progress” here. After all we are going through a massive health crisis with equally massive economic, social and psychological consequences. What is so progressive about this?

Well. If we leave our anthropomorphic framework and for a moment stop thinking about ourselves and instead focus on the evolution of life in general, what looks like a regression is indeed a progression. In other words, we as humans may be regressing, but nature itself is progressing. In fact, nature never ever regresses. What seems like a step backwards always eventually turns out to be a precursor to a bigger step forwards. To see this, all we need to do is zoom out in time.

So what happens when we zoom out? We see that the entire evolutionary history is characterized by a series of dialectic progressions through differentiation and integration, an alternating sequence of creation and synthesis of dualities.

Here, the word “synthesis” is very important. Nature does not break and asymmetrically choose one side of the dualities it creates, it transcends them instead, and this transcendence step requires the dualities to stay unbroken and functioning. In other words, nature stands on the shoulders of old dualities to build entirely new, higher-level ones.

What has all this got to do with SARS-CoV-2?

Long story short, SARS-CoV-2 came out of nowhere, dealt a heavy blow to many fault lines and is now responsible for directly (or indirectly) restoring (or accelerating the formation of) the following six dualities. (Dominant sides are placed on the left. We will delve into each topic later on in the post.)

SARS.png

But how come a small virus do all these? It is not even alive, right? Besides, why should we care about such metaphysical interpretations?

First of all, SARS-CoV-2 too is a life form and deserves the respect that every other life form commands. True, in its inert form, it looks like a simple encapsulation of 30,000 letters, but in action, its complexity is utterly mind-boggling. (Thousands of research papers are published to date.) Remember, a tree is a seed-in-action. Life is all about information, but information itself can only be recognized when it is in action. (Same thing can be said for computer programs.) A virus is no different than a seed. It just grows within you rather than out in the open.

Secondly, SARS-CoV-2 is not sadistically killing for fun. (As far as I know, only humans do that.) Like every other living being, it just wants to replicate. Death is a collateral damage. It is currently mutating and trying to adapt itself to its new host after crossing to a new species. (Such viruses are called zoonotic viruses.) Over time, it will increase in virality and decrease in lethality, and eventually join the harmless community of human coronaviruses that have been co-evolving with us for thousands of years. (Yes, there are lots of viruses that have been co-evolving with us. In fact, some of the technologies in our bodies have direct viral origins, the most dramatic example being the placenta.)

Thirdly, SARS-CoV-2 is of course just minding its own business. The duality restorations themselves are happening because the virus is stressing our systems (biological, sociological, political, economic) to their limits and exposing all the underlying weaknesses. (Generally speaking, malfunctioning of a duality becomes immediately apparent upon a test of robustness.) To think of this crisis solely in terms of its health effects is dangerously naive, and not deriving the right lessons from a crisis of this magnitude is a massive waste. What is at stake is the survival of humanity. We need to stop being so myopic and start thinking about far future rather than the next electoral cycle.

You may say that it is still too early to think in big-picture terms. (As the Chinese premier Zhou Enlai famously remarked, it is still too early to draw final conclusions from the French Revolution.) But in matters of life and death it is always better to be early than late.

Before we delve into the dualities, since there is a lot of misinformation in circulation, I want to first make sure that we are on the same page with respect to a few important background items.

We will inevitably be touching some controversial topics. So now is a great time to drop the legal disclaimer:

All postings on this site, including this one, are my own and do not necessarily represent the strategies or opinions of the organizations I am affiliated with.

We Could Have Been a Lot More Prepared

In Turkey, we say earthquakes do not kill people, bad buildings do. There is a lot of wisdom in this.

Was SARS-CoV-2 an entirely unique, unanticipatable event? Did it catch everyone by surprise? Of course not. Even Bill Gates has been shouting for years that it is only a matter of time that we get hit by another big pandemic and that we are utterly unprepared for it.

Currently, we are suffering from three major bottlenecks:

  1. Hospital Beds. This is particularly easy to solve. China built a 1,000 bed-capacity pre-fabric hospital in a month. May be you can not do it today in such a short period of time, but you definitely could have if you had thought about it well in advance.

  2. Medical Ventilators. These machines do not require rocket science to build. We could have easily stocked hundreds of thousands in a decentralized fashion.

  3. Trained Critical-Care Personelle. We could have pre-trained people beforehand just in case the need arises, focusing on the processes for handling severe pneumonia and assuming that such trainees will always be supervised by doctors who will manage the tricky cases.

If this is indeed a “war”, then why are we so ill-prepared for it? We routinely allocate trillions of dollars to military defense budgets. Why did we not channel a minuscule amount of that against the risk of a pandemic?

What we have is a case of bad leadership, not some kind of bad misfortune. We even had an opportunity to lay the scientific foundations for the current frantic vaccine development efforts well in advance, but missed it due to bad risk management practices. (Remember, technology can be developed in a frantic fashion, as we do during wartime, but science can not be rushed.)

The best-case scenario, as Schwartz sees it, is the one in which this vaccine development happens far too late to make a difference for the current outbreak. The real problem is that preparedness for this outbreak should have been happening for the past decade, ever since SARS. “Had we not set the SARS-vaccine-research program aside, we would have had a lot more of this foundational work that we could apply to this new, closely related virus, ” he said. But, as with Ebola, government funding and pharmaceutical-industry development evaporated once the sense of emergency lifted.

James Hamblin - You’re Likely to Get the Coronavirus

We Will Probably Be Unable to Stop This Pandemic

Now that the genie is out of the bottle and the outbreak has reached a pandemic status, we are very unlikely to be able to fully contain this virus. (Asymptomatic "silent spreaders" have made the job particularly hard.)

Since COVID-19 is now so widespread, within countries and around the world, the Imperial model suggests that epidemics would return within a few weeks of the restrictions being lifted. To avoid this, countries must suppress the disease each time it resurfaces, spending at least half their time in lockdown. This on-off cycle must be repeated until either the disease has worked through the population or there is a vaccine which could be months away, if one works at all.

The Economist - Paying to Stop the Pandemic

It seems like, unless the highly speculative mRNA technologies with very fast development cycles miraculously pay off, there will not be a vaccine around for at least another year or two. Remember, even if something works in the lab, the chances are it will very likely fail in the real world and not pass the necessary efficacy and toxicity tests. (The average success rate of new infectious disease medicines starting clinical trials is just 20 percent.)

If we are lucky, the virus will mutate into a more infectious but less lethal form. (It has already branched into several strains, but the mutations so far seem to be trivial.) However the structural mutations of the sort needed for such a change may render any herd immunity built up against the old version of the virus meaningless and unleash brand new waves of contagion.

There are also question marks about duration of immunity. In other words, even if we manage to develop a vaccine, it may not be a permanent solution.

OK. Now that we are in sync we can go back to the dualities.

Duality 1: Old vs Young

SARS-CoV-2 miraculously does not kill any children, except in very rare cases. We should be grateful for this. Although we do not currently have a full grasp of the underlying causal mechanisms, the general patterns of lethality are clear. One obvious correlation is between lethality and age. Risk of death increases exponentially with age.

On the other hand, if you look at who is getting most screwed by the measures taken by governments, it is disproportionally the young people.

  • Universities and schools got closed the first.

  • Many low-paying entry-level jobs (populated mostly by the young) were eliminated the first.

  • In times of uncertainty, people fall back on existing connections and trust networks. (In other words, those who have not had any time to build up social capital have nothing to fall back on.)

  • “All generations suffer during an economic crisis. But the consequences last longer for the young. Economic misery has a tendency to compound. Low wages now beget low wages later, and meagre pensions after that.” (Source)

  • Governments may be freely dispensing money today in order to ease the economic pain, but it will be the young who will need to pay off the accumulated government debt in the future.

So, the damage caused by the threat is mostly absorbed by the old, while the damage caused by the reaction against the threat is mostly absorbed by the young. This is clearly not sustainable, but also not that surprising since the decision makers themselves are mostly old people as well.

Prestige, wealth and power have always been concentrated in the hands of the old, but the recycling frequency of this concentration has significantly slowed down thanks to the advances in life expectancy. As our leaders are getting increasingly older, in literally all spheres of life, including academics, politics and business, our society is losing its evolutionary dynamism. This is a dangerous situation since, as technology advances and accelerates the pace of progress, we will need even more cognitive plasticity, not less.

Remember, what stands in the way of progress eventually gets wiped out. Humanity itself owes its own existence to a series of mass extinctions. Evolution is a cold-hearted ruthless bastard.

Will we be able to deal with the catastrophes waiting for us? The answer hinges on how fast we can develop the right (hard and soft) technologies.

Duality 2: Men vs Women

Countries who handled the first wave of infections in the best fashion are mostly led by women. And who are putting their lives at stake, fighting on the front lines of this outbreak? The health workers of course, the substantial majority of whom are again women. And who bears more of the burden when preschools and schools stay closed, and nannies and maids can no longer show up to work? Women of course.

SARS-CoV-2 on the other hand has a clear preference for men. (Same was true for SARS-CoV-1.) At first, everyone thought that this was due to the greater prevalence of smoking among males, but now it looks like smoking actually decreases the risk of infection. (Apparently nicotine also binds to ACE2, the same cell-membrane protein that the virus binds to in the lungs.) Some say that the gender difference could be related to differences in estrogen levels. (We are now injecting estrogen into male patients.) Others say that it could be related to the differences in ACE2 expression levels. Science is still unsettled.

In any case, what is clear is that this virus hits old men the hardest. This is a particularly interesting group since it happens to contain the substantial majority of the most powerful people on earth. Look around you. Who is running your country? Who is currently competing to run US, the most powerful country in the world? (Hint: Males over 70.) Have you ever wondered who sits on the boards of the S&P 500 companies? (80 percent male. Average age over 60.)

Modern women have some breathing room, yes. But there are glass ceilings everywhere. We have nowhere near enough feminine (empathy-driven, “mother nature” focused) thinking in our power nodes. Our world is still very much a masculine world and we are clearly suffering from this imbalance.

Duality 3: Humanity vs (Rest of the) Environment

Historically speaking we used to die a lot more often from viruses. Over time we learned how to develop vaccines and keep the outbreaks at bay, but recently, largely due to the emergence of zoonotic viruses, the frequency of outbreaks started to pick up again.

Remember, avian influenza jumped to us from birds, HIV from chimpanzees, Ebola from bats, MERS (which is also a coronavirus) from camels and SARS-CoV-2 (to the best of our knowledge) from pangolins. There are many new, potentially a lot more severe zoonotic events waiting for us in the future.

Deforestation is bringing us more in contact with wild animals. Giant industrial poultry farms are triggering avian influenza outbreaks on an annual basis. Are these really necessary in this age of quantum computers? Do we really need to systematically massacre tens of billions of animals in slaughter houses while there are so many other dietary options open to us? (I personally prefer pescatarianism which is basically vegetarianism plus seafood, dairy products and eggs. It is very easy to transition to.)

And what about the exotic animal markets around the world catering to the rich folks who want to spice up their boring lives? Remember, pangolin is an endangered animal, in fact the most illegally traded mammal in the world.

It is almost as if we brought this crisis onto ourselves. Left unchecked, our disregard for the environment and boundless appetite for indulgence is going to destroy us. In order to prevent another outbreak, we need to restrain ourselves and reduce what is called the attack surface in cyber security. In other words, we should just stay away from living beings with whom we share so much of our DNA, and therefore so much of our diseases. Thankfully, this has already started happening in the form of closures of animal markets and meat processing plants, most of whom suffer from extremely unhygienic working conditions. (Yes, meat prices are going up and there will be shortages, but meat should have never been cheap anyway.)

Duality 4: West vs East

It is easy to forget the fact that we shape our world largely after our ideas. Something as tangible as the maltreatment of environment can be directly traced back to the sharp object-subject separation promoted by the currently dominant Western worldview. We treat nature as if it is meant to serve our needs, as if it is an object, not a subject. This attitude is actually encouraged in an explicit form by all Abrahamic religions like Christianity, as in the biblical instruction “Subdue the earth and have dominion over the fish of the sea, and over the fowl of the air, and over every living thing.” (Meanwhile, on the contrary, almost all Eastern philosophies are infused with panpsychic ideas, ascribing consciousness to the entirety of universe.)

Recent world history can basically be characterized as the rise of the West in all its aspects. This has generally played out well for a while. Technology increased both our productive (and destructive) capabilities, and we have become rich beyond belief. However, this exponential rise in living standards have come at the cost of massive externalities in the form of environmental disasters and social inequalities. Overall, our societies have become overly individualist and distrusting, our economies have become overly competitive and efficiency-oriented, and (perhaps most importantly) our worldview has become overly analytical and reductionist.

Clearly, our lopsided philosophy is not sustainable. (This will become even more evident in the next section when we discuss the global nature of the challenges waiting for us.) But how can you balance the mainstream culture? After all, cultural evolution occurs at a very high level and is not independent of the more fundamental, tangible levels of social dynamics lying underneath it. (This was one of the most important observations of Karl Marx.) Long story short, cultural influence requires political and economic influence. In other words, a major cultural shift necessitates a major power shift first.

A magnificent (and potentially very dangerous) power shift has been taking place in front of our eyes for a while. It acquired its most legible form in Donald Trump’s popular campaign slogan “Make America Great Again” and his dramatic fight against the Chinese technology company Huawei. Then SARS-CoV-2 came out of nowhere and accelerated this power shift further.

People are blaming China for all sorts of things today, most of them being quite unjust. Yes, it made some big mistakes during the first few weeks of the outbreak. But look at the situation in US today. Do you think that the world would have been better off if the outbreak had started off in US instead?

China’s centralized government (once realizing the gravity of the situation) swiftly sealed itself off from the world and took draconian social measures (that would be unimaginable in the Western world) in a very short period of time. As a result it managed to significantly slow down the virus at its source and earn the world at least 2-3 months to prepare. What did the rest of the world do during this time? Nothing. More importantly, thanks to the data shared about the structure, virality and lethality of the virus, the rest of the world never had to operate in complete darkness. This data played a vital role in the formation of initial policy decisions in the Western world.

People are angry at China today, essentially because they feel that they are bearing a disproportionate amount of the suffering. But is it really China’s fault that it has managed to bounce back in such a short period of time, that it is enjoying significant structural advantages in handling a crisis of this sort?

  • Cultural differences matter. It is not a coincidence that Eastern countries (China, South Korea, Singapore, Taiwan) have all handled the crisis well, while Western countries where people have low trust in each other and their governments are all having a hard time containing the panic and mobilizing a harmonious front against the virus.

  • Generally speaking, under stressful conditions centralized systems always perform better, and under relaxed conditions decentralized systems always perform better. I do not know if you have realized but the world have turned completely communist in a matter of weeks. Big corporations are begging for help, governments are postponing taxes, indiscriminately extending credit-lines to everybody, guaranteeing bank loans and even helicoptering money around. While the federal government in US is failing to establish coordination across states, China’s centralized government can at any time instantly mobilize even its tiniest capillaries.

  • Privacy is a huge issue in individualist Western countries. Meanwhile China is reaping the rewards of its years of investment in surveillance technologies, tracking everyone and collecting all relevant data in one place where it becomes actionable. It can instantly detect and isolate any new local outbreaks. I know, China is bad, in the sense that there is no freedom of speech there. But US is bad too. Nearly 1 out of every 100 American is in prison or jail, an incredibly high ratio by world standards. Being a superpower seems to correlate with tyrannical internal control, either in a “preventive” form (as in China) or in a “therapeutic” form (as in US).

  • China has become a world onto itself with its giant interconnected population and diminishing reliance on external demand to prop up its economy. Remember, US emerged as the world leader after World War II primarily because it has managed to stay away from the mayhem that ravaged everyone else. China seems to be in the same exact position today with its ability to seal itself off from the pandemic. At some point, it will no doubt think of deploying something similar to the Marshall Plan. In fact this has already started happening in some form with the high-profile deliveries of medical equipment.

The most important thing that China has demonstrated to the world is that there is an alternative way of becoming a superpower based on a radically different philosophy of governance. This is exactly what is scaring the shit out of Western leaders and what has shocked me on a personal level as well. When someone shatters your worldview and wakes you up to the dual nature of truth, it really hurts. You feel enlightened, but also duped and angry.

“The opposite of a fact is falsehood, but the opposite of one profound truth may very well be another profound truth.”
- Niels Bohr

China has lifted a billion people out of poverty in a spectacular growth story, and is today making colossal bets on revolutionary technologies like AI and blockchain, while US is pathetically cutting back its R&D spending. China’s super-efficient bureaucracy run by the top brains in the country in a meritocratic tradition is exhibiting a long-term planning of the kind that we desperately need, while US can no longer think beyond the next election cycle and has proven itself to be utterly incapable of leading us in global challenges like climate change.

Duality 5: Local vs Global

Thanks to the unstoppable march of globalization, the world has now become interconnected in so many different ways. Ideas quickly spread thanks to the vast social media platforms with billions of users. Viruses quickly spread thanks to vast number of flights between hundreds of cities. I mean, think about it. One person eating an exotic animal in China eventually causes the stock market in US to collapse. How amazing is that? (It is also interesting how social media is playing a non-trivial role in this drama.)

So, in some sense, globalization reinforces itself by quickly amplifying local problems to a scale that requires a global approach which in turn requires better global governance. Today we have a pandemic in our hands, but the world has completely failed to act in unison. This means that we have a lot more work to do, which of course is not a surprise to anybody. We have already seen a slow version of the same film. It was called the Climate Change Fiasco.

Climate Change.jpg

Remember, the West did not even move a finger while China was crumbling for two months. No pharmaceutical company was willing to develop a vaccine back then. Look how many are racing today. World’s novel drug development capacity is almost entirely concentrated in US and Europe, and vaccine manufacturing know-how is concentrated in just four companies. Should we feel lucky that Americans and Europeans are dying along with the rest of us?

Even developed countries among themselves can not agree on what actions need to be taken. Not only do the responses of each country differ, but their timings do so as well, causing the virus to slow down here and accelerate there. This lack of uniformity and synchrony implies that even China’s own declaration of victory was premature. As long as the virus is still circulating around the globe, it will eventually find its way back into every single country.

I hope wealthy nations include poorer ones in these preparations, especially by devoting more foreign aid to building up their primary health-care systems. Even the most self-interested person—or isolationist government—should agree with this by now. This pandemic has shown us that viruses don’t obey border laws and that we are all connected biologically by a network of microscopic germs, whether we like it or not. If a novel virus appears in a poor country, we want its doctors to have the ability to spot it and contain it as soon as possible.

The Economist - Bill Gates on How to Fight Future Pandemics

Of course, it is ridiculously naive to expect a global coordination in an unequal world. Developing countries with barely functional health systems and already fragile economies can not afford to take the radical actions taken by developed countries. The inequality is drastic. For instance, Italy has 41 doctors per 10,000 people while Africa has only 2. (Source) Millions will die in Africa and get probably less global media coverage than Italy alone received.

Similar coordination issues had popped up during the climate change debate. A substantial portion of the carbon dioxide stock that is causing global warming today is due to the past economic activities of the developed countries. Putting a cap on this stock literally amounts to asking the developing countries to stop developing simply because they are late in the game. Of course they can not comply, what do we expect?

Lesson is simple: If you ignore inequality, it will eventually bite you back, because everything is interconnected. We are living on the same goddamn globe, breathing the same goddamn air, drinking the same goddamn water.

Of course, there is inequality not just among the countries, but also within the countries. Poor people everywhere are a lot more likely to suffer from obesity, malnutrition, poor hygiene, air pollution and high population density, all of which increase the risk of death by Covid-19. They are also affected the most by the drastic measures taken by the governments, since they often have no savings, no safety nets, no access to proper healthcare, no private cars, no spaces to self-isolate and no jobs that can be done remotely.

Duality 6: Physical vs Digital

Do you know what is truly global, by birth? Digital businesses. (That is why they find it difficult to localize themselves and why governments find it difficult to regulate them.)

Do you know which businesses are completely unaffected by and even benefiting from the current crisis? Again, digital businesses. They effortlessly adjusted to work-from-home conditions and consumption of all things digital has skyrocketed. Microsoft, Apple, Amazon, Alphabet and Facebook alone now account for more than 20 percent of the market capitalization of S&P 500.

We have been witnessing the rise of the digital for a while now. (A topic very dear to my heart!) This trend was best articulated by Marc Andreessen who presciently observed that software is eating the world. We are infusing information into everything we use and using more bits less atoms. Matter is getting smarter and products are getting lighter. Our entire economy is slowly being virtualized and ephemeralized.

A higher level of complexity is emerging above us, a higher level of life forms so to speak, based on silicon + light rather than carbon + water. (Silicon is the new abundant element facilitating construction and light is the new fluid environment facilitating communication.) This is the next step in the grand narrative of life which is evolving towards an enigmatic singularity. We are collectively giving birth to something whose complexity will be categorically beyond our comprehension, and just like every other birth, the process itself will be full of trauma and pain. In this particular case, it will require a social reform and a restoration of all the dualities we have been talking about.

Again, as we pointed out at the very beginning of this post, nature does not create dualities for no reason. The newly emerging one between digital businesses and physical businesses is no exception. (Think of it as the society-level version of the mind-body duality where the mind is maturing late in the game just as it matured late in the evolution of biology.) Time unfolds through the dynamisms unleashed by such dualities and nature progresses to higher level complexities by synthesizing these dualities in a dialectical fashion. (You are the synthesis of your mind and body.)

So what exactly is SARS-CoV-2?

  • If you really zoom in, it is a simple string of 30,000 letters wrapped inside a spiky sphere less than 100 nanometers in diameter.

  • If you really zoom out, it is a dialectical agent, speeding up a traumatic birth process, inflicting pain but also pushing in the right direction.

In other words, the answer depends on how you want to look at the question.

hypothesis vs data driven science

Science progresses in a dualistic fashion. You can either generate a new hypothesis out of existing data and conduct science in a data-driven way, or generate new data for an existing hypothesis and conduct science in a hypothesis-driven way. For instance, when Kepler was looking at the astronomical data sets to come up with his laws of planetary motion, he was doing data-driven science. When Einstein came up with his theory of General Relativity and asked experimenters to verify the theory’s prediction for the anomalous rate of precession of the perihelion of Mercury's orbit, he was doing hypothesis-driven science.

Similarly, technology can be problem-driven (the counterpart of “hypothesis-driven” in science) or tool-driven (the counterpart of “data-driven” in science). When you start with a problem, you look for what kind of (existing or not-yet-existing) tools you can throw at the problem, in what kind of a combination. (This is similar to thinking about what kind of experiments you can do to generate relevant data to support a hypothesis.) Conversely, when you start with a tool, you try to find a use case which you can deploy it at. (This is similar to starting off with a data set and digging around to see what kind of hypotheses you can extract out of it.) Tool-driven technology development is much more risky and stochastic. It is a taboo for most technology companies, since investors do not like random tinkering and prefer funding problems with high potential economic value and entrepreneurs who “know” what they are doing.

Of course, new tools allow you to ask new kind of questions to the existing data sets. Hence, problem-driven technology (by developing new tools) leads to more data-driven science. And this is exactly what is happening now, at a massive scale. With the development of cheap cloud computing (and storage) and deep learning algorithms, scientists are equipped with some very powerful tools to attack old data sets, especially in complex domains like biology.


Higher Levels of Serendipity

One great advantage of data-driven science is that it involves tinkering and “not really knowing what you are doing”. This leads to less biases and more serendipitous connections, and thereby to the discovery of more transformative ideas and hitherto unknown interesting patterns.

Hypothesis-driven science has a direction from the beginning. Hence surprises are hard to come by, unless you have exceptionally creative intuition capabilities. For instance, the theory of General Relativity was based on one such intuition leap by Einstein. (There has not been such a great leap since then. So it is extremely rare.) Quantum Mechanics on the other hand was literally forced by experimental data. It was so counter intuitive that people refused to believe it. All they could do is turn their intuition off and listen to the data.

Previously data sets were not huge, so scientists could literally eye ball them. Today this is no longer possible. That is why now scientists need computers, algorithms and statistical tools to help them decipher new patterns.

Governments do not give money to scientists so that they can tinker around and do whatever they want. So a scientist applying for a grant needs to know what he is doing. This forces everyone to be in a hypothesis-driven mode from the beginning and thereby leads to less transformative ideas in the long run. (Hat tip to Mehmet Toner for this point.)

Science and technology are polar opposite endeavors. Governments funding science like investors fund technology is a major mistake, and also an important reason why today some of the most exciting science is being done inside closed private companies rather than open academic communities.


Less Democratic Landscape

There is another good reason why the best scientists are leaving the academia. You need good quality data to do science within the data-driven paradigm, and since data is so easily monetizable the largest data sets are being generated by the private companies. So it is not surprising that the most cutting edge research in fields like AI is being done inside companies like Google and Facebook, which also provide the necessary compute power to play around with these data sets.

While hypotheses generation gets better when it is conducted in a decentralized open manner, the natural tendency of data is to be centralized under one roof where it can be harmonized and maintained consistently at a high quality. As they say, “data has gravity”. Once you pass certain critical thresholds, data starts generating strong positive feedback effects and thereby attracts even more data. That is why investors love it. Using smart data strategies, technology companies can build a moat around themselves and render their business models a lot more defensible.

In a typical private company, what data scientists do is to throw thousands of different neural networks at some massive internal data sets and simply observe which one gets the job done better. This of course is empiricism in its purest form, not any different than blindly screening millions of compounds during a drug development process. As they say, just throw it against a wall and see if it sticks.

This brings us to a major problem about big-data-driven science.


Lack of Deep Understanding

There is now a better way. Petabytes allow us to say: "Correlation is enough." We can stop looking for models. We can analyze the data without hypotheses about what it might show. We can throw the numbers into the biggest computing clusters the world has ever seen and let statistical algorithms find patterns where science cannot.

Chris Anderson - The End of Theory

We can not understand the complex machine learning models we are building. In fact, we train them the same way one trains a dog. That is why they are called black-box models. For instance, when the stock market experiences a flash crash we blame the algorithms for getting into a stupid loop, but we never really understand why they do so.

Is there any problem with this state of affairs if these models get the job done, make good predictions and (even better) earn us money? Can not scientists adopt the same pragmatic attitude of technologists and focus on results only, and suffice with successful manipulation of nature and leave true understanding aside? Are not the data sizes already too huge for human comprehension anyway? Why do we expect machines to be able to explain their thought processes to us? Perhaps they are the beginnings of the formation of a higher level life form, and we should learn to trust them about the activities they are better at than us?

Perhaps we have been under an illusion all along and our analytical models have never really penetrated that deep in to the nature anyway?

Closed analytic solutions are nice, but they are applicable only for simple configurations of reality. At best, they are toy models of simple systems. Physicists have known for centuries that the three-body problem or three dimensional Navier Stokes do not afford a closed form analytic solutions. This is why all calculations about the movement of planets in our solar system or turbulence in a fluid are all performed by numerical methods using computers.

Carlos E. Perez - The Delusion of Infinite Precision Numbers

Is it a surprise that as our understanding gets more complete, our equations become harder to solve?

To illustrate this point of view, we can recall that as the equations of physics become more fundamental, they become more difficult to solve. Thus the two-body problem of gravity (that of the motion of a binary star) is simple in Newtonian theory, but unsolvable in an exact manner in Einstein’s Theory. One might imagine that if one day the equations of a totally unified field are written, even the one-body problem will no longer have an exact solution!

Laurent Nottale - The Relativity of All Things (Page 305)

It seems like the entire history of science is a progressive approximation to an immense computational complexity via increasingly sophisticated (but nevertheless quiet simplistic) analytical models. This trend obviously is not sustainable. At some point we should perhaps just stop theorizing and let the machines figure out the rest:

In new research accepted for publication in Chaos, they showed that improved predictions of chaotic systems like the Kuramoto-Sivashinsky equation become possible by hybridizing the data-driven, machine-learning approach and traditional model-based prediction. Ott sees this as a more likely avenue for improving weather prediction and similar efforts, since we don’t always have complete high-resolution data or perfect physical models. “What we should do is use the good knowledge that we have where we have it,” he said, “and if we have ignorance we should use the machine learning to fill in the gaps where the ignorance resides.”

Natalie Wolchover - Machine Learning’s ‘Amazing’ Ability to Predict Chaos

Statistical approaches like machine learning have often been criticized for being dumb. Noam Chomsky has been especially vocal about this:

You can also collect butterflies and make many observations. If you like butterflies, that's fine; but such work must not be confounded with research, which is concerned to discover explanatory principles.

- Noam Chomsky as quoted in Colorless Green Ideas Learn Furiously

But these criticisms are akin to calling reality itself dumb since what we feed into the statistical models are basically virtualized fragments of reality. Analytical models conjure up abstract epi-phenomena to explain phenomena, while statistical models use phenomena to explain phenomena and turn reality directly onto itself. (The reason why deep learning is so much more effective than its peers among machine learning models is because it is hierarchical, just like the reality is.)

This brings us to the old dichotomy between facts and theories.


Facts vs Theories

Long before the computer scientists came into the scene, there were prominent humanists (and historians) fiercely defending fact against theory.

The ultimate goal would be to grasp that everything in the realm of fact is already theory... Let us not seek for something beyond the phenomena - they themselves are the theory.

- Johann Wolfgang von Goethe

Reality possesses a pyramid-like hierarchical structure. It is governed from the top by a few deep high-level laws, and manifested in its utmost complexity at the lowest phenomenological level. This means that there are two strategies you can employ to model phenomena.

  • Seek the simple. Blow your brains out, discover some deep laws and run simulations that can be mapped back to phenomena.

  • Bend the complexity back onto itself. Labor hard to accumulate enough phenomenological data and let the machines do the rote work.

One approach is not inherently superior to the other, and both are hard in their own ways. Deep theories are hard to find, and good quality facts (data) are hard to collect and curate in large quantities. Similarly, a theory-driven (mathematical) simulation is cheap to set up but expensive to run, while a data-driven (computational) simulation (of the same phenomena) is cheap to run but expensive to set up. In other words, while a data-driven simulation is parsimonious in time, a theory-driven simulation is parsimonious in space. (Good computational models satisfy a dual version of Occam’s Razor. They are heavy in size, with millions of parameters, but light to run.)

Some people try mix the two philosophies, inject our causal models into the machines and enjoy the best of both worlds. I believe that this approach is fundamentally mistaken, even if it proves to be fruitful in the short-run. Rather than biasing the machines with our theories, we should just ask them to economize their own thought processes and thereby come up with their own internal causal models and theories. After all, abstraction is just a form of compression, and when we talk about causality we (in practice) mean causality as it fits into the human brain. In the actual universe, everything is completely interlinked with everything else, and causality diagrams are unfathomably complicated. Hence, we should be wary of pre-imposing our theories on machines whose intuitive powers will soon surpass ours.

Remember that, in biological evolution, the development of unconscious (intuitive) thought processes came before the development of conscious (rational) thought processes. It should be no different for the digital evolution.

Side Note: We suffered an AI winter for mistakenly trying to flip this order and asking machines to develop rational capabilities before developing intuitional capabilities. When a scientist comes up with hypothesis, it is a simple effable distillation of an unconscious intuition which is of ineffable, complex statistical form. In other words, it is always “statistics first”. Sometimes the progression from the statistical to the causal takes place out in the open among a community of scientists (as happened in the smoking-causes-cancer research), but more often it just takes place inside the mind of a single scientist.


Continuing Role of the Scientist

Mohammed AlQuraishi, a researcher who studies protein folding, wrote an essay exploring a recent development in his field: the creation of a machine-learning model that can predict protein folds far more accurately than human researchers. AlQuiraishi found himself lamenting the loss of theory over data, even as he sought to reconcile himself to it. “There’s far less prestige associated with conceptual papers or papers that provide some new analytical insight,” he said, in an interview. As machines make discovery faster, people may come to see theoreticians as extraneous, superfluous, and hopelessly behind the times. Knowledge about a particular area will be less treasured than expertise in the creation of machine-learning models that produce answers on that subject.

Jonathan Zittrain - The Hidden Costs of Automated Thinking

The role of scientists in the data-driven paradigm will obviously be different but not trivial. Today’s world-champions in chess are computer-human hybrids. We should expect the situation for science to be no different. AI is complementary to human intelligence and in some sense only amplifies the already existing IQ differences. After all, a machine-learning model is only as good as the intelligence of its creator.

He who loves practice without theory is like the sailor who boards ship without a rudder and compass and never knows where he may cast.

- Leonardo da Vinci

Artificial intelligence (at least in its today’s form) is like a baby. Either it can be spoon-fed data or it gorges on everything. But, as we know, what makes great minds great is what they choose not to consume. This is where the scientists come in.

Deciding what experiments to conduct, what data sets to use are no trivial tasks. Choosing which portion of reality to “virtualize” is an important judgment call. Hence all data efforts are inevitably hypothesis-laden and therefore non-trivially involve the scientist.

For 30 years quantitative investing started with a hypothesis, says a quant investor. Investors would test it against historical data and make a judgment as to whether it would continue to be useful. Now the order has been reversed. “We start with the data and look for a hypothesis,” he says.

Humans are not out of the picture entirely. Their role is to pick and choose which data to feed into the machine. “You have to tell the algorithm what data to look at,” says the same investor. “If you apply a machine-learning algorithm to too large a dataset often it tends to revert to a very simple strategy, like momentum.”

The Economist - March of the Machines

True, each data generation effort is hypothesis-laden and each scientist comes with a unique set of biases generating a unique set of judgment calls, but at the level of the society, these biases get eventually washed out through (structured) randomization via sociological mechanisms and historical contingencies. In other words, unlike the individual, the society as a whole operates in a non-hypothesis-laden fashion, and eventually figures out the right angle. The role (and the responsibility) of the scientist (and the scientific institutions) is to cut the length of this search period as short as possible by simply being smart about it, in a fashion that is not too different from how enzymes speed up chemical reactions by lowering activation energy costs. (A scientist’s biases are actually his strengths since they implicitly contain lessons from eons of evolutionary learning. See the side note below.)

Side Note: There is this huge misunderstanding that evolution progresses via chance alone. Pure randomization is a sign of zero learning. Evolution on the other hand learns over time and embeds this knowledge in all complexity levels, ranging all the way from genetic to cultural forms. As the evolutionary entities become more complex, the search becomes smarter and the progress becomes faster. (This is how protein synthesis and folding happen incredibly fast within cells.) Only at the very beginning, in its most simplest form, does evolution try out everything blindly. (Physics is so successful because its entities are so stupid and comparatively much easier to model.) In other words, the commonly raised argument against the possibility of evolution achieving so much based on pure chance alone is correct. As mathematician Gregory Chaitin points out, “real evolution is not at all ergodic, since the space of all possible designs is much too immense for exhaustive search”.

Another venue where the scientists keep playing an important role is in transferring knowledge from one domain to another. Remember that there are two ways of solving hard problems: Diving into the vertical (technical) depths and venturing across horizontal (analogical) spaces. Machines are horrible at venturing horizontally precisely because they do not get to the gist of things. (This was the criticism of Noam Chomsky quoted above.)

Deep learning is kind of a turbocharged version of memorization. If you can memorize all that you need to know, that’s fine. But if you need to generalize to unusual circumstances, it’s not very good. Our view is that a lot of the field is selling a single hammer as if everything around it is a nail. People are trying to take deep learning, which is a perfectly fine tool, and use it for everything, which is perfectly inappropriate.

- Gary Marcus as quoted in Warning of an AI Winter


Trends Come and Go

Generally speaking, there is always a greater appetite for digging deeper for data when there is a dearth of ideas. (Extraction becomes more expensive as you dig deeper, as in mining operations.) Hence, the current trend of data-driven science is partially due to the fact that scientists themselves have ran out of sensible falsifiable hypotheses. Once the hypothesis space becomes rich again, the pendulum will inevitably swing back. (Of course, who will be doing the exploration is another question. Perhaps it will be the machines, and we will be doing the dirty work of data collection for them.)

As mentioned before, data-driven science operates stochastically in a serendipitous fashion and hypothesis-driven science operates deterministically in a directed fashion. Nature on the other hand loves to use both stochasticity and determinism together, since optimal dynamics reside - as usual - somewhere in the middle. (That is why there are tons of natural examples of structured randomnesses such as Levy Flights etc.) Hence we should learn to appreciate the complementarity between data-drivenness and hypothesis-drivenness, and embrace the duality as a whole rather than trying to break it.


If you liked this post, you will also enjoy the older post Genius vs Wisdom where genius and wisdom are framed respectively as hypothesis-driven and data-driven concepts.

science vs technology

  • Science (as a form of understanding) gets better as it zooms out. Technology (as a form of service) gets better as it zooms in. Science progresses through unifications and technology progresses through diversifications.

  • Both science and technology progress like a jellyfish moves through the water, via alternating movements of contractions (i.e. unifications) and relaxations (i.e. diversifications). So neither science or technology can be pictured as a simple linear trend of unification or diversification. Technology goes through waves of standardizations for the sake of achieving efficiency and de-standardizations for the sake of achieving a better fit. Progress happens due to the fact that each new wave of de-standardization (magically) achieving a better fit than the previous wave, thanks to an intermittent period of standardization. Opposite happens in science, where each new wave of unification (magically) reaches a higher level of accuracy than the previous wave, thanks to an intermittent period of diversification.

  • Unification is easier to achieve in a single mind. Diversification is easier to achieve among many minds. That is why the scientific world is permeated by the lone genius culture and the technology world is permeated by the tribal team-work culture. Scientists love their offices, technologists love their hubs.

“New scientific ideas never spring from a communal body, however organised, but rather from the head of an individually inspired researcher who struggles with his problems in lonely thought and unites all his thought on one single point which is his whole world for the moment.”
- Max Planck

  • Being the originator of widely adopted scientific knowledge makes the originator powerful, while being the owner of privately kept technological knowledge makes the owner powerful. Hence, the best specimens of unifications quickly get diffused out of the confined boundaries of a single mind, and the best specimens of diversifications quickly get confined from the diffused atmosphere of many minds.

  • Unifiers, standardizers tend to be more masculine types who do not mind being alone. Diversifiers, de-standardizers tend to be more feminine types who can not bear being alone. That is why successful technology leaders are more feminine than the average successful leader in the business world, and successful scientific leaders are more masculine than the average successful leader in the academic world. Generally speaking, masculine types suffer more discrimination in the technology world and feminine types suffer more discrimination in the scientific world.

  • Although unifiers play a more important role in science, we usually give the most prestigious awards to the diversifiers who deployed the new tools invented by the unifiers at tangible famous problems. Although diversifiers play a more important role in technology, we usually remember and acknowledge only the unifiers who crystallized the vast efforts of diversifiers into tangible popular formats.

  • Technological challenges lie in efficient specializations. Scientific challenges lie in efficient generalizations. You need to learn vertically and increase your depth to come up with better specializations. This involves learning-to-learn-new, meaning that what you will learn next will be built on what you learned before. You need to learn horizontally and increase your range to come up with better generalizations. This involves learning-to-relearn-old, meaning that what you learned before will be recast in the light of what you will learn next.

  • Technology and design are forms of service. Science and art are forms of understanding. That is why the intersection of technology and art, as well as the intersection of science and design, is full of short-lived garbage. While all our “external” problems can be tracked back to a missing tool (technological artifact) or a wrong design, all our “internal” problems can be traced back to a missing truth (scientific fact) or wrong aesthetics (i.e. wrong ways of looking at the world).

  • Scientific progress contracts the creative space of religion by outright disproval of certain ideas and increases the expressive power of religion by supplying it with new vocabularies. (Note that the metaphysical part of religion can be conceived as “ontology design”.) Technological progress contracts the creative space of art by outright trivialization of certain formats and increases the expressive power of art by supplying it with new tools. (Think of the invention of photography rendering realistic painting meaningless and the invention of synthesizers leading to new types of music.) In other words, science and technology aid respectively religion and art to discover their inner cores by both limiting the domain of exploration and increasing the efficacy of exploration. (Notice that artists and theologians are on the same side of the equation. We often forget this, but as Joseph Campbell reminds us, contemporary art plays an important role in updating our mythologies, and keeping the mysteries alive.)

  • Scientific progress replaces mysteries with more profound mysteries. Technological progress replaces problems with more complex problems.

  • Both science and technology progress through hype cycles, science through how much phenomena the brand new idea can explain, technology through how many problems the brand new tool can solve.

  • Scientific progress slows down when money is thrown at ideas rather than people. Technological progress slows down when money is thrown at people rather than ideas.

  • Science progresses much faster during peacetime, technology progresses much faster during wartime. Scientific breakthroughs often precede new wars, technological breakthroughs often end ongoing wars.

where extremes meet

Here are five examples where extremes meet and result in sameness despite the diametrically opposed states of mind.

Happiness

Pathologically happy ones do not worry because they do not realize that there is anything worth worrying about. Severely depressed ones do not give a shit about anything neither, but theirs is a wise apathy that knows itself.


Knowledge

Knowledge has two extremes which meet; one is the pure natural ignorance of every man at birth, the other is the extreme reached by great minds who run through the whole range of human knowledge, only to find that they know nothing and come back to the same ignorance from which they set out, but it is a wise ignorance which knows itself.

- Blaise Pascal

Reality

That Nirvana and Samsara are one is a fact about the nature of the universe; but it is a fact which cannot be fully realized or directly experienced, except by souls far advanced in spirituality.

Aldous Huxley - The Perennial Philosophy (Page 70)

Empathy

One study found that the most empathetic nurses were most likely to avoid dying patients early in their training, before they had learned to deal with the distress caused by empathizing too much. Overempathy can look from the outside like selfishness - and even produce selfish behavior.

Bruce D. Perry - Born for Love (Page 44)

Sense of Heat

The human sense of hot or cold exhibits the queer feature of ‘les extremes se touchent’: if we inadvertently touch a very cold object, we may for a moment believe that it is hot and has burnt our fingers.

Erwin Schrödinger - Mind and Matter (Page 158)

genius vs wisdom

Genius maxes out upon birth and gradually diminishes. Wisdom displays the opposite dynamics. It is nonexistent at birth and gradually builds up until death. That is why genius is often seen as a potentiality and wisdom as an actuality. (Youth have potentiality, not the old.)

Midlife crises tend to occur around the time when wisdom surpasses genius. That is why earlier maturation correlates with earlier “mid” life crisis. (On the other hand, greater innate genius does not result in a delayed crisis since it entails faster accumulation of wisdom.)


"Every child is an artist. The problem is how to remain an artist once we grow up."
- Pablo Picasso

Here Picasso is actually asking you to maintain your genius at the expense of gaining less wisdom. That is why creative folks tend to be quite unwise folks (and require the assistance of experienced talent managers to succeed in the real world). They methodologically wrap themselves inside protective environments that allow them to pause or postpone their maturation.

Generally speaking, the greater control you have over your environment, the less wisdom you need to survive. That is why wisest people originate from low survival-rate tough conditions, and rich families have hard time raising unspoiled kids without simulating artificial scarcities. (Poor folks have the opposite problem and therefore simulate artificial abundances by displaying more love, empathy etc.)


"Young man knows the rules and the old man knows the exceptions."
- Oliver Wendell Holmes Sr.

Genius is hypothesis-driven and wisdom is data-driven. That is why mature people tend to prefer experimental (and historical) disciplines, young people tend to dominate theoretical (and ahistorical) disciplines etc.

The old man can be rigid but he can also display tremendous cognitive fluidity because he can transcend the rules, improvise and dance around the set of exceptions. In fact, he no longer thinks of the exceptions as "exceptions" since an exception can only be defined with respect to a certain collection of rules. He directly intuits them as unique data points and thus is not subject to the false positives generated by operational definitions. (The young man on the other hand has not explored the full territory of possibilities yet and thus needs a practical guide no matter how crude.)

Notice that the old man can not transfer his knowledge of exceptions to the young man because that knowledge is in the form of an ineffable complex neural network that has been trained on tons of data. (Apprentice-master relationships are based on mimetic learning.) Rules on the other hand are much more transferable since they are of linguistic nature. (They are not only transferable but also a lot more compact in size, compared to the set of exceptions.) Of course, the fact that rules are transferable does not mean that the transfers actually occur! (Trivial things are deemed unworthy by the old man and important things get ignored by the young man. It is only the stuff in the middle that gets successfully transferred.)

Why is it much harder for old people to change their minds? Because wisdom is data-driven, and in a data-driven world, bugs (and biases) are buried inside large data sets and therefore much harder to find and fix. (In a hypothesis driven world, all you need to do is to go through the much shorter list of rules, hypotheses etc.)


The Hypothesis-Data duality highlighted in the previous section can be recast as young people being driven more by rational thinking vs. old people being driven more by intuitional thinking. (In an older blog post, we had discussed how education should focus on cultivating intuition, which leads to a superior form of thinking.)

We all start out life with a purely intuitive mindset. As we learn we come up with certain heuristics and rules, resulting in an adulthood that is dominated by rationality. Once we accumulate enough experience (i.e. data), we get rid of these rules and revert back to an intuitive mindset, although at a higher level than before. (That is why the old get along very well with kids.)

Artistic types (e.g. Picasso) tend to associate genius with the tabula-rasa intuitive fluidity of the newborn. Scientific types tend to associate it with the rationalistic peak of adulthood. (That is why they start to display insecurities after they themselves pass through this peak.)

As mentioned in the previous section, rules are easily transferable across individuals. Results of intuitive thinking on the other hand are non-transferable. From a societal point of view, this is a serious operational problem and the way it is overcome is through a mechanism called “trust”. Since intuition is a black box (like all machine learning models are), the only way you can transfer it is through a wholesome imitation of the observed input-outputs. (i.e. mimetic learning) In other words, you can not understand black box models, you can only have faith in them.

As we age and become more intuition-driven, our trust in trust increases. (Of course, children are dangerously trustworthy to begin with.) Adulthood on the other hand is dominated by rational thinking and therefore corresponds to the period when we are most distrustful of each other. (No wonder why economists are such distrustful folks. They always model humans as ultra-rationalistic machines.)

Today we vastly overvalue the individual over the society, and the rational over the intuitional. (Just look at how we structure school curriculums.) We decentralized society and trivialized the social fabric by centralizing trust. (Read the older blogpost Blockchain and Decentralization) We no longer trust each other because we simply do not have to. Instead we trust the institutions that we collectively created. Our analytical frameworks have reached an individualist zenith in Physics which is currently incapable of guaranteeing the reality of other peoples’ points of view. (Read the older blogpost Reality and Analytical Inquiry) We banished faith completely from public discourse and have even demanded God to be verifiable.

In short, we seem to be heading to the peak adulthood phase of humanity, facing a massive mid-life crisis. Our collective genius has become too great for our own good.

In this context, the current rise of data-driven technological paradigms is not surprising. Humanity is entering a new intuitive post-midlife-crisis phase. Our collective wisdom is now being encoded in the form of disembodied black-box machine-learning models which will keep getting more and more sophisticated over time. (At some point, we may dispense with our analytical models altogether.) Social fabric on the other hand will keep being stretched as more types of universally-trusted centralized nodes emerge and enable new forms of indirect intuition transfer.

Marx was too early. He viewed socialism in a human way as a rationalistic inevitability, but it will probably arrive in an inhuman fashion via intuitionistic technologies. (Calling such a system still as socialism will be vastly ironic since it will be resting on complete absence of trust among individuals.) Of course, not every decision making will be centralized. Remember that the human mind itself emerged for addressing non-local problems. (There is still a lot of local decision making going on within our cells etc.) The “hive” mind will be no different, and as usual, deciding whether a problem in the gray zone is local or non-local will be determined through a tug-of-war.

The central problem of ruler-ship, as Scott sees it, is what he calls legibility. To extract resources from a population the state must be able to understand that population. The state needs to make the people and things it rules legible to agents of the government. Legibility means uniformity. States dream up uniform weights and measures, impress national languages and ID numbers on their people, and divvy the country up into land plots and administrative districts, all to make the realm legible to the powers that be. The problem is that not all important things can be made legible. Much of what makes a society successful is knowledge of the tacit sort: rarely articulated, messy, and from the outside looking in, purposeless. These are the first things lost in the quest for legibility. Traditions, small cultural differences, odd and distinctive lifeways … are all swept aside by a rationalizing state that preserves (or in many cases, imposes) only what it can be understood and manipulated from the 2,000 foot view. The result, as Scott chronicles with example after example, are many of the greatest catastrophes of human history.

Tanner Greer - Tradition is Smarter Than You

reality and analytical inquiry

What is real and out there? This question is surprisingly hard to answer.

The only way we seem to be able to define ontology is as shared epistemology. (Every other definition suffers from an immediate deficiency.) In other words, what is real is what every possible point of view agrees upon, and vice versa. There is no such thing as your reality. (Note that this definition breaks the duality between ontology and epistemology. The moment you make inferences about the former, it gets subsumed by the latter. Is this surprising? Epistemology is all about making inferences. In other words, the scientific method itself is what is breaking the duality.)

Now we have a big problem: Ontological changes can not be communicated to all points of view at the same time in an instantaneous manner. This is outlawed by the finiteness of the speed of the fastest causation propagator which is usually taken as light. In fact, according to our current understanding of physics, there seems to be nothing invariant across all points of view. (e.g. Firewall paradox, twin paradox etc.) Whenever we get our hands onto some property, it slips away with the next advance in our theories.

This is a weird situation, an absolute mind-fuck to be honest. If we endorse all points of views, we can define ontology but then nothing seems to be real. If we endorse only our point of view, we can not define ontology at all and get trapped in a solipsistic world where every other point of view becomes unreal and other people turn into zombies.

Could all different points of views be part of a single - for lack of a better term “God” - point of view? In this case, our own individual point of view becomes unreal. This is a bad sacrifice indeed, but could it help us salvage reality? Nope… Can the universe observe itself? The question does not even make any sense!

It seems like physics can not start off without assuming a solipsistic worldview, adopting a single coordinate system which can not be sure about the reality of other coordinate systems.

In an older blog post, I had explained how dualities emerge as byproducts of analytical inquiry and thereby artificially split the unity of reality. Here we have a similar situation. The scientific method (i.e. analytical inquiry) is automatically giving rise to solipsism and thereby artificially splitting the unity of reality into considerations from different points of views.

In fact, the notions of duality and solipsism are very related. To see why, let us assume that we have a duality between P and not-P. Then

  • Within a single point of view, nothing can satisfy both P and not-P.

  • No property P stays invariant across all points of views.

Here, the first statement is a logical necessity and the second statement is enforced upon us by our own theories. We will take the second statement as the definition of solipsism.

Equivalently, we could have said

  • If property P holds from the point of view of A and not-P holds from the point of view of B, then A can not be equal to B.

  • For every property P, there exists at least one pair (A,B) such that A is not equal to B and P holds from the point of view of A while not-P holds from the point of view of B.

Now let X be the set of pairs (A,B) such that P holds from the point of view of A and not-P holds from the point of view of B. Also let △ stand for the diagonal set consisting of pairs (A,A). Then the above statements become

  • X can not hit △.

  • X can not miss the complement of △.

Using just mathematical notation we have

  • X ∩ △ = ∅

  • X ∩ △’ ≠ ∅

In other words, dualities and solipsism are defined using the same ingredients! Analytical inquiry gives rise to both at the same time. It supplies you labels to attach to reality (via the above equality) but simultaneously takes the reality away from you (via the above inequality). Good deal, right? After all (only) nothing comes for free!

Phenomena are the things which are empty of inherent existence, and inherent existence is that of which phenomena are empty.

Jeffrey Hopkins - Meditations on Emptiness (Page 9)


Recall that at the beginning post we had defined ontology as shared epistemology. One can also go the other way around and define epistemology as shared ontology. What does this mean?

  • To say that some thing exists we need every mind to agree to it.

  • To say that some statement is true we need every body to obey to it.

This is actually how truth is defined in model theory. A statement is deemed true if only if it holds in every possible embodiment.

In this sense, epistemology-ontology duality mirrors mind-body duality. (For a mind, the reality consists of bodies and what is fundamental is existence. For a body, the reality consist of minds and what is fundamental is truth.) For thousands of years, Western philosophy has been trying to break this duality which has popped up in various forms. Today, for instance, physicists are still debating whether “it” arouse from “bit” or “bit” arose from “it”.

Let us now do another exercise. What is the epistemological counterpart of the ontological statement that there are no invariances in physics?

  • Ontology. There is no single thing that every mind agrees to.

  • Epistemology. There is no single statement that every body obeys to.

Sounds outrageous, right? How come there be no statement that is universally true? Truth is absolute in logic, but relative in physics. We are not allowed to make any universal statements in physics, no matter how trivial.

politics, economics and naturality

Combination of liberalism and capitalism forms a nice balance. Former fights against nature in the political domain and destroys outliers by eliminating actual differences in a discrete fashion. Latter fights for nature in the economic domain and creates outliers by amplifying potential differences in a continuous fashion. (Fighting against nature results in discrete as opposed to continuous change.)

 
Politics Economics Naturality.png
 

Similarly, combination of conservatism (fighting for nature in the political domain) and communism (fighting against nature in the economic domain) forms a nice balance as well. But both combinations are hard to maintain due to their conflictual nature.