uçak vs tren yolculuğu

Tren yolculuklarına bayılıyorum. Uçakla bir yere gitmek kesinlikle aynı keyfi vermiyor.

  • Uçağa binip inersiniz, arada ne olduğu ise muammadır. Yolculuğunuzu ancak dijital bir harita üzerinden soyut bir şekilde takip edebilirsiniz, o kadar. Oysa trende koca koca camlardan dışarıyı bir film şeridi gibi izleyebilirsiniz. Bir yere doğru ilerlediğinizi görsel açıdan tam anlamıyla hissedersiniz.

  • Rayların sesi kalp atışı gibidir, ritminden hızınızı kestirebilirsiniz. Uçakta ise doğal olmayan kesintisiz bir uğultu vardır, doğal olmayan sabitlikteki hızınıza eşlik eden.

  • Uçakta herkes aynı yöne bakar, daha doğrusu herkes önündeki ekrana bakar. Ortak bir deneyim filan yoktur. Trende ise karşılıklı oturabilirsiniz. Paylaşımlı masalar garip bir sıcaklık yaratır.

  • Uçakta türbülansın bile keyfi yoktur, her şey önceden anons yapılır. Trende ise aniden savruluverirsiniz. Herkesin paylaştığı güzel bir hazırlıksızlık durumu vardır. Ayrıca her savrulmanızın da bir sebebi vardır. Tren yolundaki kavisler geçtikleri coğrafyalarla şekillenirler. Yani savrulmalarınız (camdan bakarak kolayca gözlemleyebileceğiniz) harmonilerden kaynaklanır. Uçakta yaşanan türbülanslar ise anlamsız bir soyutlukta gerçekleşir. Niye gelirler, niye geçerler belli değildir. Saf belirsizlik saf gerginlikliklere yol açar.

  • Uçak yolculuğu genel olarak kasıntı bir deneyimdir. Havalimanları şehrin dışındadır, gitmek derttir. Gitsen uçağa binmek derttir. Uçağa binsen sonrası bin bir türlü çiledir.

  • Bir sonraki durakta inebilme özgürlüğü o kadar güzel bir özgürlüktür ki! Kesintili de olsa bir nefes kaynağıdır. Her istasyonda bir sigara yakıp sonra trene dönenleri izlemek bile keyiflidir.

reality and analytical inquiry

What is real and out there? This question is surprisingly hard to answer.

The only way we seem to be able to define ontology is as shared epistemology. (Every other definition suffers from an immediate deficiency.) In other words, what is real is what every possible point of view agrees upon, and vice versa. There is no such thing as your reality. (Note that this definition breaks the duality between ontology and epistemology. The moment you make inferences about the former, it gets subsumed by the latter. Is this surprising? Epistemology is all about making inferences. In other words, the scientific method itself is what is breaking the duality.)

Now we have a big problem: Ontological changes can not be communicated to all points of view at the same time in an instantaneous manner. This is outlawed by the finiteness of the speed of the fastest causation propagator which is usually taken as light. In fact, according to our current understanding of physics, there seems to be nothing invariant across all points of view. (e.g. Firewall paradox, twin paradox etc.) Whenever we get our hands onto some property, it slips away with the next advance in our theories.

This is a weird situation, an absolute mind-fuck to be honest. If we endorse all points of views, we can define ontology but then nothing seems to be real. If we endorse only our point of view, we can not define ontology at all and get trapped in a solipsistic world where every other point of view becomes unreal and other people turn into zombies.

Could all different points of views be part of a single - for lack of a better term “God” - point of view? In this case, our own individual point of view becomes unreal. This is a bad sacrifice indeed, but could it help us salvage reality? Nope… Can the universe observe itself? The question does not even make any sense!

It seems like physics can not start off without assuming a solipsistic worldview, adopting a single coordinate system which can not be sure about the reality of other coordinate systems.

In an older blog post, I had explained how dualities emerge as byproducts of analytical inquiry and thereby artificially split the unity of reality. Here we have a similar situation. The scientific method (i.e. analytical inquiry) is automatically giving rise to solipsism and thereby artificially splitting the unity of reality into considerations from different points of views.

In fact, the notions of duality and solipsism are very related. To see why, let us assume that we have a duality between P and not-P. Then

  • Within a single point of view, nothing can satisfy both P and not-P.

  • No property P stays invariant across all points of views.

Here, the first statement is a logical necessity and the second statement is enforced upon us by our own theories. We will take the second statement as the definition of solipsism.

Equivalently, we could have said

  • If property P holds from the point of view of A and not-P holds from the point of view of B, then A can not be equal to B.

  • For every property P, there exists at least one pair (A,B) such that A is not equal to B and P holds from the point of view of A while not-P holds from the point of view of B.

Now let X be the set of pairs (A,B) such that P holds from the point of view of A and not-P holds from the point of view of B. Also let △ stand for the diagonal set consisting of pairs (A,A). Then the above statements become

  • X can not hit △.

  • X can not miss the complement of △.

Using just mathematical notation we have

  • X ∩ △ = ∅

  • X ∩ △’ ≠ ∅

In other words, dualities and solipsism are defined using the same ingredients! Analytical inquiry gives rise to both at the same time. It supplies you labels to attach to reality (via the above equality) but simultaneously takes the reality away from you (via the above inequality). Good deal, right? After all (only) nothing comes for free!

Phenomena are the things which are empty of inherent existence, and inherent existence is that of which phenomena are empty.

Jeffrey Hopkins - Meditations on Emptiness (Page 9)


Recall that at the beginning post we had defined ontology as shared epistemology. One can also go the other way around and define epistemology as shared ontology. What does this mean?

  • To say that some thing exists we need every mind to agree to it.

  • To say that some statement is true we need every body to obey to it.

This is actually how truth is defined in model theory. A statement is deemed true if only if it holds in every possible embodiment.

In this sense, epistemology-ontology duality mirrors mind-body duality. (For a mind, the reality consists of bodies and what is fundamental is existence. For a body, the reality consist of minds and what is fundamental is truth.) For thousands of years, Western philosophy has been trying to break this duality which has popped up in various forms. Today, for instance, physicists are still debating whether “it” arouse from “bit” or “bit” arose from “it”.

Let us now do another exercise. What is the epistemological counterpart of the ontological statement that there are no invariances in physics?

  • Ontology. There is no single thing that every mind agrees to.

  • Epistemology. There is no single statement that every body obeys to.

Sounds outrageous, right? How come there be no statement that is universally true? Truth is absolute in logic, but relative in physics. We are not allowed to make any universal statements in physics, no matter how trivial.

nonsensically high valuation of uber

Uber will be the single largest value collapse in technology history. Here are the reasons why:

  • The service is a commodity. Users do not care about which driver or car picks them up as long as the driver is not crazy and the car is not filthy. (It is hard to preserve even such basic qualities at large scale. Generally speaking, you can not perform above average when you become almost as big as the market itself. This in turn increases your insurance cost per transaction.)

  • The technology is a commodity. It is no longer hard to build the basic application from scratch. (Even municipalities have started doing it themselves.)

  • Neither drivers or users are loyal to the company. Uber is fundamentally a utility app with very low switching costs. A lot of drivers and users utilize the rival apps as well. (Driver are looking for more rides and users are looking for cheaper prices.) In fact, there are even aggregator apps that help drivers juggle more easily between the different networks.

  • It is not a winner-takes-all market as it was imagined. Any network that is dense enough so that the average waiting time for the user is below 5 minutes is good enough. Killing competitors through predatory pricing does not change the basic market structure. If the market allows an oligopolistic structure, it will sooner or later (i.e. once Uber runs out of all the stupid money in the world to finance every ride) converge on one.

  • Unit economics is not improving. (In fact, as mentioned above, insurance cost per transaction is getting worse.) Rides do not scale since most of the costs are variable. (Cars are owned by the drivers and efficiency gains from their greater utilization quickly maxes out, especially since they are used for personal purposes as well. So there is not much for Uber to suck away.) The underlying (evil) hope is that once Uber becomes a monopoly, it will be able to relax and dictate prices. This is a false hope however since governments do not tolerate in-your-face physical monopolies, especially if they create negative externalities like luring people away from public transportation and increasing congestion. (They seem to be more lenient with abstract digital ones.)

  • With the arrival of autonomous cars, the whole industry will change. Any gains from building a driver network will be gone, making it easier to launch Uber-like services with sheer capital. (Make no mistake, there will be a LOT of new capital coming in. Germans will hit especially hard, old money will form alliances etc.) Autonomy itself will become quickly commoditized and centralized around a few intermediary technology companies who will be training all the models and centralizing all the data. Also, at some point, (as they do in the hospitality industry) users will start placing greater value on the consistency of quality. Hence, instead of riding in random cars, they will prefer to become members of the fleets owned by car manufacturers themselves. God knows what else will happen… Autonomy will be big disruptive wave with a lot of currently-unforeseeable consequences.

Nevertheless do not short-sell Uber when it goes IPO this year. It is backed by an aggressive giant called SoftBank which is ruled by an old man who is backed by an infinite amount of blood money. As Keynes said, markets can remain irrational longer than you can remain solvent, and it will for surely be the case for Uber.

politics, economics and naturality

Combination of liberalism and capitalism forms a nice balance. Former fights against nature in the political domain and destroys outliers by eliminating actual differences in a discrete fashion. Latter fights for nature in the economic domain and creates outliers by amplifying potential differences in a continuous fashion. (Fighting against nature results in discrete as opposed to continuous change.)

 
Politics Economics Naturality.png
 

Similarly, combination of conservatism (fighting for nature in the political domain) and communism (fighting against nature in the economic domain) forms a nice balance as well. But both combinations are hard to maintain due to their conflictual nature.

necessity of dualities

All truths lie between two opposite positions. All dramas unfold between two opposing forces. Dualities are both ubiquitous and fundamental. They shape both our mental and physical worlds.

Here are some examples:

Mental

objective | subjective
rational | emotional
conscious | unconscious
reductive | inductive
absolute | relative
positive | negative
good | evil
beautiful | ugly
masculine | feminine


Physical

deterministic | indeterministic
continuous | discrete
actual | potential
necessary | contingent
inside | outside
infinite | finite
global | local
stable | unstable
reversible | irreversible

Notice that even the above split between the two groups itself is an example of duality.

These dualities arise as an epistemological byproduct of the method of analytical inquiry. That is why they are so thoroughly infused into the languages we use to describe the world around us.

Each relatum constitutive of dipolar conceptual pairs is always contextualized by both the other relatum and the relation as a whole, such that neither the relata (the parts) nor the relation (the whole) can be adequately or meaningfully defined apart from their mutual reference. It is impossible, therefore, to conceptualize one principle in a dipolar pair in abstraction from its counterpart principle. Neither principle can be conceived as "more fundamental than," or "wholly derivative of" the other.

Mutually implicative fundamental principles always find their exemplification in both the conceptual and physical features of experience. One cannot, for example, define either positive or negative numbers apart from their mutual implication; nor can one characterize either pole of a magnet without necessary reference to both its counterpart and the two poles in relation - i.e. the magnet itself. Without this double reference, neither the definiendum nor the definiens relative to the definition of either pole can adequately signify its meaning; neither pole can be understood in complete abstraction from the other.

- Epperson & Zafiris - Foundations of Relational Realism (Page 4)


Various lines of Eastern religious and philosophical thinkers intuited how languages can hide underlying unity by artificially superimposing conceptual dualities (the primary of which is the almighty object-subject duality) and posited the nondual wholesomeness of nature several thousand years before the advent of quantum mechanics. (The analytical route to enlightenment is always longer than the intuitive route.)

Western philosophy on the other hand

  • ignored the mutually implicative nature of all dualities and denied the inaccessibility of wholesomeness of nature to analytical inquiry.

  • got fooled by the precision of mathematics which is after all just another language invented by human beings.

  • confused partial control with understanding and engineering success with ontological precision. (Understanding is a binary parameter, meaning that either you understand something or you do not. Control on the other hand is a continuous parameter, meaning that you can have partial control over something.)

As a result Western philosophers mistook representation as reality and tried to confine truth to one end of each dualism in order to create a unity of representation matching the unity of reality.

Side Note: Hegel was an exception. Like Buddha, he too saw dualities as artificial byproducts of analysis, but unlike him, he suggested that one should transcend them via synthesis. In other words, for Buddha unity resided below and for Hegel unity resided above. (Buddha wanted to peel away complexity to its simplest core, while Hegel wanted to embrace complexity in its entirety.) While Buddha stopped theorizing and started meditating instead, Hegel saw the salvation through higher levels of abstraction via alternating chains of analyses and syntheses. (Buddha wanted to turn off cognition altogether, while Hegel wanted to turn up cognition full-blast.) Perhaps at the end of the day they were both preaching the same thing. After all, at the highest level of abstraction, thinking probably halts and emptiness reigns.

It was first the social thinkers who woke up and revolted against the grand narratives built on such discriminative pursuits of unity. There was just way too much politically and ethically at stake for them. The result was an overreaction, replacing unity with multiplicity and considering all points of views as valid. In other words, the pendulum swung the other way and Western philosophy jumped from one state of deep confusion into another. In fact, this time around the situation was even worse since there was an accompanying deep sense of insecurity as well.

The cacophony spread into hard sciences like physics too. Grand narrations got abandoned in favor of instrumental pragmatism. Generations of new physicists got raised as technicians who basically had no clue about the foundations of their disciplines. The most prominent of them could even publicly make an incredibly naive claim such as “something can spontaneously arise from nothing through a quantum fluctuation” and position it as a non-philosophical and non-religious alternative to existing creation myths.

Just to be clear, I am not trying to argue here in favor of Eastern holistic philosophies over Western analytic philosophies. I am just saying that the analytic approach necessitates us to embrace dualities as two-sided entities, including the duality between holistic and analytic approaches.


Politics experienced a similar swing from conservatism (which hailed unity) towards liberalism (which hailed multiplicity). During this transition, all dualities and boundaries got dissolved in the name of more inclusion and equality. The everlasting dynamism (and the subsequent wisdom) of dipolar conceptual pairs (think of magnetic poles) got killed off in favor of an unsustainable burst in the number of ontologies.

Ironically, liberalism resulted in more sameness in the long run. For instance, the traditional assignment of roles and division of tasks between father and mother got replaced by equal parenting principles applied by genderless parents. Of course, upon the dissolution of the gender dipolarity, the number of parents one can have became flexible as well. Having one parent became as natural as having two, three or four. In other words, parenting became a community affair in its truest sense.

 
Duality.png
 

The even greater irony was that liberalism itself forgot that it represented one extreme end of another duality. It was in a sense a self-defeating doctrine that aimed to destroy all discriminative pursuits of unity except for that of itself. (The only way to “resolve” this paradox is to introduce a conceptual hierarchy among dualities where the higher ones can be used to destroy the lower ones, in a fashion that is similar to how mathematicians deal with Russell’s paradox in set theory.)


Of course, at some point the pendulum will swing back to pursuit of unity again. But while we swing back and forth between unity and multiplicity, we keep skipping the only sources of representational truths, namely the dualities themselves. For some reason we are extremely uncomfortable with the fact that the world can only be represented via mutually implicative principles. We find “one” and “infinity” tolerable but “two” arbitrary and therefore abhorring. (Prevalence of “two” in mathematics and “three” in physics was mentioned in a previous blog post.)

I am personally obsessed with “two”. I look out for dualities everywhere and share the interesting finds here on my blog. In fact, I go even further and try to build my entire life on dualities whose two ends mutually enhance each other every time I visit them.

We should not collapse dualities into unities for the sake of satisfying our sense of belonging. We need to counteract this dangerous sociological tendency using our common sense at the individual level. Choosing one side and joining the groupthink is the easy way out. We should instead strive to carve out our identities by consciously sampling from both sides. In other words, when it comes to complex matters, we should embrace the dualities as a whole and not let them split us apart. (Remember, if something works very well, its dual should also work very well. However, if something is true, its dual has to be wrong. This is exactly what separates theory from reality.)

Of course, it is easy to talk about these matters, but who said that pursuit of truth would be easy?

Perhaps there is no pursuit to speak of unless one is pre-committed to choose a side, and swinging back and forth between the two ends of a dualism is the only way nature can maintain its neutrality without sacrificing its dynamicity? (After all, there is no current without a polarity in the first place.)

Perhaps we should just model our logic after reality (like Hegel wanted to) and rather than expect reality to conform to our logic? (In this way we can have our cake and eat it too!)

formalism, consciousness and understanding

In a formal (deductive) subject, the level of competency correlates with the depth of non-formalism one can display around the subject. (For instance, the mastery of a mathematician can only be gauged when he stops scribbling down mathematical notation, dives into conceptual vagueness and starts using real words.) In a non-formal (intuitive) subject, the level of competency correlates with the depth of formalism one can display around the subject.

Similarly, one can only understand the unconscious things using the consciousness and the conscious things using the unconsciousness. Due to the architecture of our brains we typically find the latter much easier to do. Our education system does not balance the scale neither. (Practicing lucid dreaming, meditation and improvisation can help.) We generally do not know how to open up and let our non-verbal intuitive brain reign, and do not care about the unconscious until it breaks down.

classical vs innovative businesses

As you move away from zero-to-one processes, economic activities become more and more sensitive to macroeconomic dynamics.

Think of the economy as a universe. Innovative startups correspond to quantum mechanical phenomena rendering something from nothing. The rest of the economy works classically within the general relativity framework where everything is tightly bound to everything else. To predict your future you need to predict the evolution of everything else as well. This of course is an extremely stressful thing to do. It is much easier to exist outside the tightly bound system and create something from scratch. For instance, you can build a productivity software that will help companies increase their profit margins. In some sense such a software will exist outside time. It will sell whether there is an economic downturn or an upturn.


In classical businesses, forecasting near future is extremely hard. Noise clears out when you look a little further out into the future. But far future is again quite hard to talk about since you start feeling the long term effects of innovation being made today. So difficulty hierarchy looks as follows:

near future > far future > mid future

In innovative businesses, forecasting near future is quite easy. In the long run, everyone agrees that transformation is inevitable. So forecasting far future is hard but still possible. However what is going to happen in mid term is extremely hard to predict. In other words, the above hierarchy gets flipped:

mid future > far future > near future

Notice that what is mid future is actually quite hard to define. It can move around with the wind, so to speak, just as intended by the goddesses of fate in Greek mythology.

In Greek mythology the Moirae were the three Fates, usually depicted as dour spinsters. One Moira spun the thread of a newborn's life. The other Moira counted out the thread’s length. And the third Moira cut the thread at death. A person’s beginning and end were predetermined. But what happened in between was not inevitable. Humans and gods could work within the confines of one's ultimate destiny.

Kevin Kelly - What Technology Wants

I personally find it much more natural to just hold onto near future and far future, and let the middle inflection point dangle around. In other words I prefer working with innovative businesses.

Middle zones are generally speaking always ill-defined, presenting another high level justification for the barbell strategy popularized by Nassim Nicholas Taleb. Mid-term behavior of complex systems is tough to crack. For instance, short-term weather forecasts are highly accurate and long-term climate changes are also quite foreseeable, but what is going to happen in mid-term is anybody’s guess.

Far future always involves “structural” change. Things will definitely change but the change is not of statistical nature. As mentioned earlier, innovative businesses are not affected by the short term statistical (environmental / macro economic) noise. Instead they suffer from mid term statistical noise of the type that phase-transition states exhibit in physics. (Think of turbulence phenomenon.) So the above two difficulty hierarchies can be seen as particular manifestations of the following master hierarchy:

statistical unpredictability > structural unpredictability > predictability


Potential entrepreneurs jumping straight into tech without building any experience in traditional domains are akin to physics students jumping straight into quantum mechanics without learning classical mechanics first. This jump is possible, but also pedagogically problematic. It is much more natural to learn things in the historical order that they were discovered. (Venture capital is a very recent phenomenon.) Understanding the idiosyncrasies and complexities of innovative businesses requires knowledge of how the usual, classical businesses operate.

Moreover, just like quantum states decohere into classical states, innovative businesses behave more and more like classical businesses as they get older and bigger. The word “classical” just means the “new” that has passed the test of time. Similarly, decoherence happens via entanglements, which is basically how time progresses at quantum level.

By the way, this transition is very interesting from an intellectual point of view. For instance, innovative businesses are valued using a revenue multiple, while classical businesses are valued using a profit multiple. When exactly do we start to value a mature innovative business using a profit multiple? How can we tell apart its maturity? When exactly a blue ocean becomes a red one? With the first blood spilled by the death of competitors? Is that an objective measure? After all, it is the investor’s expectations themselves which sustain innovative businesses who burn tons of cash all the time.

Also, notice that, just as all classical businesses were once innovative businesses, all innovative businesses are built upon the stable foundations provided by classical businesses. So we should not think of the relationship as one way. Quantum may become classical, but quantum states are always prepared by classical actors in the first place.


What happens to classical businesses as they get older and bigger? They either evolve or die. Combining this observation with the conclusions of the previous two sections, we deduce that the combined predictability-type timeline of an innovative business becoming a classical one looks as follows:

1
(Innovative) Near Future
Predictability

2
(Innovative) Mid Future
Statistical Unpredictability
(Buckle up. You are about to go through some serious turbulence!)

3
(Innovative) Far Future
Structural Unpredictability
(Congratulations! You successfully landed. Older guys need to evolve or die.)

4
(Classical) Near Future
Statistical Unpredictability
(Wear your suit. There seems to be radiation everywhere on this planet!)

5
(Classical) Mid Future
Predictability

6
(Classical) Far Future
Structural Unpredictability
(New forms of competition landed. You are outdated. Will you evolve or die?)

Notice the alteration between structural and statistical forms of unpredictability over time. Is it coincidental?


Industrial firms thrive on reducing variation (manufacturing errors); creative firms thrive on increasing variation (innovation).
- Patty McCord - How Netflix Reinvented HR

Here Patty’s observation is in line with our analogy. He is basically restating the disparity between the deterministic nature of classical mechanics and the statistical nature of quantum mechanics.

Employees in classical businesses feel like cogs in the wheel, because what needs to be done is already known with great precision and there is nothing preventing the operations to be run with utmost efficiency and predictability. They are (again just like cogs in the wheel) utterly dispensable and replaceable. (Operating in red oceans, these businesses primarily focus on cost minimization rather than revenue maximization.)

Employees in innovative businesses, on the other hand, are given a lot more space to maneuver because they are the driving force behind an evolutionary product-market fit process that is not yet complete (and in some cases will never be complete).


Investment pitches too have quite opposite dynamics for innovative and classical businesses.

  • Innovative businesses raise money from venture capital investors, while classical businesses raise money from private equity investors who belong to a completely different culture.

  • If an entrepreneur prepares a 10 megabyte Excel document for a venture capital, then he will be perceived as delusional and naive. If he does not do the same for a private equity, then he will be perceived as entitled and preposterous.

  • Private equity investors look at data about the past and run statistical, blackbox models. Venture capital investors listen to stories about the future and think in causal, structural models. Remember, classical businesses are at the mercy of macroeconomy and a healthy macroeconomy displays maximum unpredictability. (All predictabilities are arbitraged away.) Whatever remnants of causal thinking left in private equity are mostly about fixing internal operational inefficiencies.

  • The number of reasons for rejecting a private equity investment is more or less equal to the number of reasons for accepting one. In the venture capital world, rejection reasons far outnumber the acceptance reasons.

  • Experienced venture capital investors do not prepare before a pitch. The reason is not that they have a mastery over the subject matter of the entrepreneur’s work, but that there are far too many subject-matter-independent reasons for not making an investment. Private equity investors on the other hand do not have this luxury. They need to be prepared before a pitch because the devil is in the details.

  • For the venture capital investors, it is very hard to tell which company will achieve phenomenal success, but very easy to spot which one will fail miserably. Private equity investors have the opposite problem. They look at companies that have survived for a long time. Hence future-miserable-failures are statistically rare and hard to tell apart.

  • In innovative businesses, founders are (and should be) irreplaceable. In classical businesses, founders are (and should be) replaceable. (Similarly, professionals can successfully turn around failing classical companies, but can never pivot failing innovative companies.)

  • Private equity investors with balls do not shy away from turn-around situations. Venture capital investors with balls do not shy away from pivot situations.

pharma vs diagnostics

Bioinformatics industry is bifurcating into the two categories defined by the two extreme-value generation endpoints, namely drug development and data creation.

  • Drugs come with patent protection and therefore create defensible sources of revenue. Data usually suffers from diminishing returns and data generation can not sustain value indefinitely, but this is not true for the case of biology which is (almost by definition) the most complex subject in the universe. (The fact that biological data seems to have a shorter half-life makes the situation even worse.)

  • Pharma companies develop the drugs and (the volume driven) diagnostics companies generate (the majority of) the data.

Pharma companies love to dip into data because it enables them to drive their precision medicine programs forward by enabling

  • the targeting of the right patient cohorts for existing drugs, and

  • the generation of novel drug targets.

Better precision medicine generates more knowledge about the genetic variants and more drugs targeting them, which in turn render diagnostics tests respectively more accurate and useful. In other words, more data eventually leads to an increase in the demand for diagnostics tests and therefore results in the generation of even more data. (This positive feedback cycle will greatly accelerate the maturation of the precision medicine paradigm in the near future.)

Pharma companies and diagnostics companies behave very differently (as summarized in the table below) and this creates a polarity in the product and business model configuration space for the bioinformatics industry whose primary customers (in the private domain) are these two types of companies.

Pharma vs Diagnostics.png

Last two lines are very important and worth explaining in greater detail:

  • Pharma companies do basic research and therefore want to tap into all types of data sets. (They also have a greater tendency use all types of analytical applications while diagnostic companies ignore the long tail.) These datasets are generally huge and may be residing in private cloud or some public cloud provider. So pharma companies have to be able to connect to all of these datasets and run computation-heavy analysis that seamlessly weave through them. (When you are dealing with big data, computation needs to go to the data rather than other way around.) In other words, they naturally belong to the multi-cloud paradigm. Diagnostics companies, on the other hand, belong to the cloud paradigm since they are optimizing cost and will just choose a single cloud provider based on price and convenience. (Read this older blog post to better understand the difference and polarity between the multi-cloud and cloud paradigms.)

  • Pharma companies are looking for help to solve their complex problems. Hence they are primarily focused on solutions. This pushes the software layer behind the services layer. In other words, software is still there but it is the service provider who is mostly using it. Diagnostics companies, on the other hand, focus on their unit economics. They do not need much consulting since they just optimize the hell out of their production pipelines and leave them alone for the most of the time.

thoughts on cybersecurity business

  • Cybersecurity and drug development are similar in the sense that neither can harbor deep, long-lived productification processes. Problems are dynamical. Enemies eventually evolve protection against productified attacks.

  • Cybersecurity and number theory are similar in the sense that they contain the hardest problems of their respective fields and are not built on a generally-agreed-upon core body of knowledge. Nothing meaningful is accessible to beginner-level students since all sorts of techniques from other subfields are utilized to crack problems.

Hence, in its essence, cyber security is an elite services business. Anyone else claiming the opposite (that it is a product company, that it does not necessitate the recruitment of the best minds of the industry) is selling a sense of security, not real security.

normalization for positioning, coping and filtering

Normalization is a statistical term used for adjusting your position with respect to the relevant population norm which can change across time or space. (For instance, curved grading used in academia employs this technique.)

Here I will use normalization as a unifying theme to make sense of some social, psychological and cognitive phenomena.

Spatial Normalization as a Social Positioning Mechanism

We generally think in relative terms when we compare ourselves to others. All status based social dynamics take place in this way. We are happy when we are richer than the person next door. It does not matter if we all get richer. Of course, this leads to absurd situations where people are constantly unhappy although everything is improving.

What is mathematically happening here is that we keep updating the norm (average) against which we make all comparisons. In social domains, this process takes place across space, not time. (i.e. You do not see people comparing themselves to historical norms. We all live more comfortable lives than the kings of the past, but no one gives a shit.)

Spatial normalization in sociology exhibits two interesting properties:

  • Two Dimensionality. People are curious about others’ lives for both vertical and horizontal reasons. They look (up and down) at the other castes and (around) at other individuals in their own caste. Precise social positioning requires both.

  • Locality. In both dimensions, practically unreachable positions get disregarded. (That is why greater social mobility actually brings unhappiness. Knowing that everything is possible but you are stuck with your current position hurts more.) In other words, social status is determined locally. This makes it actually easier for the poor to climb up in status. After all, due to the severely nonlinear nature of the wealth distribution, it is easier to reach the top of the bottom ten percent than to reach the top of the top ten percent. (That is why the rich is a miserable bunch.)

Temporal Normalization as a Psychological Coping Mechanism

Normalization occurs across time as well, in the form of adaptivity. After all, in order to survive, we have no choice but to adapt to new norms. It is pointless not to adapt to a change that you can not change. (This is usually given as an advice for achieving inner peace. Most of our frustrations come from our inability to discern what can not be changed and should therefore be adapted to.)

Due to one dimensionality of time, we do not have the first bullet point mentioned above for the temporal version of normalization. However, locality holds and is even more pronounced.

Example of Locality

[Cult leaders] deliberately induce distress - so that when they relieve it, they will also be the source of your pleasure. This leads to a powerful and, to outside observers, puzzling connection between cult leader and cult member. The same thing can be seen in abusive relationships and in ”Stockholm syndrome,” where crime victims fall in love with or become supportive of their captors.

Born for Love - Bruce D. Perry & Maia Szalavitz (Page 237)

Temporal locality of adaptation is actually what gets us stuck in abusive relations. We slowly get used to the bad treatment and normalize it. We forget that the world used to be much better before the relationship began. We become quite happy just because we are treated less badly.

Temporal Normalization as a Cognitive Filtering Mechanism

We focus on deviations from the norm while the norm itself gets pushed down to and tracked at an unconscious level. The effects of this focus become particularly stark when deviations become very small and we are essentially left with only the norm itself. Such constancy gets completely filtered away from our consciousness. (For interesting examples of this phenomena, check out this older blog post.)

Remember from our previous discussion that we do not compare ourselves to people who are too far away from us in social distance. (Thanks to the marketing people this is actually becoming increasingly more difficult.) Similarly, when we are cognitively keeping track of deviations, we do not go too far back in time. Our brains calculate the norm in a temporally local fashion, using only recent samplings. In other words, slow change is disregarded even if its accumulative effect may be quite large over time. (Think of the fable of the frog being slowly boiled alive.)

Example of Locality

We must forgive our memory for yet another reason. It finds it easier to determine what has changed than to tell what has stayed the same. The people we have around us every day change as quickly or slowly as everyone else, but thanks to our daily contacts with them their changes are played out on a scale that makes them seem to stand still. It is unfair to blame our memory for throwing away editions when, on the face of it, the latest imprint differs in no way from the preceding one.

Why Life Speeds Up As You Get Older - Draaisma (Page 131)

In some sense, we are wired to ignore the slow passage of time. In fact, this tendency gets worse as our brain ages and accumulates more patterns against which new norms can be defined, explaining why time seems to flow faster as we grow older.