From AngelList’s study we’ve learned that the best returns are made by investing “in every credible early stage deal” - think a “startup index/ETF”.
Consider a quote by Peter Thiel:
“When you have too many people thinking substantively about shorting the market, the procedural shortcut is to buy an index fund. If everyone does process it wont work. We are in a world where things are geared towards process and away from substance.”
AngelList’s conclusion is that “you’d be better off blindly investing in every credible deal” which they have translated to “invest in a startup index”. I find this to be an error of confusing “what the market says about an aggregate” with “this is what you as an individual ought to do”.
The aggregate result of the market is not a product of a uniform individual action. It’s a product of many very different actions. Reverse-engineering the process from the perspective of the aggregate will not get you the best course of action for an individual.
This is comparable to a notion of “scale transformation (e.g. morality does not scale)”:
The intentions and morality of individual agents does not aggregate to groups. And the reverse: attributes of groups do not map to those of agents.
The standard mechanism is well grasped: competition makes prices adequate by pushing them towards the margin; price formation has nothing to do with the individual intentions of agents. But it is the second step, the wedge between intentions and outcomes, and, more generally, scale transformation, that is not generalized.
In other words; what one ought to do according to an ex post reading of the aggregate does not map to what an individual should do to succeed (ex ante action). Can I go as far as comparing it to “lecturing birds on how to fly”?
Let’s think about the index of startup from an evolutionary game theory perspective. (Same goes for passive investing in general)
Say that at some point, the most energy/efficient strategy for a seal is not to go catch a fish but wait for other seals to return with fish and then steal it from them. Now a Seal Journal publishes these numbers, seals read it and realize that their effort to actually go find fish is suboptimal.
Seals switch the strategy to maximize their gains under the equilibrium described by the Seal Journal. This strategy has zero-sum implications for the given seal community and can only last if enough seals are actually making an effort of catching fish. Now the equilibrium has changed. Not enough seals are catching fish.
I do not mean to belittle the significance of statistics but I do think it’s dangerous to think about the world in terms of probabilities only. Such thinking leads to people not targeting specific options that will unfold. They just figure that they will have plenty of options and it does not matter which options they pick as long as they pick enough of them.
We cannot calculate what the future will look like. The future depends on people trying (and failing) to calculate it. The antifragility of the system depends on its fragile units. If everyone will just buy an antifragile index of startups and no one will be picking these units (startups) within the index, the whole system collapses.
Keynesians say that in the long-run we’ll all be dead. Let’s only care about the short-run. How long will it take before a long-run becomes a short-run? This is similar to saying that there’s no way to know the future so do attempt to create it. How long before there will be no future if there are no people trying to create it?
Talking about the indices of startups is dangerous. The way I read the results of AngelList study is that great technologies come from tinkering, not “reason.” That is why technological success is only trivial retrospectively—not prospectively.
People making bad decisions individually leads to greater than good outcomes. Playing it safe by buying an index and “investing blindly” prevents people from actually having ideas about the future. (Also, under the new equilibrium the then-good decision may lead to terrible outcomes.)
Reducing our worldview to math might distort the world in a very bad way: E.g. making all LPs invest in an index of every credible deal without actual VCs selecting them and entrepreneurs merely optimizing for their ideas to look “credible”.
I’ll conclude with recommendation of Yudkowsky’s Inadequate Equilibria:
There’s no real alternative to sticking your neck out, even knowing that reality might surprise you and chop off your head...Run experiments; place bets; say oops. Anything less is an act of self-sabotage.
Sources
https://angel.co/blog/venture-returns
https://www.academia.edu/38433249/Principia_Politica
https://blakemasters.com/post/23435743973/peter-thiels-cs183-startup-class-13-notes
https://blakemasters.com/post/23787022006/peter-thiels-cs183-startup-class-14-notes
https://equilibriabook.com/