The FEX Model: Building the Next Generation of Exchanges
Special thanks to the Enclave team for helping me compile a large corpus of the information on this writeup, and a special thanks to Victor Costan and Srinivas Devadas for writing their excellent detailed paper on the inner workings of Intel’s SGX.
Inter-Chain Fusion
This article is evolving. A whole lot of subnets are planned to be launched this year. This begs the question: what can be done, if anything, to leverage their existence to further boost their individual security, effectively cross-pollinating state and validators? I’ll try to go through a few options below. Note that this isn’t meant to be an exhaustive review. Background and Definitions
Endgame
So, let’s talk about why standardization is so important. Let’s talk about the internet. I’ll call it “Web2”. Despite its many design faults at various layers of the stack, from the communication protocols to the execution environments, the Internet works exceptionally well. Billions of people interact with web connected devices pretty seamlessly. The way that web2 has been built is in an obvious and sensible way: it’s a collection of horizontally distributed and mostly-independent apps (I’m abstracting here, they are really servers/computers that run apps, but ignore that part; these apps are read, write, and process data). It’s so obvious in design that if we were to have independent teams that knew nothing about the internet try to rebuild it, they’d probably mostly end up in the same architectural design. Effectively, the techie version of “convergent evolution”.
On The Tensile Strength of Block Finality
This post concerns itself with the consequences of block re-org MEV, how much stress blockchains can sustain before breaking, and what are the protocol-level things we can do to permanently rectify this issue. Effectively, it tries to answer this question (thanks, Jeremy!):
A Basic Primer on Dynamic Portfolio Management
There was a recent tweet in the crypto-Twitter sphere (can’t seem to find now) which discussed constant-mix – a well known, albeit simplistic, dynamic portfolio management technique – in the context of Uniswap and impermanent loss. What surprised me about that thread is that people seemed unaware of constant-mix. The matter of the fact is that constant-mix has been known and used since 1985, potentially much earlier. Yep, you read that right! There’s awesome journals from Perold et. al on this topic since back then. In fact, part of this post will be from those papers.
[2]: https://sekniqi.com/functionalization-theory/ “Functionalization Theory” –>
Crypto Network's Killer Value Proposition
What is the biggest value proposition of “crypto”? Simply put, crypto is an upgrade to the internet that allows transfering and programming of “value”, or “assets” (used almost interchangably), besides just information. It does this through two key components:
Functionalization Theory
Market evolution (or innovation) is often catalyzed from a multitude of mechanisms. Software digitizes and streamlines archaic business models, technological breakthroughs give birth to whole new markets, etc. In this post, and the main topic of discussion, I wanted to take the time to formalize an important driving force of market innovation, called functionalization (not to be confused with the similar term used in materials science, which means something very different). This will be the first post in (hopefully) a series of posts formalizing what I call “innovation theory”, or the formal framework which describes precisely what makes companies valuable.
Azuma-Hoeffding
This is a brief intro to the Azuma-Hoeffding concentration inequality. It can be viewed as a generalization of the Chernoff bound. For those that are unfamiliar, concentration inequalities specify how concentrated around some mean a particular random variable is. This is useful to quickly demonstrate that a particular random process doesn’t deviate from some expected outcome by more than some bounded amount.
