top of page

Whatever they do about tokenisation, incumbents are doomed

  • Writer: Future of Finance
    Future of Finance
  • May 7, 2024
  • 7 min read

Updated: Sep 9

Cover of "Digital Asset Tokenisation Guide." Abstract cityscape in blues and yellows. Text: "Future of Finance," "Developers need a place," etc.



The inaugural Future of Finance Digital Asset Tokenisation Guide (DATG) is published at a time when confidence in a fully tokenised future is not high. As the cover story depicts, using data drawn from the Future of Finance database, the number of fund and security token issues is limited, almost all are asset-backed rather than native, and there is a marked reluctance to address the all-important equity market.


Progress so far has depended on a handful of regulated financial institutions, some of which are now visibly losing their internal funding as well as their enthusiasm. Talented individuals are moving on to more immediately lucrative possibilities. Many of the tokenisation FinTechs which ignited the initial interest have stalled too, and some have failed altogether as their venture capital backers also lose faith in rapid adoption of tokenised finance.


Hopes that asset managers would lead the drive to tokenisation were inflated by the encouraging remarks of the CEOs of a pair of household names. But they were soon deflated by the predictably narrow focus of the more adventurous asset managers in terms of asset classes, including the launch of products that are hard to admire: Bitcoin spot market exchange traded funds (ETFs).


An unambitious pair of papers from the Investment Association in London (IA) (which effectively concluded that meaningful change must await legal and regulatory reform) and the Association Française de la Gestion financière in Paris (AGF) (which argued that it may be time for French asset managers to start thinking seriously about tokenisation) were not encouraging either.


True, as an article in this first issue of the Future of Finance DATG explains, tokenisation is making unexpected progress in Japan. But, even there, asset-backing is the default model and real estate is the overwhelming choice of asset class.


So reasons to be cheerful about tokenisation must be found outside existing practices. Such reasons do exist. One of them is that what is happening and not happening in the token markets today is exactly what we should expect in an established industry facing a disruptive technological innovation.



Tokenisation is not exotic but is built of everyday components


Establishment voices in the established securities and funds industries tend to portray tokenisation as interesting but exotic. In fact, tokenisation is not some providential (or calamitous) intervention from a parallel universe. It is a digital technology made up of components that have existed for years. Three of those components are crucial to its long-term success.


The first is digital technology itself. With enough time and memory, computers can simulate any physically possible environment, making digital computation the most powerful general-purpose technology in the history of humanity. The securities and funds industries cannot escape its impact indefinitely any more than manufacturing could escape the invention of electricity or transport the invention of the internal combustion engine.


The second crucial component is the Internet. It is easy to forget that the blockchains which host tokens are an Internet technology. The Internet is itself a compound technology made up of processors and servers, telecommunications equipment and communications standards, but its most important characteristic is its openness. Because it is an open protocol, the Internet cannot be controlled and closed by any profit-seeking private interest, enabling the applications available through it to be built, accessed and used by anyone.


It is this feature which endows the Internet with the incentive power of network effects: as more people use the Internet the value of the Internet increases for everybody. The value of this feature is impossible to over-state. The reason the token universes that have existed for years in the multi-player gaming industry remain embryonic is that they are closed networks.


The companies that own computer games make money by ensuring that their customers cannot travel outside the game and the tokens earned from the game can be spent only within the game. Security and fund tokens, by contrast, must be accessible from any network, and portable to any network, or they will fail.


The third crucial component is Open-Source software. Because it can be used, modified

and enhanced by anybody, Open-Source software is “composable.” It does not have to be reinvented every time a developer wants to build an application for use on the Internet. As the Ethereum website puts it, “all apps are built on the same blockchain with a shared global state, meaning they can build off each other (like Lego bricks). This allows for better products and experiences and assurances that no-one can remove any tools apps rely upon.”


Incumbent firms in the securities and funds industries are playing familiar roles

It is these three components that have made it possible for tokenisation engines and token exchanges to emerge, along with a host of supporting products and services, including data vendors that can feed the valuation tools and smart contracts that tokens rely on.


In short, there is a horde of new entrants contending to exploit tokenisation as an innovation. Most will fail; some already have. But those that survive will become the dominant design of tokenisation - and take it into large scale production. This is not mere assertion or foolhardy prediction. It is what the entire history of technological innovation tells us.


That history also tells us that the response by incumbents to technological innovations is

often inimical to long-term survival because it is condemned to be cautious. Incumbents have revenues and customers. To seek to nurture them, rather than embark on a rapid and wholesale transition to a new technological paradigm, is a condition of remaining in business.


It is also a necessary defence against new entrants that wish to compete those customers and revenues away. When Joseph Schumpeter observed that incumbents do not merely succumb to his “perennial gale of creative destruction” but “can and will fight progress itself,” (1) he meant only that it is natural for established firms to protect the value of their existing assets and revenues from potential threats.


In doing so, they possess the advantage that a successful innovation must disrupt more than the prevailing technology. It must also disturb existing relationships (between customers and suppliers but also between employers and employees, whose instincts will in most cases be even more conservative than those of management) and ways of doing things (the issuance, trading, settlement, servicing and safekeeping of financial assets are deeply embedded processes that can only be disrupted completely and not piecemeal).


The need to continue to service existing clients, and to protect revenues, and to meet payroll, and the sheer complexity of replacing even a part of an existing process with a new technology explains why it is so hard for incumbents to innovate.


The management of the technology budgets of the major global banks betrays this fact. They all draw clear a distinction between what is spent on Running the Bank and what is spent on Changing the Bank. And the sums spent on Running the Bank always outweigh the sums spent on Changing the Bank five or even ten-fold.


In this respect, companies are not different from individuals. What we own makes demands and creates liabilities that we cannot ignore. It is also human nature to over-value what we have, to fear what we may lose and to expect others to appreciate the value of what we provide for them as much as we do.


When Steve Sasson presented his prototype of a hand-held digital camera to the senior management of Kodak back in the 1970s, they were understandably unable to envisage the personal computer, the Internet or the mobile telephone, chiefly because their attachment to the existing product set blinded them to these future possibilities. “Where would you store these images?” he was asked. “You’re not making a print. People love prints. People don’t want to look at their pictures on a television set.”



The moment of maximum performance is the moment of maximum weakness


This institutionalised myopia is, paradoxically, the principal reason for believing that tokenisation will eventually succeed. If a technology requires new capabilities, and incumbents are institutionally incapable of mastering them, the technology must ultimately prove fatal for the incumbents.


If tokenisation requires a transfer agent to reconcile digital and analogue registers, or a fund accountant to check security token as well as security prices before striking a NAV, or an asset manager to do no more than add a tokenised share class to an existing fund, a tokenised future is not a threatening one for incumbents.


But if tokenisation requires transfer agents to settle fund subscriptions and redemptions instantly in digital money, fund accountants to value commitments rather than portfolios, and asset managers to sell commitments rather than risk-adjusted market returns, a tokenised future will mark most if not all incumbents for extinction.


So it is not hard to fathom why incumbents prefer asset-backed tokens to native tokens, and to tokenise assets which lack an existing infrastructure and set of service providers. By restricting the scope of change in their existing business and confining it to areas where there is little or nothing to change in the first place, incumbents can minimise the amount of change overall.


In any industry, the dominant firms are always going to prefer no change to change and incremental change to radical innovation. Ironically, new entrants indulge this tendency. For them, the current owners of the customers and the revenues appear dauntingly large and entrenched. So challengers condemn themselves to proceed at the pace set by the incumbents.


This inadvertent conspiracy explains the proliferation of unequal “partnerships” between incumbents and challengers, and the willingness of incumbents to contribute to Proofs of Concept (PoCs) and Pilot Tests organised by new entrants. Incumbents can learn a little from a PoC or Pilot Test, and attract favourable publicity for appearing progressive, while in practice doing little to alter the status quo.


The impact of tokenisation is restricted initially to a burnishing of existing products and services to make them more competitive with the promises of tokenisation. Shrinking settlement timetables from trade date plus two days (T+2) to trade date plus one day (T+1) falls into this category. So do the promises to reduce settlement fails, discussed in the third and final article in this inaugural edition of the DATG.


Incremental improvements are unlikely to inoculate the existing system against destruction by atomic settlement of digital objects on blockchain networks. History shows that incumbents are remarkably adept at defending entrenched technologies and established processes, and that their products and services often reach their peak performance at the very juncture where they at last become obsolete. Innovation, as a source of failure in business as well as success, has many mansions.





(1) Joseph A Schumpeter, Capitalism, Socialism and Democracy, Allen & Unwin, 1957, pages 84 and 96.

bottom of page