Why are the tokenised securities and fund markets failing to scale?
- Future of Finance
- Jan 1
- 6 min read
Updated: Aug 13

Are vested interests at work?
Regulated financial institutions have adopted a cautious approach to tokenisation. But the main reason is not protection of existing revenue streams. It is a concern that the existing structure of the capital markets reflects the lessons of experience, and that it would be foolish to jettison that knowledge and the safeguards it erected in pursuit of the cost savings afforded by disintermediation. That said, there are cases where incumbents resist tokenisation as a threat to existing revenues. Bank loans, for example, could be distributed and traded more widely and serviced much more efficiently if they were issued on to blockchains, but agent banks benefit from the net interest margin income. Privately managed credit funds, Schuldschein and Collateralised Loan Obligations (CLOs) have all emerged as tokenisable assets in recent years, though they have also attracted the attention of regulators concerned about tokenising complicated and intrinsically illiquid assets which can easily become the subject of debt restructurings.
Why are sell-side firms not more engaged?
Just six global banks and three international central securities depositories (CSDs) are seriously committed to tokenisation. This reflects the lack of interest on the buy-side in investing in tokenised assets, as opposed to conventional equities and bonds. Where sell-side firms see genuine opportunities in tokenisation – such as in tokenising collateral, where banks can save substantial sums in capital and credit costs – they do invest. Sell-side firms recognise the potential gains from tokenisation in terms of new products and services. They are also aware of the risks of failing to future-proof their businesses against token markets achieving scale. But there is insufficient participation by buy-side clients to persuade them to invest now in managing either possibility.
Why are buy-side firms not more engaged?
The asset managers that are active in the token markets are acting as issuers rather than investors. In other words, they have tokenised their own funds, but they have not bought tokenised securities or funds issued by others, except in occasional Proofs of Concept they were invited to join. So asset managers understand tokenisation creates the opportunity to sell more funds to a wider range of investors. But in their capacity as investors, the case for engagement with tokens rests on increased operational efficiency. Asset managers can buy highly liquid government bonds without worrying about post-trade inefficiency, whereas a third or more of corporate bond transactions can fail to settle on time. It follows that tokenisation can increase the operational efficiency of corporate bond investing. However, asset managers argue that they need a regulated custodian bank to capture that efficiency by agreeing to safekeep their private keys and settle their transactions. Without a custodian bank, tokenisation will add cost for asset managers rather than subtract it because they will have to invest in the necessary capabilities themselves. In short, asset managers are not fully engaged with the token markets because the custodian banks are not yet ready to support them.
What should be tokenised?
Viable use cases for blockchain technology are not as plentiful as some advocates believe. Although anything can be tokenised, not everything should be. Nor should tokenisation be viewed as akin to asset-backed securities (ABSs, in which existing financial assets are repackaged and sold and traded) or exchange-traded funds (ETFs, in which baskets of securities track an index). ABS had a powerful commercial rationale in freeing up bank balance sheets. ETFs made commercial sense too, as a convenient tool for acquiring market exposures. Both could be sold through existing distribution networks with – in the case of ABSs - the imprimatur of the credit rating agencies. Money market funds also grew quickly, because they offered depositors a higher rate of interest than they could obtain at a bank. No such commercial incentives apply in the case of tokenisation.
Why has the tokenisation of privately managed assets grown so slowly?
Tokenisers have alighted on privately managed assets – real estate, private equity and private credit – as their earliest use-case because there are fewer private interests to disrupt and no established infrastructure to displace. At the same time, neither issuers nor investors find a compelling business case to support tokenisation of these asset classes. Asset managers are waiting for the custodian banks to make it as operationally painless and risk-free for them to invest in privately managed assets as it is for them to invest in the public equity and debt markets. On the supply side, only a handful of privately managed asset managers have identified the opportunities in tokenisation – chiefly accessing new investors with lower minimum subscription amounts and greater liquidity – as significant enough to issue funds in tokenised form. This limits the incentive for custodians to develop the services that would attract investment at scale. So the incentives for issuers, investors and intermediaries to tokenise privately managed assets are weak.
Why has tokenisation so far favoured “digital twins” rather than “native” tokens?
There are reasons why tokenisation has taken the form of “digital twins,” in which the underlying funds or securities continue to exist in their current form. It feels familiar, like a form of securitisation. It enables issuers, investors and intermediaries to test models of tokenisation. But it also relieves them of any need to reinvent their systems and the infrastructures they use to accommodate tokens. “Native” tokens must clear that inertial barrier.
Have the limitations of blockchain technology hampered progress?
The proliferation of blockchain protocols certainly slowed progress down because financial institutions had to take a view on which protocols are of institutional quality, will prevail competitively and which will secure regulatory endorsement. Otherwise, they risked being lumbered with a wasted investment. The proliferation of inter-operability services has alleviated that problem, by making the choice of blockchain protocol less of an all-or-nothing issue. One sign of changing attitudes is that potential users are now focused on service expectations rather than technological capabilities such as speed and scalability.
How important is inter-operability?
Inter-operability between blockchains and between blockchains and traditional markets is essential, primarily to guarantee secondary market liquidity. Buy-side institutions will not invest in the people and technology to buy and sell tokenised assets if those assets are trapped in shallow pools of inaccessible liquidity.
Is regulatory uncertainty still a factor militating against progress in tokenisation?
There is now a high degree of regulatory certainty in civil law jurisdictions such as Germany, Liechtenstein, Luxembourg and Switzerland, which have passed blockchain-friendly laws. The position in common law jurisdictions, such as Singapore and the United Kingdom, is also clear. Some jurisdictions, notably Bermuda and Singapore, have had considerable success in using legal and regulatory frameworks to attract innovative token businesses. The problem is that the asset management and securities industries want to distribute their products globally, and certainty of token distribution in a succession of domestic markets does not represent a meaningful advance on the equally fragmented regulatory status quo. Tokenisation is happening in certain domestic markets already but, for it to happen on a global scale, national regulators around the world need to reach a common view – and that is absent even in the traditional capital markets. There remains a risk that different national regulators will take different views of the same issue, partly because their knowledge of tokenisation will always be behind that of the institutions they regulate. Managing different regulators and regulatory regimes also consumes resources, leaving less time and money to invest in tokenisation. But the biggest problem is the continuing lack of legal and regulatory clarity in the biggest homogenous capital market in the world - the United States – largely because of competition between regulatory agencies at the Federal and State levels. This denies issuers access to a market that could provide the scale and profitability to generate the confidence to tap other markets.
Are tokenised markets being built in the wrong way?
Tokenisation is proven to work but demonstrably failing to take off into self-sustained growth. The total size of the tokenised securities and fund markets is 0.02 per cent of the global debt and equity markets, 0.07 per cent of the global mutual fund market and 2.24 per cent of the cryptocurrency market. Yet the obvious obstacles - regulatory and legal uncertainty, lack of fiat currency on-chain, the absence of inter-operability between markets, the shortcomings of blockchain technology, the blight cast by cryptocurrency scandals and the vested interests of incumbents – are not a complete explanation. It is possible that token enthusiasts are trying to grow the market in the wrong way. Perhaps they need to compete less and collaborate more; decentralise less and centralise more; disintermediate less and intermediate more; and ditch novelty for integration with legacy systems.