top of page

Are tokenisers currently focused on alternative assets for want of something better?

  • Writer: Future of Finance
    Future of Finance
  • Jun 2
  • 7 min read

Updated: Aug 12

Digital Asset Exchanges 2025 Book



Are tokenisers focusing on privately managed assets because they are the easy option?


This was not the view of the audience. Only minorities thought that tokenising privately managed assets is an easier option than attacking the public markets, and hardly anybody thought issuers are more readily found there. Broader distribution of privately managed asset classes, making privately managed assets more tradeable, and building an infrastructure to support them, are the priorities identified by the audience. In other words, tokenisers believe privately managed assets have problems which tokenisation can solve. Their goal is to raise privately managed asset markets to the same levels of accessibility and efficiency as the public markets. Contrary to the implication of the question, this will be far from easy. Both general partners (GPs) and limited partners (LPs) in private equity investing, for example, retain legacy systems, workflows and processes. They will not want to invest in connecting to blockchain networks.


Are tokenised privately managed assets a growth opportunity for traditional exchanges?


There are obstacles to rapid scaling of the marketplace. Investors are reluctant to engage with privately managed assets until they achieve a certain scale. On the issuance side, private companies considering an equity or debt issue on to a blockchain often lack the necessities to issue a security token, such as a legal structure or an awareness of regulatory requirements. Which means they must be hand-held throughout the process by tokenisation platforms and independent consultants, lawyers and advisers. Private issuers can also have unrealistic expectations about what is achievable in terms of price and structure and are surprised when platforms do not invest heavily in marketing their tokens to investors. Yet such issuers are also averse to high issuance costs. US$20,000 would be at the upper end of what they pay to issue a security token, which means the advice is not always economic to provide.


Will tokenisation erode the liquidity premium that makes privately managed assets attractive to investors?


In principle, tokenisation will bring product standardisation and price transparency and discovery to privately managed asset markets, creating expectations of increasing the liquidity of the assets as well as broadening their distribution. Already, intermediaries are pressing for full disclosure of information about tokenised private equity funds distributed via private blockchain networks, largely to ensure that valuations and prices remain honest. On public blockchains, such disclosure would be unavoidable, and the application of Artificial Intelligence (AI) to fragmented and unstructured data in capital markets is likely to accelerate increased transparency of information, with consequent effects on price discovery. So it would be surprising if tokenisation did not reduce the extra return investors receive for holding an asset that is less liquid. Eventually, institutional limited partners (LPs) will have to choose whether they prefer a closed market in privately managed assets or an open one.


Can tokenisation make privately managed assets more liquid?


Increased liquidity – the ability to turn an asset into cash – is part of the purpose of tokenising privately managed assets. But offering investors in, say, a private equity fund, the option to exit three times a year instead of once or not at all, does not meet the proper definition of liquidity. In the case of digital twin tokens, liquidity is affected by the underlying asset. Real estate, for example, is intrinsically illiquid; it can take months or even years to sell a building for cash. A private equity listed on a digital asset exchange but held by a small number of investors, even in digitally “native” form, will not necessarily be liquid.


Are the tokenised privately managed assets markets over-serviced?


The privately managed assets opportunity has certainly attracted a lot of capital and contenders offering infrastructural and other services to issuers and investors. Paradoxically, this has created an obstacle to growth: tokenomics. Building an infrastructure to accommodate the trading, clearing, settlement and custody of tokenised privately managed assets is easy. Creating the network that brings the marketplace to life is hard. In principle, it is worth doing. Metcalfe’s Law – which holds that, while the cost of a network increases linearly in line with the number of users, its value grows by the square of the number of users – implies successful networks will be extremely valuable. But the incentive to capture that value is high. This militates against accompanying calls for interoperability – let alone greater cooperation and collaboration - between the numerous sub-scale businesses that currently make up the digital asset universe. The tokenised privately managed asset markets would grow faster if the three main blockchain networks that host them – Ethereum, Provenance and r3 Corda - were inter-operable, and the various private networks were integrated with the public ones.  Potential token issuers and investors are bewildered by the fragmentation of service provision in the tokenised asset markets. But commercial incentives argue against interoperability and consolidation. Product, service and technology silos are as commonplace in the digital asset markets as they are in the traditional securities markets. Indeed, the current dispensation is not unlike the electronic trading revolution of the early 21st century, when challenger equity trading platforms found the clearing houses controlled by the traditional stock exchanges were closed to them.


Is the process of silo-isation in token markets reversible?


The initial response of major financial institutions to the emergence of blockchain was to form a consortium. This would have built collaboration into the design of blockchain services from the outset. But it proved unsustainable because some members of the consortium wished to proceed faster than others, so progressive firms chose to work alone or with smaller groups of like-minded firms. However, now the pioneers have proved the technology works and found a series of worthwhile use-cases, there is something of a return to the original collaborative approach in the shape of open networks. In other words, tokenisation platforms are open to connecting with other platforms to create network effects. This creates a countervailing incentive to favour interoperability.


Can digital asset exchanges encourage interoperability?


Exchanges can build new infrastructure in collaboration with market participants. In this sense, they are “convenors” of markets. But they must also respond to what clients want, as opposed to leading clients in a particular direction.


Are legacy systems an obstacle to the growth of token markets?


Yes. However appealing the user experience, many banks are still running systems on IBM mainframes. Much of their code is written in COBOL. As digitised data has become available, Application Programming Interfaces (APIs) have become more important, not less. Legacy systems are still exchanging data point-to-point via traditional messaging standards such as SWIFT and FIX. Making these systems interoperable with blockchain is not impossible but it costs time and money. A longstanding preference at traditional financial institutions for proprietary systems has affected even the digital asset services they have developed. An intermediary firm that wanted to connect to all the major institutional-grade digital asset custodians, for example, found each was using a different blockchain technology and that some of the blockchains were private and others public. Making these systems interoperable is essential, but it adds cost and complexity.


What tokenised privately managed assets have proved to be liquid?


None so far, though voluntary carbon credit markets were expected to be liquid because of the anticipated popularity of the asset class. This proved not to be the case, even though voluntary carbon credits are well suited to blockchain technology in terms of data and trade capture. Indeed, they have traded at wide spreads. But creating a basket of high integrity carbon credits and calculating the net asset value (NAV) of the contents once a week has resulted in relatively stable pricing over time. This has attracted market-makers, allowing the basket of voluntary carbon credits to be broadened, and the NAV is now calculated on a daily basis.


Is carbon a growth market for tokenisation?


Tokenisation can contribute to building trust and confidence in both the voluntary carbon credit markets and the markets in Energy Attribute Certificates (EACs). In the EAC markets, tokenisation can digitise the issuance and trading processes. Carbon credits are well-suited to a hybrid of a traditional and a digital exchange that makes use of APIs to source data and blockchain technology to host trades, capture trade details and reassure investors about the provenance of carbon reduction projects. Verifiable data can help to counter the mismeasurement and misreporting issues that have undermined trust in the voluntary carbon credit markets. For example, Internet of Things (IoT) sensors were attached to a small-scale solar energy project, which provided sufficient reassurance for the associated solar credits to sell at a premium. But it was tokenisation that made it possible, by using a smart contract to reduce the costs of reporting the electricity output to holders of the credits.


Will tokenisation turn privately managed assets into public assets?


This is the wrong question. The right question to ask is whether the market in an asset or the participants in a market in the asset are or will be regulated once the asset is tokenised. If a tokenised asset is broked or invested in by a regulated intermediary, or must be valued by a regulated intermediary, regulators will insist on supervising activity in the market. That is a tighter definition of whether an asset is “public” than, say, being listed on a digital asset exchange open to everybody.


What can accelerate progress towards scalable token markets?


The shortening of the settlement timetable in the traditional securities markets from trade date plus one day (T+1) in the European Union and the United Kingdom in October 2027 provides an opportunity to embrace tokenisation as part of the solution. This will bring Europe into line with the T+1 timetable adopted in Canada and the United States in May 2024, easing transfers of capital between western Europe and North America. If this infrastructural integration is accompanied with efforts to harmonise the regulatory treatment of digital assets between Europe and North America, it will catalyse network effects. It will also facilitate the integration of tokenised asset markets with the traditional financial markets, by giving investors the choice of assuming the same exposure on different terms.   



bottom of page