Security tokens are not about making money, or saving money, but about doing things we could not do before

Security tokens are no longer something that will happen in the future. Security token offerings (STOs) are happening in the private capital markets, which are already large and growing rapidly as the public capital markets shrink. Unlike the public markets, private markets also suffer from operational shortcomings and liquidity problems that tokenisation can solve. Regulatory respectability is being actively sought, and developers are applying solutions tested in the DeFi markets to many of the barriers to institutional engagement. But the real test of security tokens still lies in the future: what can we do now which we could not do before?

The public capital markets are electronic but not digital. Since the 1970s they have gradually shifted issuance, trading, settlement and custody activities from paper to electronic book-entries. This has cut costs. 

The electronic trading platforms that mushroomed at the turn of the century, for example, reduced the cost of trading major equities on established exchanges to a tenth of former levels.

Traditional securities markets are efficient but imperfect

Likewise, settlement and custody costs also fell.

The Depositary Trust and Clearing Corporation (DTCC) today charges US$0.00000029 to look after a large portfolio of securities and US$0.02 to receive or deliver securities net.

Even in Europe, where settlement costs remain outrageous by comparison with the United States, TARGET2-Securities (T2S) charges just 23.5 cents to deliver securities against payment.

To be sure, there is still ample scope to increase efficiency, especially when securities are traded across national borders.

Thanks to fragmented markets and systems, and continuing manual intervention, expenditure on post-trade processing in the securities markets is estimated at US$6-9 billion a year.[1]

Fragmented settlement infrastructures add cross-border cash and securities collateral management and liquidity buffer costs estimated at €3.65 billion a year for large global banks.[2]

Issuance costs are indefensibly high

Securities issuance remains staggeringly expensive.

According to PwC, the weighted average cost of an equity initial public offerings (IPO) is 4.4 per cent of gross proceeds, though costs range from 3.5 to 6.6 per cent.

Underwriting costs alone make up more than half these sums, with the take falling only in line with the size of the offering.

A study of IPOs by real estate investment trusts found other direct expenses (such as legal, accounting and regulatory fees) took the weighted average cost to 8.33 per cent of gross proceeds. [3]

Then there are the running costs, which include the listing fees charged by exchanges and the cost of employing a registrar to record shareholders.

Public capital markets are shrinking relative to private capital markets

These costs are among the reasons public equity listing are in decline in both the United States and the United Kingdom, in absolute terms and relative to private equity.

It also helps to explain why even listed companies are returning capital to shareholders through buy-backs. Private capital is not only cheaper but is accompanied by more engaged investors that can add value to the business.

Even if none of this was true, there are always many more private firms than there are public ones, and collectively they make in numbers what they lack in size.

According to McKinsey, in mid-2020 assets under management in private markets around the world totalled US$7.4 trillion and total fund-raising – even in markets hit by a global pandemic – totalled US$858 billion in the previous 12 months. [4]

So it not surprising that security tokenisation exchanges such as Archax, SDX and Tokeny are finding the readiest audience for their services among small and medium-sized enterprises (SMEs) issuing equity and debt to private investors, private equity firms and real estate companies.

Issuers access lower cost capital and investors a wider range of assets

Private issuers find tokenisation offers access to new investors and lenders and breaks the link between the size of an issue and its cost.

For private investors, on the other hand, tokenisation offers innovative forms of access to new and unusual asset classes.

Many of the assets being tokenized to date – real estate, private equity, agriculture, forestry, collectibles – are of the illiquid variety that investors can otherwise struggle to access.

Indeed, some early token issues bear a more compelling comparison with Non-Fungible Tokens (NFTs) than conventional debt and equity issues.

There is no reason in principle why these innovations and the associated cost savings will not, in due course, spill over from the private capital markets to the public.

Indeed, there is ample reason to believe that, just as digitization has transformed the economics of the music, publishing and retail industries, so it will transform the economics of issuing, trading, settling and safekeeping securities tout court.

Private markets have yet to benefit from the efficiencies gained by public markets

But for now the focus is on the private markets.

This explains the otherwise inexplicable focus of tokenisation evangelists on the anticipated post-trade cost savings.

Since post-trade costs in traditional securities markets are at nugatory levels already, this has always seemed counter-intuitive. In private markets, by contrast, post-trade is still thick with inefficiencies.

The familiar promises of blockchain technology to dispense with the need for both intermediaries and the accompanying need for intermediaries to reconcile their records of the same transaction with each other, are likely to be fulfilled here.

By using smart contracts to automate post-trade asset-servicing processes such as income collection, corporate actions and proxy voting, further savings in manual processing will be achieved.

Operational savings are constrained by lack of fiat currency and the need for fully funded accounts

The missing components in post-trade processing are equally familiar.

Although transactions can be settled by delivery of security tokens against payment tokens (including Stablecoins) on a tokenisation network, sellers wanting fiat currency have to settle trades off the blockchain through payments market infrastructures accessible via the banking system only.

This obstacle is likely to be cleared by the introduction of central bank digital currencies (CBDCs).

In addition, settlement can take place instantaneously on tokenisation platforms only when the account of the buyer is funded with cash and the account of the seller is funded with tokens.
This imposes liquidity costs as the security tokens and the payment tokens have to be sourced.

This need to be fully funded has inhibited the post-trade netting of transactions at the end of the trading day through central counterparty clearing houses (CCPs), which makes a significant contribution to reducing liquidity costs in traditional securities markets.

Digital netting and margin management tools are under development

However, netting between crypto-currency counterparties is now being pioneered in the decentralized finance (DeFi) market.

The work CCPs perform in cutting counterparty risk is also being mimicked by developers.

CCPs mitigate counterparty risk with collateral, in the shape of initial margin and daily margin calls.

At present, they calculate variation margin calls once a day. Reconciliation disputes are common, and it might take another day for the collateral actually to be transferred.

In the model now being developed, variation margin will be calculated automatically throughout the trading day, as often as every 15 minutes.

Margin will also be posted continuously throughout the trading day. By this means, counterparty credit risk can be solved bi-laterally rather than multi-laterally through a CCP.

It is an approach which delivers the savings in capital allocation to collateralised (as opposed to uncollateralised) trades favoured by regulators.

It also reduces the level of risk in the market as whole by reducing the concentration of counterparty risk at CCPs.

Digital identities are addressing constraints on borrowing cash and stock

In another potentially useful experiment, DeFi participants are developing token market equivalents of another important source of liquidity in traditional securities markets: short-sellers who sell what they do not own in the expectation that the price will fall.

In traditional securities markets, short-selling depends on an active securities lending market, and the token equivalent is being pioneered in the DeFi market.

However, the anonymity of counterparties means loans must be heavily over-collateralized (150 per cent of the value of the loan is normal).

So work is now in hand to build digital identities into the lending process, so lenders can customise collateral “haircuts” to the counterparty risks of different borrowers.

Tokenisation can reduce the capital and liquidity costs of cross-border trading

But then the traditional markets do not escape the liquidity costs imposed by settlement procedures.

Because transactions must settle by delivery against payment in central bank money via the central securities depository (CSD) into which the securities were issued, counterparty banks have to maintain large buffers of cash and securities in each market where they or their clients are active.

Furthermore, the imposition by regulators of capital weightings to cover counterparty risk in settlement means that settling instantaneously – as security tokens do - could save counterparties significant sums.

Research by Fnality found that shorter settlement timetables saved 15 large global financial institutions an average of US$300 million in risk-weighted capital allocations.

If tokenisation markets also fragment across national or asset class lines, they could mitigate the tax imposed on transactional activity by liquidity buffers and capital weightings taxes on liquidity by the relatively simple device of decoupling the cash leg of the transaction from the delivery of the security token and using technology rather than CSDs to provide delivery versus payment.

Lack of liquidity in private markets is a barrier tokenisation must clear

It matters because liquidity – which is not, in the final analysis, a technological phenomenon - may well prove the highest barrier to adoption of tokenisation.

Naturally, issuers, as well as traders and investors, are reluctant to participate in markets that lack buyers and sellers. The cost of capital is higher where liquidity is absent.

Liquidity is naturally scarce in the private markets where tokenisation has begun.

Ready buyers and sellers are hard to find, negotiations prolonged and distribution is often restricted to certain types of investor.

Private assets are both intrinsically complex and frequently have unusual conditions attached, such as a right of first refusal.

To that extent, tokenisation is not just being pioneered in private markets untouched by conventional securities markets infrastructure but actually constrained by the nature of the asset classes being tokenized.

Yet a large part of the point of tokenisation is to provide liquidity, and efforts are being made to overcome the scarcity of traders and investors.

Tokeny, for example, publishes bulletin boards where buyers and sellers can publish their intentions and find (occasional) counterparts outside the trading platform.

The extension of these and other forms of price discovery to the tokenisation issuance and trading platforms now proliferating around the world would encourage the development of liquidity by attracting issuers, traders and investors.

Inter-operability (or portability) between blockchain networks will boost liquidity

It would give a valuable boost to liquidity if tokenisation platforms were able to inter-operate or – to put it another way – assets and transactions were portable between platforms.

It would be helpful, for example, if a transaction agreed on one platform could be settled on another. If inter-operability or portability could be applied to traditional as well as blockchain networks, it would help still more.

At present, it is difficult to achieve portability even between different versions of blockchain, let alone between blockchain networks and traditional exchanges, because there is no agreed set of data standards or computer protocols.

At present, a number of standards with varying functionality - ERC 20, ERC 1400, T-REX (Tokeny), ST20 (Polymath) and R-Token (Harbor/BitGo) – are competing for attention.

Bridges between them are hard to build, technically.

A conventional view is to leave them alone and let competition between standards decide the winners.

The emergence of ERC-20, in which an initial profusion of standards eventually settled on a single standard, suggests that network effects will decide the winners.

Protocols are doing the work of message standards in the security token markets

Again, however, useful work is being done to accelerate inter-operability between blockchain networks.

Transferability of tokenized assets between public Ethereum and public Corda based blockchain networks, for example, is now possible.

The recently published work of the FinP2P group in devising a set of network protocols that allow market participants to buy and sell in marketplaces built on different versions of blockchain is likely to prove still more important in overcoming the fragmentation of liquidity.

Indeed, the variety of technological solutions now being devised to solve the problem of inter-operability or portability are an encouraging sign of the health of the industry.

They will over time almost certainly evolve into a smaller set of standard protocols, as market participants converge on the most effective solutions.

New issues of tokens need advisers to structure them

As liquidity deepens, more issuers will emerge.

They will need banks to design and structure their issues, as the pioneers of the illiquid asset classes now being tokenized have found.

The asset classes lack the established precedents and operational and infrastructural support that a conventional bond or equity issuer can rely upon.

Yet again, specialist firms are doing this work already, notably in Switzerland.

As the market grows, the investment banks that create conventional debt and equity issues will become involved.

Composability – the option to create tokenized financial instruments out of existing components – would expand the range of designs, though it will take time for corporate financiers to educate themselves about what is possible.

Bank balance sheets could become an important source of new issues

Banks themselves could become an important source of new token issues.

Bank balance sheets are crammed with capital-intensive assets that are becoming unattractive for banks to hold under the (mandatory) Basel IV capital adequacy regime.

This could unlock a little-discussed barrier to adoption: inertia.

Capital is a real, measurable cost, and its application to on-balance sheet assets is mandatory.

In the same way that emerging market debt was securitized 30 years ago, assets currently sitting on bank balance sheets might be tokenized because that offers buyers liquidity as well as discounted prices.

Real estate loans are an obvious starting point.

At the same time, the appetite of SMEs for debt and equity capital in economies recovering from the global pandemic will further increase interest in tokenisation, precisely because the banks will no longer want to accommodate their needs on their (expensive) balance sheets.

A different vocabulary could allow security tokens to achieve regulatory respectability sooner

Of course, tokenisation needs investors as well as issuers.

One lesson of the DeFi boom is that the uncertain regulatory position of many of the crypto-currency derivatives traded there makes it hard for participants in the conventional securities markets to get involved.

The jargon deployed in the DeFi markets is further hurdle to achieving regulatory respectability quickly, because regulators need time to understand what is going on.

Some observers see an analogue, in terms of erecting barriers to understanding by regulators, in the term “security token.”

The idea of a “token,” they say, originated in the Initial Coin Offering (ICO) boom of 2017-18, when it referred not to financial assets for investment but to a form of crypto-currency (“utility tokens”) that gave holders the right to access a product or service.

The label survived regulatory rulings that labelled ICOs as unregulated issues of securities but continued to make “tokens” sound alien.

On this view, it is better to talk about “digital securities” or “uncertificated securities.”

Unlike “tokens,” there is a well-established corpus of law and regulation in every major financial market to cover uncertificated instruments, and security tokens could fit comfortably within it.
That would enable regulators to recognize and regulate the genre of security tokens much more readily.

Once that regulatory uncertainty dissipates, the market access and infrastructure methods being pioneered in DeFi – and the liquidity and efficiencies they are creating - could cross over and scale astonishingly quickly in the securities markets.

The institutional preference for private, permissioned networks is a drag on progress

The need for institutional-grade tokenisation platforms to operate as private, permissioned blockchain networks, in which membership is controlled, will act as a drag on that scalability.

After all, it is much easier for public blockchains to interact with each other than private permissioned ones.

But institutions are uncomfortable about participating in public blockchain networks.

A completely anonymous, continuously updated, fully open ledger that shows flows (transactions) and stocks (who owns what) raises concerns about the privacy of transactions and portfolios and the identity and risk of counterparties.

This is why private, permissioned blockchains are the default option in the institutional marketplace.

They offer the reassuring presence of a trustworthy intermediary to decide who is admitted to the network, govern the behaviour of participants, facilitate inter-operability or portability between networks and even keep some sort of record of who did what and who owns what.

That role of the trusted intermediary is often assigned to CSDs, partly as a consolation prize.

After all, the core activities of an issuer and investor CSD – issuance, registration of ownership, maintaining the integrity of issues, settlement, safekeeping and asset servicing - are in theory quite redundant in a public blockchain model.

The persistence of such intermediaries would also make it much easier to fit security tokens within existing securities laws and regulations.

In unreconstructed form, these can prescribe a CSD, even though the movement of digital assets against digital payment takes place entirely within the systems of a tokenisation platform.

Zero-knowledge proofs could make public blockchains institutionally acceptable

This is why Blockchain purists dispute the need for intermediation at all.

As it happens, intermediary-free tokenisation networks for institutions are a more-than-theoretical possibility.

Blockchain networks are now being built on Ethereum that use zero knowledge proofs to achieve full anonymity, even on public blockchains.

Costs, poor operational performance and limited scalability inhibit widespread adoption of this technology.

But if those challenges can be defeated, the victory could overcome the reluctance of institutional investors to engage with public blockchains on grounds of privacy, confidentiality and counterparty risk.

However quickly zero-knowledge-proof solutions develop, it will inevitably take time for institutions to be confident enough to rely on technology rather than gatekeeping and rule-setting intermediaries.

Tokenisation platforms offer institutions the assurance of fully regulated status

Which is why members of the pragmatic school of tokenisers are actively seeking regulatory recognition.

Archax, for instance, is regulated by the Financial Conduct Authority (FCA) as a digital exchange and custodian.

Swiss Digital Exchange (SDX) parent company SIX is regulated by the Swiss Financial Market Supervisory Authority (FINMA).

Pursuit of regulated status reflects the recognition that, in the short term at least, institutions are bound to prefer regulated environments.

It also invariably easier to adapt a technology to fit a regulation than to bend a regulation to a technology.

The crypto-currency and security token markets are converging

This recognition is affecting even the behaviour of innovators in the crypto-currency markets, where developments in the security token markets are cross-fertilising developments in the DeFi markets.

The Binance crypto-currency platform, for example, chose recently to host trading of a conventional equity (Tesla).

Moves of this kind inch the crypto-currency markets towards the regulated tokenisation platforms. In other words, DeFi and tokenisation are converging already.

Regulation will converge too, and by the same process.

A crypto-currency exchange offering access to a conventional equity via what is effectively a depositary receipt (DR) structure is a clear invitation to regulators to bring the activity within the scope of existing securities market regulations.

The long-term success of tokenisation depends less on gains or savings than on innovation

But ultimately the success of tokenisation hinges on its ability to deliver commercial benefits.

The ability of tokenisation platforms to charge issuance and trading fees below those of traditional exchanges is almost irrelevant. The ability to dispense with underwriting fees will matter more, and that is unproven.

The reduction of post-trade processing costs matters more in private markets than public ones. But the transaction costs racked up by some public blockchain networks in the crypto-currency markets are a poor advertisement for the commercial economics of tokenisation.

One driver of tokenisation may be oblique. Banks looking to reduce the deadweight of capital costs, for example, may find tokenizing unwanted assets yields a higher price (or at least a smaller loss) than selling them to another bank or central bank or hedge or private equity fund.

But in the end tokenisation must meet the test of any innovative technology: the creation, through the application of a novel process, of additional value. It is not cost savings, or transfers of value from incumbents to new entrants, that will determine the survival and success of tokenisation. What will make the difference is the ability to do things that were not possible before.

[1] Base60 analysis. Oliver Wyman/Morgan Stanley data. Wholesale Banking & Asset Management: 2014, 2015, 2016. Based on annual spending for core post-trade and related functions within highly standardized asset classes, cited in DTCC, Re-imagining Post-Trade: No-Touch Processing Within Reach, September 2019.
[2] HQLAx, Accelerating Collateral Mobility, page 8.

[3] Ranjit Kumar Bairagi and William Dimovski, The direct costs of raising external equity capital for US REIT IPOs, Journal of Property Investment & Finance, 21 September 2012.
[4] McKinsey & Company, A year of disruption in the private markets: McKinsey Global Private Markets Review 2021, April 2021.

Written by Dominic Hobson - April 2021