Tokenization – does it really make compliance easier?
Key takeaways
- Tokenization is beginning to embed compliance directly into technical infrastructure, with real-time reconciliation, programmable rules, and identity-linked wallets automatically enforcing regulatory rules.
- Early tokenized securities are already trading at scale – Kraken’s xStocks have seen billions in volume – but only in markets where regulatory clarity enables them.
- Smart-contract automation promises major efficiencies across settlement, reporting, and AML/KYC, yet these benefits rely on synchronizing on-chain activity with off-chain books and records.
- Regulation remains a “hornet’s nest,” with open questions spanning transfer agents, custody rules, disclosure regimes, and beneficial-ownership reporting across SEC, CFTC, FINRA, OCC, the Fed, and MiFID II.
- Tokenization also introduces new risk vectors – from bridge failures to private-key loss and AI-driven proliferation – raising the risk that retail users travel further out the risk curve than they realise.
Speakers
Glenn Cross – CCO | Coinbase Asset Management
CJ Rinaldi – CCO | Kraken
Colin Cunningham – Head of Tokenization | Chainlink
Moderator: Antoine Scalia – CEO | Cryptio
Tokenization moves from abstraction to adoption
For years, “tokenization” has lived in the realm of theory and prediction – a future state where every asset exists on-chain, trades instantly, and carries its own programmable rules. At Crypto Finance Forum NYC, we heard from industry insiders how that abstraction is finally beginning to emerge as operational reality.
Kraken CCO CJ Rinaldi described how the exchange now supports trading in more than 50 tokenized equities and ETFs, known as xStocks, issued by Swiss-based Backed under a full prospectus. The assets trade 24/5. They move between Kraken clients. And crucially, they’re portable – transferable even to private wallets.
The adoption numbers surprised even seasoned attendees.
“X-stocks launched in July,” Rinaldi noted. “We’ve already seen about $4.7 billion in volume on centralized exchanges, $280 million on DEXs, and more than 36,000 unique holders.”
It’s the first clear evidence that tokenized securities can generate meaningful, organic demand, non-experimental demand.
But this reality comes with a major caveat: they are only available in jurisdictions where the regulatory ground is clear. In Europe, xStocks fall squarely under MiFID II as financial instruments. In the United States, where the regulatory picture remains unresolved, Kraken simply cannot offer them.
This tension – sufficient legal clarity in some jurisdictions, lacking in others – set the stage for the panel’s deeper exploration of whether tokenization really makes compliance easier in today’s world.
Compliance becomes a feature, not a function
The clearest – and arguably most transformative – vision on the panel came from Glenn Cross, CCO of Coinbase Asset Management. Rather than treating tokenization as a new product wrapper, he framed it as a fundamental redesign of the compliance stack itself. In his view, the shift is not about adding controls, but about where those controls live.
Today, compliance is a process: teams gather data, run checks, produce reports, reconcile breaks, and enforce restrictions. In a tokenized world, Cross argued, much of that becomes behavior encoded directly into the asset.
“You’re going to be able to take compliance as a function and collapse it into a feature — embedded into the execution of the actual assets.”
This isn’t abstract futurism; Cross described in detail what this looks like in practice. Reconciliation, for example, moves from a backwards-looking activity into a real-time property of the ledger itself – a single source of truth that eliminates the gaps between systems.
“Instead of reconciliation happening after the fact, reconciliation is happening in real time.”
If the ledger is the system of record, then settlement becomes functionally T+0, dramatically reducing operational breaks and exception handling. And regulatory reporting – Form PF, ADV, quarterly statements – can be generated directly from on-chain data rather than reconstructed through a patchwork of back-office systems.
“Imagine your Form PF or ADV just being pulled right off your blockchain, because that’s your system of record.”
Crucially, the logic of compliance can move on-chain too. Transfer restrictions such as Reg D lockups or jurisdictional eligibility can be encoded into the token itself, executing automatically rather than relying on downstream controls. Identity-linked wallets extend the same logic into AML and sanctions screening, allowing verification and permissions to happen inside the transaction flow rather than as separate checks.
“If you start tagging digital identities to wallets, AML almost becomes instantaneous.”
The result is not a world without compliance teams, but a world where their work shifts toward judgment rather than administration – interpreting edge cases, assessing real risks, and managing exceptions instead of assembling reports and reconciling mismatched datasets.
The picture Cross paints is not distant or speculative. It is a blueprint for how tokenization can gradually transform today’s compliance overhead into programmatic market logic – where controls are not applied to assets, but built into them.
The missing link: infrastructure for a hybrid market
Even if tokenization pushes compliance logic into the asset itself, one critical piece still determines whether any of this works in practice: the infrastructure that keeps on-chain activity and off-chain systems aligned.
This isn’t a cosmetic detail — it is the difference between a programmable asset and an auditable one.
This is where Chainlink’s perspective added essential depth. Tokenization, argued head of tokenization Colin Cunningham, is not simply the act of putting assets on-chain. It is the restructuring of how data moves between traditional finance and decentralized environments — a redesign of the connective tissue that links custodians, brokers, funds, and blockchains.
After explaining Chainlink’s broader vision, Cunningham pointed to three infrastructure layers that determine whether tokenization can operate at institutional scale:
1. Moving off-chain data on-chain, reliably
Financial markets depend on high-integrity data: equity prices, reference data, reserve attestations, corporate actions. Tokenization only works if that information can be transported onto blockchains securely and verifiably — without introducing latency, ambiguity, or manipulation risk.
2. Enabling true cross-chain interoperability
Tokenized assets will not live on a single chain. Users will expect to post a token as collateral on one network, trade it on another, and settle it elsewhere. That requires secure, generalized interoperability — not the fragile, siloed bridges that dominate today’s ecosystem.
3. Orchestrating dual books-and-records systems
Perhaps the most complex challenge: when the same asset exists in a custodian database and on a blockchain, both representations must remain synchronized.
As Cunningham put it:
“When you think about Tesla stock — the same asset sitting off-chain and on-chain — you need orchestration to keep those records synchronized.”
Chainlink’s “automated compliance engine” and runtime environment are early attempts to solve this synchronization problem: enabling asset managers to encode compliance rules into smart contracts while maintaining integrity across traditional and blockchain-based records.
For Cryptio, this hybrid world is not a future scenario — it is the operating reality of tokenized markets today. Tokenization does not eliminate reconciliation. It elevates its importance, because regulators, auditors, and institutions need confidence that every on-chain event is faithfully mirrored in the financial system that surrounds it.
Regulatory clarity: a ‘hornet’s nest’ still to be solved
If the panel agreed on anything, it was this: technology may innovate quickly, but compliance cannot outrun regulation. Tokenization is already proving its operational value – yet its regulatory foundations remain fragmented, inconsistent, and in many places undefined.
Kraken’s CJ Rinaldi described the landscape as a “hornet’s nest” of open questions, where long-standing securities rules collide with the realities of on-chain assets. Many of the uncertainties cut to the core of how securities law even applies in a tokenized environment.
A few of the most pressing unresolved issues include:
Do tokenized securities require transfer agents?
If settlement and recordkeeping are handled on-chain, does the traditional transfer-agent role still apply – or does it become redundant?
What counts as beneficial ownership?
If a tokenized equity mirrors economic exposure but lacks voting rights, does a 10% holder trigger Schedule 13D reporting?
Is USDC a ‘cash equivalent’ under the custody rule?
A single classification decision could reshape advisory obligations across thousands of RIAs.
What is ‘custody’ in a tokenized environment?
Self-custody, multi-sig, qualified custodians, smart-contract wallets – none map neatly onto the existing frameworks.
And who regulates what?
SEC, CFTC, NFA, FINRA, OCC, the Federal Reserve – each applies different definitions, privacy rules, and supervisory mandates.
For institutions trying to navigate this patchwork, the challenge is not simply compliance – it’s determining what compliance is. As Coinbase’s Glenn Cross put it:
“The spectrum of grey is so broad you’re often deciding between:
we believe we have a defensible position vs we genuinely don’t know if we’re over the line.”
Rinaldi added a blunt reminder of why these decisions matter now, not later:
“Statutes of limitations outlast administrations. We have to get it right now – not hope a future administration sees it the same way.”
Until regulators provide greater clarity – particularly in the United States – tokenization will continue to advance unevenly: accelerating where rules are settled, and stalling where they are not.
Tokenization’s new risk landscape
If tokenization promises to automate large parts of compliance, the panel was equally clear that it introduces a new class of risks – ****some technical, some behavioral, and some systemic. These risks don’t weaken the case for tokenization, but they do change what regulators and institutions must be prepared to manage.
Bridge and interoperability failures
Moving assets across chains is still fragile. When bridges fail, value can become irreversibly trapped or lost, with no recourse to traditional market infrastructure.
As Colin noted:
“Sometimes, it can’t be fixed. That’s a very real risk.”
Private-key and wallet-management risks
Tokenized assets may be far more complex than typical crypto tokens. Yet retail users may still self-custody them – often without the operational discipline required to safeguard private keys.
Speculation bleeding into securitized markets
The “meme-coin playbook” – hype cycles, thin liquidity, and predatory incentives – can spill into tokenized assets if governance is weak or if markets conflate regulated products with permissionless tokens.
AI-driven proliferation of assets
The cost of issuing new assets – legitimate or fraudulent – is collapsing. Colin warned that the acceleration curve may be steeper than most expect:
“I said AI will be writing blockchains in five years.
A Microsoft guy stood up and said — it’s already happening.”
This raises the likelihood of markets flooded with synthetic, auto-generated, or poorly designed assets.
Risk drift and miscalibrated exposure
Perhaps the subtlest risk is behavioral. Users – especially retail – may not realise how far out the risk curve they have moved as tokenization lowers barriers to access.
Glenn captured it plainly:
“You can find yourself out in the wilderness without realising it.”
Fraud and social engineering
Wherever new asset types emerge, new forms of exploitation follow. Tokenization widens the attack surface across issuance, custody, identity, and cross-chain transfers.
Taken together, the message was unambiguous: automation reduces manual work, not responsibility.
As tokenized markets scale, regulators will need sharper tools, institutions will need stronger oversight, and users will need clearer guardrails.
The role of digital identity
During the Q&A, the panel turned to a topic that increasingly underpins every discussion about tokenized assets: digital identity. If tokenization is to operate on open networks while meeting the expectations of regulators, auditors, and market infrastructure, identity may be the linchpin.
Kraken’s CJ Rinaldi noted that regulators are becoming more comfortable with non-documentary verification – identity checks that rely on third-party data sources such as credit bureaus, utility records, or postal databases rather than physical documents. That shift allows for smoother onboarding in digital environments, but it does not soften the underlying requirement:
“KYC is a requirement. There’s no getting around it.”
From Coinbase’s perspective, Glenn Cross argued that the real unlock will come when identity is tied directly to wallets. Identity-linked wallets could enable automated jurisdictional controls, sanctions screening, transfer restrictions, investor eligibility checks, and other elements of regulatory compliance – all executed at the moment a transaction is attempted.
In that world, compliance becomes not just automated but embedded.
But both panelists agreed: despite clear technological momentum, the ecosystem is not yet ready. Standards remain fragmented, privacy models vary widely, and no global framework exists for portable, regulatorily acceptable digital identity.
Until that foundation is built, digital identity will remain one of the core bottlenecks – and one of the biggest opportunities – for tokenized markets.
Compliance gets easier, oversight gets harder
Tokenization’s promise is no longer theoretical. The panel made clear that embedding compliance rules into the asset itself – from real-time reconciliation to automated reporting and identity-aware transfers – can meaningfully reduce the manual friction that defines today’s regulatory workflows.
But they were equally clear that tokenization shifts, rather than removes, the burden.
New risks emerge. Old definitions no longer fit. Regulatory obligations multiply across agencies. Retail users gain access to instruments they may not fully understand. And the speed and finality of on-chain activity raise the stakes for every operational mistake.
Automation will take care of the mechanics. Oversight must take care of everything else.
For tokenized markets to scale safely, regulators will need clearer frameworks, institutions will need deeper data integrity, and users will need stronger guardrails. And the infrastructure that ties these worlds together – including reconciliation and financial-data platforms like Cryptio – will play an increasingly critical role in ensuring that what happens on-chain is trusted off-chain.
About Cryptio
Cryptio is the leading financial-data platform for crypto accounting, tokenization compliance, and institutional reporting – trusted by more than 450 organizations, including publicly listed treasuries.
For stablecoin and RWA issuers, Cryptio provides independent, daily liability data across all wallets, chains, and mint/burn events, enabling issuers to maintain accurate supply records and meet emerging regulatory and assurance requirements.
By translating on-chain activity into audit-ready, GAAP- and IFRS-compliant records, Cryptio gives finance teams the clarity and control needed to operate tokenized markets with confidence.
Book a demo to see how Cryptio supports audit-ready tokenization compliance and reporting.
Table of contents
- Tokenization moves from abstraction to adoption
- Compliance becomes a feature, not a function
- The missing link: infrastructure for a hybrid market
- Regulatory clarity: a ‘hornet’s nest’ still to be solved
- Tokenization’s new risk landscape
- The role of digital identity
- Compliance gets easier, oversight gets harder
- About Cryptio

