DTCC Rejects ‘Walled Garden’ Approach to Tokenization, Emphasizes Open, Interoperable Future
The Depository Trust & Clearing Corporation (DTCC), the backbone of U.S. post-trade market infrastructure, is making it clear that its vision for tokenized securities will not be confined to a single blockchain or a closed ecosystem.
Despite being a legacy institution that processes around $10 trillion in securities transactions every day, DTCC is positioning its digital assets strategy around flexibility and cross-chain connectivity, according to Nadine Chakar, Global Head of DTCC Digital Assets.
Speaking at a virtual forum, Chakar underscored that while DTCC’s approach is firmly grounded in traditional financial risk controls and data standards, it is equally focused on ensuring that tokenized assets can move freely across different blockchain networks.
“We’re not building walled gardens,” Chakar said. “Interoperability, for me, is being able to move things seamlessly from one chain to another, without risk or extra expenses.”
Tokenization Without Lock-In
For many financial institutions experimenting with distributed ledger technology, the temptation is to build proprietary platforms that lock users and assets into specific networks. DTCC is explicitly rejecting that model.
Its goal is to design tokenization solutions that can interact with multiple chains instead of binding clients to a particular provider, protocol, or ecosystem. Practically, this means DTCC wants tokenized securities to live within a broader, connected infrastructure rather than in isolated silos, whether those silos are permissioned enterprise chains or public networks.
In Chakar’s view, the promise of tokenization will only be realized if assets can circulate freely between different environments, much like securities and cash already move among custodians, central securities depositories, and clearinghouses today.
Legacy Infrastructure Meets Digital Assets
DTCC’s cautious but constructive stance is shaped by its role at the core of the traditional financial system. With vast volumes of trades passing through its pipes each day, the firm cannot afford operational or legal missteps—especially not in a nascent area like digital assets.
That’s why Chakar repeatedly emphasized the primacy of risk management and standardized data. Tokenization, in DTCC’s framework, must integrate with robust governance, reliable record-keeping, and regulatory compliance, not bypass them.
Far from abandoning legacy processes, DTCC is looking to extend them into the digital asset world. The aim is to preserve the reliability and predictability of traditional market infrastructure while modernizing how ownership and settlement are represented and executed.
Interoperability as a Design Principle
For DTCC, interoperability isn’t a buzzword; it is a core requirement in the architecture of any tokenized product or platform it touches.
In practice, that includes:
– The ability for the same asset to be represented on different chains without creating conflicting records.
– Mechanisms to synchronize state and prevent double counting or double spending across networks.
– Common data models so that information about a tokenized asset is consistent and machine-readable regardless of the underlying technology.
– Controls to ensure transfers between chains do not introduce new operational or counterparty risks.
Chakar stressed that moving digital assets “seamlessly” is not just about speed or convenience. It’s about ensuring that each hop between networks remains safe, auditable, and cost-effective—key expectations for institutional participants.
Balancing Innovation With Regulation
As a critical financial market utility, DTCC sits at the intersection of innovation and oversight. Any tokenization strategy must satisfy regulators’ demands for transparency, resilience, and investor protection.
This means that while DTCC is open to engaging with both public and private blockchains, it will insist on clear controls over:
– Identity and access management
– Settlement finality
– Custody arrangements
– Data privacy and retention
– Business continuity in the event of outages or network splits
Tokenization, in DTCC’s conception, is not about circumventing regulation but about enhancing existing processes. It should make securities markets more efficient and transparent while still aligning with the legal frameworks that govern today’s financial system.
Why DTCC’s Stance Matters for Institutions
DTCC’s refusal to build “walled gardens” sends a strong signal to banks, asset managers, and market infrastructures that are weighing their own digital asset strategies.
Institutions have been wary of becoming dependent on a single vendor or blockchain, particularly in an environment where technologies and protocols change rapidly. An interoperable approach reduces the risk of technological lock-in and helps ensure that tokenized assets remain usable even as the underlying infrastructure evolves.
For large market participants, DTCC’s stance offers a blueprint: experiment with tokenization, but do it in a way that is chain-agnostic, standards-driven, and designed for integration with the broader market ecosystem.
The Role of Standards in an Interoperable Future
Underpinning DTCC’s strategy is an emphasis on common standards—for data, messaging, and asset representation. Without agreed formats and taxonomies, interoperability becomes fragile or purely theoretical.
Standardization enables different networks and platforms to “speak the same language,” making it easier to:
– Reconcile positions and transactions across systems
– Share reference data about assets and counterparties
– Automate compliance checks and reporting
– Plug tokenized workflows into existing back-office systems
DTCC’s long history in shaping and adopting financial standards puts it in a strong position to help coordinate how tokenized securities are described, processed, and settled across the industry.
Tokenization as an Evolution, Not a Revolution
Chakar’s comments also reflect a broader philosophical point: for core market infrastructure providers, tokenization is an evolution of existing capabilities, not a wholesale replacement.
The core functions—clearing, settlement, custody, and record-keeping—remain essential. What changes is the way assets are represented and how quickly and efficiently they can move.
By combining blockchain-based representation with tried-and-tested risk controls, DTCC aims to deliver benefits like:
– Faster or more flexible settlement cycles
– Improved transparency into asset ownership and transaction history
– Potentially lower operational costs over time
– Enhanced automation via smart-contract-like logic, where appropriate
But all of this must be done with the same level of robustness that institutional investors and regulators already expect.
The Road Ahead for DTCC and Tokenized Markets
Although DTCC’s digital assets initiative is still evolving, the direction is clear: tokenized securities will not be trapped inside proprietary networks that can’t talk to each other.
Instead, DTCC is working toward an environment where:
– Clients can choose different chains based on their needs and still interact with the same underlying asset.
– Infrastructure-level services—such as clearing, settlement, and asset servicing—extend smoothly into the tokenized world.
– Interoperability is embedded from the start, reducing fragmentation and complexity over time.
For the broader market, this approach could be critical to moving from small-scale pilots to genuine, large-scale adoption of tokenized assets.
What It Means for the Future of Finance
If DTCC succeeds in building interoperable tokenization infrastructure, it will reinforce a central idea: digital assets can coexist with, and strengthen, traditional finance rather than replace it.
A world where tokenized securities can move securely and efficiently across multiple networks—without being restricted to “walled gardens”—could lead to:
– Greater liquidity across markets and venues
– More flexible product design and distribution
– Better integration between traditional financial instruments and emerging digital-native assets
Chakar’s message is that this future is only viable if core principles—risk control, data integrity, and interoperability—are respected from the start. Tokenization, in this vision, is not an experiment at the periphery of the financial system. It is a gradual retooling of its core, built to be open, connected, and durable.

