Loading organizations...

§ Private Profile · Santa Clara, CA, USA
Semiconductor company developing AI inference chips and platforms for hyperscalers and enterprises, focused on energy-efficient data center AI.
Based in Santa Clara, California, d-Matrix develops specialized semiconductor chips and hardware platforms designed specifically for artificial intelligence inference workloads in enterprise data centers. The organization utilizes proprietary digital in-memory computing technology within its flagship Corsair platform to process large language models and transformer-based applications with significantly reduced power consumption and latency. Operating as a venture-backed enterprise targeting global hyperscalers, d-Matrix has raised a total of $450 million in equity funding, which includes a $275 million Series C financing round completed in November 2025. The semiconductor firm is backed by prominent institutional investors such as Playground Global and Nautilus Venture Partners, while maintaining strategic commercial partnerships with J.P. Morgan and Gimlet Labs to scale operations. The company was officially founded in 2019 by semiconductor industry veterans Sid Sheth and Sudeep Bhoja.
d-Matrix has raised $429.0M across 3 funding rounds.
d-Matrix has raised $429.0M in total across 3 funding rounds.
d-Matrix is a technology company developing AI compute solutions optimized for generative AI inference, focusing on ultra-low latency, high throughput, energy efficiency, and scalability.[1][2][4] Its flagship product, Corsair, is a rack-scale inference platform built with innovations in silicon, software, chiplet packaging, and interconnects, addressing the unsustainable energy and cost trajectory of current AI systems by enabling enterprises and datacenters to run large-scale workloads sustainably.[2][3][4] d-Matrix serves enterprises adopting GenAI, solving bottlenecks in traditional architectures through memory-compute integration like Digital In-Memory Computing (DIMC) and 3D stacked DRAM (3DIMC), with recent momentum from partnerships such as with Alchip for 3D DRAM and Andes for high-performance accelerators, plus the launch of SquadRack for datacenter-scale inference.[1][2]
d-Matrix was founded by industry veterans with over 20 years of experience shipping more than 100 million chips and generating over $5 billion in business value, predating the generative AI boom.[1] The idea emerged from recognizing generative AI's untapped potential hindered by inefficient compute platforms; the team envisioned and built a faster, energy-efficient alternative to push AI beyond current limitations.[1][2] Early traction built on this foresight, with the company quietly innovating in DIMC and chiplet architectures, leading to pivotal announcements like the Corsair platform and recent collaborations that validate their first-principles approach.[2][4]
d-Matrix stands out in AI inference through purpose-built hardware-software innovations:
These enable performant, sustainable, and scalable AI without compromises on speed or usability.[2][4]
d-Matrix rides the explosive growth of generative AI inference, where demand for real-time, large-scale deployment outpaces legacy GPU/TPU architectures strained by energy consumption and costs.[1][2] Timing is ideal amid AI's "unsustainable trajectory," with market forces like rising datacenter power limits and enterprise GenAI adoption favoring efficient alternatives; their DIMC and chiplet innovations transform inference economics, making high-throughput workloads viable for all company sizes.[2][4] By influencing the ecosystem through open-source tools and partnerships (e.g., Alchip, Andes), d-Matrix accelerates sustainable AI scaling, reducing barriers for broader adoption and countering Big Tech dominance in compute.[1][2]
d-Matrix is poised to capture a slice of the trillion-dollar AI infrastructure market with Corsair's rack-scale deployments and upcoming 3D DRAM integrations, targeting enterprise inference dominance.[2] Trends like model scaling, edge-to-cloud inference, and regulatory pushes for energy-efficient AI will propel them, potentially evolving into a key enabler for attainable GenAI across industries. As pioneers redefining compute paradigms, their momentum positions them to deliver on AI's promised future, transforming unsustainable hype into widespread reality.[1][2][4]
d-Matrix has raised $429.0M in total across 3 funding rounds.
d-Matrix's investors include Per Roman, Temasek, Jeff Huber, EDBI, Industry Ventures, Michael Stewart, Mirae Asset, Nautilus Venture Partners, Qatar Investment Authority, Amino Capital, Innovacom, M12.
d-Matrix has raised $429.0M across 3 funding rounds. Most recently, it raised $275.0M Series C in November 2025.
| Date | Round | Lead Investors | Other Investors | Status |
|---|---|---|---|---|
| Nov 12, 2025 | $275M Series C | PER Roman, Temasek, Jeff Huber | EDBI, Industry Ventures, Michael Stewart, Mirae Asset, Nautilus Venture Partners, Qatar Investment Authority | Announced |
| Sep 1, 2023 | $110M Series B | Temasek | Amino Capital, Innovacom, M12, Triatomic Capital, Tsvc Capital, SUE XU, Paul Mcnamara, Sasha Ostojic | Announced |
| Apr 1, 2022 | $44M Series A | M12, Sasha Ostojic, SK Hynix | Audrey Capital, Broadway Angels, C2 Investment, Innovacom, Juxtapose Capital, Locus Ventures, M13, Oyster Ventures, Pillar VC, Quiet Capital, SciFi VC, Tsvc Capital, Wndrco LLC, Adrian Aoun, Charlie Cheever, Drew Houston, Emmett Shear, Greg Brockman, Jeff Seibert, John Kobs, Kyle Vogt, Mike Vernal, TOM Blomfield, Wayne Chang, Entrada Ventures, Marvell Technology, Nautilus Venture Partners | Announced |