Recently, BIWIN made a prominent appearance at the “Data & Storage Summit 2025” hosted by the authoritative media DOIT. Its independently developed CXL 2.0 DRAM Module was honored with the “AI Storage Product Gold Award 2025” on the Storage Award list.

CXL 2.0 DRAM Module: Breaking the “Memory Wall” in Critical Computing Scenarios
Traditional servers are constrained by the limited number of DIMM slots on the motherboard and channel bandwidth, making it difficult to meet the demands for high-concurrency and large-capacity memory access. BIWIN’s CXL 2.0 DRAM Module, based on the open-standard CXL Type-3 specification and adopting the mainstream E3.S 2T form factor, can be directly inserted into PCIe Gen5 x8 slots that support the E3.S interface and CXL 2.0 protocol. This enables efficient, low-latency memory expansion and delivers unique value across multiple mission-critical computing scenarios:
- In data centers, it breaks the traditional limitation of memory being tightly coupled to CPU sockets, allowing memory resources to be accessed collaboratively and scheduled flexibly across nodes, significantly reducing data transfer latency. BIWIN enterprise-grade lab tests show that the product delivers more than 20% higher data throughput and over 30% greater expansion capacity compared with conventional memory configurations.
- In large-scale AI training environments, it establishes a memory-sharing access mechanism that enables multiple compute units to collaboratively process ultra-large parameter models with high efficiency, effectively alleviating bandwidth pressure and training delays caused by frequent data synchronization.
- In enterprise server clusters, it empowers individual nodes with dramatically enhanced multitasking capabilities. Whether running real-time AI inference services, high-concurrency databases, or containerized applications, the system can dynamically allocate memory capacity to achieve efficient coexistence of heterogeneous workloads.

Furthermore, the product is completely compliant with CXL RAS (Reliability, Availability, Serviceability) design philosophy:
- At the link layer: incorporating mechanisms such as Link CRC, link retraining, and data poisoning flags to ensure data integrity over high-speed PCIe 5.0 channels.
- At the memory layer: integrating standard ECC, automatic data cleaning, proactive patrol scrubbing, and hardware-software collaborative page retirement, which significantly enhances system stability during prolonged high-load operation.
- Out-of-band management: leveraging an SMBus out-of-band management interface to provide real-time temperature monitoring, and telemetry remote O&M capabilities, seamlessly integrating with modern data center intelligent operations frameworks and delivering full-stack observability and manageability from hardware to platform.
To meet diverse server platform deployment needs, BIWIN also offers the SXC-D5-AIC001 CXL AIC adapter as a complementary expansion solution. It supports DDR5 RDIMM dual-slots and expands memory capacity up to 2 TB, enabling smooth, cost-effective migration to CXL memory architecture on existing server platforms that do not yet natively support E3.S CXL modules.
Building Storage Competitiveness for the AI Era: End-to-End Technology Stack with Deep Scenario Implementation
Facing the explosive “endpoint-edge-cloud” storage demands driven by the AI boom, BIWIN adheres to its brand philosophy of “Infinite Storage, Unlimited Solutions” and has established a comprehensive product portfolio covering both intelligent terminals and data centers:
- For the endpoint side: BIWIN has already mass-produced LPDDR5X (up to 8,533 Mbps), uMCP, ePOP, PCIe 5.0 SSDs, and high-performance overclocked DDR5 memory modules, widely adopted in AI smartphones, AI PCs, AI glasses, and embodied intelligence devices.
- For the enterprise market: BIWIN has built a complete enterprise-grade storage matrix tailored for AI servers, including CXL memory modules, PCIe Gen4/Gen5 SSDs, SATA SSDs, and RDIMMs, providing a solid foundation for cloud training and edge inference.
Bolstered by integrated IC design, firmware algorithms and advanced packaging and testing capabilities, BIWIN has developed a highly synergistic full-stack R&D system capable of optimizing performance, power consumption, form factor, and reliability across diverse endpoint AI scenarios. As one of the first companies in the industry to achieve large-scale commercialization of AI endpoint storage products, BIWIN has become a core storage supplier for several leading AI glasses manufacturers, deeply involved in the development and delivery of multiple flagship products. The company also holds a dominant market share in emerging fields such as AI smartphones, embodied intelligence, and AI education.
In 2024, revenue from BIWIN’s emerging businesses in edge AI exceeded RMB 1 billion, representing year-on-year growth of approximately 294%, demonstrating tremendous growth momentum. With its comprehensive product layout, technological strength, and deep scenario-based implementation capabilities, BIWIN has established a clear leadership position in the on-device AI market.
