Download free PDF

Compute-In-Memory (CIM) Chip Market Size & Share 2026-2035

Market Size - By Memory Technology Type (SRAM-based CIM, DRAM-based CIM, Flash-based CIM, Others), By Architecture Type (Analog CIM, Digital CIM, Hybrid CIM), By Application (Edge AI, Data Center & Cloud AI, IoT & Embedded, HPC & Industrial Automation, Others), By End-User Industry (IT & Telecom, Automotive, Consumer Electronics, Healthcare, Industrial, Others) - Growth Forecast. The market forecasts are provided in terms of revenue (USD).

Report ID: GMI15788
   |
Published Date: April 2026
 | 
Report Format: PDF

Download Free PDF

Compute-In-Memory Chip Market Size

The global compute-in-memory (CIM) chip market was valued at USD 500 million in 2025. The market is expected to grow from USD 687.7 million in 2026 to USD 3.4 billion in 2031 & USD 12.8 billion in 2035, at a CAGR of 38.4% during the forecast period according to the latest report published by Global Market Insights Inc.

Compute-In-Memory (CIM) Chip Market Key Takeaways

Market Size & Growth

  • 2025 Market Size: USD 500 Million
  • 2026 Market Size: USD 687.7 Million
  • 2035 Forecast Market Size: USD 12.8 Billion
  • CAGR (2026–2035): 38.4%

Regional Dominance

  • Largest Market: Asia Pacific
  • Fastest Growing Region: Asia Pacific

Key Market Drivers

  • Rapid growth of artificial intelligence and machine learning workloads.
  • Rising demand for energy‑efficient computing.
  • Increasing deployment of edge and embedded computing devices.
  • Limitations of Traditional CPU‑ and GPU‑based architectures.
  • Advancements in memory technologies enabling in‑memory computation.

Challenges

  • High design complexity and limited standardization of compute‑in‑memory architectures.
  • Reliability and accuracy concerns in large‑scale in‑memory computation.

Opportunity

  • Integration of compute‑in‑memory chips into next‑generation data‑center architectures.
  • Adoption of compute‑in‑memory solutions in safety‑critical and regulated industries.

Key Players

  • Market Leader: Cerebras Systems led with over 18.2% market share in 2025.
  • Leading Players: Top 5 players in this market include Cerebras Systems, Samsung Electronics, SK hynix, Intel, Groq, which collectively held a market share of 53.2% in 2025.

The growth of the compute‑in‑memory chip market is driven by the increasing need to improve computing efficiency as data volumes and processing intensity rise across modern applications. Growing use of artificial intelligence and data‑intensive workloads, wider deployment of edge and embedded systems, and constraints faced by traditional processor‑centric architectures are pushing demand for compute‑in‑memory solutions.

The compute‑in‑memory chip market is being driven by the growing need for energy‑efficient computing as artificial intelligence continues to push electricity consumption higher across data‑center infrastructure. The International Energy Agency (IEA) estimates that data centres consumed around 415 TWh of electricity in 2024, and projects this figure to almost double to about 945 TWh by 2030 due to AI‑driven workloads. This sharp rise in energy demand is making efficiency at the hardware level a critical concern for operators. Compute-in-memory architecture solves this problem by minimizing data transfer from memory to processors, thus reducing total energy usage. With increased energy prices and growing concerns regarding sustainability, demand for compute-in-memory chips is rising as a practical solution for managing long‑term AI energy requirements.

Additionally, growth in the compute-in-memory (CIM) chip market is further supported by increasing shift toward edge and embedded computing systems that must operate under strict power and latency limits. As AI processing moves closer to the data source, conventional cloud‑dependent architectures are becoming inefficient and energy‑intensive. This is increasing demand for hardware that can deliver high performance within tight power budgets. In the year 2025, TSMC introduced the first compute-in-memory macro in edge-AI devices, with performance efficiency of 188.4 TOPS/W. These kinds of performance efficiencies are encouraging rapid adoption of compute‑in‑memory chips across edge devices and embedded AI platforms, directly supporting market growth.

The compute-in-memory (CIM) chip market increased steadily from USD 123.8 million in 2022 and reached USD 313.1 million in 2024, driven by rapid adoption of artificial intelligence applications, the focus on energy-efficient computing solutions, and the increasing use of edge and embedded computing platforms. During this period, computing architectures shifted toward memory‑centric designs to address power and latency constraints, data‑intensive AI applications scaled across data centers and edge devices, traditional processor bottlenecks became more evident, and advancements in memory technologies enabled practical in‑memory computation, collectively strengthening adoption and supporting sustained market growth.

Compute-In-Memory (CIM) Chip Market Research Report

Compute-In-Memory Chip Market Trends

  • Shift from research‑centric CIM prototypes to application‑specific CIM chips is reshaping market dynamics, with momentum building from 2023 onward. Companies instead of designing CIM systems applicable across all forms of computing tasks, businesses are working on designing specific chips based on various applications, such as artificial intelligence inference at edge level, vision processing, and signal analysis. This trend is likely to persist until at least 2028, driven by demand for optimized performance and power efficiency. As a result, CIM chips are moving closer to real‑world deployment and revenue generation.
  • The integration of the compute-in-memory architecture into the standard memory chips of semiconductors is one of the emerging trends that began to become popular after 2022 due to its simplicity and scalability. Instead of relying on specialized memory technologies, companies are embedding compute functions into existing CMOS‑based memory. This approach lowers production complexity and cost. The trend is expected to continue through 2030, as manufacturers aim for high‑volume commercialization. The impact is faster market adoption and wider use of compute‑in‑memory chips across mainstream computing systems.
  • Adoption of hybrid digital–analog compute‑in‑memory systems is another interesting industry trends, as advancements within the field begin to be seen starting from 2024 as manufacturers aim to strike a compromise between speed, accuracy, and reliability. All-analog solutions are energy efficient yet suffer from stability issues, while all-digital systems provide robust performance at a higher power consumption rate. As a solution to this problem, suppliers integrate digital control with analog compute-in-memory operations This trend is expected to continue through 2030, driven by the need for practical, scalable AI hardware. The impact is improved system reliability and increased commercial confidence, supporting wider adoption of compute‑in‑memory chips.

Compute-In-Memory Chip Market Analysis

Compute-In-Memory (CIM) Chip  Market Size,  By Memory Technology Type, 2022– 2035 (USD Million)

Based on memory technology type, the global compute-in-memory (CIM) chip market is segmented into SRAM-based CIM, DRAM-based CIM, flash-based CIM and others

  • The SRAM-based CIM segment led the market in 2025, holding a 40.6% share due to its high speed, low latency, and compatibility with advanced logic processes. SRAM‑based CIM architectures are widely adopted in AI accelerators and data‑center workloads where fast, deterministic performance is critical. Their maturity, reliability, and ease of integration with existing CMOS technology support broad commercial deployment, sustaining their leading market position.
  • The flash-based CIM segment is anticipated to grow at a CAGR of 39.7% over the forecast period. Increasing demand for energy‑efficient, non‑volatile computing solutions is going to spur further growth. Flash‑based CIM enables local data storage and computation with reduced power consumption, making it well suited for edge AI and embedded applications. As edge deployments expand, this technology accelerates adoption, driving strong segment growth.

Compute-In-Memory (CIM) Chip Market Revenue Share, By Architecture Type, 2025 (%)

Based on architecture type, the global compute-in-memory (CIM) chip market is divided into analog CIM, digital CIM and hybrid CIM

  • The digital CIM segment dominated the market in 2025 and valued at USD 217.5 million, supported by its high accuracy, design maturity, and easier integration with existing digital computing systems. Digital CIM architectures are widely preferred for data‑center and enterprise AI workloads where reliability, programmability, and scalability are critical. Strong compatibility with conventional CMOS design flows has enabled large‑scale commercial adoption, sustaining the segment’s leading market position.
  • The analog CIM segment is expected to witness growth at a CAGR of 39.7% during the forecast period. This growth is driven by its ability to deliver significantly higher energy efficiency and computational density for matrix‑intensive AI workloads. Analog CIM architectures perform operations directly within memory arrays, reducing data movement and power consumption. Rising demand for low‑power AI acceleration in edge and specialized applications is accelerating adoption, supporting rapid segment growth.

Based on application, the global compute-in-memory chip market is divided into edge AI, data center & cloud AI, IoT & embedded, HPC & industrial automation and others

  • The data center & cloud AI segment led the market in 2025 with a market share of 32.4%, owing to large‑scale deployment of AI training and inference workloads across hyperscale and enterprise data centers. Compute‑in‑memory architectures are increasingly adopted to address memory bandwidth and power efficiency challenges in centralized computing environments. High investment in AI infrastructure and sustained demand for performance optimization support the segment’s dominant market position.
  • The edge AI segment is expected to grow at a CAGR of 39.7% during the forecast period. This growth is supported by rapid proliferation of smart devices requiring real‑time, low‑latency data processing. Growing adoption of AI at the edge across consumer electronics, industrial automation, and intelligent sensors is accelerating segment growth.

U.S. Compute-In-Memory (CIM) Chip Market Size, 2022 – 2035, (USD Million)
North America Compute-In-Memory Chip Market

North America held a share of 31.4% of compute-in-memory (CIM) chip industry in 2025.

  • The North American compute‑in‑memory chip market is growing due to strong adoption of advanced computing hardware across data centers, AI development hubs, and defense‑related research environments. The region’s focus on high‑performance and energy‑efficient computing to support large‑scale AI workloads is accelerating interest in memory‑centric architectures. Leading hyperscale operators and technology developers are actively exploring compute‑in‑memory solutions to address power and performance constraints.
  • Federal research programs, national laboratories, and close collaboration between chip designers and system integrators are driving early deployment of compute‑in‑memory architectures. As North America prioritizes leadership in AI and advanced computing, the region is expected to remain a key market for compute‑in‑memory chip development and adoption.

The U.S. compute-in-memory (CIM) chip market was valued at USD 99.6 million and USD 158.8 million in 2022 and 2023, respectively. The market size reached USD 407.4 million in 2025, growing from USD 254.1 million in 2024.

  • The compute-in-memory (CIM) chip industry in the U.S. is in a growth phase due to strong federal focus on artificial intelligence leadership, advanced computing, and domestic semiconductor capability building. Government‑backed AI, defense, and high‑performance computing programs are increasing demand for memory‑centric architectures that can deliver higher efficiency and performance.
  • Additionally, large‑scale federal initiatives promoting domestic semiconductor manufacturing and next‑generation computing infrastructure are supporting market growth in the U.S. Programs under the CHIPS and Science Act and national laboratory research frameworks are encouraging innovation in novel chip architectures, including compute‑in‑memory solutions. These efforts strengthen the U.S. position as a key development and commercialization hub for compute‑in‑memory technologies in North America.

Europe Compute-In-Memory Chip Market

Europe market accounted for USD 87.8 million in 2025 and is anticipated to show lucrative growth over the forecast period.

  • Europe’s compute-in-memory (CIM) chip industry is expanding due to the region’s strong focus on energy‑efficient computing and digital sovereignty. European initiatives emphasizing sustainable data centers, AI efficiency, and reduced energy intensity are encouraging adoption of memory‑centric computing architectures.
  • Market growth is further supported by coordinated research and semiconductor innovation efforts across countries such as Germany, France, and the Netherlands. Public–private collaborations, research programs, and support for advanced chip design are helping transition compute‑in‑memory technologies from pilot projects to practical applications. These efforts position Europe as a steadily developing market for compute‑in‑memory chips, particularly within research‑driven and industrial AI use cases.

Germany dominates the Europe compute-in-memory (CIM) chip market, showcasing strong growth potential.

  • The compute‑in‑memory chip market in Germany is growing due to the country’s strong industrial digitalization focus and high adoption of advanced automation technologies. Germany’s manufacturing, automotive engineering, and industrial equipment sectors increasingly rely on real‑time data processing and AI‑driven optimization
  • Strong collaboration between research organizations, chip developers, and industrial end users is accelerating the transition of compute‑in‑memory technologies from development to practical deployment. These factors position Germany as a key European market for compute‑in‑memory chips, particularly in industrial and automotive‑linked AI applications.

Asia Pacific Compute-In-Memory Chip Market

The Asia Pacific market is anticipated to grow at the highest CAGR of 40.5% during the forecast period.

  • The compute-in-memory (CIM) chip industry in the Asia Pacific region is growing at a high rate, due to the strong concentration of semiconductor manufacturing, electronics production, and AI‑enabled device development. Countries such as China, South Korea, Taiwan, and Japan are rapidly integrating advanced computing architectures to support high‑volume electronics, AI hardware, and consumer devices.
  • Market growth is further supported by aggressive government‑backed digital transformation and semiconductor development programs across the region. Region is investing heavily in domestic chip innovation, AI infrastructure, and advanced manufacturing to strengthen global competitiveness. These initiatives are encouraging local design, testing, and commercialization of compute‑in‑memory chips, positioning the region as a major growth hub for future adoption.

China compute-in-memory (CIM) chip market is estimated to grow with a significant CAGR, in the Asia Pacific market.

  • The compute‑in‑memory chip market in China is growing due to strong state‑led emphasis on artificial intelligence, advanced computing, and data‑centric technologies. Large‑scale deployment of AI across smart manufacturing, urban infrastructure, and digital services is increasing demand for high‑efficiency computing architectures.
  • Government programs encouraging local chip design and innovation are driving exploration of alternative computing models, including compute‑in‑memory. These efforts are positioning China as a rapidly developing market for CIM chips, particularly in AI‑driven industrial, edge, and embedded computing applications.

Middle East and Africa Compute-In-Memory Chip Market

Saudi Arabia market to experience substantial growth in the Middle East and Africa.

  • The compute-in-memory (CIM) chips industry in Saudi Arabia is growing at a fast pace due to the country’s strong push toward digital transformation and smart infrastructure under Vision 2030. Large‑scale projects such as NEOM and other smart city initiatives rely on real‑time data processing, AI‑driven automation, and energy‑efficient computing.
  • Market growth is further supported by rising investments in data centers, cloud services, and national AI initiatives aimed at building a knowledge‑based economy. As Saudi Arabia expands its digital and computing infrastructure locally, interest is increasing in next‑generation chip architectures that improve efficiency and system scalability. This positions the country as an emerging growth market for compute‑in‑memory chips within the Middle East technology ecosystem.

Compute-In-Memory Chip Market Share

The compute-in-memory (CIM) chip industry is led by players such as Cerebras Systems, Samsung Electronics, SK Hynix, Intel and Groq, which together account for 53.2% share of the global market. These players offer highly specialized computing architectures that minimize data movement, improve processing throughput, and enhance energy efficiency for artificial intelligence and data‑intensive workloads across data centers and advanced computing environments.
Their leadership is supported by strong capabilities in memory‑logic integration, proprietary accelerator designs, and scalable system‑level solutions. In addition, long‑term investments in advanced manufacturing, tight hardware‑software co‑design, and application‑focused product development enable these players to meet evolving performance and efficiency requirements, sustaining their leading position in the compute‑in‑memory chip market.

Compute-In-Memory Chip Market Companies

Prominent players operating in the compute-in-memory (CIM) chip industry are as mentioned below:

  • Mythic
  • d-Matrix
  • Rain Neuromorphics
  • EnCharge AI
  • Untether AI
  • Lightmatter
  • IBM
  • Samsung Electronics
  • SK hynix
  • Micron Technology
  • Intel
  • NVIDIA
  • Graphcore
  • Groq
  • Cerebras Systems

Cerebras Systems focuses on wafer‑scale compute architectures that integrate massive on‑chip memory with processing capability. Its approach enables extremely high bandwidth and reduced data movement, making it well suited for large‑scale AI workloads.

Samsung Electronics leverages its deep expertise in advanced memory manufacturing to develop compute‑in‑memory solutions integrated directly into memory architectures. Its strength lies in scalability, high‑volume production capability, and alignment with AI and data‑center use cases.

SK hynix concentrates on memory‑centric computing innovations by embedding processing capabilities within high‑performance memory products. Its strong position in DRAM and next‑generation memory allows efficient support for data‑intensive computing applications.

Intel differentiates through system‑level integration of compute‑in‑memory concepts across processors, accelerators, and memory platforms. Its broad ecosystem approach enables tighter hardware‑software optimization and smoother adoption across enterprise and data‑center environments.

Groq specializes in deterministic, high‑throughput AI compute architectures that minimize latency and data movement. Its architecture emphasizes predictable performance and efficient execution of AI workloads, positioning it well for real‑time inference applications.

Compute-In-Memory Chip Industry News

  • In October 2025, SK hynix launched its next‑generation Accelerator‑in‑Memory (AiM) solution at the AI Infra Summit 2025, demonstrating enhanced processing‑in‑memory performance for AI inference workloads. The solution addresses memory bottlenecks in AI systems by offloading memory‑bound tasks, accelerating commercial adoption of compute‑in‑memory architectures in data‑center inference platforms.
  • In April 2025, Samsung Electronics showcased its LPDDR‑based Processing‑in‑Memory (PIM) solutions at the OCP Global Summit 2025.These solutions integrate compute functionality directly within memory chips, reducing data movement and improving energy efficiency for AI workloads. The development reinforces Samsung’s leadership in commercializing compute‑in‑memory architectures for data‑center and hyperscale platforms.
  • In April 2025 Cerebras Systems partnered with Meta Platforms to power the LLaMA API using its CS-3 wafer-scale processors delivering up to 18x faster AI inference compared to traditional GPU- based systems. This development highlights the growing shift towards advanced memory- integrated computing architectures that reduce data movement bottlenecks and improve processing efficiency, supporting the adoption of computer-in-memory approaches and driving growth in the compute-in-memory chip market

The compute-in-memory chip market research report includes in-depth coverage of the industry with estimates and forecast in terms of revenue (USD Million) from 2022 – 2035 for the following segments:

Market, By Memory Technology Type

  • SRAM-based CIM
  • DRAM-based CIM
  • Flash-based CIM
  • Others

Market, By Architecture Type

  • Analog CIM
  • Digital CIM
  • Hybrid CIM

Market, By Application

  • Edge AI
  • Data center & cloud AI
  • IoT & embedded
  • HPC & industrial automation
  • Others

Market, By End-User Industry

  • IT & telecom
  • Automotive
  • Consumer electronics
  • Healthcare
  • Industrial
  • Others

The above information is provided for the following regions and countries:

  • North America
    • U.S.
    • Canada
  • Europe
    • Germany
    • UK
    • France
    • Spain
    • Italy
    • Netherlands
  • Asia Pacific
    • China
    • India
    • Japan
    • Australia
    • South Korea
  • Latin America
    • Brazil
    • Mexico
    • Argentina
  • Middle East and Africa
    • South Africa
    • Saudi Arabia
    • UAE
Authors: Suraj Gujar, Ankita Chavan
Frequently Asked Question(FAQ) :
What is the market size of the Compute-In-Memory (CIM) chip in 2025?
The market size was USD 500 million in 2025, with a CAGR of 38.4% expected through 2035, driven by increasing edge and embedded system deployments.
What is the projected value of the CIM chip industry by 2035?
The CIM chip market is expected to reach USD 12.8 billion by 2035, propelled by continued AI expansion, memory-centric architecture adoption, and advancements in SRAM.
What is the current CIM chip industry size in 2026?
The market size is projected to reach USD 687.7 million in 2026.
How much revenue did the SRAM-based CIM segment generate in 2025?
SRAM-based CIM led the market with a 40.6% share in 2025, driven by its high speed, low latency, with advanced logic processes used in AI accelerators and data-center workloads.
What was the valuation of the digital CIM architecture segment in 2025?
Digital CIM held the dominant position and generated USD 217.5 million in 2025, supported by high accuracy, design maturity, and seamless integration with existing digital computing.
What is the growth outlook for the analog CIM segment from 2026 to 2035?
Analog CIM is projected to grow at a CAGR of 39.7% through 2035, due to its superior energy efficiency and computational density for matrix-intensive AI workloads.
Which region leads the CIM chip market?
Asia Pacific leads the CIM chip market and is the fastest-growing region, with a CAGR of 40.5% through 2035, due to strong semiconductor manufacturing, government-backed programs.
What are the upcoming trends in the CIM chip market?
Key trends include the shift from research prototypes to application-specific CIM chips, integration of compute functions into standard CMOS-based memory for scalability, and adoption of hybrid digital-analog CIM systems balancing energy efficiency with reliability through 2030.
Who are the key players in the CIM chip market?
Key players include Cerebras Systems, Samsung Electronics, SK hynix, Intel, Groq, Mythic, d-Matrix, Rain Neuromorphics, EnCharge AI, Untether AI, Lightmatter, IBM, Micron Technology, NVIDIA, and Graphcore.
Compute-In-Memory (CIM) Chip Market Scope
  • Compute-In-Memory (CIM) Chip Market Size
  • Compute-In-Memory (CIM) Chip Market Trends
  • Compute-In-Memory (CIM) Chip Market Analysis
  • Compute-In-Memory (CIM) Chip Market Share
Authors: Suraj Gujar, Ankita Chavan
Explore Our Licensing Options:
Premium Report Details:

Base Year: 2025

Companies covered: 15

Tables & Figures: 286

Countries covered: 19

Pages: 174

Download Free PDF

We use cookies to enhance user experience. (Privacy Policy)