EnCharge AI’s cover photo
EnCharge AI

EnCharge AI

Embedded Software Products

Santa Clara, California 3,584 followers

Where the future of AI compute is being defined and built, to unlock new levels of machine intelligence.

About us

EnCharge AI is a leader in advanced hardware and software systems for AI computing. EnCharge’s robust and scalable next-generation in-memory computing technology provides orders-of-magnitude higher compute efficiency and density compared to today’s best-in-class solutions, at a fraction of the cost. The high-performance architecture is coupled with seamless software and will enable the immense potential of AI to be accessible in power -, energy-, and space-constrained applications. EnCharge AI launched in 2022 and is led by veteran technologists with backgrounds in semiconductor design and AI systems.

Industry
Embedded Software Products
Company size
11-50 employees
Headquarters
Santa Clara, California
Type
Privately Held
Founded
2022

Locations

  • Primary

    4500 Great American Parkway

    Suite 230

    Santa Clara, California 95054, US

    Get directions

Employees at EnCharge AI

Updates

  • By 2027, Gartner predicts that 40% of AI data centers will face operational constraints due to power availability. This isn’t just a technological roadblock—it’s a critical challenge that could slow AI’s growth and its broader potential. AI's growing energy demands are already reshaping the U.S. power grid. Data centers, especially those supporting AI workloads, are consuming vast amounts of electricity—sometimes more than entire countries. As the energy needs of AI continue to expand, regions like Northern Virginia, home to dense clusters of data centers, are hitting capacity limits. Utilities are scrambling to meet this surging demand, which is putting significant strain on existing infrastructure. Traditional computing systems are nearing their limits. The von Neumann bottleneck, where data constantly moves between memory and processors, hinders computational efficiency and escalates energy consumption. Analog in-memory computing offers a solution, performing calculations directly within memory arrays and cutting out the energy-intensive data movement that slows performance. This can deliver up to 20 times the efficiency of traditional systems. We’re creating an energy-efficient computing platform capable of driving sophisticated AI systems without the massive power draw that is becoming unsustainable. Our solutions offer precision, reliability, and scalability, outperforming digital solutions while solving decades-old problems that have made analog solutions previously infeasible. AI’s future isn’t about consuming more power; it’s about using energy smarter. By creating more efficient, sustainable computing systems, EnCharge is helping ensure AI’s growth can continue without outstripping the grid’s ability to power it. 

    • No alternative text description for this image
  • EnCharge AI proudly welcomes Jason Huang and Leslie Szeto, MSOD to our leadership team! Following our recent $100M Series B, Huang is joining as VP of Finance, and Leslie as Director of HR, to help scale our operations as we transition from development to commercialization of our breakthrough AI accelerators. Jason brings decades of experience steering tech companies through growth milestones, including leading SupplyShift through its recent acquisition and Arteris through its successful IPO. Leslie comes with 15+ years as an HR strategic leader at companies like Box, Intel, and Adobe. These key appointments, along with last month's addition of Dr. Shwetank Kumar as Chief Scientist, strengthen our organizational foundation. Together, they position us to bring our analog in-memory computing technology to market, delivering breakthrough compute efficiency for AI from edge to cloud. Please join us in welcoming Jason Huang and Leslie Szeto to EnCharge! https://lnkd.in/e9-TyuWa

    • No alternative text description for this image
    • No alternative text description for this image
  • The world is standing at the edge of an AI innovation gap. Breakthroughs in climate science, drug discovery, and personalized medicine are on the horizon if the industry can deliver the compute capacity that will support continued AI model development. However, the "memory wall" associated with conventional compute architectures – where processing and memory units operate separately – creates a debilitating bottleneck, forcing continuous data movement that consumes excessive energy, increases costs, and restricts AI deployment to well-resourced data centers with massive power budgets. Ultra-efficient AI processing represents a paradigm shift by integrating computation and memory into unified architectures. By performing calculations directly where data resides, these systems eliminate the bottleneck that has constrained AI's potential in many contexts. This approach dramatically reduces energy consumption by up to 20x, increases compute density 9x, and will democratize access to powerful computational tools across industries and applications, from edge to cloud. Ultra-efficient AI processing will help empower AI to solve critical problems while significantly reducing costs, energy consumption, and carbon footprint. 

    • No alternative text description for this image
  • EnCharge AI reposted this

    View profile for Mayank Daga

    VP, Software | EnCharge AI | I'm hiring |

    Over the last few weeks two aspects have emerged 1) demand for AI Inference is exploding, 2) AI Inference is melting GPUs (thanks to the Ghibli craze!). At EnCharge AI, we're tackling this head-on by building energy-efficient inference accelerators. We're hiring! I'm looking for passionate minds across the SW stack—from firmware to LLM deployment, including quantization, compilers, and runtimes. Check out our openings: https://lnkd.in/g2efQjsW We have multiple openings for each of these roles and are flexible on job locations within US, Canada and parts of Europe. Whether you're just starting out or a seasoned pro, let’s connect! #ai #inference #compiler #runtime #llms

  • This year, the U.S. will see a shift in military operations from centralized to edge-based decision-making. Modern autonomous systems often require instant processing capabilities in diverse environments for efficient and responsive operations. As a result, edge computing is becoming essential in modern defense infrastructure, enabling smart decision-making in the most remote environments. This integration of AI at the tactical edge allows military personnel to receive AI-driven insights, informing human choices in dynamic situations. Edge computing also brings important advantages for data security in defense applications. By processing information closer to its source, military operations can enhance their security posture while maintaining operational efficiency. This approach naturally complements existing security frameworks while supporting mission performance. As a participant in the Defense Advanced Research Projects Agency (DARPA) OPTIMA program and with backing from RTX, we're working to enable advanced AI to run efficiently on edge devices rather than being confined to data centers. This edge-based approach is key to making military operations more efficient, secure, and responsive.

    • No alternative text description for this image
  • Join our CEO Dr. Naveen Verma for a panel discussion hosted by Normal Computing on "Technology Innovations in Next-Gen Computing, Infrastructure, and Algorithms to Tackle the AI Energy and Silicon Complexity Crises" on April 3rd! As AI systems continue to consume exponentially more energy, the semiconductor industry faces unprecedented challenges in meeting computational demands while maintaining sustainability. This exclusive workshop on “Solving the AI Energy and Silicon Complexity Crises with Physics-Based Computing” will bring together leading voices from research, industry, and government to explore physics-based computing approaches that could reduce AI training energy consumption by 1000x. Click the link to learn more and register: https://lu.ma/oi711a3v 

    • No alternative text description for this image
  • EnCharge AI reposted this

    View profile for Jimmy Kan

    Partner at Anzu Partners | Hardware & Advanced Materials Investor

    AI's energy consumption is rapidly becoming unsustainable. Data centers are projected to consume 500 terawatt-hours annually by 2027—equivalent to France's entire electricity usage. This isn't just an infrastructure challenge; it's an existential threat to AI advancement.   This is why Anzu Partners is proud to be early investors in EnCharge AI, where the team is pioneering a fundamental solution through our analog in-memory computing technology, which reduces energy needs by 10-20x. Their approach based on decades of research at Princeton University doesn't just shorten the distance between data and processing—it reimagines the entire compute architecture.   What currently requires a 1GW power plant could potentially run on just 100MW with our technology. This isn't incremental improvement; it's a paradigm shift that makes AI both powerful and sustainable.   This is why we're excited about our growing team, including our new Chief Scientist Dr. Shwetank Kumar, as they work to ensure AI's future isn't constrained by energy limitations.   Learn more: TerraWatt Timebomb: https://lnkd.in/g_NFqD6e Meet Dr. Shwetank Kumar: https://lnkd.in/gRVxWdQ7 #AI #Innovation #SustainableTech #EnergyEfficiency #GTC25 Naveen Verma, Kailash Gopalakrishnan, Echere Iroaga, Roman Mueller, Joseph F. Krause, Cathryn Paine, Jonathan Morris, Rohit Iragavarapu, John Kwaak, Siddharth Gupta, Nate Bender, Joon Lee, HoChan Lee, Scout Ventures, Philipp Schams, SIP Global Partners, John Tseng, Kai Tsang, Christy Chou, Silicon Catalyst Angels, Defense Advanced Research Projects Agency (DARPA)

    • No alternative text description for this image
  • Join our CEO, Dr. Naveen Verma at Normal Computing’s “Solving the AI Energy and Silicon Complexity Crises with Physics-Based Computing” event on April 3rd in NYC! Click here for the full agenda and registration link: https://lu.ma/oi711a3v

    View organization page for Normal Computing

    3,694 followers

    We are excited to host an exclusive workshop event “Solving the AI Energy and Silicon Complexity Crises with Physics-Based Computing” on April 3rd, 2025, at Normal’s NYC office, in collaboration with NYC Deep Tech Week. AI energy demand is growing rapidly, while the challenge of building more efficient and powerful chips continues to intensify. These two issues give rise to what we call the interconnected AI energy and silicon complexity crises. The AI energy crisis refers to the urgent need to develop new software and hardware approaches to reduce the energy demands of AI training and inference. The silicon complexity crisis highlights the need to rethink semiconductor design to accelerate time-to-market, address labor shortages, and scale the development of application-specific hardware products, all while keeping up with increasing demands. The event will feature a series of talks and panel discussions, bringing together a distinguished group of experts from industry, research, and the public sector. We will explore technological innovations in next-gen computing, software and hardware infrastructure, and frontier AI model research. Additionally, we will discuss how to build enduring companies that capitalize on the opportunities arising from addressing the AI energy and silicon complexity crises, and the critical role of private and public sector partnerships in driving progress. Featured guests include: Dr. Benjamin Weiner (Fellow, ARPA-E) Dr. Logan Wright (Assistant Professor, Yale University) Dr. Ahmad Beirami (Research Scientist, Google DeepMind) Syrus Ziai (Founder and VP Engineering, Eliyan Corporation) Dr. Douglas Durian (Professor, University of Pennsylvania) Dr. Suraj Bramhavar (Programme Director, Advanced Research + Invention Agency (ARIA)) Dr. Omar Khattab (Incoming Assistant Professor, Massachusetts Institute of Technology) Dr. Naveen Verma (Co-Founder and Professor, EnCharge AI, Princeton University) Dr. Steve Fu (Partner, Celesta Capital) Dillon Erb (CEO and Co-Founder, Paperspace) You can find the full agenda and registration link here (https://lu.ma/oi711a3v). Spots are limited. We thank Hyperstition Incorporated and Andrew Côté from NYC Deep Tech Week.

  • “We're here with agentic AI right now—an incredibly powerful technology that military and intelligence agencies should be implementing today, not tomorrow.” At last week's Tectonic Defense Summit, our COO, Echere Iroaga, shared how EnCharge is advancing edge AI solutions. Maintaining superiority in the AI era will require moving AI out of the data center and empowering military and intelligence personnel to leverage real-time insights directly on devices in contested environments like forward bases and war zones. With ultra-efficient AI chips capable of running advanced AI models securely on power-constrained devices like phones, laptops, and drones, EnCharge is bringing computational power to the tactical edge. We're proud to be part of an $18.6M Defense Advanced Research Projects Agency (DARPA) grant and funding from RTX Ventures and IQT (In-Q-Tel), to advance next-generation AI capabilities essential for defense applications.

    • No alternative text description for this image
  • View organization page for EnCharge AI

    3,584 followers

    EnCharge AI proudly welcomes Dr. Shwetank Kumar to the Executive Team as our new Chief Scientist! Dr. Kumar brings exceptional experience in advanced AI models including LLMs and SLMs, with a distinguished career spanning technical and business leadership roles at Intel, IBM, Synaptics, Orbital Insight, and Turo. As the author of the insightful "AI Afterhours" Substack (which will continue alongside his work with us), he combines deep technical expertise with strategic vision. "Having known Shwetank since 2007, I couldn’t be more pleased that he’s joined us at EnCharge," said our CTO and Co-Founder Kailash Gopalakrishnan. "Shwetank has drawn on his exceptional background to become a true thought leader on the future of AI." As Chief Scientist, Dr. Kumar will help shape our research agenda, product roadmap and strategic direction as we continue our mission to deliver unparalleled energy-efficient AI inference solutions from edge to cloud. In Dr. Kumar's own words: "EnCharge AI is addressing one of the industry's most pressing concerns—managing AI's growing energy demands sustainably. I am thrilled to join this exceptional team and help deliver the computational efficiency breakthroughs needed for AI's future." Please join us in welcoming Dr. Kumar to EnCharge! https://lnkd.in/efNUwPp9

    • No alternative text description for this image

Similar pages

Browse jobs

Funding