fbpx

Expedera

indie Semiconductor Announces Strategic Investment in AI Processor Leader Expedera

Partnership Capitalizes on Expedera’s Breakthrough AI Capabilities in Support of indie’s ADAS Portfolio Leverages indie’s Multi-modal Sensing Technology and Expedera’s Scalable NPU IP Yields Class-leading Edge AI Performance at Ultra-low Power and Low Latency March 20, 2024 04:05 PM Eastern Daylight Time–ALISO VIEJO, Calif. & SANTA CLARA, Calif.–(BUSINESS WIRE)–indie Semiconductor, Inc. (Nasdaq: INDI), an Autotech […]

indie Semiconductor Announces Strategic Investment in AI Processor Leader Expedera Read More +

Expedera NPUs Run Large Language Models Natively on Edge Devices

Highlights Expedera NPU IP adds native support for LLMs, including stable diffusion Origin NPUs deliver the power-performance profile needed to run LLMs on edge and portable devices SANTA CLARA, Calif., Jan. 8, 2024 /PRNewswire/ — Expedera, Inc, a leading provider of customizable Neural Processing Unit (NPU) semiconductor intellectual property (IP), announced today that its Origin

Expedera NPUs Run Large Language Models Natively on Edge Devices Read More +

AI Silicon IP Provider Expedera Opens R&D Office in Singapore

Highlights Expedera opens a new Singapore R&D center The company’s fifth development center, with additional locations in Santa Clara (USA), Bath (UK), Shanghai, and Taipei SANTA CLARA, Calif., Oct. 24, 2023 /PRNewswire/ — Expedera Inc, a leading provider of scalable Neural Processing Unit (NPU) semiconductor intellectual property (IP), today announced the opening of its newest R&D

AI Silicon IP Provider Expedera Opens R&D Office in Singapore Read More +

A Packet-based Architecture For Edge AI Inference

This blog post was originally published at Expedera’s website. It is reprinted here with the permission of Expedera. Despite significant improvements in throughput, edge AI accelerators (Neural Processing Units, or NPUs) are still often underutilized. Inefficient management of weights and activations leads to fewer available cores utilized for multiply-accumulate (MAC) operations. Edge AI applications frequently

A Packet-based Architecture For Edge AI Inference Read More +

A Buyers Guide to an NPU

This blog post was originally published at Expedera’s website. It is reprinted here with the permission of Expedera. Choosing the right inference NPU (Neural Processing Unit) is a critical decision for a chip architect. There’s a lot at stake because the AI landscape constantly changes, and the choices will impact overall product cost, performance, and

A Buyers Guide to an NPU Read More +

Can Compute-in-memory Bring New Benefits To Artificial Intelligence Inference?

This blog post was originally published at Expedera’s website. It is reprinted here with the permission of Expedera. Compute-in-memory (CIM) is not necessarily an Artificial Intelligence (AI) solution; rather, it is a memory management solution. CIM could bring advantages to AI processing by speeding up the multiplication operation at the heart of AI model execution.

Can Compute-in-memory Bring New Benefits To Artificial Intelligence Inference? Read More +

Expedera Announces LittleNPU AI Processors for Always-sensing Camera Applications

Highlights Specialized NPU IP makes it easier for device makers to implement feature-rich, always-sensing camera systems. A dedicated processing solution addresses the power and privacy concerns of always-sensing camera deployments. Santa Clara, California, July 18, 2023—Expedera Inc, a leading provider of scalable Neural Processing Unit (NPU) semiconductor intellectual property (IP), today announced the availability of

Expedera Announces LittleNPU AI Processors for Always-sensing Camera Applications Read More +

Expedera Expands Global Footprint with Its First European Development Center

Highlights Expedera opens its first European regional engineering development center in the UK The company’s fourth development center focused on edge AI inference with locations in Santa Clara (USA), Bath (UK), Shanghai and Taipei Santa Clara, California, June 27, 2023—Expedera Inc, a leading provider of scalable Neural Processing Unit (NPU) semiconductor intellectual property (IP), today

Expedera Expands Global Footprint with Its First European Development Center Read More +

“Using a Neural Processor for Always-sensing Cameras,” a Presentation from Expedera

Sharad Chole, Chief Scientist and Co-founder of Expedera, presents the “Using a Neural Processor for Always-sensing Cameras” tutorial at the May 2023 Embedded Vision Summit. Always-sensing cameras are becoming a common AI-enabled feature of consumer devices, much like the always-listening Siri or Google assistants. They can enable a more natural and seamless user experience, such

“Using a Neural Processor for Always-sensing Cameras,” a Presentation from Expedera Read More +

Sometimes Less is More—Introducing the New Origin E1 Edge AI Processor

All neural networks have similar components, including neurons, synapses, weights, biases, and functions. But each network has unique requirements based on the number of operations, weights, and activations that must be processed. This is apparent when comparing popular networks, as shown in the chart below. Still, in the initial wave of edge AI deployments, many

Sometimes Less is More—Introducing the New Origin E1 Edge AI Processor Read More +

Here you’ll find a wealth of practical technical insights and expert advice to help you bring AI and visual intelligence into your products without flying blind.

Contact

Address

1646 N. California Blvd.,
Suite 360
Walnut Creek, CA 94596 USA

Phone
Phone: +1 (925) 954-1411
Scroll to Top