Skip to main content

Media Alert: BrainChip Introduces aTENNuate in New Technical Paper

BrainChip Holdings Ltd (ASX: BRN, OTCQX: BRCHF, ADR: BCHPY), the world’s first commercial producer of ultra-low power, fully digital, event-based, neuromorphic AI, today released a technical paper detailing the company’s aTENNuate, a lightweight deep state-space autoencoder algorithm that can perform raw audio denoising, super resolution and de-quantization optimized for the edge.

“Real-time Speech Enhancement on Raw Signals with Deep State-space Modeling” further expands BrainChip’s leadership in the neuromorphic event-based computing space by showcasing the company’s approach in advancing state-of-the-art deep learning audio denoising methods. The white paper covers topics related to state-space modeling, network architecture, event-based benchmark experiments and future directions that such an approach can achieve.

BrainChip’s aTENNuate network belongs to the class of state space-based networks called Temporal Neural Networks (TENNs). By virtue of being a state-space model, aTENNuate can capture long-range temporal relationships present in speech signals with stable linear recurrent units. Learning long-range correlations can be useful for capturing global speech patterns or noise profiles, and perhaps implicitly capture semantic contexts to aid speech enhancement performance.

aTENNuate is the latest technological advancement added to BrainChip’s IP portfolio and an expansion of Temporal Event-Based Neural Nets (TENNs), the company’s approach to streaming and sequential data. It complements the company’s neural processor, Akida IP, an event-based technology that is inherently lower power when compared to conventional neural network accelerators. Lower power affords greater scalability and lower operational costs. BrainChip’s Akida™ supports incremental learning and high-speed inference in a wide variety of use cases. Among the markets that BrainChip’s event-based AI technology will impact are the next generation of wearable products, such as earbuds, hearing aids, watches and glasses as well as AIoT devices in the smart home, office, factory or city.

Paper is available here: https://bit.ly/3Tz1xyx

Code is available here: https://bit.ly/3MUsLMg

To access additional learning resources regarding neuromorphic AI, essential AI, TENNs and other technical achievements, interested parties are invited to visit BrainChip’s white paper library at https://brainchip.com/white-papers-case-studies/

About BrainChip Holdings Ltd (ASX: BRN, OTCQX: BRCHF, ADR: BCHPY) BrainChip is the worldwide leader in Edge AI on-chip processing and learning. The company’s first-to-market, fully digital, event-based AI processor, AkidaTM, uses neuromorphic principles to mimic the human brain, analyzing only essential sensor inputs at the point of acquisition, processing data with unparalleled efficiency, precision, and economy of energy. Akida uniquely enables Edge learning local to the chip, independent of the cloud, dramatically reducing latency while improving privacy and data security. Akida Neural processor IP, which can be integrated into SoCs on any process technology, has shown substantial benefits on today’s workloads and networks, and offers a platform for developers to create, tune and run their models using standard AI workflows like Tensorflow/Keras. In enabling effective Edge compute to be universally deployable across real world applications such as connected cars, consumer electronics, and industrial IoT, BrainChip is proving that on-chip AI, close to the sensor, is the future, for its customers’ products, as well as the planet. Explore the benefits of Essential AI at www.brainchip.com.

Follow BrainChip on Twitter: https://www.twitter.com/BrainChip_inc

Follow BrainChip on LinkedIn: https://www.linkedin.com/company/7792006

Contacts

Media Contact:

Mark Smith

JPR Communications

818-398-1424

Investor Relations:

Tony Dawe

Director, Global Investor Relations

tdawe@brainchip.com

Stock Quote API & Stock News API supplied by www.cloudquote.io
Quotes delayed at least 20 minutes.
By accessing this page, you agree to the following
Privacy Policy and Terms and Conditions.