Home Artificial Intelligence Artificial Intelligence News Using endpoint AI in vision applications

Using endpoint AI in vision applications

Audio version of the article

In 2016, truly high accuracy facial recognition on a smartphone was a remarkable innovation but is now close to becoming fully mainstream. While many consumers now see this as a basic smartphone feature, it uses incredibly complex artificial intelligence (AI) technology, with only half of consumers aware of how essential AI is to enabling this feature.

Now we are seeing vision-based AI applications go one step further into even smaller edge and endpoint devices. We are now realizing automatic office doors which can recognize employees, home security systems which can alert us to intruders, automated payment systems in stores that recognize us uniquely, smart public transport systems that help manage rush hour – there are a multitude of opportunities for AI applications at the edge and endpoint.

Edge vs Endpoint

Until relatively recently, most advanced AI processing was run on the cloud. However, in recent years, we have seen AI processing expanding from the cloud to the edge and endpoint.

What do we mean by edge and endpoint? Edge devices live at the network or cloud edge before wireless signals get to endpoint devices. Endpoint devices are those which are at the final end of a communication link, like a home thermostat or smart speaker.

As AI’s usefulness continues to expand, edge and endpoint AI are distributing workloads from the cloud to smaller devices. This digital revolution will bring new processing topologies and a new wave of smarter, life-changing devices, while also reducing network bandwidth required to communicate with the billions of devices connected to the cloud.

Endpoint AI has the ability to change our lives

Endpoint AI has great potential to create new and deeper benefits across areas including medicine, education, security, and a range of other applications affecting nearly all aspects of our daily lives.

Think of an autonomous driving system that can detect and avoid collisions by recognizing objects or take over from the driver in an emergency. Or a portable scanner which can detect medical symptoms in areas that don’t have readily accessible healthcare infrastructure.

Often every second counts in these scenarios and they can be made possible only if endpoint devices are faster, smarter, more secure, and more reliable. These AI-based applications can sometimes sound futuristic but are closer than many consumers realize.

In order to see the benefits of these applications though, they require low power and high-performance compute in tiny devices.

Enabling AI everywhere – no matter the device size

If we are to realize these future AI use cases, increased device functionality is required, but this will only be possible with enhanced ML performance. Systems need on-device ML capabilities while maintaining or improving power efficiency.

For instance, to enable AI hardware and software developers with more ways to innovate with endpoint AI, earlier this year we announced the Arm Cortex-M55 and Ethos-U55 processors which delivers up to a 480x leap in ML performance—bringing endpoint AI to billions more devices. These processors can deliver increased ML performance and help run high-performance applications in an endpoint device without constantly being connected to the cloud.

Just last month, we added the Arm Ethos-U65 to our microNPU line which maintains the power efficiency of the Ethos-U55 while extending its applicability from Arm Cortex-M to Cortex-A – and Neoverse-based systems – while delivering twice the on-device machine learning performance.

Incorporating a microNPU (like the Ethos-U55 or Ethos-U65) that is specifically designed to accelerate ML inference in area-constrained embedded devices provides acceleration of neural networks in extremely low-area and with low-power consumption. This is crucial for small devices which are power constrained like a smartwatch, allowing them to run more demanding, complex, and advanced applications.

Optimizing the developer experience will unlock endpoint AI innovation

For endpoint AI innovation to become a reality, software and tools need to help developers move from inspiration to prototype and production quickly. This need for streamlined tools is all the more urgent because demand for new ML designs is inhibited by the lack of simplicity of use in the tools and ecosystem. As with many modern technologies, ecosystem collaboration is crucial to making it easier for developers to deploy endpoint AI.

In October, Arm and Microsoft announced a new effort to accelerate the deployment of AI across billions of IoT devices. The collaboration will focus on optimizing and accelerating the complete AI workload development lifecycle, from training and tuning ML models on Azure Cloud, to optimizing, deploying, and running those models across any Arm-based endpoint device.

Putting the developer experience first and foremost will empower the innovators to deliver better solutions and a better future for all of us.

Security cannot be an afterthought 

As developers are equipped with the tools needed to create innovative, vision-based AI applications, it’s important that both privacy and security stay front of mind. For example, data gathered by a facial recognition home security system must be protected and this should start at the chip level. Arm’s processor IP and microNPUs can ensure this sensitive data stays on the endpoint system, rather than having to be sent to the cloud.

Security is a shared responsibility and it’s critical that the industry work together to ensure all this new data is generated by trusted devices. Industry initiatives and independent schemes such as PSA Certified are being recommended by government guidelines such as the National Institute of Standards and Technology in the US.

This shift in computing to the edge and endpoint will enable completely new AI capabilities, creating an explosion of intelligent, life-enhancing applications. The possibilities are endless for the future of endpoint AI.

This article has been published from a wire agency feed without modifications to the text. Only the headline has been changed.

Source link

- Advertisment -

Most Popular

- Advertisment -