The emergence of AI at the Edge
We are about to witness the next big technology inflection point and it’s being driven by AI at the edge of the network. We examine how this exciting innovation might transform mobile computing and applications.
When I hear the word Artificial Intelligence, and more recently machine learning, my mind immediately leaps to images from movies like the terrifying synthetic in Alien and the childlike character in Spielberg’s A.I. For those of you that are less film fans, it might be Spot the robot dog from Boston Dynamics that we’ve seen doing back flips. So it was fascinating to hear Intel speak at the recent TOUGHBOOK Innovation Forum about how AI and Machine Learning at the edge of the network – on our mobile computing devices – is about to be the next major technology inflection point.
In recent years, the much publicised technology trends have been towards cloud computing, where the data, number crunching power and “intelligence” of our technology are stored. But in synch, Intel has been working on the next generation of improvements to our mobile computing devices, called AI at the Edge.
The thinking is that as the next generation of smart applications are born they will need much increased processing power and AI, in the form of machine learning on the computing device itself, to bring them to life.
The science bit
Two technology breakthroughs are enabling this to happen:
Firstly, the ability to use the Graphics Processing Units (GPUs) in our devices, as well as the Central Processing Units (CPUs), to transform the number of calculations that our devices can process in any given time. For those that like the science, the main difference between CPU and GPU architecture is that a CPU is designed to handle a wide-range of tasks quickly (as measured by CPU clock speed), but are limited in the concurrency of tasks that can be running. A GPU is designed to quickly render high-resolution images and video concurrently.
Because GPUs can perform parallel operations on multiple sets of data, they are also ideally suited for non-graphical tasks such as machine learning and scientific computation. Designed with thousands of processor cores running simultaneously, GPUs enable massive parallelism where each core is focused on making efficient calculations.
The second major breakthrough is the developments in AI and machine learning itself. The ability for software to automatically learn and improve from experience without being explicitly programmed. The two advances combined enable us to take a massive step forward into the next generation of applications.
Already in use
Intel is using this technology to bring AI to all the devices running its 10th generation and future platforms. For example, in Intel’s Threat Detection Technology suite of solutions is an Accelerated Memory Scanning machine learning feature to prevent malware. Traditionally this security task is a very heavy workload on a CPU, as it looks for an ever burgeoning variety of malware attacks. Using Accelerated Memory Scanning this workload is offloaded to the GPU and machine learning capabilities are deployed to make the device become ever smarter and more efficient at spotting the attacks. In traditional security systems, it would be constantly updating its list of threats and sharing them around the network, creating an ever larger workload but imagine if the device could quickly learn what was normal and what were unusual patterns and then only focus on the unusual. It would drastically reduce the processing workload, freeing capacity on the device, and also be much more efficient in its task.
AI on the edge is also being used to make the latest computing devices smarter in the way they operate for their user. Using machine learning, the device can quickly understand the working habits of its user. For example, recognising when in the working day it is most likely to need applications requiring lots of processing power. The device can then deploy and adapt the way its resources are used to match the user’s typical workload. The user themselves might spot just a few clues to this technology being deployed; A much longer battery life, a device that uses its fan less or remains cooler and remains more reliable throughout its lifetime.
Intel has also already used AI at the edge within its own organisation for instant IT efficiency gains. A small team built a simple telemetry tool to harvest data from its employees computing devices to understand more about the performance of the devices. Machine learning was used to spot potential issues, which were then automatically corrected via a self-healing application deployed across the devices. Intel calculated that the automated, machine learning solution solved 200,000 computing issues across the space of a year, drastically reducing calls to the helpdesk and allowing employees to continue working without technology interruptions.
Everyday applications
Outside of the device itself, AI at the edge is also already being used in applications such as virtual meetings to automatically reduce visual distractions with blurring, remove background noise or boost screen clarity to super resolution when small print text documents are shared. In the research and development, these advances will also be incredibly useful as our smartest brains crunch data ever faster to provide health breakthroughs and as the creative industry creates ever more realistic holographic and film entertainment.
But the reality is that the most significant applications and benefits of this type of AI at the edge are probably still to be thought of and promise to ultimately transform the working methods of all mobile computing users. The reassuring thing to know is that the latest mobile computing devices, such as those from TOUGHBOOK, will have the technology inside to take advantage of those developments as they arise.
See the latest Panasonic TOUGHBOOK devices using Intel technology
Read more insights…
blog
Looking to 2025: The Technologies and Trends Driving Greater Productivity for Mobile Workers
Advances in mobile technology and next generation applications continue to empower extreme mobile workers in the field, wherever they are. Here are the top AI productivity trends and technologies that will benefit mobile workers in 2025.
blog
Empowering Human-Centric Digital Transformation Through AI
Digital transformation is evolving rapidly with AI at its forefront. However, successful digital transformation initiatives depend on more than just cutting-edge technology, as we learnt from a recent manufacturing-focused IndustryX event.
whitepaper
Expert Roundtables: Automation & AI in Electronics Manufacturing
In this research we explore the appetite for different wireless technology adoption, expected applications and benefits, the maturity of rollout projects, any barriers to adoption and general viewpoints about each technology.
blog
Partnership Innovation Takes Centre Stage at the Centerprise Defence Technology Show
The Panasonic TOUGHBOOK team recently attended the Centerprise Defence Technology Show, giving them an invaluable opportunity to strengthen existing relationships and discuss new innovations across the defence sector.
Sorry there was an error...
The files you selected could not be downloaded as they do not exist.
You selected items.
Continue to select additional items or download selected items together as a zip file.
You selected 1 item.
Continue to select additional items or download the selected item directly.
Share page
Share this link via:
Twitter
LinkedIn
Xing
Facebook
Or copy link: