CES may herald 2018 as the year that artificial intelligence (AI) takes off across all end market segments. Similar to what we’ve seen in the IoT space, these AI applications also overlap into other technology areas, including IoT, cloud, augmented and virtual reality (AR/VR), and big data. AI features already have made a big move into smartphones as a market differentiator and are now finding homes in smart homes (such as digital assistants), medical devices and automobiles.
AI is opening up greater growth opportunities particularly for semiconductor manufacturers. Many of them are developing AI chip platforms and strategies that can be leveraged across multiple market segments.
AI-related applications are driving up the average semiconductor per box for a variety of devices, and are spurring the development of new solutions and services, according to TrendForce. “AI influences the development of semiconductor sector in two ways: creating demand for new type of technologies and improving the product fabrication process,” said Jian-Hong Lin, TrendForce’s research manager, in a statement.
The global AI chipset market is forecast to reach $16.06 billion by 2022, growing at a compound annual growth rate of 62 percent between 2016 and 2002, according to a recent report released by MarketsandMarkets. The high growth rate is attributed to growing AI applications in a variety of market segments along with larger datasets.
We’ll take a look at a handful of new AI chip launches from CEVA, MediaTek, NXP, Samsung, and startup Gyrfalcon. All of these chips in one way or another fall into the AI categories of machine learning, deep learning and neural networks.
Let’s first take a look at CEVA’s new NeuPro family of AI processors for deep learning at the edge. The NeuPro line is segmented into four groups of specialized AI processors for a variety of applications such as IoT, smartphones, surveillance, automotive, robotics, medical and industrial. Extending the use of AI beyond machine vision to edge-based applications, said CEVA, these chips are designed to handle deep neural network workloads on-device – facial recognition, AR face filters, intelligent object classification, natural language processing, real-time translation, workflow management, authentication and real-time malware detection.
The CEVA AI processors offer performance ranging from 2 tera operations per second (TOPS) for the entry-level processor to 12.5 TOPS for the most advanced configuration.
“It’s abundantly clear that AI applications are trending toward processing at the edge, rather than relying on services from the cloud,” said Ilan Yona, vice president and general manager of the Vision Business Unit at CEVA, in a statement. “The computational power required along with the low power constraints for edge processing, calls for specialized processors rather than using CPUs, GPUs or DSPs. We designed the NeuPro processors to reduce the high barriers-to-entry into the AI space in terms of both architecture and software. Our customers now have an optimized and cost-effective standard AI platform that can be utilized for a multitude of AI-based workloads and applications.”
The NeuPro architecture offers a combination of hardware-based and software-based engines. Each family supplies different levels of parallel processing as follows:
- NP500 is the smallest processor, including 512 MAC units and targeting IoT, wearables and cameras
- NP1000 includes 1024 MAC units and targets mid-range smartphones, ADAS, industrial applications and AR/VR headsets
- NP2000 includes 2048 MAC units and targets high-end smartphones, surveillance, robots and drones
- NP4000 includes 4096 MAC units for high-performance edge processing in enterprise surveillance and autonomous driving
NeuPro will be available for licensing to select customers in the second quarter of 2018 and for general licensing in the third quarter of 2018.
Samsung Electronics Co. Ltd. also launched a new processor aimed at AI applications and multimedia content for smartphones and smart devices. The premium application processors, the Exynos 9 Series 9810 features a 2.9-GHz custom CPU, 6CA LTE modem and deep learning processing capabilities.
Built on Samsung’s second-generation 10-nanometer (nm) FinFET process, the processor has a new eight-core CPU: four are third-generation custom cores that can reach 2.9 GHz while the other four are optimized for efficiency. Single-core performance is enhanced two-fold and multi-core performance is increased by around 40 percent compared to its predecessor, according to Samsung.
“The Exynos 9 Series 9810 is our most innovative mobile processor yet, with our third-generation custom CPU, ultra-fast gigabit LTE modem and, deep learning-enhanced image processing,” said Ben Hur, vice president of System LSI marketing at Samsung Electronics, in a statement. “The Exynos 9810 will be a key catalyst for innovation in smart platforms such as smartphones, personal computing and automotive for the coming AI era.”
The Exynos 9810 also is packed with new features for improved user experience thanks to neural network-based deep learning and stronger security on advanced mobile devices. This enables “the processor to accurately recognize people or items in photos for fast image searching or categorization, or through depth sensing, scan a user’s face in 3D for hybrid face detection.”
In addition, the LTE modem makes it much easier to broadcast or stream videos at up to UHD resolution, or in newer visual formats such as 360-degree video, said Samsung. Multimedia experiences will also be more immersive thanks to a dedicated image processing and upgraded multi-format codec (MFC).
The Exynos 9 Series 9810 is currently in mass production.
Also targeting immersive multimedia experiences is NXP Semiconductors NV’s new i.MX 8M applications processors, which addresses sensory-driven experiences driven by voice, video and audio demand in IoT applications and home automation. This delivers a “one platform that combines A/V and machine learning to create connected products that can be controlled via voice command,” said the company.
The upshot: the i.MX applications processors can reduce the command and question response time in smart connected devices, and can be used for smart TVs, television subscription services, sound bars and other smart speakers, streaming media players and DVR/PVRs. The processors can also manage lighting, thermostats, door locks, home security, smart sprinklers, as well as other smart systems and devices.
“Interacting with machines will be as natural as using your human senses,” said Martyn Humphries, NXP’s vice president of consumer and industrial i.MX applications processor, in a statement. “For instance, you can give a voice command to stream a specific TV episode and then ask a contextual question about the actor which initiates a search and displays results on the screen – all while your show is still streaming.”
The i.MX 8M applications processors are available now. An evaluation kit also is available for prototyping.
MediaTek leveraged CES 2018 to unveil its AI platform strategy to enable AI edge computing with its NeuroPilot AI platform for all types of consumer devices from smartphones and smart homes to automobiles. MediaTek’s existing AI solutions target voice assistants, TVs, and autonomous cars.
The NeuroPilot AI platform is focused on several key areas:
- Edge AI Enabler – MediaTek brings AI closer to the chipset level – for devices at the edge of computing – where deep learning and intelligent decision need to happen faster. This creates a strong hybrid of an edge-to-cloud AI computing solution.
- Edge AI Efficiency – Through a balance of performance and power efficiency – a hallmark of MediaTek chipsets – MediaTek makes implementing and running AI applications efficient and practical across devices.
- Enhanced AI – MediaTek’s platform uses AI to enhance features and applications people use every day in mobile devices and at home like intelligent camera imaging and voice and image detection or recognition.
- Supports Mainstream AI Frameworks – MediaTek’s AI solution operates in concert with existing neural processing SDKs including Google TensorFlow, Caffe, Amazon MXNet, Sony NNabla and more. At the OS level, MediaTek offers support for Android and Linux.
- Software & Hardware Solution – Along with designing chipset level AI technology – Artificial Intelligence Processing Unit (APU) – MediaTek will introduce an AI SDK. It will allow developers access to SOC level functions to build AI applications and solutions across MediaTek chipsets and MediaTek powered devices.
“2018 is a new era in device innovation. MediaTek is committed to enabling our partners and customers with technology advancements consumers demand through the power of AI integration with our chipsets,” said Jerry Yu, corporate vice president and general manager of the Home Entertainment Business Group, MediaTek, in a statement. “AI enhanced technology is quickly becoming part of the consumers’ every day experience. MediaTek’s AI platform is designed for today’s smart devices and to pave the way for an AI- powered future.”
MediaTek powered devices showcased at CES included those with existing AI capabilities such as Amazon Echo, Android O DTV, Belkin Wemo Smart Plug, and MediaTek Whole Home Coverage Router.
Startup Gyrfalcon launched a family of low-power, high performance AI processors that touts energy efficiency at 9.3 TOPS/watt, and roughly 28,000 parallel computing cores, while eliminating the need for external memory for AI inference. The intelligent matrix processor –Lightspeeur 2801S – is based on APiM architecture that uses memory as the AI processing unit, which contributes to processor’s energy efficiency performance.
Lightspeeur 2801S is in production and available to qualified customers. Turnkey reference designs include USB dongles, multi-chip boards and system development kits. The reference designs provide AI platforms for a variety of applications such as mobile edge computing, AI based IoT, consumer portable devices, smart surveillance video, AR/VR products, face detection/recognition, natural language processing, deep learning enabled devices, AI data center servers, and autonomous driving.