The market for sensors that track hand movements in conjunction with touch screens is growing explosively as smartphones, tablets and other products increasingly incorporate gesture functionality into their user interfaces (UI).
Revenue for proximity-based gesture sensors is projected to reach $123 million this year, up from just $42,000 in 2012, according to a MEMS & Sensors Report from IHS Inc. (NYSE: IHS), a leading global source of critical information and insight. The primary application of gesture sensors will be in wireless devices, followed by consumer electronics and automotive. Growth will be during in the next few years—as much as 68 percent from 2014 to 2015—with sensor revenue hitting $545 million by 2017..
Current handset based gesture solutions on the market come in two types—capacitive and infrared (IR) proximity. In gesture-based mechanisms, specialized proximity sensors detect movements in two or three dimensions and recognize hand swipes going left, right, up and down. While the current crop of touch screens requires the direct touch of a finger or stylus on the screen, capacitive gesture controls go further, allowing a user to interact with a device just by being close to the touch screen.
Samsung makes a gesture
“The Galaxy S4 from Samsung Electronics represented the first major push toward gesture interface capability in a handset when the smartphone was released this year,” said Marwan Boustany, senior analyst, MEMS & Sensors, for IHS. “This is a step that others in the industry are likely to follow, thanks to the rising availability of gesture solutions from suppliers like U.S.-based Maxim Integrated Products and soon from both Japan’s Sharp and Taiwan-based Capella Microsystems.”
Because IHS does not believe that gesture sensors will be available in low-end handsets in the near term, gesture functionality will be limited to midrange and high-end cellphones. IHS believes that handsets will account for the majority of revenue for gesture sensors from 2013 through2017, even though cellphones are likely to use just one sensor because of the limited gesture use cases imposed by the size of the handset display.
PC and media tablets, meanwhile, will be the fastest-growing category for gesture sensors, boasting a revenue compound annual growth rate (CAGR) of 76 percent between 2014 and 2017. And unlike handsets, tablets could conceivably make use of multiple gesture packages spread around the tablet’s display in order to provide proper functionality.
With handsets and tablets the main devices for gesture-based sensors, wireless communications as a whole will be the major category for gesture functionality. From 2013 to 2017, gesture-based sensor revenue in wireless devices will grow at a sizable 44 percent CAGR.
Consumer and automotive applications get closer
Beyond handsets and tablets in wireless, potential growth for gesture-based sensors also will be found in PC tablets and laptops.
In PC tablets, as in media tablets, opportunity exists for multiple gesture components. Tablets offering Microsoft’s Windows 8 operating system (OS), in particular, could be very friendly to gesture interactions due to the touch-friendly design of the OS.
In the automotive sector, which has already embraced gesture interfaces, a different approach will be employed. Automotive will likely favor using multiple simple Photo Diode-type sensors with intelligence contained in one application-specific integrated circuit (ASIC). Proximity/IR-based gesture solutions could be used with other solutions—such as the capacitive touch sensors on a display, for instance—to offer more robust performance. Such a solution will be able to sense an approaching finger on a large automotive display screen in the dashboard, as an example.
Already gesture-based automotive solutions can be found in the Volkswagen’s Golf VII, using the Halios integrated circuit from Germany’s Elmos Semiconductor; or in General Motors’ Cadillac User Experience, or CUE.
Proximity sensors go further
Sensor suppliers are working on more advanced IR proximity-based gesture solutions, IHS believes, which could include 3-D gesture capabilities—adding motion detection in the z-axis.
Further in the future, camera-based gesture recognition could enter the handset and tablet market. The main use case at present for camera-based gesture recognition is for high-accuracy and high-resolution gesture performance—the sort offered by products such as Microsoft’s Kinect.