advertisement
How Google plans to reinvent the user interface
When the late Apple CEO Steve Jobs introduced the iPhone 12 years ago, he also introduced to many the concept…
When the late Apple CEO Steve Jobs introduced the iPhone 12 years ago, he also introduced to many the concept of the multitouch user interface. He emphasized the benefit of using “the pointing device we were all born with”: our fingers.
But there’s one thing even more natural than physically poking something with our fingers: in-the-air hand gestures. All humans use hand gestures to communicate to other people.
Now Google wants you to use them to communicate with all your electronics.
advertisement
Google is working on something it calls Project Soli, which uses radar to control electronics with in-the-air hand gestures.
Soli is quite amazing, actually
Announced in spring of 2015, Soli enables in-the-air gestures to control smartphones, computers, wearable devices and even cars. The project is in the news this week because the FCC just granted a request by Google to operate Soli radar sensors at higher powers than currently allowed in the U.S. It also gave permission for using Soli devices on airplanes.
Soli emerged from Google’s Advanced Technology and Projects group (ATAP), which itself was created inside Motorola Mobility by former DARPA director Regina Dugan. Google acquired ATAP when it bought Motorola in 2012, but kept it when it sold the company to Lenovo two years later.
advertisement
Research labs are a dime a dozen in Silicon Valley, throughout the tech industry and in universities everywhere. They’re great at producing great technology that never makes it into real products.
ATAP is different in that all projects are expected to move from conception to shipping product in two years. (It often misses that target, but the point is to move aggressively toward productization.)
Commenters are comparing the technology to Minority Report, which was a good movie from 17 years ago, or Replicas, which is a bad movie currently in theaters. (Here’s the trailer for Replicas, which shows the in-the-air UI.)
advertisement
Those Hollywood user interfaces involve large, conspicuous and sweeping movements of the arms and hands, whereas Soli enables interaction with slight finger movements. For example, rubbing fingers together (the international hand gesture for “money”) can direct a smartwatch to rotate, or a smartphone to cycle through a series of screens.
Soli works with the use of a special and tiny solid-state chip (8mm x 10mm) that captures 3D motion using radar, then processes it with machine-learning algorithms. The radar passes through fabric, so you could interact with a phone in your pocket or a smartwatch while wearing gloves.
ATAP is the research division that brought us Tango, Ara, Spotlight Stories, Jacquard, Abacus, Vault and others. All these projects, by the way, were either canceled or failed after being brought to market.
The difference with Soli is that all previous failed or canceled ATP projects aimed to create an entirely new kind of product, or a new, untested platform. For example, Jacquard, which emerged as a fabric UI for the clumsily named Levi’s Commuter Trucker Jacket with Jacquard by Google, failed not because of the horrible name or because the technology was bad, but because the public didn’t take to the untested idea of using a sleeve as a user interface. Or, at least, the public didn’t want to pay a $260 premium for sleeve-based smartphone control.
Soli, however, is likely to show up on platforms that are already popular, such as smartphones and laptops, with a significant rise in price. If users don’t take to it right away, the products that support Soli will still probably succeed in the market, anyway.
Stated another way, it’s likely that Jacquard would be considered a success if it had been simply added it to the existing $90 Trucker jacket instead of being available only in a special $350 version.
Soli may end up being more similar to Google Assistant, which can be baked into all kinds of things — especially Internet of Things things.
In-the-air gesture support means that random appliances and sensors don’t need screens or buttons, and could be designed with natural, intuitive hand-gesture controls. Imagine an alarm clock that gave you a number of minutes snooze time based on how many fingers you held up, or smartphone notifications that go away when you make a natural “dismiss” hand gesture.
One application nobody is talking about is the possibility of interaction with augmented reality objects. For example, a thermostat could show a virtual dial suspended in the air, then detect when you turn the dial to the right or left (thereby adjusting the temperature).
Soli Virtual Tool Gestures already developed include buttons, dials, sliders and others.
Imagine factory workers with augmented reality safety glasses that could conjure all kinds of virtual controls, then use Soli technology to manipulate those controls in the air.
I believe Soli is coming to real products, real soon. Google is working with LG, Qualcomm, JBL and others to build Soli into shipping products. The Soli SDK could be used by device makers of all kinds, as long as those devices are based on Android, Chrome or Google’s next-generation OS, called Fuschia.
Researchers are using Soli, too.
University of St Andrews researchers in Scotland published a paper in December detailing their work exploring the possibilities of Soli hardware. They demonstrated with their “RadarCat” project that Soli is so precise it can accurately count the number of playing cards in a deck, figure out compass orientation and map out the exact configuration of LEGO bricks.
Soli can identify objects or materials. The researchers showed how Soli could identify apples, oranges and hard drives. It could also figure out if objects were made out of glass, copper or steel, or what kind of beverage is poured into a glass. (Expect Soli-based coasters in bars.)
It can even tell how you’re holding a smartphone, or what part of your body you touch your phone to.
Google gets great feedback
Movement in the Solisphere overshadowed news that Google announced last week a multiyear deal with haptics leader Immersion Corp.
Immersion is the leading independent company that enables game controllers and smartphones to buzz, shake and rumble to give touchy-feeling feedback to users.
The deal gives Google the right to use or be protected by any or all of Immersion’s 3,000 or so patents.
This matters, because Motorola/Google lost a patent infringement lawsuit against Immersion seven years ago, and that lawsuit centered on “Basic Haptics,” which is mostly simple buzzing. (Immersion also won patent lawsuits against Apple, Microsoft, Samsung, Sony, Fitbit and others.)
The Motorola/Google lawsuit covered only Motorola-branded phones, not Pixel or Android phones, or any of the hardware products Google currently makes.
I think a deal licensing all Immersion patents must have been extremely expensive for Google (terms of the deal were not disclosed).
And I think the deal is related to Soli.
Haptic feedback is a necessary feature of Soli, because in-the-air gestures otherwise don’t provide the psychologically necessary “feedback” that something has actually happened on the device. For example, while turning an imaginary dial in space, you need a clicking sensation on the smartwatch to feel like something is really happening.
The Soli news plus the Immersion news add up to aggressive moves by Google to reinvent the “touch” user interface by taking the touch out of the equation.
Major user interface change is in the air.