Each year I take pause to look back on the year ponder what might have been the most significant new technology to emerge. This year at Google I/O we were introduced to Project Soli. Touch has significantly changed the way that we interact with software and the web (as a side note, this year I had the pleasure of meeting James Tagg – the person behind today’s touchscreen technology – he has just released a very interesting book on AI). Project Soli redefines touch, creating ‘touchless’ interfaces that use gestures, but unlike earlier gestural interfaces such as Microsoft’s Kinect it uses RADAR rather than cameras (though Kinect is much more than an interface). This opens up a range of new applications because of its compact size and its sensitivity, as the technology’s creator Ivan Poupyrev explains in this video:
The sensor can track very fine motions at high speed and with accuracy, and it uses machine learning, so new gestures can be learnt and added. Soli fits onto a chip, meaning that it can be built into small devices and everyday objects, from watches to washing machines. The first developer kits shipped out in November (2015), with more to follow, so it will be a little while until you see gadgets incorporating the technology, but I’m sure that they will be worth waiting for when they emerge.