Computers May Soon Recognize Everyday Objects Through Touch

Computers May Soon Recognize Everyday Objects Through Touch




Meet Project RadarCat, an innovative prototype from the University of St. Andrew’s. Students there have taken Google’s Project Soli and created completely new tech – electronics with a sense of touch. Project Soli, which debuted at Google I/O in 2015, is a chip that uses radar to detect hand and finger motions. Project Soli provides a unique way for mobile users to scroll, twist knobs, or swipe their devices by using hand gestures, rather than the touch-screen technology currently in place in modern devices. In order to be able to read these hand gestures, the device uses radar technology to send and receive signals back to the Project Soli device. These signals are then interpreted and the device inputs the commands the user is communicating with his/her hand gestures.

RadarCat, however, used Google’s simple chip and realized that different materials produce their own signals, which can be read and deciphered by Google’s Project Soli sensor. By combining machine learning technology – which trains software to identify and classify items in real time – to Project Soli’s radar technology, the students of Project RadarCat have been able to produce electronics that can recognize objects through touch. Not only can the item’s structure be identified, meaning the software can identify the material of the item, but RadarCat’s device can further detect if an item is being changed in any manner. For example, if an empty glass of water resting on the RadarCat sensor is being refilled, the RadarCat software can identify this action and flash REFILLon the screen.

This is greatly beneficial in certain areas of technology development, with Project RadarCat even pointing out their innovation’s ability to aid those with poor vision. The concept, in this case, is that someone who has trouble deciphering objects, can place the item on the radar sensor and have the device identify the item. RadarCat also offers uses in “… areas such as navigation and world knowledge (e.g., low vision users), consumer interaction (e.g., checkout scales), industrial automation (e.g., recycling), or laboratory process control (e.g., traceability).

Featured photo by David DeHetre


Related Stories




Let Our Force Be With You

Subscribe Today & Don't Miss a Thing