Wearable Devices has announced that the Company showcased the usage of brain neural commands to operate Spot, the autonomous construction robotics solution, at the 2022 Trimble Dimensions+ Conference, held on November 7-9, 2022 in Las Vegas.
At the conference, the operator was able to control aspects of the robot’s motion using neural input signals for discrete or continuous control. A discrete-type control includes moving a single finger or finger tap with a twist of the wrist. A continuous-type control allows the operator to operate the robot by applying pressure gradations of the index finger on the thumb while moving the wrist. These gestures are used to make the robot rise, sit, stand, and move to any direction.
“Collaborating with Wearable Devices is another step forward in our mission to transform the way our customers interact with and operate robots and machinery,” says Aviad Almagor, vice president, technology innovation at Trimble. “A new interaction paradigm is required to effectively merge the digital and physical environments, and we believe that neural input technology can play a major role in achieving this.”
“As we move forward towards a digitally connected world, creating synergy between the human workforce and smart machines can improve efficiency and minimise human intervention,” states Asher Dahan, chief executive officer and co-founder of Wearable Devices. “Trimble Dimensions+ is an opportunity to showcase how neurotechnology provides a natural and intuitive operator user-experience, driving increased safety and productivity in situations where autonomous robots like Spot can be utilised in place of humans in potentially dangerous environments. This is just one example of how neural input technology can offer various industries with a common interface that augments the human workforce to increase both safety and productivity, and we look forward to exploring additional opportunities as our technology continues to gain mainstream recognition.”
Wearable Devices’ Mudra technology provides hands-free and touch-free interaction of digital devices, robots and machines using the Mudra neural input wristband. Mudra tracks neural signals on the surface of the user’s wrist, which algorithms then decipher and identify as finger movements or hand gestures. The interface binds each gesture with a specific digital function, allowing users to input commands without physical touch or contact. Mudra gestures are natural to perform, and gestures can be tailored per a user’s intent, desired function, and the controlled digital device.
Follow us and Comment on Twitter @TheEE_io