Wednesday 11 December 2024
Select a region
US researchers are building a next-level Alexa

US researchers are building a next-level Alexa

3 months ago

US researchers are building a next-level Alexa

3 months ago


A new “Alexa-like” robot, which will be able to understand plain commands in a similar way to Amazon’s voice assistant, is being developed by researchers in the US.

The ComText system, created by MIT’s Computer Science and AI Laboratory (CSAIL), will be able to carry out nuanced commands which require prior “contextual knowledge” about objects.

The machine will be able to decipher what various objects are and to understand simple commands such as “pick it up”, an ability which robots have been limited in thus far but which comes naturally to humans.

Researchers at MIT are hoping the advanced technology can be applied to other robotics, such as self-driving cars (Tom Buehler/MIT CSAIL)

Rohan Paul, a post-doctoral student at CSAIL, said: “Where humans understand the world as a collection of objects and people and abstract concepts, machines view it as pixels, point-clouds and 3-D maps generated from sensors.

“This semantic gap means that, for robots to understand what we want them to do, they need a much richer representation of what we do and say.”

The robot was built by combining the ComText with a “two-armed, humanoid robot” named Baxter.

Researchers want to develop the robot with “two kinds of memory” – semantic memory and episodic memory.

Robots have previously focused on semantic memory which is based on general facts, such as the colour of the sky.

The ComText is being built to develop episodic memory, which is based on personal facts, by learning “a range of visuals and natural language”.

Researchers hope that by learning about an object’s size, shape, position and who it belongs to, it will be able to respond to simple commands which require multiple steps.

Researchers found that Baxter successfully carried out commands 90% of the time.

The team now want to train future robots to understand multi-step commands and “interact with objects more naturally”.

Such innovations could eventually find uses in self-driving cars and other robotic systems.


« Return to Tech

You have landed on the Bailiwick Express website, however it appears you are based in . Would you like to stay on the site, or visit the site?