Alphabet’s assistant robots now include AI language understanding due to Google

Alphabet, the parent company of Google, is combining two of its most ambitious research initiatives — robotics and AI language understanding — to create a “assist robot” that can comprehend directions given in plain language.

The Verge reports that Alphabet has been creating robots that can perform basic jobs like fetching beverages and cleaning surfaces since 2019.

Most robots only respond to short and simple instructions, like “bring me a bottle of water”. But LLMs like GPT-3 and Google’s MuM can better parse the intent behind more oblique commands.

In Google’s example, you might tell one of the Everyday Robots prototypes, “I spilled my drink, can you help?” The robot filters this instruction through an internal list of possible actions and interprets it as “fetch me the sponge from the kitchen”.

The system that resulted has been given the moniker PaLM-SayCan by Google, which encapsulates how the model combines the “affordance grounding” of its robots with the language understanding abilities of LLMs

According to Google, its robots were able to plan the right responses to 101 user commands 84% of the time and carry them out 74% of the time after adding PaLM-SayCan.