Rachita is a Prototyping Solutions Architect at AWS. She has 15 years of experience in emerging technologies having worked at Google, Yahoo and Cisco at the intersection of research and production. She has a Masters in Electrical and Computer Engineering at Carnegie Mellon.
What if you could control a robot using natural language? In this session, we demonstrate how to enable natural language as a control layer for Physical AI, using AWS Strands with the Boston Dynamics Spot robot as a real-world example. We’ll walk through how user intent expressed in plain language is translated into structured plans, tool calls, and robot actions—bridging large language models with perception, navigation, and actuation.