Join industry experts for a panel exploring the technical requirements of developing and deploying humanoid robots. The discussion will cover locomotion, dexterous manipulation, AI and autonomy, safety, and key lessons learned from early pilots and real-world deployments.
The integration of AI into robotics has transformed how robots and autonomous systems perceive, learn, and act across industries: from intelligent bin-picking to collaborative tasks in modern factories. Now, with the rise of Generative AI, we’re witnessing a fundamental shift in how robotics systems are built, trained, and deployed.
This talk will discuss how to develop intelligent robotic systems using both traditional AI approaches and the latest advancements in robotics foundation models. Learn how to design end-to-end workflows that incorporate deep learning, reinforcement learning, transformer-based vision-language-action (VLA) models all within a single, simulation-driven platform.
Highlights:
• Design and deploy AI-powered bin-picking and motion planning systems with reduced human supervision.
• Automate data labeling and training for object detection and pose estimation.
• Object detection with zero-shot text-conditioned models.
• Segmenting objects across images and videos using vision foundation models
Autonomous Mobile Robots (AMRs) are becoming invaluable tools for warehouse leaders looking to increase efficiency, accuracy and safety in their facilities. A successful implementation involves more than just the robots themselves; it requires a holistic approach that integrates hardware, software, and human elements seamlessly into existing operations. When choosing an integrator for your deployment, it’s crucial to find one that brings a comprehensive scope to the table, ensuring that AMRs are not only implemented efficiently but also optimized for peak performance. Integrators provide customized solutions that align with unique operational goals, enhance workflow efficiency and drive overall productivity. By leveraging their expertise in system design, process integration and change management, integrators help organizations unlock the full potential of AMR technology, fostering innovation and maintaining a competitive edge in an ever-evolving market. With the right partner, businesses can transform their operations, creating a resilient and adaptive infrastructure ready for future challenges.
Telesurgery represents the next frontier for delivering medical procedures with unmatched quality and consistency to remote and under-served areas. These telesurgery systems represent a complex interplay of robotic systems, communication infrastructure and real-time control systems.
Yet technical challenges are impacting the reliability, precision and adoption of remote surgery. In particular, system latency must be addressed in order to achieve the necessary clinical precision with high-fidelity haptic feedback and real-time movement replication. The ability to capture, process, and rapidly analyze vast amounts of data (real-time visual feeds, instrument kinematics, patient physiological data, etc) is crucial for providing surgeons with comprehensive situational awareness. Traditional communication architectures are bottlenecked by communication uncertainty, introducing latency and jitter that affect haptic realism and system stability.
System latency can be addressed by a new architectural approach: data centricity. Using the Data Distribution Service (DDS) standard, this shifts the architectural focus from traditional message-passing models to a conceptual Global Data Space. DDS allows multiple subsystems (surgeon console, patient-side robot, imaging sensors, monitoring devices) to asynchronously publish and subscribe to specific data (e.g., control commands, haptic feedback, 4K video feeds) in real-time. It also uses robust Quality of Service (QoS) policies to enforce the predictability and ultra-low latency required for human-safe surgical operations.
Attendees will learn how data-centricity works to meet the rapid, reliable communication requirements for the next generation of clinically-viable telesurgery systems.
ROS 2 has become a foundational framework for modern robotics development, offering modularity, real-time capabilities, and broad community support. However, integrating ROS-based systems into rigorous validation workflows – such as Hardware-in-the-Loop (HIL), Software-in-the-Loop (SIL), and Model-in-the-Loop (MIL) – presents unique challenges in timing, determinism, and interoperability.
This talk provides a technical overview of methodologies for embedding ROS nodes within closed-loop testing environments. Topics include synchronization of ROS communication with real-time systems, deterministic execution of control and perception algorithms, and interfacing with simulation platforms such as Gazebo, RViz, and FMI-compliant models. We will also examine the use of MCAP for scalable data logging and analysis, and discuss challenges and insights into high-fidelity data replay testing.
Through practical examples and architectural patterns, the session aims to equip robotics engineers and system integrators with strategies to validate ROS-based systems under realistic and reproducible conditions, bridging the gap between open-source development and industrial-grade testing.