Articles tagged with "human-robot-interaction"
Robot Talk Episode 125 – Chatting with robots, with Gabriel Skantze - Robohub
In episode 125 of the Robot Talk podcast, Claire interviews Gabriel Skantze, a Professor of Speech Communication and Technology at KTH Royal Institute of Technology. Skantze specializes in conversational AI and human-robot interaction, focusing on creating natural face-to-face conversations between humans and robots. His research integrates both verbal and non-verbal communication elements, such as prosody, turn-taking, feedback, and joint attention, to improve the fluidity and naturalness of spoken interactions with robots. Skantze also co-founded Furhat Robotics in 2014, where he continues to contribute as Chief Scientist. Furhat Robotics develops social robots designed to engage in human-like conversations, leveraging Skantze’s expertise in computational models of spoken interaction. The episode highlights ongoing advancements in conversational systems and the challenges involved in making robot communication more natural and effective, emphasizing the importance of combining multiple communication cues to enhance human-robot interaction.
robotroboticsconversational-AIhuman-robot-interactionspeech-communicationautonomous-machinesFurhat-RoboticsTesla sues former Optimus engineer over alleged trade secret theft
Tesla has filed a lawsuit against Zhongjie “Jay” Li, a former engineer in its Optimus humanoid robotics program, accusing him of stealing trade secrets related to advanced robotic hand sensors. Li, who worked at Tesla from August 2022 to September 2024, allegedly downloaded confidential information onto personal devices and conducted research on humanoid robotic hands and startup funding sources during his final months at the company. Shortly after his departure, Li founded a startup called Proception, which claims to have developed advanced humanoid robotic hands resembling Tesla’s designs. The complaint highlights that Proception was incorporated less than a week after Li left Tesla and publicly announced its achievements within five months, raising concerns about the misuse of Tesla’s proprietary technology. Tesla’s Optimus program, launched in 2021, has faced development challenges and delays, with Elon Musk indicating in mid-2024 that the company would continue work on the project despite earlier setbacks. The lawsuit underscores ongoing tensions in the competitive field of humanoid robotics
robothumanoid-roboticsTesla-Optimusrobotic-hand-sensorstrade-secret-theftrobotics-startuphuman-robot-interactionSensitive skin to help robots detect information about surroundings
Researchers from the University of Cambridge and University College London have developed a highly sensitive, low-cost, and durable robotic skin that can detect various types of touch and environmental information similarly to human skin. This flexible, conductive skin is made from a gelatine-based hydrogel that can be molded into complex shapes, such as a glove for robotic hands. Unlike traditional robotic touch sensors that require multiple sensor types for different stimuli, this new skin acts as a single sensor capable of multi-modal sensing, detecting taps, temperature changes, cuts, and multiple simultaneous touches through over 860,000 tiny conductive pathways. The team employed a combination of physical testing and machine learning to interpret signals from just 32 electrodes placed at the wrist, enabling the robotic skin to process more than 1.7 million data points across the hand. Tests included exposure to heat, gentle and firm touches, and even cutting, with the collected data used to train the system to recognize different types of contact efficiently. While not as sensitive as human skin
roboticsrobotic-skinsensorsflexible-materialsconductive-hydrogelmulti-modal-sensinghuman-robot-interactionInterview with Amar Halilovic: Explainable AI for robotics - Robohub
Amar Halilovic, a PhD student at Ulm University in Germany, is conducting research on explainable AI (XAI) for robotics, focusing on how robots can generate explanations of their actions—particularly in navigation—that align with human preferences and expectations. His work involves developing frameworks for environmental explanations, especially in failure scenarios, using black-box and generative methods to produce textual and visual explanations. He also studies how to plan explanation attributes such as timing, representation, and duration, and is currently exploring dynamic selection of explanation strategies based on context and user preferences. Halilovic finds it particularly interesting how people interpret robot behavior differently depending on urgency or failure context, and how explanation expectations shift accordingly. Moving forward, he plans to extend his framework to enable real-time adaptation, allowing robots to learn from user feedback and adjust explanations on the fly. He also aims to conduct more user studies to validate the effectiveness of these explanations in real-world human-robot interaction settings. His motivation for studying explainable robot navigation stems from a broader interest in human-machine interaction and the importance of understandable AI for trust and usability. Before his PhD, Amar studied Electrical Engineering and Computer Science in Bosnia and Herzegovina and Sweden. Outside of research, he enjoys traveling and photography and values building a supportive network of mentors and peers for success in doctoral studies. His interdisciplinary approach combines symbolic planning and machine learning to create context-sensitive, explainable robot systems that adapt to diverse human needs.
roboticsexplainable-AIhuman-robot-interactionrobot-navigationAI-researchPhD-researchautonomous-robotsPepper humanoid robot powered by ChatGPT conducts real-world interaction
Researchers from the University of Canberra showcased Pepper, a humanoid robot integrated with ChatGPT, at an Australian innovation festival to study public reactions to AI-powered social robots in real-world settings. Pepper captures audio from users, transcribes it, generates responses via ChatGPT, and communicates back through text-to-speech. The trial involved 88 participants who interacted with Pepper, many for the first time, providing feedback that revealed a broad spectrum of emotions including curiosity, amusement, frustration, and unease. The study underscored the importance of first impressions and real-world contexts in shaping societal acceptance of humanoid robots, especially as they become more common in sectors like healthcare, retail, and education. Key findings highlighted four main themes: user suggestions for improvement, expectations for human-like interaction, emotional responses, and perceptions of Pepper’s physical form. Participants noted a disconnect between Pepper’s human-like appearance and its limited interactive capabilities, such as difficulties in recognizing facial expressions and following social norms like turn-taking. Feedback also pointed to technical and social challenges, including the need for faster responses, greater cultural and linguistic inclusivity—particularly for Indigenous users—and improved accessibility. The study emphasizes that testing social robots “in the wild” provides richer, human-centered insights into how society may adapt to embodied AI companions beyond controlled laboratory environments.
robothumanoid-robotChatGPTAI-powered-robotshuman-robot-interactionsocial-roboticsSoftBank-RoboticsCongratulations to the #ICRA2025 best paper award winners - Robohub
The 2025 IEEE International Conference on Robotics and Automation (ICRA), held from May 19-23 in Atlanta, USA, announced its best paper award winners and finalists across multiple categories. The awards recognized outstanding research contributions in areas such as robot learning, field and service robotics, human-robot interaction, mechanisms and design, planning and control, and robot perception. Each category featured a winning paper along with several finalists, highlighting cutting-edge advancements in robotics. Notable winners include "Robo-DM: Data Management for Large Robot Datasets" by Kaiyuan Chen et al. for robot learning, "PolyTouch: A Robust Multi-Modal Tactile Sensor for Contact-Rich Manipulation Using Tactile-Diffusion Policies" by Jialiang Zhao et al. for field and service robotics, and "Human-Agent Joint Learning for Efficient Robot Manipulation Skill Acquisition" by Shengchent Luo et al. for human-robot interaction. Other winning papers addressed topics such as soft robot worm behaviors, robust sequential task solving via dynamically composed gradient descent, and metrics-aware covariance for stereo visual odometry. The finalists presented innovative work ranging from drone detection to adaptive navigation and assistive robotics, reflecting the broad scope and rapid progress in the robotics field showcased at ICRA 2025.
roboticsrobot-learninghuman-robot-interactiontactile-sensorsrobot-automationsoft-roboticsrobot-navigationWhy Intempus thinks robots should have a human physiological state
robotroboticsAIemotional-intelligencehuman-robot-interactionIntempusmachine-learningWhat’s coming up at #ICRA2025?
robotroboticsautomationICRA2025human-robot-interactionsoft-roboticsmulti-robot-systemsMô hình AI cho phép điều khiển robot bằng lời
robotAIMotionGlotmachine-learningroboticshuman-robot-interactionautomationRobot Talk Episode 110 – Designing ethical robots, with Catherine Menon
robot-ethicsassistive-technologyautonomous-systemsAI-safetyhuman-robot-interactionethical-designpublic-trust-in-AI