Multimodal perception-driven decision-making for human-robot interaction: a survey
Multimodal perception is essential for enabling robots to understand and interact with complex environments and human users by integrating diverse sensory data, such as vision, language, and tactile information. This capability plays a crucial role in decision-making in dynamic, complex environments...
Saved in:
| Main Authors: | Wenzheng Zhao, Kruthika Gangaraju, Fengpei Yuan |
|---|---|
| Format: | Article |
| Language: | English |
| Published: |
Frontiers Media S.A.
2025-08-01
|
| Series: | Frontiers in Robotics and AI |
| Subjects: | |
| Online Access: | https://www.frontiersin.org/articles/10.3389/frobt.2025.1604472/full |
| Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Similar Items
-
From Coils to Crawls: A Snake-Inspired Soft Robot for Multimodal Locomotion and Grasping
by: He Chen, et al.
Published: (2025-04-01) -
Multimodal Raga Classification from Vocal Performances with Disentanglement and Contrastive Loss
by: Sujoy Roychowdhury, et al.
Published: (2025-07-01) -
Robot System Assistant (RoSA): evaluation of touch and speech input modalities for on-site HRI and telerobotics
by: Dominykas Strazdas, et al.
Published: (2025-07-01) -
International trade market forecasting and decision-making system: multimodal data fusion under meta-learning
by: Yiming Bai, et al.
Published: (2025-08-01) -
A Full‐Range Proximity‐Tactile Sensor Based on Multimodal Perception Fusion for Minimally Invasive Surgical Robots
by: Dongsheng Li, et al.
Published: (2025-08-01)