Research Output
Publications
38 papers across RA-L, TNSRE, ICRA, IROS, Scientific Reports, and more.
Journal articles
2026 · IEEE Data Descriptions
To address the scarcity of robust digital resources for these key groups, we present a curated image dataset designed to advance automated identification systems.
visionperception
2026 · IEEE ACCESS
This paper expands upon a semi-autonomous control system combining human-in-the-loop control with Lightmyography decoding, together with affordances-based filtering using object detection and IMU data
teleoperationbiosignalcontrolgraspingmanipulationdexterous-manipulationwearable-sensing
2026 · Scientific Data
In this paper, we collected 640 high-resolution images from a commercial farm spanning multiple growth stages, weed pressures, and lighting variations.
perceptionvision
2025 · IEEE Access
This paper addresses the critical lack of comprehensive studies on feature scaling by systematically evaluating 12 scaling techniques, including several less common transformations, across 14 different Machine Learning algorithms and 16 datasets for classification and regression tasks.
2025 · IEEE Access
In this study, a wearable high-density lightmyography armband is proposed, and the offline and real-time grasp prediction schemes are compared in an attempt to deepen our understanding in real-time decoding employing lightmyography signals.
biosignalgraspingmanipulationwearable-sensingcontrol
2024 · IEEE Access
In this paper, we explore this new class of soft robotic grippers by proposing new designs and investigating their post-contact reconfiguration behaviour in a series of experiments covering grasping experiments and grasping force exertion measurement experiments.
graspingmanipulation
2023 · IEEE Access
In this work we compare the performance of EMG-based hand gesture decoding models developed using three learning approaches.
graspingmanipulationbiosignalwearable-sensing
2023 · Nature Scientific Reports
In this work, we introduce a new muscle-machine interfacing technique called lightmyography (LMG), that can be used to efficiently decode human hand gestures, motion, and forces from the detected contractions of the human muscles.
biosignalcontrolgraspingmanipulationwearable-sensing
2022 · IEEE ACCESS
In this work, we compare various machine learning and feature extraction methods for the creation of EMG based control frameworks for dexterous robotic telemanipulation.
teleoperationbiosignalcontrolgraspingmanipulationdexterous-manipulationwearable-sensing
2022 · IEEE Transactions on Neural Systems and Rehabilitation Engineering
In this work, we propose EMG based frameworks for the decoding of object motions in the execution of dexterous, in-hand manipulation tasks using raw EMG signals input and two novel deep learning (DL) techniques called Temporal Multi-Channel Transformers and Vision...
controlbiosignalmanipulationgraspingdexterous-manipulationwearable-sensing
2022 · IEEE RA-L
In this work, we propose Temporal Multi-Channel Vision Transformers as a deep learning technique that has the potential to achieve dexterous control of robots and bionic hands. The performance of this method is evaluated and compared with other well-known methods,...
biosignalgraspingmanipulationcontrolwearable-sensing
Conference papers
2025 · 2025 IEEE Latin American Robotics Symposium (LARS)
This paper introduces a deep learning framework designed to enhance the grasping capabilities of quadrupeds equipped with arms, focusing on improved precision and adaptability.
manipulationgraspinglocomanipulationmobile-manipulationlegged-robotsperceptionvisioncontrolsim2real
2025 · 2025 IEEE Latin American Robotics Symposium (LARS)
This paper addresses the challenges of data scarcity and high acquisition costs for training robust object detection models in complex industrial environments, such as offshore oil platforms
sim2realperceptionvision
2025 · 2025 IEEE Latin American Robotics Symposium (LARS)
This paper investigates the application of DRL for learning stable flight control to address the challenge of performing autonomous UAV navigation in confined spaces.
controlvisionreinforcement-learning
2025 · 2025 IEEE Latin American Robotics Symposium (LARS)
This paper introduces a vision-based teleoperation shared control framework designed to overcome real-time teleoperation limitations, providing intuitive, real-time control of a quadruped’s manipulator
teleoperationshared-controlcontrolvisionperceptionmanipulationgraspingmobile-manipulationlegged-robots
2025 · 2025 IEEE International Conference on BioInformatics and BioEngineering (BIBE 2025)
This paper introduces a vision-based teleoperation shared control framework designed to overcome real-time teleoperation limitations, providing intuitive, real-time control of a quadruped’s manipulator
teleoperationshared-controlwearable-sensingbiosignalmanipulationvisionperceptioncontrol
2025 · 2025 47th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC)
This study seeks to examine the impact of using different wavelengths of light in the decoding of hand postures using different machine learning methods.
biosignalmanipulationgraspingwearable-sensingcontrol
2025 · IEEE International Conference on Advanced Robotics (ICAR 2025)
This paper presents the MIHRAGe interface, an integrated system that combines gaze-tracking, robotic assistance, and a mixed-reality to create an immersive environment for controlling the robot using only eye movements.
teleoperationwearable-sensingmanipulationhuman-robot-interactioncontrol
2024 · 2024 46th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC)
In this paper, we present a new dataset capturing user interactions with a wide variety of everyday life objects using a fully actuated, human-like robot hand and an onboard camera.
visiongraspingperceptionmanipulation
2023 · 2023 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS)
This paper proposes a semi-autonomous framework for robotic telemanipulation that employs Electromyography (EMG) based motion decoding and potential fields to execute complex object stacking tasks with a dexterous robot arm-hand system.
teleoperationshared-controlwearable-sensingcontrolbiosignalmanipulationgrasping
2023 · 2023 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS)
In this work, we propose an efficient skill transfer methodology comprising intuitive interfaces, efficient optical tracking systems, and compliant control of robotic arm-hand systems.
controlmanipulationdexterous-manipulationwearable-sensing
2023 · 2023 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS)
In this paper, we explore this new class of soft robotic grippers by utilising them for single-grasp object classification and grasping force estimation
graspingvisioncontrol
2023 · 2023 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS)
In this paper, we propose an intuitive, affordances-oriented EMG-based telemanipulation framework for a robot arm-hand system that allows for dexterous control of the device.
teleoperationshared-controlwearable-sensingcontrolbiosignalmanipulation
2023 · 2023 IEEE 19th International Conference on Automation Science and Engineering (CASE)
In this paper, we propose an anthropomorphic, light-weight, and affordable prosthetic hand equipped with a five output, series elastic differential mechanism and an armband utilizing a new muscle machine interfacing method that is called Lightmyography (LMG).
controlgraspingbiosignalwearable-sensing
2023 · 2023 45th Annual International Conference of the IEEE Engineering in Medicine & Biology Society (EMBC)
In this study, we extend our previous work experimentally validating the efficiency of the LMG armband in classifying thirty-two different gestures from six participants using a deep learning technique called Temporal Multi-Channel Vision Transformers (TMC-ViT)
graspingcontroldexterous-manipulation
2023 · 2023 IEEE International Conference on Robotics and Automation (ICRA)
In this work, we focus on a comprehensive data collection and analysis of key attributes involved in the selection of grasping and manipulation strategies for the successful execution of kitchen tasks.
visiongraspingmanipulationperception
2022 · 2022 IEEE-RAS 21st International Conference on Humanoid Robots (Humanoids)
In this work, we focus on comparing human and robot performance in the execution of complex kitchen tasks, assessing the grasping and dexterous manipulation skills that are required.
graspingmanipulation
2022 · 2022 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS)
In this work, we employ two novel deep learning techniques called Temporal Multi-Channel Transformer (TMC-T) and Temporal Multi-Channel Vision Transformer (TMC-ViT) for the classification of hand gestures based on the LMG data
biosignalgraspingmanipulationwearable-sensing
2022 · 2022 9th IEEE RAS/EMBS International Conference for Biomedical Robotics and Biomechatronics (BioRob)
In this work, we propose Temporal Multi-Channel Vision Transformers as a deep learning technique that has the potential to achieve dexterous control of robots and bionic hands.
controlbiosignalgraspingmanipulationwearable-sensing
2021 · 2023 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS)
This paper explores the kinematic relation between the robot configuration in joint space and both the robot tool center point (TCP) position resolution and the robot end-effector orientation resolution with the purpose of reducing error.
teleoperationcontrol
2019 · 2019 19th International Conference on Advanced Robotics (ICAR)
This paper presents a Deep Reinforcement Learning agent for a 4-wheeled rover in a multi-goal competition task, under the influence of noisy GPS measurements.
controlreinforcement-learning
Preprints
2026 · Paper submitted to Elsevier Applied Soft Computing
This study proposes a deep learning-based framework for the automated identification of Ichneumonoidea wasps using a YOLO-based architecture integrated with High-Resolution Class Activation Mapping (HiResCAM) to enhance interpretability.
visionperception
2026 · Paper submitted to IROS 2026 - Preprint version available. Please click Download Paper below.
We present an end-to-end pipeline for language-guided grasping that bridges open-vocabulary target selection to safe grasp execution on a real robot
controlgraspingvisionmanipulationmobile-manipulationperception
2026 · Paper submitted to RA-L - The preprint version will be made available soon.
controlmanipulationteleoperationlocomanipulationshared-control
2026 · Paper submitted to IROS 2026 - Preprint version available. Please click Download Paper below.
In this paper, we propose Capability-Aware Traversability (CAT), a unified framework that embeds physical limits directly into the spatial feature space.
control
Evaluating Zero‑Shot and One‑Shot Adaptation of Small Language Models in Leader‑Follower Interaction
2026 · Paper submitted to BioRob 2026 - arxiv preprint available.
In this paper, we present the first benchmark of SLMs for leader–follower communication, introducing a novel dataset derived from a published database and augmented with synthetic samples to capture interaction-specific dynamics.
human-robot-interaction
2025 · arXiv
This paper emphasizes the importance of error detection and classification for efficient and safe assembly of threaded fasteners, especially aeronautical collars.
manipulationcontrol
2022 · arXiv
In this paper, we developed two deep learning models called Temporal Multi-Channel Transformer (TMC-T) and Vision Transformer (TMC-ViT), adaptations of Transformer-based architectures for multi-channel temporal signals.
biosignal