Tactile, audio, and visual dataset during bare finger interaction with textured surfaces
File(s)s41597-025-04670-0.pdf (3.46 MB)
Published version
Author(s)
Devillard, Alexis
Ramasamy, Aruna
Cheng, Xiaoxiao
Faux, Damien
Burdet, etienne
Type
Journal Article
Abstract
This paper presents a comprehensive multi-modal dataset capturing concurrent haptic, audio, and visual signals recorded from ten participants as they interacted with ten different textured surfaces using their bare fingers. The dataset includes stereoscopic images of the textures, and fingertip position, speed, applied load, emitted sound, and friction-induced vibrations, providing an unprecedented insight into the complex dynamics underlying human tactile perception. Our approach utilizes a human finger (while most previous studies relied on rigid sensorized probes), enabling the naturalistic acquisition of haptic data and addressing a significant gap in resources for studies of human tactile exploration, perceptual mechanisms, and artificial tactile perception. Additionally, fifteen participants completed a questionnaire to evaluate their subjective perception of the surfaces. Through carefully designed data collection protocols, encompassing both controlled and free exploration scenarios, this dataset offers a rich resource for studying human multi-sensory integration and supports the development of algorithms for texture recognition based on multi-modal inputs. A preliminary analysis demonstrates the dataset’s potential, as classifiers trained on different combinations of data modalities show promising accuracy in surface identification, highlighting its value for advancing research in multi-sensory perception and the development of human-machine interfaces.
Date Issued
2025-03-23
Date Acceptance
2025-02-18
Citation
Scientific Data, 2025, 12
ISSN
2052-4463
Publisher
Nature Portfolio
Journal / Book Title
Scientific Data
Volume
12
Copyright Statement
© The Author(s) 2025 Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article’s Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/.
License URL
Identifier
10.1038/s41597-025-04670-0
Publication Status
Published
Article Number
484
Date Publish Online
2025-03-23