Extending Tactile Space With Handheld Tools: A Re-Analysis and Review
Résumé
Abstract Tools can extend the sense of touch beyond the body, allowing the user to extract sensory information about distal objects in their environment. Though research on this topic has trickled in over the last few decades, little is known about the neurocomputational mechanisms of extended touch. In 2016, along with our late collaborator Vincent Hayward, we began a series of studies that attempted to fill this gap. We specifically focused on the ability to localize touch on the surface of a rod, as if it were part of the body. We have conducted eight behavioral experiments over the last several years, all of which have found that humans are incredibly accurate at tool-extended tactile localization. In the present article, we perform a model-driven re-analysis of these findings with an eye toward estimating the underlying parameters that map sensory input into spatial perception. This re-analysis revealed that users can almost perfectly localize touch on handheld tools. This raises the question of how humans can be so good at localizing touch on an inert noncorporeal object. The remainder of the paper focuses on three aspects of this process that occupied much of our collaboration with Vincent: the mechanical information used by participants for localization; the speed by which the nervous system can transform this information into a spatial percept; and whether body-based computations are repurposed for tool-extended touch. In all, these studies underscore the special relationship between bodies and tools.
Domaines
NeurosciencesOrigine | Publication financée par une institution |
---|