Human multisensory fusion of vision and touch .fr

sensory information in a non-linear fashion. q 2001 Elsevier Science Ltd. All rights reserved. Keywords: Multisensory; Sensory fusion; Vision; Somatosensation; ...
104KB taille 8 téléchargements 311 vues
Neuroscience Letters 315 (2001) 113–116 www.elsevier.com/locate/neulet

Human multisensory fusion of vision and touch: detecting nonlinearity with small changes in the sensory environment Kelvin S. Oie a, Tim Kiemel b, John J. Jeka a,c,* a

Program in Neuroscience & Cognitive Science and the Department of Kinesiology, University of Maryland, College Park, MD, USA b Department of Biology, University of Maryland, College Park, MD, USA c Department of Psychology, University of Maryland, College Park, MD, USA Received 12 May 2001; received in revised form 23 July 2001; accepted 17 September 2001

Abstract Previous investigations using relatively large amplitude sensory stimuli or complete removal of sensory input have demonstrated non-linear processing of sensory information for postural control. In the present study, we asked whether a linear range of sensory fusion exists when smaller amplitude stimuli are used. The amplitudes of visual and somatosensory input were simultaneously co-varied within a trial. The postural responses were characterized by analyzing how the Fourier transform of postural sway at the driving frequency varied with sensory movement amplitudes. If the postural control system is linear with constant weighting of sensory inputs, then the pattern of Fourier transforms should be a linear function of movement amplitude. However, in 28 of 58 trials we observed non-linearity in this function. The results clearly show that even at very small amplitudes of sensory change, the nervous system processes multisensory information in a non-linear fashion. q 2001 Elsevier Science Ltd. All rights reserved. Keywords: Multisensory; Sensory fusion; Vision; Somatosensation; Posture; Human

The maintenance of upright stance is characterized by continuous, small deviations around the actual vertical upright. The structure of postural sway reflects a control process that integrates information from the visual, vestibular and somatosensory systems (e.g. [1,3,9,13]). Under normal conditions, these deviations produce motion relative to a stationary environment that is interpreted as self-motion rather than as environmental motion. The ‘moving room’ paradigm takes advantage of this fact. Subjects are presented with small amplitude perturbations of the sensory environment that: (1) are not perceived as environmental motion [7,14]; and (2) induce postural sway that is structured by the spatiotemporal pattern of stimulation [2,4,5,14,15]. This induced sway reflects the dependence of the postural response on the imposed sensory motion, but it remains an open question as to how this integration is achieved. We have shown that postural sway trajectories under fixed conditions of sensory input (e.g. with eyes open or closed, or at a given frequency and amplitude of oscillation)

* Corresponding author. Tel.: 11-301-405-2512; fax: 11-301314-9167. E-mail address: [email protected] (J.J. Jeka).

are well accounted for by a third-order linear stochastic model with additive inputs [6,10]. By contrast, studies using moving visual scenes, (e.g. [8,12]), have demonstrated signatures of non-linearity with changes in visual motion amplitude. However, the relatively large motion amplitudes used in previous studies (e.g. [8]) or the complete removal or severe disruption (e.g. through swayreferencing) of a sensory source, may not reflect the changes in sensory information that the nervous system confronts as we move about in the environment. As biological control systems are thought to have both non-linear and linear ranges, our goal in this study was to determine if non-linearities are evident in multisensory integration across a range of very small changes in the motion amplitude of two sensory sources. Our view is that the present manipulation better reflects what the nervous system confronts under normal conditions than with drastic changes in stimulus motion or the complete loss of a sensory input. Subjects stood in a heel-to-toe stance approximately 40 cm from a visual stimulus, which consisted of a computergenerated pattern of random dots, while the somatosensory stimulus consisted of low-level forces (, 1 N) from fingertip contact with a servo-controlled, smooth, metal surface located to the right of the subject at about hip level (for more

0304-3940/01/$ - see front matter q 2001 Elsevier Science Ltd. All rights reserved. PII: S03 04 - 394 0( 0 1) 02 34 8- 5

114

K.S. Oie et al. / Neuroscience Letters 315 (2001) 113–116

details, see Ref. [10]). The stimuli: (1) could be either stationary (i.e. amplitude ¼ 0.0 cm) or oscillatory, with a frequency of 0.2 Hz; (2) were approximately sinusoidal; (3) moved in a frontal plane (i.e. medial-lateral) when oscillatory; (4) varied in peak amplitude inversely with respect to each other, from 0.0–1.0 cm or 1.0–0.0 cm in 0.1 cm increments; and (5) were anti-phase when both stimuli were oscillatory. A typical trial in Condition 1 is illustrated in Fig. 1A. The touch surface started oscillating with amplitude of 1.0 cm with the visual display stationary. After every 30-s plateau (i.e. six cycles), stimulus amplitudes were inversely incremented by 0.1 cm relative to each other. This pattern of stimulus variation was repeated until the last plateau where the touch surface was stationary and the visual display oscillated with amplitude of 1.0 cm, resulting in 11, 30 s plateaus in each 330 s trial. In Condition 2, the

Fig. 1. An exemplar trial (subject 2, condition 2). (A) Time series of center-of-mass displacement (thin line), visual display position (dark line, upper trace) and touch surface motion (dark line, lower trace). (B) The observed (X) and predicted (W) Fourier transforms for the trial depicted in (A). The Fourier transform and the predicted Fourier transform from the linear fit for the initial plateau where the visual display was stationary and the touch surface oscillated with 1.0 cm amplitude are indicated by B and A, respectively. The linear model was found to be the best fit for this trial, consistent with constant weighting of sensory inputs.

pattern of stimulus amplitude variation was reversed. Subjects performed three trials in each condition, with the order of the six trials randomized. Ten subjects with normal or corrected-to-normal vision participated in the experiment. No subjects reported any neurological or physical deficits, and all subjects gave informed consent according to the regulations of the Internal Review Board at the University of Maryland. As a measure of the postural response to the sensory stimuli within a plateau, we computed scaled Fourier transforms at the driving frequency as: 2i ZT Xj ¼ xðtÞe22pifd t dt T 0 where j is the plateau index, t is time, x(t) is the position of the body’s center-of-mass in the medial-lateral direction, fd p is the frequency of the stimulus, i ¼ 2 1 and T is the plateau duration. Thus, the Fourier transform, Xj, describes the postural response at fd, where the absolute value of Xj is the body’s amplitude and the argument of Xj is the body’s phase with respect to stimulus motion. The scaling factor, 2i/T, was chosen so that when x(t) is in-phase with the stimulus motion, Ajsin(2pfdt), Xj is real-valued and positive. Because our stimuli are deterministic, using the Fourier transform to analyze postural sway at the stimulus frequency performs as well as techniques utilizing windowing or cross-spectral methods [12]. Examining how the Fourier transforms changed over the plateaus in a trial allowed characterization of the fusion process as stimulus amplitude was varied. The Fourier transform (X) in the first plateau in Fig. 1B where vision was stationary and the touch surface oscillated with 1.0 cm amplitude, exhibited a phase of ~1808, or approximately in-phase (~08) with touch surface motion. It has previously been found that subjects’ induced body sway exhibits a stable, approximately in-phase relationship at a stimulus frequency of 0.2 Hz in single-sensory visual and touch surface conditions [2,5], as well as in multisensory conditions [4]. As the stimulus amplitudes varied in subsequent plateaus, the respective Fourier transforms varied systematically, with the Fourier transform in the last plateau approximately in-phase with the visual stimulus. With the stimuli used in the present study, we expected subjects to again adopt an in-phase relationship to the oscillatory stimulus in the extreme plateaus where one stimulus was stationary and the other was oscillatory with amplitude of 1.0 cm. The intermediate plateaus presented situations in which the oscillatory stimuli were moving in opposite directions, with the Fourier transforms in these plateaus showing how the fusion process resolved these sensory ambiguities. Thus, the observed pattern of Fourier transforms within a trial are interpretable as a reflection of the underlying process of multisensory fusion. For example, if the fusion process is linear with constant sensory weights, the Fourier transforms will be a linear function of stimulus amplitude in the complex plane. Given the structure of our stimuli, the

K.S. Oie et al. / Neuroscience Letters 315 (2001) 113–116

transforms should travel in a straight line in equal amplitude steps. Any deviations from this predicted pattern (e.g. unequal amplitude steps or a non-straight path) would indicate non-linearities in the postural control system. To examine the patterns of observed Fourier transforms, we employed a hierarchical linear least squares fitting procedure. Fits of all orders up to and including cubic were made to the observed pattern of Fourier transforms as a function of sensory movement amplitude within a trial. The resulting fits were cubic (8 parameters), quadratic (6 parameters), linear (4 parameters), constant with nonzero mean (2 parameters) and constant with zero mean (0 parameters). Pairwise comparisons between all lower order fits and the cubic fit were performed using F tests at significance level a ¼ 0:05. The order for a given trial was chosen as the minimum order that could not be rejected in favor of the cubic fit. With this selection procedure, the probability that the selected order exceeds the actual order is at most a . The selected fits were interpreted in terms of the nature of the observed postural response: (1) the cubic and quadratic fits indicate a non-linear response; (2) the linear fit is consistent with a linear response; (3) the constant non-zero-mean fit indicates that there was no detectable change in the response across plateaus; and (4) the constant zero-mean fit indicates that there was no detectable coupling to the sensory stimuli. In roughly half the trials, 28 of 58 (two trials for one subject were discarded due to technical failures during data collection), the subjects’ postural responses were non-linear in nature. Fig. 2A shows an exemplar cubic fit of a single trial. Fourteen trials were best accounted for by the cubic fit and 14 trials were best fit by the quadratic fit. In the remaining trials, the linear fit was selected in 15 trials, the constant non-zero-mean fit was selected in 9 trials and the constant zero-mean fit, an example of which is shown in Fig. 2B, was selected in 6 trials. These results show that the postural response to changing multisensory information is most often non-linear in nature. No linear model with constant, additive inputs can produce those responses that were best accounted for by the nonlinear fits. This confirms that previous evidence for nonlinear postural responses (e.g. Peterka & Benolken [13]) was not merely a function of the large changes in sensory stimulus amplitude. Peterka & Benolken, varied visual movement amplitude between trials with step sizes of at least 0.38, corresponding to a lateral displacement of 9 mm at an eye level of 1.7 m. Our step size of 1 mm was an order of magnitude smaller, yet we still observed nonlinear responses. If non-linearity is the rule, then why were 15 of the 58 trials consistent with a linear model? It is possible that the re-weighting of both touch and vision might have partially obscured a non-linear response. If the weighting of vision decreases with increasing visual motion and the weighting of touch remains constant, then the step sizes between Fourier transforms would become smaller moving from left to

115

right in the complex plane. Similarly, if the weighting of touch decreases with increasing touch plate motion and the weighting of vision remains constant, then the step sizes would become smaller moving from right to left in the complex plane. Either one of the cases alone would produce unequal step sizes, a signature of non-linearity. However, if the weighting of both vision and touch varies with the sensory amplitudes, then step sizes might decrease roughly uniformly throughout the trial and end up approximately equal. In other words, two non-linear weightings may produce an overall response that looks approximately linear.

Fig. 2. (A) Cubic model. A trial (subject 8, condition 1) that indicates non-linearity. The cubic fit for this trial was found to be significantly better than all fits of lower order (P , 0:05). (B) Constant model (zero mean). A trial (subject 10, condition 2) that was not fit significantly better by any of the higher order models. This is consistent with a response uncoupled to stimulus motion.

116

K.S. Oie et al. / Neuroscience Letters 315 (2001) 113–116

Given this, one might think that non-linearities in the form of sensory re-weighting were present even in trials in which non-linearity was not detected. Recent evidence from our laboratory provides clear evidence of intersensory reweighting as a mechanism for multisensory fusion in our visiontouch paradigm [10,11]. Thus, sensory re-weighting provides a potential explanation for the failure to detect non-linearity in those trials best accounted for by the linear fit. The present study supports non-linear processing of multisensory information for human postural control and discounts the likelihood of a linear processing range. The results suggest that as we move about a changing environment, the nervous system continually fuses multisensory information for postural control, even when sensory changes are extremely small (as little as 1 mm). The level of precision is much finer than previous studies have shown and emphasizes the need for continual online updating of estimates of the center of mass to correct for stochastic fluctuations of postural sway. [1] Allum, J.H. and Pfaltz, C.R., Visual and vestibular contributions to pitch sway stabilization in the ankle muscles of normals and patients with bilateral peripheral vestibular deficits, Exp. Brain Res., 58 (1985) 82–94. [2] Dijkstra, T.M.H., Scho¨ ner, G., Giese, M.A. and Gielen, C.C.A.M., Frequency dependence of the action-perception cycle for postural control in a moving visual environment: relative phase dynamics, Biol. Cybern., 71 (1994) 489–501. [3] Horak, F.B. and Macpherson, J.M., Postural orientation and equilibrium, In J. Shepard and L. Rowell (Eds.), Handbook of Physiology, Oxford University Press, New York, 1996, pp. 255–292. [4] Jeka, J.J., Oie, K.S. and Kiemel, T., Multisensory information for human postural control: integrating touch and vision, Exp. Brain Res., 134 (2000) 107–125.

[5] Jeka, J.J., Oie, K.S., Scho¨ ner, G., Dijkstra, T.M.H. and Henson, E.M., Position and velocity coupling of postural sway to somatosensory drive, J. Neurophysiol., 79 (1998) 1661–1674. [6] Kiemel, T., Oie, K.S. and Jeka, J.J., Multisensory fusion and the stochastic structure of postural sway, Biol. Cybernetics, (2001) in press. [7] Lee, D.N. and Lishman, J.R., Visual proprioceptive control of stance, J. Hum. Mov. Stud., 1 (1975) 87–95. [8] Lestienne, F., Soechting, J. and Berthoz, A., Postural readjustments induced by linear motion of visual scenes, Exp. Brain Res., 28 (1977) 363–384. [9] Nashner, L.M., Analysis of stance posture in humans, In A. Towe and E. Luschei (Eds.), Handbook of Behavioral Neurobiology, Motor Coordination, Vol. 5, Plenum Press, New York, 1981, pp. 527–565. [10] Oie, K.S., Kiemel, T. and Jeka, J.J., Multisensory fusion: simultaneous re-weighting of vision and touch for the control of human posture, Cognitive Brain Research, (2001) in press. [11] Oie, K.S., Kiemel, T. and Jeka, J.J., Multisensory re-weighting in response to small amplitude environmental motion, In J. Duysens, B.C.M. Smits-Engelsman and H. Kingma (Eds.), Control of Posture and Gait, Symposium of the International Society for Posture and Gait Research, 2001, pp. 302–306. [12] Peterka, R.J. and Benolken, M.S., Role of somatosensory and vestibular cues in attenuating visually induced human postural sway, Exp. Brain Res., 105 (1995) 101–110. [13] Rabin, E., Bortolami, S.B., DiZio, P. and Lackner, J.R., Haptic stabilization of posture: changes in arm proprioception and cutaneous feedback for different arm orientations, J. Neurophysiol., 82 (1999) 3541–3549. [14] Stoffregen, T.A., Flow structure versus retinal location in the optical control of stance, J. Exp. Psychol. Hum. Percept. Perform., 11 (1985) 554–565. [15] van Asten, N.J.C., Gielen, C.C.A.M. and Denier van der Gon, J.J., Postural adjustments induced by simulated motion of differently structured environments, Exp. Brain Res., 73 (1988) 371–383.