Motion coherence affects human perception and

allowed us to control the VA direction of the segment motion, ..... 0.17), but all were significantly higher than 0 ~P ...... The added noise (hS) is zero-mean and.
2MB taille 4 téléchargements 340 vues
Visual Neuroscience (2000), 17, 139–153. Printed in the USA. Copyright © 2000 Cambridge University Press 0952-5238000 $12.50

Motion coherence affects human perception and pursuit similarly

BRENT R. BEUTTER and LELAND S. STONE Human Information Processing Research Branch, NASA Ames Research Center, Moffett Field (Received March 23, 1999; Accepted September 9, 1999)

Abstract Pursuit and perception both require accurate information about the motion of objects. Recovering the motion of objects by integrating the motion of their components is a difficult visual task. Successful integration produces coherent global object motion, while a failure to integrate leaves the incoherent local motions of the components unlinked. We compared the ability of perception and pursuit to perform motion integration by measuring direction judgments and the concomitant eye-movement responses to line-figure parallelograms moving behind stationary rectangular apertures. The apertures were constructed such that only the line segments corresponding to the parallelogram’s sides were visible; thus, recovering global motion required the integration of the local segment motion. We investigated several potential motion-integration rules by using stimuli with different object, vector-average, and line-segment terminator-motion directions. We used an oculometric decision rule to directly compare direction discrimination for pursuit and perception. For visible apertures, the percept was a coherent object, and both the pursuit and perceptual performance were close to the object-motion prediction. For invisible apertures, the percept was incoherently moving segments, and both the pursuit and perceptual performance were close to the terminator-motion prediction. Furthermore, both psychometric and oculometric direction thresholds were much higher for invisible apertures than for visible apertures. We constructed a model in which both perception and pursuit are driven by a shared motion-processing stage, with perception having an additional input from an independent static-processing stage. Model simulations were consistent with our perceptual and oculomotor data. Based on these results, we propose the use of pursuit as an objective and continuous measure of perceptual coherence. Our results support the view that pursuit and perception share a common motion-integration stage, perhaps within areas MT or MST. Keywords: Eye movements, Area MT, Area MST, Direction perception, Models

ticularly simple system for examining the relationship between motion processing for perception and motor control because, unlike other more complex motor systems, eye movements involve only a single joint with a constant load. Physiological and psychophysical evidence suggests that extrastriate cortical areas MT and MST are involved in the integration of local motions to compute global object-motion signals. The fact that microstimulation and lesions of MT and MST affect both smooth eye movements and motion perception (Newsome et al., 1985; Dürsteler et al., 1987; Dürsteler & Wurtz, 1988; Newsome & Pare, 1988; Komatsu & Wurtz, 1989; Salzman et al., 1990, 1992; Murasugi et al., 1993; Pasternak & Merigan 1994; Celebrini & Newsome, 1995; Britten & van Wezel, 1998; Rudolph & Pasternak, 1999) suggests that the same neural structures may be involved in both. However, because these studies did not measure perception and pursuit simultaneously, it remains unclear if and how these two functions are related. In fact, some (e.g. Goodale & Milner, 1992) have even argued that there are two independent parallel neural pathways for visual processing, one determining perception and the other controlling motor actions, including pursuit. The first question we address is whether there is a single

Introduction Are perceptual motion coherence and smooth-pursuit eye movements linked? In everyday life, one does not worry about coherence, so the issue may seem somewhat esoteric. Indeed, our visual system does a very good job of choosing which parts of the retinal image to group together as a single object, and which to leave separate as independent objects. In fact, errors could be disastrous. For example, incorrectly grouping together dirt on the windshield and a rapidly approaching car would produce large errors in determining the car’s velocity, as would incorrectly segregating the car’s features of differing orientations and assigning them separate velocities. Not only do we depend on the accurate grouping of the pieces of the world around us for veridical perception, but we must also have access to accurate object-motion signals to control our movements. The oculomotor system represents an ideal and par-

Address correspondence and reprint requests to: Brent R. Beutter, MS 262-2, Human Information Processing Research Branch, NASA Ames Research Center, Moffett Field, CA 94035-1000, USA. E-mail: brent@ vision.arc.nasa.gov

139

140 motion-integration stage that is shared by pursuit and perception. If this is so, then pursuit should reflect the same coherence criteria as perception. If, however, pursuit and perception have different motion-integration pathways, the pursuit response to image motion will not be tightly correlated with perceptual coherence. The second issue we address is how local motion signals are combined to compute a global motion signal for pursuit and perception. Does the motion-processing system reconstruct the veridical object motion, or is a cruder computation performed? We considered three candidate computational rules and constructed stimuli for which the three predictions are very different. One way veridical object motion can be computed is using the Intersection of Constraints (IOC) rule (Fennema & Thompson, 1979; Adelson & Movshon, 1982). Although other computational methods could also be used to reconstruct the veridical object motion, for convenience we will refer to the veridical object-motion direction as the IOC direction. A second, simpler computation, which often approximates object motion, the vector average (VA) of the component motions, has been proposed for both perception (Wilson et al., 1992; Yo & Wilson, 1992; Wilson & Kim, 1994) and pursuit (Lisberger & Ferrera, 1997; Lisberger & Movshon, 1999). A third possibility is that the direction of global motion is determined simply by the motion of higher level features such as line-segment end points or terminator motion (TM). Our goal is to examine pursuit and perception produced in response to stimuli for which these three predictions are very different to shed light on the algorithm used to compute global motion. While perceptual coherence seems a rather immediate experience, measuring it experimentally has proven to be difficult. Directly measuring coherence by having observers report the perceived coherence of stimuli over the range from purely incoherent to purely coherent can be biased or imprecise due to the subjective nature of the direct judgment and the resulting criterion drift. Alternatively, motion judgments that are indirectly affected by coherence can be used as more objective means for determining coherence (Welch & Bowne, 1990; Lorenceau & Shiffrar, 1992; Shiffrar & Lorenceau, 1996). We too measured perceptual coherence using an indirect but objective method by asking observers to judge the direction of object motion. Our approach was to measure simultaneously both pursuit and perception of line-figure parallelograms moving behind stationary, vertical apertures. Previous work showed that changing only the luminance of the apertures, leaving the object motion identical, alters the percept dramatically from a single object moving coherently to multiple line segments moving incoherently (Lorenceau & Shiffrar, 1992). Tilting the parallelogram relative to its direction of motion (Beutter & Stone, 1997) and flattening it (Lorenceau, 1998) allowed us to control the VA direction of the segment motion, leaving object motion unchanged. The TM direction is determined by the orientation of the apertures, and is thus always purely vertical, independent of the object-motion direction. Thus, for our stimuli, the VA, TM, and object-motion (IOC) directions are significantly different. Furthermore, to compare directly the effects of coherence on perception and pursuit, we measured the proportion of rightward responses for each by computing psychometric functions for perception and oculometric functions for pursuit (Kowler & McKee, 1987; Beutter & Stone, 1998a; Watamaniuk & Heinen, 1999). Based on these results, we constructed a simple model which quantitatively predicts the visible-aperture psychophysical data from the pursuit and invisible-aperture psychophysical data. We also performed a trial-by-trial analysis of the correlation between the perceptual and oculomotor decisions (Beutter et al.,

B.R. Beutter and L.S. Stone 1995; Beutter & Stone, 1996). Preliminary results have been presented elsewhere (Beutter & Stone, 1997, 1998b). General methods All stimuli consisted of a line-figure parallelogram moving along a straight-line trajectory behind two stationary, vertically oriented rectangular apertures, such that the parallelogram’s vertices were never visible (Fig. 1). Although the entire parallelogram is drawn for clarity in the figure, in the experiments only the four line segments (shown as the thick white line segments) falling within the two apertures were displayed. The IOC direction, which is by definition identical to the direction of object motion, was either slightly to the left or right of vertical. Because the apertures were vertical rectangles and the line-segment terminators moved up and down along the vertical edges, the TM direction was always purely vertical. Finally, because the VA direction is determined predominantly by the average orientation of the line segments, it was always rightward of straight down for parallelograms with a counterclockwise rotation (tilt: 130 deg), always leftward of straight down for parallelograms with a clockwise rotation (tilt: 230 deg), and close to straight down for untilted parallelograms (tilt: 0 deg). We computed the VA direction (uVA ) by first determining the segment velocities perpendicular to their orientations, and then taking the average over all four segments. For a parallelogram

Fig. 1. Stimuli used in Experiment 1 and the predicted responses of three motion-integration rules. A parallelogram moved behind stationary rectangular apertures in one of two directions (610 deg). In the experiments, only the line segments falling within the apertures were visible (heavy white lines). In these sketches, the background is white, while the actual background was gray, and only the visible (black) aperture condition is shown. In the invisible-aperture condition, the aperture luminance was equal to the background gray. The last three columns show the predicted directions of motion for the 210 deg conditions as the dashed arrows, and those for 110 deg conditions as the solid arrows. For each of the three tilts, the IOC prediction is identical to the direction of object motion. The VA predictions, computed from eqn. (1), are close to the mean orientation of the segments for both directions of object motion (210 deg, 110 deg) and all three tilts (230 deg tilt: 227 deg and 224 deg; 0 deg tilt: 21.3 deg and 11.3 deg; 130 deg tilt: 124 deg and 127 deg). Because of the vertical orientation of the apertures, the TM prediction is always purely vertical.

Coherence for pursuit and perception

141

moving in a direction uIOC , with vertices subtending interior angles a and 180 deg 2 a and whose bisectors are rotated by an angle b, the relationship between the VA direction and the IOC direction is uVA 5 b 1 arctan@tan~uIOC 2 b!{tan 2 ~a02!#

computed the horizontal and vertical positions of the pupil in uncalibrated eye-tracker coordinates by measuring the centroid of a thresholded image of the pupil. Calibration

(1)

For squares (a 5 90 deg), the VA and IOC directions are identical, but in general they are different. Stimuli The stimuli were displayed on a 21-inch Philips Brilliancet 21A color monitor using the AT Vistat video display system hosted by a 486 personal computer. The monitor was run in non-interlaced 60-Hz refresh-rate mode with 640 3 486 resolution (pixel size: 0.59 mm). At the 57-cm viewing distance, the full display subtended 38 deg 3 29 deg. Luminances were measured using a PhotoResearcht 880 photometer. The luminance was 38 cd0m 2 for the background and 93 cd0m 2 for the line segments. The apertures were either “visible” with low luminance (0.2 cd0m 2 ) or “invisible” with luminance equal to the background. The parallelogram had vertex angles of 40 deg and 140 deg, and was 13.7 deg wide. The middle vertices were always aligned vertically. As shown in Fig. 1, the untilted parallelogram was symmetric about both the horizontal and vertical axes, while the tilted parallelogram’s sides were rotated by either 230 deg (top row) or 130 deg (bottom row). Therefore, the untilted parallelogram’s left and right vertices were aligned horizontally, while for the tilted parallelogram they were shifted upward or downward. The parallelogram moved sinusoidally along a straight line with a temporal frequency of 0.94 Hz and with horizontal and vertical amplitudes, A x and A y . The direction of object motion was the arctangent of A x 0A y . Subsequently, we will refer to directions of motion that are rotated counterclockwise as rightward (the displacement at the bottom is to right of straight down) and those that are rotated clockwise as leftward (the displacement at the bottom is to the left of straight down). Experimental procedures We recorded observers’ eye movements and the associated psychophysical responses on each trial. Observers viewed the stimulus binocularly in a dimly lit room. They initiated each trial with a button press. Trials began with a 500-ms presentation of a 1 deg by 1 deg fixation cross at the center of the screen. The fixation cross was then extinguished and one of the possible directions of stimulus motion was presented. Observers were instructed to track the perceived center of the object and to determine whether it appeared to move to the right or left of straight down. On each trial, the eye movements were recorded and the observer pressed either the left or right mouse button to indicate the perceptual judgment. No feedback was provided. Eye tracking Eye movements were measured with an infrared (IR) video-based eye tracker (ISCAN, Inc., custom built for NASA) sampling at 240 Hz, synchronized with our display monitor. Head movements were minimized by using a bite bar. An IR light source illuminated the observer’s left eye, which viewed the stimulus through a mirror which transmitted visible but reflected IR light. The eye tracker

Prior to each run, we calibrated the eye tracker by having observers fixate a series of nine crosses arranged in a 3 deg 3 3 deg grid. The crosses were presented in a fixed pseudorandom order and each was shown at least twice. The crosses were each presented for 1.5 s, and the eye-movement recording began after 0.5 s and lasted 1.0 s. For each fixation, the mean eye position and its standard deviation were calculated. The standard deviations of the eye positions, which averaged ;0.15 deg, provide an estimate of the eye-tracker positional noise (neglecting the small fixational eye jitter). The calibrated eye positions were computed as linear functions of the tracker outputs. Two sets of three parameters were used to compute the horizontal and vertical positions, respectively: an offset, a gain, and a cross-talk term (for details see Beutter & Stone, 1998a). The calibration parameters were determined by optimally fitting the mean eye-tracker outputs to the known locations of the fixation points. The calibration data were well fit by these six linear parameters; the reduced x 2 ranged from ;0.5 to ;1.5. Computation of steady-state pursuit direction To determine the pursuit direction, we disregarded the pursuit onset, and fit the remaining horizontal and vertical eye-position data to separate sinusoids of the stimulus temporal frequency. We used a low-pass filtered differentiator (23 db cutoff at 42 Hz) to detect saccades. The total eye velocity (the square root of the sum of the squares of the horizontal and vertical velocities) was compared to a threshold, which ranged from 16 deg0s to 23 deg0s. Detecting large saccades is easy and our saccade detection algorithm found all saccades that were detected subjectively by eye. However, we deliberately set our threshold low, so as to err on the side of making sure that small saccades were excluded from our analysis (at the expense of possibly discarding some data because of falsely detected saccades), and to assure therefore that we fit only the smooth portions of the response (Fig. 2A). We fit the saccade-free intervals and determined the sine wave (amplitude and phase) that minimized the total x 2, allowing a position offset for each intersaccadic interval (Fig. 2B). This procedure quantifies the component of the pursuit response that is at the stimulus temporal frequency. The example trial in Fig. 2 shows that our analysis is able to detect small saccades and that our fit accurately measures the smooth-pursuit response. The angular direction of motion was computed as the arctangent of the ratio of the horizontal and vertical amplitudes measured by the fits. To correct for any overall direction bias (calibration induced or observer specific), for each observer, we subtracted the grand-mean direction across all conditions from each trial’s direction. Trials with x 2 . 5.0 or , _12 cycle of saccade-free pursuit were discarded. Because the horizontal pursuit amplitude was small, we took special care to make our direction analysis robust to the resulting uncertainty in horizontal temporal phase. Trials with horizontal amplitudes of less than 0.025 deg were defined to be straight down. A horizontal–vertical phase difference of 0 deg corresponds to rightward pursuit, and a phase difference of 180 deg corresponds to leftward pursuit. The horizontal phase uncertainty resulted in some trials with phase differences close to 690 deg.

142

B.R. Beutter and L.S. Stone Results

Fig. 2. Raw eye-movement data. A: This panel shows the horizontal (thick trace) and vertical (thin trace) position data from a single trial (observer LS). The displayed data from the last two cycles of motion were low-pass filtered (23 db cutoff at 25 Hz). The smooth pursuit portions are shown as solid lines and the small saccade found by our detection algorithm is indicated by dotted lines. B: This panel shows the same data with the saccade removed (solid lines) along with the best-fitting sinusoids (dashed lines).

Because the direction of these trials is somewhat ambiguous, we also discarded trials with phase differences within 15 deg of 690 deg. For the remaining trials (;88%), the average pursuit gain was ;0.7 and the average vertical phase lag was ;8 deg. The average reduced x 2 for the two experiments was ;0.8, which strongly suggests that nearly all of the saccades were detected, because fitting position data with missed saccades would increase x 2.

Experiment 1 Methods Four observers participated in this experiment (two authors: BB & LS, and two naïve: JP & TX). All observers had normal or correctedto-normal vision. Before data collection began, observers practiced making the psychophysical judgments and tracking the stimuli in preliminary runs. The apertures were 3.3 deg wide by 18 deg high and their centers were separated by 6.6 deg. There were two types of apertures: invisible and visible. Two directions of motion were used: 610 deg from straight down ~A y 5 2.70 deg, A x 5 60.48 deg). The stimulus duration was 2.67 s, and the pursuit data corresponding to the last two cycles of stimulus motion were analyzed to determine the pursuit direction. There were three parallelogram tilts: 230 deg, 0 deg, and 130 deg. Each parallelogram was presented in both directions of motion within both types of apertures using the method of constant stimuli. Data for the visible- and invisible-aperture conditions were collected in separate 60-trial or 120-trial blocks. The two aperture-condition blocks were alternated within two or three runs.

Pursuit data Eye-position data for both directions and aperture conditions from individual trials are shown in Fig. 3. The top row shows data from the visible-aperture condition, which generally produced a percept of coherent object motion. Notice that the red traces corresponding to 110 deg object motion are oriented about 10 deg to the right, while the blue traces corresponding to 210 deg object motion are oriented about 10 deg to the left. The bottom row shows data from the invisible-aperture condition, which produced a percept of incoherent segment motion. In contrast to the visibleaperture data, the 110 deg (red) and the 210 deg (blue) traces are nearly identical and vertical. Furthermore, despite the fact that each column corresponds to a different parallelogram tilt and thus to a large difference in the VA direction, both the invisible- and visible-aperture data show little dependence on tilt. The average eye-movement directions are plotted as a function of object-motion direction in Fig. 4. The data for the three tilts and two directions of motion in the visible-aperture condition are shown in Figs. 4A– 4C, while those for the invisible-aperture condition are shown in Figs. 4D– 4F. Also shown are the IOC, VA, and TM predictions, which are independent of the aperture condition. The IOC direction (dashed line) is identical to the object-motion direction and thus predicts a line of slope 1. The TM direction (solid line) is always purely vertical and predicts a line of slope 0. The VA direction (dotted line) is close to the tilt of the parallelogram and predicts a monotonic function with a low slope and, for the tilted conditions, large vertical shifts. In the visible-aperture condition, the pursuit directions for all three tilts are largely veridical, and observers reported mostly coherent object motion. For the 0 deg tilt (Fig. 4B), the mean slope (averaged across observers) was 0.65, which is less than the IOC prediction of 1.0. The mean slopes for the 230 deg tilt (0.78) and the 130 deg tilt (0.79) are also close to but lower than the IOC prediction. These less-than-unity slopes could be caused by a bias toward either the VA or TM directions, because both predict low slopes. However, the absence of the large vertical shifts predicted by the VA for the 630 deg tilts (Figs. 4A and 4C) resolves this ambiguity. Small biases toward the VA direction are reflected in the downward shifts for the 230 deg tilt (mean: 23.7 deg) and upward shifts for the 130 deg tilt (mean: 13.5 deg), but the shifts are much less than the VA prediction of 625.5 deg. These results show that the pursuit directions are largely determined by the IOC direction, and that the ,1.0 slopes reflect a bias toward the TM direction. In the invisible-aperture condition, the pursuit directions for all three tilts were much closer to vertical, and observers reported that the line segments appeared to move incoherently. Pursuit was predominantly in the TM direction as reflected by the very low values of the mean slopes: 0.23, 0.22, 0.31, for the 230 deg, 0 deg, 130 deg tilts, respectively. The nonzero slopes suggest a small IOC contribution. However, there was little evidence for a VA bias. The mean vertical shifts were 21.0 deg for the 230 deg tilt and 10.8 deg for the 130 deg tilt. Psychophysical data The proportion judged rightward was computed for each stimulus condition. These psychophysical data and the model predictions are compared in Fig. 5. The psychophysical data, averaged across observers, are shown in Fig. 5A, along with the predictions of the IOC, VA, and TM rules in Fig. 5B. Both the IOC and TM

Coherence for pursuit and perception

143

Fig. 3. Typical eye-movement traces from Experiment 1. Eye-movement data from single trials (observer LS) for both directions of object motion (610 deg) and each of the three tilts are shown. The displayed data from the last two cycles of motion were low-pass filtered (23 db cutoff at 25 Hz). In the visible-aperture conditions, the pursuit direction was always close to the direction of object motion. In the invisible-aperture conditions, pursuit for both directions of object motion was similar and always close to the vertical motion of the segment terminators. In all conditions, there was little or no effect of tilt.

predictions are independent of tilt. The IOC rule always predicts perfect performance (100% correct), while the TM motion always predicts random performance (50% correct). Biases toward the TM direction thus would produce symmetric shifts in the percent rightward toward 50% (an increase for the 210 deg direction and a decrease for the 110 deg direction). The VA predictions are strongly influenced by the tilt. For both object-motion directions, the VA rule predicts that all judgments for the 130 deg tilt will be rightward, and that all those for the 230 deg tilt will be leftward. Biases toward the VA direction would thus produce a decreased likelihood of responding rightward for the 230 deg tilt, and an increased likelihood of responding rightward for the 130 deg tilt. For the 0 deg tilt, although the VA rule predicts perfect performance, because the actual VA predicted directions of motion are so close to vertical (61.3 deg), the addition of even a small amount of random noise would cause the expected proportions rightward to be closer to 50% for both directions. For the visible-aperture condition, the data are similar to the IOC predictions. The direction judgments (averaged over observers and tilts) were 96% correct, hence the data show little evidence for a TM bias. There are two points with statistically significant differences ~P , 0.05, one tailed t-test) that suggest a small VA bias. When the parallelogram is tilted to the right (130 deg) and the motion is leftward (210 deg), the proportion judged rightward increases from 0% for the untilted condition to 8%. When the tilt is to the left (230 deg) and the motion is rightward (110 deg), it decreases from 98% for the untilted condition to 89%.

For the invisible-aperture condition, although perceptual performance was worse than in the visible-aperture condition, it was still significantly above chance (50%). For all tilts, the judgments appear to be influenced by both the TM and IOC directions, but not the VA direction. The average percent correct was 78%, which is between the IOC and TM predictions. The data, however, showed no evidence of an overall VA bias. There was no significant ~P . 0.05, one tailed t-test) decrease in rightward judgments for the leftward tilt (230 deg), nor was there an increase in rightward judgments for the rightward tilt (130 deg). Summary Our data show that under conditions for which the motion is largely coherent (visible-aperture condition), both pursuit and perceptual direction decisions are close to the IOC predictions (veridical), with a small bias toward the TM prediction and an even smaller bias toward the VA prediction. Under conditions for which the motion is largely incoherent (invisible-aperture condition), both pursuit and perceptual direction decisions are closer to the TM prediction, but nevertheless still appear to be influenced by a global motion signal (IOC) suggesting partial coherence. Furthermore, there appear to be residual non-motion cues that influenced the perceptual judgments. For the stimuli in Experiment 1, we identified two cues that could be used to make correct binary direction judgments without the need to group the segment motions into object motion. First, the distance between the segments is relatively larger in the aperture corresponding to the direction of

144

B.R. Beutter and L.S. Stone

Fig. 4. Pursuit directions and model predictions for Experiment 1. The average pursuit directions for each observer are plotted as a function of the object-motion direction for the three tilts (top, middle, and bottom rows), and the visible and invisible apertures (left and right columns). The IOC predictions (dashed lines) are identical to the object-motion direction. The VA predictions (dotted lines) are strongly biased toward the parallelogram’s tilt. The TM predictions (solid lines) are always vertical (0 deg). In the visible-aperture condition, pursuit was predominantly in the objectmotion direction, while in the invisible-aperture condition, it was largely vertical. None of the conditions shows the large offsets predicted by the VA direction. For clarity, the standard errors are not shown; they were similar across observers and conditions (mean: 0.5 deg).

motion, which provides a static-separation cue. Observers could use this cue and base their judgments on which pair of segments had a larger separation at the bottom of the trajectory. Second, the distance between the segments increases on the side corresponding to the direction of motion and decreases on the opposite side which provides a cue based on the temporal change in separation. Observers could use this cue and base their judgments on the separation of each pair of segments at the top of the trajectory relative to its separation at the bottom. Indeed, some observers reported using these cues to make their judgments. We therefore performed a second experiment designed to minimize the effectiveness of these cues. First, we introduced a random left0right offset into the parallelogram’s spatial location, which negated the static-separation cue. Second, we randomly zoomed (positively or negatively) the parallelogram, which dynamically changed its size and reduced the effectiveness of the change-inseparation cue. The first experiment showed that both the perceptual and pursuit responses were mostly related to the IOC and TM directions, with little effect of the VA direction. Therefore, the second experiment used only untilted parallelograms and focused

on examining the differences between the visible (coherent) and invisible (incoherent) aperture conditions in greater detail by increasing the number of directions of motion. Experiment 2 Methods Four observers participated in this experiment (two authors: BB & LS, and two naïve: RB & TA). All observers had normal or corrected-to-normal vision. Before data collection began, observers practiced making the psychophysical judgments and tracking the stimuli in preliminary runs. The stimuli were similar to those used in Experiment 1, except for the following differences. The apertures were 2.5 deg wide, only the untilted parallelogram was used, and the motion amplitude was 2.25 deg. Instead of being centered about the middle of the screen, on each trial, the parallelogram’s trajectory was randomly chosen to have either a rightward or leftward overall position offset (60.72 deg). In addition, the parallelogram’s linear

Coherence for pursuit and perception

145 rection was leftward of straight down, a leftward oculometric response was chosen. From these, the proportion of rightward oculometric responses were computed to generate an oculometric function analogous to the standard psychometric function. Results

Fig. 5. Perceptual judgments and model predictions for Experiment 1. A: The histogram bars show the average (across observers) proportion rightward decisions and the error bars are the standard deviations. B: The proportion or rightward predicted by IOC, VA, and TM rules. The visibleaperture data are similar to the IOC predictions. The invisible-aperture data are between the IOC and TM predictions.

dimensions were not fixed; they changed at a constant rate over time (positive or negative zoom). The parallelogram’s average dimensions were 5.5 deg 3 15.0 deg, but each trial was randomly chosen to have either a 14.4% expansion or contraction over the duration of the trial. To minimize the effects of prediction on pursuit, for five trajectories of motion (69 deg, 63 deg, and 0 deg) the initial phase was chosen so that the initial vertical component of the motion was downward, while for three other trajectories (66 deg, 0 deg) the initial phase was chosen so that the initial motion was upward. The stimulus duration was 1.6 s, and the eye-movement data corresponding to the last 1.25 cycles of stimulus motion were analyzed to determine the pursuit direction. Each possible combination of zoom (2), offset (2), trajectory (8), and aperture type (2) was presented for a total of 64 conditions. Each run consisted of 256 trials (4 repetitions per condition), and data for the two aperture conditions, invisible and visible, were collected in separate 128-trial blocks. The invisible and visibleaperture blocks were alternated, and three runs were performed. Oculometric analysis To compare the pursuit data directly with the psychophysical data, we computed an oculometric function (Kowler & McKee, 1987; Beutter & Stone 1998a). As for the raw pursuit-direction data, we corrected for any overall bias (calibration induced or observer specific), by first subtracting the grand-median direction across all conditions from each trial’s direction. Then for each trial, if the pursuit direction was rightward of straight down, a rightward oculometric response was chosen, and similarly if the pursuit di-

Raw pursuit data The average pursuit direction for each observer (averaged across zooms and offsets) is plotted as a function of the object-motion direction in Fig. 6. In the visible-aperture condition (filled symbols), the pursuit direction was largely determined by the objectmotion direction. The correlation coefficients of the fits were high (average r 2 across observers: 0.79). The average slope (across observers) was 0.38. Observer LS had a high slope, 0.80, while the other observers’ slopes were lower (RB, 0.38; BB, 0.19; and TA, 0.17), but all were significantly higher than 0 ~P , 0.001, t-test). The variability in eye-movement directions for a given objectmotion direction as measured by the standard deviation (averaged across observers and object-motion directions) was 4.5 deg. To determine whether the low slopes were caused by the expectation of a largely vertical trajectory which could produce higher vertical gains via a cognitive expectation (Kowler, 1990), we performed a control experiment on two observers with low slopes using a fully visible line-figure parallelogram. In this condition, both observers pursued with slopes much closer to 1.0 (BB, 0.94; and RB, 0.83), ruling out this possibility. In the invisible-aperture condition (open symbols), pursuit was predominantly vertical, with little dependence on the objectmotion direction. The correlation coefficients were small (average r 2 across observers: 0.26). Observer LS’s slope was reduced from 0.80 in the visible-aperture condition to only 0.03 in the invisibleaperture condition. The other observers also had much smaller slopes (RB, 0.01; BB, 0.02; and TA, 0.01). None was significantly different from 0 ~P . 0.05), which suggests that the addition of zooms and offsets made the stimulus motion fully incoherent in the invisible-aperture condition (perhaps at the expense of making the visible-aperture condition less coherent). The variability in eyemovement directions for a given object-motion direction was slightly smaller than that for the visible-aperture condition: the standard deviation (averaged across observers and object-motion directions) was 3.6 deg. Psychophysical analysis The observers informally reported that in the visible-aperture condition, they perceived a largely coherent moving parallelogram, while in the invisible-aperture condition, the motion appeared to be a set of incoherent moving line segments. The proportion judged rightward was computed for each object-motion direction (averaged across zooms and offsets). The psychophysical data are plotted in the left column of Fig. 7. The visible-aperture data are shown as filled squares and the invisible-aperture data are shown as open squares. The data were fit to cumulative Gaussians (solid lines), and the direction threshold was defined to be the standard deviation of the Gaussian. For the visible-aperture condition, observers were able to judge the direction of object-motion precisely. The average threshold was 4.2 deg (s.d. across observers: 2.2 deg). For the invisible-aperture condition, performance was much worse. The average threshold was 21.7 deg (s.d. across observers: 14.9 deg). Thus, changing the aperture from visible to invisible increased the psychophysical thresholds by approximately a factor of 5.

146

B.R. Beutter and L.S. Stone (open squares), performance for all directions of object motion was close to random (0.5) for all observers. The invisible-aperture thresholds were nearly infinite (average across observers: 154 deg). For all observers, the slope of the oculometric functions (10 threshold) was not significantly different than 0 ~P , 0.05). Despite the higher thresholds for the oculometric as compared to the psychometric data, changing the aperture from visible to invisible produced large increases (ranging from a factor of 4 to 29) in the oculometric thresholds for all observers. Thus, the oculometric results mirror the psychophysical results; performance with visible apertures was much better than that with invisible apertures. The observed inter-subject differences are also consistent with the view that pursuit and perception are linked. Observers who made more accurate perceptual direction discriminations also generated pursuit that more accurately tracked the object motion, while those who made poorer perceptual discriminations pursued less accurately. In the visible-aperture condition, the observers with low psychometric thresholds (LS and RB) also had low oculometric thresholds and high eye-movement slopes, while those with high psychophysical thresholds (BB and TA) also had high oculometric thresholds and low eye-movement slopes.

Fig. 6. Pursuit directions for Experiment 2. The average pursuit directions for each observer are plotted as a function of the object-motion direction for both the visible (filled squares) and invisible apertures (open squares). The lines are the best linear fits to the data. For all observers, pursuit more closely followed the object-motion direction in the visible-aperture condition than in the invisible-aperture condition. For clarity, the standard errors are not shown; they were similar across observers and conditions (mean: 0.5 deg).

Oculometric analysis As for the psychophysical data, the oculometric proportion judged rightward (right column of Fig. 7) was computed for each object-motion direction (averaged across zooms and offsets) and the data were fit to cumulative Gaussians (solid lines). In the visible-aperture condition (filled squares) the average threshold was 14.1 deg (s.d. across observers: 7.6 deg). For all observers, the slope of the oculometric functions (10threshold) was significantly different than 0 ~P , 0.05). In the invisible-aperture condition

Modeling To explore further the relationships between both the visibleand invisible-aperture data, and the psychophysical and pursuit data, we constructed the simple model shown in Fig. 8. The model assumes that the image is processed by two parallel pathways. A single motion-processing stage is used both to generate pursuit and to help support perceptual direction decisions. A static-processing stage is used to extract non-object-motion cues that do not influence pursuit, but do contribute to the perceptual decision. The Appendix contains a detailed description of the computations and the noise sources used to generate the quantitative predictions. Briefly, the input to the motion-processing stage is the stimulus and its output is a noisy direction signal, which we estimate from the pursuit data and eye-tracker directional noise. Similarly, the input to the static-processing stage is the stimulus and its output is a noisy direction signal, which is based on non-motion cues to the direction of object motion. In the invisible-aperture condition, because the eye movements are essentially vertical and independent of the object-motion direction, the motion-processing stage provides no information about the object-motion direction. Thus, we determined the static-processing stage’s contribution to perception directly from the invisible-aperture psychophysical data. We assume that both visual-processing stages contain Gaussian noise sources, that the motor system adds negligible noise, and that the eye-tracker adds additional Gaussian directional noise to the measured pursuit direction. The oculomotor system has access only to the output of the motion-processing stage, which it uses to generate pursuit. The eye-tracker output is analyzed to compute the direction of pursuit for each trial. The measured pursuit direction is then used to make a left0right oculometric decision. The perceptual system, however, has access to both the motion- and static-processing stages. We assume that it optimally combines the signals from these two stages to generate a perceptual signal. This combined signal is then used to make a left0right perceptual decision. The model uses a single free parameter that controls the total amount of eye-tracker directional noise to predict the visibleaperture psychophysical data for all four observers (dashed lines in Fig. 7). The model is also consistent with the three other data sets: the visible-aperture oculometrics, and the invisible-aperture psy-

Coherence for pursuit and perception

147

Fig. 7. Psychometric and oculometric functions for Experiment 2. The average psychophysical (left column) and oculometric (right column) proportion rightward for each observer are plotted as a function of the objectmotion direction for both the visible (filled squares) and invisible apertures (open squares). The best-fitting cumulative Gaussians are shown as the solid lines. Model predictions for the visible-aperture psychophysical data are shown as dotted lines. The model prediction for BB is obscured by the Gaussian fit to the data. For both the oculometric and psychophysical data, the curves are much steeper for the visibleaperture condition than for the invisible-aperture condition.

chophysics and oculometrics (the model fits to these data are not shown because they are nearly identical to the cumulative-Gaussian fits). More specifically, it accurately describes the invisible-aperture oculometric data; for this condition, there is no motion-processing signal and the model predicts 50% rightward (random performance) for all object-motion directions. It also accurately describes the invisible-aperture psychophysical data; for this condition, the model predictions are by design identical to the fits to the data, because the motion-processing signal does not contribute and the parameter controlling the strength of the static-processing signal is directly determined by the fit to the invisible-aperture psychophysical data. The visible-aperture oculometric performance is computed from two additional parameters measured directly from the mean pursuit-direction data (slope and directional uncertainty). Finally, the model generates predictions for the visible-aperture psychophysical data using these three measured parameters, and the single free parameter that estimates the total eye-tracker direc-

tional noise. The predictions were indistinguishable from the data for three observers: LS, RB, and BB ~P , 0.05), while observer TA’s measured threshold was 38% lower than the model’s prediction. Trial-by-trial correlation analysis A key test of both our model and the hypothesis that pursuit and perception share a common motion-processing stage is provided by measurements of the trial-by-trial correlation between the oculometric and perceptual decisions for the straight-down direction of motion. The straight-down direction (zero-signal condition) is particularly interesting because both the perceptual and pursuit decisions are equally likely to be leftward or rightward (Britten et al., 1992). If a common motion-processing stage is shared by both, then stochastic variations in its output (noise) will affect both perception and pursuit similarly on each trial, and they will therefore be correlated. On the other hand, if there are separate and independent motion-processing stages for perception and pursuit,

148

B.R. Beutter and L.S. Stone tion pathway provides little or no information about the objectmotion direction. Therefore, in our model, perceptual decisions are based primarily on the independent static-processing signal, while pursuit is now driven only by the irrelevant segment-motion signal. Thus, the model predicts that the perceptual and pursuit decisions will be uncorrelated. The actual probability of both decisions being the same is the sum of the probability of both decisions being rightward and the probability of both decisions being leftward. If the signals are uncorrelated, chance predicts that the probability of both decisions being rightward0leftward is simply the product of the probabilities of each decision being rightward0leftward. We therefore measured the actual number of decisions that were the same, and compared this with both the chance and model predictions (Fig. 9). In the visible-aperture condition, the data were significantly ~P , 0.05) higher than the chance predictions, yet the hypothesis that they were generated by the model could not be rejected. While in the invisible-aperture condition, the data were not significantly different from either the chance or model predictions. Thus, for visible apertures, when coherent object motion is perceived, both perception and pursuit are correlated on a trial-by-trial basis. This provides strong evidence for a single shared motion-processing signal as in our model, as opposed to similar but independent signals. For invisible apertures, when there is no global motion percept indicating the direction of object motion, perception and pursuit are uncorrelated and appear to be based on separate and independent processes, again consistent with the model.

Fig. 8. Model of oculomotor and perceptual processing. A motion-processing stage and a static-processing stage are used to predict perceptual and oculometric decisions. Pursuit is based only on the output of the motionprocessing stage, and its measurement also contains directional noise added by the eye tracker. Perception is determined by the optimal combination of the outputs of both the motion-processing and static-processing stages. A detailed explanation is presented in the Appendix.

then the two decisions will be uncorrelated and only be the same by chance. In the visible-aperture condition, there is a strong objectmotion signal. Therefore in our model, perceptual decisions are based primarily on this motion signal and pursuit is driven exclusively by this signal. Thus, the model predicts that the perceptual and pursuit decisions will be correlated, that is, more often the same than predicted by chance. However, the eye tracker adds considerable directional noise to the measured pursuit directions that does not affect the perceptual decisions, and thus lowers the observed correlations. In the invisible-aperture condition, the mo-

Fig. 9. Trial-by-trial correlation of pursuit and perceptual decisions. The proportions of the trials for which the oculometric and the perceptual decisions were the same are shown for the four observers (LS: horizontal stripes, RB: unfilled, BB: solid, and TA: diagonal stripes). The first column (Data) shows the measured proportion of same decisions for the straightdown condition in Experiment 2. The horizontal dashed line shows the mean across observers and the error bars indicate its 95% confidence region. The second column (Chance) shows the proportion of same judgments that are predicted if pursuit and perception are uncorrelated. The third column (Model) shows the proportion of same judgments predicted by our model. A: The visible-aperture data are not significantly different than the model predictions, but are significantly higher than the chance predictions. B: The invisible-aperture data are not significantly different from either the model or chance predictions.

Coherence for pursuit and perception The above results establish a clear correlation between the direction estimation underlying perceptual and pursuit performance, but from the above data alone, we cannot infer the direction of causality. Indeed, this correlation could be caused by either shared visual or oculomotor links between perception and pursuit (Yasui & Young, 1975; Pola & Wyatt, 1989). Previous studies of the direction perception using similar partially occluded line-figure stimuli (Lorenceau & Shiffrar, 1992) have shown that the perceptual dichotomy between the visible- and invisible-aperture conditions is present even during fixation, and therefore is not due to oculomotor feedback. To confirm this older result under our exact stimuli conditions, we compared the direction judgments of one observer (LS) during runs for which the instruction was to pursue (trials for which the pursuit gain was ,0.75 were discarded) with those during runs for which the instruction was to maintain fixation on a central cross throughout the trial (trials for which the pursuit gain was .0.15 were discarded). In the visible-aperture conditions, the direction thresholds were ;2 deg or less, and in the invisible-aperture conditions, they were ;9 deg or more, regardless of whether the observer was pursuing or fixating. This large pursuit-independent dichotomy demonstrates that the aperture effect on direction perception does not require pursuit and shows that the observed correlations described above are at least in part due to shared visual processing. Discussion Pursuit models have traditionally assumed that motion on the retina (retinal slip) is used to control smooth-pursuit eye movements through an external negative feedback loop with internal positive feedback to enhance performance (e.g. Robinson et al., 1986; Krauzlis & Lisberger, 1991; Ringach, 1995). Most current models implicitly assume that pursuit is an independent process from perception, although there is evidence that perception (Steinbach, 1976; Wyatt & Pola, 1979) and higher order cognitive expectations (e.g. Kowler, 1989, 1990) can influence pursuit. Our results show that pursuit performance depends on perceived motion, independent of raw retinal slip, indicating that the visual input for pursuit models needs to be reexamined (Stone et al., 1996; Krauzlis & Stone, 1999). The stimulus motion for our visible- and invisible-aperture conditions was identical; all that differed was the aperture luminance. If pursuit depends only on image motion, the eye movements for these two conditions would be identical. Our data clearly show that they are very different. The processing for pursuit must therefore be more complex than the simple minimization of retinal slip. Pursuit computations are apparently able to integrate multiple local motions across space to calculate an object-motion signal, but only under conditions in which perception is also able to. This finding complements recent results showing that human pursuit of a partially occluded line-figure object moving along two-dimensional trajectories is largely accurate (Stone & Beutter, 1998) under conditions in which perception is largely accurate (Lorenceau, 1998). Determining the veridical direction of object motion requires a nonlinear combination of the motion of the components (e.g. the IOC rule). In Experiment 1, we examined three motion-integration rules that might be used to determine the direction of pursuit from the motion of the four segments: intersection of constraints, vector average, and terminator motion. For the visible-aperture condition, which was generally perceived as a parallelogram moving coherently, pursuit and perception were similar, and both were close to the IOC prediction, with small biases toward the TM direction and

149 little influence of the VA direction. For the invisible-aperture condition, the percept was four incoherently moving line segments, and both perception and the eye movements were close to the TM prediction and again showed little evidence of a VA bias. Thus, for identical stimulus motion, a change in aperture luminance changes the motion-integration process, resulting in different percepts and correspondingly different pursuit. The similar performance for pursuit and perception would result from a single shared pathway, but could also occur if there were similar but independent processing stages operating in parallel. These two possibilities can be distinguished by examining the trial-by-trial variability of pursuit and perception. If there are two separate processing stages, the pursuit and perceptual response will only be randomly correlated on a trial-by trial basis, while if there is a shared processing stage, the responses will be correlated. We found that, for visible apertures (coherent motion), pursuit and perception were correlated, suggesting a shared global motion signal. However, for invisible apertures (incoherent motion) pursuit and perception were uncorrelated, suggesting that, in the absence of a global motion signal, perception makes use of other cues unavailable to pursuit. Therefore, it is likely that the changes in pursuit and perception produced by changes in aperture luminance are the result of a common neural mechanism which processes motion within visible and invisible apertures differently. This shared motion integration may begin in area MT, and has been linked to the perceptual coherence of plaids (Stoner & Albright, 1992; Albright & Stoner, 1995). More recent results illustrating that there are parallel changes in perception, pursuit, and MT responses support the view that MT is part of the neural substrate shared by pursuit and motion perception (Dobkins et al., 1998). Vector averaging for pursuit and perception We find little evidence for the vector average of local motions driving pursuit or perception. An important difference between our experiment and those that did find evidence for vector averaging (Wilson et al., 1992; Yo & Wilson 1992; Lisberger & Ferrera 1997) is that we measured steady-state pursuit and perception using stimuli with long durations (.1 s), while the other studies examined only the initiation of pursuit or the initial percept. Lisberger and Ferrera (1997) measured the initiation of pursuit in monkeys in the first ;200 ms after the onset of target motion. Two spots appeared and each was equally likely to become the target. The monkey’s task was to maintain eye position within 3 deg of the target. After 150 ms of motion, one of the spots disappeared and the other became the target. Measurements of the direction of eye acceleration showed that it was consistent with the computation of a weighted vector average of the velocities of the two spots. This strategy enabled the monkeys to maximize their rewards, because it minimizes the distance between eye position and both potential targets. Thus, this study shows that monkeys are able to initiate pursuit in the VA direction when such behavior is rewarded. However, averaging behavior is not observed under other conditions; in fact Ferrera and Lisberger (1995, 1997) have shown that under conditions in which prior information is available about which spot will be the target, pursuit accurately follows the target motion and vector averaging is not observed. There is also psychophysical evidence that some brief stimuli are perceived to move in the VA direction. Yo and Wilson (1992) found that for short durations (60 ms), Type II plaids (plaids for which the IOC direction is not between the component directions) are perceived to move in the VA direction, while at longer dura-

150 tions (150 ms), they are perceived to move in the IOC direction. Although they did not measure eye movements, these psychophysical results combined with Lisberger and Ferrera’s (1997) pursuit results raise the interesting possibility that an early motion signal in the VA direction is used to drive the initial components of both pursuit and perception. However, a recent study of ultrashort latency vergence eye movements has demonstrated an early smooth eye-movement response that appears independent of perceived motion in depth (Masson et al., 1997). The relationship between the earliest pursuit and perceptual responses is an interesting area for future research. Other conditions have also been shown to produce perceived motion in the VA direction for longer stimulus durations. Yo and Wilson (1992) showed that both very low contrast Type II plaids and plaids moving in the periphery produce large perceptual biases toward the VA direction. Wilson and Kim (1994) found that nonFourier (drifting beats) Type II plaids also produced large VA biases in the perceived direction even for 1-s durations. A single shared motion-processing stage would predict that these stimuli would also produce pursuit biased toward the VA direction. Unfortunately, these studies did not measure eye movements, so comparison of the directions of pursuit and perception under these conditions awaits future studies. Non-motion cues A confounding factor in our first experiment was the presence of non-motion cues that aided the perceptual decision. Despite the perceptual incoherence of the stimuli in the invisible-aperture conditions, the psychophysical performance was better than chance (78% correct). To minimize the effects of non-motion cues, we introduced random zooms and offsets in Experiment 2. This produced invisible-aperture psychophysical thresholds that were approximately a factor of 5 higher than the visible-aperture thresholds. For object-motion directions of 69 deg, performance in the invisibleaperture condition was poor (63% correct) but not completely random, compared to the nearly perfect performance (95% correct) in the visible-aperture condition. These results suggest that perception uses non-motion cues when they are available, and that if direction discrimination thresholds are to be used as an indirect measure of coherence, the stimuli must be carefully designed to reduce the usefulness of such cues. A second confounding factor in both experiments could have been the known influence of cognitive expectations on pursuit (e.g. Kowler, 1989, 1990). Because our stimuli always moved in a predominantly vertical direction, expectation might bias the oculomotor system to produce eye movements which were more vertical than the actual object motion. Although cognitive expectations could potentially explain the observed low eye-movement slopes in the visible-aperture condition, the high slopes found in the control experiment with a fully visible parallelogram, but identical cognitive expectations, rule out this possibility. Modeling To quantitatively examine the hypothesis that pursuit and perception share a common motion-processing stage, we proposed an explicit model. Our data are well predicted by a simple model which explicitly assumes that a single motion-processing stage is used to drive both pursuit and perception, and that an additional static pathway is available to aid perception, but not pursuit. To determine the model predictions for the visible-aperture psycho-

B.R. Beutter and L.S. Stone physics, we varied a single parameter which estimates the eyetracker directional noise, sTR . We determined sTR separately for each observer using a single overall-scaling free parameter and each observer’s measured positional uncertainty from the calibrations (see Appendix). The estimated values of sTR across the observers ranged from 3.4 deg to 4.6 deg. These appear to be too high, because they are greater than the ;1 deg estimate obtained by error propagation from an idealized saccade-free eye movement. Yet using significantly lower tracker-noise values would cause the predicted thresholds to be higher than the observed psychophysics. Our model, however, neglects three potentially important issues. The first issue is that it assumes that the oculomotor system adds insignificant noise. It is likely that this is untrue and that the oculomotor system does add measurable directional noise, which our model would simply treat as additional eye-tracker directional noise. Although few experiments have measured the directional noise of the pursuit system, one recent study lends support to the idea that oculomotor processing increases directional noise. Watamaniuk and Heinen (1999) measured both pursuit and perceptual direction discrimination thresholds for moving random dot fields with added directional noise. They concluded that both pursuit and perception had internal noise sources that were nearly equal, but that pursuit thresholds were higher because “the oculomotor system multiplies the noise that the visual system passes to it.” In fact, estimates obtained from their Fig. 8 suggest that this increase in noise is ;3 deg, which would be consistent with our noise estimate. We did not add an oculomotor noise source to our model, because having two consecutive noise sources, oculomotor and eye tracker, is canonically equivalent to a single noise source whose variance is simply the sum of the two variances. Thus, adding an oculomotor noise source as an additional parameter to our model would have no impact on its performance. The second issue is that our model assumes that the motionprocessing signal is constant over time. It is likely that within some visible-aperture trials, there were temporal intervals in which the motion was coherent (close to the IOC direction) and other temporal intervals in which it was incoherent (close to straight down). This would not greatly impact the perceptual decision, but would have two major impacts on the measured visible-aperture eye movements. First, if both the coherent and incoherent subintervals were fit to a single direction, it would produce fits with smaller angles, and thus yield measured slopes below 1.0 as observed. Second, it would increase the variability of the fits, and thus yield higher pursuit-direction standard deviations relative to the largely incoherent invisible-aperture data as observed. In future experiments, we will explore eye-movement analysis procedures that allow identification of subintervals of coherent and incoherent tracking. The third issue is that our model assumes that visual feedforward mechanisms alone are responsible for the observed correlation between pursuit and perceptual behavior. Eye-movement corollary discharge (oculomotor feedback) could also provide a link between perception and pursuit (e.g. Yasui & Young, 1975; Pola & Wyatt, 1989) and could have contributed to the observed correlations. The fact that the main perceptual effect of aperture visibility occurs even during fixation (Lorenceau & Shiffrar, 1992) rules out the possibility that the observed link between perception and pursuit is caused entirely by feedback mechanisms. Furthermore, unlike the steady-state pursuit of spot stimuli, even during perfect steady-state pursuit of our stimuli, there is still large retinal slip that requires ongoing motion processing. These two additional facts allow us to infer that shared visual input must be at least partially responsible for our observations, and our model assumes

Coherence for pursuit and perception that this is the dominant factor. Nonetheless, we cannot rule out the possibility that oculomotor feedback plays an important role as well. Although there have been a large number of studies that have examined the role of eye-movement signals in speed perception using small spots targets (e.g. Brenner & van den Berg, 1994; Freeman & Banks, 1998; Turano, 1999), further studies will be needed to determine to what extent eye-movement signals play a role in motion integration and direction perception. Coherence monitoring Our results show that monitoring smooth-pursuit eye movements can be used to measure perceptual motion coherence. While the decision to pursue or not to pursue is under volitional control, the speed and direction of pursuit eye movements generally are not (see however, Barnes et al., 1997), and are instead primarily determined by cortical computations of the target velocity. Thus, if pursuit is in the same direction as the object motion, it indicates that the stimulus was a coherent object. Conversely for incoherent stimuli, pursuit will be in a different direction; for our invisibleaperture stimuli, pursuit was largely in the TM direction. The use of pursuit as a coherence measure offers many advantages over either direct coherence judgments (e.g. Alais et al., 1998; Adelson & Movshon, 1982) or inferring coherence from indirect psychophysical judgments of object velocity (e.g. Welch & Bowne, 1990; Lorenceau & Shiffrar, 1992). Direct judgments of coherence are subjective, and thus can both change over time and be biased. Indirect but objective measurements of coherence have the inherent difficulty that relevant differences in object motion must also necessarily produce changes in the component motions that can serve as cues, which are independent of coherence. Thus, even for incoherent stimuli, observers will generally be able to perform the task at a level above chance, depending on the effectiveness of the component cues. Another drawback of relying on an indirect psychophysical approach to determine coherence is that a single judgment is obtained for each trial, which prohibits the possibility of examining how coherence changes over the time course of an individual trial. Using pursuit as the metric could allow coherence to be measured over time throughout a trial. Finally, coherence is not an all-or-nothing phenomenon, but instead ranges from fully rigid object motion to nonrigid object motion to independent component motion. Pursuit could be used to provide a continuous measure of coherence during a trial, which is not possible with standard discrete psychophysical judgments (usually binary). In conclusion, our data show not only that the aperture luminance affects pursuit and perception similarly, but also that pursuit and perception of coherent stimuli are correlated on a trial-by trial basis. These results provide strong evidence that pursuit and perception share a motion-integration stage which makes a common decision to selectively integrate the local motions into an objectmotion signal, or to leave the local motions unlinked. Physiological evidence from monkeys shows that areas MT and MST are involved in this shared motion-integration process. Our results demonstrate that the primary signal used to drive both perception and steady-state pursuit is a signal related to perceived object motion, rather than the simple vector average of local motions, or retinal slip and its derivatives. Acknowledgments The authors thank Dr. Jean Lorenceau for critical insights and helpful discussions. We also thank Drs. David Grosof and Jeffrey W. McCandless

151 and an anonymous reviewer for their constructive comments on earlier drafts and the NASA Ames Vision Group for their general support. This work was supported by a NASA RTOPs 131-20-30 & 548-51-12 to L.S. Stone.

References Adelson, E.H. & Movshon, J.A. (1982). Phenomenal coherence of moving visual patterns. Nature 300, 523–525. Alais, D., van der Smagt, M.J., van den Berg, A.V. & van de Grind, W.A. (1998). Local and global factors affecting the coherent motion of gratings presented in multiple apertures. Vision Research 38, 1581–1591. Albright, T.D. & Stoner, G.R. (1995). Visual motion perception. Proceedings of the National Academy of Sciences of the U.S.A. 92, 2433–2440. Barnes, G., Grealy, M. & Collins, S. (1997). Volitional control of anticipatory ocular smooth pursuit after viewing, but not pursuing, a moving target: Evidence for a re-afferent velocity store. Experimental Brain Research 116, 445– 455. Beutter, B.R., Mulligan, J.B. & Stone, L.S. (1995). Analysis of the trial-by-trial correlation between eye movements and the perceptual responses to moving plaids. Society for Neuroscience Abstracts 21, 141. Beutter, B.R. & Stone, L.S. (1996). Quantifying the correlation between eye-movement and perceptual responses to moving plaids. Investigative Ophthalmology and Visual Science (Suppl.) 37, 738. Beutter, B.R. & Stone, L.S. (1997). Pursuit and direction perception are driven by similar and largely veridical object-motion signals. Investigative Ophthalmology and Visual Science (Suppl.) 38, 693. Beutter, B.R. & Stone, L.S. (1998a). Human motion perception and smooth eye movements show similar directional biases for elongated apertures. Vision Research 38, 1273–1286. Beutter, B.R. & Stone, L.S. (1998b). Oculometric analysis of pursuit eye movements can measure motion coherence. Society for Neuroscience Abstracts 24, 1410. Brenner, E. & van den Berg, A.V. (1994). Judging object velocity during smooth pursuit eye movements. Experimental Brain Research 99, 316–324. Britten, K.H., Shadlen, M.N., Newsome, W.T. & Movshon, J.A. (1992). The analysis of visual motion: A comparison of neuronal and psychophysical performance. Journal of Neuroscience 12, 4745– 4765. Britten, K.H. & van Wezel, J.A. (1998). Electrical microstimulation of cortical area MST biases heading perception in monkeys. Nature Neuroscience 1, 59– 63. Celebrini, S. & Newsome, W.T. (1995). Microstimulation of extrastriate area MST influences performance on a direction discrimination task. Journal of Neurophysiology 73, 437– 448. Dobkins, K.R., Stoner, G.R. & Albright, T.A. (1998). Perceptual, oculomotor and neural responses to moving color plaids. Perception 27, 681–709. Dürsteler, M.R., Wurtz, R.H. & Newsome, W.T. (1987). Directional pursuit deficits following lesions of the foveal representation within the superior temporal sulcus of the macaque. Journal of Neurophysiology 57, 1262–1287. Dürsteler, M.R. & Wurtz, R.H. (1988). Pursuit and optokinetic deficits following chemical lesions of cortical areas MT and MST. Journal of Neurophysiology 60, 940–965. Fennema, C.L. & Thompson, W.B. (1979). Velocity determination in scenes containing several moving objects. Computer Graphics and Image Processing 9, 301–315. Ferrera, V.P. & Lisberger, S.G. (1995). Attention and target selection for smooth pursuit eye movements. Journal of Neuroscience 15, 7472–7484. Ferrera, V.P. & Lisberger, S.G. (1997). The effect of a moving distracter on the initiation of smooth pursuit eye movements. Visual Neuroscience 14, 323–338. Freeman, T.C. & Banks, M.S. (1998). Perceived head-centric speed is affected by both extra-retinal and retinal errors. Vision Research 38, 941–945. Goodale, M. & Milner, A. (1992). Separate visual pathways for perception and action. Trends in Neuroscience 15, 20–25. Komatsu, H. & Wurtz, R.H. (1989). Modulation of pursuit eye movements by stimulation of cortical areas MT and MST. Journal of Neurophysiology 62, 31– 47. Kowler, E. & McKee, S.P. (1987). Sensitivity of smooth eye movement to small differences in target velocity. Vision Research 27, 993–1015.

152 Kowler, E. (1989). Cognitive expectations, not habits, control anticipatory smooth oculomotor pursuit. Vision Research 29, 1094–1057. Kowler, E. (1990). The role of visual and cognitive processes in the control of eye movements. In Eye Movements and Their Role in Visual and Cognitive Processes, ed. Kowler, E., pp. 1–70. Amsterdam, New York, Oxford: Elsevier. Krauzlis, R.J. & Lisberger, S.G. (1991). Visual motion commands for pursuit eye movements in the cerebellum. Science 253, 568–571. Krauzlis, R.J. & Stone, L.S. (1999). Tracking with the mind’s eye. Trends in Neuroscience 22, 544–550. Lisberger, S.G. & Ferrera, V.P. (1997). Vector averaging for smooth pursuit eye movements initiated by two moving targets in monkeys. Journal of Neuroscience 17, 7490–7502. Lisberger, S.G. & Movshon, J.A. (1999). Visual motion analysis for pursuit eye movements in area MT of macaque monkeys. Journal of Neuroscience 19, 2224–2246. Lorenceau, J. & Shiffrar, M. (1992). The influence of terminators on motion integration across space. Vision Research 32, 263–273. Lorenceau, J. (1998). Veridical perception of global motion from disparate component motions. Vision Research 38, 1605–1610. Masson, G.S., Busettini, C. & Miles, F.A. (1997). Vergence eye movements in response to binocular disparity without depth perception. Nature 389, 283–286. Murasugi, C.M., Salzman, C.D. & Newsome, W.T. (1993). Microstimulation in visual area MT: Effects of varying pulse amplitude and frequency. Journal of Neuroscience 13, 1719–1729. Newsome, W.T. & Paré, E.B. (1988). A selective impairment of motion perception following lesions of the middle temporal visual area (MT). Journal of Neuroscience 8, 2201–2211. Newsome, W.T., Wurtz, R.H., Dürsteler, M.R. & Mikami, A. (1985). Deficits in visual motion processing following ibotenic acid lesions of the middle temporal visual area of the macaque monkey. Journal of Neuroscience 5, 825–840. Pasternak, T. & Merigan, W. (1994). Motion perception following lesions of the superior temporal sulcus in the monkey. Cerebral Cortex 4, 247–259. Pola, J. & Wyatt, H.J. (1989). The perception of target motion during smooth pursuit eye movements in the open-loop condition: Characteristics of retinal and extraretinal signals. Vision Research 29, 471– 483. Ringach, D.L. (1995). A ‘tachometer’ feedback model of smooth pursuit eye movements. Biological Cybernetics 73, 561–568. Robinson, D.A., Gordon, J.L. & Gordon, S.E. (1986). A model of the smooth pursuit eye movement system. Biological Cybernetics 55, 43–57. Rudolph, K. & Pasternak, T. (1999). Transient and permanent deficits in motion perception after lesions of cortical areas MT and MST in the macaque monkey. Cerebral Cortex 9, 90–100. Salzman, C.D., Britten, K.H. & Newsome, W.T. (1990). Cortical microstimulation influences perceptual judgments of motion direction. Nature 346, 174–177. Salzman, C.D., Murasugi, C.M., Britten, K.H. & Newsome, W.T. (1992). Microstimulation in visual area MT: Effects on direction discrimination performance. Journal of Neuroscience 12, 2331–2355. Shiffrar, M. & Lorenceau, J. (1996). Increased motion linking edges with decreased luminance contrast, edge width and duration. Vision Research 36, 2061–2067. Steinbach, M.J. (1976). Pursuing the perceptual rather than the retinal stimulus. Vision Research 16, 1371–1376. Stone, L.S., Beutter, B.R. & Lorenceau, J. (1996). On the visual input driving human smooth-pursuit eye movements. NASA Technical Memorandum 110424, Washington, DC: NASA. Stone, L.S. & Beutter, B.R. (1998). Minimization of retinal slip cannot explain human smooth-pursuit eye movements. Society for Neuroscience Abstracts 24, 1743. Stoner, G.R. & Albright, T.D. (1992). Neural correlates of perceptual coherence. Nature 358, 412– 414. Turano, K.A. (1999). Eye movements affect the perceived speed of visual motion. Vision Research 39, 1177–1187. Watamaniuk, S.N.J. & Heinen, S.J. (1999). Human smooth pursuit direction discrimination. Vision Research 39, 59–70. Welch, L. & Bowne, S.F. (1990). Coherence determines speed discrimination. Perception 19, 425– 435. Wilson H.R., Ferrera, V.P. & Yo, C. (1992). A psychophysically motivated model for two-dimensional motion perception. Visual Neuroscience 9, 79–97.

B.R. Beutter and L.S. Stone Wilson, H.R. & Kim, J. (1994). Perceived motion in the vector sum direction. Vision Research 34, 1835–1842. Wyatt, H.J. & Pola, J. (1979). The role of perceived motion in smooth pursuit eye movements. Vision Research 19, 613– 618. Yasui, S. & Young, L.R. (1975). Perceived visual motion as effective stimulus to pursuit eye movement system. Science 35, 906–908. Yo, C. & Wilson, H.R. (1992). Perceived direction of moving twodimensional patterns depends on duration, contrast and eccentricity. Vision Research 32, 135–147.

Appendix This appendix describes our model (Fig. 8) and specifies the computations and noise sources used to generate quantitative simulations. The model assumes that the stimulus is processed by two parallel pathways. The eye-movement pathway shown on the left begins with a motion-processing stage, whose input is the stimulus motion and whose output is a noisy directional signal: uM 5 gM uI 1 hM

(A1)

Its mean is equal to the direction of object motion (uI ) multiplied by an angular gain factor ~gM ). The added noise (hM ) is Gaussian distributed with a standard deviation of sM . The motion signal (uM ) is then sent to the oculomotor system (which we assume adds a negligible amount of directional noise) and used to generate pursuit in the direction uEM 5 uM . The eye tracker outputs the measured eye position on each frame, which adds measurement noise. This positional noise, measured in the calibration procedure, determines the resultant directional noise (h TR ) that is added by the tracker, and is different for each observer. We fit the output of the eye tracker to determine the direction of the eye movement for each trial (uET ). We assume uET has the same mean as the output of the motion-processing stage and has additional zero-mean Gaussian directional noise (h TR ) with standard deviation, sTR . For each trial, uET is used to make an oculometric decision, which is rightward if uET is .0, and leftward if it is ,0. uET 5 uEM 1 h TR

(A2)

The measured eye movement has directional noise added by both the motion processing and the eye-tracker measurement, so its total standard deviation is 2 sET 5 % sM2 1 sTR

(A3)

Measuring the eye movements allows us to estimate both the motionprocessing angular gain ~gM ), by linearly fitting the pursuit directions to the object-motion direction, and the total eye-movement standard deviation (sET ), by computing the average standard deviation across directions. Specifying the directional noise added by the eye tracker (sTR ) allows us to estimate the motion-processing noise (sM ). The perceptual pathway, shown on the right of Fig. 8, has two inputs. The first is the motion signal (uM ). The second, a static signal (uS ), is based on non-motion cues, and is independent of uM . We assume its output is a noisy directional signal: uS 5 gS uI 1 hS

(A4)

Its mean is equal to the object-motion direction (uI ) multiplied by an angular gain factor ~gS ). The added noise (hS ) is zero-mean and Gaussian distributed, with a standard deviation of sS . Perceptual

Coherence for pursuit and perception

153

processing optimally combines the static and motion signals, by weighting ( E) each according to its salience: uP 5 euM 1 (1 2 e!uS

(A5)

For each trial, uP is used to make a perceptual decision, which is rightward if uP is .0 and leftward if it is ,0. The pursuit data determine all the parameters associated with the eye-movement pathway, except for the eye-tracker directional noise (sTR ). From the data in Fig. 6, we found that for the visibleaperture condition, the angular gains, gM , of the observers ranged from 0.17 to 0.80 and the average total pursuit-direction standard deviations, sET , ranged from 4.1 deg to 4.7 deg. For the invisibleaperture condition, the measured pursuit directions are close to zero for all object-motion directions, and all of the observers have angular gains, gM , that are not significantly different from zero. The motion-processing pathway therefore provides no information about the direction of object motion, and we assume that the perceptual decision is based entirely on the static signal. Thus, the perceptual performance for the invisible-aperture condition determines the parameters associated with the static signal. Signal-

detection theory allows us to convert the psychophysical proportion rightward to d ' (detectability index), and to use the relationship d ' 5 gS uS 0sS to compute the ratio of gS to sS This ratio is sufficient for predicting performance in the visible-aperture condition, because we assume that the perceptual decision stage optimally combines the two signals. For two independent signals from Gaussian distributions, the optimal strategy is to compute a weighted sum. The detectability of the combined signals is then the square root of the sum of the squares of the d ' values of the two signals. Thus, all the parameters in the model were determined from the data, except for the directional noise added by the eye tracker (sTR ) which is required to predict the visible-aperture psychophysical data. Because the pursuit was predominantly vertical, the directional uncertainty can be approximated by the ratio of the horizontal tracker positional noise to the vertical pursuit amplitude. Because observers differed significantly in both these parameters, we assumed that for each observer, the amount of tracker directional noise, sTR , was equal to this ratio multiplied by a single overall scale factor. We determined the single value of the scale factor that optimally fit the psychophysical data for all the observers.