What We Study

Our research is driven by the view that perception and cognition are tuned to the structure of environments and events. We therefore investigate how attention varies in response to behaviorally relevant moments, how events are perceived and shape memory, and how environmental regularities guide attention over space and time.

Attention to Behaviorally Relevant Moments

How do changes in the relevance of events over time influence the way other information is perceived? To answer this question we ask participants to perform a simple task: Respond to a target item that appears in a stream of distractors (e.g., count the white squares in a stream of black distractor squares) while simultaneously remembering an unrelated stream of images. Because attention is limited, increasing attention to the target should impair memory for coinciding background images. Surprisingly, the data indicate that the opposite occurs: Memory for an image presented at the same time as a target square is better than memory for an image presented with a distractor square (Swallow & Jiang, 2010). Increasing attention to a goal relevant event appears to boost the processing of concurrent information, producing what we have called the attentional boost effect.

ABE
Participants encoded a series of briefly presented scenes at the same time that they performed a simple target detection task. Later memory for scenes that coincided with a target in the detection task was enhanced relative to those presented with distractors (from Swallow & Jiang, 2010, Cognition).

Ongoing research in the lab is investigating the cognitive and neural mechanisms that produce the attentional boost effect, and understanding its implications for how people remember events. We recently found that attending to targets even boosts memory for things that people are ignoring and helps them know which faces appeared with which scene during a particular event (Broitman & Swallow, submitted; Turker & Swallow, 2019).

Relational ABE
Participants pressed a button for faces of a particular gender. At the same time they memorized scenes presented in the background. Participants were later tested on their memory for the scenes, and then asked to indicate which of two faces appeared with the scene. Because the faces were both used as either a distractor (left examples) or as a target (right examples), this test tapped participants memory for the relationship between the scene and face, which was better when the faces were also targets (from Turker & Swallow, 2019, Memory & Cognition).

For more information, please see:

Neural Mechanisms of Temporal Selection

Using MRI, we have found that activity throughout primary visual cortex increases in response to targets (Swallow, Makovski & Jiang, 2012). Surprisingly, this effect occurs even when the targets are auditory tones and there is no new visual information.

V1 Boost
Early visual cortical areas V1-V3 increased more in activity following auditory target tones than auditory distractor tones. Increases were observed in visual areas representing the central and peripheral visual fields, even when no new visual stimuli were presented (from Swallow, et al., 2012. J Neurophysiology).

New neuroimaging work is exploring the potential contributions of one neuromodulatory system, originating in the locus coeruleus, to these effects (Turker, et al., submitted).

LC iFC
Neuromelanin images were used to individually localize the LC (white spots), which overlapped across individuals, but varied in location (red = 85%, blue = 5% overlap). Individual LC activity correlated with activity in medial prefrontal, parietal, and occipital areas rest (from Turker, Riley, Luh, Colcombe, & Swallow, submitted).

For more information, please see:

  • Swallow, K. M., Makovski, T., & Jiang, Y. V. (2012). The selection of events in time enhances activity throughout early visual cortex. Journal of Neurophysiology, 108 (12), 3239-3252. [PubMed]
  • Turker, H. B., Riley, E., Luh, W.-M., Colcombe, S. J., & Swallow, K. M. (submitted). Estimates of locus coeruleus function with functional magnetic resonance imaging are influenced by localization approaches and the use of multi-echo data. BioRxiv, 731620.

Event Segmentation and Memory

In another line of research the lab investigates how people perceive and remember every day events. We ask how people divide their continuous, ongoing experience into meaningful events as they happen. We would like to understand how the moments that separate “What is happening now” from “What just happened” impact when people pay attention to the external world, and what they later remember about it.

We have found that shifts from one event to another influence what information is available for later retrieval and when it can be retrieved from memory (Swallow, Zacks & Abrams, 2009). Trying to remember objects from an event that just ended also increased activity in the medial temporal lobe, including the hippocampus, a part of the brain that is critical for episodic memory (Swallow, et al., 2011).

Memory for Objects in Film
Memory for objects presented in a film (circled) was influenced both by event boundaries that occurred during object presentation (boundary vs. nonboundary object) and by boundaries that occurred during the 5 second delay between object presentation and test (within vs. across events). The hippocampus (purple and green) was more active when boundary objects were tested across events. (Swallow, et al., 2011).

We also want to account for why people segment when they do. To better understand the contributions of low-level visual features of activities (e.g., visual motion and the actor’s body posture) to segmentation, a recent study asked participants to identify meaningful events in videos of an activity that was filmed from the first-and third-person perspectives. Surprisingly, participants identified similar events across perspectives despite differences in their low-level sensory features (Swallow, Kemp, & Candan Simsek, 2018). In ongoing research, we are looking at whether people are sensitive to different types of changes in other people’s activities, and whether this depends on familiarity with an activity, and the context in which an individual developed.

segmentation across perspectives
Participants identified events in an activity that was viewed from a third-person perspective or a first-person perspective. Analyses indicated that participants identified similar events despite changes in the visual features of the videos.

For more information, please see:

Statistical Learning

How do people learn and use statistical structure in the world to better anticipate goal-relevant events? Our research has shown that people are remarkably sensitive to the statistical structure of activities and can use this information to learn to attend at particular moments in time (Swallow & Zacks, 2008). Our other work has found that people are able to learn which regions of space are likely to contain items that they are searching for (Jiang & Swallow, 2013). However, they tend to learn these locations relative to themselves, rather than as fixed locations in the world. Ongoing research is exploring the relationship between statistical regularities in events and attention.

Probability cueing
Participants searched for a target in a display that was laid flat on the table. During training the target was more likely to appear in one (rich) quadrant of the screen than in the others. Participants found targets in that quadrant more quickly. The location participants searched more quickly moved with them during testing, staying in the same location in their visual field rather than on the screen (from Jiang & Swallow, 2013, Cognition).

For more information, please see: