image representing the current explore topic

Visual Behavior Neuropixels

Download Data  

Overview

Our ability to perceive the sensory environment and flexibly interact with the world requires the coordinated action of neuronal populations distributed throughout the brain. Yet, the detailed patterns of spiking activity that underlie perception and behavior are not well understood. To further our understanding of the neural basis of behavior, the Visual Behavior project used the Allen Brain Observatory (diagrammed below) to collect a large-scale, highly standardized dataset consisting of recordings of neural activity in mice that have learned to perform a visually guided task. This dataset can be used to investigate how patterns of spiking activity across the visual cortex and thalamus are related to behavior and also how these activity dynamics are influenced by task-engagement and prior visual experience.  

The Visual Behavior Neuropixels dataset includes 153 sessions from 81 mice. These data are made openly accessible, with all recorded timeseries, behavioral events, and experimental metadata conveniently packaged in Neurodata Without Borders (NWB) files that can be accessed and analyzed using our open Python software package, the AllenSDK.

Visual Change Detection Task

The Visual Behavior project is built upon a change detection behavioral task. Briefly, in this go/no-go task, mice are shown a continuous series of briefly presented visual images and they earn water rewards by correctly reporting when the identity of the image changes (diagrammed below). Five percent of images are omitted, allowing for analysis of expectation signals. 

 

To allow analysis of stimulus novelty on neural responses, two different images sets were used in the recording sessions (diagrammed below). Image set “G” consists of 8 images that were used during behavioral training and are highly familiar to the mice. Image set “H” consists of 6 novel images and 2 familiar images. In 10 mice, training was performed with image set H, and image set G was novel. For most mice, the familiar image set was used for the first recording day and the novel image set was used for the second recording day. In 3 mice, this ordering was reversed (novel, familiar).

Image sets G (blue) and H (red) have two common images (purple).

 

During task performance, the timing of behavioral responses (licks) and earned rewards are recorded, as well as mouse running speed, eye position and pupil area.  

Visual Behavior Neuropixels Data

This dataset includes multi-regional Neuropixels recordings from up to 6 probes at once. The probes target six visual cortical areas including VISp, VISl, VISal, VISrl, VISam, and VISpm. In addition, since the probes record along ~3 mm of vertical span, multiple subcortical areas are also typically measured including visual thalamic areas LGd and LP. A subset of experiments were performed in transgenic mice to permit optotagging of either Sst+ or Vip+ interneurons in the cortex. 

Every experimental session consisted of four major stimulus epochs as diagrammed below: 1) an active behavior epoch during which the mouse performed the change detection task, 2) a receptive field characterization epoch during which we presented gabor stimuli and full-field flashes, 3) a passive replay epoch during which we replayed the same stimulus frame for frame as the mouse encountered during active behavior, but now with the lick spout removed and 4) an optotagging epoch during which we stimulated the surface of the brain with blue light to activate ChR2-expressing cortical interneurons.