Title: Faces, objects, and agents: How infants and children understand the visual world
Abstract: How we experience the world and interact with it is ultimately tied to how we represent relevant aspects of the world, such as objects and people. In my work, I seek to elucidate how infants represent their visual world, and in particular, the faces and facial emotions of other people. The first part of my talk will focus on facial emotion perception, which is well characterized in adults and engages high-level visual cortices and social-emotional processing networks. I will discuss how such stimuli activate specific regions of the infant brain by 5-7 months, as measured by both electroencephalography (EEG) and functional Near-Infrared Spectroscopy (fNIRS), with clear overlap with the expected regions in adults. Then, I will present some biases and patterns of errors in the implicit and explicit perception of facial emotions by infants and children. Furthermore, I will make the point that neural activations to facial emotions in early infancy reflect individual risk for anxiety, and thus bear clinical implications. The second part of my talk will question whether the specificity of high-level visual activations early in life suffices to support robust, fast, category-specific representations of visual objects and kinds, such as faces and facial emotions. I will explain how and under which assumptions the multivariate pattern analysis of EEG and fNIRS data may be used to probe the timing and dimensions of such neural representations, as well as the challenges inherent to harnessing such methods for infant data. Then, I will argue that while different types of visual stimuli are robustly represented in the infant brain, they may not yet be robustly, efficiently organized by category; indeed, even a strong domain-specific marker such as inversion may not be as robustly, efficiently represented by young children as it is by adults. Taken together, these findings provide a road-map to a program of research bridging behavioral, neurophysiological, and computational approaches to contribute to a mechanistic understanding of high-level, social-emotional perception development.