Temporal binding window of Audio-Visual Speech Stimuli in Infants

This study explores multisensory perception in infants, specifically the temporal binding window of Audio-Visual Speech Stimuli.

The experiment focuses on the effect of changes in gender congruency of face/voice stimuli on the detection of changes in temporal synchrony between audio and visual components of stimuli during Multisensory Binding of Audio-Visual Speech Stimuli in 8-10 month old Infants.

To test this we plan to employ a habituation paradigm similar to that use by Lewkowicz (1996) to measure infants’ perception of auditory-visual temporal synchrony.

Before the actual test trials, the infants will be habituated with a synchronous face-voice pairing that will be either matched for gender or mismatched.

We will then show 8-10-month-old infants video clips of faces and voices which will be presented either synchronously or asynchronously, and we will measure their looking time to the videos.

Finding out whether infants’ temporal binding window is also affected by congruency effect between the face and the voice of the speak might shed light into the amount of experience needed with multisensory stimuli in order to detect top-down effects on perception.