Tracking the time course of phonological neighborhood clustering effects in spoken word recognition


Charles Redmon, Annie Tremblay, & Michael Vitevitch (under review)
Preprint: pdf

This study used visual-world eye tracking to examine the effect—first observed in Chan and Vitevitch (2009)—of the phonological neighborhood clustering coefficient on the time course of lexical access in spoken word recognition. Target words from neighborhoods with relatively high clustering (i.e., neighbors of the target word are also neighbors with each other) showed a significant lag in eye fixations relative to words from less clustered neighborhoods after controlling for neighborhood density, target frequency, neighborhood frequency, and multiple phonotactic probability measures. This effect was also influenced by lexical frequency, neighborhood density, and neighborhood frequency, suggesting that neighborhood clustering is not only a relevant factor in spoken word recognition, but it should be a factor accounted for by any viable model of spoken word recognition.