We wished to test the hypothesis that there are modules in low-frequency AI that can be identified by their responsiveness to communication calls or particular regions of space. Units were recorded in anaesthetised guinea pig AI and stimulated with conspecific vocalizations and a virtual motion stimulus (binaural beats) presented via a closed sound system. Recording tracks were mainly oriented orthogonally to the cortical surface. Some of these contained units that were all time-locked to the structure of the chutter call (14/22 tracks) and/or the purr call (12/22 tracks) and/or that had a preference for stimuli from a particular region of space (8/20 tracks with four contralateral, two ipsilateral and two midline), or where there was a strong asymmetry in the response to beats of different direction (two tracks). We conclude that about half of low-frequency AI is organized into modules that are consistent with separate "what" and "where" pathways.