UCL  IRIS
Institutional Research Information Service
UCL Logo
Please report any queries concerning the funding data grouped in the sections named "Externally Awarded" or "Internally Disbursed" (shown on the profile page) to your Research Finance Administrator. Your can find your Research Finance Administrator at https://www.ucl.ac.uk/finance/research/rs-contacts.php by entering your department
Please report any queries concerning the student data shown on the profile page to:

Email: portico-services@ucl.ac.uk

Help Desk: http://www.ucl.ac.uk/ras/portico/helpdesk
Publication Detail
Neuromagnetic signatures of segregation in complex acoustic scenes
  • Publication Type:
    Conference
  • Authors:
    Teki S, Payne C, Griffiths TD, Chait M
  • Publication date:
    2012
  • Name of conference:
    International Conference on Auditory Cortex
  • Conference start date:
    01/01/2012
Abstract
The natural auditory environment consists of multiple dynamically varying sound sources. In order to make sense of this complex mixture of sounds, we need to segregate individual sources, such as the sound of the violin in an orchestra. The brain has evolved specialized mechanisms for performing such auditory scene analysis, but the underlying mechanisms remain to be fully explained. We improved upon earlier experimental paradigms based on deterministic patterns of pure tones and modelled the acoustic scenes using a stochastic figure-ground stimulus (SFG, Teki et al., 2011). The stimulus comprises a series of chords containing random frequencies that vary from one chord to another in a range from 200 Hz to 7.2 kHz. To study segregation, we introduced a figure by randomly selecting a certain number of frequencies (where that number defines 'coherence') and repeating them over a certain number of chords (where that number defines 'duration'). This manipulation allows us to parametrically control the salience of the figure, which is indistinguishable from the background at any given point of time. The figure can only be extracted by binding across both time and frequency, and we found that behaviourally, listeners are very sensitive to the emergence of these complex figures. We have previously established a role for the intraparietal sulcus (IPS) in stimulus-driven segregation of these figures (Teki et al., 2011) and ongoing work further suggests a role for temporal coherence (Shamma et al., 2011) in segregation in such complex acoustic scenes (Teki et al., 2012). We used Magnetoencephalography (MEG) to investigate mechanisms underlying the emergence of figures with different salience (coherence of 2, 4 or 8; 0.6 long) presented after the statistically similar background segments (0.6 s). Listeners were engaged in an incidental visual task and were naive to the existence of the changes in the SFG stimulus. In another condition, we presented the same stimuli but interspersed with alternating white noise segments, as we previously found that this manipulation does not affect detection performance (Teki et al., 2012). Analysis of time-locked activity in auditory cortex shows early responses to the emergence of figure that occur within 100ms of figure onset. Late changes are also seen corresponding to the presence of figure that persists for at least a second. . Time-frequency analysis is ongoing using a beamformer approach (Sedley et al., 2011) to identify early and late oscillatory activity in sources within auditory and parietal cortex.
Publication data is maintained in RPS. Visit https://rps.ucl.ac.uk
 More search options
UCL Researchers
Author
The Ear Institute
University College London - Gower Street - London - WC1E 6BT Tel:+44 (0)20 7679 2000

© UCL 1999–2011

Search by