AIMC Topic: Acoustic Stimulation

Clear Filters Showing 51 to 60 of 63 articles

Hierarchical Learning of Statistical Regularities over Multiple Timescales of Sound Sequence Processing: A Dynamic Causal Modeling Study.

Journal of cognitive neuroscience
Our understanding of the sensory environment is contextualized on the basis of prior experience. Measurement of auditory ERPs provides insight into automatic processes that contextualize the relevance of sound as a function of how sequences change ov...

Detection of early reflections from a binaural activity map using neural networks.

The Journal of the Acoustical Society of America
Human listeners localize sounds to their sources despite competing directional cues from early room reflections. Binaural activity maps computed from a running signal can provide useful information about the presence of room reflections, but must be ...

Development of an automatic classifier for the prediction of hearing impairment from industrial noise exposure.

The Journal of the Acoustical Society of America
The ISO-1999 [(2013). International Organization for Standardization, Geneva, Switzerland] standard is the most commonly used approach for estimating noise-induced hearing trauma. However, its insensitivity to noise characteristics limits its practic...

Deep learning models to remix music for cochlear implant users.

The Journal of the Acoustical Society of America
The severe hearing loss problems that some people suffer can be treated by providing them with a surgically implanted electrical device called cochlear implant (CI). CI users struggle to perceive complex audio signals such as music; however, previous...

Augmenting intracortical brain-machine interface with neurally driven error detectors.

Journal of neural engineering
OBJECTIVE: Making mistakes is inevitable, but identifying them allows us to correct or adapt our behavior to improve future performance. Current brain-machine interfaces (BMIs) make errors that need to be explicitly corrected by the user, thereby con...

The effects of auditory and visual cues on timing synchronicity for robotic rehabilitation.

IEEE ... International Conference on Rehabilitation Robotics : [proceedings]
In this paper, we explore how the integration of auditory and visual cues can help teach the timing of motor skills for the purpose of motor function rehabilitation. We conducted a study using Amazon's Mechanical Turk in which 106 participants played...

An algorithm to increase intelligibility for hearing-impaired listeners in the presence of a competing talker.

The Journal of the Acoustical Society of America
Individuals with hearing impairment have particular difficulty perceptually segregating concurrent voices and understanding a talker in the presence of a competing voice. In contrast, individuals with normal hearing perform this task quite well. This...

Predicting the perception of performed dynamics in music audio with ensemble learning.

The Journal of the Acoustical Society of America
By varying the dynamics in a musical performance, the musician can convey structure and different expressions. Spectral properties of most musical instruments change in a complex way with the performed dynamics, but dedicated audio features for model...

Auditory inspired machine learning techniques can improve speech intelligibility and quality for hearing-impaired listeners.

The Journal of the Acoustical Society of America
Machine-learning based approaches to speech enhancement have recently shown great promise for improving speech intelligibility for hearing-impaired listeners. Here, the performance of three machine-learning algorithms and one classical algorithm, Wie...