Neural Concept Sampler

Build a neural system that learns a conceptual hierarchy of (sound-)objects, autonomously searches the underlying conceptual space, and presents the retrieved associative concept-sequence audio-visually.

Disciplines:

  • Computational modelling
  • Creative arts
  • Cognitive neuroscience

This project aims at developing a neuro-inspired sound analysis, transformation and synthesis system with the ability to extract and learn temporal hierarchical categorical structure from sound inputs. It will create new sound patterns autonomously from its internal representations.

The system will make use of neural networks that pre-process sounds in a bio-inspired way and extract sparse spectro-temporal event streams. The streams will be analysed further for hierarchical patterns that are stored in associative networks. Autonomous associative dynamics in these networks will allow the system to generate novel event-streams and sound patterns that reflect conceptual structures in the training data. Neurally-implemented syntactic rules will be developed to add an additional cognitive level which constrains the creative exploration of the conceptual hierarchical sound space.

Research Fellow
Jack McKay Fletcher
Supervisors

Thomas Wennekers, Sue Denham, Jane Grant, John Matthias, Martin Coath (Plymouth University)

Further Reading