PHYsically informed and Semantically controllable
Interactive Sound synthesis

PHYSIS is an industrial research project focused on improving our knowledge about how to 
create interactive real-life sounds and how to interact in real-time with them using semantic and physical controls or emergent tangible interfaces. Moreover, PHYSIS has a systemic  approach and considers the problem in its entirety: from the necessary core researches on sound analysis and modelling up to optimized compiled code and interactive synthesis models and transformations. PHYSIS will also provide high level analysis and synthesis tools targeting potential industrial users like video game creatorsthat give interactive semantic control over the generated content.

PHYSIS is centered on the modeling, transformation and real-time synthesis of diegetic sounds for interactive virtual worlds (video games, simulations, serious games…) and augmented reality. By diegetic sounds, we mean all sounds generated by identifiable objects in the virtual scene (e.g. glass, weapons, liquids, fire, water, fabric…) and their possible interactions: physical impacts, footsteps, slidings, rollings, sweepings… It does not comprise musical sounds.