Stan Wijnans (NL)

Stan Wijnans is a dutch interactive sonic artist, interactive performance researcher, MAX/MSP programmer and worked as manager of multi media studio S.T.A.N., as a professional sound engineer and bass guitar player. Her work investigates the human-machine relationship in interactive sound performances exploring ( 3D-surround) sound, choreography, visuals, robotics and sensor systems.
Her main interest is in the way improvised elements from different live art disciplines can expand the artistic-audiovisual connection created by interactivity and in the possibilities for interesting (not necessarily objective) realtime digital transformations of live action. The focus is on technical systems that are able to trigger body movements to manipulate the interactive architectural and/or virtual environment.

She has created sound- and video compositions and installations, radio projects, interactive light installations and multi- disciplinary interactive performances. She worked intensely with Australian robotics artist Stelarc in the project 'Muscle Machine', choreographers Sophia Lycouris ('Intelligent City' camera tracking, the Very Nervous System), Isabel Rocamora ('Body Memory' Motion Capture) and Sarah Rubidge ('Global Drifts'. 'Echoing Traces', 'Sense and Sensibility', Cricket prototype RF/Ultrasonic tracking system by V2_ & PGAcoustics, interactive surround sound, Isadora) amongst others. Her interactive dance- and sound performance 'Frozen White' (Soundbeam, ultra sound sensor system) was shown at the ICA, London, UK.

She has taught workshops at several universities in the UK and Taiwan (TNTU University of TaiPei), Art Institutions (The Junction Cambridge, Essex Dance Chelmsford) and worked several years for the Nottingham Trent University and the University of Chichester as MA/PhD advisor, performance developer and MAX/MSP programmer.

She holds an MA in interactive robotics and sound performance (MiddleSex University UK) and received a full fee PhD University bursary at Bath Spa University in 2005 to investigate the practical and theoretical potential of mapping parameters from spatial body movement into interactive 3D-sound composition using Sensor Systems and the programming environment MAX/MSP/Jitter.