I am a Canadian born, UK based electronic music composer, academic and programmer. I make music with SuperCollider and Tidal programming languages. 

I live code techno (and other music) at algoraves around the world in venues and festivals such as Fortune Sound Club (Vancouver), Norberg Festival (Sweden), EarZoom Festival (Slovenia), Corsica Studios (London UK), Incubate Festival (NL) and have been written about in media such as Mixmag about my work in live coding. 

I've run the electronic music label, xylem records since 2012, and compose algorithmic, audio visual and film scores.  In 2014, I won the Canada Council for the Arts Robert Fleming Prize for Outstanding Canadian Composer. 

I recently worked on the score for Anthropocene: The Human Epoch which premiered at the Toronto International Film Festival in 2018, Sundance Film Festival in 2019, Berlin International Film Festival in 2019, amongst others. Through the work on this film, I have been recently nominated for a Cinema Eye Honors award for Outstanding Score and was named in Sundance Institute's "18 Women Composers You Should Know" list in 2019. 

I currently lecture and research full time as a Lecturer at Falmouth University in Cornwall, United Kingdom, teaching and research in creative coding, music technology, and software engineering. I am also a visiting researcher with the Institute for Globally Distributed Open Research and Education.

I trained at the following places: 

  • Postdoctoctoral Fellow (University of British Columbia)

  • Ph.D (University of Birmingham)

  • Masters (University of Calgary)

  • Bachelors (Mount Allison University)

My main research area as of present is focused in digital health care, particularly the use of immersive and artificial intelligence technologies to support and assist with various psychiatric and neurological disorders and issues. 

I run and co-founded two software/hardware companies.

  •  HexDB Labs which provides software solutions for various high end audio companies, scientific and healthcare companies. 

  • BeeSting Lab which creates new digital musical instruments using immersive and AI technologies.

I also have a 10+ year background working and researching in the areas of Human Computer Interaction, Spatial Computing, Creative Computing, AI, VR/MR game experiences, immersive sound and make wearable tech for a variety of purposes.


I have a background in constructing 3D printed sensor based wearables for use with sound performance.  I worked in the Institute for Computing, Information and Cognitive Systems (ICICS) and the School of Music at the University of British Columbia (Vancouver) where I worked on a new digital musical instrument called the PIPE (Personal Interface for Performance Environments). I also worked with the Birmingham Electroacoustic Sound Theater (BEAST) and helped form BEER and BiLE laptop ensembles.