Some Current & Recent
Research Projects
Scorch AI: an ethical AI for the Scorch language (2024-onwards)
​
Scorch AI is an ethically informed generative AI system which is meant to work as both a collaborator and help assistant for learning, producting and performing music with the Scorch language.
​
More info soon.
​
Presentations on Scorch AI:
-
NYU "Learning to Teach" conferenc January 2024
-
NoiseFloor Conference May 2024, Portugal
-
Ai4Good Lab Creative AI talk May 2024, Montreal/Toronto
​
Journal Publications:
University of Edinburgh Press - coming soon
​
​
​
​
SuperCollider Book 2nd Edition​
MIT Press
Ed. Wilson, Cottle & Collins
I contributed the chapter on the SuperCollider IDE for the 2nd Edition of the book, publised by MIT Press.
​
Scorch: a new programming language for algorithmic composition and performance (2020-onwards)
​
Scorch is a music programming language designed to be straightforward for those not experienced in traditional programming languages. Initially for alogorithmic composition as a MIDI generating VST plugin, but ultimately intended to be used for live coding and a variety of media computing applications. Scorch has various AI implementation including an AI collaborator similar to the Autopia project which allows for collaboration with AI and human performer.​
Scorch is a project by Norah Lorway, Ed Powley and Arthur Wilson (beesting.xyz)​
​
Visit Scorch online ​
Autopia: An AI Collaborator for Live Coding Music Performance (2019-onwards)
​
Autopia is a project by Dr Norah Lorway, Arthur Wilson, Dr Edward Powley. It uses template based genetic programming to write SuperCollider code with audience feedback determining the fitness function of the evolution for the code. It interfaces with Utopia a system developed at the University of Birmingham by Dr Scott Wilson et al for collaborative, networked live coding performances.
We have presented Autopia at the following conferences:
​
AISB (2019) - The Society for the Study of Artificial Intelligence and Simulation of Behaviour
ICLC (2020) International Conference on Live Coding
NMF (2020) Network Music Festival
AIMC (2021) International Conference on AI Music Creativity
​
HiveSynth (2019 - onwards)​
​
Hive Synth is an Augmented Reality synthesiser for mobile platforms.
​
We are currently working on training a machine learning model on associations between image and sound, and therefore generating sound from image. These include:
​
-
Training on footage of instrument performance, and then generating sound from "air guitar" style miming of performance
-
Training on dance performance, and then generating music to accompany a dancer's movements
-
Replacing the image input with immersive controllers (motion capture, VR controllers) to train on 3D movements
​​
Release date: Spring 2025
It is being developed by my music tech company Beestings Labs
A Virtual Assistant using Artificial Intelligence Technology for the social and logistical support of people with Dementia"
(2020- onwards)
​
This is a collaborative research project with the NHS Cornwall Partnerships/Plymouth UK.
​
The project examining and creating a virtual assistant to assist people and carers living with dementia in rural areas.
​
Currently we are undertaking an NIHR funded PPI study and have a paper in progress.
​
​
AcrossVoids​ (2018-2019)
​
Across Voids is an immersive experience which explores how AI and Immersive technologies can help support the grieving process.
​
It was funded by UKRI Research England in 2018 as part of of the SouthWest Creative Technology Network
​
You can read about the project here: https://swctn.org.uk/2019/10/30/across-voids-an-interactive-experience-on-grief/
​
You can watch an excerpt of the experience here, premiered at University of Birmimgham, BEAST FEaST 2019 Festival
Birmingham Ensemble for Electroacoustic Research (BEER)
​
The ensemble was founded in 2011 as a research project within the Music Department to explore aspects of realtime electroacoustic music making. Particular interests include networked music performance (generally via our Utopia project), group improvisation and live coding.
​
Past and current members include Scott Wilson, Norah Lorway, Martin Ozvold, Winston Yeung, Luca Danieli and Konstantinos Vasilakos.
​
You can read our Computer Music Journal article:
https://www.mitpressjournals.org/doi/abs/10.1162/COMJ_a_00229
​
Anthropocene: The Human Epoch (2018)
(I composed the original score along with Rose Bolton)
​
A cinematic meditation on humanity’s massive reengineering of the planet, ANTHROPOCENE: The Human Epoch is a four years in the making feature documentary film from the multiple-award winning team of Jennifer Baichwal, Nicholas de Pencier and Edward Burtynsky.
​
World Premier: Toronto International Film Festival
Has been shown at Sundance International Film Festival, Berlin Film Festival and many others.
​
Myself and Rose were nominated for a Cinema Eye Honor's award for Outstanding Achievement in Original Music Score 2020
We were also named in Sundance Institute's "18 Women Composers You Should Know" list in 2019.