PROJECTS AND RESEARCH

Please scroll down this page for selected recent projects

FutureWorld (2015/2020)

music made with SuperCollider (2015)

visuals made with JavaScript (2020)

field recordings from vancouver, stockholm, copenhagen, tokyo

Audio released here: 

https://xylemrecords.bandcamp.com/album/and-then-you-win-2

Stream (2020)

 

"Stream" is an audio visual ambient/techno experience which is based on the idea of swarms moving through an undefined isolated space. I was trying to create a detached yet connected experience. 

Made with Javascript + SuperCollider.

Video footage: Somewhere above Greenland on an Air Canada flight from London to Vancouver. 

Autopia: An AI Collaborator for Live Coding Music Performance (2019-onwards)

Autopia is a project by Dr Norah Lorway, Arthur Wilson, Dr Edward Powley, John Speakman and Matthew Jarvis from Falmouth University and Dr Louise Rossiter. It uses template based genetic programming to write SuperCollider code with audience feedback determining the fitness function of the evolution for the code. It interfaces with Utopia a system developed at the University of Birmingham by Dr Scott Wilson et al for collaborative, networked live coding performances.
 

You can read our first paper on it here: repository.falmouth.ac.uk/3326/

AcrossVoids​ (2018-2019)

Across Voids is an immersive experience which explores how AI and Immersive technologies can help support the grieving process. 

It was funded by UKRI Research England in 2018 as part of of the SouthWest Creative Technology Network

You can read about the project here: https://swctn.org.uk/2019/10/30/across-voids-an-interactive-experience-on-grief/

Anthropocene: The Human Epoch (2018)

(I composed the original score with Rose Bolton)

A cinematic meditation on humanity’s massive reengineering of the planet, ANTHROPOCENE: The Human Epoch is a four years in the making feature documentary film from the multiple-award winning team of Jennifer Baichwal, Nicholas de Pencier and Edward Burtynsky.

World Premier: Toronto International Film Festival 

Has been shown at Sundance International Film Festival, Berlin Film Festival and many others. 

Myself and Rose were nominated for a Cinema Eye Honor's award for Outstanding Achievement in Original Music Score 2020

We were also named in Sundance Institute's "18 Women Composers You Should Know" list in 2019. 

Birmingham Ensemble for Electroacoustic Research (BEER) 

The ensemble was founded in 2011 as a research project within the Music Department to explore aspects of realtime electroacoustic music making. Particular interests include networked music performance (generally via our Utopia project), group improvisation and live coding.

 

Past and current members include Scott WilsonNorah Lorway, Martin Ozvold, Winston Yeung, Luca Danieli and Konstantinos Vasilakos.

Recent projects have included our Dark Matter collaboration with art@CMS at CERN. https://www.youtube.com/watch?v=U2aDudtCiY4&feature=emb_title

You can read our Computer Music Journal article: 

https://www.mitpressjournals.org/doi/abs/10.1162/COMJ_a_00229

HiveSynth (2019 - onwards

Hive Synth is an Augmented Reality synthesiser for mobile platforms. It is being developed by my music tech company Beestings Labs

"We Have Never Been Asian" 

 

Why is cyberpunk sci-fi always set in Asian cities? Why are Asians often assumed to be good with technology?《We Have Never Been Asian》is the world's first short film to investigate this curious linkage between Asia and technology in global media like movies, television, and animation."

Directed by Brent Lin

Music by Norah Lorway

Link: https://www.youtube.com/watch?v=9iahNsfAHUo&t=13s

"Hollow Vertices" (2015-6) 

Live coding - Norah Lorway
clarinet + effects - Kiran Bhumber
visuals - Nancy Lee
Premier - Vivo Media Arts Center, Vancouver Canada - November 2015
P
erformances at: NIME 2016, TIES 2016 and ICLC 2016

Hollow Vertices is an improvisatory audio-visual performance environment. The sonic components are co-created through live coding in the real-time audio synthesis language, Supercollider. This results in the creation of dense percussive and ambient textures. The two sound sources are linked through a custom built network upon which each performer has control over the others’ code. This produces developmental elements in the composition. This is combined with an amplified clarinetist using a custom programmed pedal board in Max/MSP to drive live audio effects.
A projected image is displaying the sound source’s live-code, framed by another projection of video content manipulated in real-time through an internal video feedback process programmed in CoGe VJ. The video feedback is processed by custom-built effects that transform the content into new video feedback abstractions. Effects are programmed so unpredictable visual outcomes or glitches appear. The visual glitches aid in the transitional process between visual aesthetics during the performance. Some visual effects are programmed to interact rhythmically with the composition, and some are manually controlled as the piece is improvised.
The composition converges disciplines and mediums by augmenting sensorial modalities through human-computer interaction. This is realized through employing different programming languages, combinations of instruments, and reacting to the collective output while maintaining awareness of individual contributions to the composition.