top of page
A semi automated/mechanised electric guitar .
Designed to interface/explore the interval of the physical acoustic, mechanistic, electro-magnetic, analog electronic, digital electronic and machine learning environments.
Designed for an improvising musician to play freely and using a machine learning system(AI if your a sucker for the marketing line ) the guitar attempts to harmonize/create a counterpoint to the human input which is machanised back onto the same instrument.
Somewhere between a human machine duet and a fight for control over the instrument
This project is ongoing and still being developed/modified...
Images taken at Digital design weekend at the V&A museum , London
Version 3.0
Video/Audio documentation coming soon...
New more sophisticated M.L. Response system
More nuanced input tracking to feed M.L. System (using DIY hexophonic pickup and audio to midi conversion)
Sustainer system finally balanced and quiet(!)
New neck.
Version 2.0
Late 2023
Developed during the Patterns in Practice /Watershed artist residency.
With fretboard control and primitive markov chain machine learning system.
Sustainer system still noisey and unstable
M.L. Responses trained on small datasets but quite whiley and chaotic
Version 1
Early 2023
No fretboard control , no machine learning contribution
Player responsive solenoid string attacks and sustainers( fairly unstable/noisey at this point)
String attack solenoids and sustainers can also playback automatedprecomposed midi sequences.
Also see the piece I wrote from the Nadar Ensemble with the guitar at this stage in its development
bottom of page