This Google project “Magenta” has what I’m looking for at the moment: a suite of AI applications I can download and run on my iMac to process and generate MIDI files. You get the applications depicted:
* Generate
* Continue
* Interpolate
* Groove
* Drumify
which can pipeline together like:
1) Run “Generate” to create a couple of short melodic variations as MIDI files
2) Feed these files individually into “Continue” to expand them
3) Feed these expanded variations into “Interpolate” to combine them
I have so far only succeeded in creating some awful noise, but it was MIDI material produced by this software and rendered appropriately by Logic so I’m where I want to be and ready for some experimenting! I think the most promising route to listenable music will be using Continue on an existing melody, then Interpolate on multiple continuations. In the end, if this works out, I can feed a created MIDI into BIAB for chord recognition and have it add a band.
https://magenta.tensorflow.org https://magenta.tensorflow.org/studio <= Magenta Studio: Ableton Live plugin, MacOS / Windows standalones
https://magenta.tensorflow.org/ddsp-vst <= Neural Synthesis: virtual instrument and effect plugins