I would really like to see a MPE enabled midi stream processing tool–I feel like a MIDI fx plugin that allowed one to code in python or javascript vs the midi stream with MPE aware functions or something like that. This feels like a place where, some barriers for entry being already handled, would allow for some real innovation in this space.
Personally I find the abilities of the controllers themselves and the current mode of placing the parameters for modulating the controller’s output into the driver to be a bad place to be. ROLI’s dashboard doesn’t give me the tools I need to properly interpret and change the raw controller output from my Seaboard.
Additionally, this being a technology that allows for more expressive playing and a deeper relationship with the sound and the the modulation of it I think that the best way forward would be to have deeper and deeper patches with more (and more complex) modulation schemes that are adapted to the instrument.
Let me explain what I mean: When I play the saxophone I have amazingly fine control over many dimensions of sound simultaneously: I can control volume, separate from breath intensity, separate from pitch (mouth control), separate from pitch (fingering control), etc. But even though these controls are discrete they do interact: if you overblow it has an impact on the pitch, etc. To get to the point where we have the ability to make instruments that are expressive to these degrees (not to emulate, but just to have the depth) we need to have the ability to really finely tune all the data coming from the controllers (with curves, envelopes, compressor/expanders, LFOs, etc) and to have them cross-modulate each other, etc, AND, most importantly we need to be able to store these modulation configs with the patches. So when you bring up a patch, you also bring up all the finely tuned controller processing.
In this scheme should the DAW record the raw controller data? or the processed data? or…?