MusiKraken - modular MIDI controller construction kit

I newly discovered this amazing forum and a quick search showed me that there is only one mention of MusiKraken so far. As the app is all about musical expression (including poly expression), I thought a quick introduction might be useful.

MusiKraken is an iOS and Android app (and currently also has a web version). It is basically a modular expressive MIDI controller that allows you to (mis-)use all sensors of your mobile device, tablet or computer as a building block for your own custom MIDI controller setup.

So for example it supports multi-touch, camera (hand tracking including depth sensing on iOS devices with a TrueDepth/LiDAR sensor, body pose tracking, face-tracking, color tracking and so on), motion sensors, microphone, game controllers, Apple Watch, motion sensors in AirPods… The long-term goal is to have the app with the most human-computer-interaction types on the app store (it already might be, but just to make sure, I will add more…).

The app supports all kinds of MIDI transports: Inter-app MIDI, MIDI via USB (if available by your device), MIDI via WiFi, MIDI via Bluetooth (including my own implementation of MIDI via WiFi and Bluetooth for Android for devices that do not officially support it). The iOS app can also host Audio Units (AuV3) and send raw data from the sensors via OSC. And it officially supports MIDI 2.0 (as a corporate member of the MIDI Association I even signed the licensing contract for the MIDI 2.0 logo. Which I haven’t even used yet.)

And the app has many MIDI effects that can be combined with the sensors and the outputs to create whatever expressive MIDI controller you want.

For poly expression, the keyboard module in the app supports MPE. So you can slide on the keys and use touch radius or pressure (when supported by the device) to control the different MPE parameters. But you can also combine the keyboard with a Chord Splitter module to split chords into separate voices and send each one to a separate MIDI channel. This way you kind of can create your own MPE-like instrument by controlling all kinds of parameters (Control Change, Channel Pressure, Pitch Bend etc.) separately per key.

And if you enable MIDI 2.0 in the settings (and find a synth that actually supports it, which is still very very rare), you can also already send MIDI 2.0 registered and assignable per-note controller and per-note pitch bend events (I haven’t found any other MIDI controller yet that supports this already. But as long as there are no synth that supports it, it is also kind of useless :slight_smile: ).

Here is a quick overview of what it can do: https://www.youtube.com/watch?v=_WHd4yWv418

www.musikraken.com

8 Likes

Thanks for introducing your synth, I honestly haven’t seen this before. Played around with it yesterday, it is fascinating how it makes use of pretty much all sensors that the iPad has. This is easily one of the most innovative music related iPad apps to date which might justify getting an iPad just for that!

3 Likes

Thanks, I’m glad you like it! I concentrated on the sensors that are more or less controllable so far, but I will probably add support for the more random ones as well (like GPS, which probably doesn’t make much sense as a MIDI controller, or the current memory or processor usage values, which are kind of random), just to have a complete list of input sensors.

2 Likes

Makes sense, not everything that is possible has to be done. I am glad that it doesn’t scan my emails or phone calls to switch between minor and major depending on my extrapolated mood :stuck_out_tongue:

1 Like

Great idea, maybe I should add this, thanks! :grin:

2 Likes

Wow - very impressive. It will take me some time to absorb what it might do. Thanks!

2 Likes

Hello,
I am testing Musikraken on Android and iOS (it was even the possibility of using it that made me buy an iPad!) and it is indeed a tool with very great potential.
I first have a question about the MPE mode where I cannot obtain a continuous pitch bend (X axis): the values ​​emitted always jump from one note to another. I think I have tried all the parameters, but can you tell me how to do it?
I have also read that the resolution of the finger crush detection (on iOS) which is supposed to replace the pressure is very low: would it be possible to apply an interpolation to the source? I can of course smooth the values ​​received but that does not prevent there from being only 4 or 5 steps…
Still for the iPad: do you plan to add the Pencil as a sensor? I only know of Pen2Bow that uses it but it doesn’t allow sending the XY position. That would be a great module, right?
Thanks!

2 Likes

Thanks!
I think I accidentally removed the setting for the continuous pitch bend that still matches up with the keys :-0, so currently you can only either pitch bend to the exact note, or use the value-To-MIDI-converter to pitch bend based on the x-axis (but then it is more difficult to match up with the keys).
I will add it again in the next version (I just now fixed a bug for MIDI 2.0 in AudioUnits, that appeared in iOS 18.1, so I need to do a release next week anyways).

And yes, the finger radius only has about 5 steps (too bad that Apple removed the touch pressure sensors). I will also add a smoothing settings in the next version, I wanted to add that a while ago anyways (I always used the Value Smoother module as a workaround when necessary, but it would probably be better to have this directly in the keyboard module.

The Apple Pencil already works. I didn’t create an extra module for that, but added it to the TouchPad module. You can see a demo of what it can do here:

I currently only support the x, and y positions, both tilt angles and pressure. The newer iPads in combination with the newer Pencils would also support rotation and hovering above the iPad, but you cannot use that yet with MusiKraken, simply because I do not have a newer iPad with newest Apple Pencil to test it :slight_smile: (it is probably easy to implement)…

1 Like

Thank you very much for your answer.
Yes, continuous PB at the key level is needed, because unless I am wrong, the value to Midi converter on the X axis is not polyphonic and doesn’t work in MPE.
About the Pencil I am confused to not having seen it : it works perfectly, and since I bought a iPad 10 it can use only the Pencil 1.
This make alas too that I have not a true depth camera, so the hand tracking is not very good. But even if it is far from my Leap Motion 2, it can be usefull to have it in the iPad sometimes.
The integration in a single app of all these possible gestures is incredible :wink:
By the way, could it be possible to mute some modules in order to make them temporarilly inactive ? Or maybe is there already such a function ?

1 Like

The value to MIDI converter creates both, per-note and non-per-note events, if the input has per-note information (except for MIDI events that are already polyphonic, like Aftertouch). Currently only the keyboard module and external MPE controllers have per-note information. Both these per-note and non-per-note events are then routed through all the nodes until one of the nodes decides to use one or the other.

So if you want to build an MPE controller from that, you could for example combine this with the Chord Splitter effect. Connect the MIDI out from the keyboard with the MIDI in from the Chord Splitter. Then connect the x-port of the keyboard with the MIDI In of the Chord Splitter, and change the newly created Value-to-MIDI-converter to generate Pitch Bend events.
Now in the Chord Splitter settings change the starting channel to 2 (because the first channel is used as the master channel in MPE), and disable “activate delay”. The delay is only there so that for example the topmost note is always sent on channel 2, the second most to channel 3 etc., but you do not need that in MPE, so you can deactivate it.

By default, the Chord Splitter uses the per-note events instead of the non-per-note events, this way you can control every CC, Pitch Bend, Control Change and so on of each channel separately. You can also deactivate this in the settings if you want.

So now you should have a setup that supports MPE instruments and sends polyphonic pitch bend. The x-axis in this case is far too sensitive if the MPE instrument is set to support 48 semitones, so either change that in the instrument settings or reduce the x-axis sensitivity and pitch bend range in MusiKraken…

But you can also simply activate the MPE mode of the keyboard and use it directly, also with continuous pitch bend, once I finally have released the update (I found another iOS 18.1 update bug that needed fixing, but I think I should be able to release the update soon).

But the cool thing about the Chord Splitter is that it not only works with MPE instruments. You can also simply load the same instrument multiple times, assign each one to a different channel, and then you have your own custom, per-note, MPE-like, instrument… :grin:

2 Likes

Here is a video explaining this concept:

The functionality to mute modules is something that is often requested, but I never took the time to implement it. I will move it higher up on my TODO list, should be easy to implement and might be very useful for complex setups.

2 Likes

@PrS The update was released today. You can now activate continuous pitch bend for MPE in the keyboard again. There is a setting in the keyboard module called “continuous pitch bend”, in the MPE section. Simply enable this and it should compute the correct pitch bend no matter which keyboard layout combination you use.
The continuous pitch bend only starts near the borders of the first key that is pressed by a finger. This way you won’t accidentally trigger pitch bends if you just want to play the keys. But once you move away from the start key, it should compute the distance to the target pitch of the next key and linearly pitch bend between them (so it also should work if you only have the keys of a specific scale in the layout).

3 Likes

It’s perfect, thanks a lot !

1 Like