Need help finding some MPE midi recordings

Does any know if there is a place to find midi recordings of various MPE devices. Something that could be used to test a virtual instrument compatibility?

Seaboard 1/2

I can’t seem to find barley any MPE recorded .mid files. Any assistance would be much appreciated thank you!!

1 Like

Hi @Davidson and welcome to polyexpression community !

I can record some “noodling around” say, with Geert’s “MIDI Tape Recorder” and export that into a MIDI file you can use to playback…

ContinuuMini with preset:
MPE 4 Voice

Four note polyphony MPE.
Uses Midi Channel 1 as Master Channel, and Midi Channels 2-5 for notes.

W: Midi KeyOn is sent when finger initially touches surface, KeyOff when finger is lifted.
X: Pitch is encoded using default MPE Pitch Bend Range of “48”.
Y: Front-to-back position is encoded with controller “74”.
Z: Pressure is encoded using Channel Pressure.

RPN 0 is used to transmit the bend range…

  • start with selecting the preset
  • one finger playing, separate notes, then sliding, sometimes use Y (CC74) (33.7 KB)

Let me know if its any good.
Tomorrow I’ll do the similar with Osmose, Continuum, Linnstrument.


as a developer, Id say testing against midi files wont gain you very much…

you know the mpe spec,
really if you are supporting pitchbend, cc74 and ch pressure, per note (on ch 2-16), then there is not much too it - to test that you could ‘drive’ your plugin with any DAW that supports MPE,

the only other thing to consider is these 3 messages are coming in quite a bit faster/frequent than normal midi, it does depend on surface, but upwards of 50 msg/second is not unusual.
… but on a modern computesr, Ive never had an issue with this.
(k, if you were trying to alter FFT parameters real time - perhaps :wink: )

past that the feel of the sound engine will very much differ depending on the control surface and the ‘preset’ you are using - thats why I midi file wont help you much…
the ‘issue’ is, when you play an mpe controller, your expression is in a feedback loop with the sound coming back… (this is kind of the point of expressive controllers ;))

SO… if even on the same surface, if you play a flute-like preset vs key-like preset, the midi you generate for expression will look very different.
(and of course, depending on the piece of music your playing, you may have more or less expression).

also different surface react so differently… a preset on your plugin that works great on a continuum, might not feel/sound that great on an Erae Touch or Linnstrument.

this is probably also why you cannot find many midi files… they are just not that useful, since they are so dependent on patches, hence why many of us never record mpe midi , rather the audio.

honestly, if you want to ‘test’ it, its probably best to release a demo/beta version to get some feedback.
but again, be aware, someone playing it with an Osmose will have a very different experience to someone using a Roli or a continuum…

bonus tip:
be aware that the Osmose is very different to others, due to the y axis not being independent of Z…
also by its nature, on the Osmose Y is always unipolar, whereas Y can often be treated as bi-polar on other controllers (were 64 is 0).
technically, its not any different to developers, its all just cc74, however, it will influence presets a lot.


Easier for a start to download a free or inexpensive DAW with MPE support in which you could draw ?
Not ideal but…You could be able to check basic behaviour
But as said very clearly @thetechnobear , it’s a complicated game, as you have different output sensitivities in controllers and same in virtual synths receiving. So important IMO is to have curve response settings (a.k.a smoothing) for several messages, at rise and fall. Not all synths have this but I found it very useful

1 Like

Additional things you can try to test certain aspects and alternative MPE use-case scenarios:

Use the microtonal devices that some DAWs like Ableton Live and Bitwig offer, which make use of MPE in order to do the pitch adjusting per note. Or the Oddsound MTS-ESP suite set to MPE mode, or other competing ones that do have an MPE mode such as

Also Fluid Chords is a different take on chord pitch bending which makes use of MPE Fluid Chords - The Ultimate Chord Bending System | Pitch Innovations or their other product Fluid Pitch - The Next-Gen Pitch Bend System | Pitch Innovations

Also if you have Ableton Live with Max for Live, there is the rather interesting MPE step sequencer called seqMPEror

These things are niche when it comes to lots of chatter about them on the internet, but its good to remember that MPE isnt just about fancy controllers, and one or more of the above may be useful for repeatable testing purposes and checking that you are interpreting MPE data with an appropriate sense of timing (eg the microtonal stuff tends to send the pitch bend change immediately before the note on, which the official MPE spec says should work, but I’ve had an MPE synth or two that incorrectly fails to respond to such data)


Oh and if you have an ipad or iphone there are a number of apps that act as virtual MPE controllers and you could send the MPE MIDI from those to your computer.


I forgot to say that if you are determined to test with data from a range of MPE controllers that are relevant today, its probably a good idea to add the Ableton Push 3 to your list.


I’ll just point out that some MPE controllers – notably the LinnStrument – allow you to play true unisons, which appears in MIDI as the same note number on different channels. An unfortunately too common MPE synth implementation error is to end the note on each channel when a Note Off for that note on either channel is received. In the implementation that means you should track Note Ons by channel + note number, not just note number.


Thank you so much for all the replies very much appreciated. I have been able to confirm a few controllers working well and have a basic VSTi3 plugin made. It is helping determine basic 5 axis compatibility of MPE devices. Velocity and release velocity I can extract from pressure values so these don’t have to be sent by the controller but could if present.

Apologies I did not check in last day or two to see the messages.

KeymanPal thank you for the midi file perfect! I tested the Cmini file you sent and good news it appears to pick up all the signals. If you would like to test the VST3 plugin or make a few more more midi files from other controllers it would be a big help. Plugin is pretty basic right now but once I know it has a fairly good compatibility I can make it more full featured .

Cmini render in Reaper

Thetechnobear I appreciate the tips very helpful insights. I am converting everything midi 1.0 and mpe to midi 2.0. Channel pressure on multiple channels ends up a single channel poly after-touch etc. I agree some real world beta tests will reveal which controllers need special MPE+ link configurations. My goal is to use midi 2.0 bridging the previous devices possibly automating per note expressions in some way.

Philipe good suggestion I could just draw in the note and expression events.

SteveElbows and onscreen controller would also be a good way to test. I think the sequencer app sounds cool I will check it out.

RimWolf Very good point certain controllers have special events or messages. MPE seems a bit scattered there is MPE+, highDef, regular midi 1.0 channel. Some controllers are set to 48 semitones some 12 some 96 for per note pitch. I figured having a few midi files would be a good way to check different controllers real fast.

I really appreciate the info and the time of each of you took to help me. If anyone is interested in testing the prototype or later on a more full featured version let me know.



I dont think its really as scattered as it may appear to you at this stage. There is stuff in the MPE spec for auto-config of pitch bend range but most MPE synths also let the user set that manually. There are a few hardware MPE synths out there which just assume 48 semitones, which isnt a completely safe assumption but people are mostly used to this being the default. As for MPE+, there are pretty much no synths that support this beyond the world of Haken (and now also Expressive E who use Hakens engine) who made this standard up themselves. The only exception I know of is a module someone did for VCV rack. I’m not sure what you are referring to when you mention highDef and regular MIDI 1.0 channel.

To expand on my MPE+ point a little bit, here are some reasons why you dont need to worry about it too much at the moment:

People with controllers that can do MPE+ are used to the situation, and options are available to them which allows their controller to output normal MPE instead. For example on the Osmose there is a completely separate layer of MIDI output which is designed for controlling other synths, and this outputs normal MPE not MPE+. Also the main difference with MPE+ is extra cc values, and if the destination synth doesnt support those then the rest still works, so there is a degree of backwards compatibility there. And when it comes to the Haken Continuum, people can reprogram patches on that one so that MPE is output rather than MPE+. And apart from those I dont know of any other MPE+ controllers, nor of any plans for anyone else to do any.

I can use the situation with the Osmose and an ‘accidental’ difference between its MPE and MPE+ outputs to illustrate the sort of tedious detail that can sometimes cause issues though. Its MPE output sticks to the spec in terms of sending an initial MPE Y CC74 value of zero when a key is first pressed. But if I remember correctly, if I were to use the Haken MPE+ output from the Osmose instead, it fails to send an initial MPE Y CC74 value when a key is first pressed, sending no values at all as CC74 messages unless the key being played reaches that stage of travel. This can cause glitches on synths that are always expecting an initial MPE Y value - depending on assumptions the synth makes, MPE Y modulation values may jump around or be stuck at unintended levels. The proper solution to this is for changes to be made on the controller side of things, via a future firmware update, but if I were intending to make an MPE+ synth with Osmose users in mind, I would probably try to take account of this anomoly myself for now. This issue has only shown up with the Osmose because with the Haken controllers MPE Y is determined by finger position on the surface, so always has a value derived from the users fingers. But Osmose uses MPE Y in a completely different way to other controllers, using it for the final aftertouch zone of a key, a zone for which no info initially exists when a finger first touches the surface. And this difference wasnt taken account of in their initial firmware/Haken engine when it was included in the Osmose.

The other problem with MPE+ is that DAWs that have specific MPE support, which makes assumptions about sticking to the MPE spec, may not handle the additional per-channel CC87 messages that are sent by MPE+ controllers. If the DAW is pretty agnostic and just handles MPE by letting all info through on all MIDI channels then MPE+ will probably still work, but some DAWs probably adhere more rigidly to the formal MPE spec than that. And DAWs that allow MPE-specific editing of per note expression data are mostly unaware of MPE+ and the extra effort that would be required to edit MPE+ data where a single expressive value is actually made up of 2 different CC messages combined.

Anyway my opinion is that MPE+ has no momentum beyond the Haken world, people just associate it with the synth engines built into those controllers. So I think its safe to ignore it completely, unless you want to take the opportunity to specifically market your synth to people with Hakens and Osmoses that think they will get great benefit from the extra resolution that MPE+ offers on paper. And if you do go down that route, be prepared for more customer support queries, since users that are not nerds about MPE etc sometimes struggle with all the details of getting things working, even with normal MPE, let alone MPE+.


Since MPE+ is all about additional resolution (+ a little consistency in ordering), it will be irrelevant once MIDI 2.0 gets wide adoption. Since your plugin is already MIDI 2.0 internally, it should be easy to recognize MPE+ and adapt it – the engines send all the information you need to know what mode they’re running and can be queried.

1 Like

SteveElbows thank you very much for this explanation I am blown away wow very detailed!

MPE Y modulation values may jump around or be stuck at unintended levels
Yes I have noticed this occurrence which is somewhat difficult to handle poly phonically. The voices allocator he been setup to use for poly After touch for these instant CC values that do not rise from 0 rather instantly occur at a value. To get a certain amount of transition time it needs to be factored in the with the note messages.

Doing this for every midi 2.0 per note expression is a challenge. I have found that midi 2.0 provides a fair amount of tools to handle this. It is pretty new to me compared to midi 1.0 so I am still at a medium or moderate understanding of it. I hope to bring as much compatibility as possible but some special features of DAW’s or controllers may not make it in until further experiments and testing have occurred.

I agree MPE does have really good momentum and more and more DAW’s support it. Midi 2.0 is a little ways off but I think will it eventually absorb/surpass MPE and midi 1.0. I figure if I can convert from MPE to midi 2.0 then it should be good, it is not really beneficial to convert from midi 2.0 to mpe though.

I would like to add support for the Seaboard, Haken and Osmose as much as possible. It appears these are the top tier or flagship MPE controllers capable of most technical, fast and expressive playing techniques. Using C++ and VST3 I can do a fair amount of customizing just have to get it going to iterate to the next levels of controller support.

I decided to go ahead forward with making a full featured plugin. Since yesterday I have upgraded the plugin significantly and added it the road map. It feels responsive and expressive supporting 3 axis. I should be able to by the weekend add the velocity and release velocity pluck and a physical string model.

Recording Today Reaper MPE

Since your plugin is already MIDI 2.0 internally, it should be easy to recognize MPE+ and adapt it

Yes this is what I discovered however there is one thing to watch out for also.

MIDI 2.0 note requires 64 bits rather than the 24 bits of MIDI 1.0 because it carries much more information. So each note and controller signal may flood the synth if too dense. This must be handled in a way that matches the synthesis engine to get enough signals but not too much. There is a connection mechanism, but things like active sensing generally also eat in performance. It will be fun to experiment and explore various solutions for these types of functions.

In some cases less is more and smoothing things out is easier then reacting to a very dense controller resolutions. Of course if the machine running on is beast and has nice audio interface then cranking everything up reaction time and controller resolution it would be a prime example of real world functionality of midi 2.0 at the max levels.

My guess from previous experience with HD midi signals is that it is overkill and achieving the same results and good performance can be done by finding a happy medium or balance.

I agree with this, particularly with the z and y axis, where travel (and therefore manual precision is limited)… most are not going to feel/hear the difference between a (nicely) slewed value vs a native value.
also, lets bare in mind many controllers only have something like 10bit resolution in hardware.

however, X is a special case for SOME controllers, and this was the issue Haken was resolving.
they had a 92 semitone surface, so slide from one end to another with 14 bit pitchbend is gonna get crunchy … and wont work if you are trying to support microtonal. so they need 21bit to be even usable.

of course, this is less of an issue with smaller surface as you have a more limited slide range, or something like the osmose/eigenharps where PB is like a modwheel (relative)
this is why the pitchbend range is not fixed in MPE, as its a trade off between slide range vs precision.

Interestingly, Ive done quite a bit with OSC in this area, and its always felt much better.
(I used the t3d protocol supported/created by Madrona Labs for their Soundplane)
I think this was partly due to floats being 32bit, but also because all 3 axis are sent as one (atomic) touch message, rather than in serial as used by midi, it could also be because the plugins are also bypassing any daw routing!?

never really, dug into this to determine which is the most important factor … but without doubt, Ive never come across vst (using midi) which ‘felt’ as good , even the Madrona labs plugins when using MPE dont ‘feel’ as good.
(and this is not just the soundplane … I implemented t3d/osc on the Eigenharp, and it felt much better there too)

Id say midi 2.0 is years away from any kind of mass market support… its a marathon not a sprint.
we are getting there… hopefully os support (windows and Mac) is not far off, then the daws need to do their magic… but its taken a (long) while to get this far, and I don’t see it picking up any time soon.

but indeed, I think expressive controllers could be a prime candidates to move early in that adoption cycle of midi 1.0/mpe → midi 2.0. not only because of midi1.0/MPEs limitations but also because its already a bit niche, and the instruments/controllers are more recent… so potentially easier to upgrade move to midi 2.0.


In theory mac and android support for MIDI 2.0 was already released a while ago. Here is just one example of some Apple documentation Publishing and Discovering MIDI Capabilities | Apple Developer Documentation

Its been Windows support thats been the major OS omission so far, but their plans have been moving forwards this year, and they are now sharing some of it on github and talk about it on their dev blog eg Hello MIDI 2.0 - We're opening the repo! - Windows MIDI and Music dev

In terms of Apples support, what isnt clear to me is exactly when their implementation gets updated to take account of changes that were made to the official 2.0 spec this year.

Well I was talking about a specific issue with the MPE+ output on the Osmose not sending an initial CC74 value, at least as things are implemented in current firmware, rather than the broader issue that I think you are referring to. Certainly the broader issue of how best to make use of slewing of values is an issue that synth makers need to experiment with to strike the right balance, but thats not something I can advise on.

yeah, I know, Ive been following this too…thats why I said, OS support was ‘not far off’,

macOS looks very close… as they already had the coreMidi framework. as for Apple being ‘up to date’ , that’s a moving target, Apple will keep updating as the midi 2.0 spec (inevitably) evolves.
Windows is playing catch up, which is good and bad … definitely windows users will see the biggest change for the better, but could be a bit of a bumpy road initially.

It’s all good news, including the news that the major DAW developers have been meeting/discussing with the MMA… but its all early days.

Im not sure Id bet on Ableton or even Logic Pro having full midi 2.0 support in 2024? even 2025?

and one thing thats unclear to me is, for the majority of musicians , how important are the midi 2.0 features? and which? what make them invest in midi 2.0? (upgraded software/hardware)

(this is important, as dev plans are prioritised by what users want… or rather what they will buy/pay for :laughing: )

my guess would be, its not hi-res, per note expression, microtonal that will drive midi 2.0, this is all a bit niche.

rather something much more mundane !
ease of use - midi-ci making hardware as easy to integrate as a software plugin.
and probably (the promised) better timing/latency, again, because for many, this is a PITA to setup properly. (and many electronic musicians seem obsessed by it )

but we will see, it will come, and I look forward to the day. :slight_smile:

Yeah, the MIDI-CI stuff also contains opportunities for companies to market the obvious advantages to people, within their own product ecosystems. And I have little doubt that plenty of people dont get the most out of their controllers because the traditional setup process is too tedious or confusing for them.

As for DAWs, they keep their cards close to their chest so I have no predictions about who will be first, or when that will happen. It could indeed be a long wait, or maybe one day we will be pleasantly surprised. I think there was chatter in late 2021 that a MIDI 2.0 enable option had appeared in Logic, but quite how much of 2.0 it supports remains to be seen.

Likewise I have no predictions at all as to the timing of high res, ‘successor to MPE’ stuff that would be of interest to us round here. Nor have I properly got my head round the per note expressivity of different forms that some DAWs and plugin standards have been toying with beyond MPE, eg some stuff in CLAP.

Musicians with large setups will probably be pleased by the groups stuff in MIDI 2.0 that will let them go far beyond 16 channels, but as with all the other features a whole ecosystem needs to build up over years in order for this stuff to become practical. And we’ll have to deal with other transports rather than DIN in order to make use of MIDI 2.0 stuff that uses the new message package format. I wonder if there will be a series of utility bridging devices available eventually, eg you could have a large network-based MIDI setup with very many channels but with some utility boxes that connect to that network and then offer a bridge to traditional DIN, and USB. If any companies do move into that space they could even take things further and offer a bridge between the new world of smart Capability Inquiry devices and older ones that arent ever able to support that directly, by offering bridging devices that you can load profiles for older equipment into.

I hear that the MIDI 2.0 standard for network transport is close to being published. After that eventually we might see ethernet ports on our devices :-).

The Moog One has a network port but they didnt succeed with its initial intended use (remote diagnostics which they abandoned) and its not the most likely candidate for large, forward-looking firmware updates at this stage, sadly.

Some other classes of device occasionally offer wired or wireless networking support, currently for other purposes such as updates and content downloading. eg some of the modern MPCs, Maschine+, Push 3 standalone.

In theory devices that have a USB host port could, if they were still being updated with futuristic firmware when the time comes, allow users to plug a USB->RJ45 or WiFi dongle either directly into that port or via a USB hub plugged into that port.

God I hope so… rtp midi is a mess currently, it works pretty well on macOS/iOS but windows and Linux is a lottery. Making hardware solutions unreliable.

It’s a shame, as both wifi and Ethernet offer so much potential for midi if we can get better compatibility.