I've used it in combination with visgrep from the xautomation package[2] to locate and click on non-MIDI-learnable GUI buttons on a software synth, but it can be anything.
I've also used mididings to control mpv[3], allowing me to pause, play, rewind and fast-forward videos with my feet using my MIDI foot pedal controller[4] while with my hands I type up a transcript of what's being said in the video.
[1] - http://das.nasophon.de/mididings/
[2] - https://hoopajoo.net/projects/xautomation.html
[3] - https://mpv.io/
[4] - https://www.behringer.com/Categories/Behringer/Accessories/M...
I wish that this was a feature throughout the OS. Being able to bind a foot pedal to pasting from the clipboard, or a slider to a variable in your IDE, or a rotary to a control in a video editing app, would be amazing and open up entirely new avenues for computing and letting people customize their setup. Particularly thinking of the accessibility implications.
My current strategy for binding automation is using my razer naga and binding all those buttons to do various actions depending on the app that's up.
I'm using MIDI as an input to the control software for an art installation at the moment, and once WebGPU is out and compute shaders are available on the web, the whole thing could be deployed to the browser, which would be cool. At the moment it's built to a native executable, which is fast, but not as easy to share.
(Yup, lets you draw on a Etch-a-sketch with MIDI knobs)