Yes, the primary target for this board is "Accessibility".
But the methods can be shifted from one world to the other. I am thinking about adding "clicking" to my AZ Controller. With many Synthes I have mapped MIDI controls to normal keyboard arrows and enter, but I still have to click first on required region to choose the preset. Unlike you, I do not use VST with mouse only parameters, but that can be the "next level" in the same direction.
Coming from accessibility world back, there are already useful features:
* in AZ Controller that is an audition. F.e. I (and some other) have found it useful to hear markers when they are passed during playback, much less distracting then looking at screen while playing with the question "hmm... which verse that is going to be?".
* MarKo has written an utility to convert AZ Controller preset into text file, parsing GUI where preset is loaded. I must admit, I have no skills to do this in reasonable time
That is why I think interchanging technologies between accessibility and visual worlds can have benefits for both sides.