Author Topic: Control Surfaces  (Read 17859 times)

Offline azslow3

  • Administrator
  • Hero Member
  • *****
  • Posts: 1679
Control Surfaces
« on: May 29, 2014, 12:26:03 PM »
This topic is not pretended to be complete. You can easily find the description, comparisons, etc. in the Internet.
But several basic definitions are essential for understanding how AZ Controller works.

Hardware control  is anything physically pushable, movable, turnable, etc. if such change can send a signal.
Hardware controller is a device with one or more Hardware controls on it.

By default, not every Hardware controller can be used as a SONAR Control Surface. For example, your mouse can be moved, has buttons and send signals once something happens, but it can not be used as a Control Surface in SONAR (at least not directly). To be usable such way, the Hardware controller should deliver the signal as MIDI Events. Physical MIDI interface is not required as long as some driver claim itself as MIDI driver. For example, your Tablet PC can be used as a Control Surface once there is a driver which convert your operations into MIDI Events.

SONAR scans the system for all MIDI devices (in fact just drivers claiming they have some MIDI device behind). All such devices are then listed in "MIDI/Devices" part of the SONAR "Preferences" dialog. To be usable as a Control Surface, your device should be listed in the "Inputs" list. Connection of you particular controller to SONAR is outside of this topic scope.

Control Surface plug-in can implement own internal methods to communicate with the Hardware controller. And many do, supporting network based (OSC, EUCON) or local (Joystick, proprietary) communications. How that works is concrete device/software specific, so the following text is about generic MIDI based devices.

MIDI Events can be categorized as "simple" (short) and "extended" (long, SysEx). Simple events are well defined and the definition is common for all devices and software which claim to be MIDI compatible. That means, that "as loud as possible middle C" note is represented by the same number in MIDI file, on SONAR MIDI track and sent by ANY MIDI keyboard.

But how that "middle C" is called (C3,C4,C5...) and how loud is "as loud as possible" is not well defined. Completely different story is what your instrument or DAW will do with that Event. It can start to play some complicated pattern starting A or just switch off EQ on some track. For the second action, some special "interpretation" module is required. And SONAR Control Surface plug-ins are such modules.

Other MIDI Events (SysEx) have no standard interpretation. Each company is free to define own set of such events and define the meaning. I think most known such set is defined by Mackie (HUI MIDI mapping protocol). Since special protocols need special interpretation, they need special plug-ins (Mackie Control, VS-xx, etc).

The communication with Control Surfaces can be unidirectional or bidirectional. If you want that motorized faders are moving and/or some LEDs show your DAW values, the plug-in should support sending this information back to the Hardware Controller, in the way it understand the meaning.

Another way to describe the communication between your controller and the DAW: each controller speak (more advanced controllers also understand) some Language (or several languages). Most controllers use MIDI "characters" for words (like Latin in West Europe). In case a controller use the language with other "characters", either they should be "translated" (many Novation devices work this way) or the "brain" should understand it (by special plug-ins which bypass MIDI).

As with Latin characters based languages, languages still can be different. There can be some common "words" (general MIDI specification), but there can be some special (HUI, MCP). Many manufacturers specify the whole "vocabulary" in the documentation, but some prefer to "hide" it. In the later case, they just mention "proprietary" (Mackie) or "dedicated" (Nektar) language, without going into details. Check that the software can understand it, normally till it is mentioned explicitly, that is not the case.

Another common problem is a claim "This device support protocol XYZ". That is like with the statement "I speak English", that by itself does not mean you can communicate good with the person... As with normal languages, the device probably knows only some "words" from the specified protocol, so you can miss some important functionality available in the "original" (native) device.

While Hardware Controls can produce MIDI Events, not all of them do. There can be some controls which while influencing other operation do not sent that information to the DAW. For example Bank, Program and Modes switches change there states "silently". The DAW is not informed, so such controls can not be used as Control Surface controls.  But these controls can change what other controls send. That way the same physical Hardware Control can send different MIDI Events depending on some other "silent" control state. A plug-in can see MIDI Events only, so in such case one Physical Control is seen as many Logical Controls. For many operations, the fact the control is still the same is important for correct processing.

Hardware Controls can differently react on the same physical manipulations. For example, Touch Sensitive fader/rotor send special MIDI Event when you have touched it, then another MIDI Even when (if) you have moved it and then yet another one after you took your hand off. Another example are touch sensitive Pads. They can send just one MIDI Event that you have pressed them (with or without the information how hard) or they can also send the second MIDI Event when you unpressed. Depending on what particular control supports (in particular mode), you can plan what it can do in your DAW. For example, you can not use simple Button which send no "I am unpressed" Event as a kind of Shift button, while you can use it as toggle (CapsLock like) control.

Important controller parameters. While producers are proudly advertising the number of controls the surface has, they are "forgetting" to say they are not endless (for knobs) and not motorized (for sliders) till they are really are. In addition, there are 2 quite important parameters about most producers keep silence:
  • Resolution. In analog mixer the resolution is a kind of endless, you can set any value in the range of any control. For control surfaces it is not true, there is always fixed number of possible positions for knobs and sliders. Any "simple" surfaces support standard MIDI CC/Note resolution: 127 fixed positions (7 bits!) for any slider or rotor. Advanced surfaces (Mackie for example) have 1024 positions at least for some controls, but that is still 10 bits only. No wonder they do not like focus on that in our "modern 64bit world"... For most cases (notes, Synth parameters, etc) 127 positions is absolutely ok (there are exceptions, for example Yamaha XP for Disklavier Pro with 1023 velocity levels), some parameters can suffer from so low resolution (volume level, frequency selection in EQ, etc). Do not be confused by the length of faders, this length does not increase the resolution on its own.
  • Position update speed. When you move your control let say from position 10 to position 20, you can expect the software will receive all changes: 10,11,12,13 and so on. But the hardwares/software reasonably "think" that skipping some intermediate positions is acceptable (the only alternative is to create backlog of such events and so introducing unpredictable delay). Which events will finally reach the DAW depends from many factors: what your surface is able to send, how fast the drivers can process that and what the DAW and the controller surface plug-in is able to accept. For just tuning parameter that is not a problem at all, the only interesting for you is the final value. But for writing automations that can be an issue, just do not be surprised you have big jump there. Note that visual delay is unrelated to that and it has no influence on the material (I have huge delay in all ProChannel visual controls when I use my surface instead of mouse).

Nice to have "pro" style controls, but be aware about consequences:
  • Motorized sliders. In case you have automations (some value is changing in time line) and/or you switch controlled parameter, it can be nice to have mechanically moved (by DAW) control. But such sliders are expensive, heavy and relatively noisy. As everything "motorized", they also have more parts to break. In case such slider is not touch sensitive, it can also "fight" with your finger for position.
  • Endless encoders. The behavior is like the encoder always has the position corresponding to the controlled parameter value. This position is normally indicated with LED ring around the encoder. Unlike motorized controls, the "position" can  be changed instantly and without physical movements. Apart from the price, such encoders are superior over fixed knobs. But such controls are not standardized, so the software/hardware should be able to understand particular implementation.

What AZ Controller supports you can find on About page.
« Last Edit: October 24, 2017, 11:22:26 AM by azslow3 »

Offline azslow3

  • Administrator
  • Hero Member
  • *****
  • Posts: 1679
Re: Control Surfaces
« Reply #1 on: May 31, 2014, 01:48:21 PM »
.