huggermugger wrote: ↑Sun Jun 02, 2024 12:22 am
I'm intrigued, but unclear on what function(s) this module is meant to fulfill.
Hi, thanks for the feedback. Although morphing seems to be a straightforward thing it's turned out to be difficult to explain how my interpretation of the concept works.
Custom Control on its own doesn't do anything useful. It extends the functionality of the Custom Panel module in the Adroit Custom bundle.
Custom Panel modules on their own add the same kind of functionality to VM as Combinator does to Reason or MindMeld's Patch Master does to VCV Rack. They let you build a simplified user interface that is mapped to control key elements of a complicated patch that is usually hidden off-screen.
Custom Control adds the concept of scenes. A scene represents the state of all controls in an Adroit Custom user interface. Although scenes sound very similar to variations in VM, they support morphing and have a well-ordered and deliberately limited structure. Scenes and variations are independent of each other and can be used together - there are effectively 16 scenes inside each variation.
Let's look at some practical applications of Custom Control and scenes.
If you select discrete mode (by clicking on the DISCRETE button) then the 16 numbered buttons on the left switch between scenes when clicked. If the MORPH TIME knob is set to minimum the change is instant and this is reflected by all Custom Panel controls jumping to whatever setting they had when that scene was last selected. You can also select a scene using a CV fed to the SCENE socket or by clicking on a Scene Selector element that you've added to a Custom Panel.
This behaviour is almost identical to switching variations in VM, although variations can only be switched on and off on a per module basis while with scenes each control has its own independent Scene Mode parameter. If set to Motorized then the control changes when the scene changes, if set to Shared then the control has the same setting in all scenes.
Now if we turn up the MORPH TIME knob (or adjust the morph time by using the CV input socket beneath the knob) then when a scene change happens instead of the controls instantly jumping to a new setting they smoothly change over the time specified. The controls gradually accelerate to top speed and then decelerate as they approach their new settings. All controls arrive at their new values at the same time so often controls change at different speeds depending on how far they need to travel. This is called a discrete morph. The slowest knob setting is 25 seconds but this can be extended indefinitely by exponential CV control.
If the destination for a discrete morph is changed before the journey is complete then this is called an interrupted morph. This resets the timer and a new morph begins from the current settings to the new destination. Interrupted morphs can themselves be interrupted so it's possible to indefinitely explore the parametric space between scenes - the current settings being based on the history of interruptions while always moving in the direction of the current destination.
If you select continuous mode (by clicking on the CONTINUOUS button) then the scenes are arranged in a circular pattern (as represented by the circle of LEDs). There is now a morphing position - a value somewhere between scene 1 and just before scene 1 at the other end of the circle.
The simplest application here is to select Scene 1 as the base scene by clicking on the 1 button and setting the range slider to its minimum setting of 1. Then the OFFSET slider and/or CV modulation from the two attenuverters and sockets below morphs all the controls between their settings in Scene 1 and their setting in Scene 2. This is how most synths that support morphing work. They have an A patch and a B patch and one can blend between them using CV. This is what I meant by simple A to B morphing.
One obvious way to use A to B morphing is to have the A (Scene 1) patch set for the attack part of a sound and the B (Scene 2) patch set for the final decayed sound and use a ramping CV to sweep from A to B during the duration of a note. The amplitude envelope would normally not be morphed though (i.e. we'd set ADSR controls to Shared Scene Mode) so that we can control the overall change in amplitude independently of other parameters.
But we are not restricted to simple A to B morphing. We can increase the RANGE slider so that as a note plays we morph through 3, 4, 5 or more different scenes. This is what I call micro-morphing and it's then perhaps best to think of scenes as stages in multiple parallel multi-segment envelope generators
In continuous mode the control values of each scene are interpolated using a cubic spline function with circularity - in other words scene 1 is treated as being "next" after scene 16. This means one can continuously move around the circle (in either direction) without any glitches.
This circular continuity becomes important when we look at the sequenced mode (when the SEQUENCED button is clicked). This is just a variation on continuous mode where scene position is proportional to elapsed time.
So in sequenced mode by default we run round and round the scenes in a circle and this is generally done while synced to tempo.
We might consider that each control in the interface has a 16-step sequencer or MSEG EG behind it. Each control has an interpolation parameter that can be linear, cubic or stepped. With stepped interpolation controls jump from scene to scene so we have step sequencer behaviour, but in linear and cubic mode we have something closer to multi-segment envelope generator behaviour.
To support step-sequencing behaviour without the use of external sequencers there are special Scene N scene modes. So as well as individual controls having Motorized and Shared scene mode options they can be set to Scene 1 or Scene 2 etc. This means that a particular control is only active in a particular scene. So we can have 16 knobs where each knob is mapped to the same thing but as the scenes change each knob takes its turn in controlling whatever is being mapped.
Another application is to run sequencing in discrete mode at two different but synchronized rates. One high-level very slow sequencer controls which scene (or in effect song part) is selected (by the sequencer output being connected to the SCENE socket) and multiple lower-level faster sequencers control things such as pitch and gates, with their exact settings determined by which scene is active. So a single low-level sequencer can be repeatedly "reprogrammed" depending on which scene (or song part) is active. This reprogramming can be exposed by having knobs or sliders in the interface that "explode" the settings (using Scene N scene mode) so that a single 16-step sequencer could be made to look like it's a 64-step sequencer for instance. Although the sequencers in this scenario would use stepped interpolation, other controls could use regular cubic interpolation so that although the sequencers are reprogrammed instantly on scene change other things could undergo a transition of a duration set by the morph time.
By making the mapped sequencers V/Bar driven we can include LSSP time-splitting techniques to make such setups more elegant. And by using S-Poly chord and scale signals we can reuse sequence data in different harmonic contexts. And if such sequencers are Groove signal-aware we can apply micro-timing offsets and accenting globally if desired.
Another more experimental application is sequenced mode running at audio-rate frequencies.
One important element I've not touched on yet is live editing. This means that as Custom Panel controls are morphing you can "grab" any control by clicking on it with the mouse and override the automation. Not only is the automation overriden but the changes are reflected in the control's underlying scene data. This means that there can be constant motion capture and replay happening.
Although there is only 16-step resolution, because of cubic spline interpolation, the apparent resolution is much higher than this. Reverse interpolation is used so that each control's setting in each scene is affected in proportion - so if the scene position is exactly at Scene 1 say then grabbing and changing a control will only affect the Scene 1 setting for that control but if the scene position is between Scene 1 and Scene 2 then both the Scene 1 and Scene 2 values are adjusted in proportion.
Making it possible to grab controls and do naturalistic live automation override (like touching motorized faders on a high-end mixing desk) was the most difficult aspect of this project as VM's regular interface couldn't cope with the threading issues this caused. So I had to replace much of VM to make it work reliably and this in itself triggered a whole range of secondary problems.
Gosh, I've been waffling again! I hope the above makes some sense. It's been useful explaining things in a slightly different way than usual and it will probably help me make some adjustments to the official documentation.