cyanobacteria 862 Posted May 26, 2016 Share Posted May 26, 2016 (edited) Just a random brainstorming shitpost thread for what seems likely given interview mentions and stuff. They can apparently "create an album in realtime". This implies some really slick interface where they can modify parameters and maybe modify the way in which parameters modify other parameters because there is way too much stuff going on for everything to be manually handled by them at once obviously But how much is manually sequenced ahead of time by hand, how much is sequenced in realtime through hitting physical pads, how much is triggered by other events? Is anything triggered by really abstract events? Some speculations of mine are that there is an underlying range of allowed parameters that are allowed at particular times and after certain events, maybe even certain epochs within a track that can be summoned at will in which the acceptable parameter ranges and even trigger mappings are dramatically changed without user interaction as opposed to subtly changed like during live tweaks (maybe increase in intensity of some filter). Or maybe I'm talking complete bs (probably). Anyone else have random speculations? Edited May 26, 2016 by Zeffolia Link to post Share on other sites
rst 0 Posted May 26, 2016 Share Posted May 26, 2016 Some speculations of mine are that there is an underlying range of allowed parameters that are allowed at particular times and after certain events, maybe even certain epochs within a track that can be summoned at will in which the acceptable parameter ranges and even trigger mappings are dramatically changed without user interaction as opposed to subtly changed like during live tweaks (maybe increase in intensity of some filter). that wouldn't be difficult in max if you know how to use the pattr family and all the data storage objects. of course the entire patches would have to be of wondrous intricacy. so quite plausible. Link to post Share on other sites
cyanobacteria 862 Posted May 26, 2016 Author Share Posted May 26, 2016 (edited) lel Also in an interview they said they use "conditionals" a lot So this could maybe be: -Conditionals triggering events: really abstract mappings of past events to future event triggers (if this and this happened and that hasn't happened, do this, else if this and that happened do this, else do that) -Conditionals creating timbres (doubtful but an idea I've had): mappings between events and timbre parameters Max/MSP should allow things like this most definitely especially if they've built up a big infrastructure of patches relating to each other and maybe one really generalized patch which lets them input an arbitrary number of synth patches and modulation patches, and then sequencing event trigger map patches to create the atmosphere and rhythmic uniqueness of a specific track. Probably with a really flexible and slick interface like I mentioned before to control all of this since if it's done in a live environment there can be no little fucking tweaking of manual textual number parameters but it has to be slider bars and shit. Idfk who knows. Edited May 26, 2016 by Zeffolia Link to post Share on other sites
TheTransitionary 0 Posted May 26, 2016 Share Posted May 26, 2016 I really wish I knew. I'd love to be able to get my hands on one of their recent patches and just play around with it.Anyone know if the sounds are being generated in software or are they controlling nords/machinedrums etc with MIDI output from their laptops? Link to post Share on other sites
psn 521 Posted May 26, 2016 Share Posted May 26, 2016 Apparently it's all sequenced/generated and synthesized in Max/Gen. They said in a recent interview that the data that generates patterns also generates the sounds, or something to that effect. I understood it to mean that events can trigger stuff on both a macro and a micro level. Link to post Share on other sites
cyanobacteria 862 Posted May 26, 2016 Author Share Posted May 26, 2016 Apparently it's all sequenced/generated and synthesized in Max/Gen. They said in a recent interview that the data that generates patterns also generates the sounds, or something to that effect. I understood it to mean that events can trigger stuff on both a macro and a micro level. curvcaten seems like a great example of this. The whole thing is kind of recursive with respect to both rhythm and timbres. A pattern reappears in the synth pad of a little motif later on, over and over in every way. It's like symmetric. Not sure if anyone else sees what I mean but yeah Link to post Share on other sites
cyanobacteria 862 Posted May 27, 2016 Author Share Posted May 27, 2016 (edited) So they use Max/Gen psn? From what I've gathered online Gen is a way to compile plugins that then work outside of Max. So where would they be using these self contained plugins? Other sequencers or something? Or just for the purposes of cleanliness, organization, isolation etc. Probably a very dumb question but I'm only just starting to look into Max. Edited May 27, 2016 by Zeffolia Link to post Share on other sites
mgore10 0 Posted May 27, 2016 Share Posted May 27, 2016 (edited) You can build sequencers right in Max to control time/events firing off samplers, synths, effects, that you can also make right in Max. Max also hosts VST and Audio Unit plugins - this way they can be treated like a modular building block just like other 'objects' in Max. Gen~ is a way to do optimized digital signal processing in Max that work at a single sample level and that do not need to be compiled in C. I guess the best way to think about Max is as a huge modular machine where almost anything can be patched into anything. The beauty is its ability to finely control sounds and trigger them however you want - randomly, through probabilities, external data, conditional statements (ex. if X happens then do Z), etc. - whatever you can think up, you can do - (ex. I want this rhythmic and pitch structure to interpolate/modulate seamlessly to this new structure over the next 30 seconds). Hope this helps :) Edited May 27, 2016 by mgore10 Link to post Share on other sites
ignatius 6097 Posted May 27, 2016 Share Posted May 27, 2016 max also now has snapshots for the entire patch. so you can save the state of entire patches and recall that instantly. multiple snapshots.. like presets for every parameter saved together. bing bang boom. Link to post Share on other sites
digit 247 Posted May 27, 2016 Share Posted May 27, 2016 just thinking out loud here: we know that for the recent live stuff they're using max & no other hardware other than the kenton killamix controllers. so all the sounds are being generated in Max. i'm hearing some pitched sample artifacts in elseq, so we know that not everything is synthesized. their setup can also process samples. I remember one of them mentioning they specifically use the killamix controllers that have the indented knobs that snap to discrete values (the ones with the green rather than blue leds?), so that means they're using them to select items rather than realtime tweaking filters or fx by hand. if i was trying to recreate what they do, i imagine i would separate the sounds, sequences, melodies, effects, and control data that modulate all of the above into separate 'blocks' or 'clips' or whatever terminology you want to use, and then recombine them on the fly in different ways. i would also build in some internal feedback system so that the audio or logic output of one element could influence the modulation of others, and set things to evolve gradually in this way. so even if you took your hands off all the controls, the sound would still evolve & change to a certain extent. consider that they're not using midi anymore since their system is not talking to external hardware, so all the modulation happens in the box. you could get really fast high resolution modulation of all your parameters. this explains how they get such a fluid warped sound in their recent stuff. Link to post Share on other sites
kausto 65 Posted May 27, 2016 Share Posted May 27, 2016 i'm hearing some pitched sample artifacts in elseq, so we know that not everything is synthesized. their setup can also process samples. It could be just table write / table read with phasor here i.e. resampling. I don't know about max particulary but in pd i'd do stuff (buffer tricks, granulation etc) this way. I also noticed high pitched sweeps of retriggering clicks/hi-hats are sounding pretty aliasing-free, like they use bandlimited stuff where it is necessary. Link to post Share on other sites
freddy753 3 Posted May 27, 2016 Share Posted May 27, 2016 This is off topic, but man those Killamix controllers are expensive. On the other hand, they look solid as hell. It'd be cool if they made a "clone" of the BCR2000 with the construction of the Killamix. Link to post Share on other sites
nerver 4 Posted May 27, 2016 Share Posted May 27, 2016 i think they are always changing how they do things though. if you have the time, read the massive thread about them. they spill the beans here and there. but one thing they never do is show pictures of what they have set up :) always evolving! Link to post Share on other sites
futureimage 4 Posted May 30, 2016 Share Posted May 30, 2016 Just chiming in to say I think the whole AE_LIVE/Exai tour patch must be fascinating. The fact that there are so many memorable rhythmic motifs which appear at the same points in the different sets means that there's something very interesting going on - obviously not pure random, but not pure sequencing either due to the way those motifs are slightly skewed each time.I've equated that patch to having an electronic version of the Bitches Brew band going off. That in itself is something to behold. Link to post Share on other sites
AnwarAutokino 122 Posted May 30, 2016 Share Posted May 30, 2016 I've equated that patch to having an electronic version of the Bitches Brew band going off. That in itself is something to behold. good analogy, elseq in particular struck me as very Bitches-Brew-y, it has that sort of meandering, jammy quality to it Link to post Share on other sites
MIXL2 855 Posted May 30, 2016 Share Posted May 30, 2016 (edited) can we rename this thread: "Autechre production methods speculation based on facts we have" out of nostaliga Edited May 30, 2016 by MIXL2 Link to post Share on other sites
sweepstakes 873 Posted May 30, 2016 Share Posted May 30, 2016 can we rename this thread: "Autechre production methods speculation based on facts we have" out of nostaliga Boy that was one of the dumbest thread titles ever and it sat on the front page for months. In protest I started a thread entitled "...based on farts" but it was locked almost immediately. 1 Link to post Share on other sites
auxien 2263 Posted May 30, 2016 Share Posted May 30, 2016 Fart-based production speculation might be more relevant for an AE thread. sent using magic space waves Link to post Share on other sites
Psychotronic 253 Posted May 30, 2016 Share Posted May 30, 2016 I support the fart based speculation approach. Link to post Share on other sites
kausto 65 Posted May 30, 2016 Share Posted May 30, 2016 (edited) BTW i'm somehow sure c16 deep tread is mutation of this: deep fuckin' tread Edited May 30, 2016 by telefunken Link to post Share on other sites
cyanobacteria 862 Posted July 3, 2016 Author Share Posted July 3, 2016 So do AE use Max/MSP for sound synthesis or just controlling external peripherals? I've been learning Max/MSP and reading on how to make high quality sounding synths and apparently it's riddled with issues including aliasing, scheduler slop dropping events, and things like that. Is this the case? And if so, how are these problems alleviated? Clever programming? Link to post Share on other sites
IOS 431 Posted July 3, 2016 Share Posted July 3, 2016 (edited) my humble guess is they use it for everything, whether it be synthesis, sequencing, laptop comm (using their own MIDI-like protocol), FX etc. They've written externals for it too (presumably in C?). Would be very interested to read what you've gathered re MaxMSP defects. Edited July 3, 2016 by IOS Link to post Share on other sites
auxien 2263 Posted July 3, 2016 Share Posted July 3, 2016 Yeah, everything I've seen them say in interviews for the past few years suggests everything is done within Max/MSP. They could of course be using samples on occasion, but to me it doesn't sound like it too often. Older stuff, Exai-era and of course before, seems to be some amount of hardware involved. Would be curious the last time they directly used hardware in a track...might be able to find out in AAA. Link to post Share on other sites
Recommended Posts
Create an account or sign in to comment
You need to be a member in order to leave a comment
Create an account
Sign up for a new account in our community. It's easy!
Register a new accountSign in
Already have an account? Sign in here.
Sign In Now