Jump to content


  • Posts

  • Joined

  • Last visited

Profile Information

  • Gender
  • Location
    United States

Previous Fields

  • Country
    United States

g8tr1522's Achievements


Rookie (2/14)

  • First Post
  • Conversation Starter
  • Week One Done
  • One Month Later
  • One Year In

Recent Badges



  1. I'm sorry I died for a month, but I'm opening this thread back up to get a few more answers. I just finished a 6 week technical writing class, and I didn't even bother with music or programming during then. Seems like either Max or Max4Live is my best bet. I'll have to ask about this in a more specific thread (I guess), or on a Max forum. But real quick, How???? I'm trying to google this, but I can't find anything. This sounds PERFECT though! This sounds perfect too. But Live Standard plus M4L is too expensive for me. Even with student discounts. In M4L, can I do any audio stuff I can do in ordinary Max? Because I'd be hesitant to get it if I'm missing useful Max features.
  2. Okay, I'm gonna give a couple very specific examples of what I'm wanting. I'll generalize the situations by using generic tracker jargon. But if this could be done in another software with something like a piano roll or a timeline, then that's totally fine. I also asked about this in the Radium google-group. So we'll assume: 1) that I have a generic tracker. 2) I can use scripts to 'place' notes onto the lines in a track's column or to 'place' values in an automation column. 3) I set the patterns to have 64 lines. (A "pattern" would be a collection of tracks and other columns - like in Renoise). This first example can actually be done in Renoise with xStream. I have a script that 'spits out' note values onto the lines of a track. In the script, I have a list of note values. On the first, and every 8 lines in the pattern, randomly choose an element (a note value) in the list, and 'place it' onto these specified lines lines. Let's say these notes are used to play a VST instrument. In this script, I have another list of values. In the same fashion as before (with the note values), these values are 'placed onto' an automation column. Maybe to control the cutoff amount in that VST. The script specifies all of this, and does it with a single execution. After execution, I can see the results on each tracks' lines (I like to call this, "logging the results" of the script). This gives me the option to fine tune the the pattern data as I please. This next one exactly describes what I want the most. Same situation as before, with the script that 'spits out' note data. But now, I add a function to the script, keychange(note_list,value), which adds value to every element in the list of notes (and returns the new list). I create a variable, kcval, which I pass to keychange's value parameter. In a new track (same pattern), I put ~something~ on line 32. When the pattern is in playback, and when it reaches line 32 --- this ~something~ adjusts the value of kcval. I.e., keychange is automated by the pattern itself. This is what I meant by "scheduling" code updates when I made my initial post: I know that instead of having the code update via pattern playback, you could make the script 'do the update at the right moment' with a return/getter of the pattern's current line number. But having the update in the pattern itself would be a really intuitive way of modifying the code. I know this may be too specific for my own needs, but this is exactly what I want. And again, if this could achieved with a different paradigm (eg, piano roll or timeline), I'd like to know. I'm especially wondering if I could do the second example with Max for Live. I.e., the "code updates" would be put on Live's timeline. And upon playback, this would update values in a Max patch, over time. I doubt I could achieve the "logging the results" mentioned above, but I can live with that. Also, any ideas on how I could route software together to achieve this in a tracker? I did just say I was too lazy for that, but I'm realizing that this may be my only option.
  3. Ah, yeah, I'm sure I could do exactly what I need by routing software together. But that sounds too hard I'll probably end up doing it if I absolutely have to, but like I said in my initial post: I need to be hand-held.
  4. This is very interesting, but for now, I'm putting off learning DSP for a couple years.
  5. I looked at Jeskola Buzz immediately after posting this thread. I have the same questions on that as I do above with the Radium/Puredata stuff. So subsitute: instead of Pd, a C++ program. And instead of Radium, Jeskola Buzz.
  6. Radium looks super interesting, especially with that Pd integration. Thanks for the suggestion Entorwellian. Linux only so far, damn! Awwww man. I was about to download it, until I saw that the Pd stuff was Linux only. Maybe it's time to partition my hard drive... Except I'm serious about that. But I need more details about how Pd is built into Radium. I feel like the answer will be "yes" to all of these, but I just want to be sure. Can I use the Pd integration to output notes into the note editor? Can I use it to output ~whatever~ on the automation? Can I use it to track through samples? That paper on the website says "Custom Pd controllers make it possible to control Pd from Radium, and to control Radium from Pd". How crazy does that get? Can I do that recursively? E.g., I enter data into a column in the pattern editor - this data then changes values in a Pd patch, but these changes are scheduled (with my original pattern) - but THEN this Pd patch outputs note data into a separate pattern column. If the Pd integration can't do any of those, what about that "Extension language support"? Could I do the things described above with an "extension language"?
  7. Would I need to buy Reaktor to use that? I've thought about getting it in the past, but I never have enough money. Plus I'd rather drop that amount on a Max/MSP license.
  8. danoise has been very helpful to (and very patient with) me, but I NEED baby steps like SonicPi does in its tutorial. I play around in xStream, but I quickly run into issues and then I have a million questions. I'd want to read docs and tutorials and get an answer right away rather than type out my issue on the forum and wait for an answer. Clip looked interesting, but it didn't successfully load into the current version of Renoise (can that be easily fixed by me?). And ReNoam only sequences patterns, not track data. Yeah, I'm trying to learn Lua. It's really difficult though because it doesn't natively have OOP; The metatables are stumping me at the moment. Also I'm a noob programmer. But I do appreciate the flexibility that comes from only having one type of data structure. But I have the same issue with the Renoise API; no tutorials. I need to be built from the ground up.
  9. Hello everyone. I never post here, but I do read up here a lot. -----------(Intro)----------------------------------------------------- Okay, so a couple months ago, I started learning SonicPi (and Ruby). This wasn't my first time programming, but it was my first foray into algorithmic composition. This was also the first time I ever got serious with OOP. Needless to say, I was blown away. Fucking ecstatic. Here was this algo-comp thing that always eluded me, and then Sam Aaron comes in and is like "Come my child. I will show you the way. EVERY STEP of the way". That tutorial makes it SO easy to learn this stuff. It was telling me what a "sustain level" is and I was like "pfft, whatever. I know this". But then in the MagPi tutorial he wrote: "and then when the loop repeats, the random seed is initialized again and the sequence repeats". And then I understood. PRNG seed-manipulation baby. This is how one "does the Aphex Twin". I was finally ready to get serious with making music. But frustration set in quickly because it's so damn hard to make a song transition. You can make cues and sync threads to them, but it's so unintuitive. And when you have a thread with random sleep periods, it gets even worse. Impossible for me. But really, I was missing my home, Renoise. Renoise is baby's-first-DAW. And boy was I homesick. Whenever I wrote anything in SP (SonicPi), I always wished I could select and paste 'track data'. And slice up drum breaks. And 'drag an FX slider' while a song was running. And don't get me started on how SP doesn't even have native midi support. So I found out about the Renoise 'tool' (an add-on, extension, etc.) xStream. And it seemed really cool and it'd be the answer to my prayers. I was so eager, I bought the 3rd edition of Programming in Lua (Renoise and it's tools are made with Lua). But there's no hand-holding tutorial for xStream like there was with SP. Actually, there's NO tutorial. Instead, I was told to "study examples". These examples had no comments in them. At all. Just a (commented) description at the top, of what the example did. And there were these very important functions/classes: xline and xinc. But there's no complete description of them anywhere. There's a documentation, but it's so incredibly cryptic and confusing and has zero info on how to use/implement the classes in it. (and xinc isn't even in the docs, despite it's central role). And now I'm running into issues with the PRNG, which is pretty much my favorite part of algo-comp. So fuck. That's no fun. --tldr-- I learned SonicPi, and I loved the flexibility that programming could bring to music. But using pure code to write songs can be frustrating: you can't sequence anything and you can't go back and change something. Only come up with better code and hope for the best. I'm used to a tracker interface, so I quickly because frustrated with these limitations in SP. I tried xStream in Renoise, but I need a hand-holding tutorial; xStream doesn't have any tutorial. ------------------------------------------------------------------------ Do you guys have recommendations? For any suggestion: does that software have native VST and midi support? Or even better: can this software be used as a VST? The ideal (imaginary) situation would be that I could trigger/schedule SC code with Renoise, and that I could write SC code to produce input in Renoise. Something that could do that would be perfect. But, How about SuperCollider (SC)? I looked into using SC. Especially because SP is a client to scsynth (the SC sound synthesis server). I could slowly transition by writing SC synthdefs for SP, and then eventually fully switch to SC. But I'm hesitant to dedicate my efforts into SC because I'm certain I'll run into the same frustrations I had with SP. (Does SC even use real-time scripts like SP does?) I know that in SC you can make little GUIs to control your code. Is there a tracker-like SC GUI that will schedule my code for me? How well does SC integrate with midi? How about Max/MSP? It seems like it could have similar issues with scheduling parts of the code. But I heard that it can interface with Arduino and RPi pretty easily, which sounds awesome. Also, it can 'cross-modulate' between audio and video; To me, that equals "dank meme machine". Is the community friendly enough to let me rip their patches and use them however I please? Not that I won't learn it. I just need a quick-start. I haven't really looked into Csound or ChucK. Anything else is unknown to me.
  • Create New...