Jump to content

Search the Community

Showing results for tags 'Touchdesigner'.

  • Search By Tags

    Type tags separated by commas.
  • Search By Author

Content Type


Forums

  • General Discussion
    • Music Discussion
    • WATMM Featured Artists
    • General Banter
  • Expert Knob Twiddlers
    • EKT General Discussion
    • Your Latest Creations
    • EKT New Releases

Blogs

  • Joyrex's Blog
  • assorted ramblings
  • A hateful blog
  • them groovy dinosaur thoughts
  • Kcinsu
  • q
  • wahrk's Blog
  • wahrk's non-wahrk Blog
  • patternoverlap's Blog
  • The Narrow End of the Night
  • Quiet Time With Alzado
  • jreid
  • BCM's Blog
  • BCM's Blog
  • My Bolg
  • hautlle's Blog
  • ¡MÚSICA ARRIBA! watmm blog
  • chenGOD's Blog
  • Hobermonster's Ears
  • watmm minecraft blog
  • Macca's Blog
  • en<3y
  • clevreuse's Blog
  • On Wooden Boys That Skin Their Knees
  • pera's Blog
  • Spore's Blog
  • +
  • andihow's Blog
  • delet...er...lee yorz
  • Deepex's Blog
  • Chris Moss Acid's Blog
  • G. I. Raffe's Blog
  • Deepex's Blog
  • fun zone
  • ezrablog
  • brog
  • chris moss acid's Blog
  • Fuck Tri
  • chaosmachine's Blog
  • Jaffa's Blog
  • Jaffa's Blog
  • hoggy's Blog
  • music (AD)ventures
  • Split Pan Genus
  • StephenG's Blog
  • Blog of Facts
  • Npoess' Blog
  • Daily Ambient
  • Rubin Farr's Blog
  • Squarer's Blorg
  • azatroths blaargh
  • MDM Chaos' Blog
  • very honest's Blog
  • dcomμnications

Categories

There are no results to display.


Find results in...

Find results that contain...


Date Created

  • Start

    End


Last Updated

  • Start

    End


Filter by number of...

Joined

  • Start

    End


Group


Location


My Discogs


My SoundCloud


My Bandcamp


Interests


Website URL


PSN


XBOX Live


Steam


3DS Friend Code


Wii U Username


Nintendo ID

Found 12 results

  1. Through the use of OpenAI’s DALL-E 2 API and TouchDesigner, I’ve managed to create a, let’s say, MIDI sequencer that captures RGB data incoming from AI generated images in real-time, and uses it to trigger MIDI events in Ableton Live. Said signals are then feeding, among other things, two wavetables in the first two examples, and my lovely new Prophet 6 in the other ones. In short: The idea was to have a system that, given certain concepts (a.k.a prompts), generates chord progressions from the RGB incoming data resulting from the DALL-E generated images. Concept [prompt] ➨ AI generated image [DALL-E 2] ➨ capturing RGB data in real-time [TouchDesigner] ➨ using that data to trigger MIDI events [Ableton Live] For more experiments, tutorials, and project files, you can head over to: https://linktr.ee/uisato
  2. A couple of example results from my latest system, a video player with an audioreactive playhead. There’s a couple of things more to it, but I’ll leave it to my patrons to explore. You can access this system plus many more in my Patreon profile: https://linktr.ee/uisato
  3. Hello everyone. I present before you my latest work: Transforming NASA's asteroid data in real-time into MIDI to feed my Pulsar 23:
  4. Hello everyone! I'd like to share with you a few excerpts of me playing my new chaotic pointcloud system, in which the particles are being audioreactive to the incoming audio signal. You can find these and quite a few more project files in my Patreon page: https://linktr.ee/uisato
  5. Through the use of Ambee’s + NASA’s APIs and TouchDesigner, I’ve managed to capture air quality and near-earth space object [asteroids and fireballs] data in real-time, and used it to trigger MIDI signals in Ableton Live. Said signals are feeding two stock samplers and an instance of SketchCassette. Samplers are loaded one with Thom Yorke’s vocal stems from Nude and Reckoning, and the other one with the song that single-handedly introduced me to guitar-playing. Without further ado:
  6. Hey, everyone! Just came to share my newest video-teaching on a technique which I think a good amount of you will find of interest. In the following video, you’re going to learn how to control both analog and digital instruments through movement with the help of TouchDesigner, Ableton Live and a Kinect camera. Link to video: https://www.youtube.com/watch?v=vHtUXvb6XMM
  7. Hello everyone! Through the use of a depth camera (Kinect V2), I’ve managed to create this cool real-time filter in TouchDesigner. What do you people think about it? Can't wait to see this badboy live on stages! Music and visuals by myself.
  8. Hello everyone! Been working for quite a bit in a way to transform image into sound, and came up with a hopefully-interesting idea. In this tutorial you’re going to learn how to transform RGB data into MIDI in TouchDesigner, and feed it to Ableton Live: I’ve also uploaded both project files and the final Ableton session for all my patrons: https://linktr.ee/uisato Enjoy! 😄
  9. Hey, everyone! Just wanted to share my latest experimentation with Kinect, Ableton Live and TouchDesigner. It's amazing what can be accomplished with so little investment:
  10. I’ve been kind of obsessed with oscilloscope music/drawing this past week. And kind of sad due to the fact that the cool analog ones that were being sold in my area were super over-priced or in terrible shape. That’s why I decided to build one in TouchDesigner (and I was able to include the so precious XY mode!) What you’re hearing in the first video is a segment of Jerobeam Fenderson’s “Planets”, track part of his amazing album “Oscilloscope Music”. And all the visuals you’re seeing are being generated by just using said track as an input for the oscilloscope built in XY mode. (Yes, Jerobeam is awesome) There’s still some room for improvement, but I’m quite happy with the “analog vibe” I managed to give the whole thing. PS: I’ve just uploaded a copy of Oscilloscope’s project file for all my patrons. Let the sonorous drawing begin: https://linktr.ee/uisato
  11. Hey, everyone! I've been experimenting for the past couple of months with different softwares and techniques and I'm super excited to share with you what's been achieved! 😄 Hardware used: - PC running Windows 10 with Ableton Live 11 + TouchDesigner running on parallel - Kinect V2 Kinect-Controlled Drum Rack: Kinect-Controlled Voice FX Chain: Kinect-Controlled Guitar FX Chain: Kinect-Controlled Synth: What do you guys think? What would you like to see in the future with these kinds of experiments? I'm all ears!
  12. My very first tutorial, ever. Hope you guys enjoy it. www.youtube.com/watch?v=rhxB8eq4r88
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.