Jump to content

justanotheruser

Members
  • Posts

    58
  • Joined

  • Last visited

Everything posted by justanotheruser

  1. Hello everyone! I've created a bunch of experimental digital oscilloscopes, which I tend to run into Stable Diffusion for more intricate and interesting results. Here are a couple of demonstrations for you to enjoy: You can access this bad boys to embelish your productions through my Patreon page: [url]https://linktr.ee/uisato[/url]
  2. A few small fragments of live-improvisations, testing these new real-time audioreactive pointclouds I've been working on. [Which btw, you can access as of today through my Patreon profile. Nine (9) new presets in total.] For more experiments, project files and tutorials, you can head over to: https://linktr.ee/uisato
  3. Bridging the realms of sound, live performance and literature, I've created a system in TouchDesigner that transforms real-time audio data into unique AI generated poetry being recited, also in real-time. Through the use of AI and data sonification techniques, this artistic installation aims to give life to a digital bard to accompany musical compositions in a hopefully-beautiful and meaningful way. I’ve been working on this for the past several months and let me tell you, there have been a significant amount of interesting moments while using this system to accompany my live-performing. Hope both examples shown in this post give you a glimpse of what I’m writing about. In short, process happens as follows: Real-time Audio Source ➨ Frequency Analysis ➨ Poetic Element Mapping ➨ Poem Generation [ChatGPT API] ➨ TextToSpeech [ElevenLabs API] Filled with excitement, I can’t stop thinking about possible cool usages for a system with these characteristics: from using it as a member in live-performing band, to a 24/7 online radio or a beautiful interactive installation. PS: I’ve recently got my plane tickets to Europe for this fall, and I’d love to bring Auratura with me. If you are interested in hosting this installation, or know of cool places/venues, please do get in touch. For more experiments, tutorials, and project files, you can head over to: https://linktr.ee/uisato
  4. Hey. Just wanted to share my latest system with you. https://www.youtube.com/watch?v=gZprUqsLOyQ https://www.youtube.com/watch?v=z8wgWWOHiqM A couple of examples of what happens when you combine the last particle oscilloscope I shared, with Stable Diffusion. I still can’t believe how good this tool is. We’re living in crAIzy times, folks. Be sure to fasten those seat belts. You can access this oscilloscope, plus quite a few more TouchDesigner project files, tutorials and experiments through: https://linktr.ee/uisato
  5. Hey! I think some of you might find this topic interesting. Here're a couple of experiments of my own with Stable Diffusion and video-recordings [some of my own, some well known videoclips] 1) Radiohead - Lotus Flower 2) Stimulus - [Recording of my own] What do you think? I'd love to read your thoughts on these.
  6. I've just made available my newest experimental oscilloscope, in addition to my latest MIDI generative system [DALL-E 2 generated images to MIDI] and a completely free sample pack. Enjoy! https://linktr.ee/uisato https://www.youtube.com/watch?v=rSvhhgd-gxw
  7. Through the use of OpenAI’s DALL-E 2 API and TouchDesigner, I’ve managed to create a, let’s say, MIDI sequencer that captures RGB data incoming from AI generated images in real-time, and uses it to trigger MIDI events in Ableton Live. Said signals are then feeding, among other things, two wavetables in the first two examples, and my lovely new Prophet 6 in the other ones. In short: The idea was to have a system that, given certain concepts (a.k.a prompts), generates chord progressions from the RGB incoming data resulting from the DALL-E generated images. Concept [prompt] ➨ AI generated image [DALL-E 2] ➨ capturing RGB data in real-time [TouchDesigner] ➨ using that data to trigger MIDI events [Ableton Live] For more experiments, tutorials, and project files, you can head over to: https://linktr.ee/uisato
  8. A couple of example results from my latest system, a video player with an audioreactive playhead. There’s a couple of things more to it, but I’ll leave it to my patrons to explore. You can access this system plus many more in my Patreon profile: https://linktr.ee/uisato
  9. I've made a little walkthrough for Samplebrain, Aphex Twin's new sound design tool. Hope you all enjoy it.
  10. Hello everyone. I present before you my latest work: Transforming NASA's asteroid data in real-time into MIDI to feed my Pulsar 23:
  11. A couple of hopefully-interesting patches for the Pulsar 23. I also took the opportunity to dust off my real-time ASCII filter [with which I’ve been preparing some cool things for next month.] You can freely access the entire WAV files of this sessions [+ some bonuses] for sampling in here.
  12. A couple of hopefully-interesting patches for the Pulsar 23. I also took the opportunity to dust off my real-time ASCII filter [with which I’ve been preparing some cool things for next month.] You can freely access the entire WAV files of this sessions [+ some bonuses] for sampling in here.
  13. Hello everyone! I'd like to share with you a few excerpts of me playing my new chaotic pointcloud system, in which the particles are being audioreactive to the incoming audio signal. You can find these and quite a few more project files in my Patreon page: https://linktr.ee/uisato
  14. Through the use of Ambee’s + NASA’s APIs and TouchDesigner, I’ve managed to capture air quality and near-earth space object [asteroids and fireballs] data in real-time, and used it to trigger MIDI signals in Ableton Live. Said signals are feeding two stock samplers and an instance of SketchCassette. Samplers are loaded one with Thom Yorke’s vocal stems from Nude and Reckoning, and the other one with the song that single-handedly introduced me to guitar-playing. Without further ado:
  15. Hey, everyone! Just came to share my newest video-teaching on a technique which I think a good amount of you will find of interest. In the following video, you’re going to learn how to control both analog and digital instruments through movement with the help of TouchDesigner, Ableton Live and a Kinect camera. Link to video: https://www.youtube.com/watch?v=vHtUXvb6XMM
  16. Hey, everyone. I think that some of you might be interested in my latest experiment: Through the use of NASA’s API and TouchDesigner, I’ve managed to capture near-earth space objects data [asteroids and fireballs], and used it to trigger MIDI signals in Ableton Live. Said signals are feeding a stock sampler, which happens to be loaded with a couple of one of my favorite artist’s vocal takes. To give it a little more musicality to the experiment, I decided to iterate through the data of the last six objects that passed close to Earth between the selected dates.
  17. Thanks! Now is the time! There's so much documentation and tutorials out there..
  18. Hello everyone! Through the use of a depth camera (Kinect V2), I’ve managed to create this cool real-time filter in TouchDesigner. What do you people think about it? Can't wait to see this badboy live on stages! Music and visuals by myself.
  19. Hello everyone! Been working for quite a bit in a way to transform image into sound, and came up with a hopefully-interesting idea. In this tutorial you’re going to learn how to transform RGB data into MIDI in TouchDesigner, and feed it to Ableton Live: I’ve also uploaded both project files and the final Ableton session for all my patrons: https://linktr.ee/uisato Enjoy! ?
  20. Hey, everyone! Just wanted to share my latest experimentation with Kinect, Ableton Live and TouchDesigner. It's amazing what can be accomplished with so little investment:
  21. I’ve been kind of obsessed with oscilloscope music/drawing this past week. And kind of sad due to the fact that the cool analog ones that were being sold in my area were super over-priced or in terrible shape. That’s why I decided to build one in TouchDesigner (and I was able to include the so precious XY mode!) What you’re hearing in the first video is a segment of Jerobeam Fenderson’s “Planets”, track part of his amazing album “Oscilloscope Music”. And all the visuals you’re seeing are being generated by just using said track as an input for the oscilloscope built in XY mode. (Yes, Jerobeam is awesome) There’s still some room for improvement, but I’m quite happy with the “analog vibe” I managed to give the whole thing. PS: I’ve just uploaded a copy of Oscilloscope’s project file for all my patrons. Let the sonorous drawing begin: https://linktr.ee/uisato
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.