Jump to content
IGNORED

Transforming DALL-E 2 Generated Images Into Sound [MIDI Events] – [TouchDesigner + Ableton Live]


justanotheruser

Recommended Posts



Through the use of OpenAI’s DALL-E 2 API and TouchDesigner, I’ve managed to create a, let’s say, MIDI sequencer that captures RGB data incoming from AI generated images in real-time, and uses it to trigger MIDI events in Ableton Live. Said signals are then feeding, among other things, two wavetables in the first two examples, and my lovely new Prophet 6 in the other ones.

In short: The idea was to have a system that, given certain concepts (a.k.a prompts), generates chord progressions from the RGB incoming data resulting from the DALL-E generated images.

Concept [prompt] ➨ AI generated image [DALL-E 2] ➨ capturing RGB data in real-time [TouchDesigner] ➨ using that data to trigger MIDI events [Ableton Live] For more experiments, tutorials, and project files, you can head over to: https://linktr.ee/uisato

  • Like 2
Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.