Jump to content

Recommended Posts

Do any of you develop audio plugins or applications using JUCE?

I've just started getting serious about it after playing around with it for a couple years. I'm really glad there is an adequate amount of tutorials for different things, but I'm not super enthused with how they're written.

What is your experience with it?

Link to post
Share on other sites
  • 2 weeks later...

And with that said, I hit a wall almost immediately with their tutorials. The second one didn't work and a lot of their "tutorials" aren't really tutorials. They give you a finished project and explain parts of it but don't show you a linear process to completion. You almost have to reverse engineer every project, so I'm starting to spend a lot of time looking through documentation.

However, there is the Audio Programmer on youtube.

His videos aren't perfect, but they definitely give you a better idea of how to navigate through JUCE and eliminate/add crucial things.

Link to post
Share on other sites

I played around with it a bit when I was trying to use my android phone as a midi control surface over usb. I found the stuff fairly helpful (even though it was just one tutorial), however I do do this stuff as a day job as well so maybe I had the required knowledge already. I did find it fairly straightforward to get a quick hello world thing set up that could send MIDI to the computer from the phone.

Maybe a good way to learn JUCE is to go and find some open source projects built with it. I have sometimes found that it is simpler to change an existing thing to learn how it works and tune it to your needs than to start from square 0 to build your own thing.

Check this out for example - seems pretty good for getting started https://github.com/getdunne/VanillaJuce

Link to post
Share on other sites

Thanks for the suggestion! I didn't consider looking through github.

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
  • Recently Browsing   0 members

    No registered users viewing this page.

  • Similar Content

    • By Joyrex
      The Guardian has an article on OpenAI's latest accomplishment - getting Frank Sinatra to sing a song he never sang:
    • By BaxOutTheBox
      I am brand new here and, look forward to listening to and commenting on your creations.
      I made this over the weekend:

      As noted, these sounds were made with the Arturia DrumBrute analog drum synth:
      kick  - 
      db kick signal > jdx direct drive > thru > hi cut eq
      db kick signal > jdx direct drive > out > mooer ShimVerb 
      snare - 
      db snare signal > red panda particle
      clap -
      db clap signal > joyo digital delay
      mix out -
      db mix out
      "food dye, glassware and an overhead projector" footage
      "drumbrute jam" footage
      random altered art from the internet
      Adobe Premiere Pro CS6
    • By sweepstakes
      Genuine question! The way I see it, gain is a budget, and normalizing just makes you start the game with lots of money in the bank.
      I get it, you also raise the noise floor and the resolution is diminished a bit, but is that REALLY that big of a deal? Does that signal loss really matter that much? I thought that was the whole point of having much higher bit-rate effects processing was that you just had an order of magnitude less things to give a shit as long as the levels sounded right (and of course your monitoring situation was reasonably well-calibrated).
      Is this just a pristine audio from soup-to-nuts kind of thing? Is that what everyone is shooting for these days unless you're doing some intentionally, over-the-plate lo-fi witch-house-ecco-vape-goth-rave-seinfeld thing or whatever the kids are calling it these days? 
      P.S. Not trying to pick on anyone here if we might have happened to have a germane discussion. This is one of those nagging "Am I just really stupid(*) or, is this that thing where everyone is afraid of asking the same thing" questions.
      P.P.S. Mods, feel free to merge... this was the most relevant thread I could find: https://forum.watmm.com/topic/70902-normalising-tracks 
      * Also if I am just being stupid please explain. I will not get butthurt about being stupid, I promise.
    • Guest Soyuz
      By Guest Soyuz
      Sorry bad english.....
      i have allways thought that there are more advantages in working with audio files instead of midi data. for example, it is very easy to cut the impact of a recorded ride, leaving you with only the ambient ringing tonne. also, by cutting out parts of a recording, and pitching them differently can also do wonders! and there is the good old reverse.
      for me, working with audio recordings inside my daw (reason7) gives me more creative control of the sounds that i'm working with.
      but, i dont find me being a guy of control. you can unleesh just as much chaos working with audiofiles. for example stretching (and then quickly exporting the streched sound before it has rendered and fixed all the glitches) has given me alot of chaos in my work.
      I also think that there are more advantages (for me) in working with "analog" equipment, or atleast -the idea and feel of analog equipment. the idea that all sound created is audio, just a signal and you just kinda play more with it when it is analog gear. just just twiddle som knobs and make up new sounds in the moment. and to make some filter follow a rythem in your head, does not mean you have to do an automation -you record it in the go.
      i like the playabillity of alaog. the "live" onetake, feel of recording that. problem is that i really dont own any analog gear. and i have an old casio toy synth that sounds very ugly and a SQ1 workstation, but they dont offer that much control over the sounds, -in real time, that is to say.

      But! i got a mashine last year, yeah -that beatthing with colours and blinky buttons and stuff. it's bacically like an mpc midi controller.
      But! i love pads! i play melodies so much better on pads, and my brain can like remember the scales when it is on a grid. and i can play rythems very nicely on the pads. i just love playing around with the mashine. the problem is just that if i do something live with it, it can sound nice and exiting but when i want to record something -it all comes down to automations and midi notes......
      SO HERE IS WHAT I DID, it's not like revolutionary or anything, but i really got me rid of working with any midi! (:
      I borrowed my friends soundcard (he doesnt produce anymore, -only gets high) and i dusted off my old stationary PC. booted it up, installed reason7 on that mothafukka, and now my production setup is like a digital marrige beetween two computers (PC + MAC), two soundcards (EIE + BALANCE), two daws (REASON7+MASHINE2.0). so all i do on the mashine software on the mac computer is spit out through the balance soundcard and into the eie soundcard and into the pc and recorded in reason.
      this new way of working has done sooo much for me! and has given the playfullness of making electronic computermusic back. i am thinking to expand, you can get like very cheap midi replika synths with alot of knobs too, i think aturia makes them.
      buying alot of midi knob twidling stuff -will not sound like analog DUH!.
      -But will have the same live, playfull feel in this context.

    • By melancholera
      A test trying to determine if people can really tell the difference between low quality mp3 (128kbps ), high quality mp3 (320 kbps) and lossless audio (flac).
      I got 2/6 flac but only chose the lowest quality once. This was through my logitech 2.1 computer setup. Not the best listening station in my house.
  • Create New...