Blender DAW component suggestion

How about being able to drive from currently playing audio WITHOUT having to bake a curve first.

To me it seems like having some kind of plugin that acts as a parser of DAW files and can produce an output which can be used to drive animation would be useful. Something that can read FLP, MMP(Z), MIDI, etc. and produce sets of animated empties based on the track data. Then you could just drive whatever you wanted to animate based on the position (or whatever is being animated) of the empty.

Animating something like percussion via baked F-curves works ok (most of the time) because it’s easy enough to separate that by frequency, but individual keys on a piano keyboard where there are harmonics - not so much. Ditto if you want to animate for automation track behavior like cutoff filters or some such.

I’d still do the actual audio outside of Blender in a proper DAW (I’m cheap so I like LMMS even if it’s a fairly stripped-down DAW), but it’d be neat if there was something that could produce animation with DAW data.

This - an amplitude tracker with attack/release and frequency controls that can listen to channels on VSE and spit out a value that can be sent to a driver.

Though I have a sneaking suspicion that Blender’s architecture might make this harder than it sounds, which is why Bake Sound to F-Curves exists.

If that helps… there is:

PyDAW (Python Digital Audio Workstation)
https://github.com/j3ffhubb/pydaw/releases

:yes:

ehm from what i understand of these plugins that VLT is a small framework in which other free plugins have their own visual interface which is included in the binairy, so for processes the audio in a stream, one doesnt need a cycles node system.
Once the audio stream is put into it, there not much for blender to handle about it;
So as there are allready such free players; would it realy be that hard to includer ?.
This would be realy nice when you use blender for video preocessing, and have to sync it all together with the right sound.

Sure once this starts, in the next step
And i i dont think it would take long for people to realize that, VLT could be used in triggers to for the gaming engine, or just for stereo effect when you render a movie with objects that have a sound source (birds, car engines, etc).
I realy think something like this should be part of blender. As Blender is more then just 3d editing, once you have your model or normal camera based movie; the fun parts start; doing final sound editing is just one of the fields when creating a movie, or a game.

We have OpenMPT and Audacity. They are pretty advanced suits. Both are open source and support VSTs.

Modern DAW functionality is pretty complex. And even serious video editing suites don’t match their functionality. I think DAW features would be nice, but it’d be astounding if they ever got close to a real DAW. Better to edit audio in dedicated packages. Blender should have good ways to synch actions to audio (maybe even MIDI) and make it easy to pair audio and video, but producing the audio beyond simple fades is overkill. An EQ might be as far as I’d push it…

I say this as an FL Studio power user, with experience on a little bit of other software.

That and the VST / OSS licensing issues…

Implementing DAW functionality in Blender might be a waste of time, and I partly agree with you. Better the current devs should concentrate on developping the 3D features. However, Blender is how it is now because of community pioneering. Probably, at the time when Blender was just a 3D modelling app, people thought that it was a crazy thing to have a game engine and VSE included. However, some pioneers did it and we could see the results today.

  Regarding DAWs, I could see no free application that uses VST instruments and also allows for midi note editing. (thinking here of Fl Studio)   Integrating a DAW feature would be awesome. Let's say you're a poor indie and want to add simple electro beats to your game. You need to buy a software. You want to generate various vfx sounds. You need to buy Reason. 
   I don't propose that Blender actual devs should spend time on DAW development. It does not worth. However, there are people with programming knowledge, eager to help on audio. 
   If I had time and programming knowledge, I would have tried myself to implement DAW functionality in Blender. It is a cool feature, that could help the community in their projects.
   If you get on a site with audio developpers or have a programmer friend interested in audio development, you could suggest them to give it a try with Blender DAW implementation.

    I have a challenge for BlenderArtists community : could you make some mockups of a possible Blender DAW interface ? Even if this idea will never come true, this is an exercise of immagination. Even more, some possible devs with free time could see your mock-ups, and make the DAW feature real. Who knows ?

This sounds similar to the problem posed by integrating a time component to the compositor. Nodes sort of ignore time but animation relies on it. Currently there is no integrated visual metaphor for manipulating node assets in time (frame start numbers are it).

Similarly spot sound effects and possibly music can occur in a game at any time (a triggered call?), and can vary depending upon variables generated by play.

This suggests that sound could be manipulated by stacking modifiers, possibly in the same way that the compositor uses nodes. Why not add another node window. One that uses established effects/EQs as pluggins.
I expect that there would be 2 classes:

  1. Spot modifiers = material nodes. They alter the sound as a set effect (maybe influenced by game logic?)
  2. Animated modifiers = Alter sound over time like the Dope Sheet or VSE

EDIT:
So I did a little search and check this out (skip to 00.40 seconds):

@darksder

VST would only require a pannel, they have their own GUI : http://www.vstplanet.com/
So if blender can reserve a stackable screen area for it then thats enough (filter upon filter)

BTW someone sugested that blender support JACK, i dont know where or how this works…
Perhaps there is a even a JACK <==> VST connector ?.. I’m doing again a band music clip recording, and thus i miss VST.
A lot of plugins in VST are free and originate from a time area where people made lots of apps just for free, before any little prog had to cost lots or had subscription; actual the audio video scene have always had a large ópensource’ part where people share the tools they used. and ofcourse people could fund such initiatives… nah its not such a debate

but if blender has jacks… (what is it and how to connect it?), maybe there is a jack to to vst, or something like that could be made then inside or outside blender… so what is jacks ???

Doesn’t jack just sync apps for playback?

Blender already has JACK support. I’m just not sure if it’s enabled in the default build. It is for syncing playback between apps. So theoretically you could use a program like Ardour to edit your sound while you see your animation edits in the VSE. The complication I had the last time I tried this (admittedly, this is a few years back) was that JACK didn’t have good (any?) support for single-frame shuttling or other arbitrary speed controls. That may have changed since then, but I’d have to look and check to be sure.

Can JACK also be used on Windows or is it linux only ?..
i’ve not heard much of this protocol/plugin/sound format well whatever it is…
I think JAck is mainly linux based…

But perhaps this article offers some hope, it talks about a virtual audio driver; studying it now.

It wont allow any kind of control from blender, but for i would be able to adjust live band recordings i hope.

It’s available on all three major platforms (check JACK’s website). Basically, it’s a means of getting multiple applications that handle audio to speak to one another on the same machine in real time. This way, smaller more specialized programs could work together so you don’t have a giant monolithic program. For instance, you could edit audio in a DAW like Ardour, but use Hydrogen as a drum machine and sync it to video in XJadeo or Blender.

That’s the idea at least. It’s a bit hairy to get configured and set up properly, but it can be quite useful on a properly configured box.

yes well the output, but i couldnt find software on windows that actualy used Jack (connection to other software).
My goal is vst, … so i keep searching

I’m surprised you haven’t found any free VST-capable DAWs yet.

http://ardour.org/download.html

https://lmms.io/

There are already multiple free, full-featured, VST-capable DAWs for Linux. I guess Ardour is pay-what-you-want, but that’s almost free and if you build it yourself it’s free. These are not obscure apps either.

LMMS is just free.

IMO adding music-production features to Blender would be like adding motion graphics features to LMMS. “But what if some poor person doesn’t have software to do motion graphics for their music video…?” So? There’s Blender after all.

I agree that better audio-as-driver tools would be great, and maybe even using 3D events to generate OSC or Midi commands would be useful, but trying to shoehorn music production into blender seems like a bad idea.

Are you serious? How does a DAW fit in with Blender? That is a specific tool for a specific purpose.

Besides, the Blender devs have a hard enough time keeping up with Blender itself.

To be fair, Razorblade is specifically looking in Windows land. Last I checked, Audacity has had VST support in Windows since version 2.0 or so (and it works with JACK). Disclaimer: I haven’t personally tried that combination as I don’t run Windows on any of my workstations, but it should be possible.