
Record_session = recorder(audio_settings) - call recorder class with custom audio settings (maybe someone has a better idea) function setup()įileFormat = "mp3" - file suffix (mp3, wav, m4a.) You setup a session and then use the recorder to control it. There is a recorder (start, stop, etc.) and a session (settings). I looked at Apple's API's and microphone input works in two parts. But microphone is still missing and would be endlessly great to have, even at the very low-level (buffers). We have already access to the speakers (although it would be nice to have more options, e.g.
#Midi keyboard audulus 3 generator
I already have couple of ideas where I could apply this: for example a podcasting app or static site generator for audio driven blogs. Especially thanks if something comes out of Fundamental access to audio input/output would be so awesome and welcome. Thanks for reading this far, thanks for thinking about it, and especially thanks for acknowledging this post, as prior and less detailed posts have been ignored. I know this stuff backwards, I can help, I can teach, I really want to be able to use Codea for this, there are no alternatives). I also have developed a dataflow mesh scripting engine which embeds Lua that runs natively on standard PC platforms. (fyi: I'm a seasoned professional Audio engineer, MIDI expert, embedded systems programmer, and musician, currently employed by a well known gaming console company as their consoles' audio engine developer. Implementing MIDI also opens up a world of hardware controllers for game interactions.īut there is nothing like Codea for doing scripted interactive musical instrument UI and algorithmic compositions. And, in the context of Codea, MIDI on iOS provides solutions to making really immersive sound effects for games, far beyond the simplistic tone generator that is all one has today.

Not to mention that making music has a lot in common with game play. This is a unique combination and opportunity. And Codea have for your part dealt with Apples' squeamishness about allowing end user access to Lua scripting.
#Midi keyboard audulus 3 code
If Codea could just add the basics - enumerate available ports, open an Input, open an Output, call-back on receive message, send message, and close: there is even a free code library of professional robustness for iOS called MidiBus (by Audeonics) that makes this simple to implement and reliable, and Apple are already quite happy with it.
#Midi keyboard audulus 3 full
I have one setup with an iPad Air and a Focusrite 18i20 interface that is capable of full HD surround sound.īut developing controllers for it that have any kind of significant intelligence is out of the reach of non-developer users, and seemingly beyond many actual developers. With AudioBus and MidiBridge and IAA it is now possible to design sound stages of arbitrary complexity (limited only by imagination and hardware, an ever rising limit:) including fully professional synthesizers, effects, and recording workstations, and supports multi-channel audio interfaces and multiple discrete controllers, not to mention the Virtual MIDI connectivity entirely inside the box. I mean, amazingly so, to where iPads are now regularly in use on stage. In the early days of Codea the iPad was a finicky and really only a single-tasking beast.
