How-To for using External MIDI Instrument
Hi All—I’m loving NS2, and really want to be able to use some of the other synth apps installed on my iPad with it. It seems the External MIDI instrument is the way to do this...or not?
For example, I’d like to use AudioKit Synth One. I open the app, then create an External MIDI instrument track in NS2. Then I select AudioKit Synth One on the Settings page...but I can’t hear Synth One in NS2.
Do I need AudioBus for this?
Thanks!
Scott
Comments
Ugh...I bought AudioBus 3 and am even more confused. It still isn’t working, and I’m sure it’s something simple I’m missing. I don’t know whether to select AudioBus or SynthOne in the Settings page for NS2’s External MIDI Instrument?
For what it’s worth, both the NS2 manual and the AudioBus manual are missing examples of real-world usage: for example, it would be great to have a manual section called “EXAMPLE: How to record AudioKit Synth One in NanoStudio using AudioBus (or something like that).
Any help is much appreciated!
Scott
Trying to help myself here! 😃
So I’ve played a bit with NS2’s AU Instrument. I have Sunrizer, and pretty easily figured out how to use it inside NS2. Great!
Except...I have several other Synth apps that I’m dying to use the same way (like all the AudioKit apps, Animoog, etc), and Im guessing none of those are AU apps because they don’t appear inside NS2.
Which I guess means if there is a way to use those other apps inside NS2, I need AudioBus. Which I now have...but I’d love a simple step-by-step on how to use non-AU apps inside NS2 with AudioBus.
Thanks!
Scott
AudioKit Synth On / D1 will be soon updated to AU compatibility .. no eta, nut i think they are now in beta testing phase ...
AudioBus in NS can be used for 2 things
1/ routing audio OUT of NS into other apps
2/ sample from other apps INTO NS (into Obsidian Sampler oscillator or into Slate pad)
Thanks Dendy. I’m really excited to hear the AudioKit synths are being updated to AU.
Now that I spent $$ on all those synths, I’m thinking I now need to spend more $$ on AU synths to use in NS2! 🙂
@SWriverstone What you can do now, is running Synth One in the Input slot (top) in AudioBus, side by side with Nanostudio as Input (next row, top) also.
Yes, use an External MIDI Track in Nanostudio (MIDI Output icon is on top of the track there), to play and record notes from the track's keyboard.
Explore the Audiokit synth together with Nanostudio sounds by tapping 'Send' on the side of the external MIDI track ('add MIDI send [+]') and selecting any instrument ...
Spend some quality time with Obsidian. You may find you don’t need those other synths! I use KQ Dixie and SynthMaster One to some extent but can usually get what I need from Obsidian with the benefit of more efficient CPU usage. Check out the IAP’s and user patches on this forum. You’ll find some really excellent and useful stuff! Also, you don’t have to mess with third party apps so much as great as most of them are.
Agree with @anickt 100%. It’s quite an amazing synth. Plus it’s so resource friendly you can easily create many instances and still have great performance on your iPad. I’ve learned a lot about how it works (but not enough yet, haha) by going through and preset tweak to get what I want. And you’ll have experts available here like @dendy that are happy to answer questions.
Fixed your post for you.
OK, I guess it would be me that gets this warning:
“Body is 1605 characters too long”
Can anything be done about that @Will?
Multiple posting ensues..
(The history is being simplified for clarity - not intended to document all the details and variations)
It sounds like you may not have been on board for this decade+ journey - forgive me if I cover stuff you already know. A lot of us may forget that there are people who are new to the iOS music production environment. It is confusing because the evolution of the platform shows that Apple didn’t have a plan for music production and probably assumed that musicians would use a laptop to compose & record. Initially music apps were originally thought of as ‘toss a dollar away toys’, like the Ocarnia that you controlled by blowing into the mic. Simple synthesizers started to appear with limited functionality, Beatmaker allowed you to make simple beats using samples, and Nanostudio came along with a fullly featured synth, Mixer with FX, and a great sequencer with easy to use piano roll. I don’t like GB so I will omit its part in the history.
In the early days there was no way for music apps to directly integrate. All we had was Copy/Paste. You recorded a note, chord, riff, whatever, in one app and copied and then pasted the sound into another app, such as NS1. There was no MIDI, no routing of audio between apps in anything like real time, and not even a hint of AU. Slowly apps started adding features as the desire grew in iOS users to create finished music inside iOS. MIDI appeared in the form of MIDI In so you could use a real comtroller instead of just the keys, and AudioBus was created to solve the problem of routing audio from synths & drum machines to apps that could record audio. I was using NS1, but other DAW-like apps were adding AB functionality when Apple decided they would regularly change the iOS to make some of these features no longer work without lots of extra development by the Devs. Apple introduced IAA (Inter-App Audio) as a method of routing audio, and MIDI started to become a common feature in apps. AB3 added MIDI alongside Audio and then we suddenly got AUv3 technology dumped into the iOS, with loose guidlines for Devs on how it should be implemented.
So the current situation is a bit of a mess with older and newer technologies. The apps that were successful enough to maintain development had features like AU added to them (except stubborn companies like Korg) or will hopefully be added to them soon. Other apps may slowly die because they are not actively being updated. Cherished apps like Nave don’t have AU functionality. I’d pay for that BTW Waldorf.
All this means there are a lot of potential ways of doing things, and each of us gradually work out a workflow that makes the most sense to us, but usually isn’t perfect. Therefore we also suffer from app addiction because we are always hoping for an new app that will solve some of our problems or add features we want. Or just add more sounds. NS2 in many ways is a great all-in-one tool and it is worthwhile to make music just using NS2 until one has gotten it down.
When you want to add the sounds of other apps you have to find a way based on the features of the apps you are using. There are intermediary apps that can help: AudioBus, AUM, and AudioShare are popular ones. I like AudioShare and it is easy to use from within NS2 since the ‘Copy From’ funtionality is built-in to NS2. Basic AB also allows you to bring audio into NS2. No one way is more right. It depends upon how you want to work.
Also there is the question of what are you intending to do with the audio. You can record a series of notes in an app and use those notes in Obsidian to create a sample-based patch. This can simplify the composing process. SynthJacker is an app that speeds up this process when making a lot of samples, but for starters, do it manually to get the concepts down. A lot of apps have the older IAA functionality, so it is easy to record an older app like Animoog right into AudioShare. AudioShare is organized with a standard folder/file structure so you can organize samples. I have folders for each app and anytime I want to use those samples I can find them in AudioShare.
To create an Obsidian patch, select Edit, then for the OSC Type select Sample. To the right of that select Load Sample > Library > ••• (upper right hand) > Import > AudioShare > select sample > Import into App. Repeat for the number of samples you want. When you have the samples in the NS2 Library, you can load them into the Sample OSC one at a time and decide what range of the keyboard will be covered by that sample, or if the sample has the note name at the end sample name, try using the Automap Samples to load all the samples in one go.
If you just want to have some chords let say, or a riff, it is easy to use audio in NS2 as Audio Clips in Slate (eventually real Audio Tracks will be a IAP feature in NS2). You can do this the same way using AudioShare, which is nice because you then have a central spot to store samples, or you can use AB. Open AB and select Animoog as the input and NS2 as the output. In NS2 create a blank Slate instance and select Pad 1. On the left side select Sample and you will see the Rec button to record audio into that pad. Press the Rec button and you will see the Record Input page. You can press this manually to record something with the iPad’s mic or an external mic, but since we are using AB, we won’t.
Go to Animoog and you will see a tab on the left hand side. Selct the NS2 icon and at the bottom there will be a Rec button. Select that and it will turn red. Play a chord or riff in Animoog. With full AB integration you could press Rec again and recording would stop. When using NS2 you will have to go back to NS2 and press the Pause button. Then select the ✔️ In the upper left hand side and you will see the graphic of the audio. You can zoom in and trim the audio to just what you want. Put the line at the very start of the sample then press: Select > Earlier > Delete. At the end of the sample press: Select > Later > Delete. When done press the X in the upper left hand corner to save the Audio. Now when you press the pad the chord or riff will play. You can trigger this in the piano roll editor to use it in your composition.
You may also play and record another app from NS2 by using MIDI. Lets say it is a piano part. Create an External MIDI Track. In the Piano app you may have to go theough its settings to sync MIDI channels or select Background Audio. Once you get the seqeunce of that part to your liking, you still need to get the audio into NS2 (for most uses). In that case you can record the audio into AudioShare, or use AB to record the audio directly into Slate. The only difference is you will trigger the MIDI notes by pressing play in NS2. Once the part is done playing, stop the recording and trim the audio as before.
You can use FX in AB chains. IAA/AB only FX apps are still around so you’ll be likited with those. Most all FX apps sold today are AUv3 and can be used in NS2’s Mixer, either with AUv3 synth apps or with Obsidian & Slate.
Technology marches on, and all of this long-winded post has become obsolete. Now you can now use AUv3 synths which can, when they behave properly, be used just like an instance of Obsidian or Slate. The limits being based on hardware and the CPU demand of the AU, which can vary greatly. Obsidian and Slate are light on CPU demand, so you can do a lot with those.
AUv3 apps in iOS are a fraction of what the same sounding synth would be for PC/Mac so relish in it and buy a lot of AU synths to support the platform.
Once I am happy with a part that uses an AU synth, I like to record it to audio on a Slate pad. This keeps CPU demand under control, but also ensures that future changes don’t ruin the part. For example if I forget This Song using that patach an I accidentally change that patch for another project. Now This Song sounds funny. Also I never know when a change to iOS will make an app unavailable. The 32 > 64 bit change in iOS made all 32 bit apps not work. If I relied upon Alchemy for a part in a song, there would now be silence. As IAA is being phased out of iOS, there may be a day, perhaps 2 years from now, where IAA apps no longer work. MIDI parts that play an IAA app will be silent. Using Audio Clips will help prevent future headaches, but for now AUv3 apps seem to be reliable. Don’t worry, there is a team at Apple trying to figure out a way to screw it up, but not any time soon.
‘Commiting to tape’ is a standard recording industry technique. Audio Clips in Slate are similar, audio tracks will be better, but in the end the only version of your music that exists safely is the versions that are on a master tape or on printed paper. In the real world. Digitally... back it up on multiple locations (physical or cloud) or risk losing it.
Is there any sort of award for the longest post on the forum?
No? I though not.... 🙁
@SlapHappy
(consider it as award ))
Thanks! I’ll frame it. 😊😊