Heads up on GeoShred issue
GeoShred is now an AU.
This is great, but may put the spotlight back on AU midi recording (if I’m correct).
GeoShred is now recordable from within Cubasis and Auria Pro, but NS2, BM3 and SL are all unable to record the output from its interface. I am guessing this is because the three apps mentioned are all apps that don’t have the feature to record AU midi at this time. I am however guessing that this is the reason.
GeoShred has some loading issues within NS2 (often will not load and NS2 reports a crash of the AU) and BM3 (sometimes takes an age to load the AU).
I have reported these issues to the developer of GeoShred.
Comments
maybe this is somehow connected with MidiFX recording issue (i think)
no au synth with own keyboard (primer, sunrizer) doesn't record notes to NS2 sequencer when playing on this keyboard .. This is not GeoShred issue, this is subject of NS implementation.. like you said, same in BM3 for example...
Yes I guessed at it being an AU midi issue (Midi Fx), as it’s the same for LS (no AU midi) and BM3 and NS2 (norecording from AU midi).
As Brambos has posted here before, AU Midi has undergone some changes, hence the new AU FX moniker used now. When most say AU midi or AU Fx they actually just mean midi coming from the AU itself. The technicalities of the different AU specs are beyond most of us - that’s for the devs to work their way through and eventually some semblance of AU midi out, in and thru will become standard across all apps so that us mere mortals can just record these apps lol
I had not tried GeoSynth with NS2 until today after updating to the latest version. As cool as its patches are, I generally only use it as an advanced MIDI controller. I am unable to get Obsidian to respond to GeoShred’s MIDI output. GeoShred does show up in the MIDI Input list, and it does send MIDI (verified with MidiWrench, as well as sending MIDI to my iWaveStation - all is OK). I have background audio enabled in NS2, and I’ve also tested another SW MIDI controller MusixPro with Obsidian on the same patch and it works as expected.
So my question for @Fruitbat1919 is this: are you able to use GeoShred purely as a MIDI input device with NS2? If so, how are you setting it up?
I’ve not tried to be honest. I use GeoShred as an instrument, but I can see the advantages as using it as a midi controller too. I will try it in the NS2 midi fx slot, but as you can’t record midi from that slot as yet, it’s unlikely it would be much use. As for sending in external midi, I’m haven’t been doing that with NS2 for two reasons - 1. I find using external apps to send midi to hosts to be cumbersome for my recording purposes. 2. Without AudioBus switching, external app use is again not fun or easy for those of us that prefer to look at a timeline while recording.
Having AU devices like GeoShred use their great touch screen centric input devices are without doubt the benefits of using iOS (or any touch screen device) that pulled me towards iOS in the beginning. iOS had some of the best touch screen music making apps available. It’s just a matter of time when we see more of these (or new ones) become AU touch screen apps. This is the future and DAWs / Hosts need to be ready.
I think it's a pretty grey area for host developers about where they should record MIDI from (well it is for me anyway!). At the moment, MIDI is recorded from the input going to the AU rather than the output coming from it.
I can definitely see the benefit in recording the MIDI output instead (particularly for AUs with unique UIs such as GeoShred), so I've made a note to look into this. What concerns me is that if I switch to recording the output, what if some AUs don't output MIDI events from their UIs? Then recording doesn't work at all, which would be even more serious.
Alternatively, what if they have internal sequencers or arpeggiators which output a whole set of notes rather than the original single note input which triggered them?
It will definitely need some research!
Yeah, I was groking through the development fun these new midi generators have introduced to the guys like you writing the host. I read comments like logic already does this or ableton yada yada and while that’s true, it’s a paradigm shift for the current iOS landscape. It’s still forming on how to best implement recording it when devs are still sussing out the hows on the plugins themselves. If everybody just implemented it exactly like @brambos is for example, then there’s your standard...but they just aren’t.
Tough problem to solve.
It would have been great if Apple had laid down the law with this but AFAIK there's no major overview, leaving it to be a bit of a battleground. I'm definitely scared of breaking more than I fix with radical changes like this.
Of course it could be done with a user setting (eg. a similar thing to a MIDI thru toggle button) but I don't know if this would just make working towards a common implementation even more error prone.
Second solution is to do something special for certain plugins, but this then puts workarounds into the host, which means it would be susceptible to future AU implementations and require constant updates when they change.
Yes, I can well image it’s a mine field. All areas of Au with regards to midi is changing the way we see hosts.
The future of hosts will be interesting. Hosts on iOS need to be different imo for a few reasons:
The touch screen opens up much more in the way we can input our midi data. To get the best from this, hosts will need to be more creative too with their data pipelines. This can differentiate iOS music making from PC / Mac and help encourage new users and uses.
The iOS marketplace is not ideal for large app projects such as DAWs and large scale hosts and as such needs to think of new ways to share the load - AU midi (instruments, fx etc) applications could be a way to achieve this - in essence, certain features many PC / Mac DAWs have, can be added by other devs if the midi set up is set up to achieve this. Arps, new designs for key input, new controllers etc can all be add ons. The fixed host keyboards need replacing imo for AU midi slots.
Just a few thoughts
The complexity reminds me of the old days of complex midi routings. With iOS apps at this time, you may want three settings:
Just to be clear, in my experiments with GeoShred as a MIDI controller, I’m not using it in NS2 as an AU; I’m running it externally as a stand-alone app, the same way as I use MusixPro so it should just be another MIDI input source (and it shows up in NS2 on the MIDI input panel for Obsidian). That one has me puzzled...
I think they should have made the AU spec to make sure that all AUs have midi in, out and thru ports in place.
My thoughts too. That was the way with the best hardware midi implementations. But it seems to me as if it would be best for hosts to implement that in the wrapper instead.
Where to record the midi, that’s a question I never really thought of though. I guess a pre/post switch, along with the midi out/through selection in a host wrapper is best. IMO.
That way, if you wanted to have a track that drives an arpeggiator and let the arpeggiator provide the output you could, but if you wanted to tet the arpeggiator out of the equation by recording its output instead, you could.
Implementing in the host could settle the question easier than trying to “herd the cats”.
Just my thoughts. (Non expert, non developer, etc. etc.)