057 iPhreaks Show - MIDI

00:00
Download MP3

The panelists discuss MIDI.

Transcript

[Would you like to join the conversation with the iPhreaks and their guests? Want to support the show? We have a forum that allows you to join the conversation and support the show at the same time. You can sign up at iphreaksshow.com/forum] CHUCK: Hey everybody and welcome to episode 57 of the iPhreaks Show. This week on our panel we have Andrew Madsen. ANDREW: Hi, from Salt Lake City. CHUCK: Jaim Zuber. JAIM: Hello, from Minneapolis. CHUCK: I'm Charles Max Wood from DevChat.tv, and this week we’re going to be talking about working with MIDI, which is the Music-something-something, I think. Andrew can tell us a little better than that. ANDREW: Yeah, so midi is originally a standard for connecting musical, electronic musical instruments together. You can hook your keyboard to a synthesizer and play MIDI – and that was in the late ‘70s – and it’s really used for the same thing now. It might be a way to hook a piano or keyboard up your iPad and play into Garage Band and make recordings. CHUCK: That’s really kinda interesting. Is MIDI just a protocol? Is it a binary protocol, or is it something else? ANDREW: Yeah, MIDI is a communications protocol. Fundamentally, you just have a MIDI event and a MIDI event might be – for example, one of the most common MIDI events is Note On. That just is an event that’s sent when a key on a keyboard is pressed down on a piano keyboard, and that tells the device on the other end, ‘the user just pushed this key, start playing this note ’ and then you get another message Note Off when they let the key go. Over the years a lot of other messages have been added to the standard, but those are still sort of the two most common and most important. It is a binary protocol; it’s quite simple, actually, so the typical MIDI messages are two bytes long and then come in in real-time and you can sort of do whatever you want. But I think one important distinction that’s important to understand is that MIDI is not an audio protocol, so MIDI is not for transmitting audio – it’s for transmitting events, these musical data events, and then turning those into audio is sort of another step. You get a long string of notes and you send those into – and that’s MIDI – and then you could send those into a synthesizer that would actually turn them into sound. The synthesizer can sound like anything, basically. It can sound like a piano, it can sound like a guitar – it could sound like just any sound you can come up with, so it’s not an audio protocol; it’s for musical events. CHUCK: I think I kind of understand what you're saying here. You have an event which is basically ‘this note is playing’; is there some kind of duration on that? I don’t quite follow. What are the properties of an event? Maybe that’s the best question to be asked. ANDREW: Yeah, that is a good question. To answer your first question there, there's not – in the MIDI that’s used to connect devices together – there's not a duration. You get two events, so when you press the key down you get a Note On event, and then later when the key is released, you get a Note Off event, and so that’s how you can figure the duration – it’s just the time between those two events. There is some other data that’s in a Note On event, for example, that’s useful. You get the note in question, and that’s like the number of the piano key if you start on a piano all the way to the left, that’s – I don’t play piano, so off the top of my head, I don’t even know what note it is but – that would be note zero. They just count up as you go up the piano. You also get a velocity, which is just how hard the key was pressed, and from velocity – you can use velocity to change the sound. The harder a key is pressed, probably the louder you want the sound to be, for example. And then other events, there are other events besides note events. They all have data that’s sort of specific to that event. You can get pitch wheel events, so a lot of electronic piano/keyboard type of pitch wheel – it’s a wheel you turn to bend pitch a little bit, or kind of like a whammy bar on a guitar, etcetera. There are other events like that that are sort of useful and applicable to a lot of different kinds of music and musical instruments that have been added to the spec. Does that answer your question? CHUCK: Yeah, I think so. I am a little curious though as far as other instruments that have a range that’s different from the piano. For example, if I'm playing on the tuba or something then maybe it goes lower than piano does – maybe that doesn’t happen, I don’t know – but I'm assuming that there are instruments that go higher or lower than the piano. How do you manage those? ANDREW: That’s a good question.  In MIDI – MIDI actually is a seven-bit byte, so you have a – there are actually 128 notes in MIDI, and there are only 88 keys on a keyboard, on a piano keyboard, so it actually extends past what a piano keyboard can do. I don’t actually know the limits, and I've never seen a MIDI tuba, but I actually have MIDI bagpipes, so I wouldn’t be too surprised if there's a MIDI tuba. One thing, of course, is that music is sort of mathematical. A C is a C, and a middle C is middle C, but really, the only difference between middle C and C an octave down is convention, right? You could say you want the MIDI note for middle C to represent a C five octaves down if you wanted to. Some keyboards – like I have a little MIDI keyboard that really only has two octaves on it, and you can actually just change which MIDI notes it’s sending, or you can shift it down by several octaves so that you could play the low end or the high end, or anywhere in between even though you’ve only got two octaves of piano keys. All it has to do is shift the MIDI notes it’s sending up or down. CHUCK: That makes sense. What do people use this for, then? Do they just mix their own sound? Because you said you plug an instrument into your iPad, and then you record it, and then you record another track with another instrument – can you do that kind of thing, or do people do different things with it than that? ANDREW: That’s one of the really big uses. You could hook up your keyboard to Garage Band and play the piano track with the keyboard, but then you could also add another track. You're still playing the piano, but because it’s just MIDI – it’s not sound – you could make that track guitar track in Garage Band, or whatever app you're using. Yeah, it’s definitely used for multi-track recording any time you wanna record – not audio, but music that you're composing on a composer or an iOS device, you use MIDI. The reason I use it, or the reason I've gotten into using it is that at Mixed In Key we software for EJs and all of the DJ controllers that you see. These are like a controller that has two sort of fake turntables and a bunch of knobs and sliders and buttons for a DJ to scratch in live – those actually use MIDI too. They don’t send note events; they send a different kind of event called a Control Change event, which is just used for buttons and sliders and knobs, and that sort of thing. That’s another thing that MIDI is used for is actually control of music apps, and you can sort of – there's really no reason you couldn’t use MIDI for all kinds of different control. You could even, in theory, use it to control lighting at a stage show – I don’t think that’s typically how it’s done, but it would be feasible. It’s really just a way to connect musical devices together. CHUCK: So what do you use to do this kind of work then on iOS? Let’s say I wanted to write something that I could record or play back MIDI with. Are there pods or libraries or things you can use for that? ANDREW: Yeah, funny you should ask. Actually, I have a pod called MIKMIDI that we use in our own apps and that is open source and that we wrote that really simplifies some aspects of MIDI. But fundamentally, the framework you use is called Core MIDI, and it’s available on both iOS and OSX and it’s identical between the two. Core MIDI allows you to find devices that are plugged into the system, so the OS [inaudible] actually doing the global communication with the MIDI device, but you can see which devices are connected, connect to them, receive events from them, send events to them. There's also – it’s not really technically in Core MIDI, but there are parts of both core audio and audio tool box that are for playing MIDI. Core MIDI handles the device stuff, talking to devices, and then core audio and audio tool box actually handle playing MIDI, saving MIDI files, loading MIDI files, so you can actually save MIDI data in a file. It can be played back later; there's a standard format for that. The problem is all three of those frameworks – Core MIDI, audio tool box and core audio – are C, pure C, and they're not known for being particularly easy to use. They're not really that well-documented; if you open up the documentation for Core MIDI, you basically just get a list of all the functions that are in it and not a lot of help in what this all means and how it all fits together. The same is true of core audio, and I think most iOS programmers that had any experience with core audio know how difficult that can be, so Core MIDI is sort of the same way. But really, fundamentally, it’s not really that complicated. Core MIDI is not – it’s a very consistent, well-written framework, so if you can get into it, it’s not too bad. MIKMIDI – and there are other libraries out there – is an objective-C wrapper for Core MIDI. It makes it, I think, much easier to deal with. It’s much higher level, it’s documented; you don’t have to drop down to pure C, so that’s what's out there. CHUCK: Did you write MIKMIDI, or [inaudible]? ANDREW: Yeah, I did, I wrote it. JAIM: Let’s talk about wiring this up into the app – what are the steps that you’ve been doing? ANDREW: That’s a good question. The first step is, and I'm speaking right now about devices, because the files and devices are sort of separate in MIDI. I mean, they can be connected together, but they're separate parts of the API. Talking about devices, the first thing you need to do is be able to enumerate the devices that are connected to the system. There are functions for that, or in MIKMIDI there's actually just a MIKMIDI device manager that you can get a list of all the available devices from. Each device gives you some information about it, so you can get the name of the device, and that'll be the model name – it’s whatever the manufacturer’s set up; it’s actually user-configurable on OSX by default; it’s what the manufacturer’s set. Once you get a device, you connect to it, and connecting to it really just means you're telling the API that I wanna receive – you can connect to it in receive mode and also to send messages. Connect in receive mode just means, ‘I want to receive, I wanna be notified anytime this device sends an event.’ You set up a callback in core MIDI, and then that callback, which is a C function, gets called anytime an event comes in and you get a – you basically get just a pointer to a packet and a packet – really the main thing is it just has raw bytes of data that a MIDI message consists of and that’s the data that’s in MIDI messages defined by the MIDI Spec; it’s actually not that complicated. In MIKMIDI, you can actually just pass a block to the connection method, and that block will be called anytime events are received. Responding to the event is just completely up to you; it really depends on what you want those events to be used for. In our apps, we’re connecting to EJ controllers and we basically want a knob on the DJ controller to control a knob that’s on the screen, or the play button on the DJ controller and that just does the exact same thing as the play button in our UI. That’s not really part of MIDI; that’s just what you want your app – how you want your app to respond to MIDI. JAIM: So for your application, are you doing DJ events and transferring that to MIDI and replaying them? Is that how that’s working? ANDREW: No, actually we don’t replay them. The DJ will use them to control the app live on stage, and that’s just because it’s really more intuitive and easier to control music in the way a DJ does using physical knobs and the big jock wheel turntable-like things rather than a mouse and keyboard, which is really not an ideal. It’s a little bit finicky, you gotta have a pretty good view of the screen – if you're in a dark club, it doesn’t work that well. It’s a way to control the app with a much more, well-suited control interface. But we do actually send MIDI to those devices, and that’s because most DJ controllers have lights in the buttons, and we want those lights to reflect the state of the app, so if the app is currently playing back audio, the play button on the controller should be lit up to indicate that it’s in play mode. We do actually do bidirectional communication, and its devices just have a method – in MIKMIDI, there's just a method to send any message you want out to a device, and it’s actually quite simple. To turn LED on you actually just send the same message that the button that the LED is in would send. If you pressed it, you just send that to the controller instead and it turns the light on and off. CHUCK: So these are just peripheral devices that hook into a Mac that the DJ uses to do stuff with music displaying? ANDREW: Yeah, exactly. I’ll put a link to some for the show notes, but a lot of different companies make these and they look sort of like you would expect. They have two turntables – I mean, they're not really vinyl turntables but they look like a turntable that you can spin – and sliders for volume, and fade between two decks, and buttons for play and pause, and EQ knobs for high, medium and low – or hi, mid and low – that kind of thing. They're just much better than a mouse and keyboard for EJ, the same way that a piano player wouldn’t play a computer keyboard, but a piano – they want a keyboard that’s like a piano keyboard. DJs want a surface that’s like what they're used to, and it’s sort of inspired by turntables – vinyl DJing – but they’ve added to that and then they just send MIDI. More recently, I don’t know if this is interesting, but more recently I've been working on adding file support in MIKMIDI and this is for another app that we’re developing. That’s actually musical MIDI – MIDI that has notes in it and sequences with notes. The cool thing is, it’s the same stuff. A MIDI file is basically just all those events that come from a keyboard, or whatever, saved out to a file, and then you can just play them back as if they were being played by a keyboard. That’s really quite powerful because it means you can play a song in on your keyboard, and if you screw up part of it, then you can just edit it. And it’s not audio – it’s just the events that came from the keyboard. Or maybe your timing wasn’t quite right; well, you can adjust your timing, fix that after you record. That’s a pretty cool, powerful part of MIDI that I'm really just starting to work with heavily, but I'm impressed and having all the fun. CHUCK: So when we got, back in the day, when I didn’t have a smart phone, when I got the MIDI ringtones, it was just sending a series of events to whatever interpreted it into signal through the speaker? ANDREW: Yeah, exactly. So you have something called a synthesizer, and a synthesizer takes MIDI events and turns those into sounds. They can be any kind of sound, so they could be sampled sounds like a recording of whatever – somebody’s voice – or they could be a recording of a real musical instrument. They can be electronically-synthesized sounds, so that’s just – your write a program that mimics the way a piano sounds. I think recently [inaudible] worked on physically modeled instruments, so that’s really cool. That’s where there's actually a physics simulation of a violin or something running into the computer, and all of the physical properties of a real violin – the kind of wood and the size, the way the air vibrates inside this model – then MIDI events are turned into string plucks or whatever to make sound. That MIDI synthesizer part is really flexible, so it’s taking events and it’s turning those into sound, but there's no – there’s all kinds of synthesizers out there. You can find synthesizers to mimic just about any instrument in the world and there are a bunch of things that are not real musical instruments – they're just electronic sounds. When you say synthesizer, people think of a synthesizer in music, which, they usually don’t sound like a real instrument, right? They sound electronic, but it’s just anything that turns MIDI into sound. CHUCK: So is that the way that apps work when they take some kind of actions that you're doing and translate that into sound, like translate that into music so you got some interaction that you do, and then it plays it back for you as music? It just captures those events and then translates them to MIDI events? ANDREW: Well, that would be up to whoever implemented the app; that wouldn’t be the only way he had to do it, but that would certainly be a good way to do it because that’s what MIDI is designed for, is to take events from a user and turn those into sound eventually – musical sound. I imagine there are apps that do that; certainly any apps that actually receive MIDI input from a connected instrument do that. They are probably more than you expect if you are not really in that world, like Garage Band, which is an Apple app that can take MIDI input from a device, and there are all kinds of other music apps on iOS out there that will connect to a MIDI device. We actually use – so some new MIDI devices are made with iOS support and just plug them into the dock connector or lightning cord and they just work, but even a lot of just USB MIDI devices will work with the camera connection kit. MIDI’s actually been supported for a long time on iOS, and there's quite an ecosystem of MIDI apps for iOS. And then of course, for OSX, MIDI has been available forever since the ‘80s. [inaudible] not on OSX, but on the Mac, MIDI has a long history. JAIM: I'm a little curious about the DJ app that you're doing. You're doing MIDI controls with things like knobs and faders and switching, like crossfading between one turntable and the next one? Is that all done through MIDI? ANDREW: Right. Exactly. JAIM: Are things like scratching – is that something that you're converting into MIDI events? ANDREW: Yeah, we’re going from MIDI to – we’re taking MIDI input and using that to control the app. But yeah, so when the user scratches on the turntable of the controller, you just get a bunch of MIDI events that tell you that that knob is turning, in which direction it’s turning and how fast it’s turning. You use that to control whatever it is your app does, but DJ apps all can connect to these MIDI controllers. There are MIDI controllers made by tons of different – I mean, there are hundreds of them, and so yeah, we do that. It’s been quite interesting – because of MIDI’s history, especially the part of MIDI that’s not for music, so that’s the controller stuff, the control-change events that I was talking about – they're not really that strictly standardized. They're sort of, each manufacture does things, think they should do things, and they're not exactly all the same. It’s been quite a challenge, actually, to just support all the different controllers out there that people use, but fundamentally, still just get MIDI events. It’s just a question of exactly what kinds of MIDI events you get and what they mean. For example, just as a real example – some controllers, when you press a button, you get an event. And then when you let go of a controller, or let go of the button, you get a second event telling you that the user let go of the button. Other controllers just send an event when you press the button down and don’t send another one when you let go, so you can’t actually tell when the user released the button. It’s kinda nice to know, if you're expecting a second event, for when a button is released. You might not get it depending on what controller’s hooked up, so that’s just an example of the kind of challenges we’ve had. But the music part of MIDI is much more standardized and not so much variation between devices. JAIM: Okay, so you could theoretically replay all the things that are happening. A DJ can record this set and replay it based on this MIDI information. ANDREW: Yeah, absolutely. That actually would not even be very difficult for us. I think we actually thought about doing that; the app does not do that right now but there's really no reason that we couldn’t do that. You just get events which are just data – there's no reason you couldn’t save those to a file and play them back through the exact same code that’s handling them coming in in the first place, and use them to replay exactly everything that the user did to replay their set. In some sense, our users are playing our app as if it were an instrument, right? They're using it to make music in real-time, so that’s really what MIDI’s good at. JAIM: That’s very cool. CHUCK: When would you want [inaudible] traditional recording like into wave format or whatever as opposed to using MIDI or vice versa? ANDREW: Well, obviously you can’t have MIDI for vocals, for example, and you can't use MIDI to capture a really cool guitar effect that you got with your new guitar pedal. I think MIDI is really good when you're trying to compose a song, you're recording a song that you're composing and that you wanna be able to edit on the computer to change around. You can change the key and the scale really easily, right? It’s just numbers. You wanna be able to do, fix your timing, or play around with a melody – like play a melody on your keyboard and then change it, play around with it on the computer – MIDI is great for that. But if you're actually trying to capture a sound, if you're recording your band practicing, well then of course you want record audio. And of course there's no reason you can’t mix them. Garage Band, for example – you might play your piano track with MIDI and record it with MIDI, and then once you’ve got that done, and it sounds good, and you're playing it through a synthesizer so it sounds like a piano, you record the vocals with a microphone so that they can coexist and certainly they do coexist. When people are using MIDI for music, very often they also have tracks that are regular sound recorded through a wave file or to an .aiff file. CHUCK: Very, very cool. Are there any things that we wouldn’t have thought to ask about that are interesting about MIDI? I believe Jaim is a musician, but I'm really not, so. JAIM: I've done some work with MIDI and some applications that play back, and part of the formats they supported were MIDI formats. I'm kinda curious if you [inaudible] listing for events and playing them back, how does your app do the synthesis and converting the MIDI event to actual sound? ANDREW: That’s a really good question. In core audio, there are actually – I don’t wanna digress too much into core audio because that’s a whole, several-episode topic on its own, but core audio has these things called audio units. Audio units are things that typically, they take sound in; some of them generate sound, some of them take sound in, process it, and then output it, and then their output units they just take sound in and send it through speakers. But there are also some audio units that are built in to core audio that are called instrument units, and those instrument units take MIDI as input and audio as output, so those are synthesizers. I don’t know – I haven't gotten into this a lot and I don’t know how you change which sounds the audio units are outputting, so whether you want a piano, or guitar, or a trumpet or whatever, but I do know that there are lots of third parties that make synthesizers. There are companies that that’s basically what they sell, is MIDI instruments. Anyway, you use those just like you use any other audio unit in core audio, except that instead of feeding them audio, they just receive MIDI. You give them the MIDI events that you're receiving from your device and they turn that into sound for you. JAIM: Are synthesizers generally – they take the sound from scratch and just do a sine wave and do some transformations on it? Are they recording things like SoundFonts used to do when they’ve actually recorded piano notes and they were playing that? Are they doing both things depending on the library? ANDREW: Yeah, depending on the instrument they definitely do both things, so there are a lot of instruments that are just a sample. I mean, you can find instruments that are – SoundFonts are still around, and you can use those for MIDI synthesizers. Some of them might –. CHUCK: What are SoundFonts? ANDREW: SoundFonts are – think of them like a text font, except that instead of being a way to draw a letter, they're specific sounds; there's a sound for each note. CHUCK: Okay. ANDREW: And that might be, somebody has a Steinway and they actually just mic-ed it really well and recorded every single key on the piano and when you play MIDI to that, it just plays the recordings that were made of that piano. But then there are also synthesizers that do synthesis – I mean, audio synthesis is a huge topic and there's a lot to say about it, but you're exactly right. They might have – they generate sine wave and they modulate that with another sine wave, and then they send it through a filter. A lot of that is sort of digital mimicry of the old analog synthesizers from the ‘60s, ‘70s, ‘80s kind of thing. They can play recorded sounds, but there are also synthesizers that generate sounds from scratch. CHUCK: Interesting. JAIM: Okay. Are there any open source, free synthesizer libraries we can use if I wanna add some MIDI to a game I'm doing and I don’t wanna record it, so I can just put in some notes and have a free library I can input into my app or link in? ANDREW: Yeah, absolutely. There are lots of free ones out there that you can download. There are also quite a few synthesizers that will actually produce sound better just built in that come with core audio. There's this thing called – I think it’s called general MIDI. I get confused because there's this thing called standard MIDI and a thing called general MIDI, and they're kinda different things even though they sound very similar. General MIDI is this set of sounds that any MIDI synthesizer should be able to make. For example, there's grand piano and electric piano, and they don’t have to use the exact same sound, but it should be basically the same kind of thing. Everybody knows what electric piano is supposed to sound like, so if a synthesizer has the electric piano sound, it might not be identical to every other synthesizer with that sound, but it would be pretty close and it’ll be what the user expects. If you play a MIDI file, it has all these specified general MIDI instruments; it’ll sound basically like you'd expect it to sound on any MIDI synthesizer that implements those. There are sort of these built-in ones; they're just standard MIDI sounds and there are lots that you can get out there for free and paid, and that’s a really a big – I mean, there's almost a whole industry around MIDI synthesizers; synthesizers that make certain sounds. A lot of synthesizers you can tweak, so they're actually generating sound. You can [inaudible] all kinds of settings that you can change parameters, that you can change how they make sounds. JAIM: That’s pretty cool. I'm curious about the non-audio, non-music recording of MIDI. What other applications are there for that? One thing I would think about is if you're doing automation for recording. You do a – getting your pro tools or whatever and you're doing your faders to bring out this one part, bring up a guitar part in this section, or adjusting different things. That’s one thing you can do MIDI for. Are there any kinda cool non-audio or non-music things that people are doing with MIDI? ANDREW: I don’t actually know off the top of my head. That’s an interesting question. MIDI is definitely used like you described – to automate parameters of a recording or playback system. Actually, Audio Toolbox – I think it’s Audio Toolbox – one of the MIDI frameworks on iOS and OSX has support for recording audio unit parameters, so this is not sound; this is the knobs and sliders that are on an audio unit, recording those to MIDI and then playing them back. Maybe you’ve got a flanger or something that you wanna record what you did with that during a song, and then you can do that using MIDI and play it back. MIDI is made for musical things, but like I said before, it’s not really – fundamentally, it’s just data. Especially with the kinds of devices that we connect to, the DJ controller, it really is a box with a bunch of knobs and sliders and buttons on it, and you could very much use that to do anything you wanted. It would be sort of weird, but there's no reason you couldn’t make that into a game controller or make it into something that can control a robot or whatever. You're really just getting data that tells you which buttons that user’s pressing and which knobs their turning, how far and all that – the same stuff that you'd want from any hardware controller. CHUCK: I went on Wikipedia and I’ll put the link in the show notes, but it lists a bunch of things that people use it for. It’s show control, theater lighting, special effects, sound design, console automation, recording system synchronization, audio processor control, computer animation. It lists computer networking as demonstrated in the 1987 by the early first person shooter game MIDI Maze and animatronic figure control. JAIM: I think anything that correlates time-based input could be done with this. I wonder if you could do a remote-controlled helicopter, replay your flight, go on in circles or something. That would be kinda cool to play with. ANDREW: That would be cool, and I am glad you found that, Chuck, because I hadn’t seen that. I thought I had read about people using it for theater lighting before and I mentioned that earlier, but I wasn’t sure. One really cool thing about that is it’s actually sort of hard-to-get custom hardware to input things to an iOS device, right? You can’t really just hook up any USB device to your iOS device and do anything with it, but that actually opens up a lot of possibilities, because there are all kinds of MIDI devices out there and you can use those to talk to and from your iOS device without any – it’s not like trying to use the [inaudible] where you have to get Apple’s approval and join their program, give them 10% or whatever – it’s open to everybody. Core MIDI is just a regular, public framework. CHUCK: Yeah, one other thing that occurs to me too is that since it’s evented and since your events effectively can stack up to effect the output of other events or ongoing settings. I mean, there really are some powerful things that you should be able to do with it in order to get behaviors that you're looking for. ANDREW: Yeah, absolutely. There's a lot of complexity – not in a bad way, complexity in a good way – in MIDI. One of the nice things about the fact that it was sort of a standard that there is a MIDI Manufacturers Association that maintains spec, but it feels to me like it’s sort of evolved over the years somewhat organically. One manufacturer would have these new features to it and then maybe those would sort of get adopted by others, but it means that there's actually a lot to it, and there's a lot of flexibility inherent in it. One thing we didn’t talk about already is that MIDI already has support for something called a system exclusive event, and it’s usually shortened to SYSEX. That’s just a MIDI messenger that can contain arbitrary data, so it’s not – you can put anything you want in there – just data of any type that you want. It’s there because manufacturers sort of wanna make – some manufacturers might have a controller that has this completely new touchpad on it and there's no support in MIDI for touchpad events, so you just send whatever you want in these SYSEX events. You can make touch pad send SYSEX events and if somebody else was writing an app that knew, that was only going to work with that controller or whatever and knew about that controller and they wanted to listen to the SYSEX events that can. There's actually a lot of room for expansion; you're not really limited by what other people have already done with MIDI. JAIM: One thing we didn’t talk about in MIDI is time. How does it represent time? What is the base unit? ANDREW: That’s an excellent question. For musical MIDI, which I assume is more interesting, if you just open up a MIDI file –. If you're receiving MIDI events from a live device, time is just – you get a Note On event and then you get a Note Off event; you can measure the duration between those and figure out what that means for you. You might be recording and the user has told you their tempo is 100 bpm, and so you can convert the time between the Note On and the Note Off event to a note length, say, it’s a quarter note or an eighth note. In a MIDI file, time is actually – is not actually – it’s not in seconds. You get events with tip ties and this actually gets complicated because you can have songs that are in different time signatures and different tempos, and there's ways to specify those in a MIDI file, but they're not actually specified. Typically though, by default, one tick, one MIDI tick is a quarter note, and so if you have a note that has a variation of half a tick, that’s an eighth note. Even the resolution of a tick is configurable or can be set up in the MIDI file, but off the top of my head, I think the resolution goes down to a 24th of a quarter note, or maybe it’s a 32nd of a quarter note, so it’s actually quite fine. Time is specified in terms of musical terms, so in quarter notes and eighths notes and fractions of those. That’s actually powerful because it means that the tempo of a song is not fixed when you record it. When you play it back, you wanna change the tempo, it just means that in your app, you say that quarter notes are a tenth of a second instead of a twentieth of a second or whatever. That’s actually pretty powerful because you can change the tempo of recorded audio, but it’s sort of this complicated process where you have to do an FFT, and then go back, and all –. With MIDI, it’s not – you just play those notes out at a different speed than they were recorded at. JAIM: So the tick is a quarter note – if you do an eighth note which is twice as fast as a quarter note, is that a 0.5, or do they do a ½ when they're representing it? ANDREW: Yeah, it’s a 0.5. If you open a MIDI file using core MIDI – or it’s actually using Audio Toolbox – you get the [inaudible] in the API again, but it’s a C API but there's sort of what you call a class called music sequence that it’s actually a struct, and related functions that allow you to open a file and for each event you get a start time and a duration, basically, and those are in ticks, and it’s just a float. An eighth note is 0.5, and a sixteenth note is 0.25, but it’s a float. [Crosstalk] You don’t have to be exact, right? JAIM: I guess not. If you have a triple, that’d be like 0.333 – that’s not a real number, or that’s not a rational number. You don’t express it as a fraction, but I guess rounding would be pretty minor. ANDREW: Yeah, it’s actually 64-bit float too, so you have very high resolution. MIDI, even though the timing is specified in terms of eighth notes and quarter notes and whatever, this is meant to record human musicians playing a real instrument, where note length are obviously not exact. That actually brings up a feature of a lot of MIDI recording apps, which is that they allow you to quantize your recordings. A human user will never play notes exactly on beat and exactly the right duration –. JAIM: [Inaudible] ANDREW: Yeah, right. Quantizing lets you snap note boundaries so they are exact. Of course, that actually kinda sounds weird, right? We’re not used to hearing music that’s played exact, with exact timings, and it makes it sort of sound robotic and not that good, not that human. There's also something that MIDI apps can do called dequantization – I think it’s dequantization. That’s where you –. JAIM: It’s quantization. ANDREW: Well, quantization is making the timings exact, but you can also reverse that process. If you have a recording’s exact timings, you actually edited the MIDI file on the computer instead of playing it with piano. You might wanna add a little bit of noise there, so the boundaries are not exactly on beat, because mixed sound more natural and that’s possible too. JAIM: Yeah, because you're adding a little bit of air, you're adding some times, [inaudible] a little bit of time, so that it feels more natural. As you said, if you record something exactly perfect, [inaudible] which doesn’t sound good, that sounds like a robot. ANDREW: Yeah, and the MIDI files, you can Google Bach MIDI file, or something and find a MIDI file with just Bach’s song. They sort of have a reputation of sounding like a robot playing piano, in large part because the timing is so perfect. It’s almost like there's no feeling in it, right, because a lot of the feeling that a musician can put into a piece of music is based on timing and variations in timing. MIDI doesn’t have to have that problem; that’s not intrinsic to MIDI. CHUCK: Yeah, but if you build all the events just off of the score music without having somebody actually play it in, that’s where you're talking about where it feels unnatural because it’s playing it exactly, perfectly. ANDREW: Right, exactly. CHUCK: Alright. Anything else we should cover on MIDI before we get to the picks? ANDREW: There's a lot we haven't touched on, but we've talked about most of the stuff that I actually know about. I can’t really talk authoritatively about the rest of the stuff that we have not covered, but MIDI is quite a deep topic and there are a lot of resources out there for learning more about it and I’ll put some in the show notes. MIDI is definitely not an Apple-specific thing even though they’ve got really good implementations of it on both OSX and iOS. It’s really widely supported on all kinds of different devices and computers. CHUCK: Alright. Well, if you have any opinions, experience, or other things you wanna add to the conversation, go to iphreaksshow.com and leave us a comment. Why don’t we get to the picks? Andrew, do you wanna start us with the picks? ANDREW: This is a little bit self-serving and predictable, but I’ll pick MIKMIDI. That’s our library from – it’s mixed in key library for doing MIDI and I actually wrote essentially all of that, and so that’s been a fun project over the last year. I think it’s really actually – it makes adding MIDI to your iOS or OSX app a lot easier and you don’t have to use all the low-level, core MIDI C stuff; it’s all objective-C. I've tried to design it so that it works the way you sort of expect a Cocoa library to work; it can put it in your app really easily. For my second pick, I'm going to pick my new microphone, which is a Blue Yeti. This is sort of a popular podcast microphone, but this is the episode I ever recorded with it, so if you like it, check it out. They're not actually very expensive for a decent USB microphone. Those are my picks. CHUCK: Awesome. Jaim, what are your picks? JAIM: I'm going to pick the Hemingway app, which has a website you can go to – hemingwayapp.com. You can input text, so if you're doing a blog post, or any type of writing you put in there, it’ll tell you how complex your writing is, it’ll give you ideas for making your writing simpler. If you’ve read Hemingway, he uses very simple prose and stuff that – he’s doesn’t use the flowery language. This really helps find places in your writing that may be hard to understand [inaudible] grade level comprehension down. It’ll give your writing a grade level – grade seven, grade three, grade two. This is also great for creating verbage for your app, trying to put [inaudible] messages in there. You can kinda get it as low as possible – grade two, grade one – or if you're doing it for an Android app, maybe grade kindergarten, because the users don’t have a grade of comprehension. But anyway, that’s something I've been using. It’s helped my writing and has done some pretty cool stuff, so check out Hemingway app. CHUCK: Awesome. This has been kind of a hectic week this week, so I don’t have any picks. So we will go ahead and just wrap up the show. Thanks for sharing your expertise, Andrew. ANDREW: Yeah, it’s fun to talk about. CHUCK: Alright, well we’ll catch you all next week! [Hosting and bandwidth provided by the Blue Box Group. Check them out at BlueBox.net.] [Bandwidth for this segment is provided by CacheFly, the world’s fastest CDN. Deliver your content fast with CacheFly. Visit cachefly.com to learn more]

Sign up for the Newsletter

Join our newsletter and get updates in your inbox. We won’t spam you and we respect your privacy.