Salmon drums: drum machine app

Morten Robinson – morob05
Garda Zsolt Barnabas – zsgar17

REPO: https://bitbucket.org/zsgar17/salmon-drums/src/final-handin/machine898/

1. Introduction

1.1 Introduce your project here

1.2 What is your basic idea

Both of us are interested in sound and music and thus it felt natural to do a project using the AudioKit framework. We decided on implementing a simple drum machine app.

1.3 Why is this subject important

Music plays an important role in human culture and our app is a creative tool for assisting in making music, specifically a tool for making drum beats. Also, making music can be fun and it is important to have fun.

1.4 Common platform

In order to understand this subject a basic knowledge of app development using Xcode, a basic understanding of digital audio and audio in general is required. A basic understanding of the audio kit environment including a basic understanding of the MIDI standard is also required. The application is using audio sampling, that is the manipulation and playback of audio samples. In its simplest form it consists of loading an audio file into memory and playing it back, when a certain event (e.g. button press) happens.

1.5 Problem statement

We decided on some minimal specification that our app should satisfy. The app should be playable in a live sense enabling the user to play drum beats live by tapping buttons on the iOS device to trigger drum sounds. Furthermore the app should be user programmable, enabling the user to program a drum pattern and hit a play button to play back the pattern. This is our minimum specification, but this project is easily expandable in the sense that it is easy for someone with a basic insight into digital music production to come up with extra features and functionality that could be added to the app.

1.6 Aim

Our aim is to have a working app satisfying at least the minimum specification.

1.7 Objectives

We will implement all the necessary views, controllers and models and we will make extensive use of the AudiKit framework environment.

2. Methods and Materials

2.1 Brainstorm

We did a brainstorming session at the library to arrive to the set of feature we want to implement.

2.2 Conceptual Design and Simple Prototyping

We did a handwritten sketch about the user interface we wanted to bring to life, and afterwards we did a mockup using a vector graphic design program.

2.3 Evaluation

We agreed on the features to implement and had a successful simple prototyping that we were able to follow during the implementation of the application.

2.4 Use-Case Diagrams

We used draw.io to create a use-case diagram to follow during implementation.

2.5 Object Diagrams

We used lucidchart do make the object diagram.

3 Results

3.1 Brainstorm

During our brainstorming session we did a list of features for our app and we divided these features into must-haves and nice-to-haves. We formulated the minimum specification for our app based on the features in the must-have list.

3.2 Simple early prototyping

Based on our minimum specification we began talked about how we would make a graphical user interface for our app. For example we agreed that it would be nice to have a dedicated view for playing the drum pads live. Our minimum specification was as follows:

    • 10 drum sounds / 10 tone timbrality
    • 16 step programmable pattern sequencer
    • At least 2 view in the UI, pad view and pattern view
  • A transport section for pressing play, stop and showing the tempo

Our nice to have list was as follows:

    • Nice to also be able to set the tempo
    • Nice to have a record button so patterns can be programmed from pad mode by tapping drum pads
    • Nice to be able to select between different audio samples and load the to drum pads
    • Nice to add velocity sensitivity on drum pads
    • Attack, decay, pan, pitch, reverse
    • Endless list of effects which could be added to output or to individual drum samples
  • Could be fun to use the IMU of iOS devices to control certain parameters such as the cut off frequency of a filter.

It was essential for us to have 2 specially dedicated main views, rather than cramming a lot of other stuff in there too. Big drum pads makes it easier for the user to play drum beats live by tapping the different pads and pattern view is also more user friendly with larger buttons. We also decided to make a dedicated view for programming the drum patterns. The pattern view should contain 10 rows, each corresponding to a specific drum sample. A standard drum pattern with a 4:4  time signature contains 16 steps or 16 1/16th notes, so we should have 16 buttons in each row, one button for each of the 16 steps. Based on our minimum specification and these consideration we began drawing sketches of how our app should look and we came up with a simple design for the user interface.

3.3 Required task

From the sketch of the user interface planning becomes easier. It is clear from the above that we are dealing with multiple MVCs.  We have a drum pad view and a pattern view. And then we have a view for the transport controls stop, play, record, toggleView, tempolabel and metronome which is always visible and accessible. Therefore we are dealing with 3 MVCs.

There are several options for going about this, we somehow need a controller who’s view are other MVCs. We decided on using the navigation controller that iOS provides. The navigation controller has it’s own root view which is always visible, so we can embed the transport control buttons into the root view since the transport controls should be accessible no matter if the user is in pad mode or pattern mode.

The Controller for the pad view should trigger playback of individual drum samples when the corresponding drum pad is tapped. This should be an instantaneous single-shot playback.

The pattern view controller needs to toggle the state of the buttons in pattern view when they are tapped. The pattern buttons should toggle between 2 states, they are either on or off. When toggling a button the controller needs to update the state in a drum pattern data structure accordingly. It also needs to update the view according to the drum pattern data, so the user can clearly see the state of each button.

The transport view controller needs to enable playback of the current state of the drum pattern. So we need a so called sequencer for sequencing the programmed drum pattern. The tempo with which the sequencer plays through the 16 steps in the drum pattern will be set to a specific BPM (Beats Per Minute). When the user taps the play button the sequencer should trigger the individual drum samples at the correct time corresponding to the programmed steps in the pattern, when the user taps the stop button playback stops and the sequencer rewinds to the first step in the sequence. For one pattern our sequence will be 16 steps long and sequencer should loop these 16 steps indefinitely or until the user taps the stop button.

Now the tasks seem clear. We need to implement these 3 MVCs, that is the pad MVC, the pattern MVC and the transport MVC.

3.4 First navigational prototype: A Main.storyboard

The Main.storyboard uses a navigation controller as a container for the pad view and the pattern view and it looks like this:

3.5 Use-Case Diagrams

According to the brainstorming, the user must be able to

    • Play sounds on instrument pads
    • Program patterns for each instrument in the sequencer
  • Play back the programmed pattern.

Additionally the user would be good to be able to change sounds for the instruments used in the instrument pads and change the playback tempo.

3.6 MVC Diagram


The diagram shows the desired separation of the objects by the MVC pattern.

The TransportViewController has a ContainerView, that can accommodate either a DrumPadsViewController or a PatternViewController as its view.

The DrumPadsViewController makes use of the DrumPad and the Sounds models, and has a view built in the storyboard with 10 instrument buttons.

The PatternViewController makes use of the Sequence model and has a view built in the storyboard as well. In that view there is a TableView and a custom PatternTableViewCell as the prototype cell.

3.7 Framework

We have made extensive use of the AudioKit framework which provides us with a large library of classes that we can use in our app development to make the developing process simpler and easier. Using AudioKit we don’t have to develop everything from scratch, we just have to learn how to use the framework and know how to find classes that are useful and relevant to what we are implementing. Thankfully AudioKit provides good documentation on their website. In this chapter we will provide some insight into those classes in AudioKit that we have been able to put to good use during the development phase of this app. Some code examples of using the framework will be provided and the reader should be able to implement similar functionality using the AudioKit framework after reading this chapter.

3.7.1 The AudioKit class

The AudioKit framework provides a top level class by the same name, which provides Audio and MIDI engines as well as management of AudioKit devices. In order to output any sound AudioKits audio engine must first be started. This can advantageously be done in the init function of the class which needs to use it and it is usually done by executing the following code:

        do {
            try AudioKit.start()
        } catch {
            AKLog("AudioKit did not start!")
        }
3.7.2  Audio Devices and the sampler

In order to load audio files such as .wav files they must be loaded into instances of AudioKits AKAudioFile class. Instances of this class on it’s own does not function as a device with an audio output, so in order to play back the audio file it needs to be loaded into an instance of a suitable device, such as the AKPlayer, the AKAppleSampler or the AKMidiSampler which is just a midi enabled AKAppleSampler. Additionally the audio output of the device needs to be send to the AudioKits master audio output which is what will be coming out of the iOS device speakers or headphones. This is made very easy by AudioKit, since the top level class contains a device called output, which is a so called AKNode. And all instruments and devices that produces an audio output are also AKNodes, so it is as simple as doing AudioKit.output=someAKNode and now AudioKits master audio output is equal to some AKNode. Instantiating a AKAppleSampler, loading an AKAudioFile and sending the output audio output to the master output is typically done in this manner:

var sampler = AKAppleSampler()
do{
     let someDrumSample = AKAudioFile(readFileName: "Path/to/someDrumSample.wav")
     try sampler.loadAudioFile(someDrumSample)
} catch{
     AKLog("File didn't load")
}
AudioKit.output = sampler

Now the sampler is instantiated the file is opened as an AKAudioFile the sampler is loaded with the file and the output of the sampler is sent to AudioKits master output. When triggering the sampler to play the audio file it should now come out the speakers or headphones of the iOS device. The AKAppleSampler functions as an instrument which provides the basic functionality that you would expect from a standard sampler instrument. That is, it is playable across a range of keys ranging from C-1 in the deep low end corresponding to a midi note number of 0 all the way up to G9 corresponding to a midi note number of 127. Playing a note is done by calling the function play(noteNumber:velocity:channel). When loaded with a single AKAudioFile the root key will be the middle C also called C4 which has a midi note number of 60. This means that when calling the play() function with 60 as the note number, the audio file will play at normal speed. Notenumbers higher than 60 will speed up the playback of the audio file making it higher pitched. For example an octave above at note number 72 the sample will be played at twice its recorded speed. An octave below at note number 48 the sample will be pitched down and played at half it’s recorded speed. AKAppleSampler uses interpolation to calculate and generate sample values when the sample is played at other notes than middle C. Like most other sampler instruments the AKAppleSampler also allows for loading multiple Audio Files and assigning them to individual keys/midi note numbers. This can be done simply by adding for example C2 or F#4 at the end of a filename if the goal is to assign the sample to C2 or F#4.

In our implementation the drum samples are loaded into the AKAppleSampler and they can be triggered either by tapping the individual drumpads in pad mode or by the sequencer.

3.7.3 The AKMixer

The AKMixer class makes it possible to add audio signals to each other. It basically just sums the signals together. This may off course result in clipping if the audio signals are very loud. We are not too concerned with clipping in this project, it may add some distortion to the output, but that not necessarily entirely a bad thing, distortion can be a nice effect. If clipping turned out to be a highly undesired issue, it could easily be avoided by using the AKPeakLimiter which can control the dynamics of the signal to ensure that it stays within the dynamic range. AKNodes can be added seamlessly to the mixer using the
connect(input: AKNode) function or simply by, AKNode >>> AKMixer. Instantiating a mixer and some AKNodes might look something like this:

//Creating an instance of the AKMixer class
var mixer = AKMixer()

//Creating some different instruments
var anAKNode = AKAppleSampler()
var anotherAKNode = AKOscillator()
var yetAnotherAKNode = AKMIDISampler()

//Sending these instruments to the mixer
[anAKNode, anotherAKNode, yetAnotherAKNode] >>> mixer

//Send mixer output to master output
AudioKit.output = mixer
3.7.4 The AKSequencer Class

The AKSequencer class enables the sequencing of MIDI data. New midi tracks can be seamlessly added and removed from the sequencer as required by calling the newTrack() and deleteTrack(trackIndex) functions. Several functions are provided for loading MIDI data to the tracks, such as addMIDIFileTracks(“Path/to/midifile”). In our case we a generally manipulating the MIDI data by inserting and removing notes using the add(noteNumber:, velocity:, position:, duration:) function and removing data with the clearRange(start position:, duration:) function. The tempo with which the sequencer plays through the sequenced data can be set with the function setTempo(BPM: Double). Other notions of time such as position and duration are cast as AKDuration objects. AKDuration is a container class for the notion of time in sequencing and it has parameters such as beats, tempo, seconds, etc. and it can be initialised by specifying the number of beats and tempo. For example casting AKDuration(beats: 1, tempo: 60) the duration would be one second because one beat at a 60 beats per minute tempo is 1 second. The sequencer can also be set to loop in a sequence or time interval by calling calling the function enableLooping( loopLength: AKDuration).

In our case our pattern is 4 beats long so we need to set the loopLength to 4 beats. Keeping track of time in this manner with beats rather than minutes and seconds is very advantageous, because if we were to change the BPM, we don’t need to change the MIDI data also to match the new tempo, and we don’t need to change loopLength, we just change the BPM with the setTempo() function and all the other stuff just happens automatically, as a matter of fact all the other MIDI data doesn’t change at all when changing the BPM, only the MIDI Beat Clock changes with the BPM. The MIDI beat clock is the clock signal that synchronises all the MIDI devices, it’s speed is always 24 pulses per quarter note and thus it’s speed is BPM dependent. Some relevant code for instantiating a sequencer, initialising it with a BPM value, setting it to loop, creating a track and manipulating data in that track might look something like this:

//Create an instance of the sequencer
var sequencer = AKSequencer()

//Set the BPM
sequencer.setTempo(128.0)

//Set the length of the sequence
sequencer.setLength(AKDuration(beats: 4))

//Enable looping
sequencer.enableLooping(AKDuration(beats: 4))

//Add a MIDI track, since it is the first track it will be track number 0
_ = sequencer.newTrack()

//We might send this data to a MIDI instrument such as the AKAppleSampler
var MIDIInstrument = AKMIDISampler()

//The sequencer should send MIDI output data to the instrument's MIDI input
sequencer.tracks[0].setMIDIOutput(MIDIInstrument.midiIn)

//Lets add 1/16th notes on each beat. 4 for the floor.
sequencer.track[0].add(noteNumber: MIDINoteNumber(60), velocity: MIDIVelocity(100), 
                       position: AKDuration(beats: 0), duration: AKDuration(beats: 1/4))
sequencer.track[0].add(noteNumber: MIDINoteNumber(60), velocity: MIDIVelocity(100), 
                       position: AKDuration(beats: 1), duration: AKDuration(beats: 1/4))
sequencer.track[0].add(noteNumber: MIDINoteNumber(60), velocity: MIDIVelocity(100), 
                       position: AKDuration(beats: 2), duration: AKDuration(beats: 1/4))
sequencer.track[0].add(noteNumber: MIDINoteNumber(60), velocity: MIDIVelocity(100), 
                       position: AKDuration(beats: 3), duration: AKDuration(beats: 1/4))

//Lets remove one of those 1/16 notes
sequencer.track[0].clearRange(start: AKDuration(beats: 2), duration: AKDuration(1/4))

//If we want to listen to our instrument as it receives the MIDIData
AudioKit.output = MIDIInstrument
//Off course we haven't loaded a sample into the MIDIInstrument in this example case...

//...so there will be no sound when we play this sequence
sequencer.play()

//Lets stop
sequencer.stop()

//Lets rewind so we get back to the beginning of the sequence
sequencer.rewind()

4. Implementation:

4.1 Models:
4.1.1 Sounds:

This struct contains an enumeration of the instruments usable in the application, for the pads and the patterns. Their UInt8 associated type corresponds to the standard MIDI drum instrument values, so the rawValue of the cases can be used easily to trigger any midi input enabled class.

It includes a String description (implementing the CustomStringConvertible), that is used to give the label values in the DrumPadViewController’s Button pads.

Additionally, a SoundLibrary static variable is present here, and it contains the possible sound sample names available for each individual instrument. This can be used to list the choices on the pads, when it comes to presenting the available drum samples.

Implements Codable as well as the included SoundCategory enumeration to make it availabe to Encode.

4.1.2 DrumPad:

Model for individual drum pads encapsulating the instrument (category) and the actual drum sample (as a String, that should be from the Sounds.SoundCategory corresponding instrument.)

4.1.3 Sequence:

Encapsulates the instrument and the corresponding sequence (of the 16th notes) as a Bool array.

Implements Codable, to be Encode-able, that is used by the SequencerViewController.

4.2 Views:
4.2.1 PatternTableViewCell:

A custom UITableViewCell, that has an array of UIButtons for each individual step in the sequence.

Also includes a delegate and a corresponding protocol (that the SequencerViewController implements), to be able to communicate the fact, that a step button is pressed, and thus making it able to change the Sequence model.

Additionally it holds a UIImageView for the instrument “explaining” picture in front of each sequence in the SequencerViewController.

4.3 Controllers:
4.3.1 TransportViewController:

Includes functionality for the transport functionalities of the app (Play, Stop) as well as contains the other two view controllers as a container view. It is embedded into a Navigation Controller, and the navigation bar is used to accommodate the transport functions as well as the view switching segmented control.

Properties:

drumPadsViewController: strong reference to the drum pads showing controller.

sequencerViewController: strong reference to the sequence showing controller.

Methods:

playPressed(): created the midi sequences from the Sequence model and plays the created sequence. Hooked to a button in the navbar.

stopPressed(): stops the playing sequence. Hooked to a button in the navbar.

add(asChildViewController viewController:) sets the passed VC as the subview, resulting in the showing of the passed VC’s view.

remove(asChildViewController viewController): prepares from removing the passed VC to give place to another one.

updateView(): “changes” the view according to the state of the segmented control by remove()-ing and add()-ing the VCs.

loadSamples(): loading samples from the Resources files to make the available for playback by the contained AKSequencer instance.

4.3.2 DrumPadsViewController:

Handles the individual drum pads and the playback of the samples. The playback is made available with a contained AKAppleSampler.

Methods:

givePadRoles(to buttons: [UIButton]): giving each UIButton in the outlet collection of UIButtons a “role” that is which instrument they are. This association is stored in the drumPadButtons property. This property is used to give the names of the pads and to offer the right drum sound selection, when it comes to it.

loadSamples(): similar to in the TransportViewController.

addButtonGestureRecognizers(): adding UILongPressGestureRecognizer to all the buttons to bring up the SoundSelectionTableViewController (experimental extra code) that offers the corresponding availabe sounds for the instrument represented by the button.

longPress(recognizer:): created a new SoundSelectionTableViewController and presents it in a modal way.

drumPadPressed(_ sender: UIButton, forEvent event: UIEvent): calculates the press point on the button and plays a sample with the calculated velocity.

play(noteNumber: MIDINoteNumber, velocity: MIDIVelocity): plays the given MIDI note using the AKAppleSampler instance.

4.3.3 SequencerViewController:

Accommodates a TableView of the custom PatternTableViewCells thus giving the functionality to program individual instruments with 16 steps. Includes saving function to save the pattern that is currently programmed by the user.

Methods:

private func saveCurrentPattern(): encodes the property “patterns” (an array of Sequence representing all the instruments sequences) into a JSON format. Saves the encoded data to the applicationSupportDirectory

loadPatternsFromStorage(): decodes the patterns from the saved data and sets the patterns property to the to the decoded object.

stepButtonPressed(at step: Int, for instrument: Sounds.SoundCategory): implements the delegate method from the PatternViewControllerCell to be able to register the step presses.

5. Discussion


We did succeed in meeting all of our minimum specifications for our app and we also succeeded in adding some of the extra nice to have features, such as recording patterns by tapping drum pads, adding velocity sensitivity to the drum pad mode and implementing a drop down menu to select different audio samples for the drum sound and a slider to control the BPM.

5.1 Evaluation

Unfortunately we haven’t had the opportunity to test the app with potential users for 2 main reasons. The time frame, there’s so much stuff that we would rather work on implementing and so little time… and the fact that this is a very niche kind of market targeted by this sort of app, which makes it difficult to find potential users. Not everybody is interested in music production and audio. We have only tested it amongst ourselves, but since both of us are familiar with similar apps, software and even hardware devices we both had pretty good ideas on where to improve and what features to add. And this still holds true with the current implementation of this app. We have all the basic stuff up and running and this basic stuff functions very well, but now we want to add all the fun extra stuff, so we most likely won’t leave this project entirely after handing in this assignment.

5.2 Comparison

This would not be an entirely fair comparison. Even though this app targets a niche market, the market is worldwide and huge. There are many big players in this industry that have been working with this kind of stuff for many decades and that we simply can’t compete with. Yet there are similar apps already at the app store, and some of them are not much more advanced than our app. Future revisions of this app could definitely have the potential to make it onto the app store 🙂

5.3 Conclusion

As stated in 5.1 we can conclude that we managed to reach our preliminary goals and some extra stuff too. What could really be fun now is to add an extra MVC for adding effects and controls to the different samples and maturing this app for the app store.

Leave a Reply