Gabriel – Florentin Hirjoi [gahir14@student.sdu.dk]

BitBucket link: [ https://bitbucket.org/gahir14/eloops.git ]

Introduction

Since drum machines have received the option of sending MIDI signals to a computer and the computer to interpret the signals and take the different actions accordingly ( in this case playing predefined sounds) people began exploring more and more their talent at not only producing music, but also at playing it live and the drum machines come in handy in this situation since their sturdy pad construction allows the user to play sequences on and on for hours.

A drum machine is an electronical instrument designed to imitate the sound of drums, cymbals and other percussion instruments. Even though they have been on the market for quite some time, the use of drum machines for playing live music is fairly new. An important role in this development belongs to the connection of these machines to the computer, which enables them to control not only the sound samples stored in their memory, but also the ones stored on the computer, as well as having access to different ranges of effects for these sounds.

A modern drum machine usually includes a set of 16 pressure sensitive pads, as well as other knobs and buttons to trigger and control effects, sample duration, and much more. One very good example in this category is Native Instruments Maschine, which had a huge impact on the market because of its ability to stay connected to a computer, therefore providing a new range of possibilities for the users.

The smart mobile technology development has facilitated the creation of virtual drum machine, with the iMaschine2 offered by Native Instruments being at the top of the market.

 

Due to the abundance of features they provide, which do help the more experienced users, the current drum machines discourage newcomers from trying to use them. The difference between experienced users and newcomers is made by the fact that the experienced users have used drum machines in previous projects and they have a working knowledge of not only the features presented by a drum machine, for instance looping sounds, changing different sample parameters like the gain of the sound, its pitch and its pan, recording new samples using the provided hardware and also creating new samples using the on-board sound engines, but their knowledge extends to include music theory which helps with the creation of a music project by defining the key in which a song in written and composed in, the tempo of the set song, as well as, for instance, the chord progression which contributes greatly to the harmonic rhythm of the melody. 

A virtual drum machine with less features, mostly concentrating on playing music, should inspire new users to get into production or into playing music.

Providing both a simple to use and adecvate virtual drum machine, which will allow users to play music live out of the box, but in the same time offer them the option to record their sessions should make the experience of the new users more enjoyable.

The basic idea for the app is to have a basic drum machine to play and records different audio samples in bars and phrases. Most of the music today is based on repetitive sequences of sound, sequences which are easily facilitated by drum machines which make possible the creation of songs without any prior music training, just by “beating” pads which trigger different samples.

Prototypes

The first prototype of the app consisted of 4 different screens, with inspiration from the iMaschine app, the goal being to provide a light version for it.

The users would be able to play audio samples using 16 pads, as well as control the pitch and modulation for the whole session. In the same screen they would also have the possibility to record their patterns, and also playback whatever they have recorded previously in the session. An option for undoing their actions would be present, as well as a box for setting the bpm (beats per minute) setting for the song. A bar on top of the pads would give visual clues to where he or she is in the loop sequence.

The prototype was tested using both beginner users, that have no prior experience in producing or playing music, as well as users which are more experienced, with a basic knowledge about music production, but with background in playing music, with both of the experienced users being trained in playing the guitar and both of them having been active in a band in recent years. On top of that, one of the experienced users is also a DJ. The beginner users were my 3 flatmates. 🙂

Starting with the experienced users, they both liked the idea of having multiple options for effects and other characteristics of the sound, but in the same time they argued that having too many of these features available will distract them and it will make the experience frustrating for them if in a live setting they would have to stop every single time to adjust these features for every song they are playing. The DJ also added that since whenever he is DJ-ing for a crowd he could use the app to generate build-ups to surprise the crowd, with a build-up being in simple terms the sudden increase of beats per minute that are playing at a certain point in time, he would not need to edit the sounds himself live since he will chose an existing one that is in harmony with the song that is playing and the most likely features he will need at that point would be to command only the pitch and modulation of the sound that is playing from the app.

The non-experienced users also had problems with the sound editing features of the prototype, none of them understanding exactly how those features would be beneficial for them in case they are just trying to play a sequence of sounds.

Taking both parties into consideration, a new prototype was created, a prototype which only includes features for playing samples, the ones for sound editing being discarded. Therefore the design for the new prototype app includes:

  • a pads view (screen) to allow users to play samples
  • a selection screen to select between different sample kits available in the app
  • a recording screen to allow users to record and use their own samples
  • a record function to record the sequences of sounds triggered by the users

IMPLEMENTATION

As described above, a number of different screens is required for the app. First and foremost, the users should be able to play sounds and also select between different sample kits which are available in the app.

Using XCode as the development environment two screen were designed using the Storyboards, one screen would hold the different pads for playing the sounds, while the second will include the list of sample kits available for the user.

ELoops Storyboard
ELoops Storyboard

As seen in the screenshot above, the Pads screen includes 8 different buttons which when pressed will trigger the sounds assigned to them. Using one colour for the background of the pads would confuse the user, at least in the beginning while he is getting used to the app. In order to make the app more user-friendly, a two colour scheme has been used to help the user differentiate the borders of the pads. With that in mind, the light red and the light orange have been used for the pads, in an alternating pattern, again to help the user comprehend the pads provided by the main screen of the application.  

In the second screen, the one on the right, the sound kits will be represented by a list of their name, a list which will be displayed using a Table View to better use the space on the screen as well as the memory of the device just in case the list should include a large amount of sound kits. On the left side of the image we have a Navigation Controller, which, as its name suggests, it is responsible for the navigation  of hierarchical content, in this case the two screens. One of the characteristics of the Navigation Controller is that is includes a bar on top of each of the screens it helps in the process of navigation. UI controls can be embedded in the bar and linked to the other screen. As a default behaviour, the controller inserts a “back” UI Button in the left side of the bar on the second screen to better facilitate the back navigation, in case it should happen.  While the controller maintains the relationship between the screen, the segue is responsible for the actual transition between the pads screen and the selector screen. Segues facilitate the transition between the screens, using them means having no code to write to transition from one screen to the other, dragging a segue from one bar button to the next screen being all that is needed to ensure that when the bar button is tapped the app will transition to the screen in case. But segues only go one way and in order to get back the pads screen while transmitting the new sound kit to be played, we need to use the delegate pattern.

The Delegate Pattern

The delegate pattern is an abstraction design pattern where an object, instead of performing one of its stated tasks, delegated the task to an associated object.

The Delegate Pattern
The Delegate Pattern

To implement the pattern we have to define a protocol which contains the function which is going to be executed by the delegate.

Than the delegator holds a reference to a delegate without knowing the exact identity of the later and when a sound kit is selected from the list the delegate receives the task.

protocol SelectKitViewDelegate {
    func setSoundKit(index: Int);
}

……………………….

var delegate: SelectKitViewDelegate! = nil

………………….

delegate.setSoundKit(indexPath.row)

In order for this action to take place a class has to adopt the set protocol, implement the delegate method and also set the delegate. The later happens in the  prepareForSegue 

function. In our case the controller for the pads needs to listen for any change in the sound kit and load the appropiate one when the user so wishes, therefore the PadsViewController will adopt the SelectKitViewDelegate protocol, implement the setSoundKit function and it will also set the delegate.

class PadsViewController: UIViewController, SelectKitViewDelegate{

…………...

// MARK: - Data sent back from the SelectKitViewController
    
    func setSoundKit(index: Int){
        loadUrlToPlayers(soundKitsList[index], ext: "WAV")
    }

…………..

override func prepareForSegue(segue: UIStoryboardSegue, sender: AnyObject?) {
        if(segue.identifier == "soundkits"){
            let destination = segue.destinationViewController as! SelectKitViewController            
            destination.delegate = self
            
        }
    }
}

 

The same effect could have been achieved using a so called “unwind segue”, which is provided by the storyboards. Creating the unwind segue involves, as in the case of the delegate pattern, creating a function that can be called from the second controller in the first (main) controller.  The function has the following template:

@IBAction func FUNCTION_NAME(segue:UIStoryboardSegue) {
   //add code for actions in the main controller here
}

The next step is to link the button from the second controller to this function. This can be easily done using the storyboards in Xcode.

Linking the button with the functions for the unwind segue
[gif source: http://www.raywenderlich.com/113394/storyboards-tutorial-in-ios-9-part-2]
Playing Sounds

The main purpose of the app is to allow users to play sounds by tapping the pads on the screen and is does so with the help of the AVAudioPlayer from the AVFoundation framework.

The AVAudioPlayer enables the user to play sound in any format available in iOS and OS X, but it is the developer’s responsibility to handle interruptions like incoming phone calls, as well as to update the user interface when the player has finished playing the sound. To accommodate these two tasks the class using the player must conform to the player protocol and implement its delegate methods.

func audioPlayerDidFinishPlaying(player: AVAudioPlayer, successfully flag: Bool) {
        //print("finished playing \(flag)")
    }        
func audioPlayerDecodeErrorDidOccur(player: AVAudioPlayer, error: NSError?) {
        //print("\(error.localizedDescription)")
   }

Since the audio files are stored in the application sandbox, they need to be loaded into the players using the NSURL and prepared to play the sound it holds.

var soundKit2: [String] = ["app-sound-kits/kit2/kick_3","app-sound-kits/kit2/kick_4"]

let path = NSBundle.mainBundle().pathForResource(value, ofType: ext)
            
url = NSURL(fileURLWithPath: path!)

try self.playersArray[index] = AVAudioPlayer(contentsOfURL: url)
self.playersArray[index].prepareToPlay()
self.playersArray[index].volume = 2.0

After being loaded and prepared to play, the player is started by the touch of a pad.

playersArray[index].play()

A more dynamic way of loading the sounds for the app would be achieved by storying the sound data into a SoundKit class, which will hold the location of the sound files in the application sandbox, the name and a small description for the sound kit, as well as methods to load and prepare for play the different sound files. With this class available, the PadsViewController would have to hold references to the different SoundKit objects in an array. The functions for the actions of the buttons when the touch down is detected on one of them could stay almost the same, the difference being that they would now have to call the play() function on an instance of the SoundKit class instead of the AVAudioPlayer class, since the AVAudioPlayer is going to the used in the SoundKit class itself.

Animations

While playing sounds can be fun, the number of sound kits and sounds in the app will at some point confuse the user if he does not have a reference point when he is pressing a button, a reference point indicating which button he has pressed. Therefore a small scale animation for when the button is pressed should ensure that the user always knows which pad has been pressed.

A small scale animation can be achieved by using the animateWithDuration function which defines the scale to which the UIView should be taken, as well as the duration for this animation.

UIView.animateWithDuration(0.1 ,
            animations: {
                button.transform = CGAffineTransformMakeScale(0.9, 0.9)
            },
            completion: { finish in
                UIView.animateWithDuration(0.1){
                    button.transform = CGAffineTransformIdentity
                }
 })

While the scale animation takes care of the button press, the length of the sounds in the app differs, and most of them play for longer than the duration of the animation. Therefore the user needs again a reference on which sound is playing and for how long, if not by displaying the duration in a time format, at least by the means of a circle drawn on the pressed button. The circle needs to have an animation which takes the duration of the sample to complete. Unfortunately if an animation is applied on top of a button it seems to reduce the area of the button which is touchable, therefore the animation should be placed where the user is less likely to touch a button which is towards the middle of the device’s screen. In that regard, the buttons on the left of the screen will have an animation on their right top side, while the buttons places on the right of the screen will have their animations on the left top side.

class CircleView: UIView {
    
    var circleLayer: CAShapeLayer!
    
    override init(frame: CGRect){
        super.init(frame: frame)
        self.backgroundColor = UIColor.clearColor()
        
        
        // Use UIBezierPath as an easy way to create the CGPath for the layer.
        // The path should be the entire circle.
        let circlePath = UIBezierPath(arcCenter: CGPoint(x: frame.size.width / 2.0, y: frame.size.height / 2.0), radius: (frame.size.width - 10) / 4, startAngle: 0.0, endAngle: CGFloat(M_PI * 2.0), clockwise: true)
        
        //Setup the CAShapeLayer with the path, colors and line width
        circleLayer = CAShapeLayer()
        circleLayer.path = circlePath.CGPath
        circleLayer.fillColor = UIColor.clearColor().CGColor
        circleLayer.strokeColor = UIColor(white: 0.75, alpha: 0.75).CGColor
        circleLayer.lineWidth = 10.0
        
        //DO NOT DRAW THE CIRCLE INITIALLY
        circleLayer.strokeEnd = 0.0
        
        //add the circleLayer to the view's layer's sublayers
        layer.addSublayer(circleLayer)
        
        
    }

    required init?(coder aDecoder: NSCoder) {
        fatalError("init(coder:) has not been implemented")
    }
    
    
    func animateCircle(duration: NSTimeInterval){
        
        //animate the strokeEnd property of the circleLayer
        let animation = CABasicAnimation(keyPath: "strokeEnd")
        
        
        //set the animation duration
        animation.duration = duration
        
        //animate from 0 (no circle) to 1 (full circle)
        animation.fromValue = 0
        animation.toValue = 1
        animation.removedOnCompletion = true

        //do a linear animation (the speed of the animation stays the same)
        animation.timingFunction = CAMediaTimingFunction(name: kCAMediaTimingFunctionLinear)
       
        animation.delegate = self
        
        //set the circleLayer's strokeEnd property to 1.0 now so that it's the right value when the animation ends
        circleLayer.strokeEnd = 1.0
        
        //do the actual animation
        circleLayer.addAnimation(animation, forKey: "animateCircle")
       
        
        
    }
   
    override func animationDidStop(anim: CAAnimation, finished flag: Bool) {
        //circleLayer.removeAnimationForKey("strokeEnd")
        circleLayer.removeFromSuperlayer()
    }

func animateCircleRight(button: UIButton, duration: NSTimeInterval){
        
        let rect = CGRectMake(button.frame.size.width - 100,-20,CGFloat(120), CGFloat(120))
        
        let circleView = CircleView(frame: rect)
        
        button.addSubview(circleView)
        circleView.animateCircle(duration)
        
    }
    
    func animateCircleLeft(button: UIButton, duration: NSTimeInterval){
        
        let rect = CGRectMake(-20,-20,CGFloat(120), CGFloat(120))
        
        let circleView = CircleView(frame: rect)
        
        button.addSubview(circleView)
        circleView.animateCircle(duration)
        
    }

 

Rivals

When it comes to potential rivals, ELoops provides a small amount of functionality compared to the main competitor and inspiration Keezy, which like the current project offers 8 tiles to record and play sounds, either for just goofing around or to sketch musical ideas. It comes with 15 different sound boards, but it also offers the possibility to record samples and the entire session.

[ https://itunes.apple.com/us/app/keezy/id605855595?mt=8 ]

Keezy - the different screens
Keezy – the different screens

 

Discussion

The user feedback was crucial for the development of the prototype. I was expecting the unexperienced users to have the outcome they have had because they have no prior experience with playing and creating sounds using a drum machine, therefore asking them to somehow give feedback on all the features of the prototype would overwhelm them. Again, being completely new to drum machines, they would not know how all the sound editing features could enhance the music production or the way they would play samples using the app. The unexpected feedback came from the experienced users who, to my surprise, wanted to use an app that was focused more on playing sounds than actually having the editing options. But given the different scenarios in which they would use such an app, for instance playing different sounds for a very short period of time during a live DJ set, such an app design would suit their needs better than the one with all the editing features available.

From being a light version of the iMaschine app, the prototype evolved into being the light version of the Keezy app. Both ELoops and Keezy are very similar when it comes to the main screen of the applications. They both have 8 pads with scale animation for the touch down event and a circle animation to display the length of each sound being played. Moreover, they both offer sound kits which can be selected from a different screen of the application. It is at this point that the similarities between the two app stop.

One extra feature that ELoops has and Keezy does not, is displaying the name of the sound sample on the pad.

Other than that, Keezy offers better functionality since it can record loops, play the sounds continuously if the user holds down on the pad, record different samples using the built-in microphone of the device and, of course, playing and recording sessions with these new samples. One extra feature present in Keezy is the ability of the user to undo its actions, down to the first one of them.

That being said, ELoops provides an intuitive and fun way of playing different sound samples and it is unfortunate that I did not succeed in implementing any of the recording functions for the app as I have planned in the beginning of the project. Nevertheless, playing music live without any quantization restriction ( in short a quantization function would arrange the samples in a fixed timeframe for the sequence being played, for instance if the sample is of 1 bar in length in a 4 bar loop it would allow the user to play the sample only 4 times during this sequence) has its advantages, one of the hidden objectives of the app being to give the user the possibility to play the sounds regardless of their starting point in the timeframe of the sequence.  

 

Demo

A very short demo for the app:

Leave a Reply