Made by: Jakob Høg (jasoe12) & Juri Wulff (juwul10)


YouDrum is an app minded for drum teaching. Instead of just learning drums with thousands of independent pre-made exercises as most drum books offer, the drum student in this app must built his own exercises with pre-made building blocks. Combining the building blocks opens up for an endless array of combinations and creativity.

The foundation of the concept is based on theories from Piaget’s Constructivism and Seymour Papert’s Constructionism. Piaget argues that humans generate knowledge and meaning from an interaction between their experiences and their ideas (

“The role of the teacher is to create the conditions for invention rather than provide ready-made knowledge.”
– Seymour Papert (from: Richard Saul Wurman, Information Anxiety 2‎ (2001), 240)

With this project we aim to find out what is expected of iOS app design and how the Swift language function compared to Objective-C and JAVA for example. We want to explore both of these areas, but because we are new to iOS development and the Swift language, we have done a lot of syntax exploring of the Swift language in general. Most of the tutorials we have followed are not directly related to this app project, but were needed to gain knowledge of the language.

Methods and materials

We brainstormed about several ideas we’ve had in our heads before starting this course. We settled on Jakob’s idea for a music teaching app, that came from his experience as a drum teacher at multiple music schools. The concept about Building Blocks as a foundation of learning and creating music, appealed to both of us. We quickly brainstormed from the basis idea and made it into a concept which we built a paper prototype upon.


After discussing the paper prototype, we started to find out what the requirements for the app were and divided them in “must have” and “nice to have” categories:

Must have:

  • at least three building blocks
  • two “build-to”-areas that the building blocks can be build upon
  • ability to drag the building blocks from one area to another
  • a function to check which building blocks are connected
  • a play/stop button that can play the sound of the blocks connected.
  • an app designed for iPad in Portrait mode

Nice to have:

  • a looping functionality of the sound playing
  • a metronome which can be switched on or off
  • a focus function where on building block can be zoomed upon (multi touch pinching) and only the sound of that particular building block will be heard
  • an edit function where the building blocks can be edited with different notes
  • a quick delete function where a swipe downwards on a builded block would remove the block from the build area.
  • ability to add “build-to”-areas so 3 and more connected blocks would be possible
  • an universal app designed both for iPhones, iPads in both Portrait and Landscape modes

Use case diagram based on needs– and nice-to-have..

use case diagram-2

We then tried to sketch out how the model for the MVC-design pattern would look like, focusing on the “must haves” from the list above and started sketching out some classes; how they might could be:

BuildingBlock class

PlayCreation class

Next step for our app project were many smaller test apps focusing on small parts of the design. Juri worked on the sound functionality and Jakob worked with the GUI elements making them work with touch and drag functionality.

Also the GUI design elements needed to be made. We read most of the design guidelines from Apple, thinking about them as we designed the elements. Some of the points from the Apple design guidelines that caught out attention were:

  • “As much as possible, avoid displaying a splash screen or other startup experience. It’s best when users can begin using your app immediately.”
  • “Make it easy to focus on the main task by elevating important content or functionality. Some good ways to do this are to place principal items in the upper half of the screen and—in left-to-right cultures—near the left side of the screen”
  • “Use visual weight and balance to show users the relative importance of onscreen elements. Large items catch the eye and tend to appear more important than smaller ones. Larger items are also easier for users to tap, which makes them especially useful in apps—such as Phone and Clock—that users often use in distracting surroundings.”
  • “Make it easy for people to interact with content and controls by giving each interactive element ample spacing. Give tappable controls a hit target of about 44 x 44 points.”

We then created a Mockup in a vector design programme called iDraw (see picture below).


All the elements from the mockup could be exported separately and used for GUI elements within the app afterwards. iDraw has some great export options which includes auto-creation and naming of @2x images which are needed for Retinas iOS devices.

After the UI-design was done, all the GUI elements was imported into our Xcode project. We used the Xcode Interface Builder in Storyboard to setup the elements within our Single View app. Using autolayout features we tried to make the elements as generic as possible making them work in different screen rotations and preferably many screen resolutions and sizes. Our main focus was to make it work on the iPad though.

After that we began to connect the elements with our code from the test apps. We implemented all of our code In the ViewController. We did not have any concerns regarding the MVC-pattern at this stage, even though we hadn’t forgotten about it. We figured it would be faster to create a functional prototype with only a View and ViewController to work on.


Our brainstorm and conceptual designs made us focus on simplicity and the basics and most important parts of the app. The paper prototype made us conscious about the sizing of the GUI elements. Breaking up the code in smaller test apps, worked well for us and it was an easy way to collaborate. We did not have any issues with combining our codes in the end.

Here follows some code highlights..

let scaleConstant: CGFloat = 1.6
let intersectBoundaries: Float = 130.0 - 10.0
var intersectID = 0

The scaleConstant is used to the resize the building blocks whenever the user drags them around. If they are not in the top drag-to-area or in the middle of being dragged around, then they are downscaled to 1/1.6. The intersectBoundaries constant is used for calibrating the collision check function. The intersectID is a variable which change from 0-2 in functions described in more detail below. It will always be 0 if a dragged building block is not in the top area of the screen. If it intersects the drag-to-area it will either be 1 or 2, depending on if it intersects with the left or the right placeholder.

var buildingBlockStartPositionBeat1 = CGPoint() 
var buildingBlockStartPositionBeat2 = CGPoint() 
var buildingBlockStartPositionBeat3 = CGPoint() 
var buildingBlockStartPositionBeat4 = CGPoint()
override func viewDidLoad() {
// saves start position of building blocks - used for snapToPlace()
buildingBlockStartPositionBeat1 = buildingBlockBeat1.frame.origin
buildingBlockStartPositionBeat2 = buildingBlockBeat2.frame.origin
buildingBlockStartPositionBeat3 = buildingBlockBeat3.frame.origin
buildingBlockStartPositionBeat4 = buildingBlockBeat4.frame.origin

The buildingBlockStartPosition-variables is generated within the viewDidLoad(). Because the coordinates aren’t hardcoded, we need to know the original position so we can make them snap back to their this position if the user wants them to.

@IBAction func handleDrag(recognizer:UIPanGestureRecognizer) {
let translation = recognizer.translationInView(self.view)
recognizer.view!.center = CGPoint(x:recognizer.view!.center.x + translation.x, y:recognizer.view!.center.y + translation.y)
recognizer.setTranslation(CGPointZero, inView: self.view)

This function is made using the UI-touchevent called PanGestureRecognizer which is placed upon all the building blocks in Interface Builder in the Storyboard scene. It’s taken from a tutorial from Ray Wenderlich and enables the basic dragging of all the building blocks.

    // check for collision / intersection

Whenever a user drags a building block this function will check whether it collides with the “build-to”-area. More about this function will follow later.

   if recognizer.state == .Began {
      recognizer.view!.layer.zPosition = 1; // move dragged block to front
      if recognizer.view!.frame.intersects(dragToAreaView.frame) != true {         scaleBuildingBlock(true, draggedBlock: recognizer.view!)

When the user starts to drag a block around the view’s z-position is moved to from and if it’s not already placed in the top “drag-to-area-view”, then it is ok to upscale the element with the function scaleBuildingBlock().

    if recognizer.state == .Ended {
        recognizer.view!.layer.zPosition = 0; // move dragged block back to 0

When the user is finished dragging the view’s z-position is moved back to 0 and it will snap to place according to what the snapToPlace() tell’s it to.

  func checkForBuildingBlockIntersection(draggedBlock:UIView) {    
    if dragToAreaView.frame.intersects(draggedBlock.frame) {

This function checks whether a dragged building block is close to one of the drag-to-areas and is using one UIKit’s built-in functions which checks whether a UIView intersects another. It will only run the code below if intersects a container view in the top.

     var deltaX1 = - ( + dragToAreaView.frame.origin.x)
      var deltaY1 = - ( + dragToAreaView.frame.origin.y)
      var distance1 = sqrtf(Float(deltaX1*deltaX1+deltaY1*deltaY1))
      println("distance 1:\(distance1)")
      var deltaX2 = - ( + dragToAreaView.frame.origin.x)
      var deltaY2 = - ( + dragToAreaView.frame.origin.y)
      var distance2 = sqrtf(Float(deltaX2*deltaX2+deltaY2*deltaY2))
        println("distance 2:\(distance2)")
      if (distance1 <= intersectBoundaries) {
        intersectID = 1
      } else if (distance2 <= intersectBoundaries) {
        intersectID = 2
      } else {
        intersectID = 0
      } }

This code is a “home-made” collision check function. It’s built on Pythagoras c² = a² + b². It basicly calculates the distance from the center of a dragged building block to both of the drag-to-areas center. If the distance is less than or equal to the interSectBoundaries constant, then it will change the tempary intersectID variable and highlight the right area with the setHighlight() function.

  func snapToPlace(draggedBlock:UIView) {
    switch intersectID {
    case 1:
      draggedBlock.frame.origin = CGPoint(x: dragToArea1.frame.origin.x + dragToAreaView.frame.origin.x
        , y: dragToArea1.frame.origin.y + dragToAreaView.frame.origin.y)
    case 2:
      draggedBlock.frame.origin = CGPoint(x: dragToArea2.frame.origin.x + dragToAreaView.frame.origin.x
        , y: dragToArea2.frame.origin.y + dragToAreaView.frame.origin.y)
      scaleBuildingBlock(false, draggedBlock: draggedBlock)
      switch draggedBlock.tag {
      case 1: buildingBlockBeat1.frame.origin = buildingBlockStartPositionBeat1
      case 2: buildingBlockBeat2.frame.origin = buildingBlockStartPositionBeat2
      case 3: buildingBlockBeat3.frame.origin = buildingBlockStartPositionBeat3
      case 4: buildingBlockBeat4.frame.origin = buildingBlockStartPositionBeat4
      default: println("???")

This function makes sure that the building blocks snaps to the right place and turns off highlights if they exist.

@IBOutlet var playButton: UIButton!
var player = AVQueuePlayer()
var counter = 1
var titlesForBeats = ["beat1","beat2","beat3","beat4"]
var placeHolderOne = 5
var placeHolderTwo = 5
var urlOne:NSURL{
return NSURL(fileURLWithPath: NSBundle.mainBundle().pathForResource(titlesForBeats[placeHolderOne], ofType: "aif")!)!
var urlTwo:NSURL{
return NSURL(fileURLWithPath: NSBundle.mainBundle().pathForResource(titlesForBeats[placeHolderTwo], ofType: "aif")!)!
var soundItems:NSArray{
return [AVPlayerItem(URL:urlOne),AVPlayerItem(URL:urlTwo)]

Variables used for playing beat samples. For playing the beats there is used an AVQueuePlayer that takes AVPlayerItem objects in an array.

var nc = NSNotificationCenter.defaultCenter()
nc.addObserver(self, selector: "playerItemDidReachEnd:", name: AVPlayerItemDidPlayToEndTimeNotification, object: nil)

Placed in the viewDidLoad() there is a NSNotificationCenter instantiated.
The NSNotificationCenter is used to check if a sounds finshed playing. The function playerItemDidReachEnd is called when propery AVPlayerItemDidPlayToEndTimeNotification is registered.

func playerItemDidReachEnd(NSNotification){
print("Did finish one beat \n")
if(counter == 2){
counter = 0 

@IBAction func playBeat(sender: AnyObject) {
if ((playButton.selected == false) && (placeHolderOne < 5) && (placeHolderTwo < 5)){
counter = 0
playButton.selected = true
counter = 3
playButton.selected = false


func initAudioPlayer(){
player = AVQueuePlayer(items: soundItems)

The functions, which is the methods in Swift programming language, used for the beat samples handling.
var counter works like a start stop flag for the playback of the beat samples. The two variables placeHolderOne and placeHolderTwo is index numbers in the array of strings that holds all the beat sample titles. Value 5 is default and is out of range in the array, so that should never be called.

Video of the app, screen recorded from an iPad 4.


Link to complete project, including code:


We’ve succeded in creating an app which demonstrates the principles of our goals. We managed to include all of our “must-have”-features and some of the “nice to have” features. But some of the functionality is only half-functioning though.

The looping of the building block sounds has a minor, but noticeable time gap when it starts over. And the metronom is not synchronised to the building block sounds.

The dragging only responds if the user starts to drag. If the user only touches a building block, nothing happens. It should probably scale also if the user only taps a building block.

The code is somehow generic, but it has no real “Model” in it. A lot of the code needs to be taken away from the view controller so the code could be even more generic.


The layout design is build upon the autolayout feature of Xcode, but it’s not working very on an iPhone yet. The layout of the elements also needs some fine tuning.

We have gained a lot of knowledge of the Swift language but we still have a lot to learn. Compared to both JAVA and Objective-C, we found the syntax both fun and quick. It’s still a new language so the community is not filled with as many tutorials as other older languages.

The future perspective for this app, besides making a model for the Viewcontroller, would be to make a database for the building blocks and making the app and the views more flexible. Perhaps themes of exercises, blocks and even more creativity. Also fixes for the sounds needs to be made.


  • tutorials, including:
  • Sounds and Swift:
  • iBook, iOS Human Interface Guidelines, Apple
  • iBook, The Swift Programming Language, Apple
  • MIT Media Lab,

Leave a Reply