The Musical UIButton
When should a musical note start playing? When should it stop?
This is something I’m regularly asking myself and discussing with others as we construct new musical apps.
In this article I cover the basics of using a button, means of interaction, and the impact these choices have on musical expression.
I think it’s helpful to borrow a thought from the history of synthesizers.
The buttons, switches, and sliders we commonly associate with electronic instruments come from industrial backgrounds. They were first built to control machinery and somewhere along the path of innovation, they were co-opted by creative engineers and repurposed to sculpt and control sound.
As creative engineers of the touch screen, we most do the same. We will take elements built for one purpose, and engineer them to express music instead.
A button is one of the most fundamental user interface elements on touch screens. Its use is simple. A button is touched and (usually) something happens.
Generally, a button is placed on an interface and associated with an action. This is most commonly done by attaching a touchUpInside event to a method. You tap the button, it works, and don’t it give much more thought.
The reason we use touchUpInside is signal intention. Waiting till the user lifts their finger allows an opportunity for them to change their mind. You place a finger down, decide maybe that wasn’t the right choice, and then you can slide your finger off and lift. No harm done.
This is a great pattern for common workflows, but not as interesting for music. To make a button musically expressive, we may need to unlearn a few general patterns and think a little deeper about interacting with an everyday control.
As we walk through some of the events, I’ll use a touchable piano as a metaphor or designing musical interactions.
A button is a control and fires a set of common events. As related to touch, they are:
Immediately as you press down on a piano key, the sound begins. This allows the button to feel very responsive at the cost of occasionally unintended sounds. You cannot drag your finger off, you just live with the consequence of a mistaken note.
After you lift your finger the sound is stopped. The touchUpInside is perfect for this. touchUpOutside handles the instance for when you move your finger outside the button and lift.
Tracking Touch Areas
If you look carefully at how a button interacts with a touch, you’ll notice there is a touch area larger than the control. As you drag your finger away, you see the button continues to track your touch after you’ve left the graphic. This accommodates user error, letting people be a little more sloppy with their interactions. It seems that under the hood Apple is expanding the tracking area by 100 points after a touch is started.
This 100 point expanded area is the border in which touchDragEnter and touchDragExit cross as well as the border dividing whether touchUpInside and touchUpOutside are fired.
Because we don’t have an actual physical object to touch, it’s helpful to indicate visually when touches are occurring. Controls use a highlighted state to indicate when a touch is being tracked. It’s common to alter the button image to indicate this.
By default, a UIButton sets highlighted when a touch is within the trackable area, and not when it moves outside.
Given the default touch tracking areas and highlighted states of buttons, it makes sense to have a touchable piano triggered in the same way. If a control is highlighted, it should be making a sound.
Sound triggering events:
Sound stopping events:
Discrete vs Continuous Events
Some events are fired just once in an event cycle like touchDown, touchCancel, and the various touchUp events. Others are continuously fired as the event satisfies a condition, like touchDragInside and touchDragOutside. touchDragEnter and touchDragExit are somewhere in between, they will be fired every time a drag crosses a border, but just a single time for each crossing.
Unlike a traditional piano, a piano app will interrupt you to take phone calls and other notifications. You’ll typically want to stop sounds when canceled events are triggered as well.
Throwing it All Away
To better understand actions and states, its useful to use metaphors like a touchable piano. However, as creative engineers, we should not limit our ambitions only to reproducing common instrument interactions. Now that we know how to make button-sound interactions obvious, it may be a useful exercise useful to throw this information away.
Some interesting questions to ask:
- What are the trade offs between responsiveness and musical intention?
- What impact does touch tracking area have on the playability of a touch instrument?
- How might playing styles differ between learning, performing, and composing?
- How can discrete events (on/off) and continuous events (dragging near/far) be used to enhance musical expression?
I’m Greg and I work as a freelance developer building new music technology with artists, startups, and institutions. Connect with me at Artful Medium or checkout my new book, The Creative Guide to Building Music Apps.