How To Make a Music Visualizer in iOS
This tutorial shows you how to make your own music visualizer. You’ll learn how to play music with background audio, and make a particle system that dances to the beat of a song! By .
Sign up/Sign in
With a free Kodeco account you can download source code, track your progress, bookmark, personalise your learner profile and more!
Create accountAlready a member of Kodeco? Sign in
Sign up/Sign in
With a free Kodeco account you can download source code, track your progress, bookmark, personalise your learner profile and more!
Create accountAlready a member of Kodeco? Sign in
Contents
How To Make a Music Visualizer in iOS
30 mins
Selecting a Song
A music player that just plays one song, no matter how cool that song may be, isn’t very useful. So you’ll add the ability to play audio from the device’s music library.
If you don’t plan on running on a device, or know how to set that up already, you can skip to the next section.
The starter project you downloaded is set up so that when the user chooses a song from the media picker, a URL for the selected song is passed to playURL:
inside ViewController.m
. Currently, playURL:
just toggles the icon on the play/pause button.
Inside ViewController.m, add the following code to playURL:
just after the comment that reads // Add audioPlayer configurations here
:
self.audioPlayer = [[AVAudioPlayer alloc] initWithContentsOfURL:url error:nil];
[_audioPlayer setNumberOfLoops:-1];
The above code is much the same as what you wrote in configureAudioPlayer
. However, instead of hardcoding the filename, you create a new AVAudioPlayer
instance with the URL passed into the method.
Build and run on a device, and you’ll be able to choose and play a song from your music library.
Note: If you have iTunes Match, you may see items in the media picker that are not actually on your device. If you choose a song that is not stored locally, the app dismisses the media picker and does not play the audio. So if you want to hear (and soon see) something, be sure to choose a file that’s actually there :]
Note: If you have iTunes Match, you may see items in the media picker that are not actually on your device. If you choose a song that is not stored locally, the app dismisses the media picker and does not play the audio. So if you want to hear (and soon see) something, be sure to choose a file that’s actually there :]
While running the project on a device, press the home button. You’ll notice that your music is paused. This isn’t a very good experience for a music player application, if a music player is what you’re after.
You can configure your app so that the music will continue to play even when the app enters the background. Keep in mind that this is another feature not supported in the iPhone Simulator, so run the app on a device if you want to see how it works.
To play music in the background, you need to do two things: set the audio session category, then declare the app as supporting background execution.
First, set the audio session category.
An audio session is the intermediary between your application and iOS for configuring audio behavior. Configuring your audio session establishes basic audio behavior for your application. You set your audio session category according to what your app does and how you want it to interact with the device and the system.
Add the following new method to ViewController.m:
- (void)configureAudioSession {
NSError *error;
[[AVAudioSession sharedInstance] setCategory:AVAudioSessionCategoryPlayback error:&error];
if (error) {
NSLog(@"Error setting category: %@", [error description]);
}
}
In configureAudioSession
, you get the audio session using [AVAudioSession sharedInstance]
and set its category to AVAudioSessionCategoryPlayback
. This identifies that the current audio session will be used for playing back audio (as opposed to recording or processing audio).
Add the following line to viewDidLoad
, just before the call to [self configureAudioPlayer];
:
[self configureAudioSession];
This calls configureAudioSession
to configure the audio session.
Note: To learn more about audio sessions, read Apple’s Audio Session Programming Guide. Or take a look at our Background Modes in iOS Tutorial which also covers the topic, albeit not in as much detail.
Note: To learn more about audio sessions, read Apple’s Audio Session Programming Guide. Or take a look at our Background Modes in iOS Tutorial which also covers the topic, albeit not in as much detail.
Now you have to declare that your app supports background execution.
Open iPodVisualizer-Info.plist (it’s in the Supporting Files folder), select the last line, and click the plus button to add a new item. Select Required background modes as the Key from the dropdown, and the type of the item will change to Array automatically. (If it does not automatically become Array
, double check the Key.)
Expand the item, set the value of Item0 to App plays audio. (If you have a wide Xcode window, you might not notice that the value is a dropdown list. But you can access the list by simply tapping the dropdown icon at the end of the field.)
When you are done, build and run on a device, pick a song and play it, press the home button, and this time your music should continue to play without interruption even if your app is in the background.
Visualizing with Music
Your music visualizer will be based on a UIKit
particle system. If you don’t know much about particle systems, you may want to read UIKit Particle Systems In iOS 5 or How To Make a Letter / Word Game with UIKit: Part 3/3 to familiarize yourself with the necessary background information; this tutorial does not go into detail explaining the particle system basics.
First, add the QuartzCore.framework
to your project (the same way you added the AVFoundation.framework
).
Now choose File/New/File…, and select the iOS/Cocoa Touch/Objective-C class template. Name the class VisualizerView, make it a subclass of UIView, click Next and then Create.
Select VisualizerView.m in the Xcode Project Navigator and change its extension from .m to .mm. (You can rename it by clicking the file twice slowly in the Project Navigator. That is, do not click it fast enough to be considered a double-click.) The .mm extension tells Xcode that this file needs to be compiled as C++, which is necessary because later it will access the C++ class MeterTable
.
Open VisualizerView.mm and replace its contents with the following:
#import "VisualizerView.h"
#import <QuartzCore/QuartzCore.h>
@implementation VisualizerView {
CAEmitterLayer *emitterLayer;
}
// 1
+ (Class)layerClass {
return [CAEmitterLayer class];
}
- (id)initWithFrame:(CGRect)frame
{
self = [super initWithFrame:frame];
if (self) {
[self setBackgroundColor:[UIColor blackColor]];
emitterLayer = (CAEmitterLayer *)self.layer;
// 2
CGFloat width = MAX(frame.size.width, frame.size.height);
CGFloat height = MIN(frame.size.width, frame.size.height);
emitterLayer.emitterPosition = CGPointMake(width/2, height/2);
emitterLayer.emitterSize = CGSizeMake(width-80, 60);
emitterLayer.emitterShape = kCAEmitterLayerRectangle;
emitterLayer.renderMode = kCAEmitterLayerAdditive;
// 3
CAEmitterCell *cell = [CAEmitterCell emitterCell];
cell.name = @"cell";
cell.contents = (id)[[UIImage imageNamed:@"particleTexture.png"] CGImage];
// 4
cell.color = [[UIColor colorWithRed:1.0f green:0.53f blue:0.0f alpha:0.8f] CGColor];
cell.redRange = 0.46f;
cell.greenRange = 0.49f;
cell.blueRange = 0.67f;
cell.alphaRange = 0.55f;
// 5
cell.redSpeed = 0.11f;
cell.greenSpeed = 0.07f;
cell.blueSpeed = -0.25f;
cell.alphaSpeed = 0.15f;
// 6
cell.scale = 0.5f;
cell.scaleRange = 0.5f;
// 7
cell.lifetime = 1.0f;
cell.lifetimeRange = .25f;
cell.birthRate = 80;
// 8
cell.velocity = 100.0f;
cell.velocityRange = 300.0f;
cell.emissionRange = M_PI * 2;
// 9
emitterLayer.emitterCells = @[cell];
}
return self;
}
@end
The above code mainly configures a UIKit particle system, as follows:
- Overrides
layerClass
to returnCAEmitterLayer
, which allows this view to act as a particle emitter. - Shapes the emitter as a rectangle that extends across most of the center of the screen. Particles are initially created within this area.
- Creates a
CAEmitterCell
that renders particles usingparticleTexture.png
, included in the starter project. - Sets the particle color, along with a range by which each of the red, green, and blue color components may vary.
- Sets the speed at which the color components change over the lifetime of the particle.
- Sets the scale and the amount by which the scale can vary for the generated particles.
- Sets the amount of time each particle will exist to between .75 and 1.25 seconds, and sets it to create 80 particles per second.
- Configures the emitter to create particles with a variable velocity, and to emit them in any direction.
- Adds the emitter cell to the emitter layer.
Again, read the previously mentioned tutorials if you would like to know more about the fun things you can do with UIKit particle systems and how the above configuration values affects the generated particles.
Next open ViewController.m and make the following changes:
//Add with the other imports
#import "VisualizerView.h"
//Add with the other properties
@property (strong, nonatomic) VisualizerView *visualizer;
Now add the following to viewDidLoad
, just before the line that reads [self configureAudioPlayer];
:
self.visualizer = [[VisualizerView alloc] initWithFrame:self.view.frame];
[_visualizer setAutoresizingMask:UIViewAutoresizingFlexibleHeight | UIViewAutoresizingFlexibleWidth];
[_backgroundView addSubview:_visualizer];
This creates a VisualizerView
instance that will fill its parent view and adds it to _backgroundView
. (_backgroundView
was defined as part of the starter project, and is just a view layered behind the music controls.)
Build and run, you will see the particle system in action immediately:
While that looks very cool indeed, you want the particles to “beat” in sync with your music. This is done by changing the size of particles when the decibel level of the music changes.
First, open VisualizerView.h and make the following changes:
//Add with the other imports
#import <AVFoundation/AVFoundation.h>
//Add within the @interface and @end lines
@property (strong, nonatomic) AVAudioPlayer *audioPlayer;
The new property will give your visualizer access to the app’s audio player, and hence the audio levels, but before you can use that information, you need to set up one more thing.
Switch to ViewController.m and search for setNumberOfLoops
. If you skipped the section about running on the device, it will appear only once (in configureAudioPlayer
); otherwise, it will appear twice (in configureAudioPlayer
and in playURL:
).
Add the following code just after any occurrence of the line [_audioPlayer setNumberOfLoops:-1];
:
[_audioPlayer setMeteringEnabled:YES];
[_visualizer setAudioPlayer:_audioPlayer];
With the above code, you instruct the AVAudioPlayer
instance to make audio-level metering data available. You then pass _audioPlayer
to the _visualizer
so that it can access that data.
Now switch to VisualizerView.mm and modify it as follows:
// Add with the other imports
#import "MeterTable.h"
// Change the private variable section of the implementation to look like this
@implementation VisualizerView {
CAEmitterLayer *emitterLayer;
MeterTable meterTable;
}
The above code gives you access to a MeterTable
instance named meterTable
. The starter project includes the C++ class MeterTable
, which you’ll use to help process the audio levels from AVAudioPlayer
.
What’s all this talk about metering? It should be easy to understand once you see the image below:
You’ve most likely seen something similar on the front of a sound system, bouncing along to the music. It simply shows you the relative intensity of the audio at any given time. MeterTable
is a helper class that can be used to divide decibel values into ranges used to produce images like the one above.
You will use MeterTable
to convert values into a range from 0 to 1 and you will use that new value to adjust the size of the particles in your music visualizer.
Add the following method to VisualizerView.mm:
- (void)update
{
// 1
float scale = 0.5;
if (_audioPlayer.playing )
{
// 2
[_audioPlayer updateMeters];
// 3
float power = 0.0f;
for (int i = 0; i < [_audioPlayer numberOfChannels]; i++) {
power += [_audioPlayer averagePowerForChannel:i];
}
power /= [_audioPlayer numberOfChannels];
// 4
float level = meterTable.ValueAt(power);
scale = level * 5;
}
// 5
[emitterLayer setValue:@(scale) forKeyPath:@"emitterCells.cell.scale"];
}
Each time the above method is called, it updates the size of the visualizer's particles. Here's how it works:
Note: Why use meterTable
to convert power
's value? The reason is that it simplifies the code that you have to write. Otherwise, your code will have to cover broad range of values returned by averagePowerForChannel
. A return value of 0 indicates full scale, or maximum power; a return value of -160 indicates minimum power (that is, near silence). But the signal provided to the audio player may actually exceed the range of what's considered full scale, so values can still go beyond those limits. Using meterTable
gives you a nice value from 0 to 1. No fuss, no muss.
- You set
scale
to a default value of 0.5 and then check to see whether or not_audioPlayer
is playing. - If it is playing, you call
updateMeters
on_audioPlayer
, which refreshes theAVAudioPlayer
data based on the current audio. - This is the meat of the method. For each audio channel (e.g. two for a stereo file), the average power for that channel is added to
power
. The average power is a decibel value. After the powers of all the channels have been added together,power
is divided by the number of channels. This meanspower
now holds the average power, or decibel level, for all of the audio. - Here you pass the calculated average
power
value tometerTable
'sValueAt
method. It returns a value from 0 to 1, which you multiply by 5 and then set that as thescale
. Multiplying by 5 accentuates the music's effect on the scale.Note: Why use
meterTable
to convertpower
's value? The reason is that it simplifies the code that you have to write. Otherwise, your code will have to cover broad range of values returned byaveragePowerForChannel
. A return value of 0 indicates full scale, or maximum power; a return value of -160 indicates minimum power (that is, near silence). But the signal provided to the audio player may actually exceed the range of what's considered full scale, so values can still go beyond those limits. UsingmeterTable
gives you a nice value from 0 to 1. No fuss, no muss. - Finally, the scale of the emitter's particles is set to the new
scale
value. (If_audioPlayer
was not playing, this will be the default scale of 0.5; otherwise, it will be some value based on the current audio levels.
Note: Why use meterTable
to convert power
's value? The reason is that it simplifies the code that you have to write. Otherwise, your code will have to cover broad range of values returned by averagePowerForChannel
. A return value of 0 indicates full scale, or maximum power; a return value of -160 indicates minimum power (that is, near silence). But the signal provided to the audio player may actually exceed the range of what's considered full scale, so values can still go beyond those limits. Using meterTable
gives you a nice value from 0 to 1. No fuss, no muss.
Right now your app doesn't call update
and so the new code has no effect. Fix that by modifying initWithFrame:
in VisualizerView.mm by adding the following lines just after emitterLayer.emitterCells = @[cell];
(but still inside the closing curly brace):
CADisplayLink *dpLink = [CADisplayLink displayLinkWithTarget:self selector:@selector(update)];
[dpLink addToRunLoop:[NSRunLoop currentRunLoop] forMode:NSRunLoopCommonModes];
Here you set up a CADisplayLink
. A CADisplayLink
is a timer that allows your application to synchronize its drawing to the refresh rate of the display. That is, it behaves much like a NSTimer
with a 1/60 second time interval, except that it's guaranteed to be called each time the device prepares to redraw the screen, which is usually at a rate of 60 times per second.
The first line you added above creates an instance of CADisplayLink
set up to call update
on the target self
. That means it will call the update
method you just defined during each screen refresh.
The second line calls addToRunLoop:forMode:
, which starts the display link timer.
Note: Adding the CADisplayLink
to a run loop is a low-level concept related to threading. For this tutorial, you just need to understand that the CADisplayLink
will be called for every screen update. But if you want to learn more, you can check out the class references for CADisplayLink or NSRunLoop, or read through the Run Loops chapter in Apple's Threading Programming Guide.
Note: Adding the CADisplayLink
to a run loop is a low-level concept related to threading. For this tutorial, you just need to understand that the CADisplayLink
will be called for every screen update. But if you want to learn more, you can check out the class references for CADisplayLink or NSRunLoop, or read through the Run Loops chapter in Apple's Threading Programming Guide.
Now build, run, and play some music. You will notice that particles will change their size but they don't "beat" with the music. This is because the change we make can not affect the particles that already exist on the screen. Only new particles are changed.
This needs to be fixed.
Open VisualizerView.mm and modify initWithFrame:
as follows:
// Remove this line
// cell.contents = (id)[[UIImage imageNamed:@"particleTexture.png"] CGImage];
// And replace it with the following lines
CAEmitterCell *childCell = [CAEmitterCell emitterCell];
childCell.name = @"childCell";
childCell.lifetime = 1.0f / 60.0f;
childCell.birthRate = 60.0f;
childCell.velocity = 0.0f;
childCell.contents = (id)[[UIImage imageNamed:@"particleTexture.png"] CGImage];
cell.emitterCells = @[childCell];
Like CAEmitterLayer
, CAEmitterCell
also has a property named emitterCells
. This means that a CAEmitterCell
can contain another CAEmitterCell
. This results in particles emitting particles. That's right, folks, it's turtles particles all the way down! :]
Also notice that you set the child's lifetime
to 1/60 seconds. This means that particles emitted by childCell
will have a lifetime which is the same length as a screen refresh. You set birthRate
to 60, which means that there will be 60 particles emitted per second. Since each dies in 1/60th of a second, there will always be a particle created when the previous particle dies. And you thought your day was short :]
Build and run, you will see the particle system works the same as it did before - but it still doesn't beat to the music. You can try setting birthRate
to 30 to help you understand how the setting works (just don't forget to set it back to 60).
So how do you get the particle system to beat to the music?
The last line of update
currently looks like this:
[emitterLayer setValue:@scale forKeyPath:@"emitterCells.cell.scale"];
Replace that line with the following:
[emitterLayer setValue:@(scale) forKeyPath:@"emitterCells.cell.emitterCells.childCell.scale"];
Now build and run, you will see that all the particles beat with your music now.
So what did the above change do?
Particles are created and destroyed at the same rate as a screen refresh. That means that every time the screen is redrawn, a new set of particles is created and the previous set is destroyed. Since new particles are always created with a size calculated from the audio-levels at that moment, the particles appear to pulse with the music.
Congratulations, you have just made a cool music visualizer application!