AVPlayer

AVFoundation provides a controller class AVPlayer to play timed audio-visual media. The AVPlayer can handle playback of local, progressively downloaded and streamed media conforming HLS protocol.

AVPlayer is just a controller. It can play, pause, stop and jump to a certain time in a video, but it can’t show a video to users’ eyes. In order to display the video in user interface, we need use AVPlayerLayer.

AVPlayerLayer

AVPlayerLayer is built on the top of Core Animation. AVPlayerLayer extends the CALayer class, so it can be used like any other CALayer and can be set as the backing layer for UIView or NSView or can be added into an existing layer hierarchy.
AVPlayerLayer has a property videoGravity to specifies how the video is displayed within a player layer’s bounds.

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
typedef NSString * AVLayerVideoGravity NS_STRING_ENUM;

/*!
@constant AVLayerVideoGravityResizeAspect
@abstract Preserve aspect ratio; fit within layer bounds.
*/
AVF_EXPORT AVLayerVideoGravity const AVLayerVideoGravityResizeAspect NS_AVAILABLE(10_7, 4_0); // default


/*!
@constant AVLayerVideoGravityResizeAspectFill
@abstract Preserve aspect ratio; fill layer bounds.
*/
AVF_EXPORT AVLayerVideoGravity const AVLayerVideoGravityResizeAspectFill NS_AVAILABLE(10_7, 4_0);

/*!
@constant AVLayerVideoGravityResize
@abstract Stretch to fill layer bounds.
*/
AVF_EXPORT AVLayerVideoGravity const AVLayerVideoGravityResize NS_AVAILABLE(10_7, 4_0);

AVPlayerItem

AVPlayerItem stores the reference to AVAsset objects. AVPlayerItem is a dynamic object. Many of its property values can be changed during the item’s preparation and playback. We can use KVO to observe these changes as they occur. For instance, the AVPlayerItem has a property status to indicate if the item is ready for playback. When you first create a player item, the status is AVPlayerItemStatusUnknown, meaning it’s not ready for playback. We need to wait until its status changes to AVPlayerItemStatusReadyToPlay before it’s ready to use.

Creating AVPlayer

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
// Example for create an avplayer
@property (strong, nonatomic) AVPlayer * player;
@property (strong, nonatomic) AVPlayerItem * playerItem;

- (void)initPlayer:(NSURL *)url {

// Create AVAsset
AVAsset *asset = [AVAsset assetWithURL:url];

// Create an avplayer item with asset
playerItem = [AVPlayerItem playerItemWithAsset:asset];

// register KVO for observing the change of status
NSKeyValueObservingOptions options =
NSKeyValueObservingOptionOld | NSKeyValueObservingOptionNew;

[playItem addObserver:self
forKeyPath:@"status"
options:options
context:&PlayerItemContext];

// create avplayer with player item
self.player = [AVPlayer playerWithPlayerItem:playerItem];

// create a player layer to display video content
AVPlayerLayer *playerLayer = [AVPlayerLayer playerLayerWithPlayer:self.player];

// add player layer to current view hierarchy
[self.view.layer addSublayer:playerLayer];
}

- (void)observeValueForKeyPath:(NSString *)keyPath
ofObject:(id)object
change:(NSDictionary<NSString *,id> *)change
context:(void *)context {
// Only handle observations for the PlayerItemContext
if (context != &PlayerItemContext) {
[super observeValueForKeyPath:keyPath ofObject:object change:change context:context];
return;
}

if ([keyPath isEqualToString:@"status"]) {
AVPlayerItemStatus status = AVPlayerItemStatusUnknown;
// Get the status change from the change dictionary
NSNumber *statusNumber = change[NSKeyValueChangeNewKey];
if ([statusNumber isKindOfClass:[NSNumber class]]) {
status = statusNumber.integerValue;
}
// Switch over the status
switch (status) {
case AVPlayerItemStatusReadyToPlay:
// Ready to Play
break;
case AVPlayerItemStatusFailed:
// Failed. Examine AVPlayerItem.error
break;
case AVPlayerItemStatusUnknown:
// Not ready
break;
}
}
}

Manage Playback Time

CMTime is the structure describing media time. A CMTime is represented as a rational number, with a numerator (an int64_t value), and a denominator (an int32_t timescale).

1
2
3
4
5
6
typedef struct{    
CMTimeValue value; // think as the total number of frames
CMTimeScale timescale; // think as the frame per second
CMTimeFlags flags;
CMTimeEpoch epoch;
} CMTime;

AVPlayerItem has a duration property which is a object of CMTime. We can get the total duration in seconds by

1
2
3
4
double seconds = playerItem.duration.value / playerItem.duration.timescale;

// or
double seconds = CMTimeGetSeconds(playerItem.duration)

We can create a CMTime by CMTimeMake(int64_t value, int32_t timescale)

1
2
CMTime t1 = CMTimeMake(1, 10); // 0.1s
CMTime t2 = CMTimeMake(2, 1); // 2s

AVPlayer has this functions to manage the time.

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
/* Returns the current time of the current player item. */
- (CMTime)currentTime;

/* Sets the current playback time to the time specified by the date object.
*/
- (void)seekToDate:(NSDate *)date;

/* Sets the current playback time to the specified time and executes the specified block when the seek operation completes or is interrupted. */
- (void)seekToDate:(NSDate *)date completionHandler:(void (^)(BOOL finished))completionHandler NS_AVAILABLE(10_7, 5_0);

/* Sets the current playback time to the specified time. */
- (void)seekToTime:(CMTime)time;

/* Sets the current playback time within a specified time bound. */
- (void)seekToTime:(CMTime)time toleranceBefore:(CMTime)toleranceBefore toleranceAfter:(CMTime)toleranceAfter;

/* Sets the current playback time to the specified time and executes the specified block when the seek operation completes or is interrupted.
*/
- (void)seekToTime:(CMTime)time completionHandler:(void (^)(BOOL finished))completionHandler NS_AVAILABLE(10_7, 5_0);

/* Sets the current playback time within a specified time bound and invokes the specified block when the seek operation has either been completed or been interrupted.
*/
- (void)seekToTime:(CMTime)time toleranceBefore:(CMTime)toleranceBefore toleranceAfter:(CMTime)toleranceAfter completionHandler:(void (^)(BOOL finished))completionHandler NS_AVAILABLE(10_7, 5_0);

Observing time

AVPlayer also provides methods for us to observing time.

1
2
3
4
5
6
7
8
/* Requests the periodic invocation of a given block during playback to report changing time. */
- (id)addPeriodicTimeObserverForInterval:(CMTime)interval queue:(nullable dispatch_queue_t)queue usingBlock:(void (^)(CMTime time))block;

/* Requests the invocation of a block when specified times are traversed during normal playback. */
- (id)addBoundaryTimeObserverForTimes:(NSArray<NSValue *> *)times queue:(nullable dispatch_queue_t)queue usingBlock:(void (^)(void))block;

/* Cancels a previously registered periodic or boundary time observer. */
- (void)removeTimeObserver:(id)observer;

For example, we can add update the player transport UI by adding a periodic time observer.

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
// update transport UI
- (void)addPeriodicTimeObserver {
// Invoke callback every half second
CMTime interval = CMTimeMakeWithSeconds(0.5, NSEC_PER_SEC);
// Queue on which to invoke the callback
dispatch_queue_t mainQueue = dispatch_get_main_queue();
// Add time observer
__weak typeof(self)WeakSelf = self;
self.timeObserverToken =
[self.player addPeriodicTimeObserverForInterval:interval
queue:mainQueue
usingBlock:^(CMTime time) {
// Use weak reference to self
// Update player transport UI
float currentPlayTime = (float)item.currentTime.value/ item.currentTime.timescale;
[WeakSelf updateTransportBar:currentPlayTime];
}];
}

- (void)dealloc {
[_player removeTimeObserver:_timeObserverToken];
}

Handle Playback End

AVPlayerItem has several notifications for different activities.

  1. AVPlayerItemTimeJumpedNotification
  2. AVPlayerItemDidPlayToEndTimeNotification
  3. AVPlayerItemFailedToPlayToEndTimeNotification
  4. AVPlayerItemPlaybackStalledNotification
  5. AVPlayerItemNewAccessLogEntryNotification
  6. AVPlayerItemNewErrorLogEntryNotification

We can use these to track current playback. For handling playback end:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
// Add Observer
[[NSNotificationCenter defaultCenter] addObserver:self
selector:@selector(handlePlayerDidFinishPlaying:)
name:AVPlayerItemDidPlayToEndTimeNotification
object:[self.player currentItem]];

- (void)handlePlayerDidFinishPlaying:(NSNotification *)notification {
AVPlayerItem *playerItem = (AVPlayerItem *)notification.object;

// Jump to the beginning of the video
[playerItem seekToTime:kCMTimeZero];
// pause the video
[player pause];
}

Handle Interruption

When AVPlayer got interrupted, current playback will pause. We can listen to the notification comes from AVAudioSession, but don’t forget to activate AVAudioSession.

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
// Configure AVAudioSession and activate it when you init your avplayer controller
// Setup audio session catagory, use playback for video playback
AVAudioSession *session = [AVAudioSession sharedInstance];

NSError *error;
if (![session setCategory:AVAudioSessionCategoryPlayback error:&error]) {
NSLog(@"Category Error: %@", [error localizedDescription]);
}

if (![session setActive:YES error:&error]) {
NSLog(@"Activation Error: %@", [error localizedDescription]);
}


// Add Observer
[[NSNotificationCenter defaultCenter] addObserver:self
selector:@selector(handleSessionInterruption:)
name:AVAudioSessionInterruptionNotification
object:nil];

- (void)handleSessionInterruption:(NSNotification *)notification {
AVAudioSessionInterruptionType type = [notification.userInfo[AVAudioSessionInterruptionTypeKey] unsignedIntegerValue];
if (type == AVAudioSessionInterruptionTypeBegan) {

// pause the video
[self.player pause];

} else if (type == AVAudioSessionInterruptionTypeEnded) {
// resume the video ??
}
}

Handle Route Change

We will use the notification comes from AVAudioSession again to handle route change.

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
// Add Observer
[[NSNotificationCenter defaultCenter] addObserver:self
selector:@selector(handleRouteChange:)
name:AVAudioSessionRouteChangeNotification
object:nil];

- (void)handleRouteChange:(NSNotification *)notification {

AVAudioSessionRouteChangeReason reason = [notification.userInfo[AVAudioSessionRouteChangeReasonKey] unsignedIntegerValue];

if (reason == AVAudioSessionRouteChangeReasonOldDeviceUnavailable) {
AVAudioSessionRouteDescription *preRoute = notification.userInfo[AVAudioSessionRouteChangePreviousRouteKey];
NSString *portType = [[preRoute.outputs firstObject] portType];
if ([portType isEqualToString:AVAudioSessionPortHeadphones]) {
// This notification was send from other thread, pop up error message under main thread
dispatch_async(dispatch_get_main_queue(), ^{
// pause the video
[self pause];
});
}
}
}

Handle Fail Play To End

AVPlayerItemFailedToPlayToEndTimeNotification. In most cases, we will get this notification when we play streaming video and you got network problem.

1
2
3
4
5
6
7
8
9
10
11
12
13
14
// Add Observer
[[NSNotificationCenter defaultCenter] addObserver:self
selector:@selector(handleAVPlayerItemFailedToPlayToEndTime:)
name:AVPlayerItemFailedToPlayToEndTimeNotification
object:nil];

- (void)handleAVPlayerItemFailedToPlayToEndTime:(NSNotification *)notification {
NSError *error = notification.userInfo[AVPlayerItemFailedToPlayToEndTimeErrorKey];

// This notification was send from other thread, pop up error message under main thread
dispatch_async(dispatch_get_main_queue(), ^{
[self popErrorAlert:error];
});
}

Here is the Test App