AVPlayerItemTrack
对象能够管理一个单独轨道(individual
track)的呈现状态。为了显示视频,我们需要使用AVPlayerLayer 对象。AVPlayer
来播放单一的资源。我们能够使用AVQueuePlayer
(AVQueuePlayer是AVPlayer的子类)播放序列(sequence)中的一系列对象。AVPlayerItem
对象。因为player
item对象管理着与它相关联的资源呈现的状态。一个player item包含多个player item tracks,是AVPlayerItemTrack的实例对象,相当于在资源(asset)中的tracks.下图显示了我们所描述的关系:
NSURL *url = [NSURL URLWithString:@"<#Live stream URL#>]; // You may find a test stream at <http://devimages.apple.com/iphone/samples/bipbop/bipbopall.m3u8>. self.playerItem = [AVPlayerItem playerItemWithURL:url]; [playerItem addObserver:self forKeyPath:@"status" options:0 context:&ItemStatusContext]; self.player = [AVPlayer playerWithPlayerItem:playerItem];
status,
当player
item的 status
属性状态变为AVPlayerItemStatusReadyToPlay,duration能够使用下面代码获取:
[[[[[playerItem tracks] objectAtIndex:0] assetTrack] asset] duration];如果我们想简单的播放流媒体(live stream),我们能够直接使用URL创建player对象进行播放,如下:
self.player = [AVPlayer playerWithURL:<#Live stream URL#>]; [player addObserver:self forKeyPath:@"status" options:0 context:&PlayerStatusContext];
status
属性,当属性变为AVPlayerItemStatusReadyToPlay就说明准备播放了,我们也可以观察currentItem
属性来获取为流媒体(stream)创建的player
item对象。status
属性来确定是否可以播放
- (IBAction)play:(UIButton *)sender { [player play]; }
rate
属性来改变播放的速度:
aPlayer.rate = 0.5; aPlayer.rate = 2.0;
canPlayReverse
(supports
a rate value of -1.0), canPlaySlowReverse
(supports rates between 0.0 and 1.0) and canPlayFastReverse
(supports rate values less than -1.0).
CMTime fiveSecondsIn = CMTimeMake(5, 1); [player seekToTime:fiveSecondsIn];
seekToTime:toleranceBefore:toleranceAfter:
方法:
CMTime fiveSecondsIn = CMTimeMake(5, 1); [player seekToTime:fiveSecondsIn toleranceBefore:kCMTimeZero toleranceAfter:kCMTimeZero];
AVPlayerItemDidPlayToEndTimeNotification
通知,在通知的回调中,我们调用seekToTime:方法并使用kCMTimeZero作为参数。
// Register with the notification center after creating the player item. [[NSNotificationCenter defaultCenter] addObserver:self selector:@selector(playerItemDidReachEnd:) name:AVPlayerItemDidPlayToEndTimeNotification object:<#The player item#>]; - (void)playerItemDidReachEnd:(NSNotification *)notification { [player seekToTime:kCMTimeZero]; }
AVQueuePlayer
对象来播放序列中的多个items。AVQueuePlayer是AVplayer的子类,我们使用player
item数组来初始化queue player:
NSArray *items = <#An array of player items#>; AVQueuePlayer *queuePlayer = [[AVQueuePlayer alloc] initWithItems:items];
advanceToNextItem
信息。insertItem:afterItem:
, removeItem:
,
and removeAllItems等
方法来修改列队(queue),当我们添加一个新的item的时候,首先我们需要确认该item是否能够被插入到列队中,使用canInsertItem:afterItem:
.方法,我们为第二个参数传入nil来测试新的item是否能够加入列队。
AVPlayerItem *anItem = <#Get a player item#>; if ([queuePlayer canInsertItem:anItem afterItem:nil]) { [queuePlayer insertItem:anItem afterItem:nil]; }
rate
属性将下降到0.0status
属性可能改变。observeValueForKeyPath:ofObject:change:context:,即使是在其它线程发生改变。
AVPlayerStatusFailed
或 AVPlayerItemStatusFailed
. 在这种情况,对象的error属性的值将描述错误的原因。dispatch_async
如下:
- (void)observeValueForKeyPath:(NSString *)keyPath ofObject:(id)object change:(NSDictionary *)change context:(void *)context { if (context == <#Player status context#>) { AVPlayer *thePlayer = (AVPlayer *)object; if ([thePlayer status] == AVPlayerStatusFailed) { NSError *error = [<#The AVPlayer object#> error]; // Respond to error: for example, display an alert sheet. return; } // Deal with other status change if appropriate. } // Deal with other change notifications if appropriate. [super observeValueForKeyPath:keyPath ofObject:object change:change context:context]; return; }
readyForDisplay
属性接受当layer已经显示用户视觉内容的通知,在特定情景中,我们能够插入player
layer到layer tree中,执行相应的过渡动画到用户所关注的内容。addPeriodicTimeObserverForInterval:queue:usingBlock:
方法中的block将在我们所指定的具体时间调用,如果时间跳过,播放开始或停止addBoundaryTimeObserverForTimes:queue:usingBlock:
方法,我们传递一个数组,数组中的值是包含结构体CMTime的NSValue对象,block将在任意指定值到达时进行。
// Assume a property: @property (strong) id playerObserver; Float64 durationSeconds = CMTimeGetSeconds([<#An asset#> duration]); CMTime firstThird = CMTimeMakeWithSeconds(durationSeconds/3.0, 1); CMTime secondThird = CMTimeMakeWithSeconds(durationSeconds*2.0/3.0, 1); NSArray *times = @[[NSValue valueWithCMTime:firstThird], [NSValue valueWithCMTime:secondThird]]; self.playerObserver = [<#A player#> addBoundaryTimeObserverForTimes:times queue:NULL usingBlock:^{ NSString *timeDescription = (NSString *) CFBridgingRelease(CMTimeCopyDescription(NULL, [self.player currentTime])); NSLog(@"Passed a boundary at %@", timeDescription); }];
AVPlayerItemDidPlayToEndTimeNotification
notification,当player
item完成播放。
[[NSNotificationCenter defaultCenter] addObserver:<#The observer, typically self#> selector:@selector(<#The selector name#>) name:AVPlayerItemDidPlayToEndTimeNotification object:<#A player item#>];
AVPlayerLayer
来输出AVPlayer对象,我们可以简单的创建UIView的子类:
#import <UIKit/UIKit.h> #import <AVFoundation/AVFoundation.h> @interface PlayerView : UIView @property (nonatomic) AVPlayer *player; @end @implementation PlayerView + (Class)layerClass { return [AVPlayerLayer class]; } - (AVPlayer*)player { return [(AVPlayerLayer *)[self layer] player]; } - (void)setPlayer:(AVPlayer *)player { [(AVPlayerLayer *)[self layer] setPlayer:player]; } @end
@class PlayerView; @interface PlayerViewController : UIViewController @property (nonatomic) AVPlayer *player; @property (nonatomic) AVPlayerItem *playerItem; @property (nonatomic, weak) IBOutlet PlayerView *playerView; @property (nonatomic, weak) IBOutlet UIButton *playButton; - (IBAction)loadAssetFromFile:sender; - (IBAction)play:sender; - (void)syncUI; @end
同步UI方法用于根据player的状态同步显示button的状态:
- (void)syncUI { if ((self.player.currentItem != nil) && ([self.player.currentItem status] == AVPlayerItemStatusReadyToPlay)) { self.playButton.enabled = YES; } else { self.playButton.enabled = NO; } }
我们可以在视图控制器的viewDidLoad
方法中调用syncUI方法:
- (void)viewDidLoad { [super viewDidLoad]; [self syncUI]; }
AVURLAsset根据URL创建asset
- (IBAction)loadAssetFromFile:sender { NSURL *fileURL = [[NSBundle mainBundle] URLForResource:<#@"VideoFileName"#> withExtension:<#@"extension"#>]; AVURLAsset *asset = [AVURLAsset URLAssetWithURL:fileURL options:nil]; NSString *tracksKey = @"tracks"; [asset loadValuesAsynchronouslyForKeys:@[tracksKey] completionHandler: ^{ // The completion block goes here. }]; }
在回调block中,我们可以根据asset创建AVPlayerItem
实例,并且使用该AVPlayerItem
对象初始化AVplay,在为view的player赋值。当我们创建了asset,简单的创建player
item并不意味着准备使用,我们需要确定什么时候可以播放,所有观察status属性。我们应该在为player实例赋值player
item之前,添加观察者。
// Define this constant for the key-value observation context. static const NSString *ItemStatusContext; // Completion handler block. dispatch_async(dispatch_get_main_queue(), ^{ NSError *error; AVKeyValueStatus status = [asset statusOfValueForKey:tracksKey error:&error]; if (status == AVKeyValueStatusLoaded) { self.playerItem = [AVPlayerItem playerItemWithAsset:asset]; // ensure that this is done before the playerItem is associated with the player [self.playerItem addObserver:self forKeyPath:@"status" options:NSKeyValueObservingOptionInitial context:&ItemStatusContext]; [[NSNotificationCenter defaultCenter] addObserver:self selector:@selector(playerItemDidReachEnd:) name:AVPlayerItemDidPlayToEndTimeNotification object:self.playerItem]; self.player = [AVPlayer playerWithPlayerItem:self.playerItem]; [self.playerView setPlayer:self.player]; } else { // You should deal with the error appropriately. NSLog(@"The asset‘s tracks were not loaded:\n%@", [error localizedDescription]); } });
- (void)observeValueForKeyPath:(NSString *)keyPath ofObject:(id)object change:(NSDictionary *)change context:(void *)context { if (context == &ItemStatusContext) { dispatch_async(dispatch_get_main_queue(), ^{ [self syncUI]; }); return; } [super observeValueForKeyPath:keyPath ofObject:object change:change context:context]; return; }
- (IBAction)play:(UIButton *)sender { [player play]; }
AVPlayerItemDidPlayToEndTimeNotification通知,并在通知回调方法中,执行
seekToTime:方法,参数为 kCMTimeZero
// Register with the notification center after creating the player item. [[NSNotificationCenter defaultCenter] addObserver:self selector:@selector(playerItemDidReachEnd:) name:AVPlayerItemDidPlayToEndTimeNotification object:[self.player currentItem]]; - (void)playerItemDidReachEnd:(NSNotification *)notification { [self.player seekToTime:kCMTimeZero]; }
AVFoundation Programming Guide - Playback
原文:http://blog.csdn.net/longshihua/article/details/54091080