How to set up video streaming in your app with AVPlayer

How to set up video streaming in your app with AVPlayerWatch it over and over…not just oncePayal GuptaBlockedUnblockFollowFollowingMar 11Video streaming is a type of media streaming in which the data from a video file is continuously delivered via the Internet to a remote user.

It allows a video to be viewed online without being downloaded on a host computer or device.

Streaming TV shows, movies, and other types of video over the Internet to all manner of devices, once a fringe habit, is now a squarely mainstream practice.

Even people still paying for cable or satellite service often also have Netflix or Hulu accounts.

In this article, we’ll be discussing how we can get a video playing inside our own app.

Let’s get startedVideo URLThe first and the foremost thing that we need is a video URL.

The URL could be either an online URL or the one representing the path to a video inside the app’s bundle.

let url = URL(string: “https://clips.

vorwaerts-gmbh.

de/big_buck_bunny.

mp4")Video Playback with AVPlayerTo create a video player, we’ll be using the AVFoundation’s AVPlayer object.

AVPlayer — An AVPlayer is a controller object used to manage the playback and timing of a media asset.

You can use an AVPlayer to play local and remote file-based media, such as QuickTime movies and MP3 audio files, as well as audiovisual media served using HTTP Live Streaming.

Code Snippet — 1In the above code,Line 1 — AVAsset object is created using the video URL.

An AVAsset object represents the media to be played.

Line 2 — AVPlayerItem object is created using the AVAsset object.

It models the timing and presentation state of an asset played by the player.

Line 3 — AVPlayer object is created using the AVPlayerItem object.

Adding AVPlayer to AVPlayerLayerNow that we have our AVPlayer object in place, the next milestone is to show it on the screen, i.

e.

adding it to a viewController or some view.

AVPlayerLayer — A CALayer on which a AVPlayer renders its output.

Code Snippet — 2In the above code,Line 1 — an AVPlayerLayer object is created with the previously constructed AVPlayer object.

Line 2 — frame of the playerLayer is set to the view in which we want the player to be rendered.

Line 3 — set the videoGravity of playerLayer as per the requirement (optional).

videoGravity — determines how the video will be displayed within the playerLayer bounds .

Its values can be:resizeAspect — The player should preserve the video’s aspect ratio and fit the video within the layer’s bounds.

resizeAspectFill — The player should preserve the video’s aspect ratio and fill the layer’s bounds.

resize — The video should be stretched to fill the layer’s bounds.

Since playerLayer is a CALayer after all, it’s gonna work exactly the same as a normal layer.

We need to add playerLayer to a view’s layer for it to be visible on the screen.

So, here we go.

self.

videoView.

layer.

addSublayer(player)videoView is a some UIView in which we’ve added our player.

Play video in AVPlayerTill now, we’ve successfully added the player to our view.

But merely adding the player to a view won’t get our video playing.

AVPlayer has 2 very simple methods to play and pause the video.

play() — Begins playback of the current item.

pause() — Pauses playback of the current item.

For now, we only need to play the video in the player.

player.

play()Getting all the bits and pieces together, we’re ready to stream our first video in the app.

Code Snippet — 3Note: One more thing, don’t forget to import AVFoundation in the file wherever you are going to integrate the AVPlayer.

All the classes that we’ve used are part of AVFoundation framework.

Playback ControlsNow that we’ve got our video player working, one thing we all must have noticed — there is no option to pause the video once it starts playing.

Neither can we mute it, nor fast forward/or rewind it.

In short, AVPlayer doesn’t come with default playback controls.

But, nothing is impossible my friend.

If they didn’t provide it, we’ll create our own controls.

And that, too, as per our own custom requirements.

We’ll create the following controls to manage the video playback.

Play/PauseMute/Un-mute videoForward/RewindTrack video progressPlaying multiple videos in a queueTracking video play statusReplay videoOf course, you need to create your own UI to support playback controls.

Something like in the below illustration.

Play/PauseThis one is the simplest to implement.

As we’ve already discussed, AVPlayer's play() and pause() methods can be used for this purpose.

Code Snippet — 4Mute/Un-mute videoAVPlayer’s isMuted property can be used to get this working.

isMuted — A Boolean value that indicates whether the audio output of the player is muted.

Set its value to true/false to mute/un-mute the player.

player?.

isMuted = trueFast-Forward/Rewind videoRewinding and fast-forwarding a video playback is simply — changing the current time of the playback, either decreasing or adding to it.

AVPlayer provides multiple seek methods for managing time in a video playback.

You can find them here.

We’ll be using one of those alternatives.

seek(to:) — Sets the current playback time to the specified time.

Code Snippet — 5In the above code,currentTime() — Returns the current time of the current player item.

Lines 1 to 9 — rewinds the playback by a given seconds with a cap of 0 seconds.

Lines 11 to 19 —fast-forwards the playback by given seconds with a cap of total playback duration.

Tracking video progressWe can track the video playback with AVPlayer’s periodic time observer.

addPeriodicTimeObserver(forInterval:queue:using:) — Requests the periodic invocation of a given block during playback to report changing time.

Code Snippet — 6In the above code, we’ve added an observer to player.

This observer will be invoked after every 0.

5 seconds (CMTime(seconds: 1, preferredTimescale: 2)).

CMTime — A struct representing a time value such as a timestamp or duration.

You can refer this link to understand howCMTime actually works.

Lines 2 to 6 — The progress percent is calculated using the total duration and the current duration.

Lines 9 to 11 — UIProgressView is being used to show the playback progress on UI.

Playing videos in a Queue — PlaylistThere is a limitation with AVPlayer playback.

It can only be used to play a single video, i.

e.

there is no initializer that accepts multiple AVPlayerItems.

AVQueuePlayer is the solution we’ll use to play all the videos in our playlist.

AVQueuePlayer — A player used to play a number of items in sequence.

AVQueuePlayer is an AVPlayer with additional features.

It accepts an array of AVPlayerItem .

init(items:) — Creates a queue player with player items from the specified array.

Code Snippet — 7Follow the same steps as AVPlayer i.

e.

adding it to playerLayer and then calling play() on it.

Tracking video play statusDuring its complete duration, playback can be in multiple states, i.

e.

it might be playing, or paused at some moment, or it might still be buffering for a particular timespan.

We must track all these states so we can update the UI accordingly.

For example, we could use a loader to show the playback status on screen.

AVPlayer’s timeControlStatus can be used to track the video playback status.

timeControlStatus — A status that indicates whether playback is currently in progress, paused indefinitely, or suspended while waiting for appropriate network conditions.

It accepts a value of type AVPlayer.

TimeControlStatus i.

e.

paused — The player is paused.

waitingToPlayAtSpecifiedRate — The player is in a waiting state due to empty buffers or insufficient buffering.

playing — The player is currently playing a media item.

timeControlStatus is observable using Key-value observing.

Since it is an observable property, we can register an observer and track any changes made in its value.

player.

addObserver(self, forKeyPath: “timeControlStatus”, options: [.

old, .

new], context: nil)In the above code, we are tracking the old as well as the new value of timeControlStatus.

We can observe the changes made in NSKeyValueObserving’s observeValue(forKeyPath:of:change:context:) method.

Any class that inherits from NSObject will have this method available, for example: anything that is a UIView type etc.

You just need to override it in your class.

NSObject provides an implementation of the NSKeyValueObserving protocol that provides an automatic observing capability for all objects.

Code Snippet — 8In the above code, we are using the timeControlStatus's newValue to show/hide the loader (example: UIActivityIndicatorView).

The loader is hidden in case the video is playing/paused and visible when the video is still in buffering state.

Line 5 — we cater to these changes only when newValue is other than the oldValue.

Replay videoTo replay the video once the playback ends, we can observe the notification AVPlayerItemDidPlayToEndTimeNotification .

AVPlayerItemDidPlayToEndTimeNotification — A notification that’s posted when the item has played to its end time.

NotificationCenter.

default.

addObserver(self, selector: #selector(playerEndedPlaying), name: Notification.

Name("AVPlayerItemDidPlayToEndTimeNotification"), object: nil)As evident from the above code, playerEndedPlaying method will be called whenever the playback ends.

@objc func playerEndedPlaying(_ notification: Notification) { DispatchQueue.

main.

async {[weak self] in player?.

seek(to: kCMTimeZero) player?.

play() //This is optional }} In the above code, we’re seeking the playback time to kCMTimeZero, i.

e.

to start.

Once the video is sought to its start position, we call play() on player.

Playing it again is as per our requirement.

We can avoid it just in case we want it to remain paused after playing once.

AVPlayerViewControllerBesides AVPlayer, apple has also provided the support for a full screen controller for media playback.

AVPlayerViewController — An object that displays the video content from a player object along with system-supplied playback controls.

It is provided by AVKit framework.

class AVPlayerViewController : UIViewControllerCode Snippet — 9In the above code,Line 1 — create an AVPlayerViewController instance.

Line 2 — assign an AVPlayer instance to the controller.

The AVPlayer instance can be created the same way as we discussed earlier.

Line 3 to 7— presented the controller and call play() on player once the controller is presented.

Full Screen ModeWe can use AVPlayerViewController to show the video in full screen when tapped on expand button.

Till now, we’ve mainly used 3 classes to show a video playback onscreen — AVPlayer, AVPlayerLayer and AVPlayerViewController.

AVPlayerLayer and AVPlayerViewController, both use the instance of AVPlayer.

To show the video in full screen mode, we just need to open the AVPlayerViewController and sync the playback progress with AVPlayerLayer.

Now, there can be 2 alternatives to achieve that,Create different AVPlayer objects for both AVPlayerLayer and AVPlayerViewController and use seek(to:) to sync the progress among them.

Create a single AVPlayer instance and use it for both as and when required.

The playback progress with be handled automatically in this case since we’re using the same player instance.

Both the approaches will give the expected result.

We’ll follow the 2nd one since it is a bit more efficient.

Code Snippet — 10In the above code,Line 2 — player paused before presenting the AVPlayerViewControllerLines 3 to 10 — presented the controller and call play() on player once the controller is presented.

Line 5 — register an observer for custom notification avPlayerDidDismiss .

avPlayerClosed method will be called once the notification is fired.

Apple doesn’t provide any callback/delegate method when the AVPlayerViewController is dismissed.

We actually need this to play the video again when the controller is dismissed.

This is the reason we’re using our own custom notification.

Lines 13 to 15 — avPlayerClosed method.

It’ll play the video again from the current progress.

To detect the AVPlayerViewController's dismissal, we’ll use UIViewController lifecycle’s viewWillDisappear(_:) method.

Code Snippet — 11In the above code,Line 4 — pause the video before the controller is dismissedLine 5 — a custom notification — avPlayerDidDismiss is posted on controller’s dismissal.

We already added the observer to it earlier.

Don’t forget to remove the observers (kvo and Notification Center) once you’re done.

deinit { NotificationCenter.

default.

removeObserver(self) player?.

removeObserver(self, forKeyPath: “timeControlStatus”)}One last thing…All iOS apps have a default audio session that comes preconfigured.

– In iOS, setting the Ring/Silent switch to silent mode silences any audio being played by the app.

– When the app plays audio, any other background audio is silenced.

But, it is not the general audio behaviour that any media playback should have, i.

e.

we must listen to the audio even if our device is on silent.

Also, if any other app is playing any audio session, it must be stopped for the time being.

AVAudioSession — An intermediary object that communicates to the system how you intend to use audio in your app.

It’ll configure our app’s audio session.

Code Snippet — 12In the above code,setCategory(_:mode:options:) — method of AVAudioSession to configure the audio session.

This method accepts 3 different value types which are solely responsible for the configuration.

AVAudioSession.

Category — An audio session category defines a set of audio behaviors.

The precise behaviours associated with each category are not under your app’s control, but rather are set by the operating system.

AVAudioSession.

Mode — While categories set the base behaviors for your app, modes are used to specialize the behavior of an audio session category.

AVAudioSession.

CategoryOptions — Constants that specify optional audio behaviors.

Each option is valid only with specific audio session categories.

We can use different combination of the above option and get the desired audio behaviour.

Do refer the documentation of all 3 from the links provided.

That’s all.

This is everything we need to get a video player working inside our app.

Sample ProjectYou can download the sample project from here.

I’ve created a cocoapod as well, so you can integrate the player with custom controls into your project directly.

You can find it here.

Further readingDon’t forget to read my other articles:Everything about Codable in Swift 4Everything you’ve always wanted to know about notifications in iOSDeep copy vs.

shallow copy — and how you can use them in SwiftCoding for iOS 11: How to drag & drop into collections & tablesAll you need to know about Today Extensions (Widget) in iOS 10UICollectionViewCell selection made easy.

!!Feel free to leave comments in case you have any questions.

.. More details

Leave a Reply