Take a sneak peek at YouTube Live Streaming APIs – Jarek Wilkiewicz

is all about the scale at which we can reach
our audience and really make an impact. A couple of other events that
were recently conducted on YouTube Live– we did an event with Indian
Premier League. So for those of you that have
heard of cricket, it’s a sport extremely popular in India. It was a very successful
event. 72 matches, over 72
million views. For those of you that follow
the royal family, we were actually live streaming
the royal wedding. And on April 29, there were 72
million live streams. So 72 million people tuned in from
over 180 countries. So virtually the entire globe
was represented watching the royal wedding. And then one of my favorites,
Coachella. It’s a music festival with a lot
of cool bands in the US. And we got over 4 million
streams, 61 acts, concerts. And one of the interesting
features that we implemented as a part of this was the
ability to actually view live streams from multiple
camera angles. And I’ll show you how that
works a little later. So what’s involved in
live streaming? So I have my little reference
architecture diagram here. And, really, it highlights the
steps that one has to take in order to take content from a
live event and stream it to a wide audience over
the internet. So first, event production. This is not something
that we do. This is what our content
partners do. Then capture, so audio video
equipment will actually capture the live stream,
the video stream, the audio stream. Next, after capturing,
is encoding. Encoding is a very
important step. For those of you that have done
computer science, you can relate to that. The fundamental issue is
time versus space. We’re trying to compress the
data stream as efficiently as possible, and that is very
computationally intensive. So therefore, encoding is
typically done by hardware devices, so specialized
hardware. But there are also software
encoders available. Our machines are getting
more and more powerful. And I will give you a couple of examples in the demo session. Next, content delivery
network and player. So CDN is really what optimizes
the delivery to the edge of the network. And then YouTube player is how
the audience can actually consume the video content. So this is our little reference
architecture. And I will describe the steps
involved in all the pieces and how our APIs fit into
this architecture. So in the recap, we won’t talk
about event production and capture, because this is not
something that we do. We’ll touch upon the encoding. At YouTube we don’t actually
deal with that. This is something that whoever
wants to live stream would do, either by using the hardware
device or software device. Content delivery networks– so
delivery is very important to really optimize the
stream quality. And we’ll talk about
it a little bit. And then the YouTube player. So encoding. If you have dealt with audio
video compression before, this will be fairly familiar
to you. If you haven’t, I just wanted
to highlight some of the important pieces, in case you
want to integrate with our platforms. You will see this. As I mentioned, encoding is
computationally expensive. And for high-quality streams,
typically you will actually have a hardware encoder. But we do support software
encoding, as well. We don’t really have
a preference. And Flash Media Live Encoder is
something you can get for free and try it out. Once the encoding is completed,
we actually use RTMP, real-time messaging
protocol, as a transport protocol to actually
use as a transport to ingest to YouTube. So once the stream is encoded,
it’s actually send over RTMP up to our servers. And from there it’s distributed
through the content delivery network to the
edge of the network and to the devices. We use FLV containers,
so Flash video. We are in the process of
supporting HLS, as well. So HLS live streaming
is coming. And then we use H264 for the
video encoding and AAC audio for audio encoding. And for H264, there are many
options, so you can balance stream quality versus the
bandwidth required. So anything from 400 kbps, which
will give you about 240p quality, to 720p
or even 1080p. OK. So this slide is just
a recap of how these terms fit together. So you have the RTMP transport,
FLV container within which there is an audio
stream and a video stream. The video stream obviously
takes a lot more space. It’s encoded using H264, just
standard technology. And the audio stream is encoded
using AAC, and that’s what we ingest at YouTube. And then we actually use our
platform to broadcast it out to our users. So what’s the experience like
from the perspective of a YouTube partner? One can schedule a live
event on YouTube. One important note is, today,
this functionality is only available to our content
partners. We have more and more of these
partners enabled for live streaming functionality in
most parts of the world. We are working on extending
the reach of this functionality. So this is kind of a preview
for you guys. If you work for one of the
content partners, then you should be able to get
access to it. If you don’t and you want to
build tools for these content partners, then we have
some mechanisms to help you try it out. And then we hope to extend the
ability to actually stream to a larger audience
a little later. There’s still a lot
of work remaining. And this is a very
new product. We only started working on
that product last year. So once the stream
is scheduled, one can discover it. We have a landing page,, where we feature certain events. We have events that are showing
at this moment, as well as events that
are scheduled and those that have completed. You can also go to a channel
of a partner that has been enabled for live streaming and
see what events they have scheduled, if you would
like to see it. And then finally playback. So we use the regular
YouTube player to do the live stream playback. It’s the same player. You don’t have to do
anything special. So let’s look at what’s up on
YouTube Live right now. OK. So as you see, we have a bunch
of featured events. Austin City Limits, that’s
a cool music festival. But let’s look to see if
there’s something else interesting. OK, let’s try this one. America’s Cup, live racing. So if you’re into sailing, that
might be interesting. Let’s bring it up. [VIDEO PLAYBACK] JAREK WILKIEWICZ: OK, so this
is a live streaming sailing competition from Plymouth,
England. Now, notice one cool feature. You can actually select
a camera. So this is one of the live
streams that is available. But what if I want to check
out a different boat? So this live stream is coming
from another team, from another boat. So you can actually decide what
you want to watch, and select the perspective,
and see it. This one is also interesting,
because it’s actually a live representation of
who is where. So this is a geo version
with a flag representing which country. So you can start watching this
event by orienting yourself, and then decide, oh, I want to
follow the Korean team, and then pick from one of the
available views to actually enjoy the experience the way
you want to enjoy it. So as you can tell, this is
very different from your traditional TV experience,
where somebody makes the decisions for you. It’s a lot more interactive. And it’s being streamed in
real life in real time. This is coming in from abroad,
and I have a very nice experience watching
it here in Brazil. So this is what it looks like. OK, so this is all good, but
we’re here to talk about the APIs, because we are
all developers. So how can I use the APIs to
actually interact with our live streaming platform? What’s available to
us as developers? So we have several types
of APIs, starting with scheduling APIs. So these are the APIs that a
content producer might use to schedule events, modify the
event metadata, start events, end events. We have discovery API,
which will let you discover what’s on live. So we have a RESTful API, which
allows you to fetch what are the streams that are
scheduled, what are the events that are live right now, what
are the archive events? And we have APIs for YouTube
player, which allows you to actually customize the playback
experience using the same API as we have for regular video on demand playback. So let’s look at the discovery
APIs, because these are easy to grasp. We represent the live
events as feeds, and our API is RESTful. So for those of you that have
used any Google APIs, it should be very familiar. For those of you that haven’t,
really the idea here is they run over HTTP. They’re very easy to use. So here I have actually
four feeds. And we call them live
event charts. Featured events– so these are
the ones that we think are interesting. Live now– so those are events
that are being broadcast at this moment. Upcoming, scheduled
in the future. And then recently broadcasted
feed contains video recordings for events that have
completed. So if you missed an event, you
can actually go back to that feed and fetch a video, if the
content owner configured the event to archive off
videos, which is enabled by default, actually. You’ll be able to actually
get a video recording of the past event. So if we want to access one of
these feeds, I can just issue an HTTP GET request. And
as you see, I get an XML document feed. Each feed has entries. Entries correspond to the
individual events. So really what I just showed you
is an API access example. And our API is RESTful. It runs on top of HTTP. You can parse the response
yourself in your code. We support XML. We also support JSON, or you
can use client libraries to access some of that YouTube
API backend. So here’s just a recap
of what the live events feed looks like. So this is a live events
feed that I have in my test account. As you see, I have five
events scheduled. So there are five entries. Each entry will look something
like this. We’ll have an event
identifier. It will have a video ID
corresponding to the stream. So while the stream
is live, that will actually be a live stream. Once the stream is archived,
it’ll be a video on demand ID. So the ID doesn’t change. So it’s the same video ID. You can use it to watch
a live stream. But if the content has been
archived off because the event completed, you can actually use
it in your YouTube embed, just as a regular video ID. And then we maintain a
small state machine associated with the event. So in this case, the event
is in the pending state. So when I create an event,
it will be pending. Then events transition state
as I start the event and I complete the event. And I also have a start date,
so I know when this event is scheduled to start. OK, so that was the
discovery API. Now let’s talk a little bit
about the scheduling API. So we offer the ability to
actually interact with the backend using this API. So rather than using the user
interface on that I showed you a screenshot of a
little earlier, I can actually use the API to do
the same thing. And as you would expect, I can
create an event, start and stop an event. I can change the metadata
about the event and delete an event. So let’s quickly walk through
the syntax for these API operations. As I mentioned, the API is
RESTful, so here’s an example request that will actually
insert a new event into a live events feed of the currently
authorized user. I supply the authorization
token. So the syntax here should be
familiar to you, if you’ve used OAuth. If you haven’t, probably a
good idea to look at it, because this is how a lot
of the web APIs are evolving, to use OAuth. OAuth, too, is particularly
nice because it’s a lot easier to use. And what that allows you to
do is basically allow your application to act
on user’s behalf. So we have a scope defined
for YouTube APIs. If you obtain authorization from
the user to access their data within that scope, you’ll
be able to basically invoke all these operations
on the user behalf. Developer key is important,
because this is how we can identify your application and
we can offer you the ability to actually track your
application usage, monitor your quotas, and so forth. And then a couple of metadata
fields which are required– title, summary, and
start date. Pretty straightforward. Once the event is scheduled,
it can be started. It triggers a state transition
from pending to active. And then the fields– this is an example of
a partial update. I’m using the HTTP patch
request, which allows the request to be a lot more
compact, because I only specify fields that I want to
modify and their values. So in this case, I
am modifying the start date, so ytwhen. And then start value is now. And that will trigger the
state transition. The event will be started. Ending an event, similar
to the previous slide. Patch request, and I
set the end to now. And that will basically
transition the event from active to completed. Updating, as I mentioned
before– if you want to change
the metadata programmatically, you can. Again, a patch request. I
specify the fields that I want to modify, in this case,
title, summary, and the start date. And I supply the new values,
and I’m done. And then finally deleting
an event. And this one is the simplest.
It’s just a basic HTTP request. Submit it to the feed
against an event ID. OK, one note. I mentioned that earlier, by
default, the live streams are actually archived off to YouTube
and recorded as video so they can be watched later. If you want to disable that in
your application, you can. So there’s a YT archive flag. You can set it to false. And then another thing that
is interesting is you can actually discover the linkage
between the live event that was scheduled and the video that
will be used to actually archive the content
and vice versa. So if somebody missed an event
and in your application, you want to give them the ability
to watch the video for the event that they have missed, you
can actually discover that relationship through the API. OK, player API. As I mentioned before,
our player API– for those of you that
have used it, it will be very familiar. It works pretty much the same
way, with some exceptions. Some operations are
not supported. I will mention a
couple of them. And then for those of you that
haven’t used the player API, this is the Flash or HTML5 embed
that we have. So today, we recommend using the
iframe embed, and we have API around it. So you can customize
the experience. And here’s an example of a
player API invocation. So what this code example will
do– it will actually load the YouTube API player library. It’s a JavaScript wrapper. It will load a video ID. And this video could
be a live stream. It doesn’t have to be
video on demand or regular YouTube video. It will register a couple
of event listeners– so [? OnPlayerReady ?] and
[? PlayerStateChange. ?] And here, what I’m doing upon
the event [? OnReady– ?] I invoke my event handler,
[? OnPlayerReady, ?] and I invoke the YouTube API
player invocation, a method called [? playVideo. ?] So what that will do is it will
actually start a video immediately as soon as somebody
navigates to a page which includes this script. So this is a way to programmatically trigger the player. You can start and stop, and
there’s a lot of other operations. You can change the volume
and all that stuff. So this is important if you want
to build your own user experience around the
YouTube player. If you don’t want our controls,
you want your own controls, you can do that
through the player API. As I mentioned, some of the
operations are not supported. For example, seekTo allows you
to actually go to a specific point in the video. So with live streams, you can’t
actually seekTo, because it is a little difficult to
seek into the future. And this is something that
we are still working on. So demos. We have a couple of
examples here. Watchme is a little
application that we wrote for Android. PyLive is a piece of code
running on App Engine, written in Python, that implements the
event management APIs that I showed you. WireCast is a software
encoder. And this is a partner
application from a company called Telestream. And Squrl is a constant
discovery and sharing application. So let’s transition into
the demos part. So I’m going to bring
up Watchme. And I have it installed
on my Android device. First I need Wi-Fi. OK. Looks like I’m on Wi-Fi. OK. So I’m going to pick an event
that I have previously created and hit Start Streaming. And then I’m going
to go to YouTube. So this is my channel, which has
a bunch of YouTube events. [VIDEO PLAYBACK] JAREK WILKIEWICZ: As you
can tell, there is a little bit of delay. And I’m going to– Smile, you’re on YouTube. OK, so that was our little
Android application demo. Let me tell you a little bit
about how it’s built. So what we did is we used the
YouTube API to retrieve the event list. This application
also allows you to create a new event. You can start an
event as well. And that’s a RESTful API. And then the part that is a
little tricky is the part that takes care of the stream
encoding and actually transmitting over to our YouTube
live streaming server. So for that, we use a native
application called FFmpeg. For those of you that have
actually done any video processing, transcoding, FFmpeg should be very familiar. And we use FFmpeg’s ability to
actually encode the video and the audio stream, wrap it in the
FLV container, and stream it over RTMP to our YouTube
live server. We access the FFmpeg using
a JNI wrapper. So that’s the Java native
interface wrapper. And a couple of interesting
notes about this implementation. We use a YUV byte
array for video. So YUV is a way to represent
uncompressed video. And for mono, we have 16-bit
pulse code modulation– PCM– audio stream. And we encode that as well, wrap
it in an FLV container, ship it up to our live
servers over RTMP. So that concludes the
Android demo. Let’s look at this other
application that I wrote. As I mentioned, I was literally
finishing this thing up on the plane. So what I did is I wrote a
Python application which allows you to basically retrieve
a list of events. You can view events, you
can create events. This is what it looks like. I can just look at
the event that we used for live streaming. I’m retrieving all the
information from the YouTube backend using the
RESTful APIs. And what I can also do
is I can actually create a new event. So let me just create one. OK, that works. I’m actually running the
local app engine instance on my machine. So this is the Python code
that is executing as I am invoking the API operations. OK. So I have an event already
scheduled. And then what I would like to
do is I would like to– let me just close some
of these windows. I would like to live
stream to my event. So as you see, because I
scheduled the event, it actually now shows up in the
live events panel for my user account on YouTube. So what I’m going to do now is
switch to WireCast, which is a software application. It’s a software encoder. So what I can do is I can select
Broadcast Settings, and select YouTube from the list
of available stream destinations. Log in. And let me just pick– OK. So what happened is WireCast
logged in. And using the API, it retrieved
the event list for events that are currently
pending. Hello, Brazil. Let me select this one. I’ll hit Save. And now WireCast
is configured. So I don’t have to type stream
destination or any of the stuff that would typically
be involved in doing the live streaming. Let me go back to my little
live event manager. And as you see, the event right
now is in pending state. So let’s start it. And what that will do– and I gave you an
example of that. That will actually issue the
patch request and the transition of state from
pending to active. And that’s what I’m showing
here, is an active event. So let me go to WireCast
now and hit Broadcast to start streaming. I go back to my live event
feed, see what’s active. Oh, it shows now that Hello,
Brazil is an active event. Let’s click on it. And this is me. So what you see happen is
WireCast was able to retrieve the live event list
from our backend. I used my little Python
application to actually transition the event state. And then right now, WireCast
is actually encoding the stream coming in from my webcam,
streaming it out to YouTube, and making
it available for our users to see. So a little note about the
PyLive, which is what I called my little Python
implementation. It’s an App Engine application
running with Python. I used OAuth2 authorization. And for that, I chose the
OAuth2 client library. If you’re thinking about using
OAuth for your application, I highly recommend it. It makes OAuth with
Python super easy. So I had to write very little
code to handle OAuth. All I needed to do is decorate
some of my handlers using the OAuth2-required decorator
and annotation. And that was it. So it makes the OAuth2
processing very easy. And then I am using urllib2 to
actually make the RESTful API invocations. So this stuff is
all very easy. WireCast is a software
application. It’s written by a partner. They were able to actually
pull this functionality together in a very short
amount of time. It literally took them a couple
of days to get it into their app, right before
this event. So I’m very excited that I was
able to bring it here and show you, so that you have
a little better idea of what’s involved. And then another application
that is kind of cool– it’s a content curation
application from a company called Squrl. And what they do is, it’s an
application that lets you discover video content. It’s on YouTube video content,
but not only YouTube. The web is a big place. There’s other video sources. So they pull in video sources
from multiple streams. And they basically let users
organize it and share it. So let’s see what
it looks like. OK. This is my Squrl account. And in my account, I actually
subscribe to a feed called Featured Live Playlists. So if I click on that feed,
what you see is they are actually retrieving, using the
API, the feed entries for the feeds that I showed you earlier
in this presentation. So I’m actually looking at the
live events happening now, those that are scheduled, those
that have been broadcast already, and those that
are featured. So let me look at upcoming
live events. OK. See if there’s something
interesting here. Oh, Austin City Limits
music festival. So this is going to
be pretty cool. So let’s see if I can
subscribe to it. So Squrl uses this concept
of a squirrel. Squirrels grab stuff and
take it back home. And that’s their user
interface metaphor. So as you browse content and you
discover things that are interesting, you can just
take it with you. So I will Squrl this channel,
and I’ll save it in my live connection, and create
a new playlist for this type of content. I will call it Music. OK. It was Squrled. So if I go back to my channel,
you see I have my live collection, I have Music. And this event is currently
in my collection. One thing that they also do is
they actually keep track of events that are coming up that
you haven’t seen yet. And they show you a little count
of how many new ones were created, so you can see if
somebody added a new event. And another nice thing is this
account here is actually a curator, representing
somebody that is curating YouTube content. And you can actually
see what else they have curated as well. So what they are able to do
is use the discovery APIs. And this is something that
you can use today if you would like. And they consume the streams.
They wrap the very nice user interface around it. And they also give the user the
ability to keep track of what is new. If they missed an event, because
the API exposes the linkage between the event and
the video ID, you can actually consume the archive event as
well using the same user experience. OK. So finally, a couple of notes
about what’s coming up on YouTube live. So I am personally very happy
that Rock in Rio will be live streamed on YouTube. Not in Brazil, because here you
have to watch it on TV. But when I get back home, I’ll
be able to see it live on YouTube, since it’ll be actually
broadcast in a lot of other countries. And then Trama, if you’re into
indie music, is another interesting channel. It’s a partner, and I know they
will have live events on their channel as well. America’s Cup is something that
I showed you previously, as a part of the demo. Let me just play a promo video
that shows the type of functionality that they have
implemented just as a recap. [VIDEO PLAYBACK] -A new dimension to
the America’s Cup. -Just sheer, raw carbon-fiber
power. -This is where they start
to get serious. -We’ve got these boats
that are going to blow people’s minds. -These guys are hanging on
at the back of the boat. -It’s a hard time physically
for these sailors. -25 knots. Wow. That’s just unheard of. -Three, two, one. Here we go. – [INAUDIBLE] 100-meter-plus lead here. -The heart rate’s up around
about 150 to 180 beats a minute. -This is speedboat speed. This is speedboat speed. I mean, this is crazy stuff. [UNINTELLIGIBLE] -This one slip could
be catastrophic. -The America’s Cup sails
into a new era. [END VIDEO PLAYBACK] JAREK WILKIEWICZ: OK. So in recap, why is this useful
or relevant to you? We have over 20,000 content
partners worldwide. And we are enabling more and
more of them for live streaming support. And those are the type of
customers that will be interested in encoding
solutions, event management solutions, content
management system integration, and so forth. We have a user interface on, but developers can be a little more innovative
than what we can do, I’m sure. I’m always pleasantly surprised
by what people are able to put together. And then for our users, content
discovery, curation– we get over 3 billion
views per day. And there are a lot of
interesting YouTube applications that actually
present a different user experience around our content. And we support that. We would like to see
more of that. Here’s a link of a couple
of resources. The usual– we have a blog,
documentation is pretty good. We have a Twitter account
if you’d like to reach us through Twitter. The user forum. And that’s pretty much it. So if you have any questions,
we have a couple more minutes left. And then again I’ll be around
in the YouTube lobby most of the day today. So if you don’t get a chance
to ask me now, you can just come by and chat with
me afterwards. Any questions? MALE SPEAKER: [INAUDIBLE] JAREK WILKIEWICZ: Yeah. So the question was, what’s the
minimum required to be a content partner? So I don’t have a very good
answer to that, because I work on the engineering side,
on the API side. And the content partner team is
a team that evaluates the quality of the content. Typically what they look
for is, do you produce good content? Do you have an audience? Do you already have a YouTube
presence with a lot of subscribers? So that typically
is important. So they want folks who will
produce original content. They have rights to that
content, in order to be a content partner. Now for application developers,
some of the partners that I showed
you here– if you are developing an
application for content partners– so if you’re
going to be targeting that group of people– then contact us and we’ll help
you get access to the platform for testing purposes. So for API partners, this is
something that we’ve been working on to make sure that our
developers can actually do the testing, much like what
I showed you today. So you could do your testing
before you roll out your product to the partners. For more information, if you
go to, there’s a lot of information
there. OK. Another question? MALE SPEAKER: I saw that you
have a lot of simultaneous cameras available on the stream
and they have titles. Is it possible to change the
title using the API? JAREK WILKIEWICZ: Yeah. Each stream is a video. And the video metadata
associated with that stream is very similar to the regular
metadata that you see for regular videos. So the streams in the event
are treated as a playlist. Each playlist has
a video entry. And that metadata corresponds
to what you’ll see with the video. Yeah. MALE SPEAKER:
for translation. FEMALE SPEAKER: [INAUDIBLE] JAREK WILKIEWICZ: Yes. So the question was, the
multiple camera view, is that available for partners? So that’s a feature
of the player. So if the stream actually has
multiple cameras, then the player will be able
to display it. So this is something that
a partner can schedule. And if they actually have
multiple video streams, then our player detects that
through the metadata. And it will actually show
the multiple cameras. OK, I guess we have time
for one more question. So let’s take one
last question. And then after this, I’ll
be available outside. MALE SPEAKER: I just wanted to
know about if there is any time frame or plans to support
VP8 streaming instead of H264? [INTERPOSING VOICES] JAREK WILKIEWICZ: Yeah, that’s
a good question. So I don’t have the time
frame for that. But yeah, that is something that
the team is working on, because that question
has come up before. So I’m pretty sure that this is
going to come, but I don’t have the exact time frame. OK. So I think we ran out of time. Thank you very much for coming
to this YouTube session. I hope that was useful. And please come by and say hello
after the presentation. Thank you.

Leave a Reply

Your email address will not be published. Required fields are marked *

Back To Top