
<rss version="2.0" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:media="http://search.yahoo.com/mrss/">
    <channel>
        <title><![CDATA[ The Cloudflare Blog ]]></title>
        <description><![CDATA[ Get the latest news on how products at Cloudflare are built, technologies used, and join the teams helping to build a better Internet. ]]></description>
        <link>https://blog.cloudflare.com</link>
        <atom:link href="https://blog.cloudflare.com/" rel="self" type="application/rss+xml"/>
        <language>en-us</language>
        
        <lastBuildDate>Thu, 09 Apr 2026 14:38:08 GMT</lastBuildDate>
        <item>
            <title><![CDATA[MoQ: Refactoring the Internet's real-time media stack]]></title>
            <link>https://blog.cloudflare.com/moq/</link>
            <pubDate>Fri, 22 Aug 2025 14:00:00 GMT</pubDate>
            <description><![CDATA[ Media over QUIC (MoQ) is a new IETF standard that resolves this conflict, creating a single foundation for sub-second, interactive streaming at a global scale.
 ]]></description>
            <content:encoded><![CDATA[ <p>For over two decades, we've built real-time communication on the Internet using a patchwork of specialized tools. RTMP gave us ingest. <a href="https://www.cloudflare.com/learning/video/what-is-http-live-streaming/"><u>HLS</u></a> and <a href="https://www.mpeg.org/standards/MPEG-DASH/"><u>DASH</u></a> gave us scale. WebRTC gave us interactivity. Each solved a specific problem for its time, and together they power the global streaming ecosystem we rely on today.</p><p>But using them together in 2025 feels like building a modern application with tools from different eras. The seams are starting to show—in complexity, in latency, and in the flexibility needed for the next generation of applications, from sub-second live auctions to massive interactive events. We're often forced to make painful trade-offs between latency, scale, and operational complexity.</p><p>Today Cloudflare is launching the first Media over QUIC (MoQ) relay network, running on every Cloudflare server in datacenters in 330+ cities. MoQ is an open protocol being developed at the <a href="https://www.ietf.org/"><u>IETF</u></a> by engineers from across the industry—not a proprietary Cloudflare technology. MoQ combines the low-latency interactivity of WebRTC, the scalability of HLS/DASH, and the simplicity of a single architecture, all built on a modern transport layer. We're joining Meta, Google, Cisco, and others in building implementations that work seamlessly together, creating a shared foundation for the next generation of real-time applications on the Internet.</p>
    <div>
      <h3><b>An evolutionary ladder of compromise</b></h3>
      <a href="#an-evolutionary-ladder-of-compromise">
        
      </a>
    </div>
    <p>To understand the promise of MoQ, we first have to appreciate the history that led us here—a journey defined by a series of architectural compromises where solving one problem inevitably created another.</p><p><b>The RTMP era: Conquering latency, compromising on scale</b></p><p>In the early 2000s, <b>RTMP (Real-Time Messaging Protocol)</b> was a breakthrough. It solved the frustrating "download and wait" experience of early video playback on the web by creating a persistent, stateful TCP connection between a <a href="https://en.wikipedia.org/wiki/Adobe_Flash"><u>Flash</u></a> client and a server. This enabled low-latency streaming (2-5 seconds), powering the first wave of live platforms like <a href="http://justin.tv"><u>Justin.tv</u></a> (which later became Twitch).</p><p>But its strength was its weakness. That stateful connection, which had to be maintained for every viewer, was architecturally hostile to scale. It required expensive, specialized media servers and couldn't use the commodity HTTP-based <a href="https://www.cloudflare.com/learning/cdn/what-is-a-cdn/"><u>Content Delivery Networks (CDNs)</u></a> that were beginning to power the rest of the web. Its reliance on TCP also meant that a single lost packet could freeze the entire stream—a phenomenon known as <a href="https://blog.cloudflare.com/the-road-to-quic/#head-of-line-blocking"><u>head-of-line blocking</u></a>—creating jarring latency spikes. The industry retained RTMP for the "first mile" from the camera to servers (ingest), but a new solution was needed for the "last mile" from servers to your screen (delivery).</p><p><b>The HLS &amp; DASH era: Solving for scale, compromising on latency</b></p><p>The catalyst for the next era was the iPhone's rejection of Flash. In response, Apple created <a href="https://www.cloudflare.com/learning/video/what-is-http-live-streaming/"><b><u>HLS (HTTP Live Streaming)</u></b></a>. HLS, and its open-standard counterpart <b>MPEG-DASH</b> abandoned stateful connections and treated video as a sequence of small, static files delivered over standard HTTP.</p><p>This enabled much greater scalability. By moving to the interoperable open standard of HTTP for the underlying transport, video could now be distributed by any web server and cached by global CDNs, allowing platforms to reach millions of viewers reliably and relatively inexpensively. The compromise? A <i>significant</i> trade-off in latency. To ensure smooth playback, players needed to buffer at least three video segments before starting. With segment durations of 6-10 seconds, this baked 15-30 seconds of latency directly into the architecture.</p><p>While extensions like <a href="https://developer.apple.com/documentation/http-live-streaming/enabling-low-latency-http-live-streaming-hls"><u>Low-Latency HLS (LL-HLS)</u></a> have more recently emerged to achieve latencies in the 3-second range, they remain complex patches<a href="https://blog.cloudflare.com/the-road-to-quic/#head-of-line-blocking"><u> fighting against the protocol's fundamental design</u></a>. These extensions introduce a layer of stateful, real-time communication—using clever workarounds like holding playlist requests open—that ultimately strain the stateless request-response model central to HTTP's scalability and composability.</p><p><b>The WebRTC Era: Conquering conversational latency, compromising on architecture</b></p><p>In parallel, <b>WebRTC (Web Real-Time Communication)</b> emerged to solve a different problem: plugin-free, two-way conversational video with sub-500ms latency within a browser. It worked by creating direct peer-to-peer (P2P) media paths, removing central servers from the equation.</p><p>But this P2P model is fundamentally at odds with broadcast scale. <a href="https://blog.cloudflare.com/cloudflare-calls-anycast-webrtc/#webrtc-growing-pains"><u>In a mesh network, the number of connections grows quadratically with each new participant</u></a> (the "N-squared problem"). For more than a handful of users, the model collapses under the weight of its own complexity. To work around this, the industry developed server-based topologies like the Selective Forwarding Unit (SFU) and Multipoint Control Unit (MCU). These are effective but require building what is essentially a <a href="https://blog.cloudflare.com/cloudflare-calls-anycast-webrtc/#is-cloudflare-calls-a-real-sfu"><u>private, stateful, real-time CDN</u></a>—a complex and expensive undertaking that is not standardized across infrastructure providers.</p><p>This journey has left us with a fragmented landscape of specialized, non-interoperable silos, forcing developers to stitch together multiple protocols and accept a painful three-way tension between <b>latency, scale, and complexity</b>.</p>
    <div>
      <h3><b>Introducing MoQ</b></h3>
      <a href="#introducing-moq">
        
      </a>
    </div>
    <p>This is the context into which Media over QUIC (MoQ) emerges. It's not just another protocol; it's a new design philosophy built from the ground up to resolve this historical trilemma. Born out of an open, community-driven effort at the IETF, <u>MoQ aims to be a foundational Internet technology, not a proprietary product</u>.</p><p>Its promise is to unify the disparate worlds of streaming by delivering:</p><ol><li><p><b>Sub-second latency at broadcast scale:</b> Combining the latency of WebRTC with the scale of HLS/DASH and the simplicity of RTMP.</p></li><li><p><b>Architectural simplicity:</b> Creating a single, flexible protocol for ingest, distribution, and interactive use cases, eliminating the need to transcode between different technologies.</p></li><li><p><b>Transport efficiency:</b> Building on <a href="https://blog.cloudflare.com/the-road-to-quic/"><u>QUIC</u></a>, a <a href="https://www.cloudflare.com/learning/ddos/glossary/user-datagram-protocol-udp/"><u>UDP</u></a> based protocol to eliminate bottlenecks like TCP<a href="https://blog.cloudflare.com/the-road-to-quic/#head-of-line-blocking"><u> head-of-line blocking</u></a>.</p></li></ol><p>The initial focus was "Media" over QUIC, but the core concepts—named tracks of timed, ordered, but independent data—are so flexible that the working group is now simply calling the protocol "MoQ." The name reflects the power of the abstraction: it's a generic transport for any real-time data that needs to be delivered efficiently and at scale.</p><p>MoQ is now generic enough that it’s a data fanout or pub/sub system, for everything from audio/video (high bandwidth data) to sports score updates (low bandwidth data).</p>
    <div>
      <h3><b>A deep dive into the MoQ protocol stack</b></h3>
      <a href="#a-deep-dive-into-the-moq-protocol-stack">
        
      </a>
    </div>
    <p>MoQ's elegance comes from solving the right problem at the right layer. Let's build up from the foundation to see how it achieves sub-second latency at scale.</p><p>The choice of QUIC as MoQ's foundation isn't arbitrary—it addresses issues that have plagued streaming protocols for decades.</p><p>By building on <b>QUIC</b> (the transport protocol that also powers <a href="https://www.cloudflare.com/learning/performance/what-is-http3/"><u>HTTP/3</u></a>), MoQ solves some key streaming problems:</p><ul><li><p><b>No head-of-line blocking:</b> Unlike TCP where one lost packet blocks everything behind it, QUIC streams are independent. A lost packet on one stream (e.g., an audio track) doesn't block another (e.g., the main video track). This alone eliminates the stuttering that plagued RTMP.</p></li><li><p><b>Connection migration:</b> When your device switches from Wi-Fi to cellular mid-stream, the connection seamlessly migrates without interruption—no rebuffering, no reconnection.</p></li><li><p><b>Fast connection establishment:</b> QUIC's <a href="https://blog.cloudflare.com/even-faster-connection-establishment-with-quic-0-rtt-resumption/"><u>0-RTT resumption</u></a> means returning viewers can start playing instantly.</p></li><li><p><b>Baked-in, mandatory encryption:</b> All QUIC connections are encrypted by default with <a href="https://blog.cloudflare.com/rfc-8446-aka-tls-1-3/"><u>TLS 1.3</u></a>.</p></li></ul>
    <div>
      <h4>The core innovation: Publish/subscribe for media</h4>
      <a href="#the-core-innovation-publish-subscribe-for-media">
        
      </a>
    </div>
    <p>With QUIC solving transport issues, MoQ introduces its key innovation: treating media as subscribable tracks in a publish/subscribe system. But unlike traditional pub/sub, this is designed specifically for real-time media at CDN scale.</p><p>Instead of complex session management (WebRTC) or file-based chunking (HLS), <b>MoQ lets publishers announce named tracks of media that subscribers can request</b>. A relay network handles the distribution without needing to understand the media itself.</p>
    <div>
      <h4>How MoQ organizes media: The data model</h4>
      <a href="#how-moq-organizes-media-the-data-model">
        
      </a>
    </div>
    <p>Before we see how media flows through the network, let's understand how MoQ structures it. MoQ organizes data in a hierarchy:</p><ul><li><p><b>Tracks</b>: Named streams of media, like "video-1080p" or "audio-english". Subscribers request specific tracks by name.</p></li><li><p><b>Groups</b>: Independently decodable chunks of a track. For video, this typically means a GOP (Group of Pictures) starting with a keyframe. New subscribers can join at any Group boundary.</p></li><li><p><b>Objects</b>: The actual packets sent on the wire. Each Object belongs to a Track and has a position within a Group.</p></li></ul><p>This simple hierarchy enables two capabilities:</p><ol><li><p>Subscribers can start playback at <b>Group</b> boundaries without waiting for the next keyframe</p></li><li><p>Relays can forward <b>Objects</b> without parsing or understanding the media format</p></li></ol>
    <div>
      <h5>The network architecture: From publisher to subscriber</h5>
      <a href="#the-network-architecture-from-publisher-to-subscriber">
        
      </a>
    </div>
    <p>MoQ’s network components are also simple:</p><ul><li><p><b>Publishers</b>: Announce track namespaces and send Objects</p></li><li><p><b>Subscribers</b>: Request specific tracks by name</p></li><li><p><b>Relays</b>: Connect publishers to subscribers by forwarding immutable Objects without parsing or <a href="https://www.cloudflare.com/learning/video/video-encoding-formats/"><u>transcoding</u></a> the media</p></li></ul><p>A Relay acts as a subscriber to receive tracks from upstream (like the original publisher) and simultaneously acts as a publisher to forward those same tracks downstream. This model is the key to MoQ's scalability: one upstream subscription can fan out to serve thousands of downstream viewers.</p>
    <div>
      <h5>The MoQ Stack</h5>
      <a href="#the-moq-stack">
        
      </a>
    </div>
    
          <figure>
          <img src="https://cf-assets.www.cloudflare.com/zkvhlag99gkb/4g2MroH24otkzH3LQsFWZe/84ca43ad6c1c933ac395bf4ac767c584/image1.png" />
          </figure><p>MoQ's architecture can be understood as three distinct layers, each with a clear job:</p><ol><li><p><b>The Transport Foundation (QUIC or WebTransport):</b> This is the modern foundation upon which everything is built. MoQT can run directly over raw <b>QUIC</b>, which is ideal for native applications, or over <b>WebTransport</b>, which is required for use in a web browser. Crucially, the<a href="https://www.ietf.org/archive/id/draft-ietf-webtrans-http3-02.html"> <u>WebTransport protocol</u></a> and its corresponding<a href="https://w3c.github.io/webtransport/"> <u>W3C browser API</u></a> make QUIC's multiplexed reliable streams and unreliable datagrams directly accessible to browser applications. This is a game-changer. Protocols like <a href="https://blog.cloudflare.com/stream-now-supports-srt-as-a-drop-in-replacement-for-rtmp/"><u>SRT</u></a> may be efficient, but their lack of native browser support relegates them to ingest-only roles. WebTransport gives MoQ first-class citizenship on the web, making it suitable for both ingest and massive-scale distribution directly to clients.</p></li><li><p><b>The MoQT Layer:</b> Sitting on top of QUIC (or WebTransport), the MoQT layer provides the signaling and structure for a publish-subscribe system. This is the primary focus of the IETF working group. It defines the core control messages—like ANNOUNCE, and SUBSCRIBE—and the basic data model we just covered. MoQT itself is intentionally spartan; it doesn't know or care if the data it's moving is <a href="https://www.cloudflare.com/learning/video/what-is-h264-avc/"><u>H.264</u></a> video, Opus audio, or game state updates.</p></li><li><p><b>The Streaming Format Layer:</b> This is where media-specific logic lives. A streaming format defines things like manifests, codec metadata, and packaging rules.
 <a href="https://datatracker.ietf.org/doc/draft-ietf-moq-warp/"><b><u>WARP</u></b></a> is one such format being developed alongside MoQT at the IETF, but it isn't the only one. Another standards body, like DASH-IF, could define a <a href="https://www.iso.org/standard/85623.html"><u>CMAF</u></a>-based streaming format over MoQT. A company that controls both original publisher and end subscriber can develop its own proprietary streaming format to experiment with new codecs or delivery mechanisms without being constrained by the transport protocol.</p></li></ol><p>This separation of layers is why different organizations can build interoperable implementations while still innovating at the streaming format layer.</p>
    <div>
      <h4>End-to-End Data Flow</h4>
      <a href="#end-to-end-data-flow">
        
      </a>
    </div>
    <p>Now that we understand the architecture and the data model, let's walk through how these pieces come together to deliver a stream. The protocol is flexible, but a typical broadcast flow relies on the <code>ANNOUNCE</code> and <code>SUBSCRIBE </code>messages to establish a data path from a publisher to a subscriber through the relay network.</p>
          <figure>
          <img src="https://cf-assets.www.cloudflare.com/zkvhlag99gkb/2iRTJFdtCjIOcyg7ezoYgJ/e303ea8d1eb438328b60fdb28be47e84/image2.png" />
          </figure><p>Here is a step-by-step breakdown of what happens in this flow:</p><ol><li><p><b>Initiating Connections:</b> The process begins when the endpoints, acting as clients, connect to the relay network. The Original Publisher initiates a connection with its nearest relay (we'll call it Relay A). Separately, an End Subscriber initiates a connection with its own local relay (Relay B). These endpoints perform a <code>SETUP</code> handshake with their respective relays to establish a MoQ session and declare supported parameters.</p></li><li><p><b>Announcing a Namespace:</b> To make its content discoverable, the Publisher sends an <code>ANNOUNCE</code> message to Relay A. This message declares that the publisher is the authoritative source for a given <b>track namespace</b>. Relay A receives this and registers in a shared control plane (a conceptual database) that it is now a source for this namespace within the network.</p></li><li><p><b>Subscribing to a Track:</b> When the End Subscriber wants to receive media, it sends a <code>SUBSCRIBE</code> message to its relay, Relay B. This message is a request for a specific <b>track name</b> within a specific <b>track namespace</b>.</p></li><li><p><b>Connecting the Relays:</b> Relay B receives the <code>SUBSCRIBE</code> request and queries the control plane. It looks up the requested namespace and discovers that Relay A is the source. Relay B then initiates a session with Relay A (if it doesn't already have one) and forwards the <code>SUBSCRIBE</code> request upstream.</p></li><li><p><b>Completing the Path and Forwarding Objects:</b> Relay A, having received the subscription request from Relay B, forwards it to the Original Publisher. With the full path now established, the Publisher begins sending the <code>Objects</code> for the requested track. The Objects flow from the Publisher to Relay A, which forwards them to Relay B, which in turn forwards them to the End Subscriber. If another subscriber connects to Relay B and requests the same track, Relay B can immediately start sending them the Objects without needing to create a new upstream subscription.</p></li></ol>
    <div>
      <h5>An Alternative Flow: The <code>PUBLISH</code> Model</h5>
      <a href="#an-alternative-flow-the-publish-model">
        
      </a>
    </div>
    
          <figure>
          <img src="https://cf-assets.www.cloudflare.com/zkvhlag99gkb/6KJYU1eWNyuSZEHNYonDDn/3898003d5a7f5904787c7ef009b22fe0/image3.png" />
          </figure><p>More recent drafts of the MoQ specification have introduced an alternative, push-based model using a <code>PUBLISH</code> message. In this flow, a publisher can effectively ask for permission to send a track's objects to a relay <i>without</i> waiting for a <code>SUBSCRIBE </code>request. The publisher sends a <code>PUBLISH</code> message, and the relay's <code>PUBLISH_OK</code> response indicates whether it will accept the objects. This is particularly useful for ingest scenarios, where a publisher wants to send its stream to an entry point in the network immediately, ensuring the media is available the instant the first subscriber connects.</p>
    <div>
      <h4>Advanced capabilities: Prioritization and congestion control</h4>
      <a href="#advanced-capabilities-prioritization-and-congestion-control">
        
      </a>
    </div>
    <p>MoQ’s benefits really shine when networks get congested. MoQ includes mechanisms for handling the reality of network traffic. One such mechanism is Subgroups.</p><p><b>Subgroups</b> are subdivisions within a Group that effectively map directly to the underlying QUIC streams. All Objects within the same Subgroup are generally sent on the same QUIC stream, guaranteeing their delivery order. Subgroup numbering also presents an opportunity to encode prioritization: within a Group, lower-numbered Subgroups are considered higher priority. </p><p>This enables intelligent quality degradation, especially with layered codecs (e.g. SVC):</p><ul><li><p><b>Subgroup 0</b>: Base video layer (360p) - must deliver</p></li><li><p><b>Subgroup 1</b>: Enhancement to 720p - deliver if bandwidth allows</p></li><li><p><b>Subgroup 2</b>: Enhancement to 1080p - first to drop under congestion</p></li></ul><p>When a relay detects congestion, it can drop Objects from higher-numbered Subgroups, preserving the base layer. Viewers see reduced quality instead of buffering.</p><p>The MoQ specification defines a scheduling algorithm that determines the order for all objects that are "ready to send." When a relay has multiple objects ready, it prioritizes them first by <b>group order</b> (ascending or descending) and then, within a group, by <b>subgroup id</b>. Our implementation supports the <b>group order</b> preference, which can be useful for low-latency broadcasts. If a viewer falls behind and its subscription uses descending group order, the relay prioritizes sending Objects from the newest "live" Group, potentially canceling unsent Objects from older Groups. This can help viewers catch up to the live edge quickly, a highly desirable feature for many interactive streaming use cases. The optimal strategies for using these features to improve QoE for specific use cases are still an open research question. We invite developers and researchers to use our network to experiment and help find the answers.</p>
    <div>
      <h3><b>Implementation: building the Cloudflare MoQ relay</b></h3>
      <a href="#implementation-building-the-cloudflare-moq-relay">
        
      </a>
    </div>
    <p>Theory is one thing; implementation is another. To validate the protocol and understand its real-world challenges, we've been building one of the first global MoQ relay networks. Cloudflare's network, which places compute and logic at the edge, is very well suited for this.</p><p>Our architecture connects the abstract concepts of MoQ to the Cloudflare stack. In our deep dive, we mentioned that when a publisher <code>ANNOUNCE</code>s a namespace, relays need to register this availability in a "shared control plane" so that <code>SUBSCRIBE</code> requests can be routed correctly. For this critical piece of state management, we use <a href="https://developers.cloudflare.com/durable-objects/"><u>Durable Objects</u></a>.</p><p>When a publisher announces a new namespace to a relay in, say, London, that relay uses a Durable Object—our strongly consistent, single-threaded storage solution—to record that this namespace is now available at that specific location. When a subscriber in Paris wants a track from that namespace, the network can query this distributed state to find the nearest source and route the <code>SUBSCRIBE</code> request accordingly. This architecture builds upon the technology we developed for Cloudflare's real-time services and provides a solution to the challenge of state management at a global scale.</p>
    <div>
      <h4>An Evolving Specification</h4>
      <a href="#an-evolving-specification">
        
      </a>
    </div>
    <p>Building on a new protocol in the open means implementing against a moving target. To get MoQ into the hands of the community, we made a deliberate trade-off: our current relay implementation is based on a <b>subset of the features defined in </b><a href="https://www.ietf.org/archive/id/draft-ietf-moq-transport-07.html"><b><u>draft-ietf-moq-transport-07</u></b></a>. This version became a de facto target for interoperability among several open-source projects and pausing there allowed us to put effort towards other aspects of deploying our relay network<b>.</b></p><p>This draft of the protocol makes a distinction between accessing "past" and "future" content. <code><b>SUBSCRIBE</b></code> is used to receive <b>future</b> objects for a track as they arrive—like tuning into a live broadcast to get everything from that moment forward. In contrast, <code><b>FETCH</b></code> provides a mechanism for accessing <b>past</b> content that a relay may already have in its cache—like asking for a recording of a song that just played.</p><p>Both are part of the same specification, but for the most pressing low-latency use cases, a performant implementation of <code>SUBSCRIBE</code> is what matters most. For that reason, we have focused our initial efforts there and have not yet implemented <code>FETCH</code>.</p><p>This is where our roadmap is flexible and where the community can have a direct impact. Do you need <code>FETCH</code> to build on-demand or catch-up functionality? Or is more complete support for the prioritization features within <code>SUBSCRIBE</code> more critical for your use case? The feedback we receive from early developers will help us decide what to build next.</p><p>As always, we will announce our updates and changes to our implementation as we continue with development on our <a href="https://developers.cloudflare.com/moq"><u>developer docs pages</u></a>.</p>
    <div>
      <h3>Kick the tires on the future</h3>
      <a href="#kick-the-tires-on-the-future">
        
      </a>
    </div>
    <p>We believe in building in the open and interoperability in the community. MoQ is not a Cloudflare technology but a foundational Internet technology. To that end, the first demo client we’re presenting is an open source, community example.</p><p><b>You can access the demo here: </b><a href="https://moq.dev/publish/"><b><u>https://moq.dev/publish/</u></b></a></p><p>Even though this is a preview release, we are running MoQ relays at Cloudflare’s full scale, like we do every production service. This means every server that is part of the Cloudflare network in more than 330 cities is now a MoQ relay.</p><p>We invite you to experience the "wow" moment of near-instant, sub-second streaming latency that MoQ enables. How would you use a protocol that offers the speed of a video call with the scale of a global broadcast?</p>
    <div>
      <h3><b>Interoperability</b></h3>
      <a href="#interoperability">
        
      </a>
    </div>
    <p>We’ve been working with others in the IETF WG community and beyond on interoperability of publishers, players and other parts of the MoQ ecosystem. So far, we’ve tested with:</p><ul><li><p>Luke Curley’s <a href="https://moq.dev"><u>moq.dev</u></a></p></li><li><p>Lorenzo Miniero’s <a href="https://github.com/meetecho/imquic"><u>imquic</u></a></p></li><li><p>Meta’s <a href="https://github.com/facebookexperimental/moxygen"><u>Moxygen</u></a> </p></li><li><p><a href="https://github.com/englishm/moq-rs"><u>moq-rs</u></a></p></li><li><p><a href="https://github.com/englishm/moq-js"><u>moq-js</u></a></p></li><li><p><a href="https://norsk.video/"><u>Norsk</u></a></p></li><li><p><a href="https://vindral.com/"><u>Vindral</u></a></p></li></ul>
    <div>
      <h3>The Road Ahead</h3>
      <a href="#the-road-ahead">
        
      </a>
    </div>
    <p>The Internet's media stack is being refactored. For two decades, we've been forced to choose between latency, scale, and complexity. The compromises we made solved some problems, but also led to a fragmented ecosystem.</p><p>MoQ represents a promising new foundation—a chance to unify the silos and build the next generation of real-time applications on a scalable protocol. We're committed to helping build this foundation in the open, and we're just getting started.</p><p>MoQ is a realistic way forward, built on QUIC for future proofing, easier to understand than WebRTC, compatible with browsers unlike RTMP.</p><p>The protocol is evolving, the implementations are maturing, and the community is growing. Whether you're building the next generation of live streaming, exploring real-time collaboration, or pushing the boundaries of interactive media, consider whether MoQ may provide the foundation you need.</p>
    <div>
      <h3>Availability and pricing</h3>
      <a href="#availability-and-pricing">
        
      </a>
    </div>
    <p>We want developers to start building with MoQ today. To make that possible MoQ at Cloudflare is in tech preview - this means it's available free of charge for testing (at any scale). Visit our <a href="https://developers.cloudflare.com/moq/"><u>developer homepage </u></a>for updates and potential breaking changes.</p><p>Indie developers and large enterprises alike ask about pricing early in their adoption of new technologies. We will be transparent and clear about MoQ pricing. In general availability, self-serve customers should expect to pay 5 cents/GB outbound with no cost for traffic sent towards Cloudflare. </p><p>Enterprise customers can expect usual pricing in line with regular media delivery pricing, competitive with incumbent protocols. This means if you’re already using Cloudflare for media delivery, you should not be wary of adopting new technologies because of cost. We will support you.</p><p>If you’re interested in partnering with Cloudflare in adopting the protocol early or contributing to its development, please reach out to us at <a><u>moq@cloudflare.com</u></a>! Engineers excited about the future of the Internet are standing by.</p>
    <div>
      <h3>Get involved:</h3>
      <a href="#get-involved">
        
      </a>
    </div>
    <ul><li><p><b>Try the demo:</b> <a href="https://moq.dev/publish/"><u>https://moq.dev/publish/</u></a></p></li><li><p><b>Read the Internet draft:</b> <a href="https://datatracker.ietf.org/doc/draft-ietf-moq-transport/"><u>https://datatracker.ietf.org/doc/draft-ietf-moq-transport/</u></a></p></li><li><p><b>Contribute</b> to the protocol’s development: <a href="https://datatracker.ietf.org/group/moq/documents/"><u>https://datatracker.ietf.org/group/moq/documents/</u></a></p></li><li><p><b>Visit </b>our developer homepage: <a href="https://developers.cloudflare.com/moq/"><u>https://developers.cloudflare.com/moq/</u></a></p></li></ul><p></p> ]]></content:encoded>
            <category><![CDATA[Video]]></category>
            <category><![CDATA[QUIC]]></category>
            <category><![CDATA[Live Streaming]]></category>
            <category><![CDATA[WebRTC]]></category>
            <category><![CDATA[IETF]]></category>
            <category><![CDATA[Standards]]></category>
            <guid isPermaLink="false">2XgF5NjmAy3cqybLPkpMFu</guid>
            <dc:creator>Mike English</dc:creator>
            <dc:creator>Renan Dincer</dc:creator>
        </item>
        <item>
            <title><![CDATA[Orange Me2eets: We made an end-to-end encrypted video calling app and it was easy]]></title>
            <link>https://blog.cloudflare.com/orange-me2eets-we-made-an-end-to-end-encrypted-video-calling-app-and-it-was/</link>
            <pubDate>Thu, 26 Jun 2025 14:00:00 GMT</pubDate>
            <description><![CDATA[ Orange Meets, our open-source video calling web application, now supports end-to-end encryption using the MLS protocol with continuous group key agreement. ]]></description>
            <content:encoded><![CDATA[ <p>Developing a new video conferencing application often begins with a peer-to-peer setup using <a href="https://webrtc.org/"><u>WebRTC</u></a>, facilitating direct data exchange between clients. While effective for small demonstrations, this method encounters scalability hurdles with increased participants. The data transmission load for each client escalates significantly in proportion to the number of users, as each client is required to send data to every other client except themselves (n-1).</p><p>In the scaling of video conferencing applications, Selective Forwarding Units (SFUs) are essential.  Essentially a media stream routing hub, an SFU receives media and data flows from participants and intelligently determines which streams to forward. By strategically distributing media based on network conditions and participant needs, this mechanism minimizes bandwidth usage and greatly enhances scalability. Nearly every video conferencing application today uses SFUs.</p><p>In 2024, we announced <a href="https://blog.cloudflare.com/cloudflare-calls-anycast-webrtc/"><u>Cloudflare Realtime</u></a> (then called Cloudflare Calls), our suite of WebRTC products, and we also released <a href="https://github.com/cloudflare/orange"><u>Orange Meets</u></a>, an open source video chat application built on top of our SFU.</p><p>We also realized that use of an SFU often comes with a privacy cost, as there is now a centralized hub that could see and listen to all the media contents, even though its sole job is to forward media bytes between clients as a data plane.</p><p>We believe end-to-end encryption should be the industry standard for secure communication and that’s why today we’re excited to share that we’ve implemented and open sourced end-to-end encryption in Orange Meets. Our generic implementation is client-only, so it can be used with any WebRTC infrastructure. Finally, our new <i>designated committer </i>distributed algorithm is verified in a bounded model checker to verify this algorithm handles edge cases gracefully.</p>
    <div>
      <h2>End-to-end encryption for video conferencing is different than for text messaging</h2>
      <a href="#end-to-end-encryption-for-video-conferencing-is-different-than-for-text-messaging">
        
      </a>
    </div>
    <p>End-to-end encryption describes a secure communication channel whereby only the intended participants can read, see, or listen to the contents of the conversation, not anybody else. WhatsApp and iMessage, for example, are end-to-end-encrypted, which means that the companies that operate those apps or any other infrastructure can’t see the contents of your messages. </p><p>Whereas encrypted group chats are usually long-lived, highly asynchronous, and low bandwidth sessions, video and audio calls are short-lived, highly synchronous, and require high bandwidth. This difference comes with plenty of interesting tradeoffs, which influenced the design of our system.</p><p>We had to consider how factors like the ephemeral nature of calls, compared to the persistent nature of group text messages, also influenced the way we designed E2EE for Orange Meets. In chat messages, users must be able to decrypt messages sent to them while they were offline (e.g. while taking a flight). This is not a problem for real-time communication.</p><p>The bandwidth limitations around audio/video communication and the use of an SFU prevented us from using some of the E2EE technologies already available for text messages. Apple’s iMessage, for example, encrypts a message N-1 times for an N-user group chat. We can't encrypt the video for each recipient, as that could saturate the upload capacity of Internet connections as well as slow down the client. Media has to be encrypted once and decrypted by each client while preserving secrecy around only the current participants of the call.</p>
    <div>
      <h2>Messaging Layer Security (MLS)</h2>
      <a href="#messaging-layer-security-mls">
        
      </a>
    </div>
    <p>Around the same time we were working on Orange Meets, we saw a lot of excitement around new apps being built with <a href="https://messaginglayersecurity.rocks/"><u>Messaging Layer Security</u></a> (MLS), an IETF-standardized protocol that describes how you can do a group key exchange in order to establish end-to-end-encryption for group communication. </p><p>Previously, the only way to achieve these properties was to essentially run your own fork of the <a href="https://signal.org/docs/"><u>Signal protocol</u></a>, which itself is more of a living protocol than a solidified standard. Since MLS is standardized, we’ve now seen multiple high-quality implementations appear, and we’re able to use them to achieve Signal-level security with far less effort.</p><p>Implementing MLS here wasn’t easy: it required a moderate amount of client modification, and the development and verification of an encrypted room-joining protocol. Nonetheless, we’re excited to be pioneering a standards-based approach that any customer can run on our network, and to share more details about how our implementation works. </p><p>We did not have to make any changes to the SFU to get end-to-end encryption working. Cloudflare’s SFU doesn’t care about the contents of the data forwarded on our data plane and whether it’s encrypted or not.</p>
    <div>
      <h2>Orange Meets: the basics </h2>
      <a href="#orange-meets-the-basics">
        
      </a>
    </div>
    <p>Orange Meets is a video calling application built on <a href="https://workers.cloudflare.com/"><u>Cloudflare Workers</u></a> that uses the <a href="https://developers.cloudflare.com/realtime/calls-vs-sfus/"><u>Cloudflare Realtime SFU service</u></a> as the data plane. The roles played by the three main entities in the application are as follows:</p><ul><li><p>The <i>user</i> is a participant in the video call. They connect to the Orange Meets server and SFU, described below.</p></li><li><p>The <i>Orange Meets Server </i>is a simple service run on a Cloudflare Worker that runs the small-scale coordination logic of Orange Meets, which is concerned with which user is in which video call — called a <i>room </i>— and what the state of the room is. Whenever something in the room changes, like a participant joining or leaving, or someone muting themselves, the app server broadcasts the change to all room participants. You can use any backend server for this component, we just chose Cloudflare Workers for its convenience.</p></li><li><p>Cloudflare Realtime <i>Selective Forwarding Unit</i> (SFU) is a service that Cloudflare runs, which takes everyone’s audio and video and broadcasts it to everyone else. These connections are potentially lossy, using UDP for transmission. This is done because a dropped video frame from five seconds ago is not very important in the context of a video call, and so should not be re-sent, as it would be in a TCP connection.</p></li></ul>
          <figure>
          <img src="https://cf-assets.www.cloudflare.com/zkvhlag99gkb/61htaksggj580PqX02XoVB/3b0f1ed34ee681e41b2009257fdc8525/image2.png" />
          </figure><p><sup><i>The network topology of Orange Meets</i></sup></p><p>Next, we have to define what we mean by end-to-end encryption in the context of video chat.</p>
    <div>
      <h2>End-to-end encrypting Orange Meets </h2>
      <a href="#end-to-end-encrypting-orange-meets">
        
      </a>
    </div>
    <p>The most immediate way to end-to-end encrypt Orange Meets is to simply have the initial users agree on a symmetric encryption/decryption key at the beginning of a call, and just encrypt every video frame using that key. This is sufficient to hide calls from Cloudflare’s SFU. Some source-encrypted video conferencing implementations, such as <a href="https://jitsi.org/e2ee-in-jitsi/"><u>Jitsi Meet</u></a>, work this way.</p><p>The issue, however, is that kicking a malicious user from a call does not invalidate their key, since the keys are negotiated just once. A joining user learns the key that was used to encrypt video from before they joined. These failures are more formally referred to as failures of <i>post-compromise security</i> and <i>perfect forward secrecy</i>. When a protocol successfully implements these in a group setting, we call the protocol a <b>continuous group key agreement protocol</b>.</p><p>Fortunately for us, MLS is a continuous group key agreement protocol that works out of the box, and the nice folks at <a href="https://phnx.im/"><u>Phoenix R&amp;D</u></a> and <a href="https://cryspen.com/"><u>Cryspen</u></a> have a well-documented <a href="https://github.com/openmls/openmls/tree/main"><u>open-source Rust implementation</u></a> of most of the MLS protocol. </p><p>All we needed to do was write an MLS client and compile it to WASM, so we could decrypt video streams in-browser. We’re using WASM since that’s one way of running Rust code in the browser. If you’re running a video conferencing application on a desktop or mobile native environment, there are other MLS implementations in your preferred programming language.</p><p>Our setup for encryption is as follows:</p><p><b>Make a web worker for encryption.</b> We wrote a web worker in Rust that accepts a WebRTC video stream, broken into individual frames, and encrypts each frame. This code is quite simple, as it’s just an MLS encryption:</p>
            <pre><code>group.create_message(
	&amp;self.mls_provider,
	self.my_signing_keys.as_ref()?,
	frame,
)</code></pre>
            <p><b>Postprocess outgoing audio/video.</b> We take our normal stream and, using some newer features of the <a href="https://developer.mozilla.org/en-US/docs/Web/API/WebRTC_API"><u>WebRTC API</u></a>, add a transform step to it. This transform step simply sends the stream to the worker:</p>
            <pre><code>const senderStreams = sender.createEncodedStreams()
const { readable, writable } = senderStreams
this.worker.postMessage(
	{
    	    type: 'encryptStream',
    	    in: readable,
    	    out: writable,
	},
	[readable, writable]
)</code></pre>
            <p>And the same for decryption:</p>
            <pre><code>const receiverStreams = receiver.createEncodedStreams()
const { readable, writable } = receiverStreams
this.worker.postMessage(
	{
    	    type: 'decryptStream',
    	    in: readable,
    	    out: writable,
	},
	[readable, writable]
)</code></pre>
            <p>Once we do this for both audio and video streams, we’re done.</p>
    <div>
      <h2>Handling different codec behaviors</h2>
      <a href="#handling-different-codec-behaviors">
        
      </a>
    </div>
    <p>The streams are now encrypted before sending and decrypted before rendering, but the browser doesn’t know this. To the browser, the stream is still an ordinary video or audio stream. This can cause errors to occur in the browser’s depacketizing logic, which expects to see certain bytes in certain places, depending on the codec. This results in some extremely cypherpunk artifacts every dozen seconds or so:</p>
          <figure>
          <img src="https://cf-assets.www.cloudflare.com/zkvhlag99gkb/72baLJkLPZPdbjHjGVxSU5/2ea34b02826aacc2b23086b463a4938f/image3.png" />
          </figure><p>Fortunately, this exact issue was discovered by engineers at Discord, who handily documented it in their <a href="https://github.com/discord/dave-protocol/blob/main/protocol.md"><u>DAVE</u></a> E2EE videocalling protocol. For the VP8 codec, which we use by default, the solution is simple: split off the first 1–10 bytes of each packet, and send them unencrypted:</p>
            <pre><code>fn split_vp8_header(frame: &amp;[u8]) -&gt; Option&lt;(&amp;[u8], &amp;[u8])&gt; {
    // If this is a keyframe, keep 10 bytes unencrypted. Otherwise, 1 is enough
    let is_keyframe = frame[0] &gt;&gt; 7 == 0;
    let unencrypted_prefix_size = if is_keyframe { 10 } else { 1 };
    frame.split_at_checked(unencrypted_prefix_size)
}</code></pre>
            <p>These bytes are not particularly important to encrypt, since they only contain versioning info, whether or not this frame is a keyframe, some constants, and the width and height of the video.</p><p>And that’s truly it for the stream encryption part! The only thing remaining is to figure out how we will let new users join a room.</p>
    <div>
      <h2>“Join my Orange Meet” </h2>
      <a href="#join-my-orange-meet">
        
      </a>
    </div>
    <p>Usually, the only way to join the call is to click a link. And since the protocol is encrypted, a joining user needs to have some cryptographic information in order to decrypt any messages. How do they receive this information, though? There are a few options.</p><p>DAVE does it by using an MLS feature called <i>external proposals</i>. In short, the Discord server registers itself as an <i>external sender</i>, i.e., a party that can send administrative messages to the group, but cannot receive any. When a user wants to join a room, they provide their own cryptographic material, called a <i>key package</i>, and the server constructs and sends an MLS <a href="https://www.rfc-editor.org/rfc/rfc9420.html#section-12.1.8"><u>External Add message</u></a> to the group to let them know about the new user joining. Eventually, a group member will <i>commit</i> this External Add, sending the joiner a <i>Welcome</i> message containing all information necessary to send and receive video.
</p>
          <figure>
          <img src="https://cf-assets.www.cloudflare.com/zkvhlag99gkb/1gQm3r3Bai8Rks4M82JuSh/87ff851a12505f5c17c241e3f1eade6a/image4.png" />
          </figure><p><sup><i>A user joining a group via MLS external proposals. Recall the Orange Meets app server functions as a broadcast channel for the whole group. We consider a group of 3 members. We write member #2 as the one committing to the proposal, but this can be done by any member. Member #2 also sends a Commit message to the other members, but we omit this for space.</i></sup><sup>  </sup></p><p>This is a perfectly viable way to implement room joining, but implementing it would require us to extend the Orange Meets server logic to have some concept of MLS. Since part of our goal is to keep things as simple as possible, we would like to do all our cryptography client-side.</p><p>So instead we do what we call the <i>designated committer</i> algorithm. When a user joins a group, they send their cryptographic material to one group member, the <i>designated committer</i>, who then constructs and sends the Add message to the rest of the group. Similarly, when notified of a user’s exit, the designated committer constructs and sends a Remove message to the rest of the group. With this setup, the server’s job remains nothing more than broadcasting messages! It’s quite simple too—the full implementation of the designated committer state machine comes out to <a href="https://github.com/cloudflare/orange/blob/66e80d6d9146e2aedd4668e581810c0ee6aeb4a0/rust-mls-worker/src/mls_ops.rs#L90-L446"><u>300 lines of Rust</u></a>, including the MLS boilerplate, and it’s about as efficient.</p>
          <figure>
          <img src="https://cf-assets.www.cloudflare.com/zkvhlag99gkb/3k3U7kFcYTwY81XzSrggt8/c27945dec311f251493826542704d370/image1.png" />
          </figure><p><sup><i>A user joining a group via the designated committer algorithm.</i></sup></p><p>One cool property of the designated committer algorithm is that something like this isn’t possible in a text group chat setting, since any given user (in particular, the designated committer) may be offline for an arbitrary period of time. Our method works because it leverages the fact that video calls are an inherently synchronous medium.</p>
    <div>
      <h3>Verifying the Designated Committer Algorithm with TLA<sup>+</sup></h3>
      <a href="#verifying-the-designated-committer-algorithm-with-tla">
        
      </a>
    </div>
    <p>The designated committer algorithm is a pretty neat simplification, but it comes with some non-trivial edge cases that we need to make sure we handle, such as:</p><ul><li><p><i>How do we make sure there is only one designated committer at a time?</i> The designated committer is the alive user with the smallest index in the MLS group state, which all users share.</p></li><li><p><i>What happens if the designated committer exits?</i> Then the next user will take its place. Every user keeps track of pending Adds and Removes, so it can continue where the previous designated committer left off.</p></li><li><p><i>If a user has not caught up to all messages, could they think they’re the designated committer?</i> No, they have to believe first that all prior eligible designated committers are disconnected.</p></li></ul><p>To make extra sure that this algorithm was correct, we formally modeled it and put it through the <a href="https://lamport.azurewebsites.net/tla/high-level-view.html"><u>TLA</u><u><sup>+</sup></u></a> model checker. To our surprise, it caught some low-level bugs! In particular, it found that, if the designated committer dies while adding a user, the protocol does not recover. We fixed these by breaking up MLS operations and enforcing a strict ordering on messages locally (e.g., a Welcome is always sent before its corresponding Add).</p><p>You can find an explainer, lessons learned, and the full <a href="https://learntla.com/core/index.html"><u>PlusCal</u></a> program (a high-level language that compiles to TLA<sup>+</sup>) <a href="https://github.com/cloudflareresearch/orange-e2ee-model-check"><u>here</u></a>. The caveat, as with any use of a bounded model checker, is that the checking is, well, bounded. We verified that no invalid protocol states are possible in a group of up to five users. We think this is good evidence that the protocol is correct for an arbitrary number of users. Because there are only two distinct roles in the protocol (designated committer and other group member), any weird behavior ought to be reproducible with two or three users, max.</p>
    <div>
      <h2>Preventing Monster-in-the-Middle attacks</h2>
      <a href="#preventing-monster-in-the-middle-attacks">
        
      </a>
    </div>
    <p>One important concern to address in any end-to-end encryption setup is how to prevent the service provider from replacing users’ key packages with their own. If the Orange Meets app server did this, and colluded with a malicious SFU to decrypt and re-encrypt video frames on the fly, then the SFU could see all the video sent through the network, and nobody would know.</p><p>To resolve this, like DAVE, we include a <i>safety number</i> in the corner of the screen for all calls. This number uniquely represents the cryptographic state of the group. If you check out-of-band (e.g., in a Signal group chat) that everyone agrees on the safety number, then you can be sure nobody’s key material has been secretly replaced.</p><p>In fact, you could also read the safety number aloud in the video call itself, but doing this is not provably secure. Reading a safety number aloud is an <i>in-band verification</i> mechanism, i.e., one where a party authenticates a channel within that channel. If a malicious app server colluding with a malicious SFU were able to construct believable video and audio of the user reading the safety number aloud, it could bypass this safety mechanism. So if your threat model includes adversaries that are able to break into a Worker and Cloudflare’s SFU, and simultaneously generate real-time deep-fakes, you should use out-of-band verification 😄.</p>
    <div>
      <h2>Future work</h2>
      <a href="#future-work">
        
      </a>
    </div>
    <p>There are some areas we could improve on:</p><ul><li><p>There is another attack vector for a malicious app server: it is possible to simply serve users malicious JavaScript. This problem, more generally called the <a href="https://web.archive.org/web/20200731144044/https://www.nccgroup.com/us/about-us/newsroom-and-events/blog/2011/august/javascript-cryptography-considered-harmful/"><u>JavaScript Cryptography Problem</u></a>, affects any in-browser application where the client wants to hide data from the server. Fortunately, we are working on a standard to address this, called <a href="https://github.com/beurdouche/explainers/blob/main/waict-explainer.md"><u>Web Application Manifest Consistency, Integrity, and Transparency</u></a>. In short, like our <a href="https://blog.cloudflare.com/key-transparency/"><u>Code Verify</u></a> solution for WhatsApp, this would allow every website to commit to the JavaScript it serves, and have a third party create an auditable log of the code. With transparency, malicious JavaScript can still be distributed, but at least now there is a log that records the code.</p></li><li><p>We can make out-of-band authentication easier by placing trust in an identity provider. Using <a href="https://www.bastionzero.com/openpubkey"><u>OpenPubkey</u></a>, it would be possible for a user to get the identity provider to sign their cryptographic material, and then present that. Then all the users would check the signature before using the material. Transparency would also help here to ensure no signatures were made in secret.</p></li></ul>
    <div>
      <h2>Conclusion</h2>
      <a href="#conclusion">
        
      </a>
    </div>
    <p>We built end-to-end encryption into the Orange Meets video chat app without a lot of engineering time, and by modifying just the client code. To do so, we built a WASM (compiled from Rust) <a href="https://github.com/cloudflare/orange/blob/e2ee/rust-mls-worker"><u>service worker</u></a> that sets up an <a href="https://www.rfc-editor.org/rfc/rfc9420.html"><u>MLS</u></a> group and does stream encryption and decryption, and designed a new joining protocol for groups, called the <i>designated committer algorithm</i>, and <a href="https://github.com/cloudflareresearch/orange-e2ee-model-check"><u>formally modeled it in TLA</u><u><sup>+</sup></u></a>. We made comments for all kinds of optimizations that are left to do, so please send us a PR if you’re so inclined!</p><p>Try using Orange Meets with E2EE enabled at <a href="https://e2ee.orange.cloudflare.dev/"><u>e2ee.orange.cloudflare.dev</u></a>, or deploy your own instance using the <a href="https://github.com/cloudflare/orange"><u>open source repository</u></a> on Github.</p> ]]></content:encoded>
            <category><![CDATA[Research]]></category>
            <category><![CDATA[Privacy]]></category>
            <category><![CDATA[Encryption]]></category>
            <category><![CDATA[Video]]></category>
            <category><![CDATA[Cloudflare Realtime]]></category>
            <guid isPermaLink="false">6X6FQzpKaqVyTLVk7rw6xm</guid>
            <dc:creator>Michael Rosenberg</dc:creator>
            <dc:creator>Kevin Kipp</dc:creator>
            <dc:creator>Renan Dincer</dc:creator>
            <dc:creator>Felipe Astroza Araya</dc:creator>
            <dc:creator>Mari Galicer</dc:creator>
        </item>
        <item>
            <title><![CDATA[Dynamically optimize, clip, and resize video from any origin with Media Transformations]]></title>
            <link>https://blog.cloudflare.com/media-transformations-for-video-open-beta/</link>
            <pubDate>Fri, 07 Mar 2025 14:00:00 GMT</pubDate>
            <description><![CDATA[ With Cloudflare Stream’s new Media Transformations, content owners can resize, crop, clip, and optimize short-form video, all without migrating storage.  ]]></description>
            <content:encoded><![CDATA[ <p>Today, we are thrilled to announce Media Transformations, a new service that brings the magic of <a href="https://developers.cloudflare.com/images/transform-images/"><u>Image Transformations</u></a> to short-form video files wherever they are stored.</p><p>Since 2018, Cloudflare Stream has offered a managed video pipeline that empowers customers to serve rich video experiences at global scale easily, in multiple formats and quality levels. Sometimes, the greatest friction to getting started isn't even about video, but rather the thought of migrating all those files. Customers want a simpler solution that retains their current storage strategy to deliver small, optimized MP4 files. Now you can do that with Media Transformations.</p>
    <div>
      <h3>Short videos, big volume</h3>
      <a href="#short-videos-big-volume">
        
      </a>
    </div>
    <p>For customers with a huge volume of short video, such as generative AI output, e-commerce product videos, social media clips, or short marketing content, uploading those assets to Stream is not always practical. Furthermore, Stream’s key features like adaptive bitrate encoding and HLS packaging offer diminishing returns on short content or small files.</p><p>Instead, content like this should be fetched from our customers' existing storage like R2 or S3 directly, optimized by Cloudflare quickly, and delivered efficiently as small MP4 files. Cloudflare Images customers reading this will note that this sounds just like their existing Image Transformation workflows. Starting today, the same workflow can be applied to your short-form videos.</p>
    <div>
      <h3>What’s in a video?</h3>
      <a href="#whats-in-a-video">
        
      </a>
    </div>
    <p>The distinction between video and images online can sometimes be blurry --- consider an animated GIF: is that an image or a video? (They're usually smaller as MP4s anyway!) As a practical example, consider a selection of product images for a new jacket on an e-commerce site. You want a consumer to know how it looks, but also how it flows. So perhaps the first "image" in that carousel is actually a video of a model simply putting the jacket on. Media Transformations empowers customers to optimize the product video and images with similar tools and identical infrastructure.</p>
    <div>
      <h3>How to get started</h3>
      <a href="#how-to-get-started">
        
      </a>
    </div>
    <p>Any website that is already enabled for Image Transformations is now enabled for Media Transformations. To enable a new zone, navigate to “Transformations” under Stream (or Images), locate your zone in the list, and click Enable. Enabling and disabling a zone for transformations affects both Images and Media transformations.</p>
          <figure>
          <img src="https://cf-assets.www.cloudflare.com/zkvhlag99gkb/5hltjlyKF43oV8gTvjr9vF/d904229983fbe9484b08763e22dcac8b/image3.png" />
          </figure><p>After enabling Media Transformations on a website, it is simple to construct a URL that transforms a video. The pattern is similar to Image Transformations, but uses the <code>media</code> endpoint instead of the <code>image</code> endpoint:</p>
            <pre><code>https://example.com/cdn-cgi/media/&lt;OPTIONS&gt;/&lt;SOURCE-VIDEO&gt;</code></pre>
            <p>The <code>&lt;OPTIONS&gt;</code> portion of the URL is a comma-separated <a href="https://developers.cloudflare.com/stream/transform-videos/"><u>list of flags</u></a> written as <code>key=value</code>. A few noteworthy flags:</p><ul><li><p><code>mode</code> can be <code>video</code> (the default) to output a video, <code>frame</code> to pull a still image of a single frame, or even spritesheet to generate an image with multiple frames, which is useful for seek previews or storyboarding.</p></li><li><p><code>time</code> specifies the exact start time from the input video to extract a frame or start making a clip</p></li><li><p><code>duration</code> specifies the length of an output video to make a clip shorter than the original</p></li><li><p><code>fit</code>, together with <code>height</code> and <code>width</code> allow resizing and cropping the output video or frame.</p></li><li><p>Setting <code>audio</code> to false removes the sound in the output video.</p></li></ul><p>The <code>&lt;SOURCE-VIDEO&gt;</code> is a full URL to a source file or a root-relative path if the origin is on the same zone as the transformation request.</p><p>A full list of supported options, examples, and troubleshooting information is <a href="https://developers.cloudflare.com/stream/transform-videos/"><u>available in DevDocs</u></a>.</p>
    <div>
      <h3>A few examples</h3>
      <a href="#a-few-examples">
        
      </a>
    </div>
    <p>I used my phone to take this video of the <a href="https://blog.cloudflare.com/harnessing-office-chaos/"><u>randomness mobile</u></a> in Cloudflare’s Austin Office and put it in an R2 bucket. Of course, it is possible to embed the original video file from R2 directly:</p>  


<p>That video file is almost 30 MB. Let’s optimize it together — a more efficient choice would be to resize the video to the width of this blog post template. Let’s apply a width adjustment in the options portion of the URL:</p>
            <pre><code>https://example.com/cdn-cgi/media/width=760/https://pub-d9fcbc1abcd244c1821f38b99017347f.r2.dev/aus-mobile.mp4</code></pre>
            <p>That will deliver the same video, resized and optimized:</p>


<p>Not only is this video the right size for its container, now it’s less than 4 MB. That’s a big bandwidth savings for visitors.</p><p>As I recorded the video, the lobby was pretty quiet, but there was someone talking in the distance. If we wanted to use this video as a background, we should remove the audio, shorten it, and perhaps crop it vertically. All of these options can be combined, comma-separated, in the options portion of the URL:</p>
            <pre><code>https://example.com/cdn-cgi/media/mode=video,duration=10s,width=480,height=720,fit=cover,audio=false/https://pub-d9fcbc1abcd244c1821f38b99017347f.r2.dev/aus-mobile.mp4</code></pre>
            <p>The result:</p>


<p>If this were a product video, we might want a small thumbnail to add to the carousel of images so shoppers can click to zoom in and see it move. Use the “frame” mode and a “time” to generate a static image from a single point in the video. The same size and fit options apply:</p>
            <pre><code>https://example.com/cdn-cgi/media/mode=frame,time=3s,width=120,height=120,fit=cover/https://pub-d9fcbc1abcd244c1821f38b99017347f.r2.dev/aus-mobile.mp4</code></pre>
            <p>Which generates this optimized image:</p> 
<img src="https://blog.cloudflare.com/cdn-cgi/media/mode=frame,time=3s,width=120,height=120,fit=cover/https://pub-d9fcbc1abcd244c1821f38b99017347f.r2.dev/aus-mobile.mp4" /><p>Try it out yourself using our video or one of your own: </p><ul><li><p>Enable transformations on your website/zone and use the endpoint: <code>https://[your-site]/cdn-cgi/media/</code></p></li><li><p>Mobile video: <a href="https://pub-d9fcbc1abcd244c1821f38b99017347f.r2.dev/aus-mobile.mp4"><u>https://pub-d9fcbc1abcd244c1821f38b99017347f.r2.dev/aus-mobile.mp4</u></a> </p></li><li><p>Check out the <a href="https://stream-video-transformer.kristianfreeman.com/"><u>Media Transformation URL Generator</u></a> from Kristian Freeman on our Developer Relations team, which he built using the <a href="https://streamlit.io/"><u>Streamlit</u></a> Python framework on Workers.</p></li></ul>
    <div>
      <h3>Input Limits</h3>
      <a href="#input-limits">
        
      </a>
    </div>
    <p>We are eager to start supporting real customer content, and we will right-size our input limitations with our early adopters. To start:</p><ul><li><p>Video files must be smaller than 40 megabytes.</p></li><li><p>Files must be MP4s and should be h.264 encoded.</p></li><li><p>Videos and images generated with Media Transformations will be cached. However, in our initial beta, the original content will not be cached which means regenerating a variant will result in a request to the origin.</p></li></ul>
    <div>
      <h3>How it works</h3>
      <a href="#how-it-works">
        
      </a>
    </div>
    <p>Unlike Stream, Media Transformations receives requests on a customer’s own website. Internally, however, these requests are passed to the same <a href="https://blog.cloudflare.com/behind-the-scenes-with-stream-live-cloudflares-live-streaming-service/"><u>On-the-Fly Encoder (“OTFE”) platform that Stream Live uses</u></a>. To achieve this, the Stream team built modules that run on our servers to act as entry points for these requests.</p><p>These entry points perform some initial validation on the URL formatting and flags before building a request to Stream’s own Delivery Worker, which in turn calls OTFE’s set of transformation handlers. The original asset is fetched from the <i>customer’s</i> origin, validated for size and type, and passed to the same OTFE methods responsible for manipulating and optimizing <a href="https://developers.cloudflare.com/stream/viewing-videos/displaying-thumbnails/"><u>video or still frame thumbnails</u></a> for videos uploaded to Stream. These tools do a final inspection of the media type and encoding for compatibility, then generate the requested variant. If any errors were raised along the way, an HTTP error response will be generated using <a href="https://developers.cloudflare.com/images/reference/troubleshooting/#error-responses-from-resizing"><u>similar error codes</u></a> to Image Transformations. When successful, the result is cached for future use and delivered to the requestor as a single file. Even for new or uncached requests, all of this operates much faster than the video’s play time.</p>
          <figure>
          <img src="https://cf-assets.www.cloudflare.com/zkvhlag99gkb/7wfYn8FLcgzgIdLT6NFeq3/f6c51134363231ffed964300cb9992b0/flowchart.png" />
          </figure>
    <div>
      <h3>What it costs</h3>
      <a href="#what-it-costs">
        
      </a>
    </div>
    <p>Media Transformations will be free for all customers while in beta. We expect the beta period to extend into Q3 2025, and after that, Media Transformations will use the same subscriptions and billing mechanics as Image Transformations — including a free allocation for all websites/zones. Generating a still frame (single image) from a video counts as 1 transformation. Generating an optimized video is billed as 1 transformation <i>per second of the output video.</i> Each unique transformation is only billed once per month. All Media and Image Transformations cost $0.50 per 1,000 monthly unique transformation operations, with a free monthly allocation of 5,000.</p><p>Using this post as an example, recall the two transformed videos and one transformed image above — the big original doesn’t count because it wasn’t transformed. The first video (showing blog post width) was 15 seconds of output. The second video (silent vertical clip) was 10 seconds of output. The preview square is a still frame. These three operations would count as 26 transformations — and they would only bill once per month, regardless of how many visitors this page receives.</p>
    <div>
      <h3>Looking ahead</h3>
      <a href="#looking-ahead">
        
      </a>
    </div>
    <p>Our short-term focus will be on right-sizing input limits based on real customer usage as well as adding a caching layer for origin fetches to reduce any egress fees our customers may be facing from other storage providers. Looking further, we intend to streamline Images and Media Transformations to further simplify the developer experience, unify the features, and streamline enablement: Cloudflare’s Media Transformations will optimize your images and video, quickly and easily, wherever you need them.</p><p>Try it for yourself today using our sample asset above, or get started by enabling Transformations on a zone in your account and uploading a short file to R2, both of which offer a free tier to get you going.</p> ]]></content:encoded>
            <category><![CDATA[Cloudflare Media Platform]]></category>
            <category><![CDATA[Cloudflare Stream]]></category>
            <category><![CDATA[Performance]]></category>
            <category><![CDATA[Video]]></category>
            <guid isPermaLink="false">2KCsgqrpHOVpCClqBBPnYM</guid>
            <dc:creator>Taylor Smith</dc:creator>
            <dc:creator>Mickie Betz</dc:creator>
            <dc:creator>Ben Krebsbach</dc:creator>
        </item>
        <item>
            <title><![CDATA[Introducing high-definition portrait video support for Cloudflare Stream]]></title>
            <link>https://blog.cloudflare.com/introducing-high-definition-portrait-video-support-for-cloudflare-stream/</link>
            <pubDate>Fri, 16 Aug 2024 14:00:00 GMT</pubDate>
            <description><![CDATA[ Cloudflare Stream is an end-to-end solution for video encoding, storage, delivery, and playback. Newly uploaded or ingested portrait videos will now automatically be processed in full HD quality ]]></description>
            <content:encoded><![CDATA[ 
          <figure>
          <img src="https://cf-assets.www.cloudflare.com/zkvhlag99gkb/5P5jObj3tz5QPBuitlTVd4/b620f993a510e7a35adebc58f8819afc/2492-1-Hero.png" />
          </figure><p>Cloudflare Stream is an end-to-end solution for video encoding, storage, delivery, and playback. Our focus has always been on simplifying all aspects of video for developers. This goal continues to motivate us as we introduce first-class portrait (vertical) video support today. Newly uploaded or ingested portrait videos will now automatically be processed in full HD quality.  </p>
    <div>
      <h2>Why portrait video</h2>
      <a href="#why-portrait-video">
        
      </a>
    </div>
    <p>In the past few years, the popularity of portrait video has exploded, motivated by short-form video content applications such as TikTok or YouTube Shorts.  However, Cloudflare customers have been confused as to why their portrait videos appear to be lower quality when viewed on portrait-first devices such as smartphones. This is because our video encoding pipeline previously did not support high-quality portrait videos, leading them to be grainy and lower quality. This pain point has now been addressed with the introduction of high-definition portrait video.</p>
    <div>
      <h2>The current stream pipeline</h2>
      <a href="#the-current-stream-pipeline">
        
      </a>
    </div>
    <p>When you <a href="https://developers.cloudflare.com/stream/uploading-videos/"><u>upload a video to Stream</u></a>, it is first encoded into several different “renditions” (sizes or resolutions) before delivery. This is done in order to enable playback in a wide variety of network conditions, as well as to standardize the way a video is experienced. By using these adaptive bitrate renditions, we are able to offer viewers the highest quality streaming experience which fits their network bandwidth, meaning someone watching a video with a slow mobile connection would be served a 240p video (a resolution of 320x240 pixels) and receive the 1080p (a resolution of 1920x1080 pixels) version when they are watching at home on their fiber Internet connection. This encoding pipeline follows one of two different paths:</p>
          <figure>
          <img src="https://cf-assets.www.cloudflare.com/zkvhlag99gkb/7MaG68M8kXxHCmFZGQGypj/82aa200a20d3afe9e4c1f8ba1664906d/2492-2.png" />
          </figure><p>The first path is our video on-demand (VOD) encoding pipeline, which generates and stores a set of encoded video segments at each of our standard video resolutions. The other path is our on-the-fly encoding (OTFE) pipeline, which uses the same process as Stream Live to generate resolutions upon user request. Both pipelines work with the set of standard resolutions, which are identified through a constrained target (output) height. This means that we encode every rendition to heights of 240 pixels, 360 pixels, etc. up to 1080 pixels. </p><p>When originally conceived, this encoding pipeline was not designed with portrait video in mind because portrait video was less common. As a result, portrait videos were encoded with lower quality dimensions that were consistent with landscape video encoding. For example, a portrait HD video would have the dimensions 1920x1080 — scaling that down to the height of a landscape HD video would result in the much smaller output of 1080x606. However, current smartphones all have HD displays, making the discrepancy clear when a portrait video is viewed in portrait mode on a phone. With this new change to our encoding pipeline, all newly uploaded portrait videos will now be automatically encoded with constrained target width, using a new set of standard resolutions for portrait video. These resolutions are the same as the current set of landscape resolutions, but with the dimensions reversed: 240x426 up to 1080x1920.</p>
          <figure>
          <img src="https://cf-assets.www.cloudflare.com/zkvhlag99gkb/L1mjs73r4H6VdRPtX3gzX/44d8dcc8eae3dd5d7de0ecd29222d2f5/2492-3.png" />
          </figure>
    <div>
      <h2>Technical details</h2>
      <a href="#technical-details">
        
      </a>
    </div>
    <p>As the Stream intern this summer, I was tasked with this project, as well as the expectation of shipping a long-requested change, by the end of my internship. The first step in implementing this change was to familiarize myself with the complex architecture of Stream’s internal systems. After this, I began brainstorming a few different implementation decisions, like how to consistently track orientation through various stages of the pipeline. Following a group discussion to decide which choices would be the most scalable, least complex, and best for users, it was time to write the technical specification.</p><p>Due to the implementation method we chose, making this change involved tracing the life of a video from upload to delivery through both of our encoding pipelines and applying different logic for portrait videos. Previously, all video renditions were identified by their height at each stage of the pipeline, making certain parts of the pipeline completely agnostic to the orientation of a video. With the proposed changes, we would now be using the constraining dimension and orientation to identify a video rendition. Therefore, much of the work involved modifying the different portions of the pipeline to use these new parameters.</p><p>The first area of the pipeline to be modified was the Stream API service, which is the process which handles all Stream API calls. The API service enqueues the rendition encoding jobs for a video after it is uploaded, so it was necessary to introduce a new set of renditions designed for portrait videos, and enqueue the corresponding encoding jobs. The queueing system is handled by our in-house queue management system, which handles jobs generically and therefore did not require any changes.</p><p>Following this, I tackled the on-the-fly encoding pipeline. The area of interest here was the delivery portion of our pipeline, which generated the set of encoding resolutions to pass on to our on-the-fly encoder. Here I also introduced a new set of portrait renditions and the corresponding logic to encode them for portrait videos. This part of the backend is written and hosted on <a href="https://developers.cloudflare.com/workers/"><u>Cloudflare Workers</u></a>, which made it very easy and quick to deploy and test changes.   </p><p>Finally, we wanted to change how we presented these quality levels to users in the Stream built-in player and thought that using the new constrained dimension rather than always showing the height would feel more familiar. For portrait videos, we now display the size of the <i>constraining dimension</i>, which also means quality selection for portrait videos encoded under our old system now more accurately reflects their quality, too. As an example, a 9:16 portrait video would have been encoded to a maximum size of 608x1080 by the previous pipeline. Now, such a rendition will be marked as 608p rather than the full-quality 1080p, which would be a 1080x1920 rendition.</p><p>Stream as a whole is built on many of our own <a href="https://developers.cloudflare.com/"><u>Developer Platform</u></a> products, such as Workers for handling delivery, <a href="https://developers.cloudflare.com/r2/"><u>R2</u></a> for rendition storage, <a href="https://developers.cloudflare.com/workers-ai/"><u>Workers AI</u></a> for automatic captioning, and <a href="https://developers.cloudflare.com/durable-objects/"><u>Durable Objects</u></a> for bitrate observation, all of which enhance our ability to deploy and ship new updates quickly. Throughout my work on this project, I was able to see all of these pieces in action, as well as gain a new understanding of the powerful tools Cloudflare offers for developers. </p>
    <div>
      <h2>Results and findings</h2>
      <a href="#results-and-findings">
        
      </a>
    </div>
    <p>After the change, portrait videos are now encoded to higher resolutions and visibly appear to be higher quality. To confirm these differences, I analyzed the effect of the pipeline change on four different sample videos using the <a href="https://www.ni.com/en/shop/data-acquisition-and-control/add-ons-for-data-acquisition-and-control/what-is-vision-development-module/peak-signal-to-noise-ratio-as-an-image-quality-metric.html"><u>peak-signal-to-noise ratio</u></a> (PSNR, a mathematical representation of image quality). Since the old pipeline produced lower resolution videos, the comparison here is between an upscaled version of the old pipeline rendition and the current pipeline rendition. In the graph below, higher values reflect higher quality relative to the unencoded original video.</p>
          <figure>
          <img src="https://cf-assets.www.cloudflare.com/zkvhlag99gkb/2UFpZKOjtZNgnob3TroqEX/43c2c91d20858629035aceb33c409643/2492-4.png" />
          </figure><p></p><p>According to this metric, we see an increase in quality from the pipeline changes as high as 8%. However, the quality increase is most noticeable to the human eye in videos that feature fine details or a high amount of movement, which is not always captured in the PSNR. For example, compare a side-by-side of a frame from the book sample video encoded both ways:</p>
          <figure>
          <img src="https://cf-assets.www.cloudflare.com/zkvhlag99gkb/1cslwJdIonHtk5ChAhCOSi/fbdf72ad50c7a3f28b681a62417d523b/2492-5.png" />
          </figure><p>The difference between the old and new encodings is most clear when zoomed in:</p>
          <figure>
          <img src="https://cf-assets.www.cloudflare.com/zkvhlag99gkb/6DpHar5bEKhKZi3zAVDtHa/ee90377bfce1262d97dcda4298031997/2492-6.png" />
          </figure><p>Here’s another example (sourced from <a href="https://mixkit.co/free-stock-video/winter-fashion-cold-looking-woman-concept-video-39874/"><u>Mixkit</u></a>):</p>
          <figure>
          <img src="https://cf-assets.www.cloudflare.com/zkvhlag99gkb/5rjQa5v4Dz9OVUave52hKe/4a4ad9344d89b8cdf64a315d7fd3a684/2492-7.png" />
          </figure><p>Magnified:</p>
          <figure>
          <img src="https://cf-assets.www.cloudflare.com/zkvhlag99gkb/6grWclJIxuZXYzk74jXMfM/42fcd73d9d006dcd6041ca2a2904a718/2492-8.png" />
          </figure><p>Of course, watching these example clips is the clearest way to see:</p><ul><li><p>Book intro: <a href="https://customer-m033z5x00ks6nunl.cloudflarestream.com/6eb2aa73cedb7d647ce628296c49c3aa/iframe?preload=auto&amp;muted=true&amp;autoplay=true"><u>before</u></a> and <a href="https://customer-m033z5x00ks6nunl.cloudflarestream.com/5db8973a6f79dad1c8f4a5f78a9e6aa5/iframe?preload=auto&amp;muted=true&amp;autoplay=true"><u>after</u></a></p></li><li><p>Hair and makeup: <a href="https://customer-m033z5x00ks6nunl.cloudflarestream.com/7900b2ac82a89cf6eb1a1bf6679b8d86/iframe?preload=auto&amp;muted=true&amp;autoplay=true"><u>before</u></a> and <a href="https://customer-m033z5x00ks6nunl.cloudflarestream.com/4636618dd238816ae1a164d645081668/iframe?preload=auto&amp;muted=true&amp;autoplay=true"><u>after</u></a></p></li></ul><p>Maximize the Stream player and look at the quality selector (in the gear menu) to see the new quality level labels – select the highest option to compare. Note the improved sharpness of the text in the book sample as well as the improved detail in the hair and eye shadow of the hair and makeup sample.</p>
    <div>
      <h2>Implementation challenges</h2>
      <a href="#implementation-challenges">
        
      </a>
    </div>
    <p>Due to the complex nature of our encoding pipelines, there were several technical challenges to making a large change like this. Aside from simply uploading videos, many of the features we offer, like downloads or clipping, require tweaking to produce the correct video renditions. This involved modifying many parts of the encoding pipeline to ensure that portrait video logic was handled. </p><p>There were also some edge cases which were not caught until after release. One release of this feature contained a bug in the on-the-fly encoding logic which caused a subset of new portrait livestream renditions to have negative bitrates, making them unusable. This was due to an internal representation of video renditions’ constraining dimensions being mistakenly used for bitrate observation. We remedied this by increasing the scope of our end-to-end testing to include more portrait video tests and live recording interaction tests.</p><p>Another small bug caused downloading very small videos to sometimes fail. This was because for videos under 240p, our smallest encoding resolution, the non-constraining dimension was not being properly scaled to match the aspect ratio of the video, causing some specific combinations of dimensions to be interpreted as portrait when they should have been landscape, and vice versa. This bug was fixed quickly, but was not initially caught after the release since it required a very specific set of conditions to activate. After this experience, I added a few more unit tests involving small videos.</p>
    <div>
      <h2>That’s a wrap</h2>
      <a href="#thats-a-wrap">
        
      </a>
    </div>
    <p>As my internship comes to a close, reflecting on the experience makes me grateful for all the team members who have helped me throughout this time. I am very glad to have shipped this project which addresses a long-standing concern and will have real-world customer impact. Support for high-definition portrait video is now available, and we will continue to make improvements to our video solutions suite. You can see the difference yourself by <a href="https://dash.cloudflare.com/?to=/:account/stream"><u>uploading a portrait video to Stream</u></a>! Or, perhaps you’d like to help build a better Internet, too — our <a href="https://www.cloudflare.com/careers/early-talent/"><u>internship and early talent programs</u></a> are a great way to jumpstart your own journey.</p><p><i>Sample video acknowledgements: The sample video of the book was created by the Stream Product Manager and shows the opening page of </i>The Strange Wonder of Roots<i> by </i><a href="https://www.evangriffithbooks.com/"><i><u>Evan Griffith</u></i></a><i> (HarperCollins). The </i><a href="https://mixkit.co/free-stock-video/winter-fashion-cold-looking-woman-concept-video-39874/"><i><u>hair and makeup fashion video</u></i></a><i> was sourced from </i><a href="https://mixkit.co/"><i><u>Mixkit</u></i></a><i>, a great source of free media for video projects.</i></p> ]]></content:encoded>
            <category><![CDATA[Cloudflare Stream]]></category>
            <category><![CDATA[Developers]]></category>
            <category><![CDATA[Developer Platform]]></category>
            <category><![CDATA[Internship Experience]]></category>
            <category><![CDATA[Product News]]></category>
            <category><![CDATA[Video]]></category>
            <guid isPermaLink="false">7Dgs9IOqQRU1Tj3UKzO8pV</guid>
            <dc:creator>Sasha Huang</dc:creator>
        </item>
        <item>
            <title><![CDATA[Cloudflare Stream Low-Latency HLS support now in Open Beta]]></title>
            <link>https://blog.cloudflare.com/cloudflare-stream-low-latency-hls-open-beta/</link>
            <pubDate>Mon, 25 Sep 2023 13:00:29 GMT</pubDate>
            <description><![CDATA[ Cloudflare Stream’s LL-HLS support enters open beta today. You can deliver video to your audience faster, reducing the latency a viewer may experience on their player to as little as 3 seconds ]]></description>
            <content:encoded><![CDATA[ <p></p><p>Stream Live lets users easily scale their live-streaming apps and websites to millions of creators and concurrent viewers while focusing on the content rather than the infrastructure — Stream manages codecs, protocols, and bit rate automatically.</p><p>For <a href="/recapping-speed-week-2023/">Speed Week</a> this year, we introduced a <a href="/low-latency-hls-support-for-cloudflare-stream/">closed beta of Low-Latency HTTP Live Streaming</a> (LL-HLS), which builds upon the high-quality, feature-rich HTTP Live Streaming (HLS) protocol. Lower latency brings creators even closer to their viewers, empowering customers to build more interactive features like chat and enabling the use of live-streaming in more time-sensitive applications like live e-learning, sports, gaming, and events.</p><p>Today, in celebration of Birthday Week, we’re opening this beta to all customers with even lower latency. With LL-HLS, you can deliver video to your audience faster, reducing the latency a viewer may experience on their player to as little as three seconds. <a href="https://www.cloudflare.com/developer-platform/solutions/live-streaming/">Low Latency streaming</a> is priced the same way, too: $1 per 1,000 minutes delivered, with zero extra charges for encoding or bandwidth.</p>
    <div>
      <h3>Broadcast with latency as low as three seconds.</h3>
      <a href="#broadcast-with-latency-as-low-as-three-seconds">
        
      </a>
    </div>
    <p>LL-HLS is an extension of the <a href="https://www.cloudflare.com/learning/video/what-is-http-live-streaming/">HLS standard</a> that allows us to reduce glass-to-glass latency — the time between something happening on the broadcast end and a user seeing it on their screen. That includes factors like network conditions and transcoding for HLS and adaptive bitrates. We also include client-side buffering in our understanding of latency because we know the experience is driven by what a user sees, not when a byte is delivered into a buffer. Depending on encoder and player settings, broadcasters' content can be playing on viewers' screens in less than three seconds.</p><div>
  
</div><p><i>On the left,</i> <a href="https://obsproject.com/"><i>OBS Studio</i></a> <i>broadcasting from my personal computer to Cloudflare Stream. On the right, watching this livestream using our own built-in player playing LL-HLS with three second latency!</i></p>
    <div>
      <h3>Same pricing, lower latency. Encoding is always free.</h3>
      <a href="#same-pricing-lower-latency-encoding-is-always-free">
        
      </a>
    </div>
    <p>Our addition of LL-HLS support builds on all the best parts of Stream including simple, predictable pricing. You never have to pay for ingress (broadcasting to us), compute (encoding), or egress. This allows you to stream with peace of mind, knowing there are no surprise fees and no need to trade quality for cost. Regardless of bitrate or resolution, Stream costs \$1 per 1,000 minutes of video delivered and \$5 per 1,000 minutes of video stored, billed monthly.</p><p>Stream also provides both a built-in web player or HLS/DASH manifests to use in a compatible player of your choosing. This enables you or your users to go live using the same protocols and tools that broadcasters big and small use to go live to YouTube or Twitch, but gives you full control over access and presentation of live streams. We also provide access control with signed URLs and hotlinking prevention measures to protect your content.</p>
    <div>
      <h3>Powered by the strength of the network</h3>
      <a href="#powered-by-the-strength-of-the-network">
        
      </a>
    </div>
    <p>And of course, Stream is powered by Cloudflare's global network for fast delivery worldwide, with points of presence within 50ms of 95% of the Internet connected population, a key factor in our quest to slash latency. We ingest live video close to broadcasters and move it rapidly through Cloudflare’s network. We run encoders on-demand and generate player manifests as close to viewers as possible.</p>
    <div>
      <h3>Getting started with LL-HLS</h3>
      <a href="#getting-started-with-ll-hls">
        
      </a>
    </div>
    <p>Getting started with Stream Live only takes a few minutes, and by using Live <i>Outputs</i> for restreaming, you can even test it without changing your existing infrastructure. First, create or update a Live Input in the Cloudflare dashboard. While in beta, Live Inputs will have an option to enable LL-HLS called “Low-Latency HLS Support.” Activate this toggle to enable the new pipeline.</p>
            <figure>
            
            <img src="https://cf-assets.www.cloudflare.com/zkvhlag99gkb/3LHUfSgkCln8UFZy12CtHF/17d72972159b165364d5db31c069702f/image1-9.png" />
            
            </figure><p>Stream will automatically provide the RTMPS and SRT endpoints to broadcast your feed to us, just as before. For the best results, we recommend the following broadcast settings:</p><ul><li><p>Codec: h264</p></li><li><p>GOP size / keyframe interval: 1 second</p></li></ul><p>Optionally, configure a Live Output to point to your existing video ingest endpoint via RTMPS or SRT to test Stream while rebroadcasting to an existing workflow or infrastructure.</p><p>Stream will automatically provide RTMPS and SRT endpoints to broadcast your feed to us as well as an HTML embed for our built-in player.</p>
            <figure>
            
            <img src="https://cf-assets.www.cloudflare.com/zkvhlag99gkb/1zFDLgJ0iwEuh0Brat0TXz/da52d22ae28234ecd4390cf4dc518f4d/image3-7-1.png" />
            
            </figure><p>This connection information can be added easily to a broadcast application like OBS to start streaming immediately:</p>
            <figure>
            
            <img src="https://cf-assets.www.cloudflare.com/zkvhlag99gkb/4Gn4GlXNvamzD1qnN5vEi2/c503a9c7280c6bc8f88a1bfadf43b876/image2-7.png" />
            
            </figure><p>During the beta, our built-in player will automatically attempt to use low-latency for any enabled Live Input, falling back to regular HLS otherwise. If LL-HLS is being used, you’ll see “Low Latency” noted in the player.</p><p>During this phase of the beta, we are most closely focused on using <a href="https://obsproject.com/">OBS</a> to broadcast and Stream’s built-in player to watch. However, you may test the LL-HLS manifest in a player of your own by appending <code>?protocol=llhls</code> to the end of the HLS manifest URL. This flag may change in the future and is not yet ready for production usage; <a href="https://developers.cloudflare.com/stream/changelog/">watch for changes in DevDocs</a>.</p>
    <div>
      <h3>Sign up today</h3>
      <a href="#sign-up-today">
        
      </a>
    </div>
    <p>Low-Latency HLS is Stream Live’s latest tool to bring your creators and audiences together. All new and existing Stream subscriptions are eligible for the LL-HLS open beta today, with no pricing changes or contract requirements --- all part of building the fastest, simplest serverless live-streaming platform. Join our beta to start test-driving Low-Latency HLS!</p> ]]></content:encoded>
            <category><![CDATA[Birthday Week]]></category>
            <category><![CDATA[Cloudflare Stream]]></category>
            <category><![CDATA[Live Streaming]]></category>
            <category><![CDATA[Restreaming]]></category>
            <category><![CDATA[Video]]></category>
            <category><![CDATA[Latency]]></category>
            <category><![CDATA[Product News]]></category>
            <guid isPermaLink="false">4s1ozw7fIyXQZnMYrnEnX4</guid>
            <dc:creator>Taylor Smith</dc:creator>
        </item>
        <item>
            <title><![CDATA[Introducing scheduled deletion for Cloudflare Stream]]></title>
            <link>https://blog.cloudflare.com/introducing-scheduled-deletion-for-cloudflare-stream/</link>
            <pubDate>Fri, 11 Aug 2023 13:00:45 GMT</pubDate>
            <description><![CDATA[ Easily manage storage with scheduled deletion for Cloudflare Stream, available for live recordings and on-demand video ]]></description>
            <content:encoded><![CDATA[ <p></p><p>Designed with developers in mind, Cloudflare Stream provides a seamless, integrated workflow that simplifies video streaming for creators and platforms alike. With features like <a href="/stream-live-ga/">Stream Live</a> and <a href="/stream-creator-management/">creator management</a>, customers have been looking for ways to streamline storage management.</p><p>Today, August 11, 2023, Cloudflare Stream is introducing scheduled deletion to easily manage video lifecycles from the Stream dashboard or our API, saving time and reducing storage-related costs. Whether you need to retain recordings from a live stream for only a limited time, or preserve direct creator videos for a set duration, scheduled deletion will simplify storage management and reduce costs.</p>
    <div>
      <h2>Stream scheduled deletion</h2>
      <a href="#stream-scheduled-deletion">
        
      </a>
    </div>
    <p>Scheduled deletion allows developers to automatically remove on-demand videos and live recordings from their library at a specified time. Live inputs can be set up with a deletion rule, ensuring that all recordings from the input will have a scheduled deletion date upon completion of the stream.</p><p>Let’s see how it works in those two configurations.</p>
    <div>
      <h2>Getting started with scheduled deletion for on-demand videos</h2>
      <a href="#getting-started-with-scheduled-deletion-for-on-demand-videos">
        
      </a>
    </div>
    <p>Whether you run a learning platform where students can upload videos for review, a platform that allows gamers to share clips of their gameplay, or anything in between, scheduled deletion can help manage storage and ensure you only keep the videos that you need. Scheduled deletion can be applied to both new and existing on-demand videos, as well as recordings from completed live streams. This feature lets you specify a specific date and time at which the video will be deleted. These dates can be applied in the Cloudflare dashboard or via the Cloudflare API.</p>
    <div>
      <h3>Cloudflare dashboard</h3>
      <a href="#cloudflare-dashboard">
        
      </a>
    </div>
    
            <figure>
            
            <img src="https://cf-assets.www.cloudflare.com/zkvhlag99gkb/3D1lNclKMneZTtlGbY2rO9/b917c24ab6e96bc00f78e0f0c9dbde77/Screenshot-2023-08-11-at-12.49.57.png" />
            
            </figure><ol><li><p>From the Cloudflare dashboard, select <b>Videos</b> under <b>Stream</b></p></li><li><p>Select a video</p></li><li><p>Select <b>Automatically Delete Video</b></p></li><li><p>Specify a desired date and time to delete the video</p></li><li><p>Click <b>Submit</b> to save the changes</p></li></ol>
    <div>
      <h3>Cloudflare API</h3>
      <a href="#cloudflare-api">
        
      </a>
    </div>
    <p>The Stream API can also be used to set the scheduled deletion property on new or existing videos. In this example, we’ll create a direct creator upload that will be deleted on December 31, 2023:</p>
            <pre><code>curl -X POST \
-H 'Authorization: Bearer &lt;BEARER_TOKEN&gt;' \
-d '{ "maxDurationSeconds": 10, "scheduledDeletion": "2023-12-31T12:34:56Z" }' \
https://api.cloudflare.com/client/v4/accounts/&lt;ACCOUNT_ID&gt;/stream/direct_upload </code></pre>
            <p>For more information on live inputs and how to configure deletion policies in our API, refer to <a href="https://developers.cloudflare.com/api/">the documentation</a>.</p>
    <div>
      <h2>Getting started with automated deletion for Live Input recordings</h2>
      <a href="#getting-started-with-automated-deletion-for-live-input-recordings">
        
      </a>
    </div>
    <p>We love how recordings from live streams allow those who may have missed the stream to catch up, but these recordings aren’t always needed forever. Scheduled recording deletion is a policy that can be configured for new or existing live inputs. Once configured, the recordings of all future streams on that input will have a scheduled deletion date calculated when the recording is available. Setting this retention policy can be done from the Cloudflare dashboard or via API operations to create or edit Live Inputs:</p>
    <div>
      <h3>Cloudflare Dashboard</h3>
      <a href="#cloudflare-dashboard">
        
      </a>
    </div>
    
            <figure>
            
            <img src="https://cf-assets.www.cloudflare.com/zkvhlag99gkb/2b60PHplltSmxKQQQ3LERW/431a214e7daa2431a198e7f138f26d6c/image3-3.png" />
            
            </figure><ol><li><p>From the Cloudflare dashboard, select <b>Live Inputs</b> under <b>Stream</b></p></li><li><p>Select <b>Create Live Input</b> or an existing live input</p></li><li><p>Select <b>Automatically Delete Recordings</b></p></li><li><p>Specify a number of days after which new recordings should be deleted</p></li><li><p>Click <b>Submit</b> to save the rule or create the new live input</p></li></ol>
    <div>
      <h3>Cloudflare API</h3>
      <a href="#cloudflare-api">
        
      </a>
    </div>
    <p>The Stream API makes it easy to add a deletion policy to new or existing inputs. Here is an example API request to create a live input with recordings that will expire after 30 days:</p>
            <pre><code>curl -X POST \
-H 'Authorization: Bearer &lt;BEARER_TOKEN&gt;' \
-H 'Content-Type: application/json' \
-d '{ "recording": {"mode": "automatic"}, "deleteRecordingAfterDays": 30 }' \
https://api.staging.cloudflare.com/client/v4/accounts/&lt;ACCOUNT_ID&gt;/stream/live_inputs/</code></pre>
            <p>For more information on live inputs and how to configure deletion policies in our API, refer to <a href="https://developers.cloudflare.com/api/">the documentation</a>.</p>
    <div>
      <h2>Try out scheduled deletion today</h2>
      <a href="#try-out-scheduled-deletion-today">
        
      </a>
    </div>
    <p>Scheduled deletion is now available to all Cloudflare Stream customers. Try it out now and join our <a href="https://discord.gg/cloudflaredev">Discord community</a> to let us know what you think! To learn more, check out our <a href="https://developers.cloudflare.com/stream/">developer docs</a>. Stay tuned for more exciting Cloudflare Stream updates in the future.</p> ]]></content:encoded>
            <category><![CDATA[Cloudflare Stream]]></category>
            <category><![CDATA[Video]]></category>
            <category><![CDATA[Developers]]></category>
            <category><![CDATA[Product News]]></category>
            <category><![CDATA[Internship Experience]]></category>
            <guid isPermaLink="false">64WSD7SbRipiXRnZ0xoHFt</guid>
            <dc:creator>Austin Christiansen</dc:creator>
            <dc:creator>Taylor Smith</dc:creator>
        </item>
        <item>
            <title><![CDATA[Introducing Low-Latency HLS Support for Cloudflare Stream]]></title>
            <link>https://blog.cloudflare.com/low-latency-hls-support-for-cloudflare-stream/</link>
            <pubDate>Mon, 19 Jun 2023 13:00:57 GMT</pubDate>
            <description><![CDATA[ Broadcast live to websites and applications with less than 10 second latency with Low-Latency HTTP Live Streaming (LL-HLS), now in beta with Cloudflare Stream ]]></description>
            <content:encoded><![CDATA[ <p></p><p>Stream Live lets users easily scale their live streaming apps and websites to millions of creators and concurrent viewers without having to worry about bandwidth costs or purchasing hardware for real-time encoding at scale. Stream Live lets users focus on the content rather than the infrastructure — taking care of the codecs, protocols, and bitrate automatically. When we launched Stream Live last year, we focused on bringing high quality, feature-rich streaming to websites and applications with HTTP Live Streaming (HLS).</p><p>Today, we're excited to introduce support for <a href="https://www.cloudflare.com/developer-platform/solutions/live-streaming/"><i>Low-Latency</i> HTTP Live Streaming</a> (LL-HLS) in a closed beta, offering you an even faster streaming experience. LL-HLS will reduce the latency a viewer may experience on their player from highs of around 30 seconds to less than 10 in many cases. Lower latency brings creators even closer to their viewers, empowering customers to build more interactive features like Q&amp;A or chat and enabling the use of live streaming in more time-sensitive applications like sports, gaming, and live events.</p>
    <div>
      <h3>Broadcast with less than 10-second latency</h3>
      <a href="#broadcast-with-less-than-10-second-latency">
        
      </a>
    </div>
    <p>LL-HLS is an extension of HLS and allows us to reduce <i>glass-to-glass latency</i> — the time between something happening on the broadcast end and a user seeing it on their screen. This includes everything from broadcaster encoding to client-side buffering because we know the experience is driven by what a user sees, not when a byte is delivered into a buffer. Depending on encoder and player settings, broadcasters' content can be playing on viewers' screens in less than ten seconds.</p><p>Our addition of LL-HLS support builds on all the best parts of Stream including simple, predictable pricing. You never have to pay for ingest (broadcasting to us), compute (encoding), or egress. It costs \$5 per 1,000 minutes of video stored per month and \$1 per 1,000 minutes of video viewed per month. This allows you to stream with peace of mind, knowing there are no surprise fees.</p><p>Other platforms tack on live recordings as a separate add-on feature, and those recordings only become available minutes or even hours after a live stream ends. With Cloudflare Stream, Live segments are automatically recorded and immediately available for on-demand playback.</p><p>Stream also provides both a built-in web player and HLS manifests to use in a compatible player of your choosing. This enables you or your users to go live using the same protocols and tools that broadcasters big and small use to go live to YouTube or Twitch, but gives you full control over access and presentation of live streams.</p><p>We also provide <a href="https://www.cloudflare.com/learning/access-management/what-is-access-control/">access control</a> with signed URLs allowing you to protect your content, sharing with only certain users. This allows you to restrict access so only logged in members can watch a particular video, or only let users watch your video for a limited time period. And of course, Stream is powered by Cloudflare's global network for fast delivery worldwide, with points of presents within 50ms of 95% of the Internet connected population.</p>
            <figure>
            
            <img src="https://cf-assets.www.cloudflare.com/zkvhlag99gkb/r4I8NvvEb90cCrXRJtGLf/d3e66922eae5e29e7d5b4cd56f66c0ed/image2-8.png" />
            
            </figure><p>Left: Broadcasting to Stream Live using OBS. Right: Watching that same Stream. Note the five second difference in the NIST clock between the source and the playback.</p><p>Powering the LL-HLS experience involved making several improvements to our underlying infrastructure. One of the largest challenges we encountered was that our existing architecture involved a pipeline with multiple buffers as long as the keyframe interval. This meant Stream Live would introduce a delay of up to five times the keyframe interval. To resolve this, we simplified a portion of our pipeline — now, we work with individual frames rather than whole keyframe-intervals, but without giving up the economies of scale our approach to encoding provides. This decoupling of keyframe interval and internal buffer duration lets us dramatically reduce latency in HLS, with a maximum of twice the keyframe interval.</p>
    <div>
      <h3>Getting started with the LL-HLS beta</h3>
      <a href="#getting-started-with-the-ll-hls-beta">
        
      </a>
    </div>
    <p>As we prepare to ship this new functionality, we're <a href="https://docs.google.com/forms/d/e/1FAIpQLSeZ2NBuAXC75aDJllhVA0itW0TZ1w4s48TvFm-eP7R1h9Hc9g/viewform?usp=sf_link">looking for beta testers</a> to help us test non-production workloads. To participate in the beta, your application should be configured with these settings:</p><ul><li><p>H.264 video codec</p></li><li><p>Constant bitrate</p></li><li><p>Keyframe interval (GOP size) of 2s</p></li><li><p>No B Frames</p></li><li><p>Using the Stream built-in player</p></li></ul><p>Getting started with Stream Live only takes a few minutes. Create a Live Input in the Cloudflare dashboard, then Stream will automatically provide RTMPS and SRT endpoints to broadcast your feed to us as well as an HTML embed for our built-in player and the HLS manifest for a custom player.</p>
            <figure>
            
            <img src="https://cf-assets.www.cloudflare.com/zkvhlag99gkb/5luGmhp7Otff4f8T8Bjwoh/3952e7bcb4998a416ad5e8c4e49b189c/image4-6.png" />
            
            </figure>
            <figure>
            
            <img src="https://cf-assets.www.cloudflare.com/zkvhlag99gkb/43x3NlJtNgaUAeJnmj3RD7/ee6e9533024c16ecf775813a56d06a2b/image3-7.png" />
            
            </figure><p>This connection information can be added easily to a broadcast application like OBS to start streaming immediately:</p>
            <figure>
            
            <img src="https://cf-assets.www.cloudflare.com/zkvhlag99gkb/78aLkxTHD6JIknv5Cm8yVw/446ce2661c935560d3e68052b26d284f/image1-9.png" />
            
            </figure><p>Customers in the LL-HLS beta will need to make a minor adjustment to the built-in player embed code, but there are no changes to Live Input configuration, dashboard interface, API, or existing functionality.</p>
    <div>
      <h3>Sign up today</h3>
      <a href="#sign-up-today">
        
      </a>
    </div>
    <p>LL-HLS is Stream Live’s latest tool to bring your creators and audiences together. After the beta period, this feature will be generally available to all new and existing Stream subscriptions with no pricing changes or contract requirements --- all part of building the fastest, simplest serverless live streaming platform. <a href="https://docs.google.com/forms/d/e/1FAIpQLSeZ2NBuAXC75aDJllhVA0itW0TZ1w4s48TvFm-eP7R1h9Hc9g/viewform?usp=sf_link">Join our beta</a> to start test-driving Low-Latency HLS!</p>
    <div>
      <h3>Watch on Cloudflare TV</h3>
      <a href="#watch-on-cloudflare-tv">
        
      </a>
    </div>
    <div></div> ]]></content:encoded>
            <category><![CDATA[Speed Week]]></category>
            <category><![CDATA[Live Streaming]]></category>
            <category><![CDATA[Video]]></category>
            <guid isPermaLink="false">1AGHbDsMyWLMDqABUpgMo4</guid>
            <dc:creator>Taylor Smith</dc:creator>
        </item>
        <item>
            <title><![CDATA[Get started with Cloudflare Workers with ready-made templates]]></title>
            <link>https://blog.cloudflare.com/cloudflare-workers-templates/</link>
            <pubDate>Tue, 15 Nov 2022 14:00:00 GMT</pubDate>
            <description><![CDATA[ Today, we’re excited to share a collection of ready-made templates to help you build your next application on Cloudflare Workers ]]></description>
            <content:encoded><![CDATA[ 
            <figure>
            
            <img src="https://cf-assets.www.cloudflare.com/zkvhlag99gkb/7mcyGMehbK3CHWrn4SVbYc/8667083e5fa68f72bcf2a6f871efd8fb/image1-30.png" />
            
            </figure><p>One of the things we prioritize at Cloudflare is enabling developers to build their applications on our developer platform with ease. We’re excited to share a collection of ready-made templates that’ll help you start building your next application on Workers. We want developers to get started as quickly as possible, so that they can focus on building and innovating and avoid spending so much time configuring and setting up their projects.</p>
    <div>
      <h3>Introducing <a href="https://workers.new/templates">Cloudflare Workers Templates</a></h3>
      <a href="#introducing">
        
      </a>
    </div>
    <p><a href="https://workers.cloudflare.com/">Cloudflare Workers</a> enables you to build applications with exceptional performance, reliability, and scale. We are excited to share a collection of templates that helps you get started quickly and give you an idea of what is possible to build on our developer platform.</p><p>We have made available a set of starter templates highlighting different use cases of Workers. We understand that you have different ideas you will love to build on top of Workers and you may have questions or wonder if it is possible. These sets of templates go beyond the convention ‘Hello, World’ starter. They’ll help shape your idea of what kind of applications you can build with Workers as well as other products in the Cloudflare Developer Ecosystem.</p><p>We are excited to introduce a new collection of starter templates <a href="https://workers.new/templates">workers.new/templates</a>. This shortcut will serve you a collection of templates that you can immediately start using to build your next application.</p>
            <figure>
            
            <img src="https://cf-assets.www.cloudflare.com/zkvhlag99gkb/5jEO1oooCk1wae8CfqkiKP/8cd2f203ec83c3b48f446ff4324e4918/1-3.png" />
            
            </figure><p>Cloudflare Workers Template Collection</p>
    <div>
      <h3>How to use Cloudflare Workers templates</h3>
      <a href="#how-to-use-cloudflare-workers-templates">
        
      </a>
    </div>
    <p>We created this collection of templates to support you in getting started with Workers. Some examples listed showcase a use-case with a combination of multiple Cloudflare developer platform products to build applications similar to ones you might use every day. We know that Workers are being used today for different use-cases, we want to showcase some of them to you.</p><p>We have templates for building an <a href="https://github.com/cloudflare/templates/tree/main/pages-image-sharing">image sharing website with Pages Functions</a>, <a href="https://github.com/cloudflare/templates/tree/main/stream/upload/direct-creator-uploads">direct creator upload to Cloudflare Stream</a>, <a href="https://github.com/cloudflare/templates/tree/main/worker-example-request-scheduler">Durable Object-powered request scheduler</a> and many more.</p><p>One example to highlight is a template that lets you accept payment for video content. It is powered by <a href="https://developers.cloudflare.com/pages/platform/functions/">Pages Functions</a>, <a href="https://www.cloudflare.com/products/cloudflare-stream/">Cloudflare Stream</a> and <a href="https://stripe.com/en-gb-nl/payments/checkout">Stripe Checkout</a>.  This app shows how you can use Cloudflare Workers with external payment and authentication systems to create a logged-in experience, without ever having to manage a database or persist any state yourself.</p><p>Once a user has paid, Stripe Checkout redirects to a URL that verifies a token from Stripe, and generates a <a href="https://developers.cloudflare.com/stream/viewing-videos/securing-your-stream/">signed URL</a> using Cloudflare Stream to view the video. This signed URL is only valid for a specified amount of time, and access can be restricted by country or IP address.</p><p>To use the template, you need to either click the deploy with Workers button or open the template with Stackblitz.</p>
            <figure>
            
            <img src="https://cf-assets.www.cloudflare.com/zkvhlag99gkb/7kuOBGeAbjC8nqlBCYRq67/348194b08a7f8985619c5fad7f0f3d8c/image4-11.png" />
            
            </figure><p>The <i>Deploy with Workers</i> button will redirect you to a page where you can authorize GitHub to deploy a fork of the repository using GitHub actions while opening with StackBlitz creates a new fork of the template for you to start working with.</p>
            <figure>
            
            <img src="https://cf-assets.www.cloudflare.com/zkvhlag99gkb/5JEDKuhUdxfJ5nMjSyPEKm/7f73f1b7175825ad544da1ec975268f3/image2-25.png" />
            
            </figure><p>Deploy to Workers Demo</p><p>These templates also comes bundled with additional features I would like to share with you:</p>
    <div>
      <h3>Integrated <a href="https://developers.cloudflare.com/workers/platform/deploy-button/#:~:text=Deploy%20buttons%20let%20you%20deploy,make%20sharing%20your%20work%20easier.">Deploy with Workers</a> Button</h3>
      <a href="#integrated-button">
        
      </a>
    </div>
    <p>We added a deploy button to all templates listed in the <a href="https://github.com/cloudflare/templates">templates repository</a> and the <a href="https://workers.new/templates">collection website</a>, so you can quickly get up to speed with your code. The deploy with Workers button lets you deploy your code in under five minutes, it uses a <a href="https://github.com/marketplace/actions/deploy-to-cloudflare-workers-with-wrangler">GitHub action</a> powered with Cloudflare Workers to do this.</p>
            <figure>
            
            <img src="https://cf-assets.www.cloudflare.com/zkvhlag99gkb/7kkLQ4b0wvl1V7u7CndQ2d/23a6f6d2374800b1496a1b684dbaf166/image3-19.png" />
            
            </figure><p>GitHub Repository with Deploy to Workers Button</p>
    <div>
      <h3>Support for Test Driven Development</h3>
      <a href="#support-for-test-driven-development">
        
      </a>
    </div>
    <p>As developers, we need to write tests for our code to ensure we’re shipping quality production grade software and also ensure that our code is bug-free. We support writing tests in Workers using the Wrangler <a href="https://developers.cloudflare.com/workers/wrangler/api/"><i>unstable_dev</i></a> API to write integration and end-to-end tests. We want to enable not just the developer experience but also nudge developers to follow best practices and prioritize TDD in development. We configured a number of templates to support integration tests against a local server, this will serve as a template to help you set up tests in your projects.</p><p>Here’s an example using Wrangler’s <a href="https://developers.cloudflare.com/workers/wrangler/api/#unstable_dev">unstable_dev</a> API and <a href="https://vitest.dev/">Vitest</a> test framework to test the code written in an example ‘Hello worker’ starter template:</p>
            <pre><code>import { unstable_dev } from 'wrangler';
import { describe, expect, it, beforeAll, afterAll } from 'vitest';

describe('Worker', () =&gt; {
 let worker;

 beforeAll(async () =&gt; {
   worker = await unstable_dev('index.js', {}, { disableExperimentalWarning: true });
 });

 afterAll(async () =&gt; {
   await worker.stop();
 });

 it('should return Hello worker!', async () =&gt; {
   const resp = await worker.fetch();
   if (resp) {
     const text = await resp.text();
     expect(text).toMatchInlineSnapshot(`"Hello worker!"`);
   }
 });
});</code></pre>
            
    <div>
      <h3>Online IDE Integration with StackBlitz</h3>
      <a href="#online-ide-integration-with-stackblitz">
        
      </a>
    </div>
    <p>We announced <a href="/cloudflare-stackblitz-partnership/">StackBlitz’s partnership with Cloudflare Workers</a> during Platform Week early this year. We believe a developer’s experience should be of utmost priority because we want them to build with ease on our developer platform.</p><p>StackBlitz is an online development platform for building web applications. It is powered by <a href="https://blog.stackblitz.com/posts/introducing-webcontainers/">WebContainers</a>, the first WebAssembly-based operating system which boots Node.js environments in milliseconds, securely within your browser tab.</p><p>We made it even easier to get started with Workers with an integrated <b><i>Open with StackBlitz</i></b> button for each starter template, making it easier to create a fork of a template and the great thing is you only need a web browser to build your application.</p><div></div><p>Everything we’ve highlighted in this post, all leads to one thing - <i>How can we create a better experience for developers getting started with Workers</i>. We introduce these ready-made templates to make you more efficient, bring you a wholesome developer experience and help improve your time to deployment. We want you to spend less time getting started with building on the Workers developer platform, so what are you waiting for?</p>
    <div>
      <h3>Next Steps</h3>
      <a href="#next-steps">
        
      </a>
    </div>
    <p>You can start building your own Worker today using the <a href="https://workers.new/template">available templates</a> provided in the templates collection to help you get started. If you would like to contribute your own templates to the collection, be sure to <a href="https://github.com/cloudflare/templates">send in a pull request</a> we’re more than happy to review and add to the growing collection. Share what you have built with us in the <i>#builtwith</i> channel on our Discord community. Make sure to follow us on <a href="https://twitter.com/cloudflaredev">Twitter</a> or join <a href="https://discord.gg/cloudflaredev">our Discord</a> Developers Community server.</p> ]]></content:encoded>
            <category><![CDATA[Developer Week]]></category>
            <category><![CDATA[Cloudflare Workers]]></category>
            <category><![CDATA[Wrangler]]></category>
            <category><![CDATA[Cloudflare Stream]]></category>
            <category><![CDATA[Developers]]></category>
            <category><![CDATA[Video]]></category>
            <category><![CDATA[Developer Platform]]></category>
            <guid isPermaLink="false">2FW0xuk7RWElCGndiI6nAu</guid>
            <dc:creator>Gift Egwuenu</dc:creator>
        </item>
        <item>
            <title><![CDATA[Bringing the best live video experience to Cloudflare Stream with AV1]]></title>
            <link>https://blog.cloudflare.com/av1-cloudflare-stream-beta/</link>
            <pubDate>Wed, 05 Oct 2022 17:08:00 GMT</pubDate>
            <description><![CDATA[ Cloudflare Stream now supports the AV1 codec for live video in open beta, unlocking live-streaming at higher resolution, with lower bandwidth ]]></description>
            <content:encoded><![CDATA[ <p></p><p>Consumer hardware is pushing the limits of consumers’ bandwidth.</p><p>VR headsets support 5760 x 3840 resolution — 22.1 million pixels <i>per frame</i> of video. Nearly all new TVs and smartphones sold today now support 4K — 8.8 million pixels per frame. It’s now normal for most people on a subway to be casually streaming video on their phone, even as they pass through a tunnel. People expect all of this to just work, and get frustrated when it doesn’t.</p><p>Consumer Internet bandwidth hasn’t kept up. Even advanced mobile carriers still limit streaming video resolution to prevent network congestion. Many mobile users still have to monitor and limit their mobile data usage. Higher Internet speeds require expensive infrastructure upgrades, and 30% of Americans still say they often have problems <a href="https://www.pewresearch.org/internet/2021/06/03/mobile-technology-and-home-broadband-2021/">simply connecting to the Internet at home</a>.</p><p>We talk to developers every day who are pushing up against these limits, trying to deliver the highest quality streaming video without buffering or jitter, challenged by viewers’ expectations and bandwidth. Developers building live video experiences hit these limits the hardest — buffering doesn’t just delay video playback, it can cause the viewer to get out of sync with the live event. Buffering can cause a sports fan to miss a key moment as playback suddenly skips ahead, or find out in a text message about the outcome of the final play, before they’ve had a chance to watch.</p><p>Today we’re announcing a big step towards breaking the ceiling of these limits — support in <a href="https://www.cloudflare.com/products/cloudflare-stream/">Cloudflare Stream</a> for the <a href="https://aomedia.org/av1-features/">AV1</a> codec for live videos and their recordings, available today to all Cloudflare Stream customers in open beta. Read <a href="https://developers.cloudflare.com/stream/viewing-videos/av1-playback/">the docs</a> to get started, or <a href="https://cool-sf-videos.pages.dev/">watch an AV1 video</a> from Cloudflare Stream in your web browser. AV1 is an open and royalty-free video codec that uses <a href="https://engineering.fb.com/2018/04/10/video-engineering/av1-beats-x264-and-libvpx-vp9-in-practical-use-case/">46% less bandwidth than H.264</a>, the most commonly used video codec on the web today.</p>
    <div>
      <h3>What is AV1, and how does it improve live video streaming?</h3>
      <a href="#what-is-av1-and-how-does-it-improve-live-video-streaming">
        
      </a>
    </div>
    <p>Every piece of information that travels across the Internet, from web pages to photos, requires data to be transmitted between two computers. A single character usually takes one byte, so a two-page letter would be 3600 bytes or 3.6 kilobytes of data transferred.</p><p>One pixel in a photo takes 3 bytes, one each for red, green and blue in the pixel. A 4K photo would take 8,294,400 bytes, or 8.2 Megabytes. A video is like a photo that changes 30 times a second, which would make almost 15 Gigabytes per minute. That’s a lot!</p><p>To reduce the amount of bandwidth needed to stream video, before video is sent to your device, it is compressed using a codec. When your device receives video, it decodes this into the pixels displayed on your screen. These codecs are essential to both streaming and storing video.</p><p>Video compression codecs combine multiple advanced techniques, and are able to compress video to one percent of the original size, with your eyes barely noticing a difference. This also makes video codecs computationally intensive and hard to run. Smartphones, laptops and TVs have specific media decoding hardware, separate from the main CPU, optimized to decode specific protocols quickly, using the minimum amount of battery life and power.</p><p>Every few years, as researchers invent more efficient compression techniques, standards bodies release new codecs that take advantage of these improvements. Each generation of improvements in compression technology increases the requirements for computers that run them. With higher requirements, new chips are made available with increased compute capacity. These new chips allow your device to display higher quality video while using less bandwidth.</p><p>AV1 takes advantage of recent advances in compute to deliver video with dramatically fewer bytes, even compared to other relatively recent video protocols like VP9 and HEVC.</p>
    <div>
      <h3>AV1 leverages the power of new smartphone chips</h3>
      <a href="#av1-leverages-the-power-of-new-smartphone-chips">
        
      </a>
    </div>
    <p>One of the biggest developments of the past few years has been the rise of custom chip designs for smartphones. Much of what’s driven the development of these chips is the need for advanced on-device image and video processing, as companies compete on the basis of which smartphone has the best camera.</p><p>This means the phones we carry around have an incredible amount of compute power. One way to think about AV1 is that it shifts work from the network to the viewer’s device. AV1 is fewer bytes over the wire, but computationally harder to decode than prior formats. When AV1 was first announced in 2018, it was dismissed by some as too slow to encode and decode, but smartphone chips have become radically faster in the past four years, more quickly than many saw coming.</p><p>AV1 hardware decoding is already built into the latest Google Pixel smartphones as part of the Tensor chip design. The <a href="https://semiconductor.samsung.com/processor/mobile-processor/exynos-2200/">Samsung Exynos 2200</a> and <a href="https://www.mediatek.com/products/smartphones-2/dimensity-1000-series">MediaTek Dimensity 1000 SoC</a> mobile chipsets both support hardware accelerated AV1 decoding. It appears that Google will <a href="https://android.googlesource.com/platform/cts/+/9203e0379bbb8991cdfee39e2a894d236bfaca8e?cf_target_id=EB3A10F16F1C7B0D8AE3D87D702DDC4A">require</a> that all devices that support Android 14 support decoding AV1. And AVPlayer, the media playback API built into iOS and tvOS, now <a href="https://developer.apple.com/documentation/coremedia/1564239-video_codec_constants/kcmvideocodectype_av1">includes an option for AV1</a>, which hints at future support. It’s clear that the industry is heading towards hardware-accelerated AV1 decoding in the most popular consumer devices.</p><p>With hardware decoding comes battery life savings — essential for both today’s smartphones and tomorrow’s VR headsets. For example, a Google Pixel 6 with AV1 hardware decoding uses only minimal battery and CPU to decode and play our <a href="https://cool-sf-videos.pages.dev/">test video</a>:</p>
            <figure>
            
            <img src="https://cf-assets.www.cloudflare.com/zkvhlag99gkb/1vbKm1xvDV6JE2Z9Jkn0rO/b0809152c1eb4f589c098530f254f223/image1-5.png" />
            
            </figure>
    <div>
      <h3>AV1 encoding requires even more compute power</h3>
      <a href="#av1-encoding-requires-even-more-compute-power">
        
      </a>
    </div>
    <p>Just as decoding is significantly harder for end-user devices, it is also significantly harder to encode video using AV1. When AV1 was announced in 2018, many doubted whether hardware would be able to encode it efficiently enough for the protocol to be adopted quickly enough.</p><p>To demonstrate this, we encoded the 4K rendering of <a href="https://peach.blender.org/">Big Buck Bunny</a> (a classic among video engineers!) into AV1, using an AMD EPYC 7642 48-Core Processor with 256 GB RAM. This CPU continues to be a workhorse of our compute fleet, as we have <a href="/an-epyc-trip-to-rome-amd-is-cloudflares-10th-generation-edge-server-cpu/">written about previously</a>. We used the following command to re-encode the video, <a href="https://trac.ffmpeg.org/wiki/Encode/AV1">based on the example in the ffmpeg AV1 documentation:</a></p><p><code>ffmpeg -i bbb_sunflower_2160p_30fps_normal.mp4 -c:v libaom-av1 -crf 30 -b:v 0 -strict -2 av1_test.mkv</code></p><p>Using a single core, encoding just two seconds of video at 30fps took over 30 minutes. Even if all 48 cores were used to encode, it would take at minimum over 43 seconds to encode just two seconds of video. Live encoding using only CPUs would require over 20 servers running at full capacity.</p><p>Special-purpose AV1 software encoders like <a href="https://github.com/xiph/rav1e">rav1e</a> and <a href="https://gitlab.com/AOMediaCodec/SVT-AV1">SVT-AV1</a> that run on general purpose CPUs can encode somewhat faster than <a href="https://trac.ffmpeg.org/wiki/Encode/AV1">libaom-av1</a> with ffmpeg, but still consume a huge amount of compute power to encode AV1 in real-time, requiring multiple servers running at full capacity in many scenarios.</p>
    <div>
      <h3>Cloudflare Stream encodes your video to AV1 in real-time</h3>
      <a href="#cloudflare-stream-encodes-your-video-to-av1-in-real-time">
        
      </a>
    </div>
    <p>At Cloudflare, we control both the hardware and software on our network. So to solve the CPU constraint, we’ve installed dedicated AV1 hardware encoders, designed specifically to encode AV1 at blazing fast speeds. This end to end control is what lets us encode your video to AV1 in real-time. This is entirely out of reach to most public cloud customers, including the video infrastructure providers who depend on them for compute power.</p><p>Encoding in real-time means you can use AV1 for live video streaming, where saving bandwidth matters most. With a pre-recorded video, the client video player can fetch future segments of video well in advance, relying on a buffer that can be many tens of seconds long. With live video, buffering is constrained by latency — it’s not possible to build up a large buffer when viewing a live stream. There is less margin for error with live streaming, and every byte saved means that if a viewer’s connection is interrupted, it takes less time to recover before the buffer is empty.</p>
    <div>
      <h3>Stream lets you support AV1 with no additional work</h3>
      <a href="#stream-lets-you-support-av1-with-no-additional-work">
        
      </a>
    </div>
    <p>AV1 has a chicken or the egg dilemma. And we’re helping solve it.</p><p>Companies with large video libraries often re-encode their entire content library to a new codec before using it. But AV1 is so computationally intensive that re-encoding whole libraries has been cost prohibitive. Companies have to choose specific videos to re-encode, and guess which content will be most viewed ahead of time. This is particularly challenging for apps with user generated content, where content can suddenly go viral, and viewer patterns are hard to anticipate.</p><p>This has slowed down the adoption of AV1 — content providers wait for more devices to support AV1, and device manufacturers wait for more content to use AV1. Which will come first?</p><p>With Cloudflare Stream there is no need to manually trigger re-encoding, re-upload video, or manage the bulk encoding of a large video library. This is a unique approach that is made possible by integrating encoding and delivery into a single product — it is not possible to encode on-demand using the old way of encoding first, and then pointing a CDN at a bucket of pre-encoded files.</p><p>We think this approach can accelerate the adoption of AV1. Consider a video app with millions of minutes of user-generated video. Most videos will never be watched again. In the old model, developers would have to spend huge sums of money to encode upfront, or pick and choose which videos to re-encode. With Stream, we can help anyone incrementally adopt AV1, without re-encoding upfront. As we work towards making AV1 Generally Available, we’ll be working to make supporting AV1 simple and painless, even for videos already uploaded to Stream, with no special configuration necessary.</p>
    <div>
      <h3>Open, royalty-free, and widely supported</h3>
      <a href="#open-royalty-free-and-widely-supported">
        
      </a>
    </div>
    <p>At Cloudflare, we are committed to open standards and <a href="/tag/patent-troll/">fighting patent trolls</a>. While there are multiple competing options for new video codecs, we chose to support AV1 first in part because it is open source and has royalty-free licensing.</p><p>Other encoding codecs force device manufacturers to pay royalty fees in order to adopt their standard in consumer hardware, and have been quick to file lawsuits against competing video codecs. The group behind the open and royalty-free VP8 and VP9 codecs have been pushing back against this model for more than a decade, and AV1 is the successor to these codecs, with support from all the <a href="https://aomedia.org/membership/members/">biggest technology companies</a>, both software and hardware. Beyond its technical accomplishments, AV1 is a clear message from the industry that the future of video encoding should be open, royalty-free, and free from patent litigation.</p>
    <div>
      <h3>Try AV1 right now with <b><i>your</i></b> live stream or live recording</h3>
      <a href="#try-av1-right-now-with-your-live-stream-or-live-recording">
        
      </a>
    </div>
    <p>Support for AV1 is currently in open beta. You can try using AV1 on your own live video with Cloudflare Stream right now — just add the <code>?betaCodecSuggestion=av1</code> query parameter to the HLS or DASH manifest URL for any live stream or live recording created after October 1st in Cloudflare Stream. <a href="https://developers.cloudflare.com/stream/viewing-videos/av1-playback/">Read the docs</a> to get started. If you don’t yet have a Cloudflare account, you can sign up <a href="https://dash.cloudflare.com/sign-up/stream">here</a> and start using Cloudflare Stream in just a few minutes.</p><p>We also have a recording of a live video, encoded using AV1, that you can watch <a href="https://cool-sf-videos.pages.dev/">here</a>. Note that Safari does not yet support AV1.</p><p>We encourage you to try AV1 with your test streams, and we’d love your feedback. Join our <a href="https://discord.com/invite/cloudflaredev/">Discord channel</a> and tell us what you’re building, and what kinds of video you’re interested in using AV1 with. We’d love to hear from you!</p> ]]></content:encoded>
            <category><![CDATA[Cloudflare Stream]]></category>
            <category><![CDATA[Video]]></category>
            <category><![CDATA[Developers]]></category>
            <category><![CDATA[Product News]]></category>
            <guid isPermaLink="false">1VBAndDQEb6dfu5nhqoTvb</guid>
            <dc:creator>Renan Dincer</dc:creator>
            <dc:creator>Brendan Irvine-Broque</dc:creator>
            <dc:creator>Chris Howells</dc:creator>
            <dc:creator>Ryan Schachte</dc:creator>
        </item>
        <item>
            <title><![CDATA[Lights, Camera, Action! Business and Pro customers get bundled streaming video]]></title>
            <link>https://blog.cloudflare.com/stream-for-pro-biz-customers/</link>
            <pubDate>Fri, 30 Sep 2022 13:00:00 GMT</pubDate>
            <description><![CDATA[ If you have a Business or Pro subscription, soon you will receive a free allocation of Cloudflare Stream. ]]></description>
            <content:encoded><![CDATA[ <p></p><p>Beginning December 1, 2022, if you have a Business or Pro subscription, you will receive a complimentary allocation of Cloudflare Stream. Here’s what this means:</p><ul><li><p>All Cloudflare customers with a Biz or Pro domain will be able to store up to 100 minutes of video content and deliver up to 10,000 minutes of video content each month at <i>no additional cost</i></p></li><li><p>If you need additional storage or delivery beyond the complimentary allocation, you will be able to upgrade to a paid Stream subscription from the <a href="https://dash.cloudflare.com/?to=/:account/stream">Stream Dashboard</a>.</p></li></ul><p>Cloudflare Stream simplifies storage, encoding and playback of videos. You can use the free allocation of Cloudflare Stream for various use cases, such as background/hero videos, e-commerce product videos, how-to guides and customer testimonials.</p>
    <div>
      <h3>Upload videos with no code</h3>
      <a href="#upload-videos-with-no-code">
        
      </a>
    </div>
    <p>To upload your first video Stream, simply visit the Stream Dashboard and drag-and-drop the video file:</p>
            <figure>
            
            <img src="https://cf-assets.www.cloudflare.com/zkvhlag99gkb/2Ngq1qsTkBRCPRk5tCFFUa/da505b7d31a84659210258ddc1314d86/image1-83.png" />
            
            </figure><p>Once you upload a video, Stream will store and encode your video. Stream automatically optimizes your video uploads by creating multiple versions of it at different quality levels. This happens behind-the-scenes and requires no extra effort from your side. The Stream Player automatically selects the optimal quality level based on your website visitor’s Internet connection using a technology called <i>adaptive-bit rate encoding</i>.</p><p>Your uploaded video will appear on the Dashboard:</p>
            <figure>
            
            <img src="https://cf-assets.www.cloudflare.com/zkvhlag99gkb/1ysjmverw5Lxb1oFl5cuKt/e1b70a15e953ac69d6a60e697742a679/image3-51.png" />
            
            </figure><p>Click on the video in the list of videos to watch a preview, change settings or to grab the embed code:</p>
            <figure>
            
            <img src="https://cf-assets.www.cloudflare.com/zkvhlag99gkb/4yhaNOupXKHQ3xFI786f4L/740f1865050219a1e6a4eae571c31447/image6-14.png" />
            
            </figure>
    <div>
      <h3>Built-in Stream Player</h3>
      <a href="#built-in-stream-player">
        
      </a>
    </div>
    <p>Stream provides an embed code that can be used to place your uploaded videos onto your website. The embed code can be found under the <i>Embed</i> tab:</p>
            <figure>
            
            <img src="https://cf-assets.www.cloudflare.com/zkvhlag99gkb/6COvthPh21mNvKIbXIJJCP/12b4e673d9cdc34420e7ebb90b2b419f/image4-29.png" />
            
            </figure><p>To include the video on your website, simply copy-and-paste the embed code.</p><p>You’ll notice in the screenshot above that the Embed tab lets you customize the viewing experience. It supports the following optional properties:</p><ul><li><p><b><b><b>Poster:</b></b></b> The “poster image” is what appears on the video player before the user has started playing the video. By default, the poster image is set to the first frame in the video. However, you can change it by specifying another point in time <i>or</i> by specifying a URL to an image.</p></li><li><p><b><b><b>Start Time:</b></b></b> Let’s say you have a 10-minute instructional video and your customer writes in with a question that is answered in that video at the 8-minute mark. You can use the <i>Start Time</i> property to have the video playback begin at the 8-minute mark, so your customer with a specific question does not need to watch 8 minutes of the video wondering “when will it answer my question?”. Instead, you can share a link with the customer that begins the video playback at the 8-minute mark.</p></li><li><p><b><b><b>Default Text Track:</b></b></b> You can upload caption files for multiple languages for a given video. By default, captions are turned off. But if you want the captions to <i>always</i> render when the video plays, you can choose the default language from the Default Text Track dropdown.</p></li><li><p><b><b><b>Primary Color:</b></b></b> You can choose your brand’s primary color and have it applied to various elements within the player, including the play button and the seek bar. Here is an example of the Stream Player with the Primary Color property configured to the Cloudflare orange:</p></li></ul>
            <figure>
            
            <img src="https://cf-assets.www.cloudflare.com/zkvhlag99gkb/4ygIXB54RyohFgVt1n6xKd/85e83140939afe89ef78087cec5b4ad0/image2-67.png" />
            
            </figure>
            <figure>
            
            <img src="https://cf-assets.www.cloudflare.com/zkvhlag99gkb/3Ni8EfEhC56phN7kpzHxE6/40091b5f5a07f551a6967fa3223b69c6/image5-26.png" />
            
            </figure>
    <div>
      <h3>Much, much more…</h3>
      <a href="#much-much-more">
        
      </a>
    </div>
    <p>We live in a video-first world. Many Cloudflare customers already upload their videos to free video hosting services for marketing purposes. However, when you embed a video on your website that is hosted on a free video sharing service, your users often have to engage with unwelcomed ads and pixel trackers. Our hope is that by offering a free tier of Stream to Biz and Pro customers, you can use video to show off your products in a way that respects your users’ privacy and reflects your brand identity.</p><p>In addition to the features described in this announcement, Cloudflare Stream includes many more features including:</p><ul><li><p>Dynamic Thumbnail Generation</p></li><li><p>Multi-language Captions</p></li><li><p>Live Streaming</p></li><li><p>Analytics</p></li></ul><p>For a comprehensive list of features and how to use them, check out the <a href="https://developers.cloudflare.com/stream">Cloudflare Stream Docs</a>.</p> ]]></content:encoded>
            <category><![CDATA[Birthday Week]]></category>
            <category><![CDATA[Video]]></category>
            <category><![CDATA[eCommerce]]></category>
            <category><![CDATA[Product News]]></category>
            <guid isPermaLink="false">3VFf0XrUuH4TmLskNymhpa</guid>
            <dc:creator>Zaid Farooqui</dc:creator>
        </item>
        <item>
            <title><![CDATA[WebRTC live streaming to unlimited viewers, with sub-second latency]]></title>
            <link>https://blog.cloudflare.com/webrtc-whip-whep-cloudflare-stream/</link>
            <pubDate>Tue, 27 Sep 2022 13:00:00 GMT</pubDate>
            <description><![CDATA[ Cloudflare Stream now supports live streaming over WebRTC to unlimited concurrent viewers, using open standards WHIP and WHEP, with zero dependencies or client SDKs necessary. ]]></description>
            <content:encoded><![CDATA[ <p></p><p>Creators and broadcasters expect to be able to go live from anywhere, on any device. Viewers expect “live” to mean “real-time”. The protocols that power most live streams are unable to meet these growing expectations.</p><p>In talking to developers building live streaming into their apps and websites, we’ve heard near universal frustration with the limitations of existing live streaming technologies. Developers in 2022 rightly expect to be able to deliver low latency to viewers, broadcast reliably, and use web standards rather than old protocols that date back to the era of Flash.</p><p>Today, we’re excited to announce in open beta that Cloudflare Stream now supports live video streaming over WebRTC, with sub-second latency, to unlimited concurrent viewers. This is a new feature of Cloudflare Stream, and you can start using it right now in the Cloudflare Dashboard — read the <a href="https://developers.cloudflare.com/stream/webrtc-beta/">docs</a> to get started.</p><p>WebRTC with Cloudflare Stream leapfrogs existing tools and protocols, exclusively uses open standards with zero dependency on a specific SDK, and empowers any developer to build both low latency live streaming and playback into their website or app.</p>
    <div>
      <h3>The status quo of streaming live video is broken</h3>
      <a href="#the-status-quo-of-streaming-live-video-is-broken">
        
      </a>
    </div>
    <p>The status quo of streaming live video has high latency, depends on archaic protocols and is incompatible with the way developers build apps and websites. A reasonable person’s expectations of what the Internet should be able to deliver in 2022 are simply unmet by the dominant set of protocols carried over from past eras.</p><p><b>Viewers</b> increasingly expect “live” to mean “real-time”. People want to place bets on sports broadcasts in real-time, interact and ask questions to presenters in real-time, and never feel behind their friends at a live event.</p><p>In practice, the HLS and DASH standards used to deliver video have 10+ seconds of latency. LL-HLS and LL-DASH bring this down to closer to 5 seconds, but only as a hack on top of the existing protocol that delivers segments of video in individual HTTP requests. Sending mini video clips over TCP simply cannot deliver video in real-time. HLS and DASH are here to stay, but aren’t the future of real-time live video.</p><p><b>Creators and broadcasters</b> expect to be able to go live from anywhere, on any device.</p><p>In practice, people creating live content are stuck with a limited set of native apps, and can’t go live using RTMP from a web browser. Because it’s built on top of TCP, the RTMP broadcasting protocol struggles under even the slightest network disruption, making it a poor or often unworkable option when broadcasting from mobile networks. RTMP, originally built for use with Adobe Flash Player, was <a href="https://rtmp.veriskope.com/pdf/rtmp_specification_1.0.pdf">last updated in 2012</a>, and while Stream supports the <a href="/magic-hdmi-cable/">newer SRT protocol</a>, creators need an option that works natively on the web and can more easily be integrated in native apps.</p><p><b>Developers</b> expect to be able to build using standard APIs that are built into web browsers and native apps.</p><p>In practice, RTMP can’t be used from a web browser, and creating a native app that supports RTMP broadcasting typically requires diving into lower-level programming languages like C and Rust. Only those with expertise in both live video protocols and these languages have full access to the tools needed to create novel live streaming client applications.</p>
    <div>
      <h3>We’re solving this by using new open WebRTC standards: WHIP and WHEP</h3>
      <a href="#were-solving-this-by-using-new-open-webrtc-standards-whip-and-whep">
        
      </a>
    </div>
    <p>WebRTC is the real-time communications protocol, supported across all web browsers, that powers video calling services like Zoom and Google Meet. Since inception it’s been designed for real-time, ultra low-latency communications.</p><p>While WebRTC is well established, for most of its history it’s lacked standards for:</p><ul><li><p><b>Ingestion</b> — how broadcasters should <b><i>send</i></b> media content (akin to RTMP today)</p></li><li><p><b>Egress</b> — how viewers request and <b><i>receive</i></b> media content (akin to DASH or HLS today)</p></li></ul><p>As a result, developers have had to implement this on their own, and client applications on both sides are often tightly coupled to provider-specific implementations. Developers we talk to often express frustration, having sunk months of engineering work into building around a specific vendor’s SDK, unable to switch without a significant rewrite of their client apps.</p><p>At Cloudflare, our mission is broader — we’re helping to build a better Internet. Today we’re launching not just a new feature of Cloudflare Stream, but a vote of confidence in new WebRTC standards for both ingestion and egress. We think you should be able to start using Stream without feeling locked into an SDK or implementation specific to Cloudflare, and we’re committed to using open standards whenever possible.</p><p>For ingestion, <a href="https://www.ietf.org/archive/id/draft-ietf-wish-whip-03.html">WHIP</a> is an IETF draft on the Standards Track, with many applications already successfully using it in production. For delivery (egress), <a href="https://www.ietf.org/id/draft-murillo-whep-00.html">WHEP</a> is an IETF draft with broad agreement. Combined, they provide a standardized end-to-end way to broadcast one-to-many over WebRTC at scale.</p><p><b>Cloudflare Stream is the first cloud service to let you both broadcast using WHIP and playback using WHEP — no vendor-specific SDK needed.</b> Here’s how it works:</p>
            <figure>
            
            <img src="https://cf-assets.www.cloudflare.com/zkvhlag99gkb/19Hq5GYMLCWmxGoKQBiifO/0cd900cb0c07cfac25f14c2486a3cb77/image2-44.png" />
            
            </figure><p>Cloudflare Stream is already built on top of the Cloudflare developer platform, using Workers and Durable Objects running on Cloudflare’s global network, within 50ms of 95% of the world’s Internet-connected population.</p><p>Our WebRTC implementation extends this to relay WebRTC video through our network. Broadcasters stream video using WHIP to the point of presence closest to their location, which tells the Durable Object where the live stream can be found. Viewers request streaming video from the point of presence closest to them, which asks the Durable Object where to find the stream, and video is routed through Cloudflare’s network, all with sub-second latency.</p><p>Using Durable Objects, we achieve this with zero centralized state. And just like the rest of Cloudflare Stream, you never have to think about regions, both in terms of pricing and product development.</p><p>While existing ultra low-latency streaming providers charge significantly more to stream over WebRTC, because Stream runs on Cloudflare’s global network, we’re able to offer WebRTC streaming at the same price as delivering video over HLS or DASH. We don’t think you should be penalized with higher pricing when choosing which technology to rely on to stream live video. Once generally available, WebRTC streaming will cost $1 per 1000 minutes of video delivered, just like the rest of Stream.</p>
    <div>
      <h3>What does sub-second latency let you build?</h3>
      <a href="#what-does-sub-second-latency-let-you-build">
        
      </a>
    </div>
    <p>Ultra low latency unlocks interactivity within your website or app, removing the time delay between creators, in-person attendees, and those watching remotely.</p><p>Developers we talk to are building everything from live sports betting, to live auctions, to live viewer Q&amp;A and even real-time collaboration in video post-production. Even streams without in-app interactivity can benefit from real-time — no sports fan wants to get a text from their friend at the game that ruins the moment, before they’ve had a chance to watch the final play. Whether you’re bringing an existing app or have a new idea in mind, we’re excited to see what you build.</p>
    <div>
      <h3>If you can write JavaScript, you can let your users go live from their browser</h3>
      <a href="#if-you-can-write-javascript-you-can-let-your-users-go-live-from-their-browser">
        
      </a>
    </div>
    <p>While hobbyist and professional creators might take the time to download and learn how to use an application like <a href="https://obsproject.com/">OBS Studio</a>, most Internet users won’t get past this friction of new tools, and copying RTMP keys from one tool to another. To empower more people to go live, they need to be able to broadcast from within your website or app, just by enabling access to the camera and microphone.</p><p>Cloudflare Stream with WebRTC lets you build live streaming into your app as a front-end developer, without any special knowledge of video protocols. And our approach, using the WHIP and WHEP open standards, means you can do this with zero dependencies, with 100% your code that you control.</p>
    <div>
      <h3>Go live from a web browser with just a few lines of code</h3>
      <a href="#go-live-from-a-web-browser-with-just-a-few-lines-of-code">
        
      </a>
    </div>
    <p>You can go live right now, from your web browser, by creating a live input in the <a href="https://dash.cloudflare.com/?to=/:account/stream/inputs">Cloudflare Stream dashboard</a>, and pasting a URL into the example linked below.</p><p>Read the <a href="https://developers.cloudflare.com/stream/webrtc-beta/">docs</a> or <a href="https://workers.new/stream/webrtc">run the example code below in your browser using Stackblitz</a>.</p>
            <pre><code>&lt;video id="input-video" autoplay autoplay muted&gt;&lt;/video&gt;</code></pre>
            
            <pre><code>import WHIPClient from "./WHIPClient.js";

const url = "&lt;WEBRTC_URL_FROM_YOUR_LIVE_INPUT&gt;";
const videoElement = document.getElementById("input-video");
const client = new WHIPClient(url, videoElement);</code></pre>
            <p>This example uses an example WHIP client, written in just 100 lines of Javascript, using APIs that are native to web browsers, with zero dependencies. But because WHIP is an open standard, you can use any WHIP client you choose. Support for WHIP is growing across the video streaming industry — it has recently been added to <a href="https://gstreamer.freedesktop.org/">Gstreamer</a>, and one of the authors of the WHIP specification has written a <a href="https://github.com/medooze/whip-js">Javascript client implementation</a>. We intend to support the full <a href="https://www.ietf.org/archive/id/draft-ietf-wish-whip-03.html">WHIP specification</a>, including supporting <a href="https://www.rfc-editor.org/rfc/rfc8838">Trickle ICE</a> for fast NAT traversal.</p>
    <div>
      <h3>Play a live stream in a browser, with sub-second latency, no SDK required</h3>
      <a href="#play-a-live-stream-in-a-browser-with-sub-second-latency-no-sdk-required">
        
      </a>
    </div>
    <p>Once you’ve started streaming, copy the playback URL from the live input you just created, and paste it into the example linked below.</p><p>Read the <a href="https://developers.cloudflare.com/stream/webrtc-beta/">docs</a> or <a href="https://workers.new/stream/webrtc">run the example code below in your browser using Stackbltiz</a>.</p>
            <pre><code>&lt;video id="playback" controls autoplay muted&gt;&lt;/video&gt;</code></pre>
            
            <pre><code>import WHEPClient from './WHEPClient.js';
const url = "&lt;WEBRTC_PLAYBACK_URL_FROM_YOUR_LIVE_INPUT&gt;";
const videoElement = document.getElementById("playback");
const client = new WHEPClient(url, videoElement);</code></pre>
            <p>Just like the WHIP example before, this one uses an example WHEP client we’ve written that has zero dependencies. WHEP is an earlier IETF draft than WHIP, <a href="https://www.ietf.org/id/draft-murillo-whep-00.html">published in July of this year</a>, but adoption is moving quickly. People in the community have already written open-source client implementations in both <a href="https://github.com/medooze/whip-js/blob/main/whep.js">Javascript</a>, <a href="https://github.com/meetecho/simple-whep-client">C</a>, with more to come.</p>
    <div>
      <h3>Start experimenting with real-time live video, in open beta today</h3>
      <a href="#start-experimenting-with-real-time-live-video-in-open-beta-today">
        
      </a>
    </div>
    <p>WebRTC streaming is in open beta today, ready for you to use as an integrated feature of <a href="https://www.cloudflare.com/products/cloudflare-stream/">Cloudflare Stream</a>. Once Generally Available, WebRTC streaming will be priced like the rest of Cloudflare Stream, based on minutes of video delivered and minutes stored.</p><p><a href="https://developers.cloudflare.com/stream/webrtc-beta/">Read the docs</a> to get started.</p> ]]></content:encoded>
            <category><![CDATA[Birthday Week]]></category>
            <category><![CDATA[Cloudflare Stream]]></category>
            <category><![CDATA[Video]]></category>
            <category><![CDATA[Product News]]></category>
            <category><![CDATA[Developers]]></category>
            <guid isPermaLink="false">5PQXX1PxT5vsDahi24H7Cn</guid>
            <dc:creator>Kyle Boutette</dc:creator>
            <dc:creator>Kenny Luong</dc:creator>
            <dc:creator>Brendan Irvine-Broque</dc:creator>
            <dc:creator>Jacob Curtis</dc:creator>
            <dc:creator>Rachel Chen</dc:creator>
            <dc:creator>Felipe Astroza Araya</dc:creator>
            <dc:creator>Renan Dincer</dc:creator>
        </item>
        <item>
            <title><![CDATA[Build real-time video and audio apps on the world’s most interconnected network]]></title>
            <link>https://blog.cloudflare.com/announcing-cloudflare-calls/</link>
            <pubDate>Tue, 27 Sep 2022 13:00:00 GMT</pubDate>
            <description><![CDATA[ We are announcing Cloudflare Calls, a new product that lets developers build real-time audio and video apps ]]></description>
            <content:encoded><![CDATA[ <p></p><p>In the last two years, there has been a rapid rise in real-time apps that help groups of people get together virtually with near-zero latency. User expectations have also increased: your users expect real-time video and audio features to work flawlessly. We found that developers building real-time apps want to spend less time building and maintaining low-level infrastructure. Developers also told us they want to spend more time building features that truly make their idea special.</p><p>So today, we are announcing a new product that lets developers build real-time audio/video apps. Cloudflare Calls exposes a set of APIs that allows you to build things like:</p><ul><li><p>A video conferencing app with a custom UI</p></li><li><p>An interactive conversation where the moderators can invite select audience members “on stage” as speakers</p></li><li><p>A privacy-first group workout app where only the instructor can view all the participants while the participants can only view the instructor</p></li><li><p>Remote 'fireside chats' where one or multiple people can have a video call with an audience of 10,000+ people in real time (&lt;100ms delay)</p></li></ul><p>The protocol that makes all this possible is WebRTC. And Cloudflare Calls is the product that abstracts away the complexity by turning the Cloudflare network into a “super peer,” helping you build reliable and secure real-time experiences.</p>
    <div>
      <h3>What is WebRTC?</h3>
      <a href="#what-is-webrtc">
        
      </a>
    </div>
    <p>WebRTC is a peer-to-peer protocol that enables two or more users’ devices to talk to each other <i>directly</i> and without leaving the browser. In a native implementation, peer-to-peer typically works well for 1:1 calls with only two participants. But as you add additional participants, it is common for participants to experience reliability issues, including video freezes and participants getting out of sync. Why? Because as the number of participants increases, the coordination overhead between users’ devices also increases. Each participant needs to send media to each other participant, increasing the data consumption from each computer exponentially.</p><p>A selective forwarding unit (SFU) solves this problem. An SFU is a system that connects users with each other in real-time apps by intelligently managing and routing video and audio data between the participants. Apps that use an SFU reduce the data capacity required from each user because each user doesn’t have to send data to every other user. SFUs are required parts of a real-time application when the applications need to determine who is currently speaking or when they want to send appropriate resolution video when WebRTC simulcast is used.</p>
    <div>
      <h3>Beyond SFUs</h3>
      <a href="#beyond-sfus">
        
      </a>
    </div>
    <p>The centralized nature of an SFU is also its weakness. A centralized WebRTC server needs a region, which means that it will be slow in most parts of the world for most users while being fast for only a few select regions.</p><p>Typically, SFUs are built on public clouds. They consume a lot of bandwidth by both receiving and sending high resolution media to many devices. And they come with significant devops overhead requiring your team to manually configure regions and scalability.</p><p>We realized that merely offering an SFU-as-a-service wouldn’t solve the problem of cost and bandwidth efficiency.</p>
    <div>
      <h3>Biggest WebRTC server in the world</h3>
      <a href="#biggest-webrtc-server-in-the-world">
        
      </a>
    </div>
    <p>When you are on a five-person video call powered by a classic WebRTC implementation, each person’s device talks directly with each other. In WebRTC parlance, each of the five participants is called a <i>peer.</i> And the reliability of the five-person call will only be as good as the reliability of the person (or peer) with the weakest Internet connection.</p><p>We built Calls with a simple premise: <i>“What if Cloudflare could act as a WebRTC peer?”.</i>  Calls is a “super peer” or a “giant server that spans the whole world” allows applications to be built beyond the limitations of the lowest common denominator peer or a centralized SFU. Developers can focus on the strength of their app instead of trying to compensate for the weaknesses of the weakest peer in a p2p topology.</p><p>Calls does not use the traditional SFU topology where every participant connects to a centralized server in a single location. Instead, each participant connects to their local Cloudflare data center. When another participant wants to retrieve that media, the datacenter that homes that original media stream is found and the tracks are forwarded between datacenters automatically. If two participants are physically close their media does not travel around the world to a centralized region, instead they use the same datacenter, greatly reducing latency and improving reliability.</p><p>Calls is a configurable, global, regionless WebRTC server that is the size of Cloudflare's ever-growing network. The WebRTC protocol enables peers to send and receive <i>media tracks.</i> When you are on a video call, your computer is typically sending <i>two</i> tracks: one that contains the audio of you speaking and another that contains the video stream from your camera. Calls implements the WebRTC <a href="https://developer.mozilla.org/en-US/docs/Web/API/RTCPeerConnection">RTCPeerConnection</a> API across the Cloudflare Network where users can push media tracks. Calls also exposes an API where other media tracks can be requested within the same Peer Connection context.</p>
            <figure>
            
            <img src="https://cf-assets.www.cloudflare.com/zkvhlag99gkb/55LIoL4VrFNTx2wTimkShk/473c3054a27deac146389dc237c45a08/image2-41.png" />
            
            </figure><p>Cloudflare Calls will be a good solution if you operate your own WebRTC server such as Janus or MediaSoup. Cloudflare Calls can also replace existing deployments of Janus or MediaSoup, especially in cases where you have clients connecting globally to a single, centralized deployment.</p>
    <div>
      <h2>Region: Earth</h2>
      <a href="#region-earth">
        
      </a>
    </div>
    <p>Building and maintaining your own real-time infrastructure comes with unique architecture and scaling challenges. It requires you to answer and constantly revise your answers to thorny questions such as <i>“which regions do we support?”</i>, “<i>how many users do we need to justify spinning up more infrastructure in yet another cloud region?</i>”, <i>“how do we scale for unplanned spikes in usage?”</i> and <i>“how do we not lose money during low-usage hours of our infrastructure?”</i> when you run your own WebRTC server infrastructure.</p><p>Cloudflare Calls eliminates the need to answer these questions. Calls uses <a href="https://www.cloudflare.com/learning/cdn/glossary/anycast-network/">anycast</a> for every connection, so every packet is always routed to the closest Cloudflare location. It is global by nature: your users are automatically served from a location close to them. Calls scales with your use and your team doesn’t have to build its own auto-scaling logic.</p><p>Calls runs on every Cloudflare location and every single Cloudflare server. Because the Cloudflare network is within 10 milliseconds of 90% of the world’s population, it does not add any noticeable latency.</p>
    <div>
      <h2>Answer “where’s the problem?”, only faster</h2>
      <a href="#answer-wheres-the-problem-only-faster">
        
      </a>
    </div>
    <p>When we talk to customers with existing WebRTC workloads, there is one consistent theme: customers wish it was easier to troubleshoot issues. When a group of people are talking over a video call, the stakes are much higher when users experience issues. When a web page fails to load, it is common for users to simply retry after a few minutes. When a video call is disruptive, it is often the end of the call.</p><p>Cloudflare Calls’ focus on observability will help customers get to the bottom of the issues faster. Because Calls is built on Cloudflare’s infrastructure, we have end-to-end visibility from all layers of the OSI model.</p><p>Calls provides a server side view of the <a href="https://developer.mozilla.org/en-US/docs/Web/API/WebRTC_Statistics_API">WebRTC Statistics API</a>, so you can drill into issues each Peer Connection and the flow of media within without depending only on data sent from clients. We chose this because the Statistics API is a standardized place developers are used to getting information about their experience. It is the same API available in browsers, and you might already be using it today to gain insight into the performance of your WebRTC connections.</p>
    <div>
      <h3>Privacy and security at the core</h3>
      <a href="#privacy-and-security-at-the-core">
        
      </a>
    </div>
    <p>Calls eliminates the need for participants to share information such as their IP address with each other. Let’s say you are building an app that connects therapists and patients via video calls. With a traditional WebRTC implementation, both the patient and therapist’s devices would talk directly with each other, leading to exposure of potentially sensitive data such as the IP address. Exposure of information such as the IP address can leave your users vulnerable to denial-of-service attacks.</p><p>When using Calls, you are still using WebRTC, but the individual participants are connecting to the Cloudflare network. If four people are on a video call powered by Cloudflare Calls, each of the four participants' devices will be talking only with the Cloudflare network. To your end users, the experience will feel just like a peer-to-peer call, only with added security and privacy upside.</p><p>Finally, all video and audio traffic that passes through Cloudflare Calls is encrypted by default. Calls leverages existing Cloudflare products including Argo to route the video and audio content in a secure and efficient manner. Calls API enables granular controls that cannot be implemented with vanilla WebRTC alone. When you build using Calls, you are only limited by your imagination; not the technology.</p>
    <div>
      <h3>What’s next</h3>
      <a href="#whats-next">
        
      </a>
    </div>
    <p>We’re releasing Cloudflare Calls in closed beta today. To try out Cloudflare Calls, <a href="https://www.cloudflare.com/cloudflare-calls-signup-page">request an invitation</a> and check your inbox in coming weeks.Calls will be free during the beta period. We're looking to work with early customers who want to take Calls from beta to generally available with us. If you are building a real-time video app today, having challenges scaling traditional WebRTC infrastructure, or just have a great idea you want to explore, <a href="https://www.cloudflare.com/cloudflare-calls-signup-page">leave a comment</a> when you are requesting an invitation, and we’ll reach out.</p> ]]></content:encoded>
            <category><![CDATA[Birthday Week]]></category>
            <category><![CDATA[WebRTC]]></category>
            <category><![CDATA[Video]]></category>
            <category><![CDATA[Product News]]></category>
            <guid isPermaLink="false">4PEbQjLrYQwj2Hj7O4b3ah</guid>
            <dc:creator>Zaid Farooqui</dc:creator>
            <dc:creator>Renan Dincer</dc:creator>
        </item>
        <item>
            <title><![CDATA[Stream Live is now Generally Available]]></title>
            <link>https://blog.cloudflare.com/stream-live-ga/</link>
            <pubDate>Wed, 21 Sep 2022 13:15:00 GMT</pubDate>
            <description><![CDATA[ Stream Live is now out of beta, available to everyone, and ready for production traffic at scale ]]></description>
            <content:encoded><![CDATA[ <p></p><p>Today, we’re excited to announce that Stream Live is out of beta, available to everyone, and ready for production traffic at scale. Stream Live is a feature of <a href="https://www.cloudflare.com/products/cloudflare-stream/">Cloudflare Stream</a> that allows developers to build live video features in websites and native apps.</p><p>Since its <a href="/stream-live/">beta launch</a>, developers have used Stream to broadcast live concerts from some of the world’s most popular artists directly to fans, build brand-new video creator platforms, operate a global 24/7 live OTT service, and more. While in beta, Stream has ingested millions of minutes of live video and delivered to viewers all over the world.</p><p><b>Bring your big live events, ambitious new video subscription service, or the next mobile video app with millions of users — we’re ready for it.</b></p>
    <div>
      <h2>Streaming live video at scale is hard</h2>
      <a href="#streaming-live-video-at-scale-is-hard">
        
      </a>
    </div>
    <p><b>Live video uses a massive amount of bandwidth.</b> For example, a one-hour live stream at 1080p at 8Mbps is 3.6GB. At <a href="/aws-egregious-egress/">typical cloud provider egress prices</a>, even a little egress can break the bank.</p><p><b>Live video must be encoded on-the-fly, in real-time.</b> People expect to be able to watch live video on their phone, while connected to mobile networks with less bandwidth, higher latency and frequently interrupted connections. To support this, live video must be re-encoded in real-time into multiple resolutions, allowing someone’s phone to drop to a lower resolution and continue playback. This can be complex (Which bitrates? Which codecs? How many?) and costly: running a fleet of virtual machines doesn’t come cheap.</p><p><b>Ingest location matters</b> — Live streaming protocols like RTMPS send video over TCP. If a single packet is dropped or lost, the entire connection is brought to a halt while the packet is found and re-transmitted. This is known as “head of line blocking”. The further away the broadcaster is from the ingest server, the more network hops, and the more likely packets will be dropped or lost, ultimately resulting in latency and buffering for viewers.</p><p><b>Delivery location matters</b> — Live video must be cached and served from points of presence as close to viewers as possible. The longer the network round trips, the more likely videos will buffer or drop to a lower quality.</p><p><b>Broadcasting protocols are in flux</b> — The most widely used protocol for streaming live video, RTMPS, has been abandoned since 2012, and dates back to the era of Flash video in the early 2000s. A new emerging standard, SRT, is not yet supported everywhere. And WebRTC has only recently evolved into an option for high definition one-to-many broadcasting at scale.</p><p>The old way to solve this has been to stitch together separate cloud services from different vendors. One vendor provides excellent content delivery, but no encoding. Another provides APIs or hardware to encode, but leaves you to fend for yourself and build your own storage layer. As a developer, you have to learn, manage and write a layer of glue code around the esoteric details of video streaming protocols, codecs, encoding settings and delivery pipelines.</p>
            <figure>
            
            <img src="https://cf-assets.www.cloudflare.com/zkvhlag99gkb/7epyhfBFSlN7AMdqROyaNE/3a8b0ad09f7f60f6438ace8adf8d1141/image4-8.png" />
            
            </figure><p>We built Stream Live to make streaming live video easy, like adding an  tag to a website. Live video is now a fundamental building block of Internet content, and we think any developer should have the tools to add it to their website or native app.</p><p>With Stream, you or your users stream live video directly to Cloudflare, and Cloudflare delivers video directly to your viewers. You never have to manage internal encoding, storage, or delivery systems — it’s just live video in and live video out.</p>
    <div>
      <h2>Our network, our hardware = a solution only Cloudflare can provide</h2>
      <a href="#our-network-our-hardware-a-solution-only-cloudflare-can-provide">
        
      </a>
    </div>
    <p>We’re not the only ones building APIs for live video — but we are the only ones with our <b><i>own</i></b> global network and hardware that we control and optimize for video. That lets us do things that others can’t, like <a href="/magic-hdmi-cable/">sub-second glass-to-glass latency</a> using RTMPS and SRT playback at scale.</p><p>Newer video codecs require specialized hardware encoders, and while others are beholden to the hardware limitations of public cloud providers, we’re already hard at work installing the latest encoding hardware in our own racks, so that you can deliver high resolution video with even less bandwidth. Our goal is to make what is otherwise only available to video giants available directly to you — stay tuned for some exciting updates on this next week.</p><p>Most providers limit how many destinations you can restream or simulcast to. Because we operate our own network, we’ve never even worried about this, and let you <a href="https://developers.cloudflare.com/stream/stream-live/simulcasting/">restream to as many destinations as you need</a>.</p><p>Operating our own network lets us price Stream based on minutes of video delivered — unlike others, we don’t pay someone else for bandwidth and then pass along their costs to you at a markup. The status quo of charging for bandwidth or per-GB storage penalizes you for delivering or storing high resolution content. If you ask why a few times, most of the time you’ll discover that others are pushing their own cost structures on to you.</p><p>Encoding video is compute-intensive, delivering video is bandwidth intensive, and location matters when ingesting live video. When you use Stream, you don't need to worry about optimizing performance, finding a CDN, and/or tweaking configuration endlessly. Stream takes care of this for you.</p>
    <div>
      <h2>Free your live video from the business models of big platforms</h2>
      <a href="#free-your-live-video-from-the-business-models-of-big-platforms">
        
      </a>
    </div>
    <p>Nearly every business uses live video, whether to engage with customers, broadcast events or monetize live content. But few have the specialized engineering resources to deliver live video at scale on their own, and wire together multiple low level cloud services. To date, many of the largest content creators have been forced to depend on a shortlist of social media apps and streaming services to deliver live content at scale.</p><p>Unlike the status quo, who force you to put your live video in <i>their</i> apps and services and fit <i>their</i> business models, Stream gives you full control of your live video, on <i>your</i> website or app, on any device, at scale, without pushing your users to someone else’s service.</p>
    <div>
      <h2>Free encoding. Free ingestion. Free analytics. Simple per-minute pricing.</h2>
      <a href="#free-encoding-free-ingestion-free-analytics-simple-per-minute-pricing">
        
      </a>
    </div>
    
<table>
<thead>
  <tr>
    <th></th>
    <th><span>Others</span></th>
    <th><span>Stream</span></th>
  </tr>
</thead>
<tbody>
  <tr>
    <td><span>Encoding</span></td>
    <td><span>$ per minute</span></td>
    <td><span>Free</span></td>
  </tr>
  <tr>
    <td><span>Ingestion</span></td>
    <td><span>$ per GB</span></td>
    <td><span>Free</span></td>
  </tr>
  <tr>
    <td><span>Analytics</span></td>
    <td><span>Separate product</span></td>
    <td><span>Free</span></td>
  </tr>
  <tr>
    <td><span>Live recordings</span></td>
    <td><span>Minutes or hours later</span></td>
    <td><span>Instant</span></td>
  </tr>
  <tr>
    <td><span>Storage</span></td>
    <td><span>$ per GB </span></td>
    <td><span>per minute stored</span></td>
  </tr>
  <tr>
    <td><span>Delivery</span></td>
    <td><span>$ per GB</span></td>
    <td><span>per minute delivered</span></td>
  </tr>
</tbody>
</table><p>Other platforms charge for ingestion and encoding. Many even force you to consider where you’re streaming to and from, the bitrate and frames per second of your video, and even which of their datacenters you’re using.</p><p><b>With Stream, encoding and ingestion are free.</b> Other platforms charge for delivery based on bandwidth, penalizing you for delivering high quality video to your viewers. If you stream at a high resolution, you pay more.</p><p><b>With Stream, you don’t pay a penalty for delivering high resolution video.</b> Stream’s pricing is simple — minutes of video delivered and stored. Because you pay per minute, not per gigabyte, you can stream at the ideal resolution for your viewers without worrying about bandwidth costs.</p><p>Other platforms charge separately for analytics, requiring you to buy another product to get metrics about your live streams.</p><p><b>With Stream, analytics are free.</b> Stream provides an <a href="https://developers.cloudflare.com/stream/getting-analytics/fetching-bulk-analytics/">API</a> and <a href="https://dash.cloudflare.com/?to=/:account/stream/analytics">Dashboard</a> for both server-side and client-side analytics, that can be queried on a per-video, per-creator, per-country basis, and more. You can use analytics to identify which creators in your app have the most viewed live streams, inform how much to bill your customers for their own usage, identify where content is going viral, and more.</p><p>Other platforms tack on live recordings or DVR mode as a separate add-on feature, and recordings only become available minutes or even hours after a live stream ends.</p><p><b>With Stream, live recordings are a built-in feature, made available</b> <a href="https://developers.cloudflare.com/stream/stream-live/watch-live-stream/#replaying-recordings"><b>instantly after a live stream ends</b></a><b>.</b> Once a live stream is available, it works just like any other video uploaded to Stream, letting you seamlessly use the same APIs for managing both pre-recorded and live content.</p>
    <div>
      <h2>Build live video into your website or app in minutes</h2>
      <a href="#build-live-video-into-your-website-or-app-in-minutes">
        
      </a>
    </div>
    
            <figure>
            
            <img src="https://cf-assets.www.cloudflare.com/zkvhlag99gkb/HKXLvVWCAI0VaDcIf4JN5/97f176769b109ffae59440f0f3073f8b/image1-26.png" />
            
            </figure><p>Cloudflare Stream enables you or your users to go live using the same protocols and tools that broadcasters big and small use to go live to YouTube or Twitch, but gives you full control over access and presentation of live streams.</p>
    <div>
      <h3>Step 1: Create a live input</h3>
      <a href="#step-1-create-a-live-input">
        
      </a>
    </div>
    <p><a href="https://dash.cloudflare.com/?to=/:account/stream/inputs/create">Create a new live input from the Stream Dashboard</a> or use use the Stream API:</p><p><b>Request</b></p>
            <pre><code>curl -X POST \
-H "Authorization: Bearer &lt;YOUR_API_TOKEN&gt;" \
-d "{"recording": { "mode": "automatic" } }" \
https://api.cloudflare.com/client/v4/accounts/&lt;YOUR_CLOUDFLARE_ACCOUNT_ID&gt;/stream/live_inputs</code></pre>
            <p><b>Response</b></p>
            <pre><code>{
"result": {
"uid": "&lt;UID_OF_YOUR_LIVE_INPUT&gt;",
"rtmps": {
"url": "rtmps://live.cloudflare.com:443/live/",
"streamKey": "&lt;PRIVATE_RTMPS_STREAM_KEY&gt;"
},
...
}
}</code></pre>
            
    <div>
      <h3>Step 2: Use the RTMPS key with any live broadcasting software, or in your own native app</h3>
      <a href="#step-2-use-the-rtmps-key-with-any-live-broadcasting-software-or-in-your-own-native-app">
        
      </a>
    </div>
    <p>Copy the RTMPS URL and key, and use them with your live streaming application. We recommend using <a href="https://obsproject.com/">Open Broadcaster Software (OBS)</a> to get started, but any RTMPS or SRT compatible software should be able to interoperate with Stream Live.</p><p>Enter the Stream RTMPS URL and the Stream Key from Step 1:</p>
            <figure>
            
            <img src="https://cf-assets.www.cloudflare.com/zkvhlag99gkb/6s7amxQ54W9Ik3FqA8SttE/e6a73200ca353bb12b077d2bef2ad685/image5-7.png" />
            
            </figure>
    <div>
      <h3>Step 3: Preview your live stream in the Cloudflare Dashboard</h3>
      <a href="#step-3-preview-your-live-stream-in-the-cloudflare-dashboard">
        
      </a>
    </div>
    <p>In the Stream Dashboard, within seconds of going live, you will see a preview of what your viewers will see, along with the real-time connection status of your live stream.</p>
            <figure>
            
            <img src="https://cf-assets.www.cloudflare.com/zkvhlag99gkb/4KwHTeXov4qVK1WDYiEBLS/c46f5276ac6bacdb8aa29a0bd1388903/Screen-Shot-2022-09-21-at-12.34.35-PM.png" />
            
            </figure>
    <div>
      <h3>Step 4: Add live video playback to your website or app</h3>
      <a href="#step-4-add-live-video-playback-to-your-website-or-app">
        
      </a>
    </div>
    <p>Stream your video using our <a href="https://developers.cloudflare.com/stream/viewing-videos/using-the-stream-player/">Stream Player embed code</a>, or use <a href="https://developers.cloudflare.com/stream/viewing-videos/using-own-player/">any video player that supports HLS or DASH</a> — live streams can be played in both websites or native iOS and Android apps.</p><p>For example, on iOS, all you need to do is provide AVPlayer with the URL to the HLS manifest for your live input, which you can find <a href="https://developers.cloudflare.com/stream/stream-live/watch-live-stream/">via the API</a> or in the <a href="https://dash.cloudflare.com/?to=/:account/stream">Stream Dashboard</a>.</p>
            <pre><code>import SwiftUI
import AVKit

struct MyView: View {
    // Change the url to the Cloudflare Stream HLS manifest URL
    private let player = AVPlayer(url: URL(string: "https://customer-9cbb9x7nxdw5hb57.cloudflarestream.com/8f92fe7d2c1c0983767649e065e691fc/manifest/video.m3u8")!)

    var body: some View {
        VideoPlayer(player: player)
            .onAppear() {
                player.play()
            }
    }
}

struct MyView_Previews: PreviewProvider {
    static var previews: some View {
        MyView()
    }
}</code></pre>
            <p>To run a complete example app in XCode, follow <a href="https://developers.cloudflare.com/stream/examples/ios/">this guide</a> in the Stream Developer docs.</p>
    <div>
      <h2>Companies are building whole new video platforms on top of Stream</h2>
      <a href="#companies-are-building-whole-new-video-platforms-on-top-of-stream">
        
      </a>
    </div>
    <p>Developers want control, but most don’t have time to become video experts. And even video experts building innovative new platforms don’t want to manage live streaming infrastructure.</p><p>Switcher Studio's whole business is live video -- their iOS app allows creators and businesses to produce their own branded, multi camera live streams. Switcher uses Stream as an essential part of their live streaming infrastructure. In their own words:</p><blockquote><p><i>“Since 2014, Switcher has helped creators connect to audiences with livestreams. Now, our users create over 100,000 streams per month. As we grew, we needed a scalable content delivery solution. Cloudflare offers secure, fast delivery, and even allowed us to offer new features, like multistreaming. Trusting Cloudflare Stream lets our team focus on the live production tools that make Switcher unique."</i></p></blockquote><p>While Stream Live has been in beta, we’ve worked with many customers like Switcher, where live video isn’t just one product feature, it <b><i>is</i></b> the core of their product. Even as experts in live video, they choose to use Stream, so that they can focus on the unique value they create for their customers, leaving the infrastructure of ingesting, encoding, recording and delivering live video to Cloudflare.</p>
    <div>
      <h2>Start building live video into your website or app today</h2>
      <a href="#start-building-live-video-into-your-website-or-app-today">
        
      </a>
    </div>
    <p>It takes just a few minutes to sign up and start your first live stream, using the Cloudflare Dashboard, with no code required to get started, but <a href="https://developers.cloudflare.com/stream/">APIs</a> for when you’re ready to start building your own live video features. <a href="https://dash.cloudflare.com/?to=/:account/stream">Give it a try</a> — we’re ready for you, no matter the size of your audience.</p> ]]></content:encoded>
            <category><![CDATA[GA Week]]></category>
            <category><![CDATA[General Availability]]></category>
            <category><![CDATA[Cloudflare Stream]]></category>
            <category><![CDATA[Video]]></category>
            <category><![CDATA[Product News]]></category>
            <category><![CDATA[Developers]]></category>
            <guid isPermaLink="false">2E56cHHAs6xh0x5bjlqC9U</guid>
            <dc:creator>Brendan Irvine-Broque</dc:creator>
            <dc:creator>Kyle Boutette</dc:creator>
            <dc:creator>Mickie Betz</dc:creator>
        </item>
        <item>
            <title><![CDATA[Closed Caption support coming to Stream Live]]></title>
            <link>https://blog.cloudflare.com/stream-live-captions/</link>
            <pubDate>Fri, 13 May 2022 12:59:15 GMT</pubDate>
            <description><![CDATA[ Soon, Stream will auto-detect embedded captions and include it in the live stream delivered to your viewers ]]></description>
            <content:encoded><![CDATA[ <p></p><p>Building inclusive technology is core to the Cloudflare mission. Cloudflare Stream has supported captions for on-demand videos for several years. Soon, Stream will auto-detect embedded captions and include it in the live stream delivered to your viewers.</p><p>Thousands of Cloudflare customers use the Stream product to build video functionality into their apps. With live caption support, Stream customers can better serve their users with a more comprehensive viewing experience.</p>
    <div>
      <h3>Enabling Closed Captions in Stream Live</h3>
      <a href="#enabling-closed-captions-in-stream-live">
        
      </a>
    </div>
    <p>Stream Live scans for CEA-608 and CEA-708 captions in incoming live streams ingested via SRT and RTMPS.  Assuming the live streams you are pushing to Cloudflare Stream contain captions, you don’t have to do anything further: the captions will simply get included in the manifest file.</p>
            <figure>
            
            <img src="https://cf-assets.www.cloudflare.com/zkvhlag99gkb/oV2EUCKfVZe2OgGxO0o3s/d4f202b76449027e7ec774e3f327620f/image3-23.png" />
            
            </figure><p>If you are using the Stream Player, these captions will be rendered by the Stream Player. If you are using your own player, you simply have to configure your player to display captions.  </p>
            <figure>
            
            <img src="https://cf-assets.www.cloudflare.com/zkvhlag99gkb/3qguZaQmzwoCcAbXp5llE9/9788316cc5a72b0df6931422e7b0ddeb/image2-30.png" />
            
            </figure><p>Currently, Stream Live supports captions for a single language during the live event. While the support for captions is limited to one language <i>during</i> the live stream, you can upload captions for multiple languages once the event completes and the live event becomes an on-demand video.</p>
    <div>
      <h3>What is CEA-608 and CEA-708?</h3>
      <a href="#what-is-cea-608-and-cea-708">
        
      </a>
    </div>
    <p>When captions were first introduced in 1973, they were <i>open</i> captions. This means the captions were literally overlaid on top of the picture in the video and therefore, could not be turned off. In 1982, we saw the introduction of <i>closed</i> captions during live television. Captions were no longer imprinted on the video and were instead passed via a separate feed and rendered on the video by the television set.</p><p>CEA-608 (also known as Line 21) and CEA-708 are well-established standards used to transmit captions. CEA-708 is a modern iteration of CEA-608, offering support for nearly every language and text positioning–something not supported with CEA-608.</p>
    <div>
      <h3>Availability</h3>
      <a href="#availability">
        
      </a>
    </div>
    <p>Live caption support will be available in closed beta next month. To request access, <a href="https://www.cloudflare.com/closed-caption-live-stream-private-beta/">sign up for the closed beta</a>.</p><p>Including captions in any video stream is critical to making your content more accessible. For example, the majority of live events are watched on mute and thereby, increasing the value of captions. While Stream Live does not generate live captions yet, we plan to build support for automatic live captions in the future.</p> ]]></content:encoded>
            <category><![CDATA[Platform Week]]></category>
            <category><![CDATA[Cloudflare Stream]]></category>
            <category><![CDATA[Product News]]></category>
            <category><![CDATA[Video]]></category>
            <guid isPermaLink="false">10MfzmJyCzaEh070t4CqBU</guid>
            <dc:creator>Mickie Betz</dc:creator>
            <dc:creator>Zaid Farooqui</dc:creator>
        </item>
        <item>
            <title><![CDATA[Stream with sub-second latency is like a magical HDMI cable to the cloud]]></title>
            <link>https://blog.cloudflare.com/magic-hdmi-cable/</link>
            <pubDate>Fri, 13 May 2022 12:59:12 GMT</pubDate>
            <description><![CDATA[ Starting today, in open beta, Cloudflare Stream supports video playback with sub-second latency over SRT or RTMPS at scale ]]></description>
            <content:encoded><![CDATA[ <p></p><p>Starting today, in open beta, <a href="https://www.cloudflare.com/products/cloudflare-stream/">Cloudflare Stream</a> supports video playback with sub-second latency over SRT or RTMPS at scale. Just like <a href="https://www.cloudflare.com/learning/video/what-is-http-live-streaming/">HLS</a> and DASH formats, playback over RTMPS and SRT costs $1 per 1,000 minutes delivered regardless of video encoding settings used.</p><p>Stream is like a magic HDMI cable to the cloud. You can easily connect a video stream and display it from as many screens as you want wherever you want around the world.</p>
    <div>
      <h3>What do we mean by sub-second?</h3>
      <a href="#what-do-we-mean-by-sub-second">
        
      </a>
    </div>
    <p>Video latency is the time it takes from when a camera sees something happen live to when viewers of a broadcast see the same thing happen via their screen. Although we like to think what’s on TV is happening simultaneously in the studio and your living room at the same time, this is not the case. Often, cable TV takes five seconds to reach your home.</p><p>On the Internet, the range of latencies across different services varies widely from multiple minutes down to a few seconds or less. Live streaming technologies like HLS and DASH, used on by the most common video streaming websites typically offer 10 to 30 seconds of latency, and this is what you can achieve with <a href="/stream-live/">Stream Live</a> today. However, this range does not feel natural for quite a few use cases where the viewers interact with the broadcasters. Imagine a text chat next to an esports live stream or Q&amp;A session in a remote webinar. These new ways of interacting with the broadcast won’t work with typical latencies that the industry is used to. You need one to two seconds <i>at most</i> to achieve the feeling that the viewer is in the same room as the broadcaster.</p><p>We expect Cloudflare Stream to deliver sub-second latencies reliably in most parts of the world by routing the video as much as possible within the Cloudflare network. For example, when you’re sending video from San Francisco on your Comcast home connection, the video travels directly to the nearest point where Comcast and Cloudflare connect, for example, San Jose. Whenever a viewer joins, say from Austin, the viewer connects to the Cloudflare location in Dallas, which then establishes a connection using the Cloudflare backbone to San Jose. This setup avoids unreliable long distance connections and allows Cloudflare to monitor the reliability and latency of the video all the way from broadcaster the last mile to the viewer last mile.</p>
    <div>
      <h3>Serverless, dynamic topology</h3>
      <a href="#serverless-dynamic-topology">
        
      </a>
    </div>
    <p>With Cloudflare Stream, the latency of content from the source to the destination is purely dependent on the physical distance between them: with no centralized routing, each Cloudflare location talks to other Cloudflare locations and shares the video among each other. This results in the minimum possible latency regardless of the locale you are broadcasting from.</p><p>We’ve tested about 500ms of glass to glass latency from San Francisco to London, both from and to residential networks. If both the broadcaster and the viewers were in California, this number would be lower, simply because of lower delay caused by less distance to travel over speed of light. An early tester was able to achieve 300ms of latency by broadcasting using OBS via RTMPS to Cloudflare Stream and pulling down that content over SRT using ffplay.</p>
            <figure>
            
            <img src="https://cf-assets.www.cloudflare.com/zkvhlag99gkb/2X9Jnr7Cj20i62zOz2AS6u/7a672eb97ecd0991a3ad7adc937738a1/RTMPS.png" />
            
            </figure><p>Any server in the Cloudflare Anycast network can receive and publish low-latency video, which means that you're automatically broadcasting to the nearest server with no configuration necessary. To minimize latency and avoid network congestion, we route video traffic between broadcaster and audience servers using the same network telemetry as <a href="https://www.cloudflare.com/products/argo-smart-routing/">Argo</a>.</p><p>On top of this, we construct a dynamic distribution topology, unique to the stream, which grows to meet the capacity needs of the broadcast. We’re just getting started with low-latency video, and we will continue to focus on latency and playback reliability as our real-time video features grow.</p>
    <div>
      <h3>An HDMI cable to the cloud</h3>
      <a href="#an-hdmi-cable-to-the-cloud">
        
      </a>
    </div>
    <p>Most video on the Internet uses HTTP - the protocol for loading websites on your browser to deliver video. This has many advantages, such as easy to achieve interoperability across viewer devices. Maybe more importantly, HTTP can use the existing infrastructure like caches which reduce the cost of video delivery.</p><p>Using HTTP has a cost in latency as it is not a protocol built to deliver video. There’s been many attempts made to deliver low latency video over HTTP, with some reducing latency to a few seconds, but none reach the levels achievable by protocols designed with video in mind. WebRTC and video delivery over QUIC have the potential to further reduce latency, but face inconsistent support across platforms today.</p><p>Video-oriented protocols, such as RTMPS and SRT, side-step some of the challenges above but often require custom client libraries and are not available in modern web browsers. While we now support low latency video today over RTMPS and SRT, we are actively exploring other delivery protocols.</p><p>There’s no silver bullet – yet, and our goal is to make video delivery as easy as possible by supporting the set of protocols that enables our customers to meet their unique and creative needs. Today that can mean receiving RTMPS and delivering low-latency SRT, or ingesting SRT while publishing HLS. In the future, that may include ingesting WebRTC or publishing over QUIC or HTTP/3 or WebTransport. There are many interesting technologies on the <a href="https://grnh.se/4bdb03661us">horizon</a>.</p><p>We’re excited to see new use cases emerge as low-latency video becomes easier to integrate and less costly to manage. A remote cycling instructor can ask her students to slow down in response to an increase in heart rate; an esports league can effortlessly repeat their live feed to remote broadcasters to provide timely, localized commentary while interacting with their audience.</p>
    <div>
      <h3>Creative uses of low latency video</h3>
      <a href="#creative-uses-of-low-latency-video">
        
      </a>
    </div>
    <p>Viewer experience at events like a concert or a sporting event can be augmented with live video delivered in real time to participants’ phones. This way they can experience the event in real-time and see the goal scored or details of what’s going happening on the stage.</p><p>Often in big cities, people who cheer loudly across the city can be heard before seeing a goal scored on your own screen. This can be eliminated by when every video screen shows the same content at the same time.</p><p>Esports games, large company meetings or conferences where presenters or commentators react real time to comments on chat. The delay between a fan making a comment and them seeing the reaction on the video stream can be eliminated.</p><p>Online exercise bikes can provide even more relevant and timely feedback from the live instructors, adding to the sense of community developed while riding them.</p><p>Participants in esports streams can be switched from a passive viewer to an active live participant easily as there is no delay in the broadcast.</p><p>Security cameras can be monitored from anywhere in the world without having to open ports or set up centralized servers to receive and relay video.</p>
    <div>
      <h3>Getting Started</h3>
      <a href="#getting-started">
        
      </a>
    </div>
    <p>Get started by using your existing inputs on Cloudflare Stream. Without the need to reconnect, they will be available instantly for playback with the RTMPS/SRT playback URLs.</p><p>If you don’t have any inputs on Stream, <a href="https://dash.cloudflare.com/sign-up/stream">sign up</a> for $5/mo. You will get the ability to push live video, broadcast, record and now pull video with sub-second latency.</p><p>You will need to use a computer program like FFmpeg or OBS to push video. To playback RTMPS you can use VLC and FFplay for SRT. To integrate in your native app, you can utilize FFmpeg wrappers for native apps such as <a href="https://github.com/tanersener/ffmpeg-kit">ffmpeg-kit</a> for iOS.</p><p>RTMPS and SRT playback work with the recently launched <a href="/bring-your-own-ingest-domain-to-stream-live/">custom domain support</a>, so you can use the domain of your choice and keep your branding.</p> ]]></content:encoded>
            <category><![CDATA[Platform Week]]></category>
            <category><![CDATA[Cloudflare Stream]]></category>
            <category><![CDATA[Product News]]></category>
            <category><![CDATA[Video]]></category>
            <guid isPermaLink="false">2LSEcbl8qZ4zZNAc3VAgBP</guid>
            <dc:creator>J. Scott Miller</dc:creator>
            <dc:creator>Renan Dincer</dc:creator>
        </item>
        <item>
            <title><![CDATA[Cloudflare Stream simplifies creator management for creator platforms]]></title>
            <link>https://blog.cloudflare.com/stream-creator-management/</link>
            <pubDate>Fri, 13 May 2022 12:59:08 GMT</pubDate>
            <description><![CDATA[ Many Stream customers track video usage on a per-creator basis. Today, we are announcing creator management to make usage tracking much easier ]]></description>
            <content:encoded><![CDATA[ <p></p><p>Creator platforms across the world use Cloudflare Stream to rapidly build video experiences into their apps. These platforms serve a diverse range of creators, enabling them to share their passion with their beloved audience. While working with creator platforms, we learned that many Stream customers track video usage on a per-creator basis in order to answer critical questions such as:</p><ul><li><p><b><i>“Who are our fastest growing creators?”</i></b></p></li><li><p><b><i>“How much do we charge or pay creators each month?”</i></b></p></li><li><p><b><i>“What can we do more of in order to serve our creators?”</i></b></p></li></ul>
    <div>
      <h3>Introducing the Creator Property</h3>
      <a href="#introducing-the-creator-property">
        
      </a>
    </div>
    <p>Creator platforms enable artists, teachers and hobbyists to express themselves through various media, including video. We built Cloudflare Stream for these platforms, enabling them to rapidly build video use cases without needing to build and maintain a video pipeline at scale.</p><p>At its heart, every creator platform must manage ownership of user-generated content. When a video is uploaded to Stream, Stream returns a video ID. Platforms using Stream have traditionally had to maintain their own index to track content ownership. For example, when a user with internal user ID <code>83721759</code> uploads a video to Stream with video ID <code>06aadc28eb1897702d41b4841b85f322</code>, the platform must maintain a database table of some sort to keep track of the fact that Stream video ID <code>06aadc28eb1897702d41b4841b85f322</code> belongs to internal user <code>83721759</code>.</p><p>With the introduction of the creator property, platforms no longer need to maintain this index. Stream already has a direct creator upload feature to enable users to upload videos directly to Stream using tokenized URLs and without exposing account-wide auth information. You can now set the creator field with your user’s internal user ID at the time of requesting a tokenized upload URL:</p>
            <pre><code>curl -X POST "https://api.cloudflare.com/client/v4/accounts/023e105f4ecef8ad9ca31a8372d0c353/stream/direct_upload" \
     -H "X-Auth-Email: user@example.com" \
     -H "X-Auth-Key: c2547eb745079dac9320b638f5e225cf483cc5cfdda41" \
     -H "Content-Type: application/json" \
     --data '{"maxDurationSeconds":300,"expiry":"2021-01-02T02:20:00Z","creator": "&lt;CREATOR_ID&gt;", "thumbnailTimestampPct":0.529241,"allowedOrigins":["example.com"],"requireSignedURLs":true,"watermark":{"uid":"ea95132c15732412d22c1476fa83f27a"}}'</code></pre>
            <p>When the user uploads the video, the creator property would be automatically set to your internal user ID and can be leveraged for operations such as pulling analytics data for your creators.</p>
    <div>
      <h3>Query By Creator Property</h3>
      <a href="#query-by-creator-property">
        
      </a>
    </div>
    <p>Setting the creator property on your video uploads is just the beginning. You can now filter Stream Analytics via the Dashboard or the GraphQL API using the creator property.</p>
            <figure>
            
            <img src="https://cf-assets.www.cloudflare.com/zkvhlag99gkb/AL3kZM4W7vpwNEmeuTAvO/86e0c2c72d73319007c02bef85d94c59/kGrcO1JYFhPo8gGDBCqvgLsaJO3Ky0lGQhiqoq92CJtWSFMQ0dgCod0smGH2ET9les1c3C2SXf6NFoizfKLKn8TwOExFIfEazIJrSMzDVqY42EavfavcDMrUg_O4.png" />
            
            </figure><p><i>Filter Stream Analytics in the Dashboard using the Creator property</i></p><p>Previously, if you wanted to generate a monthly report of all your creators and the number of minutes of watch time recorded for their videos, you’d likely use a scripting language such as Python to do the following:</p><ol><li><p>Call the Stream GraphQL API requesting a list of videos and their watch time</p></li><li><p>Traverse through the list of videos and query your internal index to determine which creator each video belongs to</p></li><li><p>Sum up the video watch time for each creator to get a clean report showing you video consumption grouped by the video creator</p></li></ol><p>The creator property eliminates this three step manual process. You can make a single API call to the GraphQL API to request a list of creators and the consumption of their videos for a given time period. Here is an example GraphQL API query that returns minutes delivered by creator:</p>
            <pre><code>query {
  viewer {
    accounts(filter: { accountTag: "&lt;ACCOUNT_ID&gt;" }) {
      streamMinutesViewedAdaptiveGroups(
        limit: 10
        orderBy: [sum_minutesViewed_DESC]
        filter: { date_lt: "2022-04-01", date_gt: "2022-04-31" }
      ) {
        sum {
          minutesViewed
        }
        dimensions {
          creator
        }
      }
    }
  }
}</code></pre>
            <p>Stream is focused on helping creator platforms innovate and scale. Matt Ober, CTO of NFT media management platform <a href="https://www.pinata.cloud/">Piñata</a>, says "By allowing us to upload and then query using creator IDs, large-scale analytics of Cloudflare Stream is about to get a lot easier."</p>
    <div>
      <h3>Getting Started</h3>
      <a href="#getting-started">
        
      </a>
    </div>
    <p><a href="https://developers.cloudflare.com/stream/uploading-videos/creator-id/">Read the docs</a> to learn more about setting the creator property on new and previously uploaded videos. You can also set the creator property on live inputs, so the recorded videos generated from the live event will already have the creator field populated.</p><p>Being able to filter analytics is just the beginning. We can’t wait to enable more creator-level operations, so you can spend more time on what makes your idea unique and less time maintaining table stakes infrastructure.</p> ]]></content:encoded>
            <category><![CDATA[Platform Week]]></category>
            <category><![CDATA[Cloudflare Stream]]></category>
            <category><![CDATA[Product News]]></category>
            <category><![CDATA[Video]]></category>
            <guid isPermaLink="false">3NQRPsoJomsksaoLFEGlTk</guid>
            <dc:creator>Ben Krebsbach</dc:creator>
            <dc:creator>Zaid Farooqui</dc:creator>
        </item>
        <item>
            <title><![CDATA[Stream now supports SRT as a drop-in replacement for RTMP]]></title>
            <link>https://blog.cloudflare.com/stream-now-supports-srt-as-a-drop-in-replacement-for-rtmp/</link>
            <pubDate>Thu, 10 Mar 2022 18:00:00 GMT</pubDate>
            <description><![CDATA[ RTMP is not the protocol to carry us into the future so Cloudflare Stream now supports SRT in wherever you would use RTMP for.  ]]></description>
            <content:encoded><![CDATA[ 
            <figure>
            
            <img src="https://cf-assets.www.cloudflare.com/zkvhlag99gkb/1hEBY4WtbgFMKdGe11oVGp/4716bdd4d2882483707f9d1245a67425/rtr3vWNrtk3X2HSUuSSIHvNud8K6WeNHokdbx231vi9-TIC87GeD6QJ6dMR0vlSJF4qFzv-heJHGXFzhVDygIdHsm3Jh0kGw47QyVx8vtmxPVShb8YsE7CqBRaWJ.png" />
            
            </figure><p>SRT is a new and modern live video transport protocol. It features many improvements to the incumbent popular video ingest protocol, RTMP, such as lower latency, and better resilience against unpredictable network conditions on the public Internet. SRT supports newer video codecs and makes it easier to use accessibility features such as captions and multiple audio tracks. While RTMP development has been abandoned since at least 2012, SRT development is maintained by an active community of developers.</p><p>We don’t see RTMP use going down anytime soon, but we can do something so authors of new broadcasting software, as well as video streaming platforms, can have an alternative.</p>
            <figure>
            
            <img src="https://cf-assets.www.cloudflare.com/zkvhlag99gkb/lsYVSFjz5nlK7TSllp2Y8/2ac5c0004a982682f30f7abd147159a2/Screen-Shot-2022-03-10-at-12.05.57-PM.png" />
            
            </figure><p>Starting today, in open beta, you can use <a href="/restream-with-stream-connect/">Stream Connect</a> as a gateway to translate SRT to RTMP or RTMP to SRT with your existing applications. This way, you can get the last-mile reliability benefits of SRT and can continue to use the RTMP service of your choice. It’s priced at $1 per 1,000 minutes, regardless of video encoding parameters.</p><p>You can also use SRT to go live on <a href="/stream-live/">Stream Live</a>, our end-to-end live streaming service to get HLS and DASH manifest URLs from your SRT input, and do simulcasting to multiple platforms whether you use SRT or RTMP.</p><p>Stream’s SRT and RTMP implementation supports adding or removing RTMP or SRT outputs without having to restart the source stream, scales to tens of thousands of concurrent video streams per customer and runs on every Cloudflare server in every Cloudflare location around the world.</p>
    <div>
      <h3>Go live like it’s 2022</h3>
      <a href="#go-live-like-its-2022">
        
      </a>
    </div>
    <p>When we first started developing live video features on Cloudflare Stream earlier last year we had to decide whether to reimplement an old and unmaintained protocol, RTMP, or focus on the future and start off fresh by using a modern protocol. If we launched with RTMP, we would get instant compatibility with existing clients but would give up features that would greatly improve performance and reliability. Reimplementing RTMP would also mean we’d have to handle the complicated state machine that powers it, demux the FLV container, parse AMF and even write a server that sends the text “Genuine Adobe Flash Media Server 001” as part of the RTMP handshake.</p>
            <figure>
            
            <img src="https://cf-assets.www.cloudflare.com/zkvhlag99gkb/5LxAJbuV3Vvdt0f0otJGeR/99c1e75da5da943291e08c2b83f6e515/pasted-image-0--3-.png" />
            
            </figure><p>Even though there were a few new protocols to evaluate and choose from in this project, the dominance of RTMP was still overwhelming. We decided to implement RTMP but really don’t want anybody else to do it again.</p>
    <div>
      <h3>Eliminate head of line blocking</h3>
      <a href="#eliminate-head-of-line-blocking">
        
      </a>
    </div>
    <p>A common weakness of TCP when it comes to low latency video transfer is head of line blocking. Imagine a camera app sending videos to a live streaming server. The camera puts every frame that is captured into packets and sends it over a reliable TCP connection. Regardless of the diverse set of Internet infrastructure it may be passing through, TCP makes sure all packets get delivered in order (so that your video frames don’t jump around) and reliably (so you don’t see any parts of the frame missing). However, this type of connection comes at a cost. If a <i>single</i> packet is dropped, or lost in the network somewhere between two endpoints like it happens on mobile network connections or wifi often, it means the entire TCP connection is brought to a halt while the lost packet is found and re-transmitted. This means that if one frame is suddenly missing, then <i>everything</i> that would come after the lost video frame needs to wait. This is known as head of line blocking.</p><p>RTMP experiences head of line blocking because it uses a TCP connection. Since SRT is a UDP-based protocol, it does not experience head of line blocking. SRT features packet recovery that is aware of the low-latency and high reliability requirements of video. Similar to <a href="https://www.rfc-editor.org/rfc/rfc9000.html">QUIC</a>, it achieves this by implementing its own logic for a reliable connection on top of UDP, rather than relying on TCP.</p><p>SRT solves this problem by waiting only a little bit, because it knows that losing a single frame won’t be noticeable by the end viewer in the majority of cases. The video moves on if the frame is not re-transmitted right away. SRT really shines when the broadcaster is streaming with less-than-stellar Internet connectivity. Using SRT means fewer buffering events, lower latency and a better overall viewing experience for your viewers.</p>
    <div>
      <h3>RTMP to SRT and SRT to RTMP</h3>
      <a href="#rtmp-to-srt-and-srt-to-rtmp">
        
      </a>
    </div>
    <p>Comparing SRT and RTMP today may not be that useful for the most pragmatic app developers. Perhaps it’s just another protocol that does the same thing for you. It’s important to remember that even though there might not be a big improvement for you today, tomorrow there will be new video use cases that will benefit from a UDP-based protocol that avoids head of line blocking, supports forward error correction and modern codecs beyond H.264 for high-resolution video.</p><p>Switching protocols requires effort from both software that sends video and software that receives video. This is a frustrating chicken-or-the-egg problem. A video streaming service won’t implement a protocol not in use and clients won’t implement a protocol not supported by streaming services.</p><p>Starting today, you can use Stream Connect to translate between protocols for you and deprecate RTMP without having to wait for video platforms to catch up. This way, you can use your favorite live video streaming service with the protocol of your choice.</p><p>Stream is useful if you’re a live streaming platform too! You can start using SRT while maintaining compatibility with existing RTMP clients. When creating a video service, you can have Stream Connect to terminate RTMP for you and send SRT over to the destination you intend instead.</p><p>SRT is already implemented in software like FFmpeg and OBS. Here’s how to get it working from OBS:</p>
            <figure>
            
            <img src="https://cf-assets.www.cloudflare.com/zkvhlag99gkb/5Xtq8EpQEYPGJ1ApiQpSvm/0fafdc93d03b1ba0ad6320fa26b39601/Screen-Shot-2022-03-10-at-12.05.39-PM.png" />
            
            </figure><p>Get started with signing up for Cloudflare Stream and adding a live input.</p>
    <div>
      <h3>Protocol-agnostic Live Streaming</h3>
      <a href="#protocol-agnostic-live-streaming">
        
      </a>
    </div>
    <p>We’re working on adding support for more media protocols in addition to RTMP and SRT. What would you like to see next? Let us know! If this post vibes with you, come work with the engineers building with <a href="https://boards.greenhouse.io/cloudflare/jobs/2953274?gh_jid=2953274">video</a> and <a href="https://boards.greenhouse.io/cloudflare/jobs/3523616?gh_jid=3523616">more</a> at Cloudflare!</p> ]]></content:encoded>
            <category><![CDATA[Cloudflare Stream]]></category>
            <category><![CDATA[Video]]></category>
            <category><![CDATA[Product News]]></category>
            <guid isPermaLink="false">4q8tySaWaNx4DZiOCDRg0R</guid>
            <dc:creator>Renan Dincer</dc:creator>
        </item>
        <item>
            <title><![CDATA[Build your next video application on Cloudflare]]></title>
            <link>https://blog.cloudflare.com/build-video-applications-cloudflare/</link>
            <pubDate>Fri, 19 Nov 2021 13:59:21 GMT</pubDate>
            <description><![CDATA[ Today, we’re going to build a video application inspired by Cloudflare TV. We’ll have user authentication and the ability for administrators to upload recorded videos or livestream new content. Think about being able to build your own YouTube or Twitch using Cloudflare services! ]]></description>
            <content:encoded><![CDATA[ <p></p><p>Historically, building video applications has been very difficult. There's a lot of complicated tech behind recording, encoding, and playing videos. Luckily, <a href="https://www.cloudflare.com/products/cloudflare-stream/">Cloudflare Stream</a> abstracts all the difficult parts away, so you can build custom video and streaming applications easily. Let's look at how we can combine Cloudflare Stream, Access, Pages, and Workers to create a high-performance video application with very little code.</p><p>Today, we’re going to build a video application inspired by <a href="https://cloudflare.tv/live">Cloudflare TV</a>. We’ll have user authentication and the ability for administrators to upload recorded videos or livestream new content. Think about being able to build your own YouTube or Twitch using Cloudflare services!</p><div></div>
    <div>
      <h3>Fetching a list of videos</h3>
      <a href="#fetching-a-list-of-videos">
        
      </a>
    </div>
    <p>On the main page of our application, we want to display a list of all videos. The videos are uploaded and stored with Cloudflare Stream, but more on that later! This code could be changed to display only the "trending" videos or a selection of videos chosen for each user. For now, we'll use the search API and pass in an empty string to return all.</p>
            <pre><code>import { getSignedStreamId } from "../../src/cfStream"

export async function onRequestGet(context) {
    const {
        request,
        env,
        params,
    } = context

    const { id } = params

    if (id) {
        const res = await fetch(`https://api.cloudflare.com/client/v4/accounts/${env.CF_ACCOUNT_ID}/stream/${id}`, {
            method: "GET",
            headers: {
                "Authorization": `Bearer ${env.CF_API_TOKEN_STREAM}`
            }
        })

        const video = (await res.json()).result

        if (video.meta.visibility !== "public") {
            return new Response(null, {status: 401})
        }

        const signedId = await getSignedStreamId(id, env.CF_STREAM_SIGNING_KEY)

        return new Response(JSON.stringify({
            signedId: `${signedId}`
        }), {
            headers: {
                "content-type": "application/json"
            }
        })
    } else {
        const url = new URL(request.url)
        const res = await (await fetch(`https://api.cloudflare.com/client/v4/accounts/${env.CF_ACCOUNT_ID}/stream?search=${url.searchParams.get("search") || ""}`, {
            headers: {
                "Authorization": `Bearer ${env.CF_API_TOKEN_STREAM}`
            }
        })).json()

        const filteredVideos = res.result.filter(x =&gt; x.meta.visibility === "public") 
        const videos = await Promise.all(filteredVideos.map(async x =&gt; {
            const signedId = await getSignedStreamId(x.uid, env.CF_STREAM_SIGNING_KEY)
            return {
                uid: x.uid,
                status: x.status,
                thumbnail: `https://videodelivery.net/${signedId}/thumbnails/thumbnail.jpg`,
                meta: {
                    name: x.meta.name
                },
                created: x.created,
                modified: x.modified,
                duration: x.duration,
            }
        }))
        return new Response(JSON.stringify(videos), {headers: {"content-type": "application/json"}})
    }
}</code></pre>
            <p>We'll go through each video, filter out any private videos, and pull out the metadata we need, such as the thumbnail URL, ID, and created date.</p>
    <div>
      <h3>Playing the videos</h3>
      <a href="#playing-the-videos">
        
      </a>
    </div>
    <p>To allow users to play videos from your application, they need to be public, or you'll have to sign each request. Marking your videos as public makes this process easier. However, there are many reasons you might want to control access to your videos. If you want users to log in before they play them or the ability to limit access in any way, mark them as private and use signed URLs to control access. You can find more information about securing your videos <a href="https://developers.cloudflare.com/stream/viewing-videos/securing-your-stream">here</a>.</p><p>If you are testing your application locally or expect to have fewer than 10,000 requests per day, you can call the /token endpoint to generate a signed token. If you expect more than 10,000 requests per day, sign your own tokens as we do here using <a href="https://jwt.io/">JSON Web Tokens</a>.</p>
    <div>
      <h3>Allowing users to upload videos</h3>
      <a href="#allowing-users-to-upload-videos">
        
      </a>
    </div>
    <p>The next step is to build out an admin page where users can upload their videos. You can find documentation on allowing user uploads <a href="https://developers.cloudflare.com/stream/uploading-videos/direct-creator-uploads">here</a>.</p><p>This process is made easy with the Cloudflare Stream API. You use your <a href="https://developers.cloudflare.com/api/tokens/create">API token</a> and account ID to generate a unique, one-time upload URL. Just make sure your token has the Stream:Edit permission. We hook into all POST requests from our application and return the generated upload URL.</p>
            <pre><code>export const cfTeamsAccessAuthMiddleware = async ({request, data, env, next}) =&gt; {
    try {
        const userEmail = request.headers.get("cf-access-authenticated-user-email")

        if (!userEmail) {
            throw new Error("User not found, make sure application is behind Cloudflare Access")
        }
  
        // Pass user info to next handlers
        data.user = {
            email: userEmail
        }
  
        return next()
    } catch (e) {
        return new Response(e.toString(), {status: 401})
    }
}

export const onRequest = [
    cfTeamsAccessAuthMiddleware
]</code></pre>
            <p>The admin page contains a form allowing users to drag and drop or upload videos from their computers. When a logged-in user hits submit on the upload form, the application generates a unique URL and then posts the <a href="https://developer.mozilla.org/en-US/docs/Web/API/FormData">FormData</a> to it. This code would work well for building a video sharing site or with any application that allows user-generated content.</p>
            <pre><code>async function getOneTimeUploadUrl() {
    const res = await fetch('/api/admin/videos', {method: 'POST', headers: {'accept': 'application/json'}})
    const upload = await res.json()
    return upload.uploadURL
}

async function uploadVideo() {
    const videoInput = document.getElementById("video");

    const oneTimeUploadUrl = await getOneTimeUploadUrl();
    const video = videoInput.files[0];
    const formData = new FormData();
    formData.append("file", video);

    const uploadResult = await fetch(oneTimeUploadUrl, {
        method: "POST",
        body: formData,
    })
}</code></pre>
            
    <div>
      <h3>Adding real time video with Stream Live</h3>
      <a href="#adding-real-time-video-with-stream-live">
        
      </a>
    </div>
    <p>You can add a <a href="https://www.cloudflare.com/developer-platform/solutions/live-streaming/">livestreaming</a> section to your application as well, using <a href="/stream-live/">Stream Live</a> in conjunction with the techniques we've already covered.  You could allow logged-in users to start a broadcast and then allow other logged-in users, or even the public, to watch it in real-time! The streams will automatically save to your account, so they can be viewed immediately after the broadcast finishes in the main section of your application.</p>
    <div>
      <h3>Securing our app with middleware</h3>
      <a href="#securing-our-app-with-middleware">
        
      </a>
    </div>
    <p>We put all authenticated pages behind this middleware function. It checks the request headers to make sure the user is sending a valid authenticated user email.</p>
            <pre><code>export const cfTeamsAccessAuthMiddleware = async ({request, data, env, next}) =&gt; {
    try {
        const userEmail = request.headers.get("cf-access-authenticated-user-email")

        if (!userEmail) {
            throw new Error("User not found, make sure application is behind Cloudflare Access")
        }
  
        // Pass user info to next handlers
        data.user = {
            email: userEmail
        }
  
        return next()
    } catch (e) {
        return new Response(e.toString(), {status: 401})
    }
}

export const onRequest = [
    cfTeamsAccessAuthMiddleware
]</code></pre>
            
    <div>
      <h3>Putting it all together with Pages</h3>
      <a href="#putting-it-all-together-with-pages">
        
      </a>
    </div>
    <p>We have Cloudflare Access controlling our log-in flow. We use the Stream APIs to manage uploading, displaying, and watching videos. We use <a href="https://workers.cloudflare.com/">Workers</a> for managing fetch requests and handling API calls. Now it’s time to tie it all together using <a href="https://pages.cloudflare.com/">Cloudflare Pages</a>!</p><p>Pages provides an easy way to <a href="https://www.cloudflare.com/developer-platform/solutions/hosting/">deploy and host static websites</a>. But now, Pages seamlessly integrates with the Workers platform (link to announcement post). With this new integration, we can deploy this entire application with a single, readable repository.</p>
    <div>
      <h3>Controlling access</h3>
      <a href="#controlling-access">
        
      </a>
    </div>
    <p>Some applications are better public; others contain sensitive data and should be restricted to specific users. The main page is public for this application, and we've used <a href="https://www.cloudflare.com/teams/access/">Cloudflare Access</a> to limit the admin page to employees. You could just as easily use Access to protect the entire application if you’re building an internal learning service or even if you want to beta launch a new site!</p><p>When a user clicks the admin link on our <a href="https://cf-pages-stream.pages.dev/">demo site</a>, they will be prompted for an email address. If they enter a valid Cloudflare email, the application will send them an access code. Otherwise, they won't be able to access that page.</p><p>Check out the <a href="https://github.com/cloudflare/pages-stream-demo">source code</a> and get started building your own video application today!</p> ]]></content:encoded>
            <category><![CDATA[Full Stack Week]]></category>
            <category><![CDATA[Cloudflare Workers]]></category>
            <category><![CDATA[Cloudflare Pages]]></category>
            <category><![CDATA[Cloudflare Access]]></category>
            <category><![CDATA[Cloudflare Stream]]></category>
            <category><![CDATA[Video]]></category>
            <category><![CDATA[Developers]]></category>
            <category><![CDATA[Developer Platform]]></category>
            <guid isPermaLink="false">6crZId6sbMVFWUz7qedA31</guid>
            <dc:creator>Jonathan Kuperman</dc:creator>
            <dc:creator>Adam Janiš</dc:creator>
        </item>
        <item>
            <title><![CDATA[Real-Time Communications at Scale]]></title>
            <link>https://blog.cloudflare.com/announcing-our-real-time-communications-platform/</link>
            <pubDate>Thu, 30 Sep 2021 12:59:36 GMT</pubDate>
            <description><![CDATA[ We’re making it easier to build and scale real-time communications applications around open technologies, starting with WebRTC Components. ]]></description>
            <content:encoded><![CDATA[ <p></p><p>For every successful technology, there is a moment where its time comes. Something happens, usually external, to catalyze it — shifting it from being a good idea with promise, to a reality that we can’t imagine living without. Perhaps the best recent example was what happened to the cloud as a result of the introduction of the iPhone in 2007. Smartphones created a huge addressable market for small developers; and even big developers found their customer base could explode in a way that they couldn’t handle without access to public cloud infrastructure. Both wanted to be able to focus on building amazing applications, without having to worry about what lay underneath.</p><p>Last year, during the outbreak of COVID-19, a similar moment happened to real time communication. Being able to communicate is the lifeblood of any organization. Before 2020, much of it happened in meeting rooms in offices all around the world. But in March last year — that changed dramatically. Those meeting rooms suddenly were emptied. Fast-forward 18 months, and that massive shift in how we work has persisted.</p><p>While, undoubtedly, many organizations would not have been able to get by without the likes of Slack, Zoom and Teams as real time collaboration tools, we think today’s iteration of communication tools is just the tip of the iceberg. Looking around, it’s hard to escape the feeling there is going to be an explosion in innovation that is about to take place to enable organizations to communicate in a remote, or at least hybrid, world.</p><p>With this in mind, today we’re excited to be introducing Cloudflare’s Real Time Communications platform. This is a new suite of products designed to help you build the next generation of real-time, interactive applications. Whether it’s one-to-one video calling, group audio or video-conferencing, the demand for real-time communications only continues to grow.</p><p>Running a reliable and scalable real-time communications platform requires building out a large-scale network. You need to <a href="/250-cities-is-just-the-start/">get your network edge within milliseconds of your users</a> in multiple geographies to make sure everyone can always connect with low latency, low packet loss and low jitter. A <a href="/cloudflare-backbone-internet-fast-lane/">backbone to route around</a> Internet traffic jams. <a href="/designing-edge-servers-with-arm-cpus/">Infrastructure that can efficiently scale</a> to serve thousands of participants at once. And then you need to deploy media servers, write business logic, manage multiple client platforms, and keep it all running smoothly. We think we can help with this.</p><p>Launching today, you will be able to leverage Cloudflare’s global edge network to improve connectivity for any existing WebRTC-based video and audio application, with what we’re calling “WebRTC Components”.  This includes scaling to (tens of) thousands of participants, leveraging our <a href="/cloudflare-thwarts-17-2m-rps-ddos-attack-the-largest-ever-reported/">DDoS mitigation</a> to protect your services from attacks, and enforce <a href="https://developers.cloudflare.com/spectrum/reference/configuration-options#ip-access-rules">IP and ASN-based access policies</a> in just a few clicks.</p>
    <div>
      <h3>How Real Time is “Real Time”?</h3>
      <a href="#how-real-time-is-real-time">
        
      </a>
    </div>
    <p>Real-time typically refers to communication that happens in under 500ms: that is, as fast as packets can traverse the fibre optic networks that connect the world together. In 2021, most real-time audio and video applications use <a href="https://webrtcforthecurious.com/docs/01-what-why-and-how/">WebRTC</a>, a set of open standards and browser APIs that define how to connect, secure, and transfer both media and data over UDP. It was designed to bring better, more flexible bi-directional communication when compared to the primary browser-based communication protocol we rely on today, HTTP. And because WebRTC is supported in the browser, it means that users don’t need custom clients, nor do developers need to build them: all they need is a browser.</p><p>Importantly, we’ve seen the need for reliable, real-time communication across time-zones and geographies increase dramatically, as organizations change the way they work (<a href="/the-future-of-work-at-cloudflare/">yes, including us</a>).</p><p>So where is real-time important in practice?</p><ul><li><p>One-to-one calls (think FaceTime). We’re used to almost instantaneous communication over traditional telephone lines, and there’s no reason for us to head backwards.</p></li><li><p>Group calling and conferencing (Zoom or Google Meet), where even just a few seconds of delay results in everyone talking over each other.</p></li><li><p>Social video, gaming and sports. You don’t want to be 10 seconds behind the action or miss that key moment in a game because the stream dropped a few frames or decided to buffer.</p></li><li><p>Interactive applications: from 3D modeling in the browser, Augmented Reality on your phone, and even game streaming need to be in real-time.</p></li></ul><p>We believe that we’ve only collectively scratched the surface when it comes to real-time applications — and part of that is because scaling real-time applications to even thousands of users requires new infrastructure paradigms and demands more from the network than traditional HTTP-based communication.</p>
    <div>
      <h3>Enter: WebRTC Components</h3>
      <a href="#enter-webrtc-components">
        
      </a>
    </div>
    <p>Today, we’re launching our closed beta <i>WebRTC Components</i>, allowing teams running centralized <a href="https://www.cloudflare.com/learning/video/turn-server/">WebRTC TURN servers</a> to offload it to Cloudflare’s distributed, global network and improve reliability, scale to more users, and spend less time managing infrastructure.</p><p><a href="https://webrtcforthecurious.com/docs/03-connecting/#turn">TURN</a>, or Traversal Using Relays Around NAT (Network Address Translation), was designed to navigate the practical shortcomings of WebRTC’s peer-to-peer origins. WebRTC was (and is!) a peer-to-peer technology, but in practice, establishing reliable peer-to-peer connections remains hard due to Carrier-Grade NAT, corporate NATs and firewalls. Further, each peer is limited by its own network connectivity — in a traditional <a href="https://webrtcforthecurious.com/docs/08-applied-webrtc/#full-mesh">peer-to-peer mesh</a>, participants can quickly find their network connections saturated because they have to receive data from every other peer. In a mixed environment with different devices (mobile, desktops), networks (high-latency 3G through to fast fiber), scaling to more than a handful of peers becomes extremely challenging.</p>
            <figure>
            
            <img src="https://cf-assets.www.cloudflare.com/zkvhlag99gkb/XY5oJWURkYZEvmSeSXGax/8dc00dc851aaa722ed75b8e53df26d87/Before.png" />
            
            </figure><p>Running a TURN service at the edge instead of your own infrastructure gets you a better connection. Cloudflare operates an anycast network spanning <a href="/250-cities-is-just-the-start/">250+ cities</a>, meaning we’re very close to wherever your users are. This means that when users connect to Cloudflare’s TURN service, they get a really good connection to the Cloudflare network. Once it’s on there, we leverage our network and <a href="/250-cities-is-just-the-start/">private backbone</a> to get you superior connectivity, all the way back to the other user on the call.</p><p>But even better: stop worrying about scale. WebRTC infrastructure is notoriously difficult to scale: you need to make sure you have the right capacity in the right location. Cloudflare’s TURN service scales automatically and if you want more endpoints they’re just an API call away.</p>
            <figure>
            
            <img src="https://cf-assets.www.cloudflare.com/zkvhlag99gkb/5Qt7hVKP49sXw4ceYyA2Q2/73c56a80d17827050b8f90a37a7382ee/unnamed--1--1.png" />
            
            </figure><p>Of course WebRTC Components is built on the Cloudflare network, benefiting from the DDoS protection that it’s 100 Tbps network offers. From now on deploying scalable, secure, production-grade WebRTC relays globally is only a couple of API calls away.</p>
    <div>
      <h3>A Developer First Real-Time Platform</h3>
      <a href="#a-developer-first-real-time-platform">
        
      </a>
    </div>
    <p>But, as we like to say at Cloudflare: we’re just getting started. Managed, scalable TURN infrastructure is a critical building block to building real-time services for one-to-one and small group calling, especially for teams who have been managing their own infrastructure, but things become rapidly more complex when you start adding more participants.</p><p>Whether that’s managing the quality of the streams (“tracks”, in WebRTC parlance) each client is sending and receiving to keep call quality up, permissions systems to determine who can speak or broadcast in large-scale events, and/or building signalling infrastructure with support chat and interactivity on top of the media experience, one thing is clear: it there’s a lot to bite off.</p><p>With that in mind, here’s a sneak peek at where we’re headed:</p><ul><li><p>Developer-first APIs that abstract the need to manage and configure low-level infrastructure, authentication, authorization and participant permissions. Think in terms of your participants, rooms and channels, without having to learn the intricacies of ICE, peer connections and media tracks.</p></li><li><p>Integration with <a href="https://www.cloudflare.com/teams/access/">Cloudflare for Teams</a> to support organizational access policies: great for when your company town hall meetings are now conducted remotely.</p></li><li><p>Making it easy to connect any input and output source, including broadcasting to traditional HTTP streaming clients and recording for on-demand playback with <a href="/stream-live/">Stream Live</a>, and ingesting from RTMP sources with <a href="/restream-with-stream-connect/">Stream Connect</a>, or future protocols such as <a href="https://datatracker.ietf.org/doc/html/draft-murillo-whip-02">WHIP</a>.</p></li><li><p>Embedded serverless capabilities via <a href="https://workers.cloudflare.com/">Cloudflare Workers</a>, from triggering Workers on participant events (e.g. join, leave) through to building stateful chat and collaboration tools with <a href="/introducing-workers-durable-objects/">Durable Objects</a> and WebSockets.</p></li></ul><p>… and this is just the beginning.</p><p>We’re also looking for ambitious engineers who want to play a role in building our RTC platform. If you’re an engineer interested in building the next generation of real-time, interactive applications, <a href="https://boards.greenhouse.io/cloudflare/jobs/3523616?gh_jid=3523616&amp;gh_src=9b769b781us">join</a> <a href="https://boards.greenhouse.io/cloudflare/jobs/3523626?gh_jid=3523626&amp;gh_src=4bdb03661us">us</a>!</p><p>If you’re interested in working with us to help connect more of the world together, and are struggling with scaling your existing 1-to-1 real-time video &amp; audio platform beyond a few hundred or thousand concurrent users, <a href="https://docs.google.com/forms/d/e/1FAIpQLSeGvMJPTmsdWXq1rSCGHzszce5RdM5iYHxsQQfPk8Kt5rkaKQ/viewform?usp=sf_link">sign up for the closed beta</a> of WebRTC Components. We’re especially interested in partnering with teams at the beginning of their real-time journeys and who are keen to iterate closely with us.</p> ]]></content:encoded>
            <category><![CDATA[Birthday Week]]></category>
            <category><![CDATA[Product News]]></category>
            <category><![CDATA[Video]]></category>
            <category><![CDATA[Cloudflare Stream]]></category>
            <category><![CDATA[WebRTC]]></category>
            <guid isPermaLink="false">29oyPijBN1jb64XSQsGHLy</guid>
            <dc:creator>Matt Silverlock</dc:creator>
            <dc:creator>Achiel van der Mandele</dc:creator>
            <dc:creator>James Allworth</dc:creator>
        </item>
        <item>
            <title><![CDATA[Serverless Live Streaming with Cloudflare Stream]]></title>
            <link>https://blog.cloudflare.com/stream-live/</link>
            <pubDate>Thu, 30 Sep 2021 12:59:23 GMT</pubDate>
            <description><![CDATA[ You can now use Cloudflare to do serverless end-to-end live-streaming. Stream Live offers video ingestion, encoding, recording and a player in a single product. ]]></description>
            <content:encoded><![CDATA[ <p></p><p>We’re excited to introduce the open beta of Stream Live, an end-to-end scalable <a href="https://www.cloudflare.com/developer-platform/solutions/live-streaming/">live-streaming platform</a> that allows you to focus on growing your live video apps, not your codebase.</p><p>With Stream Live, you can painlessly grow your streaming app to scale to millions of concurrent broadcasters and millions of concurrent users. Start sending live video from mobile or desktop using the industry standard RTMPS protocol to millions of viewers instantly. Stream Live works with the most popular live video broadcasting software you already use, including ffmpeg, <a href="https://obsproject.com/">OBS</a> or Zoom. Your broadcasts are automatically recorded, optimized and delivered using the Stream player.</p><p>When you are building your live infrastructure from scratch, you have to answer a few critical questions:</p><ol><li><p>“<i>Which codec(s) are we going to use to encode the videos?”</i></p></li><li><p><b><i>“Which protocols are we going to use to ingest and deliver videos?”</i></b></p></li><li><p><b><i>“How are the different components going to impact latency?”</i></b></p></li></ol><p>We built Stream Live, so you don’t have to think about these questions and spend considerable engineering effort answering them. Stream Live abstracts these pesky yet important implementation details by automatically choosing the most compatible codec and streaming protocol for the client device. There is no limit to the number of live broadcasts you can start and viewers you can have on Stream Live. Whether you want to make the next viral video sharing app or securely broadcast all-hands meetings to your company, Stream will scale with you without having to spend months building and maintaining video infrastructure.</p>
    <div>
      <h3>Built-in Player and Access Control</h3>
      <a href="#built-in-player-and-access-control">
        
      </a>
    </div>
    <p>Every live video gets an embed code that can be placed inside your app, enabling your users to watch the live stream. You can also use your own player with included support for the two major HTTP streaming formats — HLS and DASH — for a granular control over the user experience.</p><p>You can limit who can view your live videos with self-expiring tokenized links for each viewer. When generating the tokenized links, you can define constraints <a href="https://developers.cloudflare.com/stream/viewing-videos/securing-your-stream">including time-based expiration, geo-fencing and IP restrictions</a>. When building an online learning site or a video sharing app, you can put videos behind authentication, so only logged-in users can view your videos. Or if you are building a live concert platform, you may have agreements to only allow viewers from specific countries or regions. Stream’s signed tokens help you comply with complex and custom rulesets.</p>
    <div>
      <h3>Instant Recordings</h3>
      <a href="#instant-recordings">
        
      </a>
    </div>
    <p>With Stream Live, you don’t have to wait for a recording to be available after the live broadcast ends. Live videos automatically get converted to recordings in less than a second. Viewers get access to the recording instantly, allowing them to catch up on what they missed.</p>
    <div>
      <h3>Instant Scale</h3>
      <a href="#instant-scale">
        
      </a>
    </div>
    <p>Whether your platform has one active broadcaster or ten thousand, Stream Live scales with your use case. You don’t have to worry about adding new compute instances, setting up availability zones or negotiating additional software licenses.</p><p>Legacy live video pipelines built in-house typically ingest and encode the live stream continents away in a single location. Video that is ingested far away makes video streaming unreliable, especially for global audiences. All Cloudflare locations run the necessary software to ingest live video <i>in</i> and deliver video <i>out</i>. Once your video broadcast is in the Cloudflare network, Stream Live uses the <a href="/250-cities-is-just-the-start/">Cloudflare backbone</a> and <a href="https://www.cloudflare.com/products/argo-smart-routing/">Argo</a> to transmit your live video with increased reliability.</p>
            <figure>
            
            <img src="https://cf-assets.www.cloudflare.com/zkvhlag99gkb/3aUAddowRRZXNaqyepq9ey/8509043e6a1a063ee30152199de0e287/unnamed--1--2.png" />
            
            </figure>
    <div>
      <h3>Broadcast with 15 second latency</h3>
      <a href="#broadcast-with-15-second-latency">
        
      </a>
    </div>
    <p>Depending on your video encoder settings, the time between you broadcasting and the video displaying on your viewer’s screens can be as low as fifteen seconds with Stream Live. Low latency allows you to build interactive features such as chat and Q&amp;A into your application. This latency is good for broadcasting meetings, sports, concerts, and worship, but we know it doesn’t cover all uses for live video.</p><p>We’re on a mission to reduce the latency Stream Live adds to near-zero. The Cloudflare network is now <a href="/250-cities-is-just-the-start/">within 50ms for 95% of the world’s population</a>. We believe we can significantly reduce the delay from the broadcaster to the viewer in the coming months. Finally, in the world of live-streaming, latency is only meaningful once you can assume reliability. By using the Cloudflare network spanning over 250 locations, you get unparalleled reliability that is critical for live events.</p>
    <div>
      <h3>Simple and predictable pricing</h3>
      <a href="#simple-and-predictable-pricing">
        
      </a>
    </div>
    <p>Stream Live is available as a pay-as-you-go service based on the duration of videos recorded and duration of video viewed.</p><ul><li><p>It costs $5 per 1,000 minutes of video storage capacity per month. Live-streamed videos are automatically recorded. There is no additional cost for ingesting the live stream.</p></li><li><p>It costs $1 per 1,000 minutes of video viewed.</p></li><li><p>There are no surprises. You never have to pay hidden costs for video ingest, compute (encoding), egress or storage found in legacy video pipelines.</p></li><li><p>You can control how much you spend with Stream using billing alerts and restrict viewing by creating signed tokens that only work for authorized viewers.</p></li></ul><p>Cloudflare Stream encodes the live stream in multiple quality levels at no additional cost. This ensures smooth playback for your viewers with varying Internet speed. As your viewers move from Wi-Fi to mobile networks, videos continue playing without interruption. Other platforms that offer live-streaming infrastructure tend to add extra fees for adding quality levels that caters to a global audience.</p><p>If your use case consists of thousands of concurrent broadcasters or millions of concurrent viewers, <a href="https://www.cloudflare.com/plans/enterprise/contact/">reach out</a> to us for volume pricing.</p>
    <div>
      <h3>Go live with Stream</h3>
      <a href="#go-live-with-stream">
        
      </a>
    </div>
    <p>Stream works independent of any domain on Cloudflare. If you already have a Cloudflare account with a Stream subscription, you can begin using Stream Live by clicking on the “Live Input” tab on the <a href="https://dash.cloudflare.com/?to=/:account/stream">Stream Dashboard</a> and creating a new input:</p>
            <figure>
            
            <img src="https://cf-assets.www.cloudflare.com/zkvhlag99gkb/1iLtxKiyHyGsDb9zkLra2e/804f1abf6a30a86cd214dcc048685705/Stream-Screen.png" />
            
            </figure><p>If you are new to Cloudflare, <a href="https://dash.cloudflare.com/sign-up/stream">sign up for Cloudflare Stream.</a></p> ]]></content:encoded>
            <category><![CDATA[Birthday Week]]></category>
            <category><![CDATA[Live Streaming]]></category>
            <category><![CDATA[WebRTC]]></category>
            <category><![CDATA[Video]]></category>
            <category><![CDATA[Cloudflare Stream]]></category>
            <category><![CDATA[Serverless]]></category>
            <category><![CDATA[Product News]]></category>
            <guid isPermaLink="false">18F1qaXrHhAozgDsfcwjHJ</guid>
            <dc:creator>Zaid Farooqui</dc:creator>
        </item>
    </channel>
</rss>