View Source Membrane WebRTC Plugin
Membrane Plugin for sending and receiving streams via WebRTC. It's based on ex_webrtc.
It's a part of the Membrane Framework.
Installation
The package can be installed by adding membrane_webrtc_plugin to your list of dependencies in mix.exs:
def deps do
[
{:membrane_webrtc_plugin, "~> 0.26.0"}
]
endDemos
The examples directory shows how to send and receive streams from a web browser.
There are the following three demos:
live_view- a simple Phoenix LiveView project usingMembrane.WebRTC.Live.PlayerandMembrane.WebRTC.Live.Captureto echo video stream captured from the user's browser.phoenix_signaling- showcasing simple Phoenix application that usesMembrane.WebRTC.PhoenixSignalingto echo stream captured from the user's browser and sent via WebRTC. Seeassets/phoenix_signaling/README.mdfor details on how to run the demo.webrtc_signaling- it consists of two scripts:file_to_browser.exsandbrowser_to_file.exs. The first one displays the stream from the fixture file in the user's browser. The latter captures the user's camera input from the browser and saves it in the file. To run one of these demos, type:elixir <script_name>and visithttp://localhost:4000.
Exchanging Signaling Messages
To establish a WebRTC connection you have to exchange WebRTC signaling messages between peers.
In membrane_webrtc_plugin it can be done by the user, with Membrane.WebRTC.Signaling or by passing WebSocket address to
Membrane.WebRTC.Source or Membrane.WebRTC.Sink, but there are two additional ways of doing it, dedicated to be used within
Phoenix projects:
- The first one is to use
Membrane.WebRTC.PhoenixSignalingalong withMembrane.WebRTC.PhoenixSignaling.Socket - The second one is to use
Phoenix.LiveViewMembrane.WebRTC.Live.PlayerorMembrane.WebRTC.Live.Capture. These modules expectMembrane.WebRTC.Signaling.t/0as an argument and take advantage of WebSocket used byPhoenix.LiveViewto exchange WebRTC signaling messages, so there is no need to add any code to handle signaling messages.
How to use Membrane.WebRTC.PhoenixSignaling in your own Phoenix project?
The see the full example, visit example/phoenix_signaling.
- Create a new socket in your application endpoint, using the
Membrane.WebRTC.PhoenixSignaling.Socket, for instance at/signalingpath:socket "/signaling", Membrane.WebRTC.PhoenixSignaling.Socket, websocket: true, longpoll: false - Create a Phoenix signaling channel with the desired signaling ID and use it as
Membrane.WebRTC.Signaling.t()forMembrane.WebRTC.Source,Membrane.WebRTC.SinkorBoombox:signaling = Membrane.WebRTC.PhoenixSignaling.new("<signaling_id>") # use it with Membrane.WebRTC.Source: child(:webrtc_source, %Membrane.WebRTC.Source{signaling: signaling}) |> ... # or with Membrane.WebRTC.Sink: ... |> child(:webrtc_sink, %Membrane.WebRTC.Sink{signaling: signaling}) # or with Boombox: Boombox.run( input: {:webrtc, signaling}, output: ... )
Please note that
signaling_idis expected to be globally unique for each WebRTC connection about to be estabilished. You can, for instance:
Generate a unique id with
:uuidpackage and assign it to the connection in the page controller:unique_id = UUID.uuid4() render(conn, :home, layout: false, signaling_id: unique_id)Generate HTML based on HEEx template, using the previously set assign:
<video id="videoPlayer" controls muted autoplay signaling_id={@signaling_id}></video>Access it in your client code:
const videoPlayer = document.getElementById('videoPlayer'); const signalingId = videoPlayer.getAttribute('signaling_id');
- Use the Phoenix Socket to exchange WebRTC signaling data.
let socket = new Socket("/signaling", {params: {token: window.userToken}}) socket.connect() let channel = socket.channel('<signaling_id>') channel.join() .receive("ok", resp => { console.log("Signaling socket joined successfully", resp) // here you can exchange WebRTC data }) .receive("error", resp => { console.log("Unable to join signaling socket", resp) })
Visit examples/phoenix_signaling/assets/js/signaling.js to see how WebRTC signaling messages exchange might look like.
Integrating Phoenix.LiveView with Membrane WebRTC Plugin
membrane_webrtc_plugin comes with two Phoenix.LiveViews:
Membrane.WebRTC.Live.Capture- exchanges WebRTC signaling messages betweenMembrane.WebRTC.Sourceand the browser. It expects the sameMembrane.WebRTC.Signalingthat has been passed to the relatedMembrane.WebRTC.Source. As a result,Membrane.Webrtc.Sourcewill return the media stream captured from the browser, whereMembrane.WebRTC.Live.Capturehas been rendered.Membrane.WebRTC.Live.Player- exchanges WebRTC signaling messages betweenMembrane.WebRTC.Sinkand the browser. It expects the sameMembrane.WebRTC.Signalingthat has been passed to the relatedMembrane.WebRTC.Sink. As a result,Membrane.WebRTC.Live.Playerwill play media streams passed to the relatedMembrane.WebRTC.Sink. Currently supports up to one video stream and up to one audio stream.
Usage
To use Phoenix.LiveViews from this repository, you have to use related JS hooks. To do so, add the following code snippet to assets/js/app.js
import { createCaptureHook, createPlayerHook } from "membrane_webrtc_plugin";
let Hooks = {};
const iceServers = [{ urls: "stun:stun.l.google.com:19302" }];
Hooks.Capture = createCaptureHook(iceServers);
Hooks.Player = createPlayerHook(iceServers);and add Hooks to the WebSocket constructor. It can be done in the following way:
new LiveSocket("/live", Socket, {
params: SomeParams,
hooks: Hooks,
});To see the full usage example, you can go to examples/live_view/ directory in this repository (take a look especially at examples/live_view/assets/js/app.js and examples/live_view/lib/example_project_web/live_views/echo.ex).
Copyright and License
Copyright 2020, Software Mansion
Licensed under the Apache License, Version 2.0