Ghost Live! (part 3: building your own streaming service)

Posted in selfhosting

So, now we've looked at existing solutions and some new technologies, it's time we thought about actually building our own.

This is part three of a three part series:

  1. Current solutions
  2. New technologies
  3. Building your own streaming service

This part is going to get very technical, and is going to assume quite a bit of prior knowledge, including both ops and Javascript programming. It's the reason this is split up into multiple parts to make things easier to understand how it works to the not so technically inclined.

I won't be including exact examples here, there's too many specifics to each deployment.


This isn't a comprehensive guide on how to do everything, you'll need to have a few things setup already for which there are plenty of tutorials out there that I couldn't hope to write. To start with you'll definitely need:

Although entirely possible to do without, the following will definitely help you:


First thing you'll need to get spun up is your OME server. The getting started guide here is very good and will talk you through the process for both a docker install and a manual installation.

If using docker, make sure to mount /opt/ovenmediaengine/bin/origin_conf to a local directory, so you can manage the config directly. You may need to copy the base Server.xml out of the container first. I leave this as an exercise ot the reader.

The config is mostly self-explanatory, read through it and look for anything that needs changing from its defaults. In my case I needed to update the IceCandidates section to use my own coturn instance. The following tags are ones to pay attention to:

Stream security

Under the VirtualHost tag you can configure OME to only accept streams with a signed stream key. The OME docs have a good explanation of the signed policy.

The redacted policy I use is below. This only requires a signature to push a stream, it allows anybody to view:


        <!-- <Publishers>webrtc,hls,dash,lldash</Publishers> -->

Port forwarding/routing

OME requires a lot of ports forwarding, how you do this will depend on your deployment. I have a split deployment with some ports going direct from my public IP, and others getting routed via traefik (mainly for TLS termination with LetsEncrypt).

For my traefik deployment, I run the following config to route everything correctly over HTTPS. The DNS name ovenmediaengine resolves to the container running OME.

        - https
      rule: Host(``)
      service: ovenmediaengine-origin
        - ovenmediaengine

        - https
      rule: Host(``) && (Path(`/time`) || Path(`/live/{file:.+/.+\..{2,5}}`))
      service: ovenmediaengine-media
        - ovenmediaengine

        - https
      rule: Host(``) && Path(`/live/{stream:.+}`)
      service: ovenmediaengine-signal
        - ovenmediaengine

          - url: http://ovenmediaengine:9000/
          - url: http://ovenmediaengine:8080/
          - url: http://ovenmediaengine:3333/

          - GET
          - OPTIONS
          - PUT
        accessControlAllowOriginList: '*'
        accessControlMaxAge: 100
        addVaryHeader: true

        - rtmp
      rule: "HostSNI(`*`)"
      service: ovenmediaengine-rtmp

        - rtmp
        - https
      rule: "HostSNI(``)"
      service: ovenmediaengine-rtmp
      tls: true

        - stream-turn
      rule: "HostSNI(`*`)"
      service: ovenmediaengine-turn

          - address: ovenmediaengine:1935

          - address: ovenmediaengine:3748

All other ports, especially WebRTC media ports, are forwarded directly from my perimeter firewall.


Setting up Cactus is pretty straight forward, it works just like any other matrix appservice. There is a hosted version contactable at, but I prefer to run my own version of it on my own homeserver. The self hosting guide gives a good explanation of how to do this, and of course this can be done using docker.

Whichever you choose, you'll want to follow the quick start to get a chat room up and going.

Building a frontend

Now all the components are together, it's time to tie it all together with a frontend. I had to build my own, and did so using Vue, it's available in the GitLab

I encourage anyone whose made it this far to build their own, you're welcome to fork mine but it's a real learning experience. To help with this, here's the key parts of how my UI works.

Stream name

The name of the stream you wish to view is simply pulled from the URL by the Vue router. The value from this is passed to the top level component of each view.

Determining live status

When the Vue app loads, it dispatches a periodic job to check OME to see if a stream is currently live. This is relatively easy to do, with HLS enabled OME will create a file at <OME app>/<stream name>/playlist.m3u8 if the stream is live. If this file exists, then the stream is live and the player can attempt to start it, if it 404s then the player should display an offline message.

Embedding cactus

CactusChat.vue creates an instance of cactus chat based on the stream name from the URL. It requires a bit of configuring and required a post load in Vue to lack of a npm package at the time of writing.

Upon loading, cactus will automatically register a guest account (persisted to localstorage) and fetch the state of the chat room for the stream name. This allows chatting even when the stream is offline, and persists all chat on the matrix homeserver.

Unfortunately at this time cactus does not support sending images or stickers, the latter being a special case in matrix anyway.

Fetching stream title

One of the parts I'm most proud of is the UI's ability to fetch a stream title from the chat room title in matrix. This makes the UI dynamic whilst also just running on static hosting.

Doing this was a tad complicated because of how matrix does things, but it essentially requires fetching the public room directory of the home server and searching through it to find one that matches the expected room name from cactus. Thankfully the expected room name is very predictable, most of it is statically configured in cactus, it just needs to inject the stream name variable from the URL.

This is in-fact the same technique the cactus uses to load the room state, only we extract the name variable instead.

Embedding stream

To give that full discord like experience, the stream player is embeddable within a matrix client using an integration manager. For this I use dimension but that is outside the scope of this guide.

The embed URL follows the same logic as all the others, except it doesn't include the stream title or chat window, and is simply the player full width or height on an empty background. In order for it to be embeddable, CORS rules need to be setup to allow this with your hosting provider. Being a static site, I run mine on CloudFlare pages so the _headers file achieves this. The process would be similar for Netlify or other static site hosting.


Well, I think I've waffled on enough now, lets see how my finished streaming service lines up with my initial requirements criteria:

Browser based
With native option
Guest access
Access control
Signed stream key
Multiple streams
With dynamic title
OME edge replica
{: .feature-matrix }

I'd call that a success.

There's certainly still room for improvement, ActivityPub would be nice for subscriptions, although users can register with matrix and join natively to be notified by @room messages. For now though, I'm more than happy with it, and the server load appears to scale very slowly after the initial viewer (first transcode is the most intensive).

If you do decide to take this on yourself, I'd welcome pull requests to improve the core logic of the UI. If you have any questions, pop them in the comments below, I'll get a ping on matrix just as I do when streaming ;)