Each year I set up a conference bridge for the Asterisk Developer Conference. This is done to allow participants who are unable to attend the conference to still hear what is going on and provide input. In the past I have also provided an audio stream using Icecast and the app_ices application. This year will be slightly different.
I recently put up a code review which adds unicast RTP support to the chan_multicast_rtp channel driver and moves it into a chan_rtp module. This was done to allow media to be pushed in a protocol-less fashion elsewhere. For example… ffmpeg.
Before we start playing with ffmpeg though we need to begin feeding a stream. Using the UnicastRTP channel driver we can originate a channel wherever we desire. For the purposes of this blog post I’ll just use an imaginary extension:
channel originate UnicastRTP/127.0.0.1:5001//g722 extension 1000@test
This sets up a unicast RTP channel which sends RTP to 127.0.0.1 port 5001 in the G.722 format. Since this channel immediately answers it will execute the dialplan logic at 1000 in context test.
Since media is now going out let’s take a look at ffmpeg.
The ffmpeg application is used in many things to take an input, change it, and then output it elsewhere. Chances are a video you’ve watched has gone through it. One of the inputs supported is UDP RTP. To allow ffmpeg to receive the above RTP requires an SDP file like the following:
v=0 o=- 0 0 IN IPV4 127.0.0.1 t=0 0 s=No Name a=tool:libavformat c=IN IP4 127.0.0.1 m=audio 5001 RTP/AVP 9
This instructs ffmpeg what type of media to expect (in this case G.722) and what IP address and port to listen on.
Where ffmpeg expects an input to be specified we specify the SDP file above like so:
ffmpeg -i test.sdp
Just accepting a stream in is useless though. We need to do something with it! For the Asterisk Developer Conference I’ll be originating a call to the UnicastRTP channel driver and placing it into the conference bridge. The stream will then be fed to ffmpeg which will push it via RTMP to a media streaming server which expects AAC. My ffmpeg command line string will look like:
ffmpeg -i test.sdp -acodec libfdk_aac -ab 64k -f flv rtmp://127.0.0.1:1935/test
This transcodes the audio stream into AAC at 64k and then pushes it using RTMP to the media streaming server. The media streaming server can then make it available via RTMP, HLS, or other methods. This can cover a wide variety of devices (Apple, PC, Android, iOS, etc). I can also push this directly to a CDN instead of a local media streaming server. Not bad, eh?
It’s important to note as well that you can do anything your ffmpeg supports with the input stream. It’s not limited to just what I’ve done above.
While the functionality required to achieve this, chan_rtp, is not yet in trunk I expect it to be there within the upcoming weeks just in time for the Asterisk Developer Conference. If you end up listening via the stream you’ll get to experience it in use!