At A Glance Main Projects Tutorials Resume

Contact


Email: palen1c at gmail.com



Recording Decent Quality Video and Audio With Flash and Red5


This was written in June 2011, but it is a technique I have used since March 2010 when I had to deliver my first exhibit that did high quality video recording in Flash. Right now we're on the brink of a new h.264 encoder in the Flash Player (version 11); but until that time, you'll be stuck using the techniques I'm going to explain below if you want to record anything worthwhile in Flash. The challenge we have here is that the encoders in the Flash player (version 10) are pretty old and don't work too well compared to todays video quality standards.

UPDATE 9/2013: A combined stream example has been posted for recording with Red5 1.0.2 SNAPSHOT or FMS 4.5; Recording Combined Audio and Video Streams With Flash and Red5 1.0.2 SNAPSHOT or FMS. You may want to check that much shorter article and example prior to digging into this detailed, and lengthy article.

The only way I know how to record video is to use a flash capable media server. As of June, 2011 there isn't any way to record video without a media server in the Flash Player, Adobe AIR, or Flex. UPDATE 7/22/2011: I was wrong, it looks like zeropointnine.com has worked on a bitmap frames to flv writer. Anyway, in my case I have always used Red5 as the server, but have experimented with Flash Media Server. This primarily covers my experiences using Red5.

The general procedure of recording involves recognizing certain less documented functionality in the Flash Player to work around how the player sends data to the server within the Netstream classes. In order to prevent random loss of video frames, you separate audio and video into two completely different Netconnections and Netstreams then combine them again on the server once recording is completed.

In this tutorial I'll first introduce you to a Flash programming method that will take care of 80% of the work in a Flash application. If you're really hardcore and need a final file that combines high quality video and audio you can continue on to the section covering the server-side compliment to the recording application.

Recording Status In Red5 Versions
.8 Final is capable of recording.
.9 Entire releases have completely broken recording.
1.0 SVN as of May 2011 had working recording.


Audio/Video Sync Warning

Before you invest a lot of time in this technique I want to warn that there are going to be some intermittent issues with audio/video sync. There are a few ways to deal with this inside the ffmpeg commands that I will discuss further down in the server-side application.

The first option, but one that doesn't work 100% of the time is to attempt to use the ffmpeg async command.
async

Danny Kopping let me know that the second option is to use the itsoffset parameter in order to offset the audio stream. He mentioned he has had luck offsetting the audio by .75 seconds.
itsoffset

Another Possibility In the Red5 1.0 + Configuration

Thanks to hdfr's helpful article on recording in 1.0 there is another option to try out with mixed results in 1.0 +. In the conf/red5-common.xml file,there is a queueThreshold property that can be modified. Per the hdfvr article, if doing high resolution recording, set the value high to something like:
<property name="queueThreshold" value="3600"/>

Prerequisites

1. You have to know how to program a Flash client application. You can program this using the Flex SDK, or one of the Flash CS versions. My example was made using FlashDevelop with the free Flex SDK. If you don't understand what I'm asking here, please see the terminology section in my tutorial on Open Source Flash Programming.

2. You'll need to be able to setup and run Red5 or a media server like Flash Media Server or Wowza.

3. You're going to have to have some decent technical skills. These technologies aren't easy to understand and may take some patience to get going.

4. If you get lost here, you should probably check out my other two tutorials on open source flash programming and setting up a red5 development environment.

In order to do decent recording, 80% of the programming is done in your Flash based client application. There are behaviors of the Flash player we need to understand before trying to record anything. In the case of Red5, the other 20% is handled in your server-side application.

The Flash Client Application

Get Adobe Flash player

Understanding Flash Player 9 and 10 Behavior

1. Flash player puts priority on Audio transport to the media server. If your application experiences any bandwidth issues, video frames will be thrown out at the client, so you'll experience jumpy recorded video. (There is a way to deal with this, but it doesn't work 100% of the time).

2. Despite the network capabilities, there is a ceiling on the publish quality of video that can be achieved when recording using Flash Player. If you go past a certain point, the buffer will grow so large, people using your application would have to wait a half hour or longer for it to finish recording.

3. The Flash player's encoders are not controllable. You're going to see the player increase buffer size substantially when recording bright scenes, and video with lots of motion.

4. Different camera models will play a drastic role in the amount of data that the Flash player attempts to send to the media server.

5. There is no way to control the rate at which Flash Player sends data to the media server; which is extremely frustrating from a programming standpoint.

What is buffer length?

Ok, so I mentioned buffer length. What is that? When recording, buffer length is the representation of how much data is stored in your local Flash Player that still needs to be sent over the wire to the media server. Unless the buffer length runs out when you are done recording, all the data has not made it to the media server and you'll experience broken recordings. There isn't any way to get the actual data amount that needs to be sent.

NetConnection and NetSteram

The two primary classes you're dealing with when recording audio or video in Flash are the Netconnection and Netstream classes. Netconnection is used to establish a connection to the media server, and all of the more complex publish and record operations are done using the Netstream.

So from a programming standpoint this is the general procedure I use for ensuring some decent quality recordings. (Still not what I would consider high quality in 2011, but it's as good as you can get).

1. Create two Netstreams, one for video and one for audio in order to get around Flash Player dropping any video.

2. Determine the quality setting ceiling of the particular camera you're dealing with (seems to be mid 80's for many cameras) and publish based on that setting.

3. When done recording, setup timers to monitor the netstream's bufferLength. Do not close the netstreams, attach a null microphone and camera and wait for the bufferLength to reach 0.

4. Finally use Netstream's close method once the bufferLength has reached zero.

If you separate the recording into two netstreams you now have two independent flv files. One that has video only and one that has audio only. At this point I have my server-side red5 application use ffmpeg to merge the flv audio and flv video files into one final file.

Now that you are aware of the overview of the process, lets take a look at the client-side Flash Player code for the application above, then the server-side Red5 code.

Client Side Code (Actionscript 3)

package com.technogumbo
{
    import flash.display.Sprite;
    import flash.events.Event;
    import flash.events.MouseEvent;
    import flash.events.NetStatusEvent;
    import flash.events.StatusEvent;
    import flash.events.TimerEvent;
    import flash.utils.Timer;
    
    import flash.media.Camera;
    import flash.media.Microphone;
    import flash.media.Video;
    import flash.net.NetConnection;
    import flash.net.NetStream;
    import flash.text.TextFieldAutoSize;
    
    // These imports come from the swc. They are not nativeley available to applications
    // built outside the Flash CS versions.
    import fl.controls.Button;
    import fl.controls.TextInput;
    import fl.controls.Label;
    import fl.controls.ComboBox;
    import fl.controls.CheckBox;
    import fl.data.DataProvider;
    
    /**
     * ...
     * @author Charles
     * 
     * This sample was created in July 2011 using
     * SVN version of Red5 RC1/RC2. Don't criticize me on doing this all in one file.
     * I did this on purpose. I've seen too many OO heavy examples that will really lose beginners.
     * 
     * The H.264 encoders in Flash Player 11 may have a great effect on the items I describe below
     * I would assume they are going to utilize the same behavior.  Think about the tough task of
     * encoding video in real time? H.264 implementations are pretty CPU intensive so it may
     * effect the buffers even more.  In any case, this is how to do it before Flash Player 11.
     * 
     * Be aware that recording performance and behavior varries greatly
     * depending on what version of Red5 you're using.
     * 
     * In RC1 - there is a configuration property in
     * conf/red5-common.xml called queueThreshold
     * By default this is 33, but according to AVChat and hdfvr, you should set this very high,
     * like 3600
     * http://avchathq.com/blog/recording-high-quality-flash-video-over-slow-internet-connections-part-3/
     * http://hdfvr.com/documentation#11
     * 
     * 
     * Also, if you include audio and video in the same NetStream, it will almost ALWAYS mess up the video
     * or cause the video to drop frames because Flash player gives the audio transport priority and drops
     * video frames.  In order to get around this, we record audio and video separateley, then play them
     * back separeteley as well.  This causes slight audio out of sync issues some times, but its better than
     * the trash quality video you normally get.
     * 
     * Also, be aware that at time of authoring, there is no way that I am aware of to record video in Flash/AIR/Flex without using
     * a media server.
     */
    public class Main extends Sprite 
    {
        private var activeCamera:Camera;
        private var activeMic:Microphone;
        private var activeConnection:NetConnection;
        private var activeStream:NetStream;
        private var audioOnlyStream:NetStream;
        
        private var onScreenVideo:Video;
        private var audioPlaybackVideo:Video;
        
        private var camAllowed:Boolean = false;
        private var micAllowed:Boolean = false;
        private var recordingHalted:Boolean = false;
        private var currentlyRecordedFileName:String = "";
        
        private var bufferCheckTimer:Timer;
        
        // UI Purposes - These Controls come from the swc
        private var btnRecord:Button;
        private var btnPlay:Button;
        private var btnStartMerge:Button;
        
        private var cbMicHz:ComboBox;
        private var lblMic:Label;
        
        private var lblVidPlaceholder:Label;
        private var lblVidResolution:Label;
        
        private var lblCamWidth:Label;
        private var txtCamWidth:TextInput;
        
        private var lblCamHeight:Label;
        private var txtCamHeight:TextInput;
        
        private var lblCamFPS:Label;
        private var txtCamFPS:TextInput;
        
        private var lblCamBandwidth:Label;
        private var txtCamBandwidth:TextInput;
        
        private var lblCamQuality:Label;
        private var txtCamQuality:TextInput;
        
        private var lblVideoBuffer:Label;
        private var lblAudioBuffer:Label;
        
        private var lblMediaSvr:Label;
        private var txtMediaSvrAdd:TextInput;
        private var btnMediaSrvr:Button;
        private var lblBufferSize:Label;
        
        private var lblNetStreamBuff:Label;
        private var txtNetStreamBuff:TextInput;
        
        private var lblNetStreamBehavior:Label;
        private var cbNetStreamBehavior:CheckBox;
        
        private var lblStatus:Label;
        
        public function Main():void 
        {
            if (stage) init();
            else addEventListener(Event.ADDED_TO_STAGE, init);
        }
        
        private function init(e:Event = null):void 
        {
            removeEventListener(Event.ADDED_TO_STAGE, init);
            // entry point
            
            // Setup the UI
            setupUI();
        }
        
        /**
         * Sets up the UI.
         * Don't get lost in here. The important parts of the example are definatley not this.
        **/
        private function setupUI():void {
            
            onScreenVideo = new Video(320, 240);
            onScreenVideo.smoothing = true;
            onScreenVideo.x = 10;
            onScreenVideo.y = 10;
            
            // We smash the audio only video behind the real video that contains..well video. Audio video is only used for
            // playing back audio.
            
            // Make sure to set the video to the same size as your camera!
            // NAH - making it smaller and it will just get crushed
            audioPlaybackVideo = new Video(320, 240);
            audioPlaybackVideo.x = 10;
            audioPlaybackVideo.y = 10;
            
            addChild(audioPlaybackVideo); // -- This is directly behind the onScreenVideo so its not visible
            addChild(onScreenVideo);
            
            lblVidPlaceholder = new Label();
            lblVidPlaceholder.text = "Once You Record Video Wideo Will Be Displayed Here";
            lblVidPlaceholder.width = 200;
            lblVidPlaceholder.height = 100;
            lblVidPlaceholder.wordWrap = true;
            lblVidPlaceholder.x = 10;
            lblVidPlaceholder.y = 100;
            addChild(lblVidPlaceholder);
            
            lblMediaSvr = new Label();
            lblMediaSvr.text = "Media Server Application URI:";
            lblMediaSvr.width = 200;
            lblMediaSvr.x = 340;
            lblMediaSvr.y = 10;
            addChild(lblMediaSvr);
            
            txtMediaSvrAdd = new TextInput();
            txtMediaSvrAdd.text = "rtmp://localhost/vod";
            txtMediaSvrAdd.x = 340;
            txtMediaSvrAdd.y = 30;
            txtMediaSvrAdd.width = 160;
            addChild(txtMediaSvrAdd);
            
            btnMediaSrvr = new Button();
            btnMediaSrvr.label = "Connect";
            btnMediaSrvr.addEventListener(MouseEvent.CLICK, handleMedSrvrBtn, false, 0, true);
            btnMediaSrvr.x = 520;
            btnMediaSrvr.y = 30;
            addChild(btnMediaSrvr);
            
            lblVidResolution = new Label();
            lblVidResolution.text = "Settings Take Effect When You Connect to the Media Server In This Example. Disconnect then Re-connect to Change Camera and Microphone Settings.";
            lblVidResolution.wordWrap = true;
            lblVidResolution.width = 300;
            lblVidResolution.height = 200;
            //lblVidResolution.autoSize = TextFieldAutoSize.LEFT;
            lblVidResolution.x = 340;
            lblVidResolution.y = 60;
            addChild(lblVidResolution);
            
        
            lblCamWidth = new Label();
            lblCamWidth.text = "Width:";
            lblCamWidth.x = 340;
            lblCamWidth.y = 120;
            addChild(lblCamWidth);
            
            txtCamWidth = new TextInput();
            txtCamWidth.width = 35;
            txtCamWidth.height = 20;
            txtCamWidth.x = 375;
            txtCamWidth.y = 120;
            txtCamWidth.text = "640";
            addChild(txtCamWidth);
            
            lblCamHeight = new Label();
            lblCamHeight.text = "Height:";
            lblCamHeight.y = 120;
            lblCamHeight.x = 435;
            addChild(lblCamHeight);
            
            txtCamHeight = new TextInput();
            txtCamHeight.width = 35;
            txtCamHeight.height = 20;
            txtCamHeight.y = 120;
            txtCamHeight.x = 470;
            txtCamHeight.text = "480";
            addChild(txtCamHeight);
            
            lblCamFPS = new Label();
            lblCamFPS.text = "FPS:";
            lblCamFPS.y = 120;
            lblCamFPS.x = 525;
            addChild(lblCamFPS);
            
            txtCamFPS = new TextInput();
            txtCamFPS.width = 35;
            txtCamFPS.height = 20;
            txtCamFPS.x = 550;
            txtCamFPS.y = 120;
            txtCamFPS.text = "29.97";
            addChild(txtCamFPS);
            
            lblCamBandwidth = new Label();
            lblCamBandwidth.text = "Bandwidth:";
            lblCamBandwidth.x = 340;
            lblCamBandwidth.y = 150;
            lblCamBandwidth.width = 60;
            addChild( lblCamBandwidth );
            
            txtCamBandwidth = new TextInput();
            txtCamBandwidth.width = 30;
            txtCamBandwidth.height = 20;
            txtCamBandwidth.y = 150;
            txtCamBandwidth.x = 400;
            txtCamBandwidth.text = "0";
            addChild( txtCamBandwidth );
            
            lblCamQuality = new Label();
            lblCamQuality.text = "Quality:";
            lblCamQuality.x = 435;
            lblCamQuality.y = 150;
            lblCamQuality.width = 65;
            addChild( lblCamQuality );
            
            txtCamQuality = new TextInput();
            txtCamQuality.width = 30;
            txtCamQuality.height = 20;
            txtCamQuality.y = 150;
            txtCamQuality.x = 470;
            txtCamQuality.text = "85";
            addChild( txtCamQuality );
            
            lblMic = new Label();
            lblMic.text = "Microphone Record Rate";
            lblMic.width = 150;
            lblMic.x = 340;
            lblMic.y = 180;
            addChild(lblMic);
            
            var sampleRates:Array = [ 
            {label:"44kHz", data:"44"}, 
            {label:"22kHz", data:"22"}, 
            {label:"11kHz", data:"11" },
            {label:"8kHz", data:"8"}, 
            {label:"5kHz", data:"5"}
            ]; 
            
            cbMicHz = new ComboBox();
            cbMicHz.dropdownWidth = 100;
            cbMicHz.move(340, 200);
            cbMicHz.dataProvider = new DataProvider(sampleRates);
            addChild(cbMicHz);
        
            lblNetStreamBuff = new Label();
            lblNetStreamBuff.text = "Netstream Buffer Time During Recording (Automatically Set to 1 For Playback)";
            lblNetStreamBuff.x = 340;
            lblNetStreamBuff.y = 240;
            lblNetStreamBuff.width = 200;
            lblNetStreamBuff.height = 60;
            lblNetStreamBuff.wordWrap = true;
            addChild( lblNetStreamBuff );
            
            txtNetStreamBuff = new TextInput();
            txtNetStreamBuff.text = "60";
            txtNetStreamBuff.height = 20;
            txtNetStreamBuff.width = 30;
            txtNetStreamBuff.x = 340;
            txtNetStreamBuff.y = 280;
            addChild( txtNetStreamBuff );
            
            lblNetStreamBehavior = new Label();
            lblNetStreamBehavior.text = "Netstream Waits For Buffer To Empty Before Calling close?";
            lblNetStreamBehavior.width = 200;
            lblNetStreamBehavior.height = 60;
            lblNetStreamBehavior.wordWrap = true;
            lblNetStreamBehavior.x = 340;
            lblNetStreamBehavior.y = 320;
            addChild( lblNetStreamBehavior );
            
            cbNetStreamBehavior = new CheckBox();
            cbNetStreamBehavior.x = 340;
            cbNetStreamBehavior.y = 360;
            cbNetStreamBehavior.selected = true;
            cbNetStreamBehavior.label = "";
            addChild( cbNetStreamBehavior );
            
            // -- Left side UI
            btnRecord = new Button();
            btnRecord.label = "Record";
            btnRecord.addEventListener(MouseEvent.CLICK, handleRecordStartStop, false, 0, true);
            btnRecord.x = 10;
            btnRecord.y = 290;
            addChild(btnRecord);
            btnRecord.enabled = false;
            
            btnPlay = new Button();
            btnPlay.label = "Play Recording";
            btnPlay.addEventListener(MouseEvent.CLICK, handlePlaybackClick, false, 0, true);
            btnPlay.x = 200;
            btnPlay.y = 290;
            addChild(btnPlay);
            btnPlay.enabled = false;
            
            btnStartMerge = new Button();
            btnStartMerge.label = "Start Merge";
            btnStartMerge.addEventListener(MouseEvent.CLICK, handleMediaServerMergeCall, false, 0, true);
            btnStartMerge.x = 10;
            btnStartMerge.y = 320;
            addChild( btnStartMerge );
            btnStartMerge.enabled = false;
            
            lblVideoBuffer = new Label();
            lblVideoBuffer.text = "Video Stream Buffer: 0";
            lblVideoBuffer.x = 10;
            lblVideoBuffer.y = 250;
            lblVideoBuffer.width = 320;
            addChild(lblVideoBuffer);
            
            lblAudioBuffer = new Label();
            lblAudioBuffer.text = "Audio Stream Buffer: 0";
            lblAudioBuffer.x = 10;
            lblAudioBuffer.y = 270;
            lblAudioBuffer.width = 320;
            addChild(lblAudioBuffer);
            
            lblStatus = new Label();
            lblStatus.x = 10;
            lblStatus.y = 360;
            lblStatus.width = 320;
            lblStatus.height = 120;
            lblStatus.wordWrap = true;
            setStatus("Setting up UI:");
            addChild(lblStatus);
            
            
        }
        
        /**
         * Just a utility function for the UI
         * @param    _OnOff
         */
        private function disableEnableRecordingSettings(_OnOff:Boolean):void {
            if (_OnOff == true) {
                
                txtMediaSvrAdd.enabled = true;
                txtCamWidth.enabled = true;
                txtCamHeight.enabled = true;
                txtCamFPS.enabled = true;
                txtCamBandwidth.enabled = true;
                txtCamQuality.enabled = true;
                cbMicHz.enabled = true;
                txtNetStreamBuff.enabled = true;
                cbNetStreamBehavior.enabled = true;
                
            } else {
                
                txtMediaSvrAdd.enabled = false;
                txtCamWidth.enabled = false;
                txtCamHeight.enabled = false;
                txtCamFPS.enabled = false;
                txtCamBandwidth.enabled = false;
                txtCamQuality.enabled = false;
                cbMicHz.enabled = false;
                txtNetStreamBuff.enabled = false;
                cbNetStreamBehavior.enabled = false;
                
            }
        }
        
        /**
         * Another utility function for the UI. Changes the status text in the lower left
         * @param    _NewStatus
         */
        private function setStatus(_NewStatus:String):void {
            lblStatus.text = "STATUS: " + _NewStatus;
        }
        
        /**
         * Mouse Click Event button handler for connect/Disconnect to the media server
         * (The button in the top left)
         * @param    e
         */
        private function handleMedSrvrBtn(e:MouseEvent):void {
            if (btnMediaSrvr.label == "Connect") {
            
                // Disable UI recording settings
                disableEnableRecordingSettings(false);
                
                setupConnection();
                
                btnMediaSrvr.label = "Disconnect";
            } else {
                
                // Destroy the camera
                destroyBufferCheckTimer();
                // Destroy camera and mic
                destroyCameraAndMic();
                // Destroy any leftover or active streams
                destroyStreams();
                // Destroy the original connection
                destroyConnection();
                
                // Enable UI recording settings
                disableEnableRecordingSettings(true);
                
                btnRecord.enabled = false;
                btnPlay.enabled = false;
                btnStartMerge.enabled = false;
                
                btnMediaSrvr.label = "Connect";
            }
        }

        /**
         * The mouse click event handler for the recording start/stop button
         * @param    e
         */
        private function handleRecordStartStop(e:MouseEvent):void {
            if (btnRecord.label == "Record") {
                
                recordingHalted = false;
                
                lblVidPlaceholder.visible = false;
                setupStream();
                setupCameraAndMic();
                    
                btnPlay.label = "Play Recording";
                btnPlay.enabled = false;
                btnStartMerge.enabled = false;
                
                btnRecord.label = "Stop";
            } else {
                
                if (activeStream != null) {
                    setStatus("Done recording, waiting for buffers to empty.");
                    
                    activeStream.attachCamera(null);
                    activeStream.attachAudio(null);
                    
                    audioOnlyStream.attachCamera(null);
                    audioOnlyStream.attachAudio(null);
                    
                    recordingHalted = true;
                    
                    // Looks like we aren't waiting for the buffer to empty. Good luck!
                    if (cbNetStreamBehavior.selected == false) {
                        destroyBufferCheckTimer();
                        activeStream.close();
                        audioOnlyStream.close();
                        
                        setStatus("We didn't wait for the buffers to empty and immediateley closed.");
                        // OK - Enable Playback - playback time
                        btnPlay.enabled = true;
                        // Enable calling 
                        btnStartMerge.enabled = true;
                    }
                }
                
                btnRecord.label = "Record";
            }
        }
        
        /**
         * The mouse click handler for the button that allows you to play back a recording.
         * @param    e
         */
        private function handlePlaybackClick(e:MouseEvent):void {
            if (btnPlay.label == "Play Recording") {
                
                doRecordingPlayback();
                
                btnPlay.label = "Stop";
            } else {
                
                stopRecordingPlayback();
                
                btnPlay.label = "Play Recording";
            }
        }
        
        /**
         * Mouse event click handler. This will attempt to do a netconnection.call to 
         * call a function on the media server. This was intended for use with the
         * recording_merger server-side application included in the same tutorial as this application.
         * 
         * Im ignoring a responder here because the server-side function is void and returns nothing.
         * In the recording_merger application it actually invokes a call back on this application that
         * has nothing to do with a responder.
         * 
         * In the red5 server side application, I'm attempting to call a public function here.
         * @param    e
         */
        private function handleMediaServerMergeCall(e:MouseEvent):void {
            if (activeConnection != null) {
                activeConnection.call( "initiateTranscoder", null, currentlyRecordedFileName );
            }
        }
        
        /**
         * This function is envoked by the media server when transcoding completes..if using the 
         * recording_merger application.
         * 
         * @param    _FullFilePath
         */
        public function transcodingCallback(_FullFilePath:String):void {
            setStatus("transcodingCallback Final Video at: " + _FullFilePath);
        }
        
        /**
         * Sets up the Netconnection to the media server
         */
        private function setupConnection():void {
            if(activeConnection == null) {
                activeConnection = new NetConnection();
                // Setting the client to this so when the server attempts to call functions on the client
                // it will go to this class.
                activeConnection.client = this;
                activeConnection.addEventListener(NetStatusEvent.NET_STATUS, handleConnectionStatus, false, 0, true);
                activeConnection.connect( txtMediaSvrAdd.text );
                setStatus("Connecting to: " + txtMediaSvrAdd.text );
            } else {
                destroyConnection(true);
            }
        }
        
        /**
         * A utility function for destroying the active netconnection object.
         * 
         * Please ignore the optionalCallback items, they are just a methodology I use
         * @param    _OptionalCallback
         */
        private function destroyConnection(_OptionalCallback:Boolean = false):void {
            if (activeConnection != null) {
                
                activeConnection.close();
                activeConnection.removeEventListener(NetStatusEvent.NET_STATUS, handleConnectionStatus);
                activeConnection = null;
                
                if (_OptionalCallback == true) {
                    setupConnection();
                }
            }
        }
        
        /**
         * The NetConnections netstatus event handler.  This will fire when a connection is closed
         * or if you successfully connect to a media server.
         * @param    e
         */
        private function handleConnectionStatus(e:NetStatusEvent):void {
            trace("handleConnectionStatus - " + e.info.code );
            switch(e.info.code) {
                case 'NetConnection.Connect.Success':
                    setStatus("Successfully connected to: " + txtMediaSvrAdd.text );
                    btnRecord.enabled = true;
                    
                break;
                case 'NetConnection.Connect.Closed':
                    setStatus("Netconnection Closed to: " + txtMediaSvrAdd.text );
                break;
            }
        }
        
        /**
         * This function begins setup on the audio and video netstream which are part of the
         * core of recording and playback when dealing with a media server.
         */
        private function setupStream():void {
            if (activeStream == null) {
                activeStream = new NetStream(activeConnection);
                audioOnlyStream = new NetStream(activeConnection);
                
                // Set the client object so the netstreams can call onMetaData when we do playback
                activeStream.client = this;
                audioOnlyStream.client = this;
                
                activeStream.addEventListener(NetStatusEvent.NET_STATUS, handleStreamStatus, false, 0, true);
                audioOnlyStream.addEventListener(NetStatusEvent.NET_STATUS, handleAudioOnlyStreamStatus, false, 0, true);
                // Set the buffer time real high so the player doesnt just start dropping frames all over the place
                // to preserve the tiny default .1 buffer
                // http://livedocs.adobe.com/flash/9.0/ActionScriptLangRefV3/flash/net/NetStream.html#bufferTime
                activeStream.bufferTime = Number( txtNetStreamBuff.text );
                audioOnlyStream.bufferTime = Number( txtNetStreamBuff.text );
            } else {
                destroyStreams(true);
            }
        }
        
        /**
         * Just a function for cleaning up any of the Netstreams as completley as possible.
         * Again, ignore the optional callback stuff. Its just a methodology I use in OO programming.
         * @param    _OptionalCallback
         */
        private function destroyStreams(_OptionalCallback:Boolean = false):void {
            if (activeStream != null) {
                
                if(onScreenVideo != null) {
                    onScreenVideo.attachNetStream(null);
                }
                if(audioPlaybackVideo != null) {
                    audioPlaybackVideo.attachNetStream(null);
                }
            
                activeStream.attachCamera(null);
                activeStream.attachAudio(null);
                
                audioOnlyStream.attachCamera(null);
                audioOnlyStream.attachAudio(null);
                    
                activeStream.close();
                audioOnlyStream.close();
                
                activeStream.removeEventListener(NetStatusEvent.NET_STATUS, handleStreamStatus);
                audioOnlyStream.removeEventListener(NetStatusEvent.NET_STATUS, handleAudioOnlyStreamStatus);
                
                activeStream = null;
                audioOnlyStream = null;
                
                if (_OptionalCallback == true) {
                    setupStream();
                }
            }
        }
    
        // When streaming video or doing playback using netstream you always have to setup onMetaData
        // so this is used for both AUDIO and VIDEO
        // Where we set the netstream's client to this, it allows the netstream to automatically call this function
        public function onMetaData(info:Object):void {
            trace("playback called onMetaData");
            setStatus("Playback of recording called onMetaData");
        }
        
        // Also a callback needed when streaming video back
        // http://livedocs.adobe.com/flash/9.0/ActionScriptLangRefV3/flash/net/NetStream.html#event:onPlayStatus
        public function onPlayStatus(info:Object):void {
            trace("playback called onPlayStatus");
            setStatus("Playback of recording called onPlayStatus");
        }
        
        /**
         * Handles NetStatus events on the video NetStream
         * We dont particularly do anything here, but these are real helpful
         * if making a full video player.
         * @param    e
         */
        private function handleStreamStatus(e:NetStatusEvent):void {
            switch(e.info.code) {
                case 'NetStream.Buffer.Empty':
                    setStatus("Video Netstream Buffer Empty");
                break;
                case 'NetStream.Buffer.Full':
                    setStatus("Video Netstream Buffer Full");
                break;
                case 'NetStream.Buffer.Flush':
                    setStatus("Video Netstream Buffer Flushed!!!!");
                break;
            }
        }
        
        /**
         * Handles NetStatus events on the audio NetStream
         * We dont particularly do anything here, but these are real helpful
         * if making a full video player.
         * @param    e
         */
        private function handleAudioOnlyStreamStatus(e:NetStatusEvent):void {
            switch(e.info.code) {
                case 'NetStream.Buffer.Empty':
                    setStatus("Audio Netstream Buffer Empty");
                break;
                case 'NetStream.Buffer.Full':
                    setStatus("Audio Netstream Buffer Full");
                break;
                case 'NetStream.Buffer.Flush':
                    setStatus("Audio Netstream Buffer Flushed!!!");
                break;
            }
        }
        
        /**
         * Prepares the camera and microphone based on the parameters in the UI.
         */
        private function setupCameraAndMic():void {
            if (activeCamera == null) {
                // Gets the default camera on the system - you can pass "0", "1", etc to manually select devices
                activeCamera = Camera.getCamera();
                activeCamera.addEventListener(StatusEvent.STATUS, handleCameraStatus, false, 0, true);
                // 320x240 at 20fps
                activeCamera.setMode( int( txtCamWidth.text ) , int( txtCamHeight.text ) , Number( txtCamFPS.text ) );
                // Youll probably have to set quality lower depending on your webcam.  All webcams transmit different
                // ammounts of data. Theyll get the buffer so high it's impractical for users to wait that long!
                activeCamera.setQuality( int( txtCamBandwidth.text ) , int( txtCamQuality.text ) );
                
                activeMic = Microphone.getMicrophone();
                activeMic.addEventListener(StatusEvent.STATUS, handleMicrophoneStatus, false, 0, true);
                // Rate in khz
                // http://livedocs.adobe.com/flash/9.0/ActionScriptLangRefV3/flash/media/Microphone.html#rate
                activeMic.rate = int( cbMicHz.selectedItem.data  );
                activeMic.gain = 50;
                // Audio wont transmit all the time causing breaks in recorded audio stream screwing up the
                // flv.  Make sure even if its silence while recording the mic is always sending data
                activeMic.setSilenceLevel(0);
                
                
                onScreenVideo.attachCamera(activeCamera);
                if (activeStream != null) {
                    activeStream.attachCamera(activeCamera);
                }
                if (audioOnlyStream != null) {
                    audioOnlyStream.attachAudio(activeMic);
                }
                
                // Wait for the security settings - If we have already approved these - the fun little dialog wont ever show up again,
                // and the events on camera and mic wont fire, so just check initially.
                setStatus("Waiting for camera and microphone security settings.");
                
                // -- Will just proceede to publish recording if we already have passed security settings
                finishCamAndMicSetup();
            } else {
                destroyCameraAndMic(true);
            }
        }
        
        /**
         * A utility function for completley cleaning up the camrea and microphone
         * @param    _OptionalCallback
         */
        private function destroyCameraAndMic(_OptionalCallback:Boolean = false):void {
            if (activeCamera != null) {
                
                if ( onScreenVideo != null) {
                    onScreenVideo.attachCamera(null);
                }
                
                if (activeStream != null) {
                    activeStream.attachCamera(null);
                }
                
                if (audioOnlyStream != null) {
                    audioOnlyStream.attachAudio(null);
                }
                
                activeCamera.removeEventListener(StatusEvent.STATUS, handleCameraStatus);
                activeCamera = null;
                
                activeMic.removeEventListener(StatusEvent.STATUS, handleMicrophoneStatus);
                activeMic = null;
                
                // Just in case - will do nothing if already destroyed
                destroyBufferCheckTimer();
                
                if (_OptionalCallback == true) {
                    setupCameraAndMic();
                }
            }
        }
        
        /**
         * We have to wait to actually start recording due to the Flash Player security
         * dialog. So this method is called only after we have the OK on security settings.
         */
        private function finishCamAndMicSetup():void {
            if (camAllowed == true && micAllowed == true) {
                if (activeStream != null) {
                    
                    setStatus("Recording....");
                                    
                    // This will start recording - the name we specify here is the name of the final file
                    // on the server.
                    // I'll use the data functionality to generate unique names
                    var tempDate:Date = new Date();
                    var uniqueFileName:String = "RecordTest_" + String(tempDate.getMinutes()) + String(tempDate.getMilliseconds());
                    
                    currentlyRecordedFileName = uniqueFileName;
                    
                    // OK lets start recording - look in the "streams" directory inside of red5
                    // inside of the app we connected to in the NetConnection
                    activeStream.publish(uniqueFileName + "_Video", "record");
                    audioOnlyStream.publish(uniqueFileName + "_Audio", "record");
                    
                    createBufferCheckTimer();
                }
            }
        }
        
        /**
         * Utility function for creating the timer that checks the netstreams buffer.
         * This timer executes every 100 milliseconds, or 10 times a second because 1000ms = 1 second
         */
        private function createBufferCheckTimer():void {
            if (bufferCheckTimer == null) {
                    bufferCheckTimer = new Timer(100);
                    bufferCheckTimer.addEventListener(TimerEvent.TIMER, handleBufferCheck, false, 0, true);
                    bufferCheckTimer.start();
            } else {
                destroyBufferCheckTimer(true);
            }
        }
        
        /**
         * Utility function for destroying the buffer check timer.
         * @param    _OptionalCallback
         */
        private function destroyBufferCheckTimer(_OptionalCallback:Boolean = false):void {
            if (bufferCheckTimer != null) {
                    bufferCheckTimer.stop();
                    bufferCheckTimer.removeEventListener(TimerEvent.TIMER, handleBufferCheck);
                    bufferCheckTimer = null;
                    
                    if (_OptionalCallback == true) {
                        createBufferCheckTimer();
                    }
            }
        }
        
        /**
         * Whenever the timer runs, this function is called, so its called 10 times a second
         * to check the state of the buffers when recording.
         * @param    e
         */
        private function handleBufferCheck(e:TimerEvent):void {
            if (activeStream != null) {
                
                if (lblAudioBuffer != null) {
                    lblAudioBuffer.text = "Audio Stream Buffer: " + String( audioOnlyStream.bufferLength );
                }

                if (lblVideoBuffer != null) {
                    lblVideoBuffer.text = "Video Stream Buffer: " + String( activeStream.bufferLength );
                }
                
                if (recordingHalted == true) {
                    if ( (activeStream.bufferLength == 0) && (audioOnlyStream.bufferLength == 0) ) {
                        activeStream.close();
                        audioOnlyStream.close();
                        
                        bufferCheckTimer.stop();
                        bufferCheckTimer.removeEventListener(TimerEvent.TIMER, handleBufferCheck);
                        bufferCheckTimer = null;
                        
                        setStatus("Buffers Empty. Press 'Play Recording' to start playback.");
                        // OK - Enable Playback - playback time
                        btnPlay.enabled = true;
                        btnStartMerge.enabled = true;
                    }
                }
            }
            
            if (bufferCheckTimer != null) {
                bufferCheckTimer.reset();
                bufferCheckTimer.start();
            }
        }
        
        /**
         * The flash player security dialog will prompt users for camera access when we request the camera
         * use this event to figure out if they allowed it or not.
         * http://livedocs.adobe.com/flash/9.0/ActionScriptLangRefV3/flash/media/Camera.html#getCamera%28%29
         * @param    e
         */
        private function handleCameraStatus(e:StatusEvent):void {
            trace("handleCameraStatus - " + e.code);
            switch(e.code) {
                case 'Camera.muted':
                    // Show a message
                    setStatus("Camera muted - unable to continue");
                break;
                case 'Camera.Unmuted':
                    camAllowed = true;
                    finishCamAndMicSetup();
                break;
            }
        }
        
        /**
         * The flash player security dialog will prompt users for microphone access when we request the camera
         * use this event to figure out if they allowed it or not.
         * http://livedocs.adobe.com/flash/9.0/ActionScriptLangRefV3/flash/media/Camera.html#getCamera%28%29
         * @param    e
         */
        private function handleMicrophoneStatus(e:StatusEvent):void {
            trace("handleMicrophoneStatus - " + e.code);
            switch(e.code) {
                case 'Microphone.Muted':
                    // Show a message
                    setStatus("Microphone muted - unable to continue");
                break;
                case 'Microphone.Unmuted':
                    micAllowed = true;
                    finishCamAndMicSetup();
                break;
            }
        }
        
        /**
         * When playing back video/audio, we can use the same Netstreams we used to record
         * as well as the same video.
         * 
         * Take special note of the bufferTime change here.  If I left it at 60 or whatever was set
         * in the UI, the netstreams try and download that much time of video/audio before starting
         * playback.
         * 
         * I think the default value on netstreams of .1 sec is really intended for playback because
         * so many people do playback vs video recording.
         */
        private function doRecordingPlayback():void {
            setStatus("Starting simultanious playback of recording audio and video");
            
            onScreenVideo.attachCamera(null);
            onScreenVideo.attachNetStream(activeStream);
            audioPlaybackVideo.attachNetStream(audioOnlyStream);
            
            // Attempt to play back the audio and video file simultaniously
            // First, adjust the buffers as this will be BAD for playback
            activeStream.bufferTime = 1;
            audioOnlyStream.bufferTime = 1;
            
            activeStream.play(currentlyRecordedFileName + "_Video");
            audioOnlyStream.play(currentlyRecordedFileName + "_Audio");
        }
        
        /**
         * If for some reason someone stops the playback. I just pause the audio and
         * video stream.  they get destroyed when we record again anyway.
         */
        private function stopRecordingPlayback():void {
            setStatus("Stopping Recording Playback");
            
            if ( activeStream != null) {
                activeStream.pause();
            }
            if ( audioOnlyStream != null) {
                audioOnlyStream.pause();
            }
        }
        
    }
    
}

Download The Client Side Code

Download the swf of the application above (Save-As)

Download a zip of the entire source and FlashDevelop project

You can download the pre-compiled swf and source code. It was made using the free Flex SDK and FlashDevelop. Please note that when compiling it is important to reference the “FlashComponents.swc” file.

Red5 Server Side Code For Merging Audio and Video

Ok, looks like recording playback isn't good enough for you via the media server just in Flash? You want a combined final file! Hold on to your shorts because we're going to blast off on a Java adventure below.

We had to record audio and video to two separate files to prevent the flash player from garbling the video due to it's inherent audio transport priority behavior. If you're always recording and playing back using the media server, then there is no need to create a final merged file with audio and video. You would always just start playback of the audio and video at the same time in your Flash application.

On the server, we need a way to combine the two files into one final file with audio and video. There are other techniques for doing this, but the solution I explain utilizes the popular ffmpeg / libav application. This server-side code launches ffmpeg when recording is finished to merge the two files.

Making your own application in Red5 is pretty tough. It requires a decent understanding of Java technology and Red5. In this example I'm creating my server-side application based on red5 RC1 out of the SVN in July 2011. Before we get into the nuts and bolts of the solution, you can just download my pre-compiled Red5 server application and can drop it into a running instance of Red5. Make sure to read the INSTALLATION.txt file inside the zip archive.

Download the compiled zip archive of the recording_merger server side application


Or, if you're still on our Java adventure you can:

Download the source code for the server-side application which I will describe below.



The real trick when developing a custom server-side application is to insert your custom application as you program it into the red5 build.xml script for Apache Ant, which will let you build and debug your custom application as the entire server is built. If you're just taking my pre-built application above, you can insert it into the dist/webapps directory (or just webapps if you have no dist directory) as of July 2011, and red5 will pickup the app and launch it.

Basically you make your own main class that extends some of the main Red5 core classes. Assuming you have entered your applications information into the Ant build script correctly (build.xml), when you use Apache Ant to run Red5, it will compile your new custom application along with all of Red5. Once you're done debugging your app and have it all built, you can just use the red5 run scripts like red5.bat and red5.sh depending on your platform.

Explanation of Files (Besides core Red5 Application Files)


recording_merger.properties - A configuration file pointing to the absolute system path of ffmpeg and relative path to the deployed applications streams directory. This file should be placed in the root of your red5 directory (along-side build.xml or red5.bat or red5.sh).

Manager.java - This is the entry point for the application. Whenever you do a netconnection.connect to this application in Flash Red5 will execute certain function inside of here like, appConnect and appDisconnect. There is one public method in this file: "initiateTranscoder" which can be called to preform the process of merging two flv files using ffmpeg. This also contains an inner-class that is used to launch the transcoder in a separate thread from Red5, so execution of the server is not halted during transcoding.

UseTranscoderBlocking.java - This handles setup of the command line parameters for ffmpeg and calling of the actual binary to do the transcoding. It blocks until the process finishes and is designed to be run inside a separate thread within Manager.java

ThreadedtranscoderIO.java - The input and output streams of ffmpeg will cause mistchif when run this way if we dont direct them somewhere. This is used inside UseTranscoderBlocking.java when executing ffmpeg.

Server Side Code Files (Java)

Manager.java

package recording_merger;

// Non Standard RED- Classes
import java.io.IOException;

// For logging
import org.slf4j.Logger;
import org.red5.logging.Red5LoggerFactory;

import org.red5.server.adapter.ApplicationAdapter;
import org.red5.server.api.IScope;
import org.red5.server.api.stream.IServerStream;
import org.red5.server.api.stream.IStreamCapableConnection;

// Function calls on clients
import org.red5.server.api.IConnection;
import org.red5.server.api.Red5;
import org.red5.server.api.service.IServiceCapableConnection; 

// Thread stuff
import java.io.*;
import java.lang.Thread;
import java.lang.Throwable;
    
public class Manager extends ApplicationAdapter {
    
    // ********  Global Class Objects *********
       
    // For Red5
    private IScope appScope;

    private IServerStream serverStream;
    
    private static Logger log = Red5LoggerFactory.getLogger(Manager.class, "recording_merger");
    
    public boolean appStart(IScope app) {
        appScope = app;
            /** {@inheritDoc} */

        log.info("recording_merger appStart");
        
        return true;
    }
    
    @Override
    // This gets called whenever a Flash client connects to the server
    //
    public boolean appConnect(IConnection conn, Object[] params) {
        return super.appConnect(conn, params);
    }
    
    @Override
    // This gets called whenever a Flash client disconnects
    //
    //
    public void appDisconnect(IConnection conn) {
    
        if (appScope == conn.getScope() && serverStream != null) {
            //log.info("Closing server stream using appDisconnect");
            serverStream.close();
        }
        
        super.appDisconnect(conn);
    }
    
// Non Red-5 Specific Functions
    
    public void initiateTranscoder(String _RecordedVideoFileName) {
        
        //Create a thread to do the ffmpeg encoding without
        //blocking red5 and preventing its further execution
        //by passing in an instance of our connection we are able
        // to make a callback to the client once the thread finishes!
        IConnection currentConnection = Red5.getConnectionLocal(); 
        Runnable myRunObject = new ExecuteTranscoderWithCallbacks(_RecordedVideoFileName, currentConnection);
        
        Thread actualThread = new Thread(myRunObject);
        actualThread.start();
    }

        class ExecuteTranscoderWithCallbacks implements Runnable {
            
            private String RecordedFileName = "";
            IConnection currentConnection;
            
        ExecuteTranscoderWithCallbacks(String _RecordedFileName, IConnection _clientConn) {
            RecordedFileName = _RecordedFileName;
            currentConnection = _clientConn;
        }
        
        // In threaded java programming, this run will automatically run
        public void run(){
            try{
                UseTranscoderBlocking ffmpegTranscoder = new UseTranscoderBlocking();
            
                // This will block until the encoding is done!
                String returnStatus = "";
                
                returnStatus = ffmpegTranscoder.Start_FFMPEG(RecordedFileName);
                
                // Try and invoke a callback on the client!
                // IConnection currentConnection = Red5.getConnectionLocal(); 
                if (currentConnection instanceof IServiceCapableConnection) { 
                    IServiceCapableConnection sc = (IServiceCapableConnection) currentConnection;
                // Send the entire absolute file system path of the encoded file back to the flash client!
                // Calls a function on the netConnections "client" object called transcodingCallback
                    sc.invoke("transcodingCallback", new Object[]{returnStatus}); 
                } 

            }catch (Exception e){
                e.printStackTrace();
            }//end catch
        }//end run
        
    }//end inner class ExecuteFFMPEGWithCallbacks
}


UseTranscoderBlocking.java

package recording_merger;

import java.io.*;
import java.util.regex.Pattern;
import java.util.regex.Matcher;

// For Red5 Logging
import org.slf4j.Logger;
import org.red5.logging.Red5LoggerFactory;

// We need this for threads - this is for the neat line up in a row processing functionality
import java.lang.Thread; 

// For Loading the .properties configuration file
import java.util.Properties;

public class UseTranscoderBlocking {
    
     // DATAMEMBERS
        // Get the logger going - this allows us to write logs to red5root/log/recording_merger.log
        //
        private static Logger log = Red5LoggerFactory.getLogger(UseTranscoderBlocking.class, "recording_merger");
        
        // CONFIG FILE
        private static Properties configLoader;
        
        private String FFMPEGFULLPATH = "";
        
        // The relative path from the Red5 root to the applications "sreams" directory
        private String RELATIVEVIDPATH = "\\dist\\webapps\\recording_merger\\streams\\";
        
        // For blocking mode of this class, we return the absolute path to the final encoded file
        // which then gets sent back to the flash player client!
        private String FINALFILEPATH = "";
        
        public UseTranscoderBlocking() {

        }
       
       public String Start_FFMPEG(String _BaseFileName) {
           String result = "";
            
                try {
                    // The config .properties file is setup to reside in the root Red5 directory...thats what will resolve here.
                    // Would like to not have it in the red5 root, but if I put it in dist/webapps/..blah blah then thatll probably
                    // change later as red5 progresses.
                    configLoader = new Properties();
                    configLoader.load(new FileInputStream( "recording_merger.properties" ));
                    
                    // Populate all the global vars
                    FFMPEGFULLPATH = configLoader.getProperty("FFMPEGAbsPath");
                    RELATIVEVIDPATH = configLoader.getProperty("StreamsDirRelativePath");

                    // Get rid of the properties
                    configLoader = null;
                    
                    // Run FFMPEG
                    Run_FFMPEG(_BaseFileName);
                    
                } catch (Exception e) {
                    System.out.println("Start_FFMPEG exception-" + e);
                    log.error("Start_FFMPEG-" + e);
                }
                
            result = FINALFILEPATH;
           return result;
       }
       
       private String Build_FFMPEG_Video_Merge_Command(String _RelativePathToVideoFile) {
           String returnString = "";
           
           try {
            // Get the root Red5 Path
            File tmpDir = new File (".");
                
            // Prepare the full path to the video file
            String FullInputVideoPath = tmpDir.getCanonicalPath() + RELATIVEVIDPATH + _RelativePathToVideoFile + "_Video.flv";
            String FullInputAudioPath = tmpDir.getCanonicalPath() + RELATIVEVIDPATH + _RelativePathToVideoFile + "_Audio.flv";
            
            // Prepare the full path to the output video file - from the root of Red5
            String FullOutputVideoPath = tmpDir.getCanonicalPath() + RELATIVEVIDPATH + _RelativePathToVideoFile + "_Merged.flv";

            // Set the final files absolute path
            FINALFILEPATH = FullOutputVideoPath;

           // I am going to manually reconstruct this.  You could include it in the external config file if you wanted
           returnString = FFMPEGFULLPATH + " -y -i " + FullInputVideoPath + " -i " + FullInputAudioPath + " " + FullOutputVideoPath;
           
           } catch (IOException e) {
                System.out.println("UseFFMPEG Build_FFMPEG_Video_Merge_Command-" + e.toString());
                log.error("Build_FFMPEG_Video_Merge_Command-" + e.toString());
           }
           
           return returnString;
       }
       
       private void Run_FFMPEG(String _RecordingNameWoExtension) {
            try {
                // If you're going to run this on OSX or Linux, then there will be some subtlties here that you'll have to deal with.
                Runtime rt = Runtime.getRuntime();
                
                String cmd = Build_FFMPEG_Video_Merge_Command(_RecordingNameWoExtension);
        
                Process pr = rt.exec(cmd);
        
                ThreadedTranscoderIO errorHandler = new ThreadedTranscoderIO(pr.getErrorStream(), "Error Stream");
                 errorHandler.start();
                 ThreadedTranscoderIO inputHandler = new ThreadedTranscoderIO(pr.getInputStream(), "Output Stream");
                 inputHandler.start();
                 
                 try {
                     pr.waitFor();
                 } catch (InterruptedException e) {
                     throw new IOException("UseTranscoderBlocking - Run_FFMPEG - process interrupted " + e);
                 }
                
            } catch(Exception e) {
                System.out.println("UseTranscoderBlocking Run_FFMPEG-" + e.toString());
                log.error("Run_FFMPEG-" + e.toString());
            }
       }
}


I decided not to show the source of ThreadedtranscoderIO.java because it is so trivial. Download the source and you can look at it yourself. I've also removed exhaustive comments on many of the methods to shrink the over-flow of code in this article.

Step Forth Recording Graduate. Congradulations on Making It.

If you can get this client and server-side application working, you're going to be armed with a recording solution that will push the limit of the old encoders inside the Flash Player, and your network connection. I can't wait to check out the H.264 encoder in Flash Player 11! I hope this tutorial has helped de-mystify some of the tricks to recording using Flash and a media server.


Comments

solarcloud
solarcloud
July 4, 2015 9:35 pm

Hey. I just wanted to say thank you so much for releasing the source code. I'm very apreciative

Fran
Fran
December 11, 2014 4:36 pm

hi charles, sorry for my english, i come from Venezuela and here just speak spanih, well this is myproblem, i try to use red5 rtmp, to push on flash player in the web this players have spetrums, in this web you can see the player i want to use http://demo.netandino.com/demobajablerelaywowza3.php, red5 rtmp dont push on this player, my question is, you have a documentation, or example to put on this player with red5 and mi own http audio from my computer.. finally thanks you and congrats for you help and your excelents projects friend...

AZ
AZ
February 10, 2014 03:59 am

Hi

Arul
Arul
November 5, 2013 10:01 am

I have recorded for 2 min. Video is recorded for 2 min but the audio records for only 58sec.

Charles
Charles
July 13, 2013 2:07 pm

Hi Naico!

To set audio settings you use the microphone class to setup codecs and bit rate. If you are using fms you may not have to separate audio and video. I did a project last year with FMS and it worked fine on connections all the way down to dial up.

naico!
naico!
July 13, 2013 1:39 pm

i'm using a flash application conected to an fms.

naico!
naico!
July 13, 2013 1:38 pm

Hey! super helpfull stuff!! one question, how do i set the audio quality? i mean the bitrate.
I'm handling video and audio separatelly (audio is recorded locally and uploaded after video is finished streaming)

Triangulito
Triangulito
March 17, 2013 12:11 am

Oh, just realized that I'm getting an error that says: Cannot find class [org.red5.server.WebScope] for bean with name 'web.scope' defined in ServletContext resource [/WEB_INF/red5_web.xml]

Is this because of a different red5 than the one used in this tutorial?

Triangulito
Triangulito
March 15, 2013 11:50 pm

Hey, I'm trying to mount red5 with your recording_merger app and keep getting errors. What version of red5 do I need (I read that it was 1.0.0 RC1, but can't get it to work with that) and does it have to be the svn version because I don't know how to compile it for it to work. Any help with this will be greatly appreciated.

Charles
Charles
February 13, 2013 11:36 am

Hi Prashant,

I have used Flash Media Server 4.5 recently with a mac, but not Red5. I do know that in the Flash player there are timing differences between when the security dialog is accepted and when video/audio actually starts flowing to the media server between Mac/PC platforms.

If you've separated the streams it's probably inevitable that you will have miss-sync between the audio and video.

If you're trying to push the quality of a combined audio/video stream in Red5 .8, you'll have issues no matter what if you go too high as well.

Prashant
Prashant
February 13, 2013 09:39 am

Thanx tpyo,

Your trick works well with windows. But I am trying this from Macintosh platform. I am facing audio / video sync issue and also videos hangs for some time with audio is playing behind.
Please help with this issue if you can.... :)

Charles
Charles
December 26, 2012 6:04 pm

Thanks tpyo.

It took me over a week to put this together over summer vacation in 2011 so I appreciate your nice comment.

tpyo
tpyo
November 28, 2012 11:25 am

Thanks for all the comment in the downloadable source, they are most useful! :), this comment thread is also golden.

well
well
September 20, 2012 12:48 am

so nice, it could be apply in home security!

Charles
Charles
September 13, 2012 6:21 pm

Great work Daniel!

I haven't tested with the latest version of Red5, but with Flash Media Server 4.5 and the latest Flash player, audio and video was not dropped when setting the buffer really high on slow connections with audio and video in the same netstream. I was testing down to 4kb/second.

Daniel
Daniel
September 13, 2012 6:12 pm

Thanx Charles for your reply.
I haven't tested with latest flash player and latest Red5 but I will. Are you saying that the latest flash player and latest Red5 will not drop video frames when bandwidth is very low?
I am doing tests on a 10Kb/sec bandwidth.

For the streams separation case:
I noticed that the audio stream is being recorded earlier than the video stream (not always, it's intermittent). In this case I offset the audio so the end of both streams match and I cut off the extra audio at the beginning.
I looks like it's working this way... at least until now.

Charles
Charles
September 13, 2012 09:24 am

Daniel,

Have you tested on the most recent versions of Red5 if stream separation is even needed? In August 2012 I did a project with Flash Media server 4.5 and it would not throw out audio data when recording high quality. You should test the most recent Flash Player vs the most recent build of the media server you want to use prior to separating the streams.

Doggystash,
You would probably just need to change the server-side configuration file for this particular red5 application.

No change should be needed in the client side code.

Doggystash
Doggystash
September 13, 2012 08:43 am

Hi,
What is the possible change if im running it on linux server?

Thanks

Daniel
Daniel
September 12, 2012 8:46 pm

Hi guys,
I managed to record the 2 streams audio and video separately to the server using Red5.
I was wondering what makes the audio and video be out of sync and how could I know the time difference between the two?
If someone knows I would very much appreciate.
Thank you

Charles
Charles
August 20, 2012 08:21 am

Roi,

The primary thing to keep track of is the Netstreams bufferTime vs the Netstreams bufferLength. I have not used recent versions of Red5, but have recently done a Flash Media Server 4.5 project on the Amazon cloud and have been able to record without dropping any audio or video by setting a high bufferTime before starting recording, then monitoring the bufferLength before calling the Netstream Close method.

There has been chatter on the Red5 mailing list recently about patches to the recording functionality again, but I have not had the opportunity to examine the recent versions.

Roi
Roi
August 19, 2012 12:36 am

Hi Charles,
Thanks again for this great post.
You mention that there is a way to record in one stream without throwing away video frames, but it doesn't work 100% of the time.
Can you plz elaborate on this one?
Best,
Roi

Charles
Charles
June 22, 2012 09:12 am

Luke,

If you're using the super newest version of Red5 I think past RC1, there were major architecture changes made which may make this server-side application not work.

It seems like the logs would be able to help you track down the root problem.

Luke
Luke
June 22, 2012 08:02 am

Thanks again. I tested the demo apps and they're running so i guess i have to do some checking regarding the compiled app.

Thanks.

Charles
Charles
June 21, 2012 12:46 am

Luke,

I'd make sure your using the compiled version of the application. Also, I'd suggest first testing on one of the red5 sample applications like publisher or OflaDemo. You can record to those applications without doing anything special.

luke
luke
June 20, 2012 1:30 pm

Thanks Charles...I restarted red5 and am testing it locally on the network....for the Media Application Server URI i'm using "rtmp://10.192.28.210/recording_merger" whereas I i put the "recording_merger" folder in the webapps directory...when i click connect i get "STATUS: Netconnection Closed to: rtmp://10.192.28.210/recording_merger" - should i be using something else?

Thanks again!

Charles
Charles
June 20, 2012 11:47 am

Luke,

If you have downloaded the compiled server side application, it should be picked up by red 5 if you start the server. It's been about a year since I wrote this, but I don't think any config is needed other than what I mention above. You can test by making a net connection to the server using the application name. You should see red5 respond in the scrolling console window.

luke
luke
June 20, 2012 10:21 am

Hi all....sorry for the nub question as I'm a AS3 developer and not Java developer but how do i deploy the app to Red5? - i put the files in the webapps dir...what's next?

Thanks again for all your help and can't wait to get this working!

Charles
Charles
May 24, 2012 09:22 am

Krx,

Im glad to hear you were able to sort things out for the most part. Good luck with your project!

Krx
Krx
May 24, 2012 06:39 am

Hi Charles
I just wanted to say BIG THANKS for this excellent post. It helped me a lot to properly understand (most of) headache causing issues for badly recorded videos.
Regards

Charles
Charles
March 7, 2012 5:53 pm

Good luck Alfredo. With Flash and Red5, even when on a LAN getting quality can be difficult due to all the issues mentioned in this article.

Good luck with your application!

Alfredo
Alfredo
March 7, 2012 12:25 am

Thank you Charles, I will use:

- one High Quality RTMP stream for recording
- one Medium/Low Quality RTMP stream for live
- one medium/Low Quality RTMPT stream for live for firewalled clients

I suppose that the best quality for recording is by using LAN, but in my case, it is impossible and I have to do it in my server in Amazon Ec2...

Charles
Charles
March 6, 2012 1:50 pm

Hi Alfredo,

If lower quality video is acceptable, I'd do exactly what you mentioned, and keep all the streams together. It makes things much more simple. In a livestream example it would be acceptable to have temporary video frame drop.

You could also publish one stream at a lower quality live, and a separate at high quality for the recording (which you may want to break up the audio and video for).

I saw one of the people on the Red5 mailing list mention a solution that I think will solve the sync issue with splitting the audio and video. It is to save the the record start times on the server then use those in conjunction with ffmpeg on the final stream. I haven't had time to test it yet though.

I guess if it were me Id go with your number 1, in order to provide some flexibility when setting the quality of your live stream vs recordings.

Alfredo
Alfredo
March 6, 2012 11:19 am

Hi Charles,

I would like to publish a live stream and to record this stream in order to play it in the future...
Which is the best solution?
1. To handle 2 separate streams: one for live stream and one for recording
2. To handle 3 separate streams: one for live, and the other ones for audio+video recording separately
3. To do it all in a single stream

For simplicity, I was thinking in the 3rd one...

Thank you in advance.
Best regards,
Alfredo

Charles
Charles
January 17, 2012 2:20 pm

Hi Umar,

I've fooled around with Xuggler and it's a great project. I haven't ever used it in a professional solution though.

Umar
Umar
January 17, 2012 01:51 am

Hey Charles,
Have you ever used Xuggler with Red5 to combine and edit videos? Please let me Know :D

Charles
Charles
January 9, 2012 08:57 am

Also, your mic rate is way too high. If your streaming video and audio over the same Netstream I guarantee a mic setting of 44Khz will cause video frames to be dropped. Try dropping it down to 22 or 11.

Charles
Charles
January 9, 2012 08:53 am

Hi Roi and Umar,

Roi,

Concerning FileConsumer, I don't have any experience examining the guts of that class. For my exhibits, all on the same computer or on a LAN considerations in the client have handled all my problems so I have not had to dive into the server code. I apologize for not being able to give much insight in that regard.

Umar,

I am suspecting that your quality settings are way too high and the buffer of your Netstream can be adjusted. The encoders inside the Flash Player (except the new H.264 encoder that I do not yet have experience with) are very old so doing anything comparable to H.264 video we are all commonly used to in 2012 takes a lot of bandwidth.
I know there are methods to calculate the bandwidth a stream is likely to use, I just cant find them right now. Two articles that may help you are the one about FMS from Adobe, and this .

So lets look at your client code now.
1. For setQuality, (
here is the Camera code reference) you are basically telling Flash player to do whatever necessary to maintain 90% quality at 204800. That means it will drop frames to stay within that bandwidth requirement.

Try camera.setQuality(204800,0) or camera.setQuality(0,90). With your current settings the player has no choice but to drop frames if it encounters problems.

In my experience, even at 640x480 on a local computer, a 90% quality setting with a Logitec Quickcam 9000 would generate so much data that it was impractical to get anything out of the Flash Player.

You may need to look at an alternative to Flash for your solution. I'd check the H.264 encoders (which have special requirements I think AIR 3) and see if they do any better.

2. Before ditching Flash, also take a look at your Netstream (here's the Netstream reference). On the system with the camera, ensure your buffer time is pretty high, ex 60 (this may delay the video getting sent by a bit but it will ensure its all in tact if the player doesn't exceed that buffer time).

On the receiving system, you may also want to set the buffer time high (like 1 or 2) if the video is studdering, but not receiving video with lost frames. Your sending a lot of data, so the recieving system may not be getting the data fast enough to display it smoothly. By setting buffer time higher, you should see behavior where the buffered video will play, then it will sit there frozen for a while again, then when the buffer fills up, it will again play more.

It's a complex balance.

Umar
Umar
January 8, 2012 11:15 pm

Hi Charles
Thanks for your reply.
Right now i am not recording anything. I am just trying to display video. On lan the lag is very less but on server the lag is very huge. The code on client side is
camera = Camera.getCamera()
mic = Microphone.getMicrophone()
mic.setLoopBack(false) // prevent input from being routed back to local speakers - helps reduce feedback in some conditions
mic.setUseEchoSuppression(true)
mic.rate = 44
camera.setMode(panelWidth,panelHeight,29.97) // 800*600
camera.setQuality(204800,90)

We need such quality. We do have very fast internet and it should not be a problem.
What configuration we need to do on red5 server side.
Note that i am not a very expert of red5

Roi
Roi
January 8, 2012 11:27 am

Hi Charles,

Great tip with the "queuethreshold" issue. It indeed works great - high threshold for large files, and low threshold for small files. The thing is we don't always know in advanced what is the length that the user will record.
My suspicion is that a tweak is needed in "FileConsumer" class (pushMessage method?), which handle a case the threshold is not reached.
Any idea?

Best,
Roi

Charles
Charles
January 5, 2012 10:11 am

I just updated the article to include the queueThreshold.

Charles
Charles
January 5, 2012 09:43 am

Hi Umar,

If your buffertime is high in the client application, lets take a look at the server. I realized I left a critical piece of information out of this article.

In Red5 1 There is an additional configuration item in conf/red5-common.xml, that HDFVR highlighted - the queueThreshold. The essentially recommend increasing it a lot.

Be careful, with Red5 RC1 if you're using the official release on GoogleCode. I wasn't able to get recording to work with that.

Umar
Umar
January 5, 2012 03:20 am

Hey,
If your application experiences any bandwidth issues, video frames will be thrown out at the client, so you'll experience jumpy recorded video. (There is a way to deal with this, but it doesn't work 100% of the time).

I am experiencing the same issue. Can you tell me how to resolve it. There is a 2-3 second lag too on localhost and when i use server then the lag is increased to 5 seconds + it gives way more jerks.

Charles
Charles
January 2, 2012 12:40 am

Hi Guy. Thanks for explaining it better.

I doubt you want to but probably by far the easiest way to fix the issue you're having would be to record both audio and video on the same netstream into one file. In order to get this to work, you'll just have to ensure your camera and microsphone quality settings are set real low.

When I did a few kiosks this way with Red5 and the application on the same computer in AIR I used to have really good luck just playing back the separate streams at the same time when showing visitors a preview of their recording before they saved/posted it. I'm betting your situation is a lot more complicated though.

I guess an easy thing to ask about would be the version of Red5 you're using? I've only ever deployed this solution on .8 Final. All the .9's and 1.0 RC1 have issues with recording.

Regardless even on .8 and the current svn version of 1, if the streams are separated I'm betting you may still encounter sync issues.

Guy
Guy
January 2, 2012 04:27 am

Hi Charles.

Thanks for the quick answer.
I indeed implemented the suggested solution with FFMPEG. The problem is that it just takes too much time to transcode the 2 streams into one file.

Since the offset between the streams is already known, I'm trying to schedule the playing of the streams from actionscript, in order to have them synced.
Meanwhile I haven't succeeded to convey this goal.
Various attempts included Video component and FLVPlayback component, by using "seek" method (which isn't accurate as it rounds to nearest keyframe), cue points and timer delay.

What is the best way to implement such a solution, when taking into consideration all relevant issues, including buffers etc.?

Best,
Guy

Charles
Charles
January 1, 2012 2:49 pm

Hi Guy.

In the doRecordingPlayback method I set the buffer time back to a low value (by default the Netstream classes set it at .1 second) because if not, it should require the flash player to download however much data is needed to play back that many seconds of content before starting streaming playback.

If you separated the audio and video into separate streams and files, you will likely intermittently encounter a sync problem between the two files. A few methods to address this were listed in the "Audio/Video Sync Warning" section toward the top. If it's happening every single recording, then there may be something else going on.

When the Flash Players buffer fills(it downloads however much data you have set the buffer time to), it will fire a NetStatus buffer full event. I'd guess that if you're recording over a network, if your audio and video streams buffer times are set to the same during playback the audio will likely fill it's buffer before the video (audio should have less data than the video and take less time to download).

I hope that gives you some ideas of where to look.

Guy
Guy
January 1, 2012 08:16 am

Hi Charles,

This tutorial is indeed very helpful!
My question is about the "doRecordingPlayback" method:
what do you mean by BAD buffer size for playback?

I am facing a scenario, in which there exists an offset between the audio and video streams. AS behaves in a weird way and I don't manage to sync the streams.
I know the offset from the FLV files duration gap - what is the best way to play those files? Both timer solution and using seek method didn't work for me.

Any clue?

Best!

Roi
Roi
November 28, 2011 10:32 am

Hi Charles,
It's indeed just what I needed.
Many thanks!

Charles
Charles
November 28, 2011 00:45 am

Hi Roi. I usually work in AIR, but I'm pretty sure this is what you need to check out.

When your application is initialized, you'll need to set your desired scale mode.

http://help.adobe.com/en_US/FlashPlatform/reference/actionscript/3/flash/display/StageScaleMode.html

I also think that in your embed html for the swf you need to set width and height to 100%?

Roi
Roi
November 27, 2011 03:47 am

Great tutorial.
When I embed the swf in an html file, the recorder has white margins all around, i.e it not fastened to its border.
Any clue how I can solve it?
Thanks!

Jason
Jason
September 6, 2011 07:12 am

Thanks. It helped me out.

Mike
Mike
July 27, 2011 6:45 pm

Awesome, thanks Charles. Email sent.

Charles
Charles
July 25, 2011 11:23 pm

You can e-mail me using the contact information in the "Resume" section of the site.

Thanks Mike.

Mike
Mike
July 25, 2011 12:09 am

hi Charles, thanks for the response and ETA. if there a more non-public way I can contact about some questions? Cheers.

Charles
Charles
July 22, 2011 9:56 pm

Mike - I can't currently offer these services but towards the end of September 2011 I should be able to if you are still interested.

Thanks for asking.

Charles
Charles
July 22, 2011 9:54 pm

Roi & J-
A general good practice I've found with ffmpeg when combining audio and video is to be as specific as you can, so lets say in Flash you recorded at 15fps, 320x240. Lets also say you recorded audio at 44,100Hz.

This would translate to the ffmpeg commands -r 15 (-r is framerate) -ar 44100 (-ar is audio rate) -s 320x240 (-s is video size)

So lets say I recorded flvVideo.flv and flvAudio.flv, with the parameters mentioned above my command would be:

ffmpeg -i flvVideo.flv -i flvAudio.flv -ar 44100 -s 320x240 -r 15 CombinedFile.flv

Mike
Mike
July 22, 2011 8:29 pm

this is fantastic. I was just wondering if you offered any services to create something similar for my site. I have a site running under PHP, MySQL and Apache. All I'm looking for is a Flash app that allows a visitor to record only audio using their microphone and when they hit a button like "Save", it will upload it as an .mp3 on my server. Thanks!

Roi
Roi
July 20, 2011 10:10 am

Thanks a lot, you can't imagine how much you helped me.

I am facing a sync problem , when merging audio and video streams. I have tried to use ffmpeg "async" flag with various numbers, but I don't really understand how it works.
Any suggestions for possible solutions will be appreciated :)

Regards
Roi

Charles
Charles
July 17, 2011 12:05 am

Hi J. Thanks for trying it out. I'd investigate the ffmpeg -async parameter, as that clears up a lot of the issue for me. You may also want to manually specify the audio rate in your ffmpeg command line parameters.

J
J
July 16, 2011 9:50 pm

Fixed!

Had the streams configuration pointing at the wrong folder.
Great job!

So, any thoughts on what to do about how out of sync the audio and video are even when merged?

J
J
July 16, 2011 9:44 pm

To be clear, Red5 is producing this error on attempted merge (recording to the server works just fine):

[ERROR] [NioProcessor-1] org.red5.server.service.ServiceInvoker - Method initiateTranscoder with parameters [RecordTest_31816] not found in org.red5.server.adapter.ApplicationAdapter@1e4905a

J
J
July 16, 2011 9:38 pm

Thanks for this. It's a big step in the right direction.

I seem to be unable to get ant to compile red5 and your code from source. It just successfully compiles red5, but ignores your project completely, even after modifying the build.xml file.

Might you be able to explain exactly what needs to change to get this auto-compilation to work and the steps involved? Using your pre-compiled version didn't work for me as I am using an older version of the JDK (1.6.0_15) than it appears you used to compile.

Comments are currently disabled.