gnash-commit
[Top][All Lists]
Advanced

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

[Gnash-commit] gnash ChangeLog configure.ac libmedia/Makefile....


From: Bastiaan Jacques
Subject: [Gnash-commit] gnash ChangeLog configure.ac libmedia/Makefile....
Date: Mon, 21 Jan 2008 07:07:28 +0000

CVSROOT:        /sources/gnash
Module name:    gnash
Changes by:     Bastiaan Jacques <bjacques>     08/01/21 07:07:28

Modified files:
        .              : ChangeLog configure.ac 
        libmedia       : Makefile.am 
        server         : namedStrings.cpp namedStrings.h 
        server/asobj   : NetConnection.cpp NetConnection.h NetStream.cpp 
                         NetStream.h NetStreamGst.cpp NetStreamGst.h 
                         Sound.cpp SoundGst.cpp SoundGst.h 
Added files:
        libmedia/gst   : gstflvdemux.c gstflvdemux.h gstflvparse.c 
                         gstflvparse.h 

Log message:
                * configure.ac: Disable the use of mad and ffmpeg, because they
                are no longer maintained media backends.
                * libmedia/Makefile.am: Compile the newly added 
gstflvdemux.{h,cpp}
                and gstflvparse.{h,cpp}. These files are Gstreamer's new FLV 
demuxer,
                which will be loaded on-the-fly if it not available on the 
system.
                * server/namedStrings.{h,cpp}: Add a symbol for onMetaData.
                * server/asobj/NetConnection.cpp: Replace openConnection with
                validateURL, since that name more accurately reflects its 
current
                purpose. Media handlers will now be in charge of starting their 
own
                network connections. Remove (now) unused methods.
                * server/asobj/NetStream.{cpp,h}: Use symbolic names for pause 
modes.
                Implement onMetaData through processMetaData.
                * server/asobj/NetStreamGst.{h,cpp}: Reimplement the Gstreamer
                NetStream handler. Gstreamer now takes care of URL resolving, 
                downloading, buffering, FLV decoding and all grocery shopping 
and 
                house cleaning. Because Gstreamer abstracts much of the 
threading
                code, we rarely need to lock the main execution thread. The new
                handler has been tested to work with YouTube, Lulu.tv, streaming
                ogg-theora-vorbis etc. However, seeking HTTP streams does not
                currently work.
                * server/asobj/Sound.cpp: Don't try to start a NetConnection, 
as it
                makes no sense in the base class.
                * server/asobj/SoundGst.cpp: Like NetStreamGst, SoundGst now 
lets 
                Gstreamer handle downloading URLs and buffering data. SoundGst 
has
                received a general cleanup.

CVSWeb URLs:
http://cvs.savannah.gnu.org/viewcvs/gnash/ChangeLog?cvsroot=gnash&r1=1.5438&r2=1.5439
http://cvs.savannah.gnu.org/viewcvs/gnash/configure.ac?cvsroot=gnash&r1=1.473&r2=1.474
http://cvs.savannah.gnu.org/viewcvs/gnash/libmedia/Makefile.am?cvsroot=gnash&r1=1.9&r2=1.10
http://cvs.savannah.gnu.org/viewcvs/gnash/libmedia/gst/gstflvdemux.c?cvsroot=gnash&rev=1.1
http://cvs.savannah.gnu.org/viewcvs/gnash/libmedia/gst/gstflvdemux.h?cvsroot=gnash&rev=1.1
http://cvs.savannah.gnu.org/viewcvs/gnash/libmedia/gst/gstflvparse.c?cvsroot=gnash&rev=1.1
http://cvs.savannah.gnu.org/viewcvs/gnash/libmedia/gst/gstflvparse.h?cvsroot=gnash&rev=1.1
http://cvs.savannah.gnu.org/viewcvs/gnash/server/namedStrings.cpp?cvsroot=gnash&r1=1.7&r2=1.8
http://cvs.savannah.gnu.org/viewcvs/gnash/server/namedStrings.h?cvsroot=gnash&r1=1.9&r2=1.10
http://cvs.savannah.gnu.org/viewcvs/gnash/server/asobj/NetConnection.cpp?cvsroot=gnash&r1=1.52&r2=1.53
http://cvs.savannah.gnu.org/viewcvs/gnash/server/asobj/NetConnection.h?cvsroot=gnash&r1=1.36&r2=1.37
http://cvs.savannah.gnu.org/viewcvs/gnash/server/asobj/NetStream.cpp?cvsroot=gnash&r1=1.78&r2=1.79
http://cvs.savannah.gnu.org/viewcvs/gnash/server/asobj/NetStream.h?cvsroot=gnash&r1=1.54&r2=1.55
http://cvs.savannah.gnu.org/viewcvs/gnash/server/asobj/NetStreamGst.cpp?cvsroot=gnash&r1=1.64&r2=1.65
http://cvs.savannah.gnu.org/viewcvs/gnash/server/asobj/NetStreamGst.h?cvsroot=gnash&r1=1.29&r2=1.30
http://cvs.savannah.gnu.org/viewcvs/gnash/server/asobj/Sound.cpp?cvsroot=gnash&r1=1.25&r2=1.26
http://cvs.savannah.gnu.org/viewcvs/gnash/server/asobj/SoundGst.cpp?cvsroot=gnash&r1=1.15&r2=1.16
http://cvs.savannah.gnu.org/viewcvs/gnash/server/asobj/SoundGst.h?cvsroot=gnash&r1=1.7&r2=1.8

Patches:
Index: ChangeLog
===================================================================
RCS file: /sources/gnash/gnash/ChangeLog,v
retrieving revision 1.5438
retrieving revision 1.5439
diff -u -b -r1.5438 -r1.5439
--- ChangeLog   20 Jan 2008 19:38:03 -0000      1.5438
+++ ChangeLog   21 Jan 2008 07:07:26 -0000      1.5439
@@ -1,3 +1,31 @@
+2008-01-20 Bastiaan Jacques <address@hidden>
+
+       * configure.ac: Disable the use of mad and ffmpeg, because they
+       are no longer maintained media backends.
+       * libmedia/Makefile.am: Compile the newly added gstflvdemux.{h,cpp}
+       and gstflvparse.{h,cpp}. These files are Gstreamer's new FLV demuxer,
+       which will be loaded on-the-fly if it not available on the system.
+       * server/namedStrings.{h,cpp}: Add a symbol for onMetaData.
+       * server/asobj/NetConnection.cpp: Replace openConnection with
+       validateURL, since that name more accurately reflects its current
+       purpose. Media handlers will now be in charge of starting their own
+       network connections. Remove (now) unused methods.
+       * server/asobj/NetStream.{cpp,h}: Use symbolic names for pause modes.
+       Implement onMetaData through processMetaData.
+       * server/asobj/NetStreamGst.{h,cpp}: Reimplement the Gstreamer
+       NetStream handler. Gstreamer now takes care of URL resolving,
+       downloading, buffering, FLV decoding and all grocery shopping and
+       house cleaning. Because Gstreamer abstracts much of the threading
+       code, we rarely need to lock the main execution thread. The new
+       handler has been tested to work with YouTube, Lulu.tv, streaming
+       ogg-theora-vorbis etc. However, seeking HTTP streams does not
+       currently work.
+       * server/asobj/Sound.cpp: Don't try to start a NetConnection, as it
+       makes no sense in the base class.
+       * server/asobj/SoundGst.cpp: Like NetStreamGst, SoundGst now lets
+       Gstreamer handle downloading URLs and buffering data. SoundGst has
+       received a general cleanup.
+
 2008-01-20 Sandro Santilli <address@hidden>
 
        * server/character.{cpp,h}: keep a pointer from the mask to

Index: configure.ac
===================================================================
RCS file: /sources/gnash/gnash/configure.ac,v
retrieving revision 1.473
retrieving revision 1.474
diff -u -b -r1.473 -r1.474
--- configure.ac        17 Jan 2008 00:18:50 -0000      1.473
+++ configure.ac        21 Jan 2008 07:07:26 -0000      1.474
@@ -15,7 +15,7 @@
 dnl  Foundation, Inc., 51 Franklin St, Fifth Floor, Boston, MA  02110-1301  USA
 dnl  
 
-dnl $Id: configure.ac,v 1.473 2008/01/17 00:18:50 rsavoye Exp $
+dnl $Id: configure.ac,v 1.474 2008/01/21 07:07:26 bjacques Exp $
 
 AC_PREREQ(2.50)
 AC_INIT(gnash, cvs)
@@ -565,27 +565,19 @@
 
 media_handler_specified=false
 AC_ARG_ENABLE(media,
- AC_HELP_STRING([--enable-media=handler], [Enable media handling support using 
the specified handler: ffmpeg, mad, GST (Gstreamer) or none (no sound) 
[[ffmpeg]] ]),
+ AC_HELP_STRING([--enable-media=handler], [Enable media handling support using 
the specified handler: gst or none (no sound) [[none]] ]),
  [case "${enableval}" in
    GST|gst)
      media_handler=gst
      media_handler_specified=true
      ;;
-   ffmpeg|FFMPEG)
-     media_handler=ffmpeg 
-     media_handler_specified=true
-     ;;
-   mad|MAD)
-     media_handler=mad 
-     media_handler_specified=true
-     ;;
    no|NO|none)
      media_handler=none
      media_handler_specified=true
      ;;
    *) AC_MSG_ERROR([bad value ${enableval} for --enable-media option]) ;;
   esac],
- [media_handler=ffmpeg]
+ [media_handler=gst]
 )
 
 AC_ARG_ENABLE(lirc, AC_HELP_STRING([--enable-lirc], [Disable support for 
Lirc]),
@@ -1574,14 +1566,6 @@
 
 case "${media_handler}" in
   gst) AC_DEFINE([SOUND_GST],  [1], [Use GSTREAMER for media handling]) ;;
-  ffmpeg) AC_DEFINE([SOUND_SDL],  [1], [Use SDL for sound handing]) ;;
-  mad) AC_DEFINE([SOUND_SDL],  [1], [Use SDL for sound handling]) ;;
-  *)
-esac
-
-case "${media_handler}" in
-  ffmpeg)  AC_DEFINE([USE_FFMPEG],  [1], [Use FFMPEG for media decoding]) ;;
-  mad)  AC_DEFINE([USE_MAD],  [1], [Use MAD for media decoding]) ;;
   *)
 esac
 

Index: libmedia/Makefile.am
===================================================================
RCS file: /sources/gnash/gnash/libmedia/Makefile.am,v
retrieving revision 1.9
retrieving revision 1.10
diff -u -b -r1.9 -r1.10
--- libmedia/Makefile.am        10 Dec 2007 19:21:01 -0000      1.9
+++ libmedia/Makefile.am        21 Jan 2008 07:07:26 -0000      1.10
@@ -103,7 +103,9 @@
                gst/gstappsink.c \
                gst/gstappsrc.c \
                gst/sound_handler_gst.cpp \
-               gst/MediaDecoderGst.cpp
+               gst/MediaDecoderGst.cpp \
+               gst/gstflvdemux.c \
+               gst/gstflvparse.c 
        
    noinst_HEADERS += \
                gst/gstgnashsrc.h \
@@ -113,7 +115,9 @@
                gst/gstappsink.h \
                gst/gstappsrc.h \
                gst/sound_handler_gst.h \
-               gst/MediaDecoderGst.h
+               gst/MediaDecoderGst.h \
+               gst/gstflvdemux.h \
+               gst/gstflvparse.h
 
    libgnashmedia_la_CPPFLAGS += \
                $(GSTREAMER_CFLAGS)

Index: server/namedStrings.cpp
===================================================================
RCS file: /sources/gnash/gnash/server/namedStrings.cpp,v
retrieving revision 1.7
retrieving revision 1.8
diff -u -b -r1.7 -r1.8
--- server/namedStrings.cpp     26 Dec 2007 08:04:59 -0000      1.7
+++ server/namedStrings.cpp     21 Jan 2008 07:07:27 -0000      1.8
@@ -60,6 +60,7 @@
        { "onRollOver", NSV::PROP_ON_ROLL_OVER },
        { "onSelect", NSV::PROP_ON_SELECT },
        { "onStatus", NSV::PROP_ON_STATUS },
+       { "onMetaData", NSV::PROP_ON_META_DATA },
        { "_parent", NSV::PROP_uPARENT },
        { "_root", NSV::PROP_uROOT },
        { "_global", NSV::PROP_uGLOBAL },

Index: server/namedStrings.h
===================================================================
RCS file: /sources/gnash/gnash/server/namedStrings.h,v
retrieving revision 1.9
retrieving revision 1.10
diff -u -b -r1.9 -r1.10
--- server/namedStrings.h       26 Dec 2007 08:04:59 -0000      1.9
+++ server/namedStrings.h       21 Jan 2008 07:07:27 -0000      1.10
@@ -82,6 +82,7 @@
                PROP_ON_ROLL_OVER,
                PROP_ON_SELECT,
                PROP_ON_STATUS,
+               PROP_ON_META_DATA,
                PROP_uPARENT,
                PROP_uROOT,
                PROP_uGLOBAL,

Index: server/asobj/NetConnection.cpp
===================================================================
RCS file: /sources/gnash/gnash/server/asobj/NetConnection.cpp,v
retrieving revision 1.52
retrieving revision 1.53
diff -u -b -r1.52 -r1.53
--- server/asobj/NetConnection.cpp      26 Dec 2007 20:30:14 -0000      1.52
+++ server/asobj/NetConnection.cpp      21 Jan 2008 07:07:27 -0000      1.53
@@ -17,7 +17,7 @@
 // Foundation, Inc., 51 Franklin St, Fifth Floor, Boston, MA  02110-1301  USA
 //
 
-/* $Id: NetConnection.cpp,v 1.52 2007/12/26 20:30:14 strk Exp $ */
+/* $Id: NetConnection.cpp,v 1.53 2008/01/21 07:07:27 bjacques Exp $ */
 
 #ifdef HAVE_CONFIG_H
 #include "config.h"
@@ -52,8 +52,7 @@
 
 NetConnection::NetConnection()
        :
-       as_object(getNetConnectionInterface()),
-       _loader()
+       as_object(getNetConnectionInterface())
 {
        attachProperties();
 }
@@ -63,30 +62,17 @@
 }
 
 /*public*/
-bool NetConnection::openConnection(const std::string& url)
+std::string NetConnection::validateURL(const std::string& url)
 {
 
-       // if already running there is no need to setup things again
-       if (_loader.get())
-       {
-               log_debug("NetConnection::openConnection() called when already 
connected to a stream. Checking if the existing connection can be used.");
-               std::string newurl;
+       std::string completeUrl;
                if (_prefixUrl.size() > 0) {
-                       newurl += _prefixUrl + "/" + url;
+               completeUrl += _prefixUrl + "/" + url;
                } else {
-                       newurl += url;
-               }
-               if (newurl.compare(_completeUrl) == 0) return true;
-               else return false;
+               completeUrl += url;
        }
 
-       if (_prefixUrl.size() > 0) {
-               _completeUrl += _prefixUrl + "/" + url;
-       } else {
-               _completeUrl += url;
-       }
-
-       URL uri(_completeUrl, get_base_url());
+       URL uri(completeUrl, get_base_url());
 
        std::string uriStr(uri.str());
        assert(uriStr.find("://")!=string::npos);
@@ -94,22 +80,14 @@
        // Check if we're allowed to open url
        if (!URLAccessManager::allow(uri)) {
                log_security(_("Gnash is not allowed to open this url: %s"), 
uriStr.c_str());
-               return false;
+               return "";
        }
 
        log_msg(_("Connecting to movie: %s"), uriStr.c_str());
 
-       _loader.reset( new LoadThread() );
-       
-       if 
(!_loader->setStream(std::auto_ptr<tu_file>(StreamProvider::getDefaultInstance().getStream(uri))))
 {
-               log_error(_("Gnash could not open this url: %s"), 
uriStr.c_str());
-               _loader.reset();
-               return false;
-       }
-
        log_msg(_("Connection etablished to movie: %s"), uriStr.c_str());
 
-       return true;
+       return uriStr;
 }
 
 /*public*/
@@ -126,75 +104,6 @@
        _prefixUrl += url;
 }
 
-/*public*/
-bool
-NetConnection::eof()
-{
-       if (!_loader.get()) return true; // @@ correct ?
-       return _loader->eof();
-}
-
-/*public*/
-size_t
-NetConnection::read(void *dst, size_t bytes)
-{
-       if (!_loader.get()) return 0; // @@ correct ?
-       return _loader->read(dst, bytes);
-}
-
-/*public*/
-bool
-NetConnection::seek(size_t pos)
-{
-       if (!_loader.get()) return false; // @@ correct ?
-       return _loader->seek(pos);
-}
-
-/*public*/
-size_t
-NetConnection::tell()
-{
-       if (!_loader.get()) return 0; // @@ correct ?
-       return _loader->tell();
-}
-
-/*public*/
-long
-NetConnection::getBytesLoaded()
-{
-       if (!_loader.get()) return 0; // @@ correct ?
-       return _loader->getBytesLoaded();
-}
-
-
-/*public*/
-long
-NetConnection::getBytesTotal()
-{
-       if (!_loader.get()) return 0; // @@ correct ?
-       return _loader->getBytesTotal();
-}
-
-/*public*/
-bool
-NetConnection::loadCompleted()
-{
-       if ( !_loader.get() ) return false; // @@ correct ?
-       return _loader->completed();
-}
-
-std::auto_ptr<FLVParser>
-NetConnection::getConnectedParser() const
-{
-       std::auto_ptr<FLVParser> ret;
-
-       if ( _loader.get() )
-       {
-               ret.reset(new FLVParser(*_loader));
-       }
-
-       return ret;
-}
 
 /// \brief callback to instantiate a new NetConnection object.
 /// \param fn the parameters from the Flash movie
@@ -213,6 +122,13 @@
 as_value
 NetConnection::connect_method(const fn_call& fn)
 {
+       // NOTE:
+       //
+       // NetConnection::connect() is *documented*, I repeat, *documented*, to 
require the
+       // "url" argument to be NULL in AS <= 2. This is *legal* and 
*required*. Anything
+       // other than NULL is undocumented behaviour, and I would like to know 
if there
+       // are any movies out there relying on it. --bjacques.
+
        GNASH_REPORT_FUNCTION;
 
        boost::intrusive_ptr<NetConnection> ptr = 
ensureType<NetConnection>(fn.this_ptr); 
@@ -228,15 +144,25 @@
        as_value& url_val = fn.arg(0);
 
        // Check first arg for validity 
-       if ( url_val.is_null() || url_val.is_undefined() )
+       if ( url_val.is_null())
        {
+               // Null URL was passed. This is expected. Of course, it also 
makes this
+               // function (and, this class) rather useless. We return true, 
even though
+               // returning true has no meaning.
+               
+               return as_value(true);
+       }
+
+       // The remainder of this function is undocumented.
+       
+       if (url_val.is_undefined()) {
                IF_VERBOSE_ASCODING_ERRORS(
-               std::stringstream ss; fn.dump_args(ss);
-               log_aserror(_("NetConnection.connect(%s): invalid first arg"), 
ss.str().c_str());
+                log_aserror(_("NetConnection.connect(): first argument 
shouldn't be undefined"));
                );
                return as_value(false);
        }
 
+
        /// .. TODO: checkme ... addToURL ?? shoudnl't we attempt a connection 
??
        ptr->addToURL(url_val.to_string());
 
@@ -289,10 +215,8 @@
 
        if ( fn.nargs == 0 ) // getter
        {
-               // TODO: define a NetConnection::isConnected method
-               return as_value((bool)ptr->_loader.get());
-               //log_unimpl("NetConnection.isConnected get");
-               //return as_value();
+               log_unimpl("NetConnection.isConnected get");
+         return as_value();
        }
        else // setter
        {

Index: server/asobj/NetConnection.h
===================================================================
RCS file: /sources/gnash/gnash/server/asobj/NetConnection.h,v
retrieving revision 1.36
retrieving revision 1.37
diff -u -b -r1.36 -r1.37
--- server/asobj/NetConnection.h        2 Aug 2007 06:03:43 -0000       1.36
+++ server/asobj/NetConnection.h        21 Jan 2008 07:07:27 -0000      1.37
@@ -15,7 +15,7 @@
 // along with this program; if not, write to the Free Software
 // Foundation, Inc., 51 Franklin St, Fifth Floor, Boston, MA  02110-1301  USA
 
-/* $Id: NetConnection.h,v 1.36 2007/08/02 06:03:43 strk Exp $ */
+/* $Id: NetConnection.h,v 1.37 2008/01/21 07:07:27 bjacques Exp $ */
 
 #ifndef __NETCONNECTION_H__
 #define __NETCONNECTION_H__
@@ -75,65 +75,7 @@
        /// RTMP. Newer Flash movies have a parameter to connect which is a
        /// URL string like rtmp://foobar.com/videos/bar.flv
        ///
-       bool openConnection(const std::string& url);
-
-       /// Put read pointer at given position
-       //
-       /// If the position has not been loaded yet
-       /// this call blocks. If not connected false
-       /// is returned w/out blocking.
-       ///
-       bool seek(size_t pos);
-
-       /// Read 'bytes' bytes into the given buffer.
-       //
-       /// If not enough bytes have been loaded yet
-       /// this call blocks. If not connected false
-       /// is returned w/out blocking.
-       ///
-       /// Return number of actually read bytes
-       ///
-       size_t read(void *dst, size_t bytes);
-
-       /// Return true if EOF has been reached
-       //
-       /// This call never blocks.
-       /// If not connected, true is returned (is this correct behaviour?)
-       ///
-       bool eof();
-
-       /// Report global position within the file
-       //
-       /// This call never blocks.
-       /// If not connected, 0 is returned (is this correct behaviour?)
-       ///
-       size_t tell();
-
-       /// Returns the number of bytes cached
-       //
-       /// This call never blocks.
-       /// If not connected, 0 is returned (is this correct behaviour?)
-       ///
-       long getBytesLoaded();
-
-       /// Returns the total size of the file
-       //
-       /// This call never blocks.
-       /// If not connected, 0 is returned (is this correct behaviour?)
-       ///
-       long getBytesTotal();
-
-       /// Return an FLVParser using our LoadThread for input
-       //
-       /// If not connected, a NULL auto_ptr is returned.
-       ///
-       std::auto_ptr<FLVParser> getConnectedParser() const;
-
-       /// Returns whether the load is complete
-       //
-       /// This call never blocks.
-       ///
-       bool loadCompleted();
+       std::string validateURL(const std::string& url);
 
        /// Register the "NetConnection" constructor to the given global object
        static void registerConstructor(as_object& global);
@@ -146,12 +88,6 @@
        /// the url prefix optionally passed to connect()
        std::string _prefixUrl;
 
-       /// the complete url of the file
-       std::string _completeUrl;
-
-       /// The file/stream loader thread and interface
-       std::auto_ptr<LoadThread> _loader;
-
        /// Attach ActionScript instance properties
        void attachProperties();
 

Index: server/asobj/NetStream.cpp
===================================================================
RCS file: /sources/gnash/gnash/server/asobj/NetStream.cpp,v
retrieving revision 1.78
retrieving revision 1.79
diff -u -b -r1.78 -r1.79
--- server/asobj/NetStream.cpp  17 Dec 2007 22:24:59 -0000      1.78
+++ server/asobj/NetStream.cpp  21 Jan 2008 07:07:27 -0000      1.79
@@ -17,7 +17,7 @@
 // Foundation, Inc., 51 Franklin St, Fifth Floor, Boston, MA  02110-1301  USA
 //
 
-/* $Id: NetStream.cpp,v 1.78 2007/12/17 22:24:59 strk Exp $ */
+/* $Id: NetStream.cpp,v 1.79 2008/01/21 07:07:27 bjacques Exp $ */
 
 #ifdef HAVE_CONFIG_H
 #include "config.h"
@@ -131,10 +131,11 @@
        boost::intrusive_ptr<NetStream> ns = ensureType<NetStream>(fn.this_ptr);
        
        // mode: -1 ==> toogle, 0==> pause, 1==> play
-       int mode = -1;
+       NetStream::PauseMode mode = NetStream::pauseModeToggle;
        if (fn.nargs > 0)
        {
-               mode = fn.arg(0).to_bool() ? 0 : 1;
+               mode = fn.arg(0).to_bool() ? NetStream::pauseModePause :
+                                            NetStream::pauseModeUnPause;
        }
        ns->pause(mode);        // toggle mode
        return as_value();
@@ -311,22 +312,10 @@
 {
        boost::intrusive_ptr<NetStream> ns = ensureType<NetStream>(fn.this_ptr);
 
-       bool warned = false;
-       if ( ! warned ) {
-               log_unimpl("NetStream.currentFPS getter/setter");
-               warned = true;
-       }
-       if ( fn.nargs == 0 ) // getter
-       {
-               return as_value();
-       }
-       else // setter
-       {
-               return as_value();
-       }
+       return as_value(ns->getCurrentFPS());
 }
 
-// Both a getter and a (do-nothing) setter for bufferLength
+// read-only property bufferLength: amount of time buffered before playback
 static as_value
 netstream_bufferLength(const fn_call& fn)
 {
@@ -451,6 +440,42 @@
 
 }
 
+
+void
+NetStream::processMetaData(boost::intrusive_ptr<as_object>& metadata_obj)
+{
+       // TODO: check for System.onStatus too ! use a private 
getStatusHandler() method for this.
+       as_value handler;
+       if (!get_member(NSV::PROP_ON_META_DATA, &handler) || ! 
handler.is_function())
+       {
+#ifdef GNASH_DEBUG_METADATA
+         log_debug("No onMetaData handler");
+#endif
+               return;
+       }
+
+       size_t initialStackSize = m_env->stack_size();
+       if ( initialStackSize > 0 )
+       {
+               log_debug("NetStream environment stack not empty at start of 
processMetaData");
+       }
+
+#ifdef GNASH_DEBUG_METADATA
+  log_debug(" Invoking onMetaData");
+#endif
+
+  m_env->push(as_value(metadata_obj.get()));
+  call_method(handler, m_env, this, 1, m_env->get_top_index() );
+
+       // clear the stack after method execution
+       if ( m_env->stack_size() > initialStackSize )
+       {
+               log_debug("NetStream environment stack not empty at end of 
processMetaData. Fixing.");
+               m_env->drop(m_env->stack_size() - initialStackSize);
+       }
+}
+
+
 void
 NetStream::processStatusNotifications()
 {
@@ -517,20 +542,6 @@
 }
 
 long
-NetStream::bytesLoaded()
-{
-       if (_netCon == NULL) return 0;
-       return _netCon->getBytesLoaded();
-}
-
-long
-NetStream::bytesTotal()
-{
-       if (_netCon == NULL) return 0;
-       return _netCon->getBytesTotal();
-}
-
-long
 NetStream::bufferLength()
 {
        if (m_parser.get() == NULL) return 0;

Index: server/asobj/NetStream.h
===================================================================
RCS file: /sources/gnash/gnash/server/asobj/NetStream.h,v
retrieving revision 1.54
retrieving revision 1.55
diff -u -b -r1.54 -r1.55
--- server/asobj/NetStream.h    10 Jan 2008 17:34:46 -0000      1.54
+++ server/asobj/NetStream.h    21 Jan 2008 07:07:28 -0000      1.55
@@ -19,7 +19,7 @@
 //
 //
 
-/*  $Id: NetStream.h,v 1.54 2008/01/10 17:34:46 strk Exp $ */
+/*  $Id: NetStream.h,v 1.55 2008/01/21 07:07:28 bjacques Exp $ */
 
 #ifndef __NETSTREAM_H__
 #define __NETSTREAM_H__
@@ -88,6 +88,7 @@
                invalidTime
        };
 
+
        boost::intrusive_ptr<NetConnection> _netCon;
 
        /// Set stream status.
@@ -122,6 +123,9 @@
        ///
        void processStatusNotifications();
 
+       
+       void processMetaData(boost::intrusive_ptr<as_object>& metadata_obj);
+
        /// The actionscript enviroment for the AS callbacks
        //
        /// TODO: research on safety of this: who's the owner of the 
as_environment ?
@@ -187,6 +191,12 @@
 
 public:
 
+       enum PauseMode {
+         pauseModeToggle = -1,
+         pauseModePause = 0,
+         pauseModeUnPause = 1  
+       };
+
        NetStream();
 
 #if !defined(sgi) || defined(__GNUC__)
@@ -200,11 +210,7 @@
        //
        /// @param mode
        ///     Defines what mode to put the instance in. 
-       /// -1 : toogle mode
-       /// 0 : switch to pause
-       /// 1 : switch to play
-       ///
-       virtual void pause(int /*mode*/){}
+       virtual void pause(PauseMode /*mode*/){}
 
        /// Starts the playback of the media
        //
@@ -231,6 +237,10 @@
        /// used to find the next video frame to be shown, though this might 
change.
        virtual void advance(){}
 
+       /// Returns the current framerate in frames per second.
+       virtual double getCurrentFPS() { return 0; }
+       
+
        /// Sets the NetConnection needed to access external files
        //
        /// @param netconnection
@@ -264,17 +274,14 @@
        ///
        boost::uint32_t bufferTime() { return m_bufferTime; }
 
-       /// Returns the number of bytes loaded of the media file
-       //
-       /// @return the number of bytes loaded of the media file
-       ///
-       long bytesLoaded();
+       /// Returns the number of bytes of the media file that have been 
buffered.
+       virtual long bytesLoaded() { return 0; }
 
        /// Returns the total number of bytes (size) of the media file
        //
        /// @return the total number of bytes (size) of the media file
        ///
-       long bytesTotal();
+       virtual long bytesTotal() { return 0;}
 
        /// Returns the number of millisecond of the media file that is 
buffered and 
        /// yet to be played
@@ -295,6 +302,8 @@
        ///
        std::auto_ptr<image::image_base> get_video();
 
+
+
 private:
 
        /// Pop next queued status notification from the queue

Index: server/asobj/NetStreamGst.cpp
===================================================================
RCS file: /sources/gnash/gnash/server/asobj/NetStreamGst.cpp,v
retrieving revision 1.64
retrieving revision 1.65
diff -u -b -r1.64 -r1.65
--- server/asobj/NetStreamGst.cpp       12 Dec 2007 10:23:46 -0000      1.64
+++ server/asobj/NetStreamGst.cpp       21 Jan 2008 07:07:28 -0000      1.65
@@ -1,4 +1,3 @@
-// NetStreamGst.cpp:  Audio/video output via Gstreamer library, for Gnash.
 // 
 //   Copyright (C) 2005, 2006, 2007 Free Software Foundation, Inc.
 // 
@@ -15,1696 +14,555 @@
 // You should have received a copy of the GNU General Public License
 // along with this program; if not, write to the Free Software
 // Foundation, Inc., 51 Franklin St, Fifth Floor, Boston, MA  02110-1301  USA
-//
 
-/* $Id: NetStreamGst.cpp,v 1.64 2007/12/12 10:23:46 zoulunkai Exp $ */
 
 #ifdef HAVE_CONFIG_H
 #include "config.h"
 #endif
 
-#ifdef SOUND_GST
-
-#include "log.h"
 #include "NetStreamGst.h"
-#include "fn_call.h"
-#include "NetStream.h"
-#include "URLAccessManager.h"
-#include "render.h"    
-#include "movie_root.h"
-#include "NetConnection.h"
-//#include "action.h"
 
 #include "gstgnashsrc.h"
+#include "Object.h"
+#include "gstflvdemux.h"
 
-#ifdef GST_HAS_MODERN_PBUTILS
-#include <gst/pbutils/missing-plugins.h>
-#include <gst/pbutils/install-plugins.h>
-#endif // GST_HAS_MODERN_PBUTILS
 
-#include "URL.h"
+//                                        video -> ffmpegcolorspace -> 
capsfilter -> fakesink
+//                                       /
+// (GstUriHandler) -> queue -> decodebin
+//                                       |
+//                                        audio -> audioconvert -> 
autoaudiosink
 
-// Define the following macro to enable debugging traces
-//#define GNASH_DEBUG
 
 namespace gnash {
 
-static gboolean
-register_elements (GstPlugin *plugin)
+NetStreamGst::NetStreamGst()
+ : _downloader(NULL),
+   _duration(0)
 {
-       return gst_element_register (plugin, "gnashsrc", GST_RANK_NONE, 
GST_TYPE_GNASH_SRC);
-}
+  gst_init(NULL, NULL);
 
-static GstPluginDesc gnash_plugin_desc = {
-       0, // GST_VERSION_MAJOR
-       10, // GST_VERSION_MINOR
-       "gnashsrc",
-       "Use gnash as source via callbacks",
-       register_elements,
-       "0.0.1",
-       "LGPL",
-       "gnash",
-       "gnash",
-       "http://www.gnu.org/software/gnash/";,
-       GST_PADDING_INIT
-};
-
-NetStreamGst::NetStreamGst():
-
-       pipeline(NULL),
-       audiosink(NULL),
-       videosink(NULL),
-       decoder(NULL),
-       volume(NULL),
-       colorspace(NULL),
-       videorate(NULL),
-       videocaps(NULL),
-       videoflip(NULL),
-       audioconv(NULL),
-
-       audiosource(NULL),
-       videosource(NULL),
-       source(NULL),
-       videodecoder(NULL),
-       audiodecoder(NULL),
-       videoinputcaps(NULL),
-       audioinputcaps(NULL),
-       _handoffVideoSigHandler(0),
-       _handoffAudioSigHandler(0),
+  _pipeline = gst_pipeline_new ("gnash_pipeline");
 
-#ifndef DISABLE_START_THREAD
-       startThread(NULL),
-#endif
-       videowidth(0),
-       videoheight(0),
-       m_clock_offset(0),
-       m_pausePlayback(false)
-{
-       gst_init (NULL, NULL);
+  // Figure out if flvdemux is present on the system. If not load the one from
+  // the Gnash tree.
+  GstElementFactory* factory = gst_element_factory_find ("flvdemux");
+  if (!factory) {
+    if (!gst_element_register (NULL, "flvdemux", GST_RANK_PRIMARY,
+          gst_flv_demux_get_type ())) {
+      log_error("Failed to register our own FLV demuxer. FLV playback may not "
+                "work.");            
+    }    
+  } else {
+    gst_object_unref(GST_OBJECT(factory));
+  }
+
+  // Setup general decoders
+  _dataqueue = gst_element_factory_make ("queue", "gnash_dataqueue");
+  g_signal_connect (_dataqueue, "underrun", G_CALLBACK 
(NetStreamGst::queue_underrun_cb), this);
+  g_signal_connect (_dataqueue, "running", G_CALLBACK 
(NetStreamGst::queue_running_cb), this);
+    
+  GstElement* decoder = gst_element_factory_make ("decodebin", NULL);  
+  g_signal_connect (decoder, "new-decoded-pad", G_CALLBACK 
(NetStreamGst::decodebin_newpad_cb), this);   
+
+  gst_bin_add_many (GST_BIN (_pipeline), _dataqueue, decoder, NULL);
+  gst_element_link(_dataqueue, decoder);
+
+  // Setup video conversion and sink
+
+
+  // setup the video colorspaceconverter converter
+  GstElement* colorspace = gst_element_factory_make ("ffmpegcolorspace", 
"gnash_colorspace");
+
+  GstElement* videocaps = gst_element_factory_make ("capsfilter", NULL);
+
+  // Make sure we receive RGB
+  GstCaps* videooutcaps = gst_caps_new_simple ("video/x-raw-rgb", NULL);
+  g_object_set (G_OBJECT (videocaps), "caps", videooutcaps, NULL);
+  gst_caps_unref (videooutcaps);
+
+  // Videoscale isn't needed currently, but it'll just pass the video through.
+  // At some point we might make this feature available to the renderer, for
+  // example.
+  GstElement* videoscale = gst_element_factory_make ("videoscale", NULL);
+
+  // setup the videosink with callback
+  GstElement* videosink = gst_element_factory_make ("fakesink", NULL);
+
+  g_object_set (G_OBJECT (videosink), "signal-handoffs", TRUE, "sync", TRUE, 
NULL);
+  g_signal_connect (videosink, "handoff", G_CALLBACK 
(NetStreamGst::video_data_cb), this);
+
+
+  // Create the video pipeline and link the elements. The pipeline will
+  // dereference the elements when they are destroyed.
+  gst_bin_add_many (GST_BIN (_pipeline), colorspace, videoscale, videocaps, 
videosink, NULL);
+  gst_element_link_many(colorspace, videoscale, videocaps, videosink, NULL);   
+
+  // Setup audio sink
+  GstElement* audioconvert = gst_element_factory_make ("audioconvert", NULL);  
+
+  GstElement* audiosink = gst_element_factory_make ("autoaudiosink", NULL);
+
+  gst_bin_add_many(GST_BIN(_pipeline), audioconvert, audiosink, NULL);
+  gst_element_link(audioconvert, audiosink);
+
+  _audiopad = gst_element_get_static_pad (audioconvert, "sink");
+  _videopad = gst_element_get_static_pad (colorspace, "sink");
 }
 
 NetStreamGst::~NetStreamGst()
 {
-       close();
-}
+  gst_element_set_state (_pipeline, GST_STATE_NULL);
 
-void NetStreamGst::pause(int mode)
-{
-       if (mode == -1)
-       {
-               m_pause = ! m_pause;
-       }
-       else
-       {
-               m_pause = (mode == 0) ? true : false;
-       }
 
-       if (pipeline)
-       {
-               if (m_pause)
-               { 
-                       log_msg("Pausing pipeline on user request");
-                       if ( ! pausePipeline(false) )
-                       {
-                               log_error("Could not pause pipeline");
-                       }
-               }
-               else
-               {
-                       if ( ! playPipeline() )
-                       {
-                               log_error("Could not play pipeline");
-                       }
-               }
-       }
+  gst_element_get_state(_pipeline, NULL, NULL, 0); // wait for a response
 
-       if (!pipeline && !m_pause && !m_go) {
-               setStatus(playStart);
-               m_go = true;
-               // To avoid blocking while connecting, we use a thread.
-#ifndef DISABLE_START_THREAD
-               startThread = new 
boost::thread(boost::bind(NetStreamGst::playbackStarter, this));
-#else
-               startPlayback(this);
-#endif
-       }
+  gst_object_unref(GST_OBJECT(_pipeline));
+  
+  gst_object_unref(GST_OBJECT(_videopad));
+  gst_object_unref(GST_OBJECT(_audiopad));
 }
 
-void NetStreamGst::close()
+void
+NetStreamGst::close()
 {
-       if (m_go)
-       {
-               setStatus(playStop);
-               m_go = false;
-#ifndef DISABLE_START_THREAD
-               startThread->join();
-               delete startThread;
-#endif
-       }
+  gst_element_set_state (_pipeline, GST_STATE_NULL);  
 
-       if ( ! disablePipeline() )
-       {
-               log_error("Can't reset pipeline on close");
-       }
+  setStatus(playStop);
 
-       // Should we keep the ref if the above failed ?
-       // Unreffing the pipeline should also unref all elements in it.
-       gst_object_unref (GST_OBJECT (pipeline));
-       pipeline = NULL;
+  processStatusNotifications();
 
        boost::mutex::scoped_lock lock(image_mutex);
 
        delete m_imageframe;
        m_imageframe = NULL;
-
-       _handoffVideoSigHandler = 0;
-       _handoffAudioSigHandler = 0;
-
-       videowidth = 0;
-       videoheight = 0;
-       m_clock_offset = 0;
-       m_pausePlayback = false;
 }
 
-
 void
-NetStreamGst::play(const std::string& c_url)
+NetStreamGst::pause(PauseMode mode)
 {
-
-       // Does it have an associated NetConnection?
-       if ( ! _netCon )
+  GstState newstate;
+  switch(mode) {
+    case pauseModeToggle:
        {
-               IF_VERBOSE_ASCODING_ERRORS(
-               log_aserror(_("No NetConnection associated with this NetStream, 
won't play"));
-               );
-               return;
-       }
+      GstState cur_state;
 
-       // Is it already playing ?
-       if (m_go)
-       {
-               if (m_pause)
-               {
-                       playPipeline();
-               }
+      GstStateChangeReturn statereturn = gst_element_get_state(_pipeline,
+                                          &cur_state, NULL,
+                                          1000000 /* wait 1 ms */);
+      if (statereturn != GST_STATE_CHANGE_SUCCESS) {
                return;
        }
 
-       if (url.size() == 0) url += c_url;
-       // Remove any "mp3:" prefix. Maybe should use this to mark as audio-only
-       if (url.compare(0, 4, std::string("mp3:")) == 0) {
-               url = url.substr(4);
-       }
-       m_go = true;
-
-       // To avoid blocking while connecting, we use a thread.
-#ifndef DISABLE_START_THREAD
-       startThread = new 
boost::thread(boost::bind(NetStreamGst::playbackStarter, this));
-#else
-       startPlayback(this);
-#endif
-       return;
-}
+      if (cur_state == GST_STATE_PLAYING) {
+        newstate = GST_STATE_PAUSED;
+      } else {
+        gst_element_set_base_time(_pipeline, 0);
+        newstate = GST_STATE_PLAYING;
+      }
 
+      break;
+    }
+    case pauseModePause:
+      newstate = GST_STATE_PAUSED;
+      break;
+    case pauseModeUnPause:
 
-// Callback function used by Gstreamer to to attached audio and video streams
-// detected by decoderbin to either the video out or audio out elements.
-// Only used when not playing FLV
-void
-NetStreamGst::callback_newpad (GstElement* /*decodebin*/, GstPad *pad, 
gboolean /*last*/, gpointer data)
-{
+      newstate = GST_STATE_PLAYING;
 
-       NetStreamGst* ns = static_cast<NetStreamGst*>(data);
-       GstCaps *caps;
-       GstStructure *str;
-       GstPad *audiopad, *videopad;
-
-       audiopad = gst_element_get_pad (ns->audioconv, "sink");
-       videopad = gst_element_get_pad (ns->colorspace, "sink");
-
-       // check media type
-       caps = gst_pad_get_caps (pad);
-       str = gst_caps_get_structure (caps, 0);
-       if (g_strrstr (gst_structure_get_name (str), "audio")) {
-               gst_object_unref (videopad);
-
-               // link'n'play
-               gst_pad_link (pad, audiopad);
-
-       } else if (g_strrstr (gst_structure_get_name (str), "video")) {
-               gst_object_unref (audiopad);
-               // Link'n'play
-               gst_pad_link (pad, videopad);
-       } else {
-               gst_object_unref (audiopad);
-               gst_object_unref (videopad);
+      break;
        }
-       gst_caps_unref (caps);
-       return;
+  
+  gst_element_set_state (_pipeline, newstate);
 
 }
 
-// The callback function which unloads the decoded video frames unto the video
-// output imageframe.
 void 
-NetStreamGst::callback_output (GstElement* /*c*/, GstBuffer *buffer, GstPad* 
/*pad*/, gpointer user_data)
+NetStreamGst::play(const std::string& url)
 {
-       NetStreamGst* ns = static_cast<NetStreamGst*>(user_data);
+  std::string valid_url = _netCon->validateURL(url);
 
-       boost::mutex::scoped_lock lock(ns->image_mutex);
+  if (valid_url.empty()) {
+    // error; TODO: nofiy user
+    return;
+  }
 
-       // If the video size has not yet been detected, detect them
-       if (ns->videowidth == 0 && ns->videoheight == 0) {
-               GstPad* videopad = gst_element_get_pad (ns->colorspace, "src");
-               GstCaps* caps = gst_pad_get_caps (videopad);
-               GstStructure* str = gst_caps_get_structure (caps, 0);
+  if (_downloader) {
+    gst_element_set_state (_pipeline, GST_STATE_NULL);
 
-               int height, width, ret, framerate1, framerate2;
-               ret = gst_structure_get_int (str, "width", &width);
-               ret &= gst_structure_get_int (str, "height", &height);
-               if (ret) {
-                       ns->videowidth = width;
-                       ns->videoheight = height;
-               }
-               ret = gst_structure_get_fraction (str, "framerate", 
&framerate1, &framerate2);
-               
-               // Setup the output videoframe
-               if (ns->m_videoFrameFormat == render::YUV) {
-                       ns->m_imageframe = new image::yuv(width, height);
-               } else if (ns->m_videoFrameFormat == render::RGB) {
-                       ns->m_imageframe = new image::rgb(width, height);
-               }
+    gst_bin_remove(GST_BIN(_pipeline), _downloader); // will also unref
        }
 
-       if (ns->m_imageframe) {
-//             ns->m_imageframe->update(GST_BUFFER_DATA(buffer));
-               if (ns->m_videoFrameFormat == render::YUV) {
-                       abort();
-
-               /*      image::yuv* yuvframe = 
static_cast<image::yuv*>(m_imageframe);
-                       int copied = 0;
-                       boost::uint8_t* ptr = GST_BUFFER_DATA(buffer);
-                       for (int i = 0; i < 3 ; i++)
-                       {
-                               int shift = (i == 0 ? 0 : 1);
-                               boost::uint8_t* yuv_factor = m_Frame->data[i];
-                               int h = ns->videoheight >> shift;
-                               int w = ns->videowidth >> shift;
-                               for (int j = 0; j < h; j++)
-                               {
-                                       copied += w;
-                                       assert(copied <= yuvframe->size());
-                                       memcpy(ptr, yuv_factor, w);
-                                       yuv_factor += m_Frame->linesize[i];
-                                       ptr += w;
-                               }
-                       }
-                       video->m_size = copied;*/
-               } else {
-                       ns->m_imageframe->update(GST_BUFFER_DATA(buffer));
-                       ns->m_newFrameReady = true;
-               }
+  _downloader = gst_element_make_from_uri(GST_URI_SRC, valid_url.c_str(),
+                                          "gnash_uridownloader");
 
-       }
+  bool success = gst_bin_add(GST_BIN(_pipeline), _downloader);
+  assert(success);
+
+  gst_element_link(_downloader, _dataqueue);  
+
+
+  // if everything went well, start playback
+  gst_element_set_state (_pipeline, GST_STATE_PLAYING);
 
 }
 
 
-// The callback function which refills the audio buffer with data
-// Only used when playing FLV
-void NetStreamGst::audio_callback_handoff (GstElement * /*c*/, GstBuffer 
*buffer, GstPad* /*pad*/, gpointer user_data)
+// FIXME: this does not work for HTTP streams.
+void
+NetStreamGst::seek(boost::uint32_t pos)
 {
-       NetStreamGst* ns = static_cast<NetStreamGst*>(user_data);
+  bool success = gst_element_seek_simple(_pipeline, GST_FORMAT_TIME,
+                   GstSeekFlags(GST_SEEK_FLAG_FLUSH | GST_SEEK_FLAG_KEY_UNIT),
+                   GST_MSECOND * pos);
 
-       FLVFrame* frame = ns->m_parser->nextAudioFrame();
-       if (!frame) {
-               ns->setStatus(bufferEmpty);
-               ns->m_pausePlayback = true;
-               return;
+  if (success) {
+    setStatus(seekNotify);
+  } else {
+    log_msg(_("Seek failed. This is expected, but we tried it anyway."));
+    setStatus(invalidTime);
        }
-
-//     if (GST_BUFFER_DATA(buffer)) delete [] GST_BUFFER_DATA(buffer);
-       GST_BUFFER_SIZE(buffer) = frame->dataSize;
-       GST_BUFFER_DATA(buffer) = frame->data;
-       GST_BUFFER_TIMESTAMP(buffer) = (frame->timestamp + ns->m_clock_offset) 
* GST_MSECOND;
-       delete frame;
-       return;
-
 }
 
-// The callback function which refills the video buffer with data
-// Only used when playing FLV
-void
-NetStreamGst::video_callback_handoff (GstElement * /*c*/, GstBuffer *buffer, 
GstPad* /*pad*/, gpointer user_data)
+boost::int32_t
+NetStreamGst::time()
 {
-       //GNASH_REPORT_FUNCTION;
+  GstFormat fmt = GST_FORMAT_TIME;
 
-       NetStreamGst* ns = static_cast<NetStreamGst*>(user_data);
+  gint64 pos = 0;
 
-       FLVFrame* frame = ns->m_parser->nextVideoFrame();
-       if (!frame) {
-               ns->setStatus(bufferEmpty);
-               ns->m_pausePlayback = true;
-               return;
+  bool rv = gst_element_query_position (_pipeline, &fmt, &pos);  
+  
+  if (!rv) {
+    return 0;
        }
 
-//     if (GST_BUFFER_DATA(buffer)) delete [] GST_BUFFER_DATA(buffer);
-       GST_BUFFER_SIZE(buffer) = frame->dataSize;
-       GST_BUFFER_DATA(buffer) = frame->data;
-       GST_BUFFER_TIMESTAMP(buffer) = (frame->timestamp + ns->m_clock_offset) 
* GST_MSECOND;
-       delete frame;
-       return;
+  return pos / GST_MSECOND;
 }
 
 void
-NetStreamGst::playbackStarter(NetStreamGst* ns)
+NetStreamGst::advance()
 {
-       ns->startPlayback();
-}
+  GstBus* bus = gst_element_get_bus(_pipeline);
 
-void
-NetStreamGst::unrefElements()
-{
-#ifdef GNASH_DEBUG
-       log_debug("unreffing elements");
-#endif
+  while (gst_bus_have_pending(bus)) {
+    GstMessage* msg = gst_bus_pop(bus);
+    handleMessage(msg);
 
-       boost::mutex::scoped_lock lock(_pipelineMutex);
+    gst_message_unref(msg); 
+  }
 
-       // TODO: Define an GstElement class for storing all these elements,
-       //       and have it's destructor take care of unreffing...
-       //
-       // TODO2: check if calling gst_object_unref is enough to release all
-       //       resources allocated by gst_*_new or gst_element_factory_make
+  gst_object_unref(GST_OBJECT(bus));
 
-       if ( pipeline )
-       {
-               gst_object_unref (GST_OBJECT (pipeline));
-               pipeline = NULL;
-       }
+  processStatusNotifications();
+}
 
-       if ( audiosink )
-       {
-               gst_object_unref (GST_OBJECT (audiosink));
-               audiosink = NULL;
-       }
+double
+NetStreamGst::getCurrentFPS()
+{
+  GstElement*  colorspace = gst_bin_get_by_name (GST_BIN(_pipeline), 
"gnash_colorspace");
 
-       if ( videosink )
-       {
-               gst_object_unref (GST_OBJECT (videosink));
-               videosink = NULL;
-       }
+  GstPad* videopad = gst_element_get_static_pad (colorspace, "src");
 
-       if ( volume )
-       {
-               gst_object_unref (GST_OBJECT (volume));
-               volume = NULL;
-       }
+  gst_object_unref(GST_OBJECT(colorspace));
 
-       if ( colorspace )
-       {
-               gst_object_unref (GST_OBJECT (colorspace));
-               colorspace = NULL;
+       GstCaps* caps = gst_pad_get_negotiated_caps (videopad);
+  if (!caps) {
+    return 0;
        }
 
-       if ( videorate )
-       {
-               gst_object_unref (GST_OBJECT (videorate));
-               videorate = NULL;
-       }
+  gst_object_unref(GST_OBJECT(videopad));
 
-       if ( videocaps )
-       {
-               gst_object_unref (GST_OBJECT (videocaps));
-               videocaps = NULL;
-       }
+       // must not be freed
+  GstStructure* structure = gst_caps_get_structure (caps, 0);
 
-       if ( videoflip )
-       {
-               gst_object_unref (GST_OBJECT (videoflip));
-               videoflip = NULL;
-       }
+  gst_caps_unref(caps);
 
-       if ( audioconv )
-       {
-               gst_object_unref (GST_OBJECT (audioconv));
-               audioconv = NULL;
-       }
+  gint framerate[2] = {0, 0};  
 
-       if (m_isFLV)
-       {
-               if ( audiosource )
-               {
-                       gst_object_unref (GST_OBJECT (audiosource));
-                       audiosource = NULL;
+  gst_structure_get_fraction (structure, "framerate", &framerate[0],
+                              &framerate[1]);
+  if (framerate[1] == 0) {
+    return 0;
                }
 
-               if ( videosource )
-               {
-                       gst_object_unref (GST_OBJECT (videosource));
-                       videosource = NULL;
-               }
+  return double(framerate[0]) / double(framerate[1]);
+}
 
-               if ( videodecoder )
-               {
-                       gst_object_unref (GST_OBJECT (videodecoder));
-                       videodecoder = NULL;
-               }
+long
+NetStreamGst::bytesLoaded()
+{
 
-               if ( audiodecoder )
-               {
-                       gst_object_unref (GST_OBJECT (audiodecoder));
-                       audiodecoder = NULL;
-               }
+  gint64 pos = 0;
+  GstFormat format = GST_FORMAT_BYTES;
+  gst_element_query_position(_downloader, &format, &pos);
 
-               if ( videoinputcaps )
-               {
-                       gst_object_unref (GST_OBJECT (videoinputcaps));
-                       videoinputcaps = NULL;
-               }
+  guint buffer_size = 0;
+  g_object_get(G_OBJECT(_dataqueue), "current-level-bytes", &buffer_size, 
NULL);
 
-               if ( audioinputcaps )
-               {
-                       gst_object_unref (GST_OBJECT (audioinputcaps));
-                       audioinputcaps = NULL;
-               }
+  guint64 bytesloaded = pos + buffer_size;
 
-               assert(source == NULL);
-               assert(decoder == NULL);
-       }
-       else
-       {
-               if ( source )
-               {
-                       gst_object_unref (GST_OBJECT (source));
-                       source = NULL;
-               }
+  // Sanity check; did we exceed the total data size?
+  guint64 total_bytes = bytesTotal();
 
-               if ( decoder )
-               {
-                       gst_object_unref (GST_OBJECT (decoder));
-                       decoder = NULL;
+  if (total_bytes && bytesloaded > total_bytes) {
+    return total_bytes;
                }
 
-               assert( audiosource == NULL);
-               assert( videosource == NULL);
-               assert( videodecoder == NULL);
-               assert( audiodecoder == NULL);
-               assert( videoinputcaps == NULL);
-               assert( audioinputcaps == NULL);
-       }
+  return bytesloaded;
 }
 
-bool
-NetStreamGst::buildFLVPipeline(bool& video, bool& audio)
+long
+NetStreamGst::bytesTotal()
 {
-       boost::mutex::scoped_lock lock(_pipelineMutex);
-
-       if ( ! buildFLVVideoPipeline(video) ) return false;
-       if ( audio )
-       {
-               if ( ! buildFLVSoundPipeline(audio) ) return false;
-       }
+  gint64 duration = 0;
+  GstFormat format = GST_FORMAT_BYTES;
 
-       return true;
-
-}
+  gst_element_query_duration (_downloader, &format, &duration);
 
-#ifdef GST_HAS_MODERN_PBUTILS
+  if (!duration) {
+    return _duration;
+  }
 
-static void
-GstInstallPluginsResultCb (GstInstallPluginsReturn  result,
-                          gpointer                 /*user_data*/)
-{
-  //g_debug("JAU RESULTO MENDO");
+  return duration;
 }
 
 
-static gboolean
-NetStreamGst_install_missing_codecs(GList *missing_plugin_details)
+void
+metadata(const GstTagList *list, const gchar *tag, gpointer user_data)
 {
+  const gchar* nick = gst_tag_get_nick(tag);
+  as_object* o = static_cast<as_object*>(user_data);
 
-  GstInstallPluginsReturn rv;
-  int i,c;
-  GstInstallPluginsContext *install_ctx = gst_install_plugins_context_new();
+#ifdef DEBUG_METADATA
+  const gchar* descr = gst_tag_get_description(tag);
+  g_print("tag name: %s,description: %s, type: %s.\n", nick, descr, 
g_type_name(gst_tag_get_type(tag)));
+#endif
 
-  c=g_list_length(missing_plugin_details);
-  gchar **details = g_new0(gchar*, c+1);
   
-  for(i=0; i < c; i++)
+  switch(gst_tag_get_type(tag)) {
+    case G_TYPE_STRING:
   {
-    details[i] = (gchar*) g_list_nth_data(missing_plugin_details, i);
-  }
+      gchar* value;
 
-  rv = gst_install_plugins_sync (details,
-                                install_ctx);
+      gst_tag_list_get_string(list, tag, &value);
 
-  g_strfreev(details);
+      o->init_member(nick, value);
+      
+      g_free(value);
 
-  switch(rv) {
-  case GST_INSTALL_PLUGINS_SUCCESS:
-    if(!gst_update_registry())
-      g_warning("we failed to update gst registry for new codecs");
-    else
-      return true;
-    break;
-  case GST_INSTALL_PLUGINS_NOT_FOUND:
-    g_debug("gst_install_plugins_sync -> GST_INSTALL_PLUGINS_NOT_FOUND");
-    break;
-  case GST_INSTALL_PLUGINS_ERROR:
-    g_debug("gst_install_plugins_sync -> GST_INSTALL_PLUGINS_ERROR");
     break;
-  case GST_INSTALL_PLUGINS_PARTIAL_SUCCESS:
-    g_debug("gst_install_plugins_sync -> GST_INSTALL_PLUGINS_PARTIAL_SUCCESS");
+    }
+    case G_TYPE_DOUBLE:
+    {
+      gdouble value;
+      gst_tag_list_get_double(list, tag, &value);
+      o->init_member(nick, value);
+      
     break;
-  case GST_INSTALL_PLUGINS_USER_ABORT:
-    g_debug("gst_install_plugins_sync -> GST_INSTALL_PLUGINS_USER_ABORT");
+    }
+    case G_TYPE_BOOLEAN:
+    {
+      gboolean value;
+      gst_tag_list_get_boolean(list, tag, &value);
+      o->init_member(nick, value);
     break;
-  case GST_INSTALL_PLUGINS_CRASHED:
-    g_debug("gst_install_plugins_sync -> GST_INSTALL_PLUGINS_CRASHED");
+    }
+    case G_TYPE_UINT64:
+    {
+      guint64 value;
+      gst_tag_list_get_uint64(list, tag, &value);
+      o->init_member(nick, (unsigned long) value); // FIXME: actually, fix 
as_value().
     break;
-  case GST_INSTALL_PLUGINS_INVALID:
-    g_warning("gst_install_plugins_sync -> GST_INSTALL_PLUGINS_INVALID");
+    }
+    case G_TYPE_UINT:
+    {
+      guint value;
+      gst_tag_list_get_uint(list, tag, &value);
+      o->init_member(nick, value);
     break;
+    }
   default:
-    g_warning("gst_install_plugins_sync -> UNEXPECTED RESULT (undocumented 
value)");
-    break;                               
-  };
+    {}
+  } // switch
 
-  return false;
 }
 
-static GList*
-NetStreamGst_append_missing_codec_to_details (GList *list,
-                                             GstElement *source,
-                                             const GstCaps* caps)
+
+// TODO: apparently the onStatus message.details propery can be set with some
+//       actually useful error description. Investigate and implement.
+void
+NetStreamGst::handleMessage (GstMessage *message)
 {
-  GstMessage *missing_msg;
-  missing_msg = gst_missing_decoder_message_new(source,
-                                               caps);
-  gchar* detail = 
gst_missing_plugin_message_get_installer_detail(missing_msg);  
+#ifdef DEBUG_MESSAGES
+  g_print ("Got %s message\n", GST_MESSAGE_TYPE_NAME (message));
+#endif
 
-  if(!detail)
+  switch (GST_MESSAGE_TYPE (message)) {
+    case GST_MESSAGE_ERROR:
   {
-    g_warning("missing message details not found. No details added.");
-    return list;
-  } 
-
-  return g_list_append(list, detail);
-}
+      GError *err;
+      gchar *debug;
+      gst_message_parse_error (message, &err, &debug);
 
-#endif // GST_HAS_MODERN_PBUTILS
+      log_error(_("NetStream playback halted because: %s\n"), err->message);
 
-bool
-NetStreamGst::buildFLVVideoPipeline(bool &video)
-{
-#ifdef GNASH_DEBUG
-       log_debug("Building FLV video decoding pipeline");
-#endif
+      g_error_free (err);
+      g_free (debug);
 
-       FLVVideoInfo* videoInfo = m_parser->getVideoInfo();
+      setStatus(streamNotFound);
 
-       bool doVideo = video;
+      // Clear any buffers.
+      gst_element_set_state (_pipeline, GST_STATE_NULL);
 
-       GList *missing_plugin_details = NULL;
-#ifdef GST_HAS_MODERN_PBUTILS
- retry:
-#endif
-       if (videoInfo) {
-               doVideo = true;
-               videosource = gst_element_factory_make ("fakesrc", NULL);
-               if ( ! videosource )
-               {
-                       log_error("Unable to create videosource 'fakesrc' 
element");
-                       return false;
+      break;
                }
+    case GST_MESSAGE_EOS:
+      log_msg(_("NetStream has reached the end of the stream."));
+      break;
+    case GST_MESSAGE_TAG:
+    {
+      GstTagList* taglist;
                
-               // setup fake source
-               g_object_set (G_OBJECT (videosource),
-                                       "sizetype", 2, "can-activate-pull", 
FALSE, "signal-handoffs", TRUE, NULL);
+      gst_message_parse_tag(message, &taglist);      
 
-               // Setup the callback
-               if ( ! connectVideoHandoffSignal() )
-               {
-                       log_error("Unable to connect the video 'handoff' signal 
handler");
-                       return false;
-               }
+      boost::intrusive_ptr<as_object> o = new as_object(getObjectInterface());
 
-               // Setup the input capsfilter
-               videoinputcaps = gst_element_factory_make ("capsfilter", NULL);
-               if ( ! videoinputcaps )
-               {
-                       log_error("Unable to create videoinputcaps 'capsfilter' 
element");
-                       return false;
-               }
+      gst_tag_list_foreach(taglist, metadata, o.get());
 
-               boost::uint32_t fps = m_parser->videoFrameRate(); 
+      processMetaData(o);
 
-               GstCaps* videonincaps;
-               if (videoInfo->codec == media::VIDEO_CODEC_H263) {
-                       videonincaps = gst_caps_new_simple 
("video/x-flash-video",
-                               "width", G_TYPE_INT, videoInfo->width,
-                               "height", G_TYPE_INT, videoInfo->height,
-                               "framerate", GST_TYPE_FRACTION, fps, 1,
-                               "flvversion", G_TYPE_INT, 1,
-                               NULL);
-                       videodecoder = gst_element_factory_make ("ffdec_flv", 
NULL);
-                       if ( ! videodecoder )
+      g_free(taglist);
+      break;
+    }    
+    case GST_MESSAGE_BUFFERING:
                        {
-                               log_error("Unable to create videodecoder 
'ffdec_flv' element");
+      gint percent_buffered;
+      gst_message_parse_buffering(message, &percent_buffered);
 
-#ifdef GST_HAS_MODERN_PBUTILS
-                               missing_plugin_details = 
NetStreamGst_append_missing_codec_to_details
-                                 (missing_plugin_details,
-                                  videosource,
-                                  videonincaps);
-#else // GST_HAS_MODERN_PBUTILS
-                               return false;
-#endif // GST_HAS_MODERN_PBUTILS
-                       }
-
-               } else if (videoInfo->codec == media::VIDEO_CODEC_VP6) {
-                       videonincaps = gst_caps_new_simple ("video/x-vp6-flash",
-                               "width", G_TYPE_INT, 320, // We don't yet have 
a size extract for this codec, so we guess...
-                               "height", G_TYPE_INT, 240,
-                               "framerate", GST_TYPE_FRACTION, fps, 1,
-                               NULL);
-                       videodecoder = gst_element_factory_make ("ffdec_vp6f", 
NULL);
-                       if ( ! videodecoder )
+      if (percent_buffered == 100) {
+        setStatus(bufferFull);      
+      }
+      break;
+    }
+    case GST_MESSAGE_STATE_CHANGED:
                        {
-                               log_error("Unable to create videodecoder 
'ffdec_vp6f' element");
+      GstState oldstate;
+      GstState newstate;
+      GstState pending;
 
-#ifdef GST_HAS_MODERN_PBUTILS
-                               missing_plugin_details = 
NetStreamGst_append_missing_codec_to_details
-                                 (missing_plugin_details,
-                                  videosource,
-                                  videonincaps);
-#else // GST_HAS_MODERN_PBUTILS
-                               return false;
-#endif // GST_HAS_MODERN_PBUTILS
-                       }
-
-               } else if (videoInfo->codec == media::VIDEO_CODEC_SCREENVIDEO) {
-                       videonincaps = gst_caps_new_simple 
("video/x-flash-screen",
-                               "width", G_TYPE_INT, 320, // We don't yet have 
a size extract for this codec, so we guess...
-                               "height", G_TYPE_INT, 240,
-                               "framerate", GST_TYPE_FRACTION, fps, 1,
-                               NULL);
-                       videodecoder = gst_element_factory_make 
("ffdec_flashsv", NULL);
-
-                       // Check if the element was correctly created
-                       if (!videodecoder) {
-                               log_error(_("A gstreamer flashvideo 
(ScreenVideo) decoder element could not be created! You probably need to 
install gst-ffmpeg."));
-
-#ifdef GST_HAS_MODERN_PBUTILS
-                               missing_plugin_details = 
NetStreamGst_append_missing_codec_to_details
-                                 (missing_plugin_details,
-                                  videosource,
-                                  videonincaps);
-#else // GST_HAS_MODERN_PBUTILS
-                               return false;
-#endif // GST_HAS_MODERN_PBUTILS
-                       }
+      gst_message_parse_state_changed(message, &oldstate, &newstate, &pending);
 
-               } else {
-                       log_error(_("Unsupported video codec %d"), 
videoInfo->codec);
-                       return false;
-               }
+      if (oldstate == GST_STATE_READY && (newstate == GST_STATE_PAUSED || 
newstate == GST_STATE_PLAYING)) {
 
-               if(g_list_length(missing_plugin_details) == 0)
-               {
-                 g_object_set (G_OBJECT (videoinputcaps), "caps", 
videonincaps, NULL);
-                 gst_caps_unref (videonincaps);
+        setStatus(playStart);
+      }
+      break;
                }
+    case GST_MESSAGE_DURATION:
+    {
+      // Sometimes the pipeline fails to use this number in queries.
+      GstFormat format = GST_FORMAT_BYTES;
+      gst_message_parse_duration(message, &format, &_duration);
        }
 
-
-#ifdef GST_HAS_MODERN_PBUTILS
-       if(g_list_length(missing_plugin_details) == 0)
+    default:
        {
-         g_debug("no missing plugins found");
-         video = doVideo;
-         return true;
+#ifdef DEBUG_MESSAGES
+      g_print("unhandled message\n");
+#endif
+    }
        }
 
-       g_debug("try to install missing plugins (count=%d)", 
g_list_length(missing_plugin_details));
-       if(NetStreamGst_install_missing_codecs(missing_plugin_details))
-       {
-         disconnectVideoHandoffSignal();
-         g_list_free(missing_plugin_details);
-         missing_plugin_details = NULL;
-         g_debug("gst_install_plugins_sync -> GST_INSTALL_PLUGINS_SUCCESS ... 
one more roundtrip");
-         goto retry;
-       }
-       g_list_free(missing_plugin_details);
-       return false;
-#else // GST_HAS_MODERN_PBUTILS
-       video = doVideo;
-       return true;
-#endif // GST_HAS_MODERN_PBUTILS
 }
 
-bool
-NetStreamGst::buildFLVSoundPipeline(bool &sound)
+// NOTE: callbacks will be called from the streaming thread!
+
+void 
+NetStreamGst::video_data_cb(GstElement* /*c*/, GstBuffer *buffer,
+                            GstPad* /*pad*/, gpointer user_data)
 {
-       bool doSound = sound;
+  NetStreamGst* ns = reinterpret_cast<NetStreamGst*>(user_data);
 
-       FLVAudioInfo* audioInfo = m_parser->getAudioInfo();
-       if (!audioInfo) doSound = false;
+  GstElement*  colorspace = gst_bin_get_by_name (GST_BIN(ns->_pipeline),
+                                                 "gnash_colorspace");
 
-#ifdef GST_HAS_MODERN_PBUTILS
-       GList *missing_plugin_details = NULL;
- retry:
-#endif
-       if (doSound) {
+  GstPad* videopad = gst_element_get_static_pad (colorspace, "src");
+  GstCaps* caps = gst_pad_get_negotiated_caps (videopad);
 
-#ifdef GNASH_DEBUG
-               log_debug("Building FLV video decoding pipeline");
-#endif
+  gint height, width;
 
-               audiosource = gst_element_factory_make ("fakesrc", NULL);
-               if ( ! audiosource )
-               {
-                       log_error("Unable to create audiosource 'fakesrc' 
element");
-                       return false;
-               }
+  GstStructure* str = gst_caps_get_structure (caps, 0);
 
-               // setup fake source
-               g_object_set (G_OBJECT (audiosource),
-                                       "sizetype", 2, "can-activate-pull", 
FALSE, "signal-handoffs", TRUE, NULL);
+  gst_structure_get_int (str, "width", &width);
+  gst_structure_get_int (str, "height", &height);
 
-               // Setup the callback
-               if ( ! connectAudioHandoffSignal() )
-               {
-                       log_error("Unable to connect the audio 'handoff' signal 
handler");
-                       // TODO: what to do in this case ?
+  boost::mutex::scoped_lock lock(ns->image_mutex);
+  
+  if (!ns->m_imageframe || unsigned(width) != ns->m_imageframe->width() ||
+      unsigned(height) != ns->m_imageframe->height()) {
+    delete ns->m_imageframe;
+    ns->m_imageframe = new image::rgb(width, height);
                }
 
+  ns->m_imageframe->update(GST_BUFFER_DATA(buffer));
 
-               if (audioInfo->codec == media::AUDIO_CODEC_MP3) { 
+  ns->m_newFrameReady = true;
 
-                       audiodecoder = gst_element_factory_make ("mad", NULL);
-                       if ( ! audiodecoder )
-                       {
-                               audiodecoder = gst_element_factory_make 
("flump3dec", NULL);
-                               // Check if the element was correctly created
-                               if (!audiodecoder)
-                               {
-                                       log_error(_("A gstreamer mp3-decoder 
element could not be created! You probably need to install a mp3-decoder plugin 
like gstreamer0.10-mad or gstreamer0.10-fluendo-mp3."));
-                               }
-                       }
+  gst_object_unref(GST_OBJECT (colorspace));
+  gst_object_unref(GST_OBJECT(videopad));
+  gst_caps_unref(caps);
+}
 
 
-                       // Set the info about the stream so that gstreamer 
knows what it is.
-                       audioinputcaps = gst_element_factory_make 
("capsfilter", NULL);
-                       if (!audioinputcaps)
-                       {
-                               log_error("Unable to create audioinputcaps 
'capsfilter' element");
-                               return false;
-                       }
+void
+NetStreamGst::decodebin_newpad_cb(GstElement* /*decodebin*/, GstPad* pad,
+                                  gboolean /*last*/, gpointer user_data)
+{
+  NetStreamGst* ns = static_cast<NetStreamGst*>(user_data);  
 
-                       GstCaps* audioincaps = gst_caps_new_simple 
("audio/mpeg",
-                               "mpegversion", G_TYPE_INT, 1,
-                               "layer", G_TYPE_INT, 3,
-                               "rate", G_TYPE_INT, audioInfo->sampleRate,
-                               "channels", G_TYPE_INT, audioInfo->stereo ? 2 : 
1, NULL);
+  GstCaps* caps = gst_pad_get_caps (pad);
+  GstStructure* str = gst_caps_get_structure (caps, 0);
+  const gchar* structure_name = gst_structure_get_name (str);
 
-                       if(!audiodecoder)
-                       {
-#ifdef GST_HAS_MODERN_PBUTILS
-                         missing_plugin_details = 
NetStreamGst_append_missing_codec_to_details
-                           (missing_plugin_details,
-                            audiosource,
-                            audioincaps);
-
-                         
if(NetStreamGst_install_missing_codecs(missing_plugin_details))
-                         {
-                           disconnectAudioHandoffSignal();
-                           g_list_free(missing_plugin_details);
-                           missing_plugin_details = NULL;
-                           g_debug("gst_install_plugins_sync -> 
GST_INSTALL_PLUGINS_SUCCESS ... one more roundtrip");
-                           goto retry;
-                         }
-
-                         g_list_free(missing_plugin_details);
-#endif // GST_HAS_MODERN_PBUTILS
-                         return false;
-                       } 
-                       g_object_set (G_OBJECT (audioinputcaps), "caps", 
audioincaps, NULL);
-                       gst_caps_unref (audioincaps);
-               } else {
-                       log_error(_("Unsupported audio codec %d"), 
audioInfo->codec);
-                       return false;
-               }
-       }
-
-       sound = doSound;
-
-       return true;
-}
-
-bool
-NetStreamGst::buildPipeline()
-{
-#ifdef GNASH_DEBUG
-       log_debug("Building non-FLV decoding pipeline");
-#endif
-
-       boost::mutex::scoped_lock lock(_pipelineMutex);
-
-       // setup gnashnc source if we are not decoding FLV (our homegrown 
source element)
-       source = gst_element_factory_make ("gnashsrc", NULL);
-       if ( ! source )
-       {
-               log_error("Failed to create 'gnashrc' element");
-               return false;
-       }
-       gnashsrc_callback* gc = new gnashsrc_callback; // TODO: who's going to 
delete this ?
-       gc->read = NetStreamGst::readPacket;
-       gc->seek = NetStreamGst::seekMedia;
-       g_object_set (G_OBJECT (source), "data", this, "callbacks", gc, NULL);
-
-       // setup the decoder with callback
-       decoder = gst_element_factory_make ("decodebin", NULL);
-       if (!decoder)
-       {
-               log_error("Unable to create decoder 'decodebin' element");
-               return false;
-       }
-       g_signal_connect (decoder, "new-decoded-pad", G_CALLBACK 
(NetStreamGst::callback_newpad), this);
-
-       return true;
-}
-
-
-void
-NetStreamGst::startPlayback()
-{
-       // This should only happen if close() is called before this thread is 
ready
-       if (!m_go) return;
-
-       boost::intrusive_ptr<NetConnection> nc = _netCon;
-       assert(nc);
-
-       // Pass stuff from/to the NetConnection object.
-       if ( !nc->openConnection(url) ) {
-               setStatus(streamNotFound);
-#ifdef GNASH_DEBUG
-               log_debug(_("Gnash could not open movie: %s"), url.c_str());
-#endif
-               return;
-       }
-
-       inputPos = 0;
-
-       boost::uint8_t head[3];
-       if (nc->read(head, 3) < 3) {
-               setStatus(streamNotFound);
-               return;
-       }
-       nc->seek(0);
-       if (head[0] == 'F' && head[1] == 'L' && head[2] == 'V') { 
-               m_isFLV = true;
-               if (!m_parser.get()) {
-                       m_parser = nc->getConnectedParser(); 
-                       if (! m_parser.get() )
-                       {
-                               setStatus(streamNotFound);
-                               log_error(_("Gnash could not open FLV movie: 
%s"), url.c_str());
-                               return;
-                       }
-               }
-
-       }
-
-       // setup the GnashNC plugin if we are not decoding FLV
-       if (!m_isFLV) _gst_plugin_register_static (&gnash_plugin_desc);
+  gst_caps_unref (caps);
 
-       // setup the pipeline
-       pipeline = gst_pipeline_new (NULL);
+  if (g_strrstr (structure_name, "audio")) {
 
-       // Check if the creation of the gstreamer pipeline was a succes
-       if (!pipeline) {
-               gnash::log_error(_("The gstreamer pipeline element could not be 
created"));
+    if (GST_PAD_IS_LINKED (ns->_audiopad)) {
                return;
        }
 
-       bool video = false;
-       bool sound = false;
+    gst_pad_link (pad, ns->_audiopad);
 
-       // If sound is enabled we set it up
-       if (get_sound_handler()) sound = true;
+  } else if (g_strrstr (structure_name, "video")) {
        
-       // Setup the decoder and source
-       // TODO: move the m_isFLV test into buildPipeline and just call that 
one...
-       if (m_isFLV)
-       {
-               if ( ! buildFLVPipeline(video, sound) )
-               {
-                       unrefElements();
-                       return;
-               }
-       }
-       else
-       {
-               if (!buildPipeline())
-               {
-                       unrefElements();
+    if (GST_PAD_IS_LINKED (ns->_audiopad)) {
                        return;
                }
-       }
 
-
-       if (sound) {
-               // create an audio sink - use oss, alsa or...? make a 
commandline option?
-               // we first try autodetect, then alsa, then oss, then esd, 
then...?
-               // If the gstreamer adder ever gets fixed this should be 
connected to the
-               // adder in the soundhandler.
-#if !defined(__NetBSD__)
-               audiosink = gst_element_factory_make ("autoaudiosink", NULL);
-               if (!audiosink) audiosink = gst_element_factory_make 
("alsasink", NULL);
-               if (!audiosink) audiosink = gst_element_factory_make 
("osssink", NULL);
-#endif
-               if (!audiosink) audiosink = gst_element_factory_make 
("esdsink", NULL);
-
-               if (!audiosink) {
-                       log_error(_("The gstreamer audiosink element could not 
be created"));
-                       unrefElements();
-                       return;
-               }
-               // setup the audio converter
-               audioconv = gst_element_factory_make ("audioconvert", NULL);
-               if (!audioconv) {
-                       log_error(_("The gstreamer audioconvert element could 
not be created"));
-                       unrefElements();
-                       return;
-               }
-
-               // setup the volume controller
-               volume = gst_element_factory_make ("volume", NULL);
-               if (!volume) {
-                       log_error(_("The gstreamer volume element could not be 
created"));
-                       unrefElements();
-                       return;
-               }
+    gst_pad_link (pad, ns->_videopad);
 
        } else  {
-               audiosink = gst_element_factory_make ("fakesink", NULL);
-               if (!audiosink) {
-                       log_error(_("The gstreamer fakesink element could not 
be created"));
-                       unrefElements();
-                       return;
-               }
+    log_unimpl(_("Streams of type %s are not expected!"), structure_name);
        }
-
-       if (video) {
-               // setup the video colorspaceconverter converter
-               colorspace = gst_element_factory_make ("ffmpegcolorspace", 
NULL);
-               if (!colorspace)
-               {
-                       log_error("Unable to create colorspace 
'ffmpegcolorspace' element");
-                       unrefElements();
-                       return;
-               }
-
-               // Setup the capsfilter which demands either YUV or RGB 
videoframe format
-               videocaps = gst_element_factory_make ("capsfilter", NULL);
-               if (!videocaps)
-               {
-                       log_error("Unable to create videocaps 'capsfilter' 
element");
-                       unrefElements();
-                       return;
-               }
-
-               GstCaps* videooutcaps;
-               if (m_videoFrameFormat == render::YUV) {
-                       videooutcaps = gst_caps_new_simple ("video/x-raw-yuv", 
NULL);
-               } else {
-                       videooutcaps = gst_caps_new_simple ("video/x-raw-rgb", 
NULL);
-               }
-               g_object_set (G_OBJECT (videocaps), "caps", videooutcaps, NULL);
-               gst_caps_unref (videooutcaps);
-
-               // Setup the videorate element which makes sure the frames are 
delivered on time.
-               videorate = gst_element_factory_make ("videorate", NULL);
-               if (!videorate)
-               {
-                       log_error("Unable to create videorate 'videorate' 
element");
-                       unrefElements();
-                       return;
-               }
-
-               // setup the videosink with callback
-               videosink = gst_element_factory_make ("fakesink", NULL);
-               if (!videosink)
-               {
-                       log_error("Unable to create videosink 'fakesink' 
element");
-                       unrefElements();
-                       return;
-               }
-
-               g_object_set (G_OBJECT (videosink), "signal-handoffs", TRUE, 
"sync", TRUE, NULL);
-               // TODO: use connectVideoSincCallback()
-               g_signal_connect (videosink, "handoff", G_CALLBACK 
(NetStreamGst::callback_output), this);
-       }
-
-       if (video && (!colorspace || !videocaps || !videorate || !videosink)) {
-               log_error(_("Gstreamer element(s) for video movie handling 
could not be created, you probably need to install gstreamer0.10-base for 
ffmpegcolorspace and videorate support."));
-               unrefElements();
-               return;
-       }
-
-       // put it all in the pipeline and link the elements
-       if (!m_isFLV) { 
-               if (sound) gst_bin_add_many (GST_BIN (pipeline),audiosink, 
audioconv, volume, NULL);
-               if (video) gst_bin_add_many (GST_BIN (pipeline), source, 
decoder, colorspace, 
-                                       videosink, videorate, videocaps, NULL);
-
-               if (video || sound) gst_element_link(source, decoder);
-               if (video) gst_element_link_many(colorspace, videocaps, 
videorate, videosink, NULL);
-               if (sound) gst_element_link_many(audioconv, volume, audiosink, 
NULL);
-
-       } else {
-               if (video) gst_bin_add_many (GST_BIN (pipeline), videosource, 
videoinputcaps, videodecoder, colorspace, videocaps, videorate, videosink, 
NULL);
-               if (sound) gst_bin_add_many (GST_BIN (pipeline), audiosource, 
audioinputcaps, audiodecoder, audioconv, volume, audiosink, NULL);
-
-               if (sound) gst_element_link_many(audiosource, audioinputcaps, 
audiodecoder, audioconv, volume, audiosink, NULL);
-               if (video) gst_element_link_many(videosource, videoinputcaps, 
videodecoder, colorspace, videocaps, videorate, videosink, NULL);
-
-       }
-
-       // start playing        
-       if (!m_isFLV)
-       {
-               if (video || sound)
-               {
-                       // TODO: should call playPipeline() ?
-                       gst_element_set_state (GST_ELEMENT (pipeline), 
GST_STATE_PLAYING);
-               }
-       }
-       else
-       {
-               if (video || sound)
-               {
-                       log_msg("Pausing pipeline on startPlayback");
-                       if ( ! pausePipeline(true) )
-                       {
-                               log_error("Could not pause pipeline");
-                       }
-               }
-       }
-
-       setStatus(playStart);
-       return;
 }
 
 void
-NetStreamGst::seek(boost::uint32_t pos)
+NetStreamGst::queue_underrun_cb(GstElement* /*queue*/, gpointer user_data)
 {
-       if (!pipeline) {
-               if (m_parser.get())  {
-                       m_parser->seek(pos);
-                       m_clock_offset = 0;
-               }
-               return;
-       }
-
-       if (m_isFLV) {
-               assert(m_parser.get()); // why assumed here and not above ?
-               boost::uint32_t newpos = m_parser->seek(pos);
-               GstClock* clock = GST_ELEMENT_CLOCK(pipeline);
-               boost::uint64_t currenttime = gst_clock_get_time (clock);
-               gst_object_unref(clock);
-               
-               m_clock_offset = (currenttime / GST_MSECOND) - newpos;
+  NetStreamGst* ns = static_cast<NetStreamGst*>(user_data);
 
-       } else {
-               if (!gst_element_seek (pipeline, 1.0, GST_FORMAT_TIME, 
GST_SEEK_FLAG_FLUSH,
-                       GST_SEEK_TYPE_SET, GST_MSECOND * pos,
-                       GST_SEEK_TYPE_NONE, GST_CLOCK_TIME_NONE)) {
-                       log_error("Gstreamer seek failed");
-                       setStatus(invalidTime);
-                       return;
-               }
-       }
-       setStatus(seekNotify);
+  ns->setStatus(bufferEmpty);
 }
 
 void
-NetStreamGst::advance()
+NetStreamGst::queue_running_cb(GstElement* /*queue*/, gpointer  user_data)
 {
-       // Check if we should start the playback when a certain amount is 
buffered
-       // This can happen in 2 cases: 
-       // 1) When playback has just started and we've been waiting for the 
buffer 
-       //    to be filled (buffersize set by setBufferTime() and default is 100
-       //    miliseconds).
-       // 2) The buffer has be "starved" (not being filled as quickly as 
needed),
-       //    and we then wait until the buffer contains some data (1 sec) 
again.
-       if (m_isFLV && m_pause && m_go && m_start_onbuffer && m_parser.get() && 
m_parser->isTimeLoaded(m_bufferTime))
-       {
-               if ( ! playPipeline() )
-               {
-                       log_error("Could not enable pipeline");
-                       return;
-               }
-       }
-
-       // If we're out of data, but still not done loading, pause playback,
-       // or stop if loading is complete
-       if (m_pausePlayback)
-       {
-#ifdef GNASH_DEBUG
-               log_debug("Playback paused (out of data?)");
-#endif
-
-               m_pausePlayback = false;
-               if (_netCon->loadCompleted())
-               {
-#ifdef GNASH_DEBUG
-                       log_debug("Load completed, setting playStop status and 
shutting down pipeline");
-#endif
-                       setStatus(playStop);
-
-                       // Drop gstreamer pipeline so callbacks are not called 
again
-                       if ( ! disablePipeline() )
-                       {
-                               // the state change failed
-                               log_error("Could not interrupt pipeline!");
-
-                               // @@ eh.. what to do then ?
-                       }
-
-                       m_go = false;
-                       m_clock_offset = 0;
-               }
-               else
-               {
-                       //log_debug("Pausing pipeline on ::advance() [ 
loadCompleted returned false ]");
-                       if ( !pausePipeline(true) )
-                       {
-                               log_error("Could not pause pipeline");
-                       }
-
-                       boost::int64_t pos;
-                       GstState current, pending;
-                       GstStateChangeReturn ret;
-                       GstFormat fmt = GST_FORMAT_TIME;
-
-                       ret = gst_element_get_state (GST_ELEMENT (pipeline), 
&current, &pending, 0);
-                       if (ret == GST_STATE_CHANGE_SUCCESS) {
-                               if (current != GST_STATE_NULL && 
gst_element_query_position (pipeline, &fmt, &pos)) {
-                                       pos = pos / 1000000;
-                               } else {
-                                       pos = 0;
-                               }
-                               // Buffer a second before continuing
-                               m_bufferTime = pos + 1000;
-                       } else {
-                               // the pipeline failed state change
-                               log_error("Pipeline failed to complete state 
change!");
-
-                               // @@ eh.. what to do then 
-                       }
-               }
-       }
-
-       // Check if there are any new status messages, and if we should
-       // pass them to a event handler
-       processStatusNotifications();
-}
-
-boost::int32_t
-NetStreamGst::time()
-{
-
-       if (!pipeline) return 0;
-
-       GstFormat fmt = GST_FORMAT_TIME;
-       boost::int64_t pos;
-       GstStateChangeReturn ret;
-       GstState current, pending;
-
-       ret = gst_element_get_state (GST_ELEMENT (pipeline), &current, 
&pending, 0);
-
-       if (current != GST_STATE_NULL && gst_element_query_position (pipeline, 
&fmt, &pos)) {
-               pos = pos / 1000000;
-
-               return pos - m_clock_offset;
-       } else {
-               return 0;
-       }
-}
-
-// Gstreamer callback function
-int 
-NetStreamGst::readPacket(void* opaque, char* buf, int buf_size){
-
-       NetStreamGst* ns = static_cast<NetStreamGst*>(opaque);
-
-       boost::intrusive_ptr<NetConnection> nc = ns->_netCon;
-       size_t ret = nc->read(static_cast<void*>(buf), buf_size);
-       ns->inputPos += ret;
-
-       return ret;
-
-}
-
-// Gstreamer callback function
-int 
-NetStreamGst::seekMedia(void *opaque, int offset, int whence){
-
-       NetStreamGst* ns = static_cast<NetStreamGst*>(opaque);
-       boost::intrusive_ptr<NetConnection> nc = ns->_netCon;
-
-       bool ret;
-
-       // Offset is absolute new position in the file
-       if (whence == SEEK_SET) {
-               ret = nc->seek(offset);
-               if (!ret) return -1;
-               ns->inputPos = offset;
-
-       // New position is offset + old position
-       } else if (whence == SEEK_CUR) {
-               ret = nc->seek(ns->inputPos + offset);
-               if (!ret) return -1;
-               ns->inputPos = ns->inputPos + offset;
-
-       //      // New position is offset + end of file
-       } else if (whence == SEEK_END) {
-               // This is (most likely) a streamed file, so we can't seek to 
the end!
-               // Instead we seek to 50.000 bytes... seems to work fine...
-               ret = nc->seek(50000);
-               ns->inputPos = 50000;
-       }
-       return ns->inputPos;
-}
-
-/*private*/
-bool
-NetStreamGst::disconnectVideoHandoffSignal()
-{
-       if (videosource && _handoffVideoSigHandler )
-       {
-#ifdef GNASH_DEBUG
-               log_debug("Disconnecting video handoff signal %lu", 
_handoffVideoSigHandler);
-#endif
-               g_signal_handler_disconnect(videosource, 
_handoffVideoSigHandler);
-               _handoffVideoSigHandler = 0;
-       }
-
-       // TODO: check return code from previous call !
-       return true;
-}
-
-/*private*/
-bool
-NetStreamGst::disconnectAudioHandoffSignal()
-{
-       if ( audiosource && _handoffAudioSigHandler )
-       {
-#ifdef GNASH_DEBUG
-               log_debug("Disconnecting audio handoff signal %lu", 
_handoffAudioSigHandler);
-#endif
-               g_signal_handler_disconnect(audiosource, 
_handoffAudioSigHandler);
-               _handoffAudioSigHandler = 0;
-       }
-
-       // TODO: check return code from previous call !
-       return true;
-}
-
-/*private*/
-bool
-NetStreamGst::connectVideoHandoffSignal()
-{
-#ifdef GNASH_DEBUG
-       log_debug("Connecting video handoff signal");
-#endif
-
-       assert(_handoffVideoSigHandler == 0);
-
-       _handoffVideoSigHandler = g_signal_connect (videosource, "handoff",
-                       G_CALLBACK (NetStreamGst::video_callback_handoff), 
this);
-#ifdef GNASH_DEBUG
-       log_debug("New _handoffVideoSigHandler id : %lu", 
_handoffVideoSigHandler);
-#endif
-
-       assert(_handoffVideoSigHandler != 0);
-
-       // TODO: check return code from previous call !
-       return true;
-}
-
-/*private*/
-bool
-NetStreamGst::connectAudioHandoffSignal()
-{
-#ifdef GNASH_DEBUG
-       log_debug("Connecting audio handoff signal");
-#endif
-
-       assert(_handoffAudioSigHandler == 0);
-
-       _handoffAudioSigHandler = g_signal_connect (audiosource, "handoff",
-                       G_CALLBACK (NetStreamGst::audio_callback_handoff), 
this);
-
-#ifdef GNASH_DEBUG
-       log_debug("New _handoffAudioSigHandler id : %lu", 
_handoffAudioSigHandler);
-#endif
-
-       assert(_handoffAudioSigHandler != 0);
-
-       // TODO: check return code from previous call !
-       return true;
-}
-
-/*private*/
-bool
-NetStreamGst::disablePipeline()
-{
-       boost::mutex::scoped_lock lock(_pipelineMutex);
-
-       // Disconnect the handoff handler
-       // TODO: VERIFY THE SIGNAL WILL BE RESTORED WHEN NEEDED !!
-       if ( videosource ) disconnectVideoHandoffSignal();
-       if ( audiosource ) disconnectAudioHandoffSignal();
-
-       // Drop gstreamer pipeline so callbacks are not called again
-       GstStateChangeReturn ret =  gst_element_set_state (GST_ELEMENT 
(pipeline), GST_STATE_NULL);
-       if ( ret == GST_STATE_CHANGE_FAILURE )
-       {
-               // the state change failed
-               log_error("Could not interrupt pipeline!");
-               return false;
-
-               // @@ eh.. what to do then ?
-       }
-       else if ( ret == GST_STATE_CHANGE_SUCCESS )
-       {
-               // the state change succeeded
-#ifdef GNASH_DEBUG
-               log_debug("State change to NULL successful");
-#endif
-
-               // just make sure
-               GstState current, pending;
-               ret = gst_element_get_state (GST_ELEMENT (pipeline), &current, 
&pending, 0);
-               if (current != GST_STATE_NULL )
-               {
-                       log_error("State change to NULL NOT confirmed !");
-                       return false;
-               }
-       }
-       else if ( ret == GST_STATE_CHANGE_ASYNC )
-       {
-               // The element will perform the remainder of the state change
-               // asynchronously in another thread
-               // We'll wait for it...
-
-#ifdef GNASH_DEBUG
-               log_debug("State change to NULL will be asynchronous.. waiting 
for it");
-#endif
-
-               GstState current, pending;
-               do {
-                       ret = gst_element_get_state (GST_ELEMENT (pipeline), 
&current, &pending, GST_SECOND*1); 
-
-#ifdef GNASH_DEBUG
-                       log_debug(" NULL state change still not completed after 
X seconds");
-#endif
-
-               } while ( ret == GST_STATE_CHANGE_ASYNC && current != 
GST_STATE_NULL );
-
-               if ( ret == GST_STATE_CHANGE_SUCCESS )
-               {
-                       assert ( current == GST_STATE_NULL );
-#ifdef GNASH_DEBUG
-                       log_debug(" Async NULL completed successfully");
-#endif
-               }
-               else if ( ret == GST_STATE_CHANGE_FAILURE )
-               {
-                       assert ( current != GST_STATE_NULL );
-#ifdef GNASH_DEBUG
-                       log_debug(" Async NULL completed failing.");
-#endif
-                       return false;
-               }
-               else abort();
-
-
-       }
-       else if ( ret == GST_STATE_CHANGE_NO_PREROLL )
-       {
-               // the state change succeeded but the element
-               // cannot produce data in PAUSED.
-               // This typically happens with live sources.
-#ifdef GNASH_DEBUG
-               log_debug("State change succeeded but the element cannot 
produce data in PAUSED");
-#endif
-
-               // @@ what to do in this case ?
-       }
-       else
-       {
-               log_error("Unknown return code from gst_element_set_state");
-               return false;
-       }
-
-       return true;
-
-}
-
-/*private*/
-bool
-NetStreamGst::playPipeline()
-{
-       boost::mutex::scoped_lock lock(_pipelineMutex);
-
-#ifdef GNASH_DEBUG
-       log_debug("Setting status to bufferFull and enabling pipeline");
-#endif
-
-       if ( videosource && ! _handoffVideoSigHandler )
-       {
-               connectVideoHandoffSignal();
-       }
-
-       if ( audiosource && ! _handoffAudioSigHandler )
-       {
-               connectAudioHandoffSignal();
-       }
-
-       if (!m_go) { 
-               setStatus(playStart);
-               m_go = true;
-       }
-       m_pause = false;
-       m_start_onbuffer = false;
-
-
-       // Set pipeline to PLAYING state
-       GstStateChangeReturn ret =  gst_element_set_state (GST_ELEMENT 
(pipeline), GST_STATE_PLAYING);
-       if ( ret == GST_STATE_CHANGE_FAILURE )
-       {
-               // the state change failed
-               log_error("Could not set pipeline state to PLAYING!");
-               return false;
-
-               // @@ eh.. what to do then ?
-       }
-       else if ( ret == GST_STATE_CHANGE_SUCCESS )
-       {
-               // the state change succeeded
-#ifdef GNASH_DEBUG
-               log_debug("State change to PLAYING successful");
-#endif
-
-               // just make sure
-               GstState current, pending;
-               ret = gst_element_get_state (GST_ELEMENT (pipeline), &current, 
&pending, 0);
-               if (current != GST_STATE_PLAYING )
-               {
-                       log_error("State change to PLAYING NOT confirmed !");
-                       return false;
-               }
-       }
-       else if ( ret == GST_STATE_CHANGE_ASYNC )
-       {
-               // The element will perform the remainder of the state change
-               // asynchronously in another thread
-               // We'll wait for it...
-
-#ifdef GNASH_DEBUG
-               log_debug("State change to play will be asynchronous.. waiting 
for it");
-#endif
-
-               GstState current, pending;
-               do {
-                       ret = gst_element_get_state (GST_ELEMENT (pipeline), 
&current, &pending, GST_SECOND*1); 
-
-#ifdef GNASH_DEBUG
-                       log_debug(" Play still not completed after X seconds");
-#endif
-
-               } while ( ret == GST_STATE_CHANGE_ASYNC && current != 
GST_STATE_PLAYING );
-
-               if ( ret == GST_STATE_CHANGE_SUCCESS )
-               {
-                       assert ( current == GST_STATE_PLAYING );
-#ifdef GNASH_DEBUG
-                       log_debug(" Async play completed successfully");
-#endif
-               }
-               else if ( ret == GST_STATE_CHANGE_FAILURE )
-               {
-                       assert ( current != GST_STATE_PLAYING );
-#ifdef GNASH_DEBUG
-                       log_debug(" Async play completed failing.");
-#endif
-                       return false;
-               }
-               else abort();
-
-       }
-       else if ( ret == GST_STATE_CHANGE_NO_PREROLL )
-       {
-               // the state change succeeded but the element
-               // cannot produce data in PAUSED.
-               // This typically happens with live sources.
-#ifdef GNASH_DEBUG
-               log_debug("State change succeeded but the element cannot 
produce data in PAUSED");
-#endif
-
-               // @@ what to do in this case ?
-       }
-       else
-       {
-               log_error("Unknown return code from gst_element_set_state");
-               return false;
-       }
-
-       return true;
-
-}
-
-/*private*/
-bool
-NetStreamGst::pausePipeline(bool startOnBuffer)
-{
-       boost::mutex::scoped_lock lock(_pipelineMutex);
-
-#ifdef GNASH_DEBUG
-       log_debug("Setting pipeline state to PAUSE");
-#endif
-
-       if ( ! m_go )
-       {
-#ifdef GNASH_DEBUG
-               log_debug("Won't set the pipeline to PAUSE state if m_go is 
false");
-#endif
-               return false;
-       }
-
-
-       if ( videosource && ! _handoffVideoSigHandler )
-       {
-               connectVideoHandoffSignal();
-       }
-
-       if ( audiosource && ! _handoffAudioSigHandler )
-       {
-               connectAudioHandoffSignal();
-       }
-
-       m_pause = true;
-       m_start_onbuffer = startOnBuffer;
-
-       // Set pipeline to PAUSE state
-       GstStateChangeReturn ret =  gst_element_set_state (GST_ELEMENT 
(pipeline), GST_STATE_PAUSED);
-       if ( ret == GST_STATE_CHANGE_FAILURE )
-       {
-               // the state change failed
-               log_error("Could not interrupt pipeline!");
-               return false;
-       }
-       else if ( ret == GST_STATE_CHANGE_SUCCESS )
-       {
-               // the state change succeeded
-#ifdef GNASH_DEBUG
-               log_debug("State change to PAUSE successful");
-#endif
-
-               // just make sure
-               GstState current, pending;
-               ret = gst_element_get_state (GST_ELEMENT (pipeline), &current, 
&pending, 0);
-               if (current != GST_STATE_PAUSED )
-               {
-                       log_error("State change to PLAYING NOT confirmed !");
-                       return false;
-               }
-       }
-       else if ( ret == GST_STATE_CHANGE_ASYNC )
-       {
-               // The element will perform the remainder of the state change
-               // asynchronously in another thread
-               // We'll wait for it...
-
-#ifdef GNASH_DEBUG
-               log_debug("State change to paused will be asynchronous.. 
waiting for it");
-#endif
-
-               GstState current, pending;
-               do {
-                       ret = gst_element_get_state (GST_ELEMENT (pipeline), 
&current, &pending, GST_SECOND*1); 
-
-#ifdef GNASH_DEBUG
-                       log_debug(" Pause still not completed after X seconds");
-#endif
-
-               } while ( ret == GST_STATE_CHANGE_ASYNC && current != 
GST_STATE_PAUSED );
-
-               if ( ret == GST_STATE_CHANGE_SUCCESS )
-               {
-                       assert ( current == GST_STATE_PAUSED );
-#ifdef GNASH_DEBUG
-                       log_debug(" Async pause completed successfully");
-#endif
-               }
-               else if ( ret == GST_STATE_CHANGE_FAILURE )
-               {
-                       assert ( current != GST_STATE_PAUSED );
-#ifdef GNASH_DEBUG
-                       log_debug(" Async pause completed failing.");
-#endif
-                       return false;
-               }
-               else abort();
-
-       }
-       else if ( ret == GST_STATE_CHANGE_NO_PREROLL )
-       {
-               // the state change succeeded but the element
-               // cannot produce data in PAUSED.
-               // This typically happens with live sources.
-#ifdef GNASH_DEBUG
-               log_debug("State change succeeded but the element cannot 
produce data in PAUSED");
-#endif
-
-               // @@ what to do in this case ?
-       }
-       else
-       {
-               log_error("Unknown return code from gst_element_set_state");
-               return false;
-       }
-
-       return true;
+  NetStreamGst* ns = static_cast<NetStreamGst*>(user_data);  
 
+  ns->setStatus(bufferFull);
 }
 
+} // end of gnash namespace
 
-} // gnash namespcae
-
-#endif // SOUND_GST

Index: server/asobj/NetStreamGst.h
===================================================================
RCS file: /sources/gnash/gnash/server/asobj/NetStreamGst.h,v
retrieving revision 1.29
retrieving revision 1.30
diff -u -b -r1.29 -r1.30
--- server/asobj/NetStreamGst.h 4 Dec 2007 11:45:31 -0000       1.29
+++ server/asobj/NetStreamGst.h 21 Jan 2008 07:07:28 -0000      1.30
@@ -15,7 +15,7 @@
 // along with this program; if not, write to the Free Software
 // Foundation, Inc., 51 Franklin St, Fifth Floor, Boston, MA  02110-1301  USA
 
-/* $Id: NetStreamGst.h,v 1.29 2007/12/04 11:45:31 strk Exp $ */
+/* $Id: NetStreamGst.h,v 1.30 2008/01/21 07:07:28 bjacques Exp $ */
 
 #ifndef __NETSTREAMGST_H__
 #define __NETSTREAMGST_H__
@@ -26,22 +26,12 @@
 
 #ifdef SOUND_GST
 
-#include <boost/thread/thread.hpp>
 #include <boost/bind.hpp> 
-#include <boost/thread/mutex.hpp>
 #include "impl.h"
 #include "video_stream_instance.h"
 #include <gst/gst.h>
 #include "image.h"
 #include "NetStream.h" // for inheritance
-#include "FLVParser.h"
-
-/// Define DISABLE_START_THREAD to avoid using threads 
-///
-/// TODO: use a globally-defined thread-disabling routine
-///       when available
-///
-//#define DISABLE_START_THREAD
 
 
 namespace gnash {
@@ -50,185 +40,46 @@
 public:
        NetStreamGst();
        ~NetStreamGst();
+
        void close();
-       void pause(int mode);
-       void play(const std::string& source);
+
+  void pause(PauseMode mode);
+
+  void play(const std::string& url);
+
        void seek(boost::uint32_t pos);
-       boost::int32_t time();
+       
        void advance();
 
-       // Used for gstreamer data read and seek callbacks
-       static int readPacket(void* opaque, char* buf, int buf_size);
-       static int seekMedia(void *opaque, int offset, int whence);
-
-       void startPlayback();
-
-       static void playbackStarter(NetStreamGst* ns);
-       static void callback_output (GstElement* /*c*/, GstBuffer *buffer, 
GstPad* /*pad*/, gpointer user_data);
-       static void callback_newpad (GstElement *decodebin, GstPad *pad, 
gboolean last, gpointer data);
-       static void video_callback_handoff (GstElement* /*c*/, GstBuffer 
*buffer, GstPad* /*pad*/, gpointer user_data);
-       static void audio_callback_handoff (GstElement* /*c*/, GstBuffer 
*buffer, GstPad* /*pad*/, gpointer user_data);
+  boost::int32_t time();
 
-private:
+  double getCurrentFPS();
 
-       /// Creates the sound decoder and source element for playing FLVs
-       //
-       /// Does NOT Lock the pipeline mutex during operations, use 
buildFLVPipeline() for that.
-       ///
-       /// @return true on success, false on failure
-       ///
-       /// @param sound
-       ///     Determines if sound should be setup. It is passed by reference 
-       ///     and might be changed.
-       ///
-       bool buildFLVSoundPipeline(bool &sound);
-
-       /// Creates the video decoder and source element for playing FLVs
-       //
-       /// Does NOT Lock the pipeline mutex during operations, use 
buildFLVPipeline() for that.
-       ///
-       /// @return true on success, false on failure
-       ///
-       /// @param video
-       ///     Determines if video should be setup. It is passed by reference 
-       ///     and might be changed.
-       ///
-       bool buildFLVVideoPipeline(bool &video);
-
-       /// Creates the decoder and source element for playing FLVs
-       //
-       /// Locks the pipeline mutex during operations
-       ///
-       ///
-       /// @param video
-       ///     Determines if video should be setup. It is passed by reference 
-       ///     and might be changed.
-       ///
-       /// @param sound
-       ///     Determines if sound should be setup. It is passed by reference 
-       ///     and might be changed.
-       ///
-       /// @return true on success, false on failure
-       ///
-       bool buildFLVPipeline(bool& video, bool& audio);
-
-       /// Creates the decoder and source element for playing non-FLVs
-       //
-       /// Locks the pipeline mutex during operations
-       ///
-       /// @return true on success, false on failure
-       ///
-       bool buildPipeline();
-
-       /// Unrefs (deletes) all the gstreamer elements. Used when the setup 
failed.
-       //
-       /// Locks the pipeline mutex during operations
-       ///
-       void unrefElements();
-
-       /// Connect the video "handoff" signal
-       //
-       /// @return true on success, false on failure
-       ///
-       bool connectVideoHandoffSignal();
-
-       /// Connect the audio "handoff" signal
-       //
-       /// @return true on success, false on failure
-       ///
-       bool connectAudioHandoffSignal();
-
-       /// Disconnect the video "handoff" signal
-       //
-       /// @return true on success, false on failure
-       ///
-       bool disconnectVideoHandoffSignal();
-
-       /// Disconnect the audio "handoff" signal
-       //
-       /// @return true on success, false on failure
-       ///
-       bool disconnectAudioHandoffSignal();
-
-       /// \brief
-       /// Set pipeline state to GST_STATE_NULL
-       /// and disconnect handoff signals
-       //
-       /// Locks the pipeline mutex during operations
-       ///
-       /// If the call needs be asynchronous, we'll wait for it.
-       /// TODO: implement the above
-       ///
-       bool disablePipeline();
-
-       /// \brief
-       /// Set pipeline state to GST_STATE_PLAYING,
-       /// connect handoff signals, send appropriate
-       /// notifications.
-       //
-       /// Locks the pipeline mutex during operations
-       ///
-       /// If the call needs be asynchronous, we'll wait for it.
-       /// TOOD: implement the above
-       ///
-       bool playPipeline();
-
-       /// \brief
-       /// Set pipeline state to GST_STATE_PAUSE,
-       /// connect handoff signals if not connected already
-       //
-       /// Locks the pipeline mutex during operations
-       ///
-       /// If the call needs be asynchronous, we'll wait for it.
-       /// TOOD: implement the above
-       ///
-       bool pausePipeline(bool startOnBuffer);
-
-       // gstreamer pipeline objects
-       GstElement *pipeline;
-       GstElement *audiosink;
-       GstElement *videosink;
-       GstElement *decoder;
-       GstElement *volume;
-       GstElement *colorspace;
-       GstElement *videorate;
-       GstElement *videocaps;
-       GstElement *videoflip;
-       GstElement *audioconv;
-
-       // Mutex protecting pipeline control
-       boost::mutex _pipelineMutex;
-
-       // used only for FLV
-       GstElement *audiosource;
-       GstElement *videosource;
-       GstElement *source;
-       GstElement *videodecoder;
-       GstElement *audiodecoder;
-       GstElement *videoinputcaps;
-       GstElement *audioinputcaps;
-
-       // Signal handlers id
-       gulong _handoffVideoSigHandler;
-       gulong _handoffAudioSigHandler;
+  long bytesLoaded();
 
-#ifndef DISABLE_START_THREAD
-       boost::thread *startThread;
-#endif
+  long bytesTotal();
+       
+  static void video_data_cb(GstElement* /*c*/, GstBuffer *buffer, GstPad* pad,
+                           gpointer user_data);
+  
+  static void
+  decodebin_newpad_cb(GstElement* decodebin, GstPad* pad, gboolean last,
+                      gpointer user_data);
 
-       // video info
-       int videowidth;
-       int videoheight;
-
-       // Used when seeking. To make the gst-pipeline more cooperative
-       // we don't tell it when we seek, but just add m_clock_offset to
-       // make it believe the search never happend. A better aproach whould
-       // probably be to make a dedicated gstreamer source element.
-       volatile long m_clock_offset;
+  static void queue_underrun_cb(GstElement *queue, gpointer  user_data);
 
-       // On next advance() should we pause?
-       volatile bool m_pausePlayback;
+  static void queue_running_cb(GstElement *queue, gpointer  user_data);
+
+
+private:
+  void handleMessage (GstMessage *message);
 
+  GstElement* _pipeline;
+  GstElement* _dataqueue;
+  GstElement* _downloader;
+  GstPad*     _videopad;
+  GstPad*     _audiopad;
+  gint64      _duration;
 };
 
 } // gnash namespace

Index: server/asobj/Sound.cpp
===================================================================
RCS file: /sources/gnash/gnash/server/asobj/Sound.cpp,v
retrieving revision 1.25
retrieving revision 1.26
diff -u -b -r1.25 -r1.26
--- server/asobj/Sound.cpp      17 Dec 2007 12:32:54 -0000      1.25
+++ server/asobj/Sound.cpp      21 Jan 2008 07:07:28 -0000      1.26
@@ -63,7 +63,6 @@
 
 Sound::Sound()         :
        as_object(getSoundInterface()),
-       connection(),
        soundId(-1),
        externalSound(false),
        isStreaming(false)
@@ -147,9 +146,10 @@
                log_error(_("%s: This sound already has a connection?  (We try 
to handle this by overriding the old one...)"), __FUNCTION__);
        }
        externalURL = file;
-
+#if 0
        connection = new NetConnection();
        connection->openConnection(externalURL);
+#endif
 }
 
 void

Index: server/asobj/SoundGst.cpp
===================================================================
RCS file: /sources/gnash/gnash/server/asobj/SoundGst.cpp,v
retrieving revision 1.15
retrieving revision 1.16
diff -u -b -r1.15 -r1.16
--- server/asobj/SoundGst.cpp   17 Dec 2007 12:32:54 -0000      1.15
+++ server/asobj/SoundGst.cpp   21 Jan 2008 07:07:28 -0000      1.16
@@ -29,89 +29,29 @@
 #include "fn_call.h"
 #include "GnashException.h"
 #include "builtin_function.h"
-
-#include "gstgnashsrc.h"
+#include "URL.h"
 
 #include <string>
 
-namespace gnash {
-
-static gboolean
-register_elements (GstPlugin *plugin)
-{
-       return gst_element_register (plugin, "gnashsrc", GST_RANK_NONE, 
GST_TYPE_GNASH_SRC);
-}
-
-static GstPluginDesc gnash_plugin_desc = {
-       0, // GST_VERSION_MAJOR
-       10, // GST_VERSION_MINOR
-       "gnashsrc",
-       "Use gnash as source via callbacks",
-       register_elements,
-       "0.0.1",
-       "LGPL",
-       "gnash",
-       "gnash",
-       "http://www.gnu.org/software/gnash/";,
-       GST_PADDING_INIT
-};
-
-// Gstreamer callback function
-int 
-SoundGst::readPacket(void* opaque, char* buf, int buf_size)
-{
-
-       SoundGst* so = static_cast<SoundGst*>(opaque);
-       boost::intrusive_ptr<NetConnection> nc = so->connection;
-
-       size_t ret = nc->read(static_cast<void*>(buf), buf_size);
-       so->inputPos += ret;
-       return ret;
+// TODO: implement loops
+//       seeking
 
-}
-
-// Gstreamer callback function
-int 
-SoundGst::seekMedia(void *opaque, int offset, int whence){
-
-       SoundGst* so = static_cast<SoundGst*>(opaque);
-       boost::intrusive_ptr<NetConnection> nc = so->connection;
-
-
-       // Offset is absolute new position in the file
-       if (whence == SEEK_SET) {
-               nc->seek(offset);
-               so->inputPos = offset;
-
-       // New position is offset + old position
-       } else if (whence == SEEK_CUR) {
-               nc->seek(so->inputPos + offset);
-               so->inputPos = so->inputPos + offset;
-
-       //      // New position is offset + end of file
-       } else if (whence == SEEK_END) {
-               // This is (most likely) a streamed file, so we can't seek to 
the end!
-               // Instead we seek to 50.000 bytes... seems to work fine...
-               nc->seek(50000);
-               so->inputPos = 50000;
-               
-       }
-
-       return so->inputPos;
-}
+namespace gnash {
 
-// Callback function used by Gstreamer to to attached audio and video streams
-// detected by decoderbin to either the video out or audio out elements.
+// Callback function used by Gstreamer to attach audio and video streams
+// detected by decodebin to either the video out or audio out elements.
 void
 SoundGst::callback_newpad (GstElement* /*decodebin*/, GstPad *pad, gboolean 
/*last*/, gpointer data)
 {
+#if 0
        log_msg(_("%s: new pad found"), __FUNCTION__);
+#endif
        SoundGst* so = static_cast<SoundGst*>(data);
        GstCaps *caps;
        GstStructure *str;
        GstPad *audiopad;
 
-       audiopad = gst_element_get_pad (so->audioconv, "sink");
+       audiopad = gst_element_get_static_pad (so->_audioconv, "sink");
 
        // check media type
        caps = gst_pad_get_caps (pad);
@@ -122,135 +62,131 @@
                log_msg(_("%s: new pad connected"), __FUNCTION__);
        } else {
                gst_object_unref (audiopad);
-               log_error(_("%s: Non-audio data found in file %s"), 
__FUNCTION__,
-                               so->externalURL.c_str());
+               log_msg(_("%s: Non-audio data found in Sound url"), 
__FUNCTION__);
        }
        gst_caps_unref (caps);
-       return;
+       gst_object_unref(GST_OBJECT(audiopad));
 }
 
 void
-SoundGst::setupDecoder(SoundGst* so)
+SoundGst::setupDecoder(const std::string& url)
 {
-
-       boost::intrusive_ptr<NetConnection> nc = so->connection;
-       assert(nc);
-
-       // Pass stuff from/to the NetConnection object.
-       assert(so);
-       if ( !nc->openConnection(so->externalURL) ) {
-               log_error(_("could not open audio url: %s"), 
so->externalURL.c_str());
-               delete so->lock;
-               return;
-       }
-
-       so->inputPos = 0;
+       _inputPos = 0;
 
        // init GStreamer
        gst_init (NULL, NULL);
 
-       // setup the GnashNC plugin
-       _gst_plugin_register_static (&gnash_plugin_desc);
-
        // setup the pipeline
-       so->pipeline = gst_pipeline_new (NULL);
+       _pipeline = gst_pipeline_new (NULL);
+
+       if (!_pipeline) {
+               log_error(_("Could not create gstreamer pipeline element"));
+               return;
+       }
 
-       // create an audio sink - use oss, alsa or...? make a commandline 
option?
-       // we first try atudetect, then alsa, then oss, then esd, then...?
-       // If the gstreamer adder ever gets fixed this should be connected to 
the
-       // adder in the soundhandler.
 #if !defined(__NetBSD__)
-       so->audiosink = gst_element_factory_make ("autoaudiosink", NULL);
-       if (!so->audiosink) so->audiosink = gst_element_factory_make 
("alsasink", NULL);
-       if (!so->audiosink) so->audiosink = gst_element_factory_make 
("osssink", NULL);
+       _audiosink = gst_element_factory_make ("autoaudiosink", NULL);
+       if (!_audiosink) _audiosink = gst_element_factory_make ("alsasink", 
NULL);
+       if (!_audiosink) _audiosink = gst_element_factory_make ("osssink", 
NULL);
 #endif
-       if (!so->audiosink) so->audiosink = gst_element_factory_make 
("esdsink", NULL);
+       if (!_audiosink) _audiosink = gst_element_factory_make ("esdsink", 
NULL);
 
-       // Check if the creation of the gstreamer pipeline and audiosink was a 
succes
-       if (!so->pipeline) {
-               gnash::log_error(_("Could not create gstreamer pipeline 
element"));
-               return;
-       }
-       if (!so->audiosink) {
-               gnash::log_error(_("Could not create gstreamer audiosink 
element"));
+       if (!_audiosink) {
+               log_error(_("Could not create gstreamer audiosink element"));
+                gst_object_unref(GST_OBJECT(_pipeline));
                return;
        }
 
-       // setup gnashnc source (our homegrown source element)
-       so->source = gst_element_factory_make ("gnashsrc", NULL);
-       gnashsrc_callback* gc = new gnashsrc_callback;
-       gc->read = SoundGst::readPacket;
-       gc->seek = SoundGst::seekMedia;
-       g_object_set (G_OBJECT (so->source), "data", so, "callbacks", gc, NULL);
-
        // setup the audio converter
-       so->audioconv = gst_element_factory_make ("audioconvert", NULL);
+       _audioconv = gst_element_factory_make ("audioconvert", NULL);
 
        // setup the volume controller
-       so->volume = gst_element_factory_make ("volume", NULL);
+       _volume = gst_element_factory_make ("volume", NULL);
 
        // setup the decoder with callback
-       so->decoder = gst_element_factory_make ("decodebin", NULL);
-       g_signal_connect (so->decoder, "new-decoded-pad", G_CALLBACK 
(SoundGst::callback_newpad), so);
+       _decoder = gst_element_factory_make ("decodebin", NULL);
+       g_signal_connect (_decoder, "new-decoded-pad", G_CALLBACK 
(SoundGst::callback_newpad), this);
 
 
-       if (!so->source || !so->audioconv || !so->volume || !so->decoder) {
+       if (!_audioconv || !_volume || !_decoder) {
                gnash::log_error(_("Could not create Gstreamer element(s) for 
movie handling"));
                return;
        }
 
+       GstElement* downloader = gst_element_make_from_uri(GST_URI_SRC, 
url.c_str(),
+                                                          
"gnash_audiodownloader");
+
+       GstElement* queue = gst_element_factory_make ("queue", 
"gnash_audioqueue");
+
+
        // put it all in the pipeline
-       gst_bin_add_many (GST_BIN (so->pipeline), so->source, so->decoder, 
so->audiosink, so->audioconv, so->volume, NULL);
+       gst_bin_add_many (GST_BIN (_pipeline), downloader, queue, _decoder,
+                         _audiosink, _audioconv, _volume, NULL);
 
        // link the elements
-       gst_element_link(so->source, so->decoder);
-       gst_element_link_many(so->audioconv, so->volume, so->audiosink, NULL);
+       gst_element_link_many(_audioconv, _volume, _audiosink, NULL);
+  gst_element_link_many(downloader, queue, _decoder, NULL);
        
-       // By deleting this lock we allow start() to start playback
-       delete so->lock;
        return;
 }
 
-SoundGst::~SoundGst() {
+SoundGst::~SoundGst()
+{
 
-       if (externalSound && pipeline) {
-               gst_element_set_state (GST_ELEMENT (pipeline), GST_STATE_NULL);
-               gst_object_unref (GST_OBJECT (pipeline));
+       if (externalSound && _pipeline) {
+               gst_element_set_state (_pipeline, GST_STATE_NULL);
+               gst_object_unref (GST_OBJECT (_pipeline));
        }
 }
 
 void
-SoundGst::loadSound(const std::string& file, bool streaming)
+SoundGst::loadSound(const std::string& url, bool streaming)
 {
-       pipeline = NULL;
-       remainingLoops = 0;
+  connection = new NetConnection;
 
-       if (connection) {
-               log_error(_("%s: This sound already has a connection?  (We try 
to handle this by overriding the old one...)"), __FUNCTION__);
-       }
-       externalURL = file;
+  std::string valid_url = connection->validateURL(url);
 
-       connection = new NetConnection();
+       log_msg("%s: loading URL %s", __FUNCTION__, valid_url.c_str());
 
-       externalSound = true;
-       isStreaming = streaming;
+       _remainingLoops = 0;
+
+       if (_pipeline) {
+               log_msg(_("%s: This sound already has a pipeline. Resetting for 
new URL connection. (%s)"), __FUNCTION__, valid_url.c_str());
+               gst_element_set_state (_pipeline, GST_STATE_NULL); // FIXME: 
wait for state?
+
+               GstElement* downloader = 
gst_bin_get_by_name(GST_BIN(_pipeline), "gnash_audiodownloader");
+               gst_bin_remove(GST_BIN(_pipeline), downloader); 
+               gst_object_unref(GST_OBJECT(downloader));
+
+               downloader = gst_element_make_from_uri(GST_URI_SRC, 
valid_url.c_str(),
+                                                       
"gnash_audiodownloader");
 
-       lock = new boost::mutex::scoped_lock(setupMutex);
+               gst_bin_add(GST_BIN(_pipeline), downloader);
 
-       // To avoid blocking while connecting, we use a thread.
-       setupThread = new boost::thread(boost::bind(SoundGst::setupDecoder, 
this));
+               GstElement* queue = gst_bin_get_by_name(GST_BIN(_pipeline), 
"gnash_audioqueue");
 
+               gst_element_link(downloader, queue);
+               gst_object_unref(GST_OBJECT(queue));
+       } else {
+               setupDecoder(valid_url);
+       }
+
+       externalSound = true;
+
+       start(0, 0);
 }
 
 void
 SoundGst::start(int offset, int loops)
 {
-       boost::mutex::scoped_lock lock(setupMutex);
+       if (!externalSound) {
+               Sound::start(offset, loops);
+               return;
+       }
 
-       if (externalSound) {
                if (offset > 0) {
                        // Seek to offset position
-                       if (!gst_element_seek (pipeline, 1.0, GST_FORMAT_TIME, 
GST_SEEK_FLAG_FLUSH,
+               if (!gst_element_seek (_pipeline, 1.0, GST_FORMAT_TIME, 
GST_SEEK_FLAG_FLUSH,
                                GST_SEEK_TYPE_SET, GST_SECOND * 
static_cast<long>(offset),
                                GST_SEEK_TYPE_NONE, GST_CLOCK_TIME_NONE)) {
                                log_error(_("%s: seeking to offset failed"), 
@@ -260,40 +196,21 @@
                }
                // Save how many loops to do
                if (loops > 0) {
-                       remainingLoops = loops;
+               _remainingLoops = loops;
                }
                // start playing        
-               gst_element_set_state (pipeline, GST_STATE_PLAYING);
-
-       }
-
-
-       // Start sound
-       media::sound_handler* s = get_sound_handler();
-       if (s) {
-               if (!externalSound) {
-               s->play_sound(soundId, loops, offset, 0, NULL);
-           }
-       }
+       gst_element_set_state (_pipeline, GST_STATE_PLAYING);
 }
 
 void
 SoundGst::stop(int si)
 {
-       // stop the sound
-       media::sound_handler* s = get_sound_handler();
-       if (s != NULL)
-       {
-           if (si < 0) {
-               if (externalSound) {
-                               gst_element_set_state (GST_ELEMENT (pipeline), 
GST_STATE_NULL);
-               } else {
-                               s->stop_sound(soundId);
-                       }
-               } else {
-                       s->stop_sound(si);
-               }
+       if (!externalSound) {
+               Sound::stop(si);
+               return;
        }
+
+       gst_element_set_state (GST_ELEMENT (_pipeline), GST_STATE_NULL);
 }
 
 unsigned int
@@ -301,20 +218,14 @@
 {
        // Return the duration of the file in milliseconds
        
-       // If this is a event sound get the info from the soundhandler
        if (!externalSound) {
-               media::sound_handler* s = get_sound_handler();
-               if (s) {                
-               return (s->get_duration(soundId));
-           } else {
-               return 0; // just in case
-               }
+               return Sound::getDuration();
        }
        
        GstFormat fmt = GST_FORMAT_TIME;
        boost::int64_t len;
 
-       if (pipeline && gst_element_query_duration (pipeline, &fmt, &len)) {
+       if (_pipeline && gst_element_query_duration (_pipeline, &fmt, &len)) {
                return static_cast<unsigned int>(len / GST_MSECOND);
        } else {
                return 0;
@@ -326,26 +237,20 @@
 {
        // Return the position in the file in milliseconds
        
-       // If this is a event sound get the info from the soundhandler
        if (!externalSound) {
-               media::sound_handler* s = get_sound_handler();
-               if (s) {
-                       return s->get_position(soundId);        
-           } else {
-               return 0; // just in case
-               }
+               return Sound::getPosition();
        }
        
-       if (!pipeline) return 0;
+       if (!_pipeline) return 0;
 
        GstFormat fmt = GST_FORMAT_TIME;
        boost::int64_t pos;
        GstStateChangeReturn ret;
        GstState current, pending;
 
-       ret = gst_element_get_state (GST_ELEMENT (pipeline), &current, 
&pending, 0);
+       ret = gst_element_get_state (GST_ELEMENT (_pipeline), &current, 
&pending, 0);
 
-       if (current != GST_STATE_NULL && gst_element_query_position (pipeline, 
&fmt, &pos)) {
+       if (current != GST_STATE_NULL && gst_element_query_position (_pipeline, 
&fmt, &pos)) {
                return static_cast<unsigned int>(pos / GST_MSECOND);
        } else {
                return 0;

Index: server/asobj/SoundGst.h
===================================================================
RCS file: /sources/gnash/gnash/server/asobj/SoundGst.h,v
retrieving revision 1.7
retrieving revision 1.8
diff -u -b -r1.7 -r1.8
--- server/asobj/SoundGst.h     17 Dec 2007 12:32:54 -0000      1.7
+++ server/asobj/SoundGst.h     21 Jan 2008 07:07:28 -0000      1.8
@@ -42,17 +42,15 @@
 public:
        SoundGst()
                :
-               pipeline(NULL),
-               audiosink(NULL),
-               source(NULL),
-               decoder(NULL),
-               volume(NULL),
-               audioconv(NULL),
-               setupThread(NULL),
-               lock(NULL), 
-               inputPos(0),
-               isAttached(false),
-               remainingLoops(0)
+               _pipeline(NULL),
+               _audiosink(NULL),
+               _source(NULL),
+               _decoder(NULL),
+               _volume(NULL),
+               _audioconv(NULL),
+               _inputPos(0),
+               _isAttached(false),
+               _remainingLoops(0)
        {}
        ~SoundGst();
 
@@ -62,34 +60,25 @@
        unsigned int getDuration();
        unsigned int getPosition();
 
-       // Used for ffmpeg data read and seek callbacks
-       static int readPacket(void* opaque, char* buf, int buf_size);
-       static int seekMedia(void *opaque, int offset, int whence);
-
-       static void setupDecoder(SoundGst* so);
-       static bool getAudio(void *owner, boost::uint8_t *stream, int len);
+       void setupDecoder(const std::string& url);
 
        static void callback_newpad (GstElement *decodebin, GstPad *pad, 
gboolean last, gpointer data);
 private:
 
        // gstreamer pipeline objects
-       GstElement *pipeline;
-       GstElement *audiosink;
-       GstElement *source;
-       GstElement *decoder;
-       GstElement *volume;
-       GstElement *audioconv;
-
-       boost::thread *setupThread;
-       boost::mutex setupMutex;
-       boost::mutex::scoped_lock *lock;
+       GstElement* _pipeline;
+       GstElement* _audiosink;
+       GstElement* _source;
+       GstElement* _decoder;
+       GstElement* _volume;
+       GstElement* _audioconv;
 
-       long inputPos;
+       long _inputPos;
 
-       // Are this sound attached to the soundhandler?
-       bool isAttached;
+       // Is this sound attached to the soundhandler?
+       bool _isAttached;
 
-       int remainingLoops;
+       int _remainingLoops;
 };
 
 

Index: libmedia/gst/gstflvdemux.c
===================================================================
RCS file: libmedia/gst/gstflvdemux.c
diff -N libmedia/gst/gstflvdemux.c
--- /dev/null   1 Jan 1970 00:00:00 -0000
+++ libmedia/gst/gstflvdemux.c  21 Jan 2008 07:07:27 -0000      1.1
@@ -0,0 +1,1200 @@
+/* GStreamer
+ * Copyright (C) <2007> Julien Moutte <address@hidden>
+ *
+ * This library is free software; you can redistribute it and/or
+ * modify it under the terms of the GNU Library General Public
+ * License as published by the Free Software Foundation; either
+ * version 2 of the License, or (at your option) any later version.
+ *
+ * This library is distributed in the hope that it will be useful,
+ * but WITHOUT ANY WARRANTY; without even the implied warranty of
+ * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the GNU
+ * Library General Public License for more details.
+ *
+ * You should have received a copy of the GNU Library General Public
+ * License along with this library; if not, write to the
+ * Free Software Foundation, Inc., 59 Temple Place - Suite 330,
+ * Boston, MA 02111-1307, USA.
+ */
+
+#ifdef HAVE_CONFIG_H
+#include "config.h"
+#endif
+
+#include "gstflvdemux.h"
+#include "gstflvparse.h"
+
+#include <string.h>
+
+static GstStaticPadTemplate flv_sink_template = GST_STATIC_PAD_TEMPLATE 
("sink",
+    GST_PAD_SINK,
+    GST_PAD_ALWAYS,
+    GST_STATIC_CAPS ("video/x-flv")
+    );
+
+static GstStaticPadTemplate audio_src_template =
+GST_STATIC_PAD_TEMPLATE ("audio",
+    GST_PAD_SRC,
+    GST_PAD_SOMETIMES,
+    GST_STATIC_CAPS_ANY);
+
+static GstStaticPadTemplate video_src_template =
+GST_STATIC_PAD_TEMPLATE ("video",
+    GST_PAD_SRC,
+    GST_PAD_SOMETIMES,
+    GST_STATIC_CAPS_ANY);
+
+static GstElementDetails flv_demux_details = {
+  "FLV Demuxer",
+  "Codec/Demuxer",
+  "Demux FLV feeds into digital streams",
+  "Julien Moutte <address@hidden>"
+};
+
+GST_DEBUG_CATEGORY (flvdemux_debug);
+#define GST_CAT_DEFAULT flvdemux_debug
+
+GST_BOILERPLATE (GstFLVDemux, gst_flv_demux, GstElement, GST_TYPE_ELEMENT);
+
+#define FLV_HEADER_SIZE 13
+#define FLV_TAG_TYPE_SIZE 4
+
+static void
+gst_flv_demux_flush (GstFLVDemux * demux, gboolean discont)
+{
+  GST_DEBUG_OBJECT (demux, "flushing queued data in the FLV demuxer");
+
+  gst_adapter_clear (demux->adapter);
+
+  demux->audio_need_discont = TRUE;
+  demux->video_need_discont = TRUE;
+
+  demux->flushing = FALSE;
+
+  /* Only in push mode */
+  if (!demux->random_access) {
+    /* After a flush we expect a tag_type */
+    demux->state = FLV_STATE_TAG_TYPE;
+    /* We reset the offset and will get one from first push */
+    demux->offset = 0;
+  }
+}
+
+static void
+gst_flv_demux_cleanup (GstFLVDemux * demux)
+{
+  GST_DEBUG_OBJECT (demux, "cleaning up FLV demuxer");
+
+  demux->state = FLV_STATE_HEADER;
+
+  demux->flushing = FALSE;
+  demux->need_header = TRUE;
+  demux->audio_need_segment = TRUE;
+  demux->video_need_segment = TRUE;
+  demux->audio_need_discont = TRUE;
+  demux->video_need_discont = TRUE;
+
+  /* By default we consider them as linked */
+  demux->audio_linked = TRUE;
+  demux->video_linked = TRUE;
+
+  demux->has_audio = FALSE;
+  demux->has_video = FALSE;
+  demux->push_tags = FALSE;
+  demux->got_par = FALSE;
+
+  demux->w = demux->h = 0;
+  demux->par_x = demux->par_y = 1;
+  demux->video_offset = 0;
+  demux->audio_offset = 0;
+  demux->offset = demux->cur_tag_offset = 0;
+  demux->tag_size = demux->tag_data_size = 0;
+  demux->duration = GST_CLOCK_TIME_NONE;
+
+  if (demux->new_seg_event) {
+    gst_event_unref (demux->new_seg_event);
+    demux->new_seg_event = NULL;
+  }
+
+  gst_adapter_clear (demux->adapter);
+
+  if (demux->audio_pad) {
+    gst_element_remove_pad (GST_ELEMENT (demux), demux->audio_pad);
+    gst_object_unref (demux->audio_pad);
+    demux->audio_pad = NULL;
+  }
+
+  if (demux->video_pad) {
+    gst_element_remove_pad (GST_ELEMENT (demux), demux->video_pad);
+    gst_object_unref (demux->video_pad);
+    demux->video_pad = NULL;
+  }
+
+  if (demux->times) {
+    g_array_free (demux->times, TRUE);
+    demux->times = NULL;
+  }
+
+  if (demux->filepositions) {
+    g_array_free (demux->filepositions, TRUE);
+    demux->filepositions = NULL;
+  }
+}
+
+static void
+gst_flv_demux_adapter_flush (GstFLVDemux * demux, guint64 bytes)
+{
+  demux->offset += bytes;
+
+  gst_adapter_flush (demux->adapter, bytes);
+}
+
+static GstFlowReturn
+gst_flv_demux_chain (GstPad * pad, GstBuffer * buffer)
+{
+  GstFlowReturn ret = GST_FLOW_OK;
+  GstFLVDemux *demux = NULL;
+
+  demux = GST_FLV_DEMUX (gst_pad_get_parent (pad));
+
+  GST_LOG_OBJECT (demux, "received buffer of %d bytes at offset %"
+      G_GUINT64_FORMAT, GST_BUFFER_SIZE (buffer), GST_BUFFER_OFFSET (buffer));
+
+  if (G_UNLIKELY (GST_BUFFER_OFFSET (buffer) == 0)) {
+    GST_DEBUG_OBJECT (demux, "beginning of file, expect header");
+    demux->state = FLV_STATE_HEADER;
+    demux->offset = 0;
+  }
+
+  if (G_UNLIKELY (demux->offset == 0)) {
+    GST_DEBUG_OBJECT (demux, "offset was zero, synchronizing with buffer's");
+    demux->offset = GST_BUFFER_OFFSET (buffer);
+  }
+
+  gst_adapter_push (demux->adapter, buffer);
+
+parse:
+  if (G_UNLIKELY (demux->flushing)) {
+    GST_DEBUG_OBJECT (demux, "we are now flushing, exiting parser loop");
+    ret = GST_FLOW_WRONG_STATE;
+    goto beach;
+  }
+  switch (demux->state) {
+    case FLV_STATE_HEADER:
+    {
+      if (gst_adapter_available (demux->adapter) >= FLV_HEADER_SIZE) {
+        const guint8 *data;
+
+        data = gst_adapter_peek (demux->adapter, FLV_HEADER_SIZE);
+
+        ret = gst_flv_parse_header (demux, data, FLV_HEADER_SIZE);
+
+        gst_flv_demux_adapter_flush (demux, FLV_HEADER_SIZE);
+
+        demux->state = FLV_STATE_TAG_TYPE;
+        goto parse;
+      } else {
+        goto beach;
+      }
+    }
+    case FLV_STATE_TAG_TYPE:
+    {
+      if (gst_adapter_available (demux->adapter) >= FLV_TAG_TYPE_SIZE) {
+        const guint8 *data;
+
+        /* Remember the tag offset in bytes */
+        demux->cur_tag_offset = demux->offset;
+
+        data = gst_adapter_peek (demux->adapter, FLV_TAG_TYPE_SIZE);
+
+        ret = gst_flv_parse_tag_type (demux, data, FLV_TAG_TYPE_SIZE);
+
+        gst_flv_demux_adapter_flush (demux, FLV_TAG_TYPE_SIZE);
+
+        goto parse;
+      } else {
+        goto beach;
+      }
+    }
+    case FLV_STATE_TAG_VIDEO:
+    {
+      if (gst_adapter_available (demux->adapter) >= demux->tag_size) {
+        const guint8 *data;
+
+        data = gst_adapter_peek (demux->adapter, demux->tag_size);
+
+        ret = gst_flv_parse_tag_video (demux, data, demux->tag_size);
+
+        gst_flv_demux_adapter_flush (demux, demux->tag_size);
+
+        demux->state = FLV_STATE_TAG_TYPE;
+        goto parse;
+      } else {
+        goto beach;
+      }
+    }
+    case FLV_STATE_TAG_AUDIO:
+    {
+      if (gst_adapter_available (demux->adapter) >= demux->tag_size) {
+        const guint8 *data;
+
+        data = gst_adapter_peek (demux->adapter, demux->tag_size);
+
+        ret = gst_flv_parse_tag_audio (demux, data, demux->tag_size);
+
+        gst_flv_demux_adapter_flush (demux, demux->tag_size);
+
+        demux->state = FLV_STATE_TAG_TYPE;
+        goto parse;
+      } else {
+        goto beach;
+      }
+    }
+    case FLV_STATE_TAG_SCRIPT:
+    {
+      if (gst_adapter_available (demux->adapter) >= demux->tag_size) {
+        const guint8 *data;
+
+        data = gst_adapter_peek (demux->adapter, demux->tag_size);
+
+        ret = gst_flv_parse_tag_script (demux, data, demux->tag_size);
+
+        gst_flv_demux_adapter_flush (demux, demux->tag_size);
+
+        demux->state = FLV_STATE_TAG_TYPE;
+        goto parse;
+      } else {
+        goto beach;
+      }
+    }
+    default:
+      GST_DEBUG_OBJECT (demux, "unexpected demuxer state");
+  }
+
+beach:
+  if (G_UNLIKELY (ret == GST_FLOW_NOT_LINKED)) {
+    /* If either audio or video is linked we return GST_FLOW_OK */
+    if (demux->audio_linked || demux->video_linked) {
+      ret = GST_FLOW_OK;
+    }
+  }
+
+  gst_object_unref (demux);
+
+  return ret;
+}
+
+static GstFlowReturn
+gst_flv_demux_pull_tag (GstPad * pad, GstFLVDemux * demux)
+{
+  GstBuffer *buffer = NULL;
+  GstFlowReturn ret = GST_FLOW_OK;
+
+  /* Store tag offset */
+  demux->cur_tag_offset = demux->offset;
+
+  /* Get the first 4 bytes to identify tag type and size */
+  ret = gst_pad_pull_range (pad, demux->offset, FLV_TAG_TYPE_SIZE, &buffer);
+  if (G_UNLIKELY (ret != GST_FLOW_OK)) {
+    GST_WARNING_OBJECT (demux, "failed when pulling %d bytes",
+        FLV_TAG_TYPE_SIZE);
+    goto beach;
+  }
+
+  if (G_UNLIKELY (buffer && GST_BUFFER_SIZE (buffer) != FLV_TAG_TYPE_SIZE)) {
+    GST_WARNING_OBJECT (demux, "partial pull got %d when expecting %d",
+        GST_BUFFER_SIZE (buffer), FLV_TAG_TYPE_SIZE);
+    gst_buffer_unref (buffer);
+    ret = GST_FLOW_UNEXPECTED;
+    goto beach;
+  }
+
+  /* Identify tag type */
+  ret = gst_flv_parse_tag_type (demux, GST_BUFFER_DATA (buffer),
+      GST_BUFFER_SIZE (buffer));
+
+  gst_buffer_unref (buffer);
+
+  /* Jump over tag type + size */
+  demux->offset += FLV_TAG_TYPE_SIZE;
+
+  /* Pull the whole tag */
+  ret = gst_pad_pull_range (pad, demux->offset, demux->tag_size, &buffer);
+  if (G_UNLIKELY (ret != GST_FLOW_OK)) {
+    GST_WARNING_OBJECT (demux,
+        "failed when pulling %" G_GUINT64_FORMAT " bytes", demux->tag_size);
+    goto beach;
+  }
+
+  if (G_UNLIKELY (buffer && GST_BUFFER_SIZE (buffer) != demux->tag_size)) {
+    GST_WARNING_OBJECT (demux,
+        "partial pull got %d when expecting %" G_GUINT64_FORMAT,
+        GST_BUFFER_SIZE (buffer), demux->tag_size);
+    gst_buffer_unref (buffer);
+    ret = GST_FLOW_UNEXPECTED;
+    goto beach;
+  }
+
+  switch (demux->state) {
+    case FLV_STATE_TAG_VIDEO:
+      ret = gst_flv_parse_tag_video (demux, GST_BUFFER_DATA (buffer),
+          GST_BUFFER_SIZE (buffer));
+      break;
+    case FLV_STATE_TAG_AUDIO:
+      ret = gst_flv_parse_tag_audio (demux, GST_BUFFER_DATA (buffer),
+          GST_BUFFER_SIZE (buffer));
+      break;
+    case FLV_STATE_TAG_SCRIPT:
+      ret = gst_flv_parse_tag_script (demux, GST_BUFFER_DATA (buffer),
+          GST_BUFFER_SIZE (buffer));
+      break;
+    default:
+      GST_WARNING_OBJECT (demux, "unexpected state %d", demux->state);
+  }
+
+  gst_buffer_unref (buffer);
+
+  /* Jump over that part we've just parsed */
+  demux->offset += demux->tag_size;
+
+  /* Make sure we reinitialize the tag size */
+  demux->tag_size = 0;
+
+  /* Ready for the next tag */
+  demux->state = FLV_STATE_TAG_TYPE;
+
+  if (G_UNLIKELY (ret == GST_FLOW_NOT_LINKED)) {
+    /* If either audio or video is linked we return GST_FLOW_OK */
+    if (demux->audio_linked || demux->video_linked) {
+      ret = GST_FLOW_OK;
+    } else {
+      GST_WARNING_OBJECT (demux, "parsing this tag returned not-linked and "
+          "neither video nor audio are linked");
+    }
+  }
+
+beach:
+  return ret;
+}
+
+static GstFlowReturn
+gst_flv_demux_pull_header (GstPad * pad, GstFLVDemux * demux)
+{
+  GstBuffer *buffer = NULL;
+  GstFlowReturn ret = GST_FLOW_OK;
+
+  /* Get the first 9 bytes */
+  ret = gst_pad_pull_range (pad, demux->offset, FLV_HEADER_SIZE, &buffer);
+  if (G_UNLIKELY (ret != GST_FLOW_OK)) {
+    GST_WARNING_OBJECT (demux, "failed when pulling %d bytes", 
FLV_HEADER_SIZE);
+    goto beach;
+  }
+
+  if (G_UNLIKELY (buffer && GST_BUFFER_SIZE (buffer) != FLV_HEADER_SIZE)) {
+    GST_WARNING_OBJECT (demux, "partial pull got %d when expecting %d",
+        GST_BUFFER_SIZE (buffer), FLV_HEADER_SIZE);
+    gst_buffer_unref (buffer);
+    ret = GST_FLOW_UNEXPECTED;
+    goto beach;
+  }
+
+  ret = gst_flv_parse_header (demux, GST_BUFFER_DATA (buffer),
+      GST_BUFFER_SIZE (buffer));
+
+  /* Jump over the header now */
+  demux->offset += FLV_HEADER_SIZE;
+  demux->state = FLV_STATE_TAG_TYPE;
+
+beach:
+  return ret;
+}
+
+static GstFlowReturn
+gst_flv_demux_seek_to_prev_keyframe (GstFLVDemux * demux)
+{
+  return GST_FLOW_OK;
+}
+
+static void
+gst_flv_demux_loop (GstPad * pad)
+{
+  GstFLVDemux *demux = NULL;
+  GstFlowReturn ret = GST_FLOW_OK;
+
+  demux = GST_FLV_DEMUX (gst_pad_get_parent (pad));
+
+  if (demux->segment->rate >= 0) {
+    /* pull in data */
+    switch (demux->state) {
+      case FLV_STATE_TAG_TYPE:
+        ret = gst_flv_demux_pull_tag (pad, demux);
+        break;
+      case FLV_STATE_DONE:
+        ret = GST_FLOW_UNEXPECTED;
+        break;
+      default:
+        ret = gst_flv_demux_pull_header (pad, demux);
+    }
+
+    /* pause if something went wrong */
+    if (G_UNLIKELY (ret != GST_FLOW_OK))
+      goto pause;
+
+    /* check EOS condition */
+    if ((demux->segment->flags & GST_SEEK_FLAG_SEGMENT) &&
+        (demux->segment->stop != -1) &&
+        (demux->segment->last_stop >= demux->segment->stop)) {
+      ret = GST_FLOW_UNEXPECTED;
+      goto pause;
+    }
+  } else {                      /* Reverse playback */
+    /* pull in data */
+    switch (demux->state) {
+      case FLV_STATE_TAG_TYPE:
+        ret = gst_flv_demux_pull_tag (pad, demux);
+        /* When packet parsing returns UNEXPECTED that means we ve reached the
+           point where we want to go to the previous keyframe. This is either
+           the last FLV tag or the keyframe we used last time */
+        if (ret == GST_FLOW_UNEXPECTED) {
+          ret = gst_flv_demux_seek_to_prev_keyframe (demux);
+          demux->state = FLV_STATE_TAG_TYPE;
+        }
+        break;
+      default:
+        ret = gst_flv_demux_pull_header (pad, demux);
+    }
+
+    /* pause if something went wrong */
+    if (G_UNLIKELY (ret != GST_FLOW_OK))
+      goto pause;
+
+    /* check EOS condition */
+    if (demux->segment->last_stop <= demux->segment->start) {
+      ret = GST_FLOW_UNEXPECTED;
+      goto pause;
+    }
+  }
+
+  gst_object_unref (demux);
+
+  return;
+
+pause:
+  {
+    const gchar *reason = gst_flow_get_name (ret);
+
+    GST_LOG_OBJECT (demux, "pausing task, reason %s", reason);
+    gst_pad_pause_task (pad);
+
+    if (GST_FLOW_IS_FATAL (ret) || ret == GST_FLOW_NOT_LINKED) {
+      if (ret == GST_FLOW_UNEXPECTED) {
+        /* perform EOS logic */
+        gst_element_no_more_pads (GST_ELEMENT_CAST (demux));
+        if (demux->segment->flags & GST_SEEK_FLAG_SEGMENT) {
+          gint64 stop;
+
+          /* for segment playback we need to post when (in stream time)
+           * we stopped, this is either stop (when set) or the duration. */
+          if ((stop = demux->segment->stop) == -1)
+            stop = demux->segment->duration;
+
+          if (demux->segment->rate >= 0) {
+            GST_LOG_OBJECT (demux, "Sending segment done, at end of segment");
+            gst_element_post_message (GST_ELEMENT_CAST (demux),
+                gst_message_new_segment_done (GST_OBJECT_CAST (demux),
+                    GST_FORMAT_TIME, stop));
+          } else {              /* Reverse playback */
+            GST_LOG_OBJECT (demux, "Sending segment done, at beginning of "
+                "segment");
+            gst_element_post_message (GST_ELEMENT_CAST (demux),
+                gst_message_new_segment_done (GST_OBJECT_CAST (demux),
+                    GST_FORMAT_TIME, demux->segment->start));
+          }
+        } else {
+          /* normal playback, send EOS to all linked pads */
+          gst_element_no_more_pads (GST_ELEMENT (demux));
+          GST_LOG_OBJECT (demux, "Sending EOS, at end of stream");
+          if (!gst_pad_event_default (demux->sinkpad, gst_event_new_eos ())) {
+            GST_WARNING_OBJECT (demux, "failed pushing EOS on streams");
+            GST_ELEMENT_ERROR (demux, STREAM, FAILED,
+                ("Internal data stream error."),
+                ("Can't push EOS downstream (empty/invalid file "
+                    "with no streams/tags ?)"));
+          }
+        }
+      } else {
+        GST_ELEMENT_ERROR (demux, STREAM, FAILED,
+            ("Internal data stream error."),
+            ("stream stopped, reason %s", reason));
+        gst_pad_event_default (demux->sinkpad, gst_event_new_eos ());
+      }
+    }
+    gst_object_unref (demux);
+    return;
+  }
+}
+
+static guint64
+gst_flv_demux_find_offset (GstFLVDemux * demux, GstSegment * segment)
+{
+  gint64 bytes = 0;
+  gint64 time = 0;
+  GstIndexEntry *entry;
+
+  g_return_val_if_fail (segment != NULL, 0);
+
+  time = segment->start;
+
+  if (demux->index) {
+    /* Let's check if we have an index entry for that seek time */
+    entry = gst_index_get_assoc_entry (demux->index, demux->index_id,
+        GST_INDEX_LOOKUP_BEFORE, GST_ASSOCIATION_FLAG_KEY_UNIT, 
GST_FORMAT_TIME,
+        time);
+
+    if (entry) {
+      gst_index_entry_assoc_map (entry, GST_FORMAT_BYTES, &bytes);
+      gst_index_entry_assoc_map (entry, GST_FORMAT_TIME, &time);
+
+      GST_DEBUG_OBJECT (demux, "found index entry for %" GST_TIME_FORMAT
+          " at %" GST_TIME_FORMAT ", seeking to %" G_GINT64_FORMAT,
+          GST_TIME_ARGS (segment->start), GST_TIME_ARGS (time), bytes);
+
+      /* Key frame seeking */
+      if (segment->flags & GST_SEEK_FLAG_KEY_UNIT) {
+        /* Adjust the segment so that the keyframe fits in */
+        if (time < segment->start) {
+          segment->start = segment->time = time;
+        }
+        segment->last_stop = time;
+      }
+    } else {
+      GST_DEBUG_OBJECT (demux, "no index entry found for %" GST_TIME_FORMAT,
+          GST_TIME_ARGS (segment->start));
+    }
+  }
+
+  return bytes;
+}
+
+static gboolean
+gst_flv_demux_handle_seek_push (GstFLVDemux * demux, GstEvent * event)
+{
+  GstFormat format;
+  GstSeekFlags flags;
+  GstSeekType start_type, stop_type;
+  gint64 start, stop;
+  gdouble rate;
+  gboolean update, flush, keyframe, ret;
+  GstSegment seeksegment;
+
+  gst_event_parse_seek (event, &rate, &format, &flags,
+      &start_type, &start, &stop_type, &stop);
+
+  if (format != GST_FORMAT_TIME)
+    goto wrong_format;
+
+  flush = flags & GST_SEEK_FLAG_FLUSH;
+  keyframe = flags & GST_SEEK_FLAG_KEY_UNIT;
+
+  /* Work on a copy until we are sure the seek succeeded. */
+  memcpy (&seeksegment, demux->segment, sizeof (GstSegment));
+
+  GST_DEBUG_OBJECT (demux, "segment before configure %" GST_SEGMENT_FORMAT,
+      demux->segment);
+
+  /* Apply the seek to our segment */
+  gst_segment_set_seek (&seeksegment, rate, format, flags,
+      start_type, start, stop_type, stop, &update);
+
+  GST_DEBUG_OBJECT (demux, "segment configured %" GST_SEGMENT_FORMAT,
+      &seeksegment);
+
+  if (flush || seeksegment.last_stop != demux->segment->last_stop) {
+    /* Do the actual seeking */
+    guint64 offset = gst_flv_demux_find_offset (demux, &seeksegment);
+
+    GST_DEBUG_OBJECT (demux, "generating an upstream seek at position %"
+        G_GUINT64_FORMAT, offset);
+    ret = gst_pad_push_event (demux->sinkpad,
+        gst_event_new_seek (seeksegment.rate, GST_FORMAT_BYTES,
+            GST_SEEK_FLAG_FLUSH | GST_SEEK_FLAG_ACCURATE, GST_SEEK_TYPE_SET,
+            offset, GST_SEEK_TYPE_NONE, 0));
+    if (G_UNLIKELY (!ret)) {
+      GST_WARNING_OBJECT (demux, "upstream seek failed");
+    }
+  } else {
+    ret = TRUE;
+  }
+
+  if (ret) {
+    /* Ok seek succeeded, take the newly configured segment */
+    memcpy (demux->segment, &seeksegment, sizeof (GstSegment));
+
+    /* Notify about the start of a new segment */
+    if (demux->segment->flags & GST_SEEK_FLAG_SEGMENT) {
+      gst_element_post_message (GST_ELEMENT (demux),
+          gst_message_new_segment_start (GST_OBJECT (demux),
+              demux->segment->format, demux->segment->last_stop));
+    }
+
+    /* Tell all the stream a new segment is needed */
+    {
+      demux->audio_need_segment = TRUE;
+      demux->video_need_segment = TRUE;
+      /* Clean any potential newsegment event kept for the streams. The first
+       * stream needing a new segment will create a new one. */
+      if (G_UNLIKELY (demux->new_seg_event)) {
+        gst_event_unref (demux->new_seg_event);
+        demux->new_seg_event = NULL;
+      }
+    }
+  }
+
+  return ret;
+
+/* ERRORS */
+wrong_format:
+  {
+    GST_WARNING_OBJECT (demux, "we only support seeking in TIME format");
+    return FALSE;
+  }
+}
+
+static gboolean
+gst_flv_demux_handle_seek_pull (GstFLVDemux * demux, GstEvent * event)
+{
+  GstFormat format;
+  GstSeekFlags flags;
+  GstSeekType start_type, stop_type;
+  gint64 start, stop;
+  gdouble rate;
+  gboolean update, flush, keyframe, ret;
+  GstSegment seeksegment;
+
+  gst_event_parse_seek (event, &rate, &format, &flags,
+      &start_type, &start, &stop_type, &stop);
+
+  if (format != GST_FORMAT_TIME)
+    goto wrong_format;
+
+  flush = flags & GST_SEEK_FLAG_FLUSH;
+  keyframe = flags & GST_SEEK_FLAG_KEY_UNIT;
+
+  if (flush) {
+    /* Flush start up and downstream to make sure data flow and loops are
+       idle */
+    gst_pad_event_default (demux->sinkpad, gst_event_new_flush_start ());
+    gst_pad_push_event (demux->sinkpad, gst_event_new_flush_start ());
+  } else {
+    /* Pause the pulling task */
+    gst_pad_pause_task (demux->sinkpad);
+  }
+
+  /* Take the stream lock */
+  GST_PAD_STREAM_LOCK (demux->sinkpad);
+
+  if (flush) {
+    /* Stop flushing upstream we need to pull */
+    gst_pad_push_event (demux->sinkpad, gst_event_new_flush_stop ());
+  }
+
+  /* Work on a copy until we are sure the seek succeeded. */
+  memcpy (&seeksegment, demux->segment, sizeof (GstSegment));
+
+  GST_DEBUG_OBJECT (demux, "segment before configure %" GST_SEGMENT_FORMAT,
+      demux->segment);
+
+  /* Apply the seek to our segment */
+  gst_segment_set_seek (&seeksegment, rate, format, flags,
+      start_type, start, stop_type, stop, &update);
+
+  GST_DEBUG_OBJECT (demux, "segment configured %" GST_SEGMENT_FORMAT,
+      &seeksegment);
+
+  if (flush || seeksegment.last_stop != demux->segment->last_stop) {
+    /* Do the actual seeking */
+    demux->offset = gst_flv_demux_find_offset (demux, &seeksegment);
+
+    /* If we seeked at the beginning of the file parse the header again */
+    if (G_UNLIKELY (!demux->offset)) {
+      demux->state = FLV_STATE_HEADER;
+    } else {                    /* or parse a tag */
+      demux->state = FLV_STATE_TAG_TYPE;
+    }
+    ret = TRUE;
+  } else {
+    ret = TRUE;
+  }
+
+  if (flush) {
+    /* Stop flushing, the sinks are at time 0 now */
+    gst_pad_event_default (demux->sinkpad, gst_event_new_flush_stop ());
+  } else {
+    GST_DEBUG_OBJECT (demux, "closing running segment %" GST_SEGMENT_FORMAT,
+        demux->segment);
+
+    /* Close the current segment for a linear playback, FIXME, queue for
+     * streaming thread. */
+    if (demux->segment->rate >= 0) {
+      /* for forward playback, we played from start to last_stop */
+      gst_pad_event_default (demux->sinkpad, gst_event_new_new_segment (TRUE,
+              demux->segment->rate, demux->segment->format,
+              demux->segment->start, demux->segment->last_stop,
+              demux->segment->time));
+    } else {
+      gint64 stop;
+
+      if ((stop = demux->segment->stop) == -1)
+        stop = demux->segment->duration;
+
+      /* for reverse playback, we played from stop to last_stop. */
+      gst_pad_event_default (demux->sinkpad, gst_event_new_new_segment (TRUE,
+              demux->segment->rate, demux->segment->format,
+              demux->segment->last_stop, stop, demux->segment->last_stop));
+    }
+  }
+
+  if (ret) {
+    /* Ok seek succeeded, take the newly configured segment */
+    memcpy (demux->segment, &seeksegment, sizeof (GstSegment));
+
+    /* Notify about the start of a new segment */
+    if (demux->segment->flags & GST_SEEK_FLAG_SEGMENT) {
+      gst_element_post_message (GST_ELEMENT (demux),
+          gst_message_new_segment_start (GST_OBJECT (demux),
+              demux->segment->format, demux->segment->last_stop));
+    }
+
+    /* Tell all the stream a new segment is needed */
+    {
+      demux->audio_need_segment = TRUE;
+      demux->video_need_segment = TRUE;
+      /* Clean any potential newsegment event kept for the streams. The first
+       * stream needing a new segment will create a new one. */
+      if (G_UNLIKELY (demux->new_seg_event)) {
+        gst_event_unref (demux->new_seg_event);
+        demux->new_seg_event = NULL;
+      }
+    }
+  }
+
+  gst_pad_start_task (demux->sinkpad,
+      (GstTaskFunction) gst_flv_demux_loop, demux->sinkpad);
+
+  GST_PAD_STREAM_UNLOCK (demux->sinkpad);
+
+  return ret;
+
+  /* ERRORS */
+wrong_format:
+  {
+    GST_WARNING_OBJECT (demux, "we only support seeking in TIME format");
+    return FALSE;
+  }
+}
+
+/* If we can pull that's prefered */
+static gboolean
+gst_flv_demux_sink_activate (GstPad * sinkpad)
+{
+  if (gst_pad_check_pull_range (sinkpad)) {
+    return gst_pad_activate_pull (sinkpad, TRUE);
+  } else {
+    return gst_pad_activate_push (sinkpad, TRUE);
+  }
+}
+
+/* This function gets called when we activate ourselves in push mode.
+ * We cannot seek (ourselves) in the stream */
+static gboolean
+gst_flv_demux_sink_activate_push (GstPad * sinkpad, gboolean active)
+{
+  GstFLVDemux *demux;
+
+  demux = GST_FLV_DEMUX (gst_pad_get_parent (sinkpad));
+
+  demux->random_access = FALSE;
+
+  gst_object_unref (demux);
+
+  return TRUE;
+}
+
+/* this function gets called when we activate ourselves in pull mode.
+ * We can perform  random access to the resource and we start a task
+ * to start reading */
+static gboolean
+gst_flv_demux_sink_activate_pull (GstPad * sinkpad, gboolean active)
+{
+  GstFLVDemux *demux;
+
+  demux = GST_FLV_DEMUX (gst_pad_get_parent (sinkpad));
+
+  if (active) {
+    demux->random_access = TRUE;
+    gst_object_unref (demux);
+    return gst_pad_start_task (sinkpad, (GstTaskFunction) gst_flv_demux_loop,
+        sinkpad);
+  } else {
+    demux->random_access = FALSE;
+    gst_object_unref (demux);
+    return gst_pad_stop_task (sinkpad);
+  }
+}
+
+static gboolean
+gst_flv_demux_sink_event (GstPad * pad, GstEvent * event)
+{
+  GstFLVDemux *demux;
+  gboolean ret = FALSE;
+
+  demux = GST_FLV_DEMUX (gst_pad_get_parent (pad));
+
+  GST_DEBUG_OBJECT (demux, "handling event %s", GST_EVENT_TYPE_NAME (event));
+
+  switch (GST_EVENT_TYPE (event)) {
+    case GST_EVENT_FLUSH_START:
+      GST_DEBUG_OBJECT (demux, "trying to force chain function to exit");
+      demux->flushing = TRUE;
+      ret = gst_pad_event_default (demux->sinkpad, event);
+      break;
+    case GST_EVENT_FLUSH_STOP:
+      GST_DEBUG_OBJECT (demux, "flushing FLV demuxer");
+      gst_flv_demux_flush (demux, TRUE);
+      ret = gst_pad_event_default (demux->sinkpad, event);
+      break;
+    case GST_EVENT_EOS:
+      GST_DEBUG_OBJECT (demux, "received EOS");
+      if (demux->index) {
+        GST_DEBUG_OBJECT (demux, "committing index");
+        gst_index_commit (demux->index, demux->index_id);
+      }
+      gst_element_no_more_pads (GST_ELEMENT (demux));
+      if (!gst_pad_event_default (demux->sinkpad, event)) {
+        GST_WARNING_OBJECT (demux, "failed pushing EOS on streams");
+        GST_ELEMENT_ERROR (demux, STREAM, FAILED,
+            ("Internal data stream error."),
+            ("Can't push EOS downstream (empty/invalid file "
+                "with no streams/tags ?)"));
+      }
+      ret = TRUE;
+      break;
+    case GST_EVENT_NEWSEGMENT:
+    {
+      GstFormat format;
+      gdouble rate;
+      gint64 start, stop, time;
+      gboolean update;
+
+      GST_DEBUG_OBJECT (demux, "received new segment");
+
+      gst_event_parse_new_segment (event, &update, &rate, &format, &start,
+          &stop, &time);
+
+      if (format == GST_FORMAT_TIME) {
+        /* time segment, this is perfect, copy over the values. */
+        gst_segment_set_newsegment (demux->segment, update, rate, format, 
start,
+            stop, time);
+
+        GST_DEBUG_OBJECT (demux, "NEWSEGMENT: %" GST_SEGMENT_FORMAT,
+            demux->segment);
+
+        /* and forward */
+        ret = gst_pad_event_default (demux->sinkpad, event);
+      } else {
+        /* non-time format */
+        demux->audio_need_segment = TRUE;
+        demux->video_need_segment = TRUE;
+        ret = TRUE;
+        gst_event_unref (event);
+      }
+      break;
+    }
+    default:
+      ret = gst_pad_event_default (demux->sinkpad, event);
+      break;
+  }
+
+  gst_object_unref (demux);
+
+  return ret;
+}
+
+gboolean
+gst_flv_demux_src_event (GstPad * pad, GstEvent * event)
+{
+  GstFLVDemux *demux;
+  gboolean ret = FALSE;
+
+  demux = GST_FLV_DEMUX (gst_pad_get_parent (pad));
+
+  GST_DEBUG_OBJECT (demux, "handling event %s", GST_EVENT_TYPE_NAME (event));
+
+  switch (GST_EVENT_TYPE (event)) {
+    case GST_EVENT_SEEK:
+      if (demux->random_access) {
+        ret = gst_flv_demux_handle_seek_pull (demux, event);
+      } else {
+        ret = gst_flv_demux_handle_seek_push (demux, event);
+      }
+      break;
+    default:
+      ret = gst_pad_push_event (demux->sinkpad, event);
+      break;
+  }
+
+  gst_object_unref (demux);
+
+  return ret;
+}
+
+gboolean
+gst_flv_demux_query (GstPad * pad, GstQuery * query)
+{
+  gboolean res = TRUE;
+  GstFLVDemux *demux;
+
+  demux = GST_FLV_DEMUX (gst_pad_get_parent (pad));
+
+  switch (GST_QUERY_TYPE (query)) {
+    case GST_QUERY_DURATION:
+    {
+      GstFormat format;
+
+      gst_query_parse_duration (query, &format, NULL);
+
+      /* duration is time only */
+      if (format != GST_FORMAT_TIME) {
+        GST_DEBUG_OBJECT (demux, "duration query only supported for time "
+            "format");
+        res = FALSE;
+        goto beach;
+      }
+
+      GST_DEBUG_OBJECT (pad, "duration query, replying %" GST_TIME_FORMAT,
+          GST_TIME_ARGS (demux->duration));
+
+      gst_query_set_duration (query, GST_FORMAT_TIME, demux->duration);
+
+      break;
+    }
+    case GST_QUERY_LATENCY:
+    {
+      GstPad *peer;
+
+      if ((peer = gst_pad_get_peer (demux->sinkpad))) {
+        /* query latency on peer pad */
+        res = gst_pad_query (peer, query);
+        gst_object_unref (peer);
+      } else {
+        /* no peer, we don't know */
+        res = FALSE;
+      }
+      break;
+    }
+    default:
+      res = FALSE;
+      break;
+  }
+
+beach:
+  gst_object_unref (demux);
+
+  return res;
+}
+
+static GstStateChangeReturn
+gst_flv_demux_change_state (GstElement * element, GstStateChange transition)
+{
+  GstFLVDemux *demux;
+  GstStateChangeReturn ret;
+
+  demux = GST_FLV_DEMUX (element);
+
+  switch (transition) {
+    case GST_STATE_CHANGE_READY_TO_PAUSED:
+      /* If no index was created, generate one */
+      if (G_UNLIKELY (!demux->index)) {
+        GST_DEBUG_OBJECT (demux, "no index provided creating our own");
+
+        demux->index = gst_index_factory_make ("memindex");
+
+        gst_index_get_writer_id (demux->index, GST_OBJECT (demux),
+            &demux->index_id);
+      }
+      gst_flv_demux_cleanup (demux);
+      break;
+    default:
+      break;
+  }
+
+  ret = GST_ELEMENT_CLASS (parent_class)->change_state (element, transition);
+  if (ret == GST_STATE_CHANGE_FAILURE)
+    return ret;
+
+  switch (transition) {
+    case GST_STATE_CHANGE_PAUSED_TO_READY:
+      gst_flv_demux_cleanup (demux);
+      break;
+    default:
+      break;
+  }
+
+  return ret;
+}
+
+static void
+gst_flv_demux_set_index (GstElement * element, GstIndex * index)
+{
+  GstFLVDemux *demux = GST_FLV_DEMUX (element);
+
+  GST_OBJECT_LOCK (demux);
+  if (demux->index)
+    gst_object_unref (demux->index);
+  demux->index = gst_object_ref (index);
+  GST_OBJECT_UNLOCK (demux);
+
+  gst_index_get_writer_id (index, GST_OBJECT (element), &demux->index_id);
+}
+
+static GstIndex *
+gst_flv_demux_get_index (GstElement * element)
+{
+  GstIndex *result = NULL;
+
+  GstFLVDemux *demux = GST_FLV_DEMUX (element);
+
+  GST_OBJECT_LOCK (demux);
+  if (demux->index)
+    result = gst_object_ref (demux->index);
+  GST_OBJECT_UNLOCK (demux);
+
+  return result;
+}
+
+static void
+gst_flv_demux_dispose (GObject * object)
+{
+  GstFLVDemux *demux = GST_FLV_DEMUX (object);
+
+  GST_DEBUG_OBJECT (demux, "disposing FLV demuxer");
+
+  if (demux->adapter) {
+    gst_adapter_clear (demux->adapter);
+    g_object_unref (demux->adapter);
+    demux->adapter = NULL;
+  }
+
+  if (demux->segment) {
+    gst_segment_free (demux->segment);
+    demux->segment = NULL;
+  }
+
+  if (demux->taglist) {
+    gst_tag_list_free (demux->taglist);
+    demux->taglist = NULL;
+  }
+
+  if (demux->new_seg_event) {
+    gst_event_unref (demux->new_seg_event);
+    demux->new_seg_event = NULL;
+  }
+
+  if (demux->audio_pad) {
+    gst_object_unref (demux->audio_pad);
+    demux->audio_pad = NULL;
+  }
+
+  if (demux->video_pad) {
+    gst_object_unref (demux->video_pad);
+    demux->video_pad = NULL;
+  }
+
+  if (demux->index) {
+    gst_object_unref (demux->index);
+    demux->index = NULL;
+  }
+
+  if (demux->times) {
+    g_array_free (demux->times, TRUE);
+    demux->times = NULL;
+  }
+
+  if (demux->filepositions) {
+    g_array_free (demux->filepositions, TRUE);
+    demux->filepositions = NULL;
+  }
+
+  GST_CALL_PARENT (G_OBJECT_CLASS, dispose, (object));
+}
+
+static void
+gst_flv_demux_base_init (gpointer g_class)
+{
+  GstElementClass *element_class = GST_ELEMENT_CLASS (g_class);
+
+  gst_element_class_add_pad_template (element_class,
+      gst_static_pad_template_get (&flv_sink_template));
+  gst_element_class_add_pad_template (element_class,
+      gst_static_pad_template_get (&audio_src_template));
+  gst_element_class_add_pad_template (element_class,
+      gst_static_pad_template_get (&video_src_template));
+  gst_element_class_set_details (element_class, &flv_demux_details);
+}
+
+static void
+gst_flv_demux_class_init (GstFLVDemuxClass * klass)
+{
+  GstElementClass *gstelement_class = GST_ELEMENT_CLASS (klass);
+  GObjectClass *gobject_class = G_OBJECT_CLASS (klass);
+
+  gobject_class->dispose = GST_DEBUG_FUNCPTR (gst_flv_demux_dispose);
+
+  gstelement_class->change_state =
+      GST_DEBUG_FUNCPTR (gst_flv_demux_change_state);
+  gstelement_class->set_index = GST_DEBUG_FUNCPTR (gst_flv_demux_set_index);
+  gstelement_class->get_index = GST_DEBUG_FUNCPTR (gst_flv_demux_get_index);
+}
+
+static void
+gst_flv_demux_init (GstFLVDemux * demux, GstFLVDemuxClass * g_class)
+{
+  demux->sinkpad =
+      gst_pad_new_from_static_template (&flv_sink_template, "sink");
+
+  gst_pad_set_event_function (demux->sinkpad,
+      GST_DEBUG_FUNCPTR (gst_flv_demux_sink_event));
+  gst_pad_set_chain_function (demux->sinkpad,
+      GST_DEBUG_FUNCPTR (gst_flv_demux_chain));
+  gst_pad_set_activate_function (demux->sinkpad,
+      GST_DEBUG_FUNCPTR (gst_flv_demux_sink_activate));
+  gst_pad_set_activatepull_function (demux->sinkpad,
+      GST_DEBUG_FUNCPTR (gst_flv_demux_sink_activate_pull));
+  gst_pad_set_activatepush_function (demux->sinkpad,
+      GST_DEBUG_FUNCPTR (gst_flv_demux_sink_activate_push));
+
+  gst_element_add_pad (GST_ELEMENT (demux), demux->sinkpad);
+
+  demux->adapter = gst_adapter_new ();
+  demux->segment = gst_segment_new ();
+  demux->taglist = gst_tag_list_new ();
+  gst_segment_init (demux->segment, GST_FORMAT_TIME);
+
+  gst_flv_demux_cleanup (demux);
+}
+
+static gboolean
+plugin_init (GstPlugin * plugin)
+{
+  GST_DEBUG_CATEGORY_INIT (flvdemux_debug, "flvdemux", 0, "FLV demuxer");
+
+  if (!gst_element_register (plugin, "flvdemux", GST_RANK_PRIMARY,
+          gst_flv_demux_get_type ()))
+    return FALSE;
+
+  return TRUE;
+}
+
+GST_PLUGIN_DEFINE (GST_VERSION_MAJOR, GST_VERSION_MINOR,
+    "flvdemux", "Element demuxing FLV stream",
+    plugin_init, VERSION, "LGPL", "Gnash's internal copy of flvdemux", "Gnash")

Index: libmedia/gst/gstflvdemux.h
===================================================================
RCS file: libmedia/gst/gstflvdemux.h
diff -N libmedia/gst/gstflvdemux.h
--- /dev/null   1 Jan 1970 00:00:00 -0000
+++ libmedia/gst/gstflvdemux.h  21 Jan 2008 07:07:27 -0000      1.1
@@ -0,0 +1,124 @@
+/* GStreamer
+ * Copyright (C) <2007> Julien Moutte <address@hidden>
+ *
+ * This library is free software; you can redistribute it and/or
+ * modify it under the terms of the GNU Library General Public
+ * License as published by the Free Software Foundation; either
+ * version 2 of the License, or (at your option) any later version.
+ *
+ * This library is distributed in the hope that it will be useful,
+ * but WITHOUT ANY WARRANTY; without even the implied warranty of
+ * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the GNU
+ * Library General Public License for more details.
+ *
+ * You should have received a copy of the GNU Library General Public
+ * License along with this library; if not, write to the
+ * Free Software Foundation, Inc., 59 Temple Place - Suite 330,
+ * Boston, MA 02111-1307, USA.
+ */
+
+#ifndef __FLV_DEMUX_H__
+#define __FLV_DEMUX_H__
+
+#include <gst/gst.h>
+#include <gst/base/gstadapter.h>
+
+G_BEGIN_DECLS
+#define GST_TYPE_FLV_DEMUX \
+  (gst_flv_demux_get_type())
+#define GST_FLV_DEMUX(obj) \
+  (G_TYPE_CHECK_INSTANCE_CAST((obj),GST_TYPE_FLV_DEMUX,GstFLVDemux))
+#define GST_FLV_DEMUX_CLASS(klass) \
+  (G_TYPE_CHECK_CLASS_CAST((klass),GST_TYPE_FLV_DEMUX,GstFLVDemuxClass))
+#define GST_IS_FLV_DEMUX(obj) \
+  (G_TYPE_CHECK_INSTANCE_TYPE((obj),GST_TYPE_FLV_DEMUX))
+#define GST_IS_FLV_DEMUX_CLASS(klass) \
+  (G_TYPE_CHECK_CLASS_TYPE((klass),GST_TYPE_FLV_DEMUX))
+typedef struct _GstFLVDemux GstFLVDemux;
+typedef struct _GstFLVDemuxClass GstFLVDemuxClass;
+
+typedef enum
+{
+  FLV_STATE_HEADER,
+  FLV_STATE_TAG_TYPE,
+  FLV_STATE_TAG_VIDEO,
+  FLV_STATE_TAG_AUDIO,
+  FLV_STATE_TAG_SCRIPT,
+  FLV_STATE_DONE,
+  FLV_STATE_NONE
+} GstFLVDemuxState;
+
+struct _GstFLVDemux
+{
+  GstElement element;
+
+  GstPad *sinkpad;
+
+  GstPad *audio_pad;
+  GstPad *video_pad;
+  
+  GstIndex *index;
+  gint index_id;
+  
+  GArray * times;
+  GArray * filepositions;
+
+  GstAdapter *adapter;
+
+  GstSegment *segment;
+
+  GstEvent *new_seg_event;
+
+  GstTagList *taglist;
+
+  GstFLVDemuxState state;
+
+  guint64 offset;
+  guint64 cur_tag_offset;
+  GstClockTime duration;
+  guint64 tag_size;
+  guint64 tag_data_size;
+
+  /* Audio infos */
+  guint16 rate;
+  guint16 channels;
+  guint16 width;
+  guint16 audio_codec_tag;
+  guint64 audio_offset;
+  gboolean audio_need_discont;
+  gboolean audio_need_segment;
+  gboolean audio_linked;
+
+  /* Video infos */
+  guint32 w;
+  guint32 h;
+  guint32 par_x;
+  guint32 par_y;
+  guint16 video_codec_tag;
+  guint64 video_offset;
+  gboolean video_need_discont;
+  gboolean video_need_segment;
+  gboolean video_linked;
+  gboolean got_par;
+
+  gboolean random_access;
+  gboolean need_header;
+  gboolean has_audio;
+  gboolean has_video;
+  gboolean push_tags;
+  gboolean strict;
+  gboolean flushing;
+};
+
+struct _GstFLVDemuxClass
+{
+  GstElementClass parent_class;
+};
+
+GType gst_flv_demux_get_type (void);
+
+gboolean gst_flv_demux_query (GstPad * pad, GstQuery * query);
+gboolean gst_flv_demux_src_event (GstPad * pad, GstEvent * event);
+
+G_END_DECLS
+#endif /* __FLV_DEMUX_H__ */

Index: libmedia/gst/gstflvparse.c
===================================================================
RCS file: libmedia/gst/gstflvparse.c
diff -N libmedia/gst/gstflvparse.c
--- /dev/null   1 Jan 1970 00:00:00 -0000
+++ libmedia/gst/gstflvparse.c  21 Jan 2008 07:07:27 -0000      1.1
@@ -0,0 +1,1027 @@
+/* GStreamer
+ * Copyright (C) <2007> Julien Moutte <address@hidden>
+ *
+ * This library is free software; you can redistribute it and/or
+ * modify it under the terms of the GNU Library General Public
+ * License as published by the Free Software Foundation; either
+ * version 2 of the License, or (at your option) any later version.
+ *
+ * This library is distributed in the hope that it will be useful,
+ * but WITHOUT ANY WARRANTY; without even the implied warranty of
+ * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the GNU
+ * Library General Public License for more details.
+ *
+ * You should have received a copy of the GNU Library General Public
+ * License along with this library; if not, write to the
+ * Free Software Foundation, Inc., 59 Temple Place - Suite 330,
+ * Boston, MA 02111-1307, USA.
+ */
+
+#include "gstflvparse.h"
+
+#include <string.h>
+
+GST_DEBUG_CATEGORY_EXTERN (flvdemux_debug);
+#define GST_CAT_DEFAULT flvdemux_debug
+
+static guint32
+FLV_GET_BEUI24 (const guint8 * data, size_t data_size)
+{
+  guint32 ret = 0;
+
+  g_return_val_if_fail (data != NULL, 0);
+  g_return_val_if_fail (data_size >= 3, 0);
+
+  ret = GST_READ_UINT16_BE (data) << 8;
+  ret |= GST_READ_UINT8 (data + 2);
+
+  return ret;
+}
+
+static gchar *
+FLV_GET_STRING (const guint8 * data, size_t data_size)
+{
+  guint32 string_size = 0;
+  gchar *string = NULL;
+
+  g_return_val_if_fail (data != NULL, NULL);
+  g_return_val_if_fail (data_size >= 2, NULL);
+
+  string_size = GST_READ_UINT16_BE (data);
+  if (G_UNLIKELY (string_size > data_size)) {
+    return NULL;
+  }
+
+  string = g_try_malloc0 (string_size + 1);
+  if (G_UNLIKELY (!string)) {
+    return NULL;
+  }
+
+  memcpy (string, data + 2, string_size);
+
+  return string;
+}
+
+static const GstQueryType *
+gst_flv_demux_query_types (GstPad * pad)
+{
+  static const GstQueryType query_types[] = {
+    GST_QUERY_DURATION,
+    0
+  };
+
+  return query_types;
+}
+
+static size_t
+gst_flv_parse_metadata_item (GstFLVDemux * demux, const guint8 * data,
+    size_t data_size, gboolean * end_marker)
+{
+  gchar *tag_name = NULL;
+  guint8 tag_type = 0;
+  size_t offset = 0;
+
+  /* Initialize the end_marker flag to FALSE */
+  *end_marker = FALSE;
+
+  /* Name of the tag */
+  tag_name = FLV_GET_STRING (data, data_size);
+  if (G_UNLIKELY (!tag_name)) {
+    GST_WARNING_OBJECT (demux, "failed reading tag name");
+    goto beach;
+  }
+
+  offset += strlen (tag_name) + 2;
+
+  /* What kind of object is that */
+  tag_type = GST_READ_UINT8 (data + offset);
+
+  offset++;
+
+  GST_DEBUG_OBJECT (demux, "tag name %s, tag type %d", tag_name, tag_type);
+
+  switch (tag_type) {
+    case 0:                    // Double
+    {                           /* Use a union to read the uint64 and then as 
a double */
+      union
+      {
+        guint64 value_uint64;
+        gdouble value_double;
+      } value_union;
+
+      value_union.value_uint64 = GST_READ_UINT64_BE (data + offset);
+
+      offset += 8;
+
+      GST_DEBUG_OBJECT (demux, "%s => (double) %f", tag_name,
+          value_union.value_double);
+
+      if (!strcmp (tag_name, "duration")) {
+        demux->duration = value_union.value_double * GST_SECOND;
+
+        gst_tag_list_add (demux->taglist, GST_TAG_MERGE_REPLACE,
+            GST_TAG_DURATION, demux->duration, NULL);
+      } else {
+        if (tag_name) {
+          if (!strcmp (tag_name, "AspectRatioX")) {
+            demux->par_x = value_union.value_double;
+            demux->got_par = TRUE;
+          } else if (!strcmp (tag_name, "AspectRatioY")) {
+            demux->par_y = value_union.value_double;
+            demux->got_par = TRUE;
+          }
+          if (!gst_tag_exists (tag_name)) {
+            gst_tag_register (tag_name, GST_TAG_FLAG_META, G_TYPE_DOUBLE,
+                tag_name, tag_name, gst_tag_merge_use_first);
+          }
+
+          if (gst_tag_get_type (tag_name) == G_TYPE_DOUBLE) {
+            gst_tag_list_add (demux->taglist, GST_TAG_MERGE_REPLACE,
+                tag_name, value_union.value_double, NULL);
+          } else {
+            GST_WARNING_OBJECT (demux, "tag %s already registered with a "
+                "different type", tag_name);
+          }
+        }
+      }
+
+      break;
+    }
+    case 1:                    // Boolean
+    {
+      gboolean value = GST_READ_UINT8 (data + offset);
+
+      offset++;
+
+      GST_DEBUG_OBJECT (demux, "%s => (boolean) %d", tag_name, value);
+
+      if (tag_name) {
+        if (!gst_tag_exists (tag_name)) {
+          gst_tag_register (tag_name, GST_TAG_FLAG_META, G_TYPE_BOOLEAN,
+              tag_name, tag_name, gst_tag_merge_use_first);
+        }
+
+        if (gst_tag_get_type (tag_name) == G_TYPE_BOOLEAN) {
+          gst_tag_list_add (demux->taglist, GST_TAG_MERGE_REPLACE,
+              tag_name, value, NULL);
+        } else {
+          GST_WARNING_OBJECT (demux, "tag %s already registered with a "
+              "different type", tag_name);
+        }
+      }
+
+      break;
+    }
+    case 2:                    // String
+    {
+      gchar *value = NULL;
+
+      value = FLV_GET_STRING (data + offset, data_size - offset);
+
+      offset += strlen (value) + 2;
+
+      GST_DEBUG_OBJECT (demux, "%s => (string) %s", tag_name, value);
+
+      if (tag_name) {
+        if (!gst_tag_exists (tag_name)) {
+          gst_tag_register (tag_name, GST_TAG_FLAG_META, G_TYPE_STRING,
+              tag_name, tag_name, gst_tag_merge_strings_with_comma);
+        }
+
+        if (gst_tag_get_type (tag_name) == G_TYPE_STRING) {
+          gst_tag_list_add (demux->taglist, GST_TAG_MERGE_REPLACE,
+              tag_name, value, NULL);
+        } else {
+          GST_WARNING_OBJECT (demux, "tag %s already registered with a "
+              "different type", tag_name);
+        }
+      }
+
+      g_free (value);
+
+      break;
+    }
+    case 3:                    // Object
+    {
+      gboolean end_of_object_marker = FALSE;
+
+      while (!end_of_object_marker && offset < data_size) {
+        size_t read = gst_flv_parse_metadata_item (demux, data + offset,
+            data_size - offset, &end_of_object_marker);
+
+        if (G_UNLIKELY (!read)) {
+          GST_WARNING_OBJECT (demux, "failed reading a tag, skipping");
+          break;
+        }
+
+        offset += read;
+      }
+
+      break;
+    }
+    case 9:                    // End marker
+    {
+      GST_DEBUG_OBJECT (demux, "end marker ?");
+      if (tag_name[0] == '\0') {
+
+        GST_DEBUG_OBJECT (demux, "end marker detected");
+
+        *end_marker = TRUE;
+      }
+
+      break;
+    }
+    case 10:                   // Array
+    {
+      guint32 nb_elems = GST_READ_UINT32_BE (data + offset);
+
+      offset += 4;
+
+      GST_DEBUG_OBJECT (demux, "array has %d elements", nb_elems);
+
+      if (!strcmp (tag_name, "times")) {
+        if (demux->times) {
+          g_array_free (demux->times, TRUE);
+        }
+        demux->times = g_array_new (FALSE, TRUE, sizeof (gdouble));
+      } else if (!strcmp (tag_name, "filepositions")) {
+        if (demux->filepositions) {
+          g_array_free (demux->filepositions, TRUE);
+        }
+        demux->filepositions = g_array_new (FALSE, TRUE, sizeof (gdouble));
+      }
+
+      while (nb_elems--) {
+        guint8 elem_type = GST_READ_UINT8 (data + offset);
+
+        offset++;
+
+        switch (elem_type) {
+          case 0:
+          {
+            union
+            {
+              guint64 value_uint64;
+              gdouble value_double;
+            } value_union;
+
+            value_union.value_uint64 = GST_READ_UINT64_BE (data + offset);
+
+            offset += 8;
+
+            GST_DEBUG_OBJECT (demux, "element is a double %f",
+                value_union.value_double);
+
+            if (!strcmp (tag_name, "times") && demux->times) {
+              g_array_append_val (demux->times, value_union.value_double);
+            } else if (!strcmp (tag_name, "filepositions") &&
+                demux->filepositions) {
+              g_array_append_val (demux->filepositions,
+                  value_union.value_double);
+            }
+            break;
+          }
+          default:
+            GST_WARNING_OBJECT (demux, "unsupported array element type %d",
+                elem_type);
+        }
+      }
+
+      break;
+    }
+    case 11:                   // Date
+    {
+      union
+      {
+        guint64 value_uint64;
+        gdouble value_double;
+      } value_union;
+
+      value_union.value_uint64 = GST_READ_UINT64_BE (data + offset);
+
+      offset += 8;
+
+      /* There are 2 additional bytes */
+      offset += 2;
+
+      GST_DEBUG_OBJECT (demux, "%s => (date as a double) %f", tag_name,
+          value_union.value_double);
+
+      break;
+    }
+    default:
+      GST_WARNING_OBJECT (demux, "unsupported tag type %d", tag_type);
+  }
+
+  g_free (tag_name);
+
+beach:
+  return offset;
+}
+
+GstFlowReturn
+gst_flv_parse_tag_script (GstFLVDemux * demux, const guint8 * data,
+    size_t data_size)
+{
+  GstFlowReturn ret = GST_FLOW_OK;
+  size_t offset = 7;
+
+  GST_LOG_OBJECT (demux, "parsing a script tag");
+
+  if (GST_READ_UINT8 (data + offset++) == 2) {
+    guint i;
+    gchar *function_name = FLV_GET_STRING (data + offset, data_size - offset);
+
+    GST_LOG_OBJECT (demux, "function name is %s", function_name);
+
+    if (!strcmp (function_name, "onMetaData")) {
+      guint32 nb_elems = 0;
+      gboolean end_marker = FALSE;
+
+      GST_DEBUG_OBJECT (demux, "we have a metadata script object");
+
+      /* Jump over the onMetaData string and the array indicator */
+      offset += 13;
+
+      nb_elems = GST_READ_UINT32_BE (data + offset);
+
+      /* Jump over the number of elements */
+      offset += 4;
+
+      GST_DEBUG_OBJECT (demux, "there are %d elements in the array", nb_elems);
+
+      while (nb_elems-- && !end_marker) {
+        size_t read = gst_flv_parse_metadata_item (demux, data + offset,
+            data_size - offset, &end_marker);
+
+        if (G_UNLIKELY (!read)) {
+          GST_WARNING_OBJECT (demux, "failed reading a tag, skipping");
+          break;
+        }
+        offset += read;
+      }
+
+      demux->push_tags = TRUE;
+    }
+
+    g_free (function_name);
+
+    if (demux->index && demux->times && demux->filepositions) {
+      /* If an index was found, insert associations */
+      for (i = 0; i < MIN (demux->times->len, demux->filepositions->len); i++) 
{
+        guint64 time, fileposition;
+
+        time = g_array_index (demux->times, gdouble, i) * GST_SECOND;
+        fileposition = g_array_index (demux->filepositions, gdouble, i);
+        GST_LOG_OBJECT (demux, "adding association %" GST_TIME_FORMAT "-> %"
+            G_GUINT64_FORMAT, GST_TIME_ARGS (time), fileposition);
+        gst_index_add_association (demux->index, demux->index_id,
+            GST_ASSOCIATION_FLAG_KEY_UNIT, GST_FORMAT_TIME, time,
+            GST_FORMAT_BYTES, fileposition, NULL);
+      }
+    }
+  }
+
+
+  return ret;
+}
+
+GstFlowReturn
+gst_flv_parse_tag_audio (GstFLVDemux * demux, const guint8 * data,
+    size_t data_size)
+{
+  GstFlowReturn ret = GST_FLOW_OK;
+  GstBuffer *buffer = NULL;
+  guint32 pts = 0, codec_tag = 0, rate = 5512, width = 8, channels = 1;
+  guint32 codec_data = 0, pts_ext = 0;
+  guint8 flags = 0;
+
+  GST_LOG_OBJECT (demux, "parsing an audio tag");
+
+  GST_LOG_OBJECT (demux, "pts bytes %02X %02X %02X %02X", data[0], data[1],
+      data[2], data[3]);
+
+  /* Grab information about audio tag */
+  pts = FLV_GET_BEUI24 (data, data_size);
+  /* read the pts extension to 32 bits integer */
+  pts_ext = GST_READ_UINT8 (data + 3);
+  /* Combine them */
+  pts |= pts_ext << 24;
+  /* Skip the stream id and go directly to the flags */
+  flags = GST_READ_UINT8 (data + 7);
+
+  /* Channels */
+  if (flags & 0x01) {
+    channels = 2;
+  }
+  /* Width */
+  if (flags & 0x02) {
+    width = 16;
+  }
+  /* Sampling rate */
+  if ((flags & 0x0C) == 0x0C) {
+    rate = 44100;
+  } else if ((flags & 0x0C) == 0x08) {
+    rate = 22050;
+  } else if ((flags & 0x0C) == 0x04) {
+    rate = 11025;
+  }
+  /* Codec tag */
+  codec_tag = flags >> 4;
+  codec_data = 1;
+
+  GST_LOG_OBJECT (demux, "audio tag with %d channels, %dHz sampling rate, "
+      "%d bits width, codec tag %u (flags %02X)", channels, rate, width,
+      codec_tag, flags);
+
+  /* If we don't have our audio pad created, then create it. */
+  if (G_UNLIKELY (!demux->audio_pad)) {
+    GstCaps *caps = NULL;
+    gchar *codec_name = NULL;
+
+    demux->audio_pad = gst_pad_new ("audio", GST_PAD_SRC);
+    if (G_UNLIKELY (!demux->audio_pad)) {
+      GST_WARNING_OBJECT (demux, "failed creating audio pad");
+      ret = GST_FLOW_ERROR;
+      goto beach;
+    }
+
+    /* Make it active */
+    gst_pad_set_active (demux->audio_pad, TRUE);
+
+    switch (codec_tag) {
+      case 1:
+        caps =
+            gst_caps_new_simple ("audio/x-adpcm", "layout", G_TYPE_STRING,
+            "swf", NULL);
+        codec_name = "Shockwave ADPCM";
+        break;
+      case 2:
+        caps = gst_caps_new_simple ("audio/mpeg",
+            "mpegversion", G_TYPE_INT, 1, "layer", G_TYPE_INT, 3, NULL);
+        codec_name = "MPEG 1 Audio, Layer 3 (MP3)";
+        break;
+      case 0:
+      case 3:
+        caps = gst_caps_new_simple ("audio/x-raw-int",
+            "endianness", G_TYPE_INT, G_BYTE_ORDER,
+            "signed", G_TYPE_BOOLEAN, TRUE,
+            "width", G_TYPE_INT, width, "depth", G_TYPE_INT, width, NULL);
+        codec_name = "Raw Audio";
+        break;
+      case 5:
+      case 6:
+        caps = gst_caps_new_simple ("audio/x-nellymoser", NULL);
+        codec_name = "Nellymoser ASAO";
+        break;
+      default:
+        GST_WARNING_OBJECT (demux, "unsupported audio codec tag %u", 
codec_tag);
+    }
+
+    if (G_UNLIKELY (!caps)) {
+      GST_WARNING_OBJECT (demux, "failed creating caps for audio pad");
+      ret = GST_FLOW_ERROR;
+      gst_object_unref (demux->audio_pad);
+      demux->audio_pad = NULL;
+      goto beach;
+    }
+
+    gst_caps_set_simple (caps,
+        "rate", G_TYPE_INT, rate, "channels", G_TYPE_INT, channels, NULL);
+
+    gst_pad_set_caps (demux->audio_pad, caps);
+    if (codec_name) {
+      if (demux->taglist == NULL)
+        demux->taglist = gst_tag_list_new ();
+      gst_tag_list_add (demux->taglist, GST_TAG_MERGE_REPLACE,
+          GST_TAG_AUDIO_CODEC, codec_name, NULL);
+    }
+
+    GST_DEBUG_OBJECT (demux, "created audio pad with caps %" GST_PTR_FORMAT,
+        caps);
+
+    gst_caps_unref (caps);
+
+    /* Store the caps we have set */
+    demux->audio_codec_tag = codec_tag;
+    demux->rate = rate;
+    demux->channels = channels;
+    demux->width = width;
+
+    /* Set functions on the pad */
+    gst_pad_set_query_type_function (demux->audio_pad,
+        GST_DEBUG_FUNCPTR (gst_flv_demux_query_types));
+    gst_pad_set_query_function (demux->audio_pad,
+        GST_DEBUG_FUNCPTR (gst_flv_demux_query));
+    gst_pad_set_event_function (demux->audio_pad,
+        GST_DEBUG_FUNCPTR (gst_flv_demux_src_event));
+
+    /* We need to set caps before adding */
+    gst_element_add_pad (GST_ELEMENT (demux),
+        gst_object_ref (demux->audio_pad));
+
+    /* We only emit no more pads when we have audio and video. Indeed we can
+     * not trust the FLV header to tell us if there will be only audio or 
+     * only video and we would just break discovery of some files */
+    if (demux->audio_pad && demux->video_pad) {
+      GST_DEBUG_OBJECT (demux, "emitting no more pads");
+      gst_element_no_more_pads (GST_ELEMENT (demux));
+    }
+  }
+
+  /* Check if caps have changed */
+  if (G_UNLIKELY (rate != demux->rate || channels != demux->channels ||
+          codec_tag != demux->audio_codec_tag || width != demux->width)) {
+    GstCaps *caps = NULL;
+    gchar *codec_name = NULL;
+
+    GST_DEBUG_OBJECT (demux, "audio settings have changed, changing caps");
+
+    switch (codec_tag) {
+      case 1:
+        caps =
+            gst_caps_new_simple ("audio/x-adpcm", "layout", G_TYPE_STRING,
+            "swf", NULL);
+        codec_name = "Shockwave ADPCM";
+        break;
+      case 2:
+        caps = gst_caps_new_simple ("audio/mpeg",
+            "mpegversion", G_TYPE_INT, 1, "layer", G_TYPE_INT, 3, NULL);
+        codec_name = "MPEG 1 Audio, Layer 3 (MP3)";
+        break;
+      case 0:
+      case 3:
+        caps = gst_caps_new_simple ("audio/x-raw-int", NULL);
+        codec_name = "Raw Audio";
+        break;
+      case 6:
+        caps = gst_caps_new_simple ("audio/x-nellymoser", NULL);
+        codec_name = "Nellymoser ASAO";
+        break;
+      default:
+        GST_WARNING_OBJECT (demux, "unsupported audio codec tag %u", 
codec_tag);
+    }
+
+    if (G_UNLIKELY (!caps)) {
+      GST_WARNING_OBJECT (demux, "failed creating caps for audio pad");
+      ret = GST_FLOW_ERROR;
+      goto beach;
+    }
+
+    gst_caps_set_simple (caps,
+        "rate", G_TYPE_INT, rate,
+        "channels", G_TYPE_INT, channels, "width", G_TYPE_INT, width, NULL);
+
+    gst_pad_set_caps (demux->audio_pad, caps);
+    if (codec_name) {
+      if (demux->taglist == NULL)
+        demux->taglist = gst_tag_list_new ();
+      gst_tag_list_add (demux->taglist, GST_TAG_MERGE_REPLACE,
+          GST_TAG_AUDIO_CODEC, codec_name, NULL);
+    }
+
+    gst_caps_unref (caps);
+
+    /* Store the caps we have set */
+    demux->audio_codec_tag = codec_tag;
+    demux->rate = rate;
+    demux->channels = channels;
+    demux->width = width;
+  }
+
+  /* Push taglist if present */
+  if ((demux->has_audio && !demux->audio_pad) ||
+      (demux->has_video && !demux->video_pad)) {
+    GST_DEBUG_OBJECT (demux, "we are still waiting for a stream to come up "
+        "before we can push tags");
+  } else {
+    if (demux->taglist && demux->push_tags) {
+      GST_DEBUG_OBJECT (demux, "pushing tags out");
+      gst_element_found_tags (GST_ELEMENT (demux), demux->taglist);
+      demux->taglist = gst_tag_list_new ();
+      demux->push_tags = FALSE;
+    }
+  }
+
+  /* Check if we have anything to push */
+  if (demux->tag_data_size <= codec_data) {
+    GST_LOG_OBJECT (demux, "Nothing left in this tag, returning");
+    goto beach;
+  }
+
+  /* Create buffer from pad */
+  ret = gst_pad_alloc_buffer (demux->audio_pad, GST_BUFFER_OFFSET_NONE,
+      demux->tag_data_size - codec_data, GST_PAD_CAPS (demux->audio_pad),
+      &buffer);
+  if (G_UNLIKELY (ret != GST_FLOW_OK)) {
+    GST_WARNING_OBJECT (demux, "failed allocating a %" G_GUINT64_FORMAT
+        " bytes buffer: %s", demux->tag_data_size, gst_flow_get_name (ret));
+    if (ret == GST_FLOW_NOT_LINKED) {
+      demux->audio_linked = FALSE;
+    }
+    goto beach;
+  }
+
+  demux->audio_linked = TRUE;
+
+  /* Fill buffer with data */
+  GST_BUFFER_TIMESTAMP (buffer) = pts * GST_MSECOND;
+  GST_BUFFER_DURATION (buffer) = GST_CLOCK_TIME_NONE;
+  GST_BUFFER_OFFSET (buffer) = demux->audio_offset++;
+  GST_BUFFER_OFFSET_END (buffer) = demux->audio_offset;
+
+  if (G_UNLIKELY (demux->audio_need_discont)) {
+    GST_BUFFER_FLAG_SET (buffer, GST_BUFFER_FLAG_DISCONT);
+    demux->audio_need_discont = FALSE;
+  }
+
+  gst_segment_set_last_stop (demux->segment, GST_FORMAT_TIME,
+      GST_BUFFER_TIMESTAMP (buffer));
+
+  /* Do we need a newsegment event ? */
+  if (G_UNLIKELY (demux->audio_need_segment)) {
+    if (!demux->new_seg_event) {
+      GST_DEBUG_OBJECT (demux, "pushing newsegment from %"
+          GST_TIME_FORMAT " to %" GST_TIME_FORMAT,
+          GST_TIME_ARGS (demux->segment->last_stop),
+          GST_TIME_ARGS (demux->segment->stop));
+      demux->new_seg_event =
+          gst_event_new_new_segment (FALSE, demux->segment->rate,
+          demux->segment->format, demux->segment->last_stop,
+          demux->segment->stop, demux->segment->last_stop);
+    } else {
+      GST_DEBUG_OBJECT (demux, "pushing pre-generated newsegment event");
+    }
+
+    gst_pad_push_event (demux->audio_pad, gst_event_ref 
(demux->new_seg_event));
+
+    demux->audio_need_segment = FALSE;
+  }
+
+  memcpy (GST_BUFFER_DATA (buffer), data + 7 + codec_data,
+      demux->tag_data_size - codec_data);
+
+  GST_LOG_OBJECT (demux, "pushing %d bytes buffer at pts %" GST_TIME_FORMAT
+      " with duration %" GST_TIME_FORMAT ", offset %" G_GUINT64_FORMAT,
+      GST_BUFFER_SIZE (buffer), GST_TIME_ARGS (GST_BUFFER_TIMESTAMP (buffer)),
+      GST_TIME_ARGS (GST_BUFFER_DURATION (buffer)), GST_BUFFER_OFFSET 
(buffer));
+
+  /* Push downstream */
+  ret = gst_pad_push (demux->audio_pad, buffer);
+
+beach:
+  return ret;
+}
+
+GstFlowReturn
+gst_flv_parse_tag_video (GstFLVDemux * demux, const guint8 * data,
+    size_t data_size)
+{
+  GstFlowReturn ret = GST_FLOW_OK;
+  GstBuffer *buffer = NULL;
+  guint32 pts = 0, codec_data = 1, pts_ext = 0;
+  gboolean keyframe = FALSE;
+  guint8 flags = 0, codec_tag = 0;
+
+  GST_LOG_OBJECT (demux, "parsing a video tag");
+
+  GST_LOG_OBJECT (demux, "pts bytes %02X %02X %02X %02X", data[0], data[1],
+      data[2], data[3]);
+
+  /* Grab information about video tag */
+  pts = FLV_GET_BEUI24 (data, data_size);
+  /* read the pts extension to 32 bits integer */
+  pts_ext = GST_READ_UINT8 (data + 3);
+  /* Combine them */
+  pts |= pts_ext << 24;
+  /* Skip the stream id and go directly to the flags */
+  flags = GST_READ_UINT8 (data + 7);
+
+  /* Keyframe */
+  if ((flags >> 4) == 1) {
+    keyframe = TRUE;
+  }
+  /* Codec tag */
+  codec_tag = flags & 0x0F;
+  if (codec_tag == 4 || codec_tag == 5) {
+    codec_data = 2;
+  }
+
+  GST_LOG_OBJECT (demux, "video tag with codec tag %u, keyframe (%d) "
+      "(flags %02X)", codec_tag, keyframe, flags);
+
+  /* If we don't have our video pad created, then create it. */
+  if (G_UNLIKELY (!demux->video_pad)) {
+    GstCaps *caps = NULL;
+    gchar *codec_name = NULL;
+
+    demux->video_pad = gst_pad_new ("video", GST_PAD_SRC);
+    if (G_UNLIKELY (!demux->video_pad)) {
+      GST_WARNING_OBJECT (demux, "failed creating video pad");
+      ret = GST_FLOW_ERROR;
+      goto beach;
+    }
+    /* Make it active */
+    gst_pad_set_active (demux->video_pad, TRUE);
+
+    /* Generate caps for that pad */
+    switch (codec_tag) {
+      case 2:
+        caps = gst_caps_new_simple ("video/x-flash-video", NULL);
+        codec_name = "Sorenson Video";
+        break;
+      case 3:
+        caps = gst_caps_new_simple ("video/x-flash-screen", NULL);
+        codec_name = "Flash Screen Video";
+      case 4:
+      case 5:
+        caps = gst_caps_new_simple ("video/x-vp6-flash", NULL);
+        codec_name = "On2 VP6 Video";
+        break;
+      default:
+        GST_WARNING_OBJECT (demux, "unsupported video codec tag %d", 
codec_tag);
+    }
+
+    if (G_UNLIKELY (!caps)) {
+      GST_WARNING_OBJECT (demux, "failed creating caps for video pad");
+      gst_object_unref (demux->video_pad);
+      demux->video_pad = NULL;
+      ret = GST_FLOW_ERROR;
+      goto beach;
+    }
+
+    gst_caps_set_simple (caps, "pixel-aspect-ratio", GST_TYPE_FRACTION,
+        demux->par_x, demux->par_y, NULL);
+
+    /* When we ve set pixel-aspect-ratio we use that boolean to detect a 
+     * metadata tag that would come later and trigger a caps change */
+    demux->got_par = FALSE;
+
+    gst_pad_set_caps (demux->video_pad, caps);
+
+    GST_DEBUG_OBJECT (demux, "created video pad with caps %" GST_PTR_FORMAT,
+        caps);
+
+    gst_caps_unref (caps);
+    if (codec_name) {
+      if (demux->taglist == NULL)
+        demux->taglist = gst_tag_list_new ();
+      gst_tag_list_add (demux->taglist, GST_TAG_MERGE_REPLACE,
+          GST_TAG_VIDEO_CODEC, codec_name, NULL);
+    }
+
+    /* Store the caps we have set */
+    demux->video_codec_tag = codec_tag;
+
+    /* Set functions on the pad */
+    gst_pad_set_query_type_function (demux->video_pad,
+        GST_DEBUG_FUNCPTR (gst_flv_demux_query_types));
+    gst_pad_set_query_function (demux->video_pad,
+        GST_DEBUG_FUNCPTR (gst_flv_demux_query));
+    gst_pad_set_event_function (demux->video_pad,
+        GST_DEBUG_FUNCPTR (gst_flv_demux_src_event));
+
+    /* We need to set caps before adding */
+    gst_element_add_pad (GST_ELEMENT (demux),
+        gst_object_ref (demux->video_pad));
+
+    /* We only emit no more pads when we have audio and video. Indeed we can
+     * not trust the FLV header to tell us if there will be only audio or 
+     * only video and we would just break discovery of some files */
+    if (demux->audio_pad && demux->video_pad) {
+      GST_DEBUG_OBJECT (demux, "emitting no more pads");
+      gst_element_no_more_pads (GST_ELEMENT (demux));
+    }
+  }
+
+  /* Check if caps have changed */
+  if (G_UNLIKELY (codec_tag != demux->video_codec_tag || demux->got_par)) {
+    GstCaps *caps = NULL;
+    gchar *codec_name = NULL;
+
+    GST_DEBUG_OBJECT (demux, "video settings have changed, changing caps");
+
+    /* Generate caps for that pad */
+    switch (codec_tag) {
+      case 2:
+        caps = gst_caps_new_simple ("video/x-flash-video", NULL);
+        codec_name = "Sorenson Video";
+        break;
+      case 3:
+        caps = gst_caps_new_simple ("video/x-flash-screen", NULL);
+        codec_name = "Flash Screen Video";
+      case 4:
+      case 5:
+        caps = gst_caps_new_simple ("video/x-vp6", NULL);
+        codec_name = "On2 VP6 Video";
+        break;
+      default:
+        GST_WARNING_OBJECT (demux, "unsupported video codec tag %d", 
codec_tag);
+    }
+
+    if (G_UNLIKELY (!caps)) {
+      GST_WARNING_OBJECT (demux, "failed creating caps for video pad");
+      ret = GST_FLOW_ERROR;
+      goto beach;
+    }
+
+    gst_caps_set_simple (caps, "pixel-aspect-ratio", GST_TYPE_FRACTION,
+        demux->par_x, demux->par_y, NULL);
+
+    /* When we ve set pixel-aspect-ratio we use that boolean to detect a 
+     * metadata tag that would come later and trigger a caps change */
+    demux->got_par = FALSE;
+
+    gst_pad_set_caps (demux->video_pad, caps);
+
+    gst_caps_unref (caps);
+    if (codec_name) {
+      if (demux->taglist == NULL)
+        demux->taglist = gst_tag_list_new ();
+      gst_tag_list_add (demux->taglist, GST_TAG_MERGE_REPLACE,
+          GST_TAG_VIDEO_CODEC, codec_name, NULL);
+    }
+
+    /* Store the caps we have set */
+    demux->video_codec_tag = codec_tag;
+  }
+
+  /* Push taglist if present */
+  if ((demux->has_audio && !demux->audio_pad) ||
+      (demux->has_video && !demux->video_pad)) {
+    GST_DEBUG_OBJECT (demux, "we are still waiting for a stream to come up "
+        "before we can push tags");
+  } else {
+    if (demux->taglist && demux->push_tags) {
+      GST_DEBUG_OBJECT (demux, "pushing tags out");
+      gst_element_found_tags (GST_ELEMENT (demux), demux->taglist);
+      demux->taglist = gst_tag_list_new ();
+      demux->push_tags = FALSE;
+    }
+  }
+
+  /* Check if we have anything to push */
+  if (demux->tag_data_size <= codec_data) {
+    GST_LOG_OBJECT (demux, "Nothing left in this tag, returning");
+    goto beach;
+  }
+
+  /* Create buffer from pad */
+  ret = gst_pad_alloc_buffer (demux->video_pad, GST_BUFFER_OFFSET_NONE,
+      demux->tag_data_size - codec_data, GST_PAD_CAPS (demux->video_pad),
+      &buffer);
+  if (G_UNLIKELY (ret != GST_FLOW_OK)) {
+    GST_WARNING_OBJECT (demux, "failed allocating a %" G_GUINT64_FORMAT
+        " bytes buffer: %s", demux->tag_data_size, gst_flow_get_name (ret));
+    if (ret == GST_FLOW_NOT_LINKED) {
+      demux->video_linked = FALSE;
+    }
+    goto beach;
+  }
+
+  demux->video_linked = TRUE;
+
+  /* Fill buffer with data */
+  GST_BUFFER_TIMESTAMP (buffer) = pts * GST_MSECOND;
+  GST_BUFFER_DURATION (buffer) = GST_CLOCK_TIME_NONE;
+  GST_BUFFER_OFFSET (buffer) = demux->video_offset++;
+  GST_BUFFER_OFFSET_END (buffer) = demux->video_offset;
+
+  if (!keyframe) {
+    GST_BUFFER_FLAG_SET (buffer, GST_BUFFER_FLAG_DELTA_UNIT);
+  } else {
+    if (demux->index) {
+      GST_LOG_OBJECT (demux, "adding association %" GST_TIME_FORMAT "-> %"
+          G_GUINT64_FORMAT, GST_TIME_ARGS (GST_BUFFER_TIMESTAMP (buffer)),
+          demux->cur_tag_offset);
+      gst_index_add_association (demux->index, demux->index_id,
+          GST_ASSOCIATION_FLAG_KEY_UNIT,
+          GST_FORMAT_TIME, GST_BUFFER_TIMESTAMP (buffer),
+          GST_FORMAT_BYTES, demux->cur_tag_offset, NULL);
+    }
+  }
+
+  if (G_UNLIKELY (demux->video_need_discont)) {
+    GST_BUFFER_FLAG_SET (buffer, GST_BUFFER_FLAG_DISCONT);
+    demux->video_need_discont = FALSE;
+  }
+
+  gst_segment_set_last_stop (demux->segment, GST_FORMAT_TIME,
+      GST_BUFFER_TIMESTAMP (buffer));
+
+  /* Do we need a newsegment event ? */
+  if (G_UNLIKELY (demux->video_need_segment)) {
+    if (!demux->new_seg_event) {
+      GST_DEBUG_OBJECT (demux, "pushing newsegment from %"
+          GST_TIME_FORMAT " to %" GST_TIME_FORMAT,
+          GST_TIME_ARGS (demux->segment->last_stop),
+          GST_TIME_ARGS (demux->segment->stop));
+      demux->new_seg_event =
+          gst_event_new_new_segment (FALSE, demux->segment->rate,
+          demux->segment->format, demux->segment->last_stop,
+          demux->segment->stop, demux->segment->last_stop);
+    } else {
+      GST_DEBUG_OBJECT (demux, "pushing pre-generated newsegment event");
+    }
+
+    gst_pad_push_event (demux->video_pad, gst_event_ref 
(demux->new_seg_event));
+
+    demux->video_need_segment = FALSE;
+  }
+
+  /* FIXME: safety checks */
+  memcpy (GST_BUFFER_DATA (buffer), data + 7 + codec_data,
+      demux->tag_data_size - codec_data);
+
+  GST_LOG_OBJECT (demux, "pushing %d bytes buffer at pts %" GST_TIME_FORMAT
+      " with duration %" GST_TIME_FORMAT ", offset %" G_GUINT64_FORMAT
+      ", keyframe (%d)", GST_BUFFER_SIZE (buffer),
+      GST_TIME_ARGS (GST_BUFFER_TIMESTAMP (buffer)),
+      GST_TIME_ARGS (GST_BUFFER_DURATION (buffer)), GST_BUFFER_OFFSET (buffer),
+      keyframe);
+
+  /* Push downstream */
+  ret = gst_pad_push (demux->video_pad, buffer);
+
+beach:
+  return ret;
+}
+
+GstFlowReturn
+gst_flv_parse_tag_type (GstFLVDemux * demux, const guint8 * data,
+    size_t data_size)
+{
+  GstFlowReturn ret = GST_FLOW_OK;
+  guint8 tag_type = 0;
+
+  tag_type = data[0];
+
+  switch (tag_type) {
+    case 9:
+      demux->state = FLV_STATE_TAG_VIDEO;
+      demux->has_video = TRUE;
+      break;
+    case 8:
+      demux->state = FLV_STATE_TAG_AUDIO;
+      demux->has_audio = TRUE;
+      break;
+    case 18:
+      demux->state = FLV_STATE_TAG_SCRIPT;
+      break;
+    default:
+      GST_WARNING_OBJECT (demux, "unsupported tag type %u", tag_type);
+  }
+
+  /* Tag size is 1 byte of type + 3 bytes of size + 7 bytes + tag data size +
+   * 4 bytes of previous tag size */
+  demux->tag_data_size = FLV_GET_BEUI24 (data + 1, data_size - 1);
+  demux->tag_size = demux->tag_data_size + 11;
+
+  GST_LOG_OBJECT (demux, "tag data size is %" G_GUINT64_FORMAT,
+      demux->tag_data_size);
+
+  return ret;
+}
+
+GstFlowReturn
+gst_flv_parse_header (GstFLVDemux * demux, const guint8 * data,
+    size_t data_size)
+{
+  GstFlowReturn ret = GST_FLOW_OK;
+
+  /* Check for the FLV tag */
+  if (data[0] == 'F' && data[1] == 'L' && data[2] == 'V') {
+    GST_DEBUG_OBJECT (demux, "FLV header detected");
+  } else {
+    if (G_UNLIKELY (demux->strict)) {
+      GST_WARNING_OBJECT (demux, "invalid header tag detected");
+      ret = GST_FLOW_UNEXPECTED;
+      goto beach;
+    }
+  }
+
+  /* Jump over the 4 first bytes */
+  data += 4;
+
+  /* Now look at audio/video flags */
+  {
+    guint8 flags = data[0];
+
+    demux->has_video = demux->has_audio = FALSE;
+
+    if (flags & 1) {
+      GST_DEBUG_OBJECT (demux, "there is a video stream");
+      demux->has_video = TRUE;
+    }
+    if (flags & 4) {
+      GST_DEBUG_OBJECT (demux, "there is an audio stream");
+      demux->has_audio = TRUE;
+    }
+  }
+
+  /* We don't care about the rest */
+  demux->need_header = FALSE;
+
+beach:
+  return ret;
+}

Index: libmedia/gst/gstflvparse.h
===================================================================
RCS file: libmedia/gst/gstflvparse.h
diff -N libmedia/gst/gstflvparse.h
--- /dev/null   1 Jan 1970 00:00:00 -0000
+++ libmedia/gst/gstflvparse.h  21 Jan 2008 07:07:27 -0000      1.1
@@ -0,0 +1,42 @@
+/* GStreamer
+ * Copyright (C) <2007> Julien Moutte <address@hidden>
+ *
+ * This library is free software; you can redistribute it and/or
+ * modify it under the terms of the GNU Library General Public
+ * License as published by the Free Software Foundation; either
+ * version 2 of the License, or (at your option) any later version.
+ *
+ * This library is distributed in the hope that it will be useful,
+ * but WITHOUT ANY WARRANTY; without even the implied warranty of
+ * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the GNU
+ * Library General Public License for more details.
+ *
+ * You should have received a copy of the GNU Library General Public
+ * License along with this library; if not, write to the
+ * Free Software Foundation, Inc., 59 Temple Place - Suite 330,
+ * Boston, MA 02111-1307, USA.
+ */
+
+#ifndef __FLV_PARSE_H__
+#define __FLV_PARSE_H__
+
+#include "gstflvdemux.h"
+
+G_BEGIN_DECLS
+    GstFlowReturn gst_flv_parse_tag_script (GstFLVDemux * demux,
+    const guint8 * data, size_t data_size);
+
+GstFlowReturn gst_flv_parse_tag_audio (GstFLVDemux * demux, const guint8 * 
data,
+    size_t data_size);
+
+GstFlowReturn gst_flv_parse_tag_video (GstFLVDemux * demux, const guint8 * 
data,
+    size_t data_size);
+
+GstFlowReturn gst_flv_parse_tag_type (GstFLVDemux * demux, const guint8 * data,
+    size_t data_size);
+
+GstFlowReturn gst_flv_parse_header (GstFLVDemux * demux, const guint8 * data,
+    size_t data_size);
+
+G_END_DECLS
+#endif /* __FLV_PARSE_H__ */




reply via email to

[Prev in Thread] Current Thread [Next in Thread]