[Jahshaka-cvs] SF.net SVN: openlibraries: [1320] trunk
Status: Beta
Brought to you by:
jahshaka
From: <gl...@us...> - 2007-10-28 15:23:12
|
Revision: 1320 http://openlibraries.svn.sourceforge.net/openlibraries/?rev=1320&view=rev Author: glslang Date: 2007-10-28 08:23:10 -0700 (Sun, 28 Oct 2007) Log Message: ----------- + merge cleanups and tidy-ups Added Paths: ----------- trunk/doc/ trunk/doc/CodingGuidelines.html trunk/doc/Installation.html trunk/doc/ReleaseNotes.html trunk/doc/oml.odt trunk/src/openmedialib/plugins/directshow/ trunk/src/openmedialib/plugins/directshow/config.hpp trunk/src/openmedialib/plugins/directshow/directshow_plugin.cpp trunk/src/openmedialib/plugins/directshow/directshow_plugin.hpp trunk/src/openmedialib/plugins/directshow/directshow_plugin.opl trunk/src/openmedialib/plugins/directshow/directshow_vc8.vcproj trunk/src/openmedialib/plugins/sdl/ trunk/src/openmedialib/plugins/sdl/Makefile.am trunk/src/openmedialib/plugins/sdl/sdl_plugin.cpp trunk/src/openmedialib/plugins/sdl/sdl_plugin.opl trunk/src/openmedialib/plugins/sdl/sdl_vc8.vcproj trunk/src/openobjectlib/plugins/Collada/ trunk/src/openobjectlib/plugins/Collada/Collada_vc8.vcproj trunk/src/openobjectlib/plugins/Collada/Makefile.am trunk/src/openobjectlib/plugins/Collada/actions/ trunk/src/openobjectlib/plugins/Collada/actions/dae_parser_action.cpp trunk/src/openobjectlib/plugins/Collada/actions/dae_parser_action.hpp trunk/src/openobjectlib/plugins/Collada/collada_plugin.opl trunk/src/openobjectlib/plugins/Collada/config.hpp trunk/src/openobjectlib/plugins/Collada/dae.cpp trunk/src/openobjectlib/plugins/Collada/dae_content_handler_libxml.cpp trunk/src/openobjectlib/plugins/Collada/dae_content_handler_libxml.hpp trunk/src/openobjectlib/plugins/Collada/dae_content_handler_msxml.cpp trunk/src/openobjectlib/plugins/Collada/dae_content_handler_msxml.hpp trunk/src/openobjectlib/plugins/Collada/dae_plugin.cpp trunk/src/openobjectlib/plugins/Collada/dae_plugin.hpp trunk/src/openobjectlib/plugins/Collada/xml_value_tokenizer.hpp Added: trunk/doc/CodingGuidelines.html =================================================================== --- trunk/doc/CodingGuidelines.html (rev 0) +++ trunk/doc/CodingGuidelines.html 2007-10-28 15:23:10 UTC (rev 1320) @@ -0,0 +1,139 @@ +<!DOCTYPE HTML PUBLIC "-//W3C//DTD HTML 4.01 Strict//EN" "http://www.w3.org/TR/html4/strict.dtd"> +<html> +<head> + <title>OpenLibraries</title> + <meta http-equiv="Content-Type" content="text/html; charset=ISO-8859-1"> + <meta name="GENERATOR" content="Quanta Plus"> +</head> +<body> + +<h1 align="center"> + OpenLibraries Coding Guidelines</h1> +<br> +<ol> + <li><a href="#intro">About</a></li> + <li><a href="#dependencies">Dependencies</a></li><li><a href="#repository">Repository Structure</a></li> + <li><a href="#start">Getting Started</a></li> + <li><a href="#trouble">Coding Conventions</a></li> + <li><a href="#faq">FAQ</a></li> +</ol> + +<h2><a name="intro">1. About</a></h2> + +<p>Welcome to the OpenLibraries! A unified framework to develop real-time editing, + visual effects and rich-media applications.</p> + <p> + OpenLibraries are free to be applied for any purpose, including commercial usage and +distribution. It's open-source software, released under LGPL. OpenLibraries + are a true community effort and anyone is welcome to participate. This document + describe some of the guidelines that the OpenLibraries team uses when developing + the code. They are not exhaustive and they don't take away your freedom of expression + when writing code. This document does not tell you where to put your curly braces. + It simply describes a simple set of conventions to allow for proper usage of CVS + and development of openlibraries aware code.</p> + +<p align="right"><a href="#top">back to top</a></p> + +<h2><a name="dependencies">2. Dependencies</a></h2> + <p> + The libraries depend heavily + on <a href="http://www.boost.org">Boost</a> and <a href="http://glew.sourceforge.net"> + GLEW</a>. RPMS for these dependencies are provided and they should be used. + The module dependencies in CVS contains the RPM spec files to build these RPMS. + However, other versions of Boost may be used for development and this is left to + the discretion of the programmer. It is recommended, however, that the latest versions + be used.</p> + <h3> + </h3> + + +<p align="right"><a href="#top">back to top</a></p> + +<h2><a name="repository">3. Repository Structure</a></h2> + <p> + The repository directory structure has src, test and doc directories. The top level + directory is for exclusive access of build scripts and global configuration files. + All the development should happen in the src directory under a library name directory + (e.g. src/openobjectlib). The developer manages the directory structure after that. + All unit tests will go into the test directory under a library name directory (e.g. + test/openobjectlib). The doc directory follows the same conventions and it should + hold users and developers guide for each of the libraries. Remember, lack of documentation + prevents adoption. Proper usage of CVS must be respected at all times. Code not + following these conventions will be deleted without warning. Also, it is illegal + to share CVS sanboxes between multiple OS due to well known end of line characters + translation issues.</p> + <p style="text-align: right"> + <a href="#top">back to top</a></p> + <h2> + 4<a name="start">. Getting Started</a></h2> + <p> + To build the libraries Visual Studio solution files, XCode 2.1 projects, and autoconf + based scripts are provided depending on each platform. + The best way of getting started as a developer is to download the source code and + have a look at the test directory to see how they are being used. It is our intention + to provide full developer documentation. Although much code has been written some + of it is not fully featured and is effectively entry point code for add-ons and + more advanced features (this is where you come + in!). Eventually all of it will be written hopefully with the + help of community effort.</p> + <p> + <strong>Source: </strong>source code tarballs are available from the OpenLibraries + SourceForge web site and the source code can also be downloaded from the CVS repository:</p> + <ol> + <li>cvs -z3 -d :pserver:ano...@cv...:/cvsroot/openlibraries login</li> + <li>cvs -z3 -d:pserver:ano...@cv...:/cvsroot/openlibraries co -P openlibraries</li> + </ol> + <p> + or, if running on Windows, with a CVS client such as TortoiseCVS.</p> + <p> + To build the libraries Visual Studio solution files, XCode 2.1 projects, and autoconf + based scripts are provided depending on each platform. Also, and to minimise the + number of dependencies, and elusive ports of external code to different platforms, + the OpenLibraries strive to use as much native functionality as possible. Currently + there is no unified build process on all platforms. On Linux, OpenLibraries make + use of the GOAT tools to manage the detection of platform characteristics and the + generation of Makefiles. On Windows, Visual Studio is being used and different libraries + have different solution files. At this point in time they will have to be built + separately in order of dependencies. On OSX, XCode 2.1 projects exist and xcodebuild + is used. In the future, however, scripts will be made available to simplify the + build process, but they will still depend on the native tools of each platform.</p> + <h3> + </h3> + +<p align="right"><a href="#top">back to top</a></p> + <h2> + 5<a name="trouble">. Coding Conventions</a></h2> + <p> + There are no strict coding conventions. The programmer is free to use its existing + conventions with a few exceptions. All OpenLibraries related source includes by + means of #include will be of the form #include <name_of_library/whatever>. + This allows us to keep the number of include directories specified on the command + line to a bare minimum. It will avoid conflicts between files with the same name + and has the potential to speed up compilation times. Namespaces are of the form + namespace olib { namespace library_name {. The programmer will manage its other + namespaces under this structure at will. Visual Studio solution files also follow + a particular set of conventions. The VC runtime is always the DLL multithreaded + version and the configuration name is Multi-threaded Debug DLL or Multi-threaded + Release DLL. Please check the openobjectlib for examples.</p> + <h2> + <a name="faq">6. FAQ</a></h2> + +<ol> + <li><span style="color: #0000ff; text-decoration: underline">What's the relationship + between Jahshaka and OpenLibraries?</span> </li> +</ol> + +<h3> + What's the relationship between Jahshaka and the OpenLibraries?<a name="faq_1">:</a></h3> + We're it! The OpenLibraries project forms the technological foundation for Jahshaka.<br /> + <br /> + +<hr> +<p>Thanks for reading, we hope you enjoy the OpenLibraries project!</p> + +<p><font size=-1>Document version 1.0, November 2005</font></p> + +<p align="right"><a href="#top">back to top</a></p> + +</body> +</html> Property changes on: trunk/doc/CodingGuidelines.html ___________________________________________________________________ Name: svn:eol-style + native Added: trunk/doc/Installation.html =================================================================== --- trunk/doc/Installation.html (rev 0) +++ trunk/doc/Installation.html 2007-10-28 15:23:10 UTC (rev 1320) @@ -0,0 +1,144 @@ +<!DOCTYPE HTML PUBLIC "-//W3C//DTD HTML 4.01 Strict//EN" "http://www.w3.org/TR/html4/strict.dtd"> +<html> + <head> + <title>OpenLibraries</title> + <meta http-equiv="Content-Type" content="text/html; charset=ISO-8859-1"> + <meta name="GENERATOR" content="Quanta Plus"> + </head> + <body> + <h1 align="center"> + OpenLibraries Installation from Source Notes</h1> + <br> + <OL> + <li> + <a href="#intro">About</a> + <li> + <span style="COLOR: #0000ff; TEXT-DECORATION: underline">Download</span><a href="#pack"> + and Installation</a></li></OL> + <h2><a name="intro">1. About</a></h2> + <p>Welcome to the OpenLibraries! A unified framework to develop real-time editing, + visual effects and rich-media applications.</p> + <p> + OpenLibraries are free to be applied for any purpose, including commercial + usage and distribution. It's open-source software, released under LGPL. + OpenLibraries are a true community effort and anyone is welcome to participate.</p> + <p align="right"><a href="#top">back to top</a></p> + <h2><a name="pack">2. Download and Install</a>ation</h2> + <h3> + </h3> + <h3> + Downloading:</h3> + <p> + Installers for Windows and Linux distributions such as Fedora Core 4 and SUSE + 10 are available from both the OpenLibraries web site <a href="http://www.openlibraries.org"> + http://www.openlibraries.org</a> and from the OpenLibraries SourceForge web + site. Other distributions can use the src rpm or a tarball. The OpenLibraries + have hard (must be satisfied) dependencies on Boost (version 1.34) (<A href="http://www.boost.org">www.boost.org</A>) + and GLEW (<A href="http://glew.sourceforge.net">glew.sourceforge.net</A>). + These must be installed prior to OpenLibraries compilation. OpenLibraries + assumes, on Windows, these libraries are installed on its default locations (as + set by the respective projects). For reference purposes, GLEW goes into the + appropriate Visual Studio directory and Boost is installed in C:\Boost.</p> + <P>The plugins that are being provided have dependencies on other libraries. A + complete list is (in no particular order):</P> + <OL> + <LI> + NVIDIA's Cg 1.4</LI> + <LI> + FFMpeg</LI> + <LI> + Autodesk FBX SDK 200508</LI> + <LI> + NVIDIA Gelato</LI> + <LI> + OpenAL</LI></OL> + <P>Some of these dependencies can be disabled by either disabling the respective + project in the sln file or undefining the preprocessor symbol that controls its + inclusion (to be moved into a user_config.hpp file).</P> + <P>OpenLibraries source code can be accessed from its SourceForge (<A href="http://www.sf.net">www.sf.net</A>) + repository. For anonymous access,</P> + <OL> + <LI> + cvs -z3 -d :pserver:ano...@cv...:/cvsroot/openlibraries login + <LI> + cvs -z3 -d:pserver:ano...@cv...:/cvsroot/openlibraries co -P + openlibraries + </LI> + </OL> + <P>or, if running on Windows, with a CVS client such as TortoiseCVS.</P> + <h3> + Compilation <a name="start_install"> notes:</a></h3> + <p><b>Windows:</b> The current build system on Windows is based on Microsoft Visual + Studio. The official version of the Visual Studio compiler supported by the + OpenLibraries is 7.1 (or .NET 2003). However, OpenLibraries has build support + for VC8 and Intel 9. There is a VC sln file for each of the libraries. The + reason being is easy to manage per developer and minimises CVS conflicts. The + libraries have dependencies between each other and therefore must be built in + the appropriate order. In the future, a build script will be provided based on + VCBuildTool to automate this procedure. The current order to build the + libraries is the following:</p> + <OL> + <LI> + OpenPluginLib + <LI> + OpenImageLib + <LI> + OpenMediaLib + <LI> + OpenObjectLib + <LI> + OpenEffectsLib + <LI> + OpenAssetLib</LI></OL> + <P>The OpenMediaLib has further dependencies on FFmpeg (<A href="http://ffmpeg.sourceforge.net">ffmpeg.sourceforge.net</A>) and + OpenAL (<A href="http://www.openal.org">www.openal.org</A>). OpenEffectsLib and + OpenObjectLib also depend on NVIDIA's Cg toolkit (<A href="http://developer.nvidia.com/page/cg_main.html">developer.nvidia.com/page/cg_main.html</A>). + The version is 1.4 (or any of its release candidates).</P> + <P>Installation on Windows is based on NSIS (<A href="http://nsis.sourceforge.net">nsis.sourceforge.net</A>). + Unfortunately, the NSIS build is still not functional (it will be very soon) so + the installation is a bit broken at the moment. However, and to run the + unit tests (and this includes the OpenLibraries media player) all of the + libraries main *.dlls need to be copied onto the working directory of the test + or somewhere in your PATH variable. Each plugin has its own XML description + file. This is due to how the build system is set and not a limitation of the + OpenPluginLib itself. Nothing prevents the use of a single XML file and binary + installation will rely on a single XML file on Windows in the future. For each + unit test there is a plugins directory onto which the plugins *.opl files and + *.dll files must be copied. This is needed since the unit tests on Windows rely + on a particular path to discover and initialise the plugins. Applications based + on the OpenLibraries will use the installed one and plugin discovery will be + based on the installed directory.</P> + <p><b>Linux:</b> The Linux build uses the standard autoconf and automake tools. + Installations (from CVS) must first call the bootstrap script, followed by + configure. To compile do a make. Installation is done by a make install. + All the usual configure script options are available. Some custom options may + have to be specified depending on your linux distribution and package setup. + The options --with-boostprefix, --with--boostversion and --with-boosttoolset. + For example, with boost 1.34 in /usr/local and gcc, use --with-boostprefix= + /usr/local --with-boostversion= 1_34 --with-boosttoolset= gcc. A make check + will compile the unit tests. You can test the libraries functionality by + running these tests.</p> + <P><b>OSX:</b> Not all OpenLibraries are currently supported on Tiger (10.4). + However, the build system is also be based on XCode 2.2. Needless to say + the OSX platform is fully supported.</P> + <P> + <br> + Currently, in CVS you can find a unit test used to exercise the OpenMediaLib + player functionality. It is called store and can be found in test/openmedialib. + The syntax for the player is the following: store < input > < store + > where input is any string which can be mapped to an installed plugin (for + example, file.avi, file.mpg, /path/*.jpg/sequence: etc) and store is any string + which can be mapped to an installed store (for example, glew:, caca:, openal:, + file.jpg/sequence: etc).</P> + <P>There is a more comprehensive player example called qtplayer which has to be + manually built and uses the QT toolkit. To build it, you need to change + directory to test/openmedialib/qtplayer and run qmake && make. The + usage is simply ./qtplayer [ < input > ].</P> + <P> + <hr> + <P></P> + <p>Thanks for reading, we hope you enjoy the OpenLibraries project!</p> + <p><font size="-1">Document version 1.0, 11 January 2006</font></p> + <p align="right"><a href="#top">back to top</a></p> + </body> +</html> Property changes on: trunk/doc/Installation.html ___________________________________________________________________ Name: svn:eol-style + native Added: trunk/doc/ReleaseNotes.html =================================================================== --- trunk/doc/ReleaseNotes.html (rev 0) +++ trunk/doc/ReleaseNotes.html 2007-10-28 15:23:10 UTC (rev 1320) @@ -0,0 +1,142 @@ +<!DOCTYPE HTML PUBLIC "-//W3C//DTD HTML 4.01 Strict//EN" "http://www.w3.org/TR/html4/strict.dtd"> +<html> +<head> + <title>OpenLibraries</title> + <meta http-equiv="Content-Type" content="text/html; charset=ISO-8859-1"> + <meta name="GENERATOR" content="Quanta Plus"> +</head> +<body> + +<h1 align="center"> + OpenLibraries v0.1.0 Release Notes</h1> +<br> +<ol> + <li><a href="#intro">About</a></li> + <li><span style="color: #0000ff; text-decoration: underline">Download</span><a href="#pack"> + and Installation</a></li><li><a href="#start">Getting Started</a></li><li><a href="#trouble">Troubleshooting</a></li> + <li><a href="#faq">FAQ</a></li> +</ol> + +<h2><a name="intro">1. About</a></h2> + +<p>Welcome to the OpenLibraries! A unified framework to develop real-time editing, + visual effects and rich-media applications.</p> + <p> + OpenLibraries are free to be applied for any purpose, including commercial usage and +distribution. It's open-source software, released under LGPL. OpenLibraries + are a true community effort and anyone is welcome to participate.</p> + +<p align="right"><a href="#top">back to top</a></p> + +<h2><a name="pack">2. Download and Install</a>ation</h2> + <h3> + </h3> + +<h3> + Downloading:</h3> + <p> + Installers for Windows and Linux distributions such as Fedora Core 4 and SUSE 10 + are available from both the OpenLibraries web site <a href="http://www.openlibraries.org"> + http://www.openlibraries.org</a> and from the OpenLibraries SourceForge web + site. Other distributions can use the src rpm or a tarball.</p> + <h3> + <a name="start_install">Installation notes:</a></h3> + +<p><b>Windows:</b> the .msi installer handles both the runtime and development environment + needed to run applications based on the OpenLibraries. Additionally, it sets your + PATH environment variable. The installer is self contained and it doesn't install + any files outside its installation directory.</p> + +<p><b>Linux:</b> the linux RPMs can be used to install both the runtime and development + versions. Additionally, an optional RPM is available that includes sample media + files.</p> + +<p><b>OSX:</b> there is no installer for OSX, although the OpenLibraries are supported + on Tiger (10.4).</p> + +<p> + <strong>Source: </strong>source code tarballs are available from the OpenLibraries + SourceForge web site and the source code can also be downloaded from the CVS repository:</p> + <ol> + <li>cvs -z3 -d :pserver:ano...@cv...:/cvsroot/openlibraries login</li> + <li>cvs -z3 -d:pserver:ano...@cv...:/cvsroot/openlibraries co -P openlibraries</li> + </ol> + <p> + or, if running on Windows, with a CVS client such as TortoiseCVS.</p> + <p> + To build the libraries Visual Studio solution files, XCode 2.1 projects, and autoconf + based scripts are provided depending on each platform. The libraries depend heavily + on <a href="http://www.boost.org">Boost</a> and <a href="http://glew.sourceforge.net"> + GLEW</a>.</p> + <p> + </p> + <p> + </p> + <p> + </p> + + +<p align="right"><a href="#top">back to top</a></p> + +<h2><a name="start">3. Getting Started</a></h2> + <p> + The best way of getting started as a developer is to download the source code and + have a look at the test directory to see how they are being used. It is our intention + to provide full developer documentation. Although much code has been written some + of it is not fully featured and is effectively entry point code for add-ons and + more advanced features. Eventually all of it will be written hopefully with the + help of community effort.</p> + <h3> + </h3> + +<p align="right"><a href="#top">back to top</a></p> + +<h2><a name="trouble">5. Troubleshooting</a></h2> + +<h3><a name="trouble_bugt">The Bug Tracker</a></h3> + +<p> + All bug reports will be followed and taken very seriously. There is no excuse for + not reporting a bug. If something doesn't work tell us using the bug tracker in SourceForge + or in the OpenLibraries web site forums.</p> + <h3> + The Fixed Function Pipeline</h3> + <p> + The OpenLibraries was primarily designed to take advantage of modern GPUs. +Most of its functionality is based on programmable shaders, most notably by means of + the OpenGL Shading Language and in the future Cg and CgFX. However, there is initial + support to fallback to the fixed function pipeline when possible. Initial means + that some functionality may not be available or simply buggy. Please help us fix these problems by submitting patches (the best way of solving bugs), or your original + files or test samples that cause the erratic behaviour.</p> + <h3> + </h3> + <h3> + </h3> + <h3> + </h3> + <h3> + </h3> + +<p align="right"><a href="#top">back to top</a></p> + +<h2><a name="faq">6. FAQ</a></h2> + +<ol> + <li><span style="color: #0000ff; text-decoration: underline">What's the relationship + between Jahshaka and OpenLibraries?</span> </li> +</ol> + +<h3> + What's the relationship between Jahshaka and the OpenLibraries?<a name="faq_1">:</a></h3> + We're it! The OpenLibraries project forms the technological foundation for Jahshaka.<br /> + <br /> + +<hr> +<p>Thanks for reading, we hope you enjoy the OpenLibraries project!</p> + +<p><font size=-1>Document version 1.0, November 2005</font></p> + +<p align="right"><a href="#top">back to top</a></p> + +</body> +</html> Property changes on: trunk/doc/ReleaseNotes.html ___________________________________________________________________ Name: svn:eol-style + native Added: trunk/doc/oml.odt =================================================================== (Binary files differ) Property changes on: trunk/doc/oml.odt ___________________________________________________________________ Name: svn:mime-type + application/octet-stream Added: trunk/src/openmedialib/plugins/directshow/config.hpp =================================================================== --- trunk/src/openmedialib/plugins/directshow/config.hpp (rev 0) +++ trunk/src/openmedialib/plugins/directshow/config.hpp 2007-10-28 15:23:10 UTC (rev 1320) @@ -0,0 +1,24 @@ +// config.hpp for filesystem storage plugin +// +// Copyright (C) 2005-2006 VM Inc. +// Released under the LGPL. +// For more information, see http://www.openlibraries.org. + +#ifndef FILESYSTEM_STORAGE_PLUGIN_CONFIG_INC_ +#define FILESYSTEM_STORAGE_PLUGIN_CONFIG_INC_ + +#ifdef WIN32 +# ifdef FILESYSTEM_STORAGE_PLUGIN_EXPORTS +# define AL_FILESYSTEM_DECLSPEC __declspec( dllexport ) +# else +# define AL_FILESYSTEM_DECLSPEC __declspec( dllimport ) +# endif +#else +# ifdef FILESYSTEM_STORAGE_PLUGIN_EXPORTS +# define AL_FILESYSTEM_DECLSPEC extern +# else +# define AL_FILESYSTEM_DECLSPEC +# endif +#endif + +#endif Property changes on: trunk/src/openmedialib/plugins/directshow/config.hpp ___________________________________________________________________ Name: svn:eol-style + native Added: trunk/src/openmedialib/plugins/directshow/directshow_plugin.cpp =================================================================== --- trunk/src/openmedialib/plugins/directshow/directshow_plugin.cpp (rev 0) +++ trunk/src/openmedialib/plugins/directshow/directshow_plugin.cpp 2007-10-28 15:23:10 UTC (rev 1320) @@ -0,0 +1,1798 @@ +// directshow_plugin.cpp +// +// Copyright (C) 2005-2006 VM Inc. +// Released under the LGPL. +// For more information, see http://www.openlibraries.org. + +#pragma warning ( push ) +#pragma warning ( disable: 4251 ) + +#include <deque> +#include <boost/lexical_cast.hpp> +#include <openmedialib/plugins/directshow/directshow_plugin.hpp> + +//-------------------------------------------------------------------------- + +namespace olib { namespace openmedialib { namespace ml { + +namespace +{ + void parse_BITMAPINFOHEADER(const BITMAPINFOHEADER& bmi) + { +#if defined(_DEBUG) + try + { + OutputDebugStringA((std::string("BITMAPINFOHEADER.biSize: ") + boost::lexical_cast<std::string>(bmi.biSize) + std::string("\n")).c_str()); + OutputDebugStringA((std::string("BITMAPINFOHEADER.biWidth: ") + boost::lexical_cast<std::string>(bmi.biWidth) + std::string("\n")).c_str()); + OutputDebugStringA((std::string("BITMAPINFOHEADER.biHeight: ") + boost::lexical_cast<std::string>(bmi.biHeight) + std::string("\n")).c_str()); + OutputDebugStringA((std::string("BITMAPINFOHEADER.biPlanes: ") + boost::lexical_cast<std::string>(bmi.biPlanes) + std::string("\n")).c_str()); + OutputDebugStringA((std::string("BITMAPINFOHEADER.biBitCount: ") + boost::lexical_cast<std::string>(bmi.biBitCount) + std::string("\n")).c_str()); + OutputDebugStringA((std::string("BITMAPINFOHEADER.biCompression: 0x") + boost::lexical_cast<std::string>(bmi.biCompression) + std::string("\n")).c_str()); + OutputDebugStringA((std::string("BITMAPINFOHEADER.biSizeImage: ") + boost::lexical_cast<std::string>(bmi.biSizeImage) + std::string("\n")).c_str()); + OutputDebugStringA((std::string("BITMAPINFOHEADER.biXPelsPerMeter: ") + boost::lexical_cast<std::string>(bmi.biXPelsPerMeter) + std::string("\n")).c_str()); + OutputDebugStringA((std::string("BITMAPINFOHEADER.biYPelsPerMeter: ") + boost::lexical_cast<std::string>(bmi.biYPelsPerMeter) + std::string("\n")).c_str()); + OutputDebugStringA((std::string("BITMAPINFOHEADER.biClrUsed: ") + boost::lexical_cast<std::string>(bmi.biClrUsed) + std::string("\n")).c_str()); + OutputDebugStringA((std::string("BITMAPINFOHEADER.biClrImportant: ") + boost::lexical_cast<std::string>(bmi.biClrImportant) + std::string("\n")).c_str()); + } + catch(const boost::bad_lexical_cast&) + { + } +#endif + } + + void parse_VIDEOINFOHEADER(const VIDEOINFOHEADER& videoinfo) + { +#if defined(_DEBUG) + try + { + OutputDebugStringA((std::string("VIDEOINFOHEADER.rcSource: left=") + boost::lexical_cast<std::string>(videoinfo.rcSource.left) + std::string(" top=") + boost::lexical_cast<std::string>(videoinfo.rcSource.top) + std::string(" right=") + boost::lexical_cast<std::string>(videoinfo.rcSource.right) + std::string(" bottom=") + boost::lexical_cast<std::string>(videoinfo.rcSource.bottom) + std::string("\n")).c_str()); + OutputDebugStringA((std::string("VIDEOINFOHEADER.rcTarget: left=") + boost::lexical_cast<std::string>(videoinfo.rcTarget.left) + std::string(" top=") + boost::lexical_cast<std::string>(videoinfo.rcTarget.top) + std::string(" right=") + boost::lexical_cast<std::string>(videoinfo.rcTarget.right) + std::string(" bottom=") + boost::lexical_cast<std::string>(videoinfo.rcTarget.bottom) + std::string("\n")).c_str()); + OutputDebugStringA((std::string("VIDEOINFOHEADER.dwBitRate: ") + boost::lexical_cast<std::string>(videoinfo.dwBitRate) + std::string("\n")).c_str()); + OutputDebugStringA((std::string("VIDEOINFOHEADER.dwBitErrorRate: ") + boost::lexical_cast<std::string>(videoinfo.dwBitErrorRate) + std::string("\n")).c_str()); + OutputDebugStringA((std::string("VIDEOINFOHEADER.AvgTimePerFrame: ") + boost::lexical_cast<std::string>(videoinfo.AvgTimePerFrame) + std::string("\n")).c_str()); + } + catch(const boost::bad_lexical_cast&) + { + } + + parse_BITMAPINFOHEADER(videoinfo.bmiHeader); +#endif + } + + void parse_VIDEOINFOHEADER2(const VIDEOINFOHEADER2& videoinfo2) + { +#if defined(_DEBUG) + try + { + OutputDebugStringA((std::string("VIDEOINFOHEADER2.rcSource: left=") + boost::lexical_cast<std::string>(videoinfo2.rcSource.left) + std::string(" top=") + boost::lexical_cast<std::string>(videoinfo2.rcSource.top) + std::string(" right=") + boost::lexical_cast<std::string>(videoinfo2.rcSource.right) + std::string(" bottom=") + boost::lexical_cast<std::string>(videoinfo2.rcSource.bottom) + std::string("\n")).c_str()); + OutputDebugStringA((std::string("VIDEOINFOHEADER2.rcTarget: left=") + boost::lexical_cast<std::string>(videoinfo2.rcTarget.left) + std::string(" top=") + boost::lexical_cast<std::string>(videoinfo2.rcTarget.top) + std::string(" right=") + boost::lexical_cast<std::string>(videoinfo2.rcTarget.right) + std::string(" bottom=") + boost::lexical_cast<std::string>(videoinfo2.rcTarget.bottom) + std::string("\n")).c_str()); + OutputDebugStringA((std::string("VIDEOINFOHEADER2.dwBitRate: ") + boost::lexical_cast<std::string>(videoinfo2.dwBitRate) + std::string("\n")).c_str()); + OutputDebugStringA((std::string("VIDEOINFOHEADER2.dwBitErrorRate: ") + boost::lexical_cast<std::string>(videoinfo2.dwBitErrorRate) + std::string("\n")).c_str()); + OutputDebugStringA((std::string("VIDEOINFOHEADER2.AvgTimePerFrame: ") + boost::lexical_cast<std::string>(videoinfo2.AvgTimePerFrame) + std::string("\n")).c_str()); + OutputDebugStringA((std::string("VIDEOINFOHEADER2.dwInterlaceFlags: 0x") + boost::lexical_cast<std::string>(videoinfo2.dwInterlaceFlags) + std::string("\n")).c_str()); + OutputDebugStringA((std::string("VIDEOINFOHEADER2.dwCopyProtectFlags: 0x") + boost::lexical_cast<std::string>(videoinfo2.dwCopyProtectFlags) + std::string("\n")).c_str()); + OutputDebugStringA((std::string("VIDEOINFOHEADER2.dwPictAspectRatioX: ") + boost::lexical_cast<std::string>(videoinfo2.dwPictAspectRatioX) + std::string("\n")).c_str()); + OutputDebugStringA((std::string("VIDEOINFOHEADER2.dwPictAspectRatioY: ") + boost::lexical_cast<std::string>(videoinfo2.dwPictAspectRatioY) + std::string("\n")).c_str()); + OutputDebugStringA((std::string("VIDEOINFOHEADER2.dwControlFlags: 0x") + boost::lexical_cast<std::string>(videoinfo2.dwControlFlags) + std::string("\n")).c_str()); + OutputDebugStringA((std::string("VIDEOINFOHEADER2.dwReserved2: 0x") + boost::lexical_cast<std::string>(videoinfo2.dwReserved2) + std::string("\n")).c_str()); + } + catch(const boost::bad_lexical_cast&) + { + } + + parse_BITMAPINFOHEADER(videoinfo2.bmiHeader); +#endif + } + + void parse_WAVEFORMATEX(const WAVEFORMATEX& waveformat) + { +#if defined(_DEBUG) + try + { + OutputDebugStringA((std::string("WAVEFORMATEX.wFormatTag: ") + boost::lexical_cast<std::string>(waveformat.wFormatTag) + std::string("\n")).c_str()); + OutputDebugStringA((std::string("WAVEFORMATEX.nChannels: ") + boost::lexical_cast<std::string>(waveformat.nChannels) + std::string("\n")).c_str()); + OutputDebugStringA((std::string("WAVEFORMATEX.nSamplesPerSec: ") + boost::lexical_cast<std::string>(waveformat.nSamplesPerSec) + std::string("\n")).c_str()); + OutputDebugStringA((std::string("WAVEFORMATEX.nAvgBytesPerSec: ") + boost::lexical_cast<std::string>(waveformat.nAvgBytesPerSec) + std::string("\n")).c_str()); + OutputDebugStringA((std::string("WAVEFORMATEX.nBlockAlign: ") + boost::lexical_cast<std::string>(waveformat.nBlockAlign) + std::string("\n")).c_str()); + OutputDebugStringA((std::string("WAVEFORMATEX.wBitsPerSample: ") + boost::lexical_cast<std::string>(waveformat.wBitsPerSample) + std::string("\n")).c_str()); + OutputDebugStringA((std::string("WAVEFORMATEX.cbSize: ") + boost::lexical_cast<std::string>(waveformat.cbSize) + std::string("\n")).c_str()); + } + catch(const boost::bad_lexical_cast&) + { + } +#endif + } + + void parse_AM_MEDIA_TYPE(const AM_MEDIA_TYPE& mediatype) + { +#if defined(_DEBUG) + if(mediatype.majortype == MEDIATYPE_Video) + { + OutputDebugStringA("AM_MEDIA_TYPE.majortype: MEDIATYPE_Video\n"); + + if(mediatype.formattype == FORMAT_VideoInfo) + { + OutputDebugStringA("AM_MEDIA_TYPE.formattype: FORMAT_VideoInfo\n"); + if( (mediatype.pbFormat) + && (mediatype.cbFormat >= sizeof(VIDEOINFOHEADER)) ) + { + parse_VIDEOINFOHEADER(*reinterpret_cast<VIDEOINFOHEADER*>(mediatype.pbFormat)); + } + } + else if(mediatype.formattype == FORMAT_VideoInfo2) + { + OutputDebugStringA("AM_MEDIA_TYPE.formattype: FORMAT_VideoInfo2\n"); + if( (mediatype.pbFormat) + && (mediatype.cbFormat >= sizeof(VIDEOINFOHEADER2)) ) + { + parse_VIDEOINFOHEADER2(*reinterpret_cast<VIDEOINFOHEADER2*>(mediatype.pbFormat)); + } + } + else if(mediatype.formattype == FORMAT_DvInfo) + { + } + else if(mediatype.formattype == FORMAT_MPEG2Video) + { + } + else if(mediatype.formattype == FORMAT_MPEGStreams) + { + } + else if(mediatype.formattype == FORMAT_MPEGVideo) + { + } + } + else if(mediatype.majortype == MEDIATYPE_Audio) + { + OutputDebugStringA("AM_MEDIA_TYPE.majortype: MEDIATYPE_Audio\n"); + + if(mediatype.formattype == FORMAT_WaveFormatEx) + { + OutputDebugStringA("AM_MEDIA_TYPE.formattype: FORMAT_WaveFormatEx\n"); + if( (mediatype.pbFormat) + && (mediatype.cbFormat >= sizeof(WAVEFORMATEX)) ) + parse_WAVEFORMATEX(*reinterpret_cast<WAVEFORMATEX*>(mediatype.pbFormat)); + } + } + + try + { + OutputDebugStringA((std::string("AM_MEDIA_TYPE.subtype: 0x") + boost::lexical_cast<std::string>((void*)*((long*)&mediatype.subtype)) + std::string("\n")).c_str()); + OutputDebugStringA((std::string("AM_MEDIA_TYPE.bFixedSizeSamples: ") + boost::lexical_cast<std::string>(mediatype.bFixedSizeSamples) + std::string("\n")).c_str()); + OutputDebugStringA((std::string("AM_MEDIA_TYPE.bTemporalCompression: ") + boost::lexical_cast<std::string>(mediatype.bTemporalCompression) + std::string("\n")).c_str()); + OutputDebugStringA((std::string("AM_MEDIA_TYPE.lSampleSize: ") + boost::lexical_cast<std::string>(mediatype.lSampleSize) + std::string("\n")).c_str()); + OutputDebugStringA((std::string("AM_MEDIA_TYPE.cbFormat: ") + boost::lexical_cast<std::string>(mediatype.cbFormat) + std::string("\n")).c_str()); + } + catch(const boost::bad_lexical_cast&) + { + } +#endif + } + + //--------------------------------------------------------------------------------------------- + + opl::string fourcc_descriptor(unsigned long fourcc_format) + { + opl::string format; + format += (char)(fourcc_format & 0x000000FF); + format += (char)((fourcc_format & 0x0000FF00) >> 8); + format += (char)((fourcc_format & 0x00FF0000) >> 16); + format += (char)((fourcc_format & 0xFF000000) >> 24); + return format; + } + + // FOURCC to openimagelib format convertion + // Format converter base class... + typedef boost::shared_ptr<class format_converter> format_converter_ptr; + + class format_converter + { + public: + explicit format_converter(unsigned long fourcc_format); + virtual ~format_converter(); + + opl::string get_source_format() const; + virtual opl::wstring get_target_format() const = 0; + virtual void convert(unsigned char* src_buffer, unsigned long src_buffer_size, image_type_ptr image) const = 0; + + private: + unsigned long fourcc_format_; + }; + + format_converter::format_converter(unsigned long fourcc_format) + : fourcc_format_(fourcc_format) + { + } + + format_converter::~format_converter() + { + } + + opl::string format_converter::get_source_format() const + { + return fourcc_descriptor(fourcc_format_); + } + + // FOURCC Y41P converter... + class Y41P_converter : public format_converter + { + public: + explicit Y41P_converter(); + ~Y41P_converter(); + + opl::wstring get_target_format() const; + void convert(unsigned char* src_buffer, unsigned long src_buffer_size, image_type_ptr image) const; + }; + + typedef boost::shared_ptr<Y41P_converter> Y41P_converter_ptr; + Y41P_converter::Y41P_converter() + : format_converter( FCC('Y41P') ) + { + } + + Y41P_converter::~Y41P_converter() + { + } + + opl::wstring Y41P_converter::get_target_format() const + { + return opl::wstring(L"yuv411p"); + } + + void Y41P_converter::convert(unsigned char* src_buffer, unsigned long src_buffer_size, image_type_ptr image) const + { +#if 1 + // Macropixel: u0 y0 v0 y1 u4 y2 v4 y3 y4 y5 y6 y7 + const unsigned long bytes_per_macropixel = 12; + const unsigned long luma_per_macropixel = 8; + unsigned long num_macropixels = src_buffer_size/bytes_per_macropixel; + unsigned char* src = src_buffer; + unsigned char* dest_Y = (unsigned char*)image->data(0); + unsigned char* dest_U = (unsigned char*)image->data(1); + unsigned char* dest_V = (unsigned char*)image->data(2); + + for(unsigned long idx = 0; idx < num_macropixels; idx++) + { + *dest_U++ = *src++; + *dest_Y++ = *src++; + *dest_V++ = *src++; + *dest_Y++ = *src++; + *dest_U++ = *src++; + *dest_Y++ = *src++; + *dest_V++ = *src++; + *dest_Y++ = *src++; + *dest_Y++ = *src++; + *dest_Y++ = *src++; + *dest_Y++ = *src++; + *dest_Y++ = *src++; + } +#else + memcpy( (unsigned char*)(image->data(0)), + src_buffer, + image->width()*image->height() ); + memset((unsigned char*)(image->data(1)), 128, image->width(1)*image->height(1)); + memset((unsigned char*)(image->data(2)), 128, image->width(2)*image->height(2)); +#endif + } + + // FOURCC YV12 converter... + class YV12_converter : public format_converter + { + public: + explicit YV12_converter(); + ~YV12_converter(); + + opl::wstring get_target_format() const; + void convert(unsigned char* src_buffer, unsigned long src_buffer_size, image_type_ptr image) const; + }; + + typedef boost::shared_ptr<YV12_converter> YV12_converter_ptr; + YV12_converter::YV12_converter() + : format_converter( FCC('YV12') ) + { + } + + YV12_converter::~YV12_converter() + { + } + + opl::wstring YV12_converter::get_target_format() const + { + return opl::wstring(L"yuv420p"); + } + + void YV12_converter::convert(unsigned char* src_buffer, unsigned long src_buffer_size, image_type_ptr image) const + { + // Note: YV12 plane order is Y, V then U. + // Is this the case for a yuv420p olib image? + memcpy( (unsigned char*)(image->data(0)), + src_buffer, + image->size() ); + } + + // RGB24 converter... + class RGB24_converter : public format_converter + { + public: + explicit RGB24_converter(); + ~RGB24_converter(); + + opl::wstring get_target_format() const; + void convert(unsigned char* src_buffer, unsigned long src_buffer_size, image_type_ptr image) const; + }; + + typedef boost::shared_ptr<RGB24_converter> RGB24_converter_ptr; + RGB24_converter::RGB24_converter() + : format_converter( 0x00000000/*FCC('RGB ')*/ ) + { + } + + RGB24_converter::~RGB24_converter() + { + } + + opl::wstring RGB24_converter::get_target_format() const + { + // http://msdn2.microsoft.com/en-us/library/ms787838.aspx + return opl::wstring(L"b8g8r8"); + } + + void RGB24_converter::convert(unsigned char* src_buffer, unsigned long src_buffer_size, image_type_ptr image) const + { + memcpy( (unsigned char*)(image->data(0)), + src_buffer, + image->size() ); + } + + // FOURCC RGB32 converter... + class RGBA32_converter : public format_converter + { + public: + explicit RGBA32_converter(); + ~RGBA32_converter(); + + opl::wstring get_target_format() const; + void convert(unsigned char* src_buffer, unsigned long src_buffer_size, image_type_ptr image) const; + }; + + typedef boost::shared_ptr<RGBA32_converter> RGBA32_converter_ptr; + RGBA32_converter::RGBA32_converter() + : format_converter( FCC('RGBA') ) + { + } + + RGBA32_converter::~RGBA32_converter() + { + } + + opl::wstring RGBA32_converter::get_target_format() const + { + // http://msdn2.microsoft.com/en-us/library/ms787838.aspx + return opl::wstring(L"b8g8r8a8"); + } + + void RGBA32_converter::convert(unsigned char* src_buffer, unsigned long src_buffer_size, image_type_ptr image) const + { + memcpy( (unsigned char*)(image->data(0)), + src_buffer, + image->size() ); + } + + // FOURCC to oil format converter factory function + format_converter_ptr get_format_converter(unsigned long fourcc_format) + { + switch(fourcc_format) + { + case 0x00000000: // BI_RGB + // return format_converter_ptr(new RGB24_converter()); + //case FCC('RGBA'): + return format_converter_ptr(new RGBA32_converter()); + case FCC('Y41P'): + return format_converter_ptr(new Y41P_converter()); + case FCC('YV12'): + return format_converter_ptr(new YV12_converter()); + default: + return format_converter_ptr(); + } + } + + //--------------------------------------------------------------------------------------------- + + // Move these to anonymous namespace. No need to be part of class. + HRESULT get_pin(IBaseFilter *pFilter, PIN_DIRECTION direction, IPin** ppPin) + { + if(!pFilter) + return E_POINTER; + if(!ppPin) + return E_POINTER; + + CComPtr<IEnumPins> pEnumPins; + HRESULT hr = pFilter->EnumPins(&pEnumPins); + if(FAILED(hr)) + return hr; + + CComPtr<IPin> pPin; + while(pEnumPins->Next(1, &pPin.p, 0) == S_OK) + { + PIN_DIRECTION dir; + hr = pPin->QueryDirection(&dir); + if(FAILED(hr)) + break; + + if(dir == direction) + { + *ppPin = pPin; + break; + } + } + + return hr; + } + + HRESULT get_pins(IBaseFilter *pFilter, PIN_DIRECTION direction, std::deque< CComPtr<IPin> >& pin_container) + { + if(!pFilter) + return E_POINTER; + + pin_container.clear(); + + CComPtr<IEnumPins> pEnumPins; + HRESULT hr = pFilter->EnumPins(&pEnumPins); + if(FAILED(hr)) + return hr; + + CComPtr<IPin> pPin; + while(pEnumPins->Next(1, &pPin.p, 0) == S_OK) + { + PIN_DIRECTION dir; + hr = pPin->QueryDirection(&dir); + if(FAILED(hr)) + break; + + if(dir == direction) + pin_container.push_back(pPin); + } + + return hr; + } + + //--------------------------------------------------------------------------------------------- + +#if defined(_DEBUG) + void print_native_video_format(const opl::wstring& uri) + { + CComPtr<IMediaDet> pIMediaDet; + HRESULT hr = pIMediaDet.CoCreateInstance(CLSID_MediaDet, NULL, CLSCTX_INPROC_SERVER); + if(SUCCEEDED(hr)) + hr = pIMediaDet->put_Filename(BSTR(uri.c_str())); + + long num_output_streams = 0; + if(SUCCEEDED(hr)) + hr = pIMediaDet->get_OutputStreams(&num_output_streams); + + if(SUCCEEDED(hr)) + { + for(long idx = 0; idx < num_output_streams; idx++) + { + hr = pIMediaDet->put_CurrentStream(idx); + if(FAILED(hr)) + continue; + + AM_MEDIA_TYPE mt; + hr = pIMediaDet->get_StreamMediaType(&mt); + if(FAILED(hr)) + continue; + + if(mt.formattype != FORMAT_VideoInfo) + continue; + + OutputDebugStringA((std::string("Native FOURCC format: ") + fourcc_descriptor((*reinterpret_cast<VIDEOINFOHEADER*>(mt.pbFormat)).bmiHeader.biCompression).c_str() + std::string("\n")).c_str()); + } + } + } + + void print_num_filters_in_filter_graph(const CComPtr<IGraphBuilder>& pGraph) + { + CComPtr<IEnumFilters> pEnumFilters; + HRESULT hr = pGraph->EnumFilters(&pEnumFilters); + if(SUCCEEDED(hr)) + { + const int FILTERS_TO_REQUEST = 100; + CComPtr<IBaseFilter> pFiltersArray[FILTERS_TO_REQUEST]; + ULONG num_got = 0; + hr = pEnumFilters->Next(FILTERS_TO_REQUEST, &pFiltersArray[0], &num_got); + if(SUCCEEDED(hr)) + { + try + { + OutputDebugStringA((std::string("Filters in filter graph: ") + boost::lexical_cast<std::string>(num_got) + std::string("\n")).c_str()); + } + catch(const boost::bad_lexical_cast&) + { + } + } + } + } + +#endif +} + +//################################################################################################# + +class SampleGrabberCallbackDelegate : public ISampleGrabberCB +{ +public: + explicit SampleGrabberCallbackDelegate(directshow_input* input); + ~SampleGrabberCallbackDelegate(); + + // IUnknown overrides + HRESULT STDMETHODCALLTYPE QueryInterface( REFIID iid, void** ppvObject ); + ULONG STDMETHODCALLTYPE AddRef(); + ULONG STDMETHODCALLTYPE Release(); + + // ISampleGrabberCB overrides + virtual HRESULT STDMETHODCALLTYPE SampleCB(double SampleTime, IMediaSample *pSample) = 0; + virtual HRESULT STDMETHODCALLTYPE BufferCB(double SampleTime, BYTE *pBuffer, long BufferLen) = 0; + +protected: + unsigned long refcount_; + directshow_input* input_; +}; + +SampleGrabberCallbackDelegate::SampleGrabberCallbackDelegate(directshow_input* input) + : input_(input) + , refcount_(0) +{ +} + +SampleGrabberCallbackDelegate::~SampleGrabberCallbackDelegate() +{ +} + +HRESULT STDMETHODCALLTYPE SampleGrabberCallbackDelegate::QueryInterface( REFIID iid, void** ppv ) +{ + if(!ppv) + return E_POINTER; + + if(iid == IID_IUnknown) + *ppv = dynamic_cast<IUnknown*>( this ); + else if(iid == IID_ISampleGrabberCB) + *ppv = dynamic_cast<ISampleGrabberCB*>( this ); + else + { + // It didn't match an interface + *ppv = NULL; + return E_NOINTERFACE; + } + + // Increment refcount on the pointer we're about to return + AddRef(); + // Return success + return S_OK; +} + +ULONG STDMETHODCALLTYPE SampleGrabberCallbackDelegate::AddRef() +{ + return(++refcount_); +} + +ULONG STDMETHODCALLTYPE SampleGrabberCallbackDelegate::Release() +{ + if(--refcount_ <= 0) + delete this; + + return refcount_; +} + +//------------------------------------------------------------------------------------------------- + +class VideoCallbackDelegate : public SampleGrabberCallbackDelegate +{ +public: + explicit VideoCallbackDelegate(directshow_input* input); + ~VideoCallbackDelegate(); + + // ISampleGrabberCB overrides + HRESULT STDMETHODCALLTYPE SampleCB(double SampleTime, IMediaSample *pSample); + HRESULT STDMETHODCALLTYPE BufferCB(double SampleTime, BYTE *pBuffer, long BufferLen); +}; + +VideoCallbackDelegate::VideoCallbackDelegate(directshow_input* input) + : SampleGrabberCallbackDelegate(input) +{ +} + +VideoCallbackDelegate::~VideoCallbackDelegate() +{ +} + +HRESULT STDMETHODCALLTYPE VideoCallbackDelegate::SampleCB(double SampleTime, IMediaSample *pSample) +{ +#ifdef BUFFER_MEDIA_SAMPLES + return E_FAIL; +#else + if(!pSample) + return E_INVALIDARG; + if(pSample->IsPreroll() == S_OK) + return S_FALSE; + + CAutoLock mutex(&input_->video_container_cs_); +#if defined(_DEBUG) + try + { + std::string msg("Received video callback: SampleTime: "); + msg += boost::lexical_cast<std::string>(SampleTime); + msg += " (frame: "; + msg += boost::lexical_cast<std::string>(static_cast<long long>((SampleTime * 10000000)/reinterpret_cast<VIDEOINFOHEADER*>(input_->video_mediatype_.pbFormat)->AvgTimePerFrame)); + msg += ") pSample: 0x"; + msg += boost::lexical_cast<std::string>(pSample); + msg += "\n"; + OutputDebugStringA(msg.c_str()); + } + catch(const boost::bad_lexical_cast&) + { + OutputDebugStringA("Received video callback\n"); + } +#endif + input_->video_container_[SampleTime] = pSample; + SetEvent(input_->hVideoReceivedEvent_); + return S_OK; +#endif +} + +HRESULT STDMETHODCALLTYPE VideoCallbackDelegate::BufferCB(double SampleTime, BYTE *pBuffer, long BufferLen) +{ +#ifdef BUFFER_MEDIA_SAMPLES + if(!pBuffer) + return E_INVALIDARG; + + media_sample_ptr pMS = boost::shared_ptr<media_sample>(new media_sample); + pMS->SampleTime = SampleTime; + pMS->pBuffer = pBuffer; + pMS->BufferLen = BufferLen; + + CAutoLock mutex(&input_->video_container_cs_); +#if defined(_DEBUG) + try + { + std::string msg("Received video callback: SampleTime: "); + msg += boost::lexical_cast<std::string>(SampleTime); + msg += " (frame: "; + msg += boost::lexical_cast<std::string>(static_cast<long long>((SampleTime * 10000000)/reinterpret_cast<VIDEOINFOHEADER*>(input_->video_mediatype_.pbFormat)->AvgTimePerFrame)); + msg += ") pBuffer: 0x"; + msg += boost::lexical_cast<std::string>(pBuffer); + msg += ", buffer size:"; + msg += boost::lexical_cast<std::string>(BufferLen); + msg += "\n"; + OutputDebugStringA(msg.c_str()); + } + catch(const boost::bad_lexical_cast&) + { + OutputDebugStringA("Received video frame\n"); + } +#endif + input_->video_container_[SampleTime] = pMS; + SetEvent(input_->hVideoReceivedEvent_); + return S_OK; +#else + return E_FAIL; +#endif +} + +//------------------------------------------------------------------------------------------------- + +class AudioCallbackDelegate : public SampleGrabberCallbackDelegate +{ +public: + explicit AudioCallbackDelegate(directshow_input* input); + ~AudioCallbackDelegate(); + + // ISampleGrabberCB overrides + HRESULT STDMETHODCALLTYPE SampleCB(double SampleTime, IMediaSample *pSample); + HRESULT STDMETHODCALLTYPE BufferCB(double SampleTime, BYTE *pBuffer, long BufferLen); +}; + +AudioCallbackDelegate::AudioCallbackDelegate(directshow_input* input) + : SampleGrabberCallbackDelegate(input) +{ +} + +AudioCallbackDelegate::~AudioCallbackDelegate() +{ +} + +HRESULT STDMETHODCALLTYPE AudioCallbackDelegate::SampleCB(double SampleTime, IMediaSample *pSample) +{ +#ifdef BUFFER_MEDIA_SAMPLES + return E_FAIL; +#else + if(!pSample) + return S_FALSE; + if(pSample->IsPreroll() == S_OK) + return S_FALSE; + CAutoLock mutex(&input_->audio_container_cs_); +#if defined(_DEBUG) + HRESULT hr = E_FAIL; +//#define DEBUG_GET_MEDIA_TIME +#ifdef DEBUG_GET_MEDIA_TIME + long long startTime = 0, + endTime = 0; + hr = pSample->GetMediaTime(&startTime, &endTime); + if(hr == VFW_E_MEDIA_TIME_NOT_SET) + { + startTime = -1; + endTime = -1; + } +#endif + long long frame = static_cast<long long>((SampleTime * 10000000)/reinterpret_cast<VIDEOINFOHEADER*>(input_->video_mediatype_.pbFormat)->AvgTimePerFrame); +//#define DEBUG_CONVERT_TIME_FORMAT +#ifdef DEBUG_CONVERT_TIME_FORMAT + long long targetTime = 0; + //hr = input_->pMediaSeeking_->ConvertTimeFormat(&targetTime, &TIME_FORMAT_MEDIA_TIME, frame, NULL/*&TIME_FORMAT_FRAME*/); + //hr = input_->pMediaSeeking_->ConvertTimeFormat(&targetTime, &TIME_FORMAT_SAMPLE, long long(SampleTime*10000000), &TIME_FORMAT_MEDIA_TIME); + //hr = input_->pMediaSeeking_->ConvertTimeFormat(&targetTime, &TIME_FORMAT_SAMPLE, long long(SampleTime*10000000), &TIME_FORMAT_MEDIA_TIME); + hr = input_->pMediaSeeking_->ConvertTimeFormat(&targetTime, &TIME_FORMAT_SAMPLE, SampleTime, NULL/*&TIME_FORMAT_FRAME*/); +#endif + try + { + std::string msg("Received audio callback: SampleTime: "); + msg += boost::lexical_cast<std::string>(SampleTime); + msg += " (frame: "; + msg += boost::lexical_cast<std::string>(frame); + msg += ") "; +#ifdef DEBUG_GET_MEDIA_TIME + msg += "startTime: "; + msg += boost::lexical_cast<std::string>(startTime); + msg += "; endTime: "; + msg += boost::lexical_cast<std::string>(endTime); + msg += "; "; +#endif +#ifdef DEBUG_CONVERT_TIME_FORMAT + msg += "targetTime: "; +// msg += boost::lexical_cast<std::string>(double(targetTime)/10000000); + msg += boost::lexical_cast<std::string>(targetTime); + msg += "; "; +#endif + msg += "pSample: 0x"; + msg += boost::lexical_cast<std::string>(pSample); + msg += "\n"; + OutputDebugStringA(msg.c_str()); + } + catch(const boost::bad_lexical_cast&) + { + OutputDebugStringA("Received audio callback\n"); + } +#endif + input_->audio_container_[SampleTime] = pSample; + SetEvent(input_->hAudioReceivedEvent_); + return S_OK; +#endif +} + +HRESULT STDMETHODCALLTYPE AudioCallbackDelegate::BufferCB(double SampleTime, BYTE *pBuffer, long BufferLen) +{ +#ifdef BUFFER_MEDIA_SAMPLES + if(!pBuffer) + return S_FALSE; + + media_sample_ptr pMS = boost::shared_ptr<media_sample>(new media_sample); + pMS->SampleTime = SampleTime; + pMS->pBuffer = pBuffer; + pMS->BufferLen = BufferLen; + + CAutoLock mutex(&input_->audio_container_cs_); +#if defined(_DEBUG) + try + { + std::string msg("Received audio callback: SampleTime: "); + msg += boost::lexical_cast<std::string>(SampleTime); + msg += " (frame: "; + msg += boost::lexical_cast<std::string>(static_cast<long long>((SampleTime * 10000000)/reinterpret_cast<VIDEOINFOHEADER*>(input_->video_mediatype_.pbFormat)->AvgTimePerFrame)); + msg += ") pBuffer: 0x"; + msg += boost::lexical_cast<std::string>(pBuffer); + msg += ", buffer size:"; + msg += boost::lexical_cast<std::string>(BufferLen); + msg += "\n"; + OutputDebugStringA(msg.c_str()); + } + catch(const boost::bad_lexical_cast&) + { + OutputDebugStringA("Received audio frame\n"); + } +#endif + input_->audio_container_[SampleTime] = pMS; + SetEvent(input_->hAudioReceivedEvent_); + return S_OK; +#else + return E_FAIL; +#endif +} + +//################################################################################################# + +directshow_input::directshow_input( opl::wstring resource, const opl::wstring mime_type ) + : input_type( ) + , uri_( resource ) + , mime_type_( mime_type ) + , valid_(false) + , total_frames_(0) + , hVideoReceivedEvent_(NULL) + , hAudioReceivedEvent_(NULL) + , video_callback_delegate_(NULL) + , audio_callback_delegate_(NULL) +{ + // parse the resource path replacing forward slashes with back slashes. + opl::string res(opl::to_string(resource)); + for(opl::string::iterator it = res.begin(); + it != res.end(); + it++) + { + if(*it == '/') + *it = '\\'; + } + uri_ = opl::to_wstring(res); + + // initialize video & audio mediatype cache + memset(&video_mediatype_, 0, sizeof(AM_MEDIA_TYPE)); + memset(&audio_mediatype_, 0, sizeof(AM_MEDIA_TYPE)); + + // create events for video & audio callbacks to indicate new data has been received + hVideoReceivedEvent_ = CreateEvent(NULL, false, false, NULL); + if(!hVideoReceivedEvent_) + return; + + hAudioReceivedEvent_ = CreateEvent(NULL, false, false, NULL); + if(!hAudioReceivedEvent_) + return; + + // Create callback delegates for receiving media samples from the sample grabber filters + video_callback_delegate_ = new VideoCallbackDelegate(this); + if(!video_callback_delegate_) + return; + video_callback_delegate_->AddRef(); + + audio_callback_delegate_ = new AudioCallbackDelegate(this); + if(!audio_callback_delegate_) + return; + audio_callback_delegate_->AddRef(); + +#if defined(_DEBUG) + print_native_video_format(uri_); +#endif + + HRESULT hr = build_graph(); + +#if defined(_DEBUG) + if(SUCCEEDED(hr)) + print_num_filters_in_filter_graph(pGraph_); +#endif + + // Obtain and cache an IMediaSeeking i/f, set the time format to be frame based and read the media duration + if(SUCCEEDED(hr)) + { + hr = pGraph_->QueryInterface(IID_IMediaSeeking, (void**)&pMediaSeeking_); + if(SUCCEEDED(hr)) + { + hr = pMediaSeeking_->SetTimeFormat(&TIME_FORMAT_FRAME); + } +#if defined(_DEBUG) + if(SUCCEEDED(hr)) + { + if(pMediaSeeking_->IsUsingTimeFormat(&TIME_FORMAT_NONE) == S_OK) + OutputDebugStringA("TimeFormat: No format\n"); + else if(pMediaSeeking_->IsUsingTimeFormat(&TIME_FORMAT_FRAME) == S_OK) + OutputDebugStringA("TimeFormat: Video frames\n"); + else if(pMediaSeeking_->IsUsingTimeFormat(&TIME_FORMAT_SAMPLE) == S_OK) + OutputDebugStringA("TimeFormat: Samples in the stream\n"); + else if(pMediaSeeking_->IsUsingTimeFormat(&TIME_FORMAT_FIELD) == S_OK) + OutputDebugStringA("TimeFormat: Interlaced video fields\n"); + else if(pMediaSeeking_->IsUsingTimeFormat(&TIME_FORMAT_BYTE) == S_OK) + OutputDebugStringA("TimeFormat: Byte offset within the stream\n"); + else if(pMediaSeeking_->IsUsingTimeFormat(&TIME_FORMAT_MEDIA_TIME) == S_OK) + OutputDebugStringA("TimeFormat: Reference time (100-nanosecond units)\n"); + } +#endif + if(SUCCEEDED(hr)) + { + LONGLONG duration; + hr = pMediaSeeking_->GetDuration(&duration); + if(SUCCEEDED(hr)) + total_frames_ = static_cast<long long>(duration); + } + } + + // Obtain and cache an IVideoFrameStep i/f. See if the filter will support stepping. + if(SUCCEEDED(hr)) + { + HRESULT hr = pGraph_->QueryInterface(IID_IVideoFrameStep, (void**)&pIVideoFrameStep_); + if(SUCCEEDED(hr)) + if(pIVideoFrameStep_->CanStep(0, NULL) != S_OK) + pIVideoFrameStep_ = NULL; + } + + // start the filter graph going either paused or running - not sure whats the best at this stage + if(SUCCEEDED(hr)) + { + hr = pGraph_->QueryInterface(IID_IMediaControl, (void**)&pMediaControl_); + if(SUCCEEDED(hr)) + { +#define RUN_FILTER_GRAPH +#if defined(RUN_FILTER_GRAPH) + hr = pMediaControl_->Run(); +#else + hr = pMediaControl_->Pause(); +#endif + if(SUCCEEDED(hr)) + { + OAFilterState state = State_Stopped; + int count = 0; + while(++count <= 5) + { + hr = pMediaControl_->GetState(1000, &state); + if(hr == VFW_S_STATE_INTERMEDIATE) + continue; + else if(hr == VFW_S_CANT_CUE) + break; + else + break; + } + + if( (SUCCEEDED(hr)) +#if defined(RUN_FILTER_GRAPH) + && (state == State_Running) ) +#else + && (state == State_Paused) ) +#endif + { +#if defined(RUN_FILTER_GRAPH) + OutputDebugStringA("filter graph is running!\n"); +#else + OutputDebugStringA("filter graph is paused!\n"); +#endif + valid_ = true; + } + } + } + } + + if(!valid_) + OutputDebugStringA("directshow plugin is NOT valid!\n"); +} + +directshow_input::~directshow_input( ) +{ + HRESULT hr = E_FAIL; + + // Stop the graph if not already stopped + OAFilterState state; + if(pMediaControl_) + { + hr = pMediaControl_->GetState(1000, &state); + if(SUCCEEDED(hr) && state != State_Stopped) + { + hr = pMediaControl_->Stop(); + if(SUCCEEDED(hr)) + { + int count = 0; + while(++count <= 5) + { + hr = pMediaControl_->GetState(1000, &state); + if(hr == VFW_S_STATE_INTERMEDIATE) + continue; + else if(hr == VFW_S_CANT_CUE) + break; + else + break; + } + + if( (SUCCEEDED(hr)) + && (state == State_Stopped) ) + { + OutputDebugStringA("filter graph is stopped!\n"); + } + } + } + } + +#ifdef BUFFER_MEDIA_SAMPLES + if(pISampleGrabberVideo_) + pISampleGrabberVideo_->SetCallback(NULL, 1); +#else + if(pISampleGrabberVideo_) + pISampleGrabberVideo_->SetCallback(NULL, 0); +#endif + + valid_ = false; + + { + CAutoLock video_mutex(&video_container_cs_); + video_container_.clear(); + } + + { + CAutoLock video_mutex(&audio_container_cs_); + audio_container_.clear(); + } + + // Remove all filters from the filter graph + CComPtr<IEnumFilters> pEnumFilters; + hr = pGraph_->EnumFilters(&pEnumFilters); + if(SUCCEEDED(hr)) + { + typedef std::deque<IBaseFilter*> FilterContainer; + typedef FilterContainer::const_iterator FilterContainerIterator; + + FilterContainer filterContainer; + IBaseFilter* pFilter; + + while(pEnumFilters->Next(1, &pFilter, NULL) == S_OK) + { + filterContainer.push_back(pFilter); + } + + for(FilterContainerIterator it = filterContainer.begin(); + it != filterContainer.end(); + it++) + { + pGraph_->RemoveFilter(*it); + } + } + +#if defined(_DEBUG) + if(SUCCEEDED(hr)) + print_num_filters_in_filter_graph(pGraph_); +#endif + + if(video_callback_delegate_) + video_callback_delegate_->Release(); + + if(audio_callback_delegate_) + audio_callback_delegate_->Release(); + + if(hVideoReceivedEvent_) + { + CloseHandle(hVideoReceivedEvent_); + hVideoReceivedEvent_ = NULL; + } + + if(hAudioReceivedEvent_) + { + CloseHandle(hAudioReceivedEvent_); + hAudioReceivedEvent_ = NULL; + } + + if(get_video_streams()) + FreeMediaType(video_mediatype_); + + if(get_audio_streams()) + FreeMediaType(audio_mediatype_); +} + +inline bool directshow_input::is_valid() const +{ + return valid_; +} + +inline bool directshow_input::is_seekable( ) const... [truncated message content] |