From: Muthu <gnu...@us...> - 2007-09-29 17:39:00
|
Update of /cvsroot/octave/octave-forge/main/info-theory/inst In directory sc8-pr-cvs3.sourceforge.net:/tmp/cvs-serv8746/inst Added Files: hartley_entropy.m kullback_leibler_distance.m renyi_entropy.m shannon_entropy.m Log Message: kullback_leibler_distance.m shannon_entropy.m hartley_entropy.m renyi_entropy.m, new entropy definitions --- NEW FILE: kullback_leibler_distance.m --- ## Copyright (C) 2007 Muthiah Annamalai <mut...@ut...> ## ## This program is free software; you can redistribute it and/or modify ## it under the terms of the GNU General Public License as published by ## the Free Software Foundation; either version 2 of the License, or ## (at your option) any later version. ## ## This program is distributed in the hope that it will be useful, ## but WITHOUT ANY WARRANTY; without even the implied warranty of ## MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the ## GNU General Public License for more details. ## ## You should have received a copy of the GNU General Public License ## along with this program; if not, write to the Free Software ## Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA ## ## -*- texinfo -*- ## @deftypefn {Function File} {} kullback_leibler_distance (@var{P}, @var{Q}) ## ## @var{P} and @var{Q} are probability distribution functions of the ## @math{Dkl(P,Q) = \sum_x{ -P(x).log(Q(x)) + P(x).log(P(x))} ## = \sum_x{ -P(x).log(P(x)/Q(x))}} ## ## Compute the Kullback-Leibler distance of two probability distributions ## given, P & Q. ## @example ## @group ## kullback_leibler_distance([0.2 0.3 0.5],[0.1 0.8 0.1]) ## @result{} ans = 0.64910 ## @end group ## @end example ## @end deftypefn function dist=kullback_leibler_distance(P,Q) if (nargin < 2) print_usage(); end PQ=P./Q; idx=[find(P == 0), find(Q == 0)]; PQ(idx)=[]; P(idx)=[]; dist=dot(P,log(PQ)); end --- NEW FILE: hartley_entropy.m --- ## Copyright (C) 2007 Muthiah Annamalai <mut...@ut...> ## ## This program is free software; you can redistribute it and/or modify ## it under the terms of the GNU General Public License as published by ## the Free Software Foundation; either version 2 of the License, or ## (at your option) any later version. ## ## This program is distributed in the hope that it will be useful, ## but WITHOUT ANY WARRANTY; without even the implied warranty of ## MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the ## GNU General Public License for more details. ## ## You should have received a copy of the GNU General Public License ## along with this program; if not, write to the Free Software ## Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA ## ## -*- texinfo -*- ## @deftypefn {Function File} {} hartley_entropy (@var{P}) ## ## Compute the Hartley entropy using Reyni entropy of order 0, ## for the given probability distribution. ## ## @math{H\alpha(P(x)) = log{\sum_i (Pi(x)^\alpha)}/(1-\alpha)} ## ## special-cases include, and when alpha=0, it reduces ## to Hartley entropy. ## ## Hartley entropy H0(X) = log|X|, where X=n(P), cardinality of P, ## the pdf of random variable x. ## ## @example ## @group ## hartley_entropy([0.2 0.3 0.5]) ## @result{} ans = 1.0986 ## @end group ## @end example ## @end deftypefn function R=hartley_entropy(P) if (nargin ~= 1) print_usage(); end R=renyi_entropy(0,P); end --- NEW FILE: shannon_entropy.m --- ## Copyright (C) 2007 Muthiah Annamalai <mut...@ut...> ## ## This program is free software; you can redistribute it and/or modify ## it under the terms of the GNU General Public License as published by ## the Free Software Foundation; either version 2 of the License, or ## (at your option) any later version. ## ## This program is distributed in the hope that it will be useful, ## but WITHOUT ANY WARRANTY; without even the implied warranty of ## MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the ## GNU General Public License for more details. ## ## You should have received a copy of the GNU General Public License ## along with this program; if not, write to the Free Software ## Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA ## ## -*- texinfo -*- ## @deftypefn {Function File} {} shannon_entropy (@var{P}) ## ## Redirects Shannon Entropy to entropy function. This is consistent ## with the definition of Renyi entropy. ## ## @end deftypefn function E=shannon_entropy(P) if nargin < 1, print_usage(), end; E=entropy(P); return end --- NEW FILE: renyi_entropy.m --- ## Copyright (C) 2007 Muthiah Annamalai <mut...@ut...> ## ## This program is free software; you can redistribute it and/or modify ## it under the terms of the GNU General Public License as published by ## the Free Software Foundation; either version 2 of the License, or ## (at your option) any later version. ## ## This program is distributed in the hope that it will be useful, ## but WITHOUT ANY WARRANTY; without even the implied warranty of ## MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the ## GNU General Public License for more details. ## ## You should have received a copy of the GNU General Public License ## along with this program; if not, write to the Free Software ## Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA ## ## -*- texinfo -*- ## @deftypefn {Function File} {} renyi_entropy (@var{alpha}, @var{P}) ## ## Compute the Renyi entropy of order @var{alpha}, ## for the given probability distribution @var{P}. ## ## @math{Halpha(P(x)) = log{\sum_i{(P(x_i)^alpha)}/(1-alpha)}} ## ## special-cases include, when @var{alpha}=1, it reduces to ## regular definition of shannon entropy, and when @var{alpha}=0, ## it reduces to hartley entropy. ## ## @example ## @group ## renyi_entropy(0,[0.2 0.3 0.5]) ## @result{} ans = 1.0986 ## @end group ## @end example ## @end deftypefn function R=renyi_entropy(alpha,P) if( nargin ~= 2 ) print_usage(); end if ( alpha == 1 ) R=entropy(P); else S=sum(P.^alpha); R=log(S)/(1-alpha); end return end |