## [46726a]: inst / entropy.m Maximize Restore History

 ``` 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50``` ```## Copyright (C) 2006 Muthiah Annamalai ## ## This program is free software; you can redistribute it and/or modify ## it under the terms of the GNU General Public License as published by ## the Free Software Foundation; either version 2 of the License, or ## (at your option) any later version. ## ## This program is distributed in the hope that it will be useful, ## but WITHOUT ANY WARRANTY; without even the implied warranty of ## MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the ## GNU General Public License for more details. ## ## You should have received a copy of the GNU General Public License ## along with this program; If not, see . ## ## -*- texinfo -*- ## @deftypefn {Function File} {} entropy (@var{symbol_probabilites}, @var{base}) ## ## Computes the Shannon entropy of a discrete source whose ## probabilities are by @var{symbol_probabilities}, and optionally ## @var{base} can be specified. Base of logarithm defaults to 2, ## when the entropy can be thought of as a measure of bits ## needed to represent any message of the source. For example ## ## @example ## @group ## entropy([0.25 0.25 0.25 0.25]) @result{} ans = 2 ## entropy([0.25 0.25 0.25 0.25],4) @result{} ans = 1 ## @end group ## @end example ## @end deftypefn function val=entropy(symprob,base) if nargin < 1 error("usage: entropy(symbol_probability_list); computes entropy in base-2"); elseif nargin < 2 base=2; end val=0.0; #eliminate zeros from x. x=symprob(symprob > 0); val=-sum(log10(x).*x)/log10(base); return end %! %!assert(entropy([0.25 0.25 0.25 0.25]),2,0) %! ```