Usage of entropy function

10,600

I used open entropy to check the code, and there is a line:

if ~islogical(I)
  I = im2uint8(I);
end
p = imhist(I(:));

which mean that the input is converted to uint8, and then the function computes the entropy of the histogram of the input, and not of the input itself.

That explains the difference.

Share:
10,600

Related videos on Youtube

atlantis
Author by

atlantis

Updated on May 25, 2022

Comments

  • atlantis
    atlantis almost 2 years

    I was trying to find the entropy of a certain probability distribution in MATLAB. For p, I tried doing

    E = -sum(p .* log2(p))
    

    and Echeck = entropy(p)

    Shouldn't E and Echeck be same?

    The matlab help on entropy does say Entropy is defined as -sum(p.*log2(p)) where p contains the histogram counts returned from imhist.But also that entropy converts any class other than logical to uint8 for the histogram count calculation since it is actually trying to calculate the entropy of a grayscale image and hence wants the pixel values to be discrete. So I guess it's incorrect to use this function for my purpose? Is there a good alternative?

    • Chris Taylor
      Chris Taylor about 12 years
      What is your variable p? The expression sum(p .* log2(p)) will return a scalar if p is a vector, whereas the function entropy expects to operate on a matrix representing a grayscale image. In effect, the entropy function is defined by entropy(I) = -sum(imhist(I) .* log2(imhist(I))).
    • Oli
      Oli about 12 years
      @Chris Taylor, You should post your comment as an answer.
    • atlantis
      atlantis about 12 years
      p is a probability distribution - a NX1 vector of probability values in my case.
    • Chris Taylor
      Chris Taylor about 12 years
      @Oli meh, yours was more complete (also, I don't have the image processing toolbox so I can't see the source of entropy, which I think added to your answer)
    • atlantis
      atlantis about 12 years
      The MATLAB entropy function also returned a scalar for vector p. The difference in values is probably due to turning double probability values in the vector p to uint8. I was wondering if MATLAB had a more straightforward way to calculate entropy from of any probability distribution