Get notified about new tutorials
RECEIVE NEW TUTORIALS

First 15 Minutes Free

220 sessions given

since Jul 09, 2014

since Jul 09, 2014

Likelihood of Reply:
100%

Response Time:
within an hour

Ray Phan

Feb 02, 2015

<p><a href="http://www.mathworks.com/help/images/ref/entropy.html" rel="nofollow"><code>entropy</code></a> in MATLAB already computes the Shannon Entropy for the entire image. What you need to do is remove the pixels from the image that are the background first. As such, simply create a mask that will segment the image so that we remove the background pixels, then compute the entropy for the rest. As such, we can use a combination of <a href="http://www.mathworks.com/help/images/ref/im2bw.html" rel="nofollow"><code>im2bw</code></a> and <a href="http://www.mathworks.com/help/images/ref/graythresh.html" rel="nofollow"><code>graythresh</code></a> to find our mask. <code>im2bw</code> converts a grayscale image into a binary image, which is essentially our mask, and <code>graythresh</code> finds the optimum threshold to convert the grayscale image into binary. Therefore, we can simply do this:</p>
<pre><code>im = imread('http://i.stack.imgur.com/I4hf4.png'); %// Read your image from StackOverflow
out = im2bw(im, graythresh(im));
out_final = im(out);
e = entropy(out_final);
</code></pre>
<p>The first line reads in the image directly from StackOverflow, the second line finds a mask that is used to extract out the relevant pixels in the image. The next line only extracts the pixels that belong to the lung, and we finally compute the entropy in the last line. To convince yourself that the mask is proper, if we did <code>imshow(out)</code>, this is the image we get:</p>
<p><img src="http://i.stack.imgur.com/24c0d.png" alt="enter image description here"></p>
<p>Now, the entropy of what is remaining is stored in <code>e</code>, and the answer I get is:</p>
<pre><code>e =
6.1745
</code></pre>
<hr>
<p>However, if you're asking whether or not your code works, the answer is no. If you recall the definition, Shannon Entropy is defined as:</p>
<p><img src="http://upload.wikimedia.org/math/4/9/3/49343eb4fda021767005bc902299bb00.png" alt=""></p>
<p><code>b = 2</code> in our case. What you need to do is find the <strong>probability of occurrence</strong> for each symbol / intensity that is encountered in your image. Therefore, you need to find the histogram that tabulates the total number of pixels observed for every possible intensity found in your image and normalize by the total number of pixels to get the probability distribution function. We can easily find the histogram through <a href="http://www.mathworks.com/help/images/ref/imhist.html" rel="nofollow"><code>imhist</code></a>. Because this image is <code>uint8</code>, there are 8 bits possible per pixel, and so the total number of intensities would be <code>2^8 = 256</code>, thus making our histogram and PDF 256 entries long. </p>
<p>With regards to your code, you are normalizing all intensities by the total sum of the image, and then summing over the image, which unfortunately is not the correct way to find the entropy. Therefore, if you want to calculate entropy yourself, you would need to do something like this:</p>
<pre><code>%// Same code as before
im = imread('http://i.stack.imgur.com/I4hf4.png'); %// Read your image from StackOverflow
out = im2bw(im, graythresh(im));
out_final = im(out);
%// Compute PDF
h = imhist(out_final);
pdf = h / numel(out_final);
%// Set any entries that are 0 to 1 so that log calculation equals 0.
pdf(pdf == 0) = 1;
%// Calculate entropy
e = -sum(pdf.*log2(pdf));
</code></pre>
<p>Take a look at the code above closely. The first three lines come from what you have seen before with my first answer. What we do next is we compute the PDF by calculating the histogram and normalizing by the total number of pixels in the image. Next, we search the PDF and set any values that are 0 to 1 so that when it comes to doing the <code>log</code> calculation, these will safely go to 0. The last part is to compute the entropy. If you run the above code, we also get <code>e = 6.1745</code>, which agrees with the first attempt.</p>
<p>This tip was originally posted on <a href="http://stackoverflow.com/questions/27726735/How%20to%20calculate%20the%20Shannon%20Entropy%20of%20a%20part%20of%20image%20data?/27726789">Stack Overflow</a>.</p>

Get New Tutorials Delivered to Your Inbox

New tutorials will be sent to your Inbox once a week.