CompNeuro Daily Coding Problem 3-- What is Entropy?
Question:
What is Entropy? Can you write Python code to compute the entropy of a random variable with the probability distribution [0.2,0.1,0.6,0.1]?
Entropy is the average amount of surprise carried by a random variable. The surprise or Shannon information is just the log of 1/p(x). The Entropy is then the sum of these log factors weighted by the probability distribution.
Here's a nice video from free David Mackay (Cambdridge) lectures on Information Theory. This particular one covers Entropy.
https://youtu.be/y5VdtQSqiAI
As for the coding part :
import numpy as np
dist=[0.2,0.1,0.6,0.1]
ent=dist@-np.log(dist)
What is Entropy? Can you write Python code to compute the entropy of a random variable with the probability distribution [0.2,0.1,0.6,0.1]?
Entropy is the average amount of surprise carried by a random variable. The surprise or Shannon information is just the log of 1/p(x). The Entropy is then the sum of these log factors weighted by the probability distribution.
Here's a nice video from free David Mackay (Cambdridge) lectures on Information Theory. This particular one covers Entropy.
https://youtu.be/y5VdtQSqiAI
As for the coding part :
import numpy as np
dist=[0.2,0.1,0.6,0.1]
ent=dist@-np.log(dist)
Kommentaarid
Postita kommentaar