Answer:
Differential entropy differ's from the normal entropy as in case of differential entropy random variable needs not to be discrete.
The differential concept of entropy is as follows
Let X be a random variable which is continuous
Let[tex]f(x)[/tex] be it's probability distribution function
We have differential entropy h(X) defined as
[tex]h(X)=-\int _{X}f(X)log[f(X)]dx[/tex]
The random variable 'X' need not to be discrete and it is continuous.