Computer Dictionary Online

Medical Dictionary   Law Dictionary   Legal Dictionary   Website Design

0  1  2  3  4  5  6  7  8  9  a  b  c  d  e  f  g  h  i  j  k  l  m  n  o  p  q  r  s  t  u  v  w  x  y  z 


entropy

<theory> A measure of the disorder of a system. Systems tend to go from a state of order (low entropy) to a state of maximum disorder (high entropy).

The entropy of a system is related to the amount of information it contains. A highly ordered system can be described using fewer bits of information than a disordered one. For example, a string containing one million "0"s can be described using run-length encoding as [("0", 1000000)] whereas a string of random symbols (e.g. bits, or characters) will be much harder, if not impossible, to compress in this way.

Shannon's formula gives the entropy H(M) of a message M in bits:

		H(M) = -log2 p(M)


Where p(M) is the probability of message M.

(1998-11-23)


Contact the Computer Dictionary Online  ::  Link to the Computer Dictionary Online  ::  Disclaimer for Computer Dictionary Online

Computer Dictionary Online
Copyright © 2017