Entropy

From QETLAB
Jump to: navigation, search
Entropy
Computes the von Neumann or Rényi entropy of a density matrix

Other toolboxes required none
Function category Information theory

Entropy is a function that computes the von Neumann entropy or Rényi entropy of a density matrix. That is, given a density matrix $\rho$, it computes the following quantity:

\[S(\rho) := -\mathrm{Tr}\big(\rho\log_2(\rho)\big)\]

(i.e., the von Neumann entropy) or the following quantity:

\[S_\alpha(\rho) := \frac{1}{1-\alpha}\log_2\big(\mathrm{Tr}(\rho^\alpha)\big)\]

(i.e., the Rényi-$\alpha$ entropy).

Syntax

  • ENT = Entropy(RHO)
  • ENT = Entropy(RHO,BASE)
  • ENT = Entropy(RHO,BASE,ALPHA)

Argument descriptions

  • RHO: A density matrix to have its entropy computed.
  • BASE (optional, default 2): The base of the logarithm used in the entropy calculation.
  • ALPHA (optional, default 1): A non-negative real parameter that determines which entropy is computed (ALPHA = 1 corresponds to the von Neumann entropy, otherwise the Rényi-ALPHA entropy is computed).

Examples

The extreme cases: pure states and maximally-mixed states

A pure state has entropy zero:

>> Entropy(RandomDensityMatrix(4,0,1)) % entropy of a random 4-by-4 rank-1 density matrix
 
ans =
 
   7.3396e-15 % silly numerical errors: this is effectively zero

A d-by-d maximally-mixed state has entropy $\log_2(d)$:

>> Entropy(eye(4)/4)
 
ans =
 
     2

All other states have entropy somewhere between these two extremes:

>> Entropy(RandomDensityMatrix(4))
 
ans =
 
    1.6157

Notes

The Rényi-$\alpha$ entropy approaches the von Neumann entropy as $\alpha \rightarrow 1$.

Source code

Click on "expand" to the right to view the MATLAB source code for this function.

  1. %%  ENTROPY    Computes the von Neumann or Renyi entropy of a density matrix
  2. %   This function has one required argument:
  3. %     RHO: a density matrix
  4. %
  5. %   ENT = Entropy(RHO) is the (base 2) von Neumann entropy of RHO.
  6. %
  7. %   This function has two optional input arguments:
  8. %     BASE (default 2)
  9. %     ALPHA (default 1)
  10. %
  11. %   ENT = Entropy(RHO,BASE,ALPHA) is the entropy of RHO, computed with
  12. %   logarithms in the base specified by BASE. If ALPHA = 1 then this is the
  13. %   von Neumann entropy. If ALPHA <> 1 then this is the Renyi-ALPHA
  14. %   entropy.
  15. %
  16. %   URL: http://www.qetlab.com/Entropy
  17.  
  18. %   requires: nothing
  19. %   author: Nathaniel Johnston (nathaniel@njohnston.ca)
  20. %   last updated: May 12, 2016
  21.  
  22. function ent = Entropy(rho,varargin)
  23.  
  24. % set optional argument defaults: base=2, alpha=1
  25. [base,alpha] = opt_args({ 2, 1 },varargin{:});
  26.  
  27. lam = eig(full(rho));
  28. lam = lam(lam>0); % handle zero entries better: we want 0*log(0) = 0, not NaN
  29.  
  30. % If alpha == 1, compute the von Neumann entropy
  31. if(abs(alpha - 1) <= eps^(3/4))
  32.     if(base == 2)
  33.         ent = -sum(real(lam.*log2(lam)));
  34.     else
  35.         ent = -sum(real(lam.*log(lam)))/log(base);
  36.     end
  37. elseif(alpha >= 0)
  38.     if(alpha < Inf) % Renyi-alpha entropy with ALPHA < Inf
  39.         ent = log(sum(lam.^alpha))/(log(base)*(1-alpha));
  40.  
  41.         % Check whether or not we ran into numerical problems due to ALPHA
  42.         % being large. If so, compute the infinity-entropy instead.
  43.         if(ent == Inf)
  44.             alpha = Inf;
  45.             warning('Entropy:LargeAlpha','Numerical problems were encountered due to a large value of ALPHA. Computing the entropy with ALPHA = Inf instead.');
  46.         end
  47.     end
  48.  
  49.     % Do not merge the following if statement with the previous one: we
  50.     % need them separate, since this one catches a warning from the
  51.     % previous block.
  52.     if(alpha == Inf) % Renyi-infinity entropy
  53.         ent = -log(max(lam))/log(base);
  54.     end
  55. else
  56.     error('Entropy:InvalidAlpha','ALPHA must be non-negative.');
  57. end