Information Theory

Entropy

QBase.shannon_entropyFunction
shannon_entropy( probabilities :: AbstractVector ) :: Float64

The classical entropy of a probability distribution:

\[S = -\sum_{i=1}^n p_i \log_2(p_i)\]

A DomainError is thrown if input probabilities does not satisfy is_probability_distribution.

source
QBase.von_neumann_entropyFunction
von_neumann_entropy( ρ :: AbstractMatrix ) :: Float64

The von neumann entropy of a density matrix:

\[S(\rho) = - \sum_j \lambda_j \log_2(\lambda_j)\]

where $\lambda_j$ are the eigenvalues of quantum state $\rho$.

A DomainError is thrown if ρ does not satisfy is_density_matrix.

source
QBase.joint_entropyFunction
joint_entropy(priors :: AbstractVector, conditionals :: AbstractMatrix) :: Float64

Returns the entropy for the union of pdf $P(x,y)$. The joint entropy is the shannon_entropy of the joint probability distribution:

\[S(X,Y) = - \sum_{x,y} p(x,y) \log_2(p(x,y))\]

source
QBase.conditional_entropyFunction
conditional_entropy(priors::AbstractVector, conditionals::AbstractMatrix) :: Float64

Returns the conditional entropy for the system with specified priors $p(x)$ and conditionals $p(y|x)$:

\[S(Y|X) = - \sum_{x,y} p(x,y)\log_2\left(\frac{p(y|x)}{p(x)}\right)\]

source

Information

QBase.holevo_boundFunction
holevo_bound(
    priors :: AbstractVector,
    ρ_states :: Vector{<:AbstractMatrix}
) :: Float64

The Holevo theorem places a bound on the mutual_information $I(X : Y) \leq \mathcal{X}$. It places a limit on the amount information that can be decoded from a set of quantum states. For a mixed state $\rho = \sum_i p_i \rho_i$ the Holevo bound $\mathcal{X}$ is

\[\mathcal{X} := S(\rho) - \sum_i p_i S(\rho_i)\]

where $S(\rho)$ is the von_neumann_entropy.

A DomainError is thrown if:

source
QBase.mutual_informationFunction
mutual_information(
    priors :: AbstractVector,
    conditionals :: AbstractMatrix
) :: Float64

The entropy of the overlap between $p(x)$ and $p(y)$. The mutual information is directly computed from the shannon_entropy and joint_entropy:

\[I(X : Y) = S(X) + S(Y) - S(X,Y)\]

source
mutual_information(
    priors :: AbstractVector,
    ρ_states :: Vector{<:AbstractMatrix},
    Π :: AbstractVector
) :: Float64

Computes the classical mutual information for a quantum state and measurement encoding and decoding. The conditional probabilities are obtained from quantum states and measurements using measure.

A DomainError is thrown if:

source

State Discrimination

QBase.success_probabilityFunction
success_probability(
    priors::AbstractVector,
    states::Vector{<:AbstractMatrix},
    Π::Vector{<:AbstractMatrix}
) :: Float64

The probability of correctly distinguishing quantum states with the specifed POVM:

\[P_{\text{Success}} = \sum_{i=1}^n p_i \text{Tr}[\Pi_i \rho_i].\]

The number of states must match the number POVMs.

source
QBase.error_probabilityFunction
error_probability(
    priors::AbstractVector,
    ρ_states::Vector{<:AbstractMatrix},
    Π::AbstractVector{<:AbstractMatrix}
) :: Float64

The probability of incorrectly distinguishing quantum states with the specifed POVM. This quantity is simply obtained as $P_{\text{Error}} = 1 - P_{\text{Success}}$.

source