Information Theory
Entropy
QBase.shannon_entropy — Functionshannon_entropy( probabilities :: AbstractVector ) :: Float64The classical entropy of a probability distribution:
\[S = -\sum_{i=1}^n p_i \log_2(p_i)\]
A DomainError is thrown if input probabilities does not satisfy is_probability_distribution.
QBase.von_neumann_entropy — Functionvon_neumann_entropy( ρ :: AbstractMatrix ) :: Float64The von neumann entropy of a density matrix:
\[S(\rho) = - \sum_j \lambda_j \log_2(\lambda_j)\]
where $\lambda_j$ are the eigenvalues of quantum state $\rho$.
A DomainError is thrown if ρ does not satisfy is_density_matrix.
QBase.joint_entropy — Functionjoint_entropy(priors :: AbstractVector, conditionals :: AbstractMatrix) :: Float64Returns the entropy for the union of pdf $P(x,y)$. The joint entropy is the shannon_entropy of the joint probability distribution:
\[S(X,Y) = - \sum_{x,y} p(x,y) \log_2(p(x,y))\]
QBase.conditional_entropy — Functionconditional_entropy(priors::AbstractVector, conditionals::AbstractMatrix) :: Float64Returns the conditional entropy for the system with specified priors $p(x)$ and conditionals $p(y|x)$:
\[S(Y|X) = - \sum_{x,y} p(x,y)\log_2\left(\frac{p(y|x)}{p(x)}\right)\]
Information
QBase.holevo_bound — Functionholevo_bound(
priors :: AbstractVector,
ρ_states :: Vector{<:AbstractMatrix}
) :: Float64The Holevo theorem places a bound on the mutual_information $I(X : Y) \leq \mathcal{X}$. It places a limit on the amount information that can be decoded from a set of quantum states. For a mixed state $\rho = \sum_i p_i \rho_i$ the Holevo bound $\mathcal{X}$ is
\[\mathcal{X} := S(\rho) - \sum_i p_i S(\rho_i)\]
where $S(\rho)$ is the von_neumann_entropy.
A DomainError is thrown if:
priorsdoes not satisfyis_probability_distribution.- Any
ρ ∈ ρ_statesdoes not satisfyis_density_matrix.
QBase.mutual_information — Functionmutual_information(
priors :: AbstractVector,
conditionals :: AbstractMatrix
) :: Float64The entropy of the overlap between $p(x)$ and $p(y)$. The mutual information is directly computed from the shannon_entropy and joint_entropy:
\[I(X : Y) = S(X) + S(Y) - S(X,Y)\]
mutual_information(
priors :: AbstractVector,
ρ_states :: Vector{<:AbstractMatrix},
Π :: AbstractVector
) :: Float64Computes the classical mutual information for a quantum state and measurement encoding and decoding. The conditional probabilities are obtained from quantum states and measurements using measure.
A DomainError is thrown if:
priorsdoes not satisfyis_probability_distribution.- Any
ρ ∈ ρ_statesdoes not satisfyis_density_matrix. Πdoes not satisfyis_povm.
State Discrimination
QBase.success_probability — Functionsuccess_probability(
priors::AbstractVector,
states::Vector{<:AbstractMatrix},
Π::Vector{<:AbstractMatrix}
) :: Float64The probability of correctly distinguishing quantum states with the specifed POVM:
\[P_{\text{Success}} = \sum_{i=1}^n p_i \text{Tr}[\Pi_i \rho_i].\]
The number of states must match the number POVMs.
QBase.error_probability — Functionerror_probability(
priors::AbstractVector,
ρ_states::Vector{<:AbstractMatrix},
Π::AbstractVector{<:AbstractMatrix}
) :: Float64The probability of incorrectly distinguishing quantum states with the specifed POVM. This quantity is simply obtained as $P_{\text{Error}} = 1 - P_{\text{Success}}$.