title | author | date | geometry | output |
---|---|---|---|---|
Graphs and MBL |
Jacob Ross |
Dec 2018 |
margin=2cm |
pdf_document |
- Contents
- Graph structure
- The Laplacian - Properties of the Laplacian
- Properties of $\aleph$
- ====
- TODO
A graph state usually refers to a specific construction in quantum computing theory, where a state of several qubits is constructed from a given graph by applying CNOT gates between qubits where an edge is present between the corresponding nodes in the graph. In this work I seek a reversal of this definition, whereby one constructs a graph from a given arbitrary state. The quantum mutual information captures both classical and quantum correlations and provides rich detail into the structure of many-body states. The quantum mutual information of a pair of systems
Where $\mathcal{S}{(B)}$ is the von Neumann entropy of the reduced density matrix of subsystem A (B), and $\mathcal{S}{AB}$ is the von Neumann entropy of the complete system. The QMI is not an entanglement measure but it does have the advantage of being completely basis independent, and instead provides a bound on the degree of possible correlation observable between two systems.
Given the mutual information between each subsystem of a many-body system, define
$$
W_{ij} = S(\rho_i) + S(\rho_j) - S(\rho_{ij})
$$
whose positive off-diagonal $i,j$th elements are the mutual information between subsystems
I apply this construction to the Pal-Huse model, a variation of the Heisenberg spin chain. The model is given by the Hamiltonian
$$
\mathcal{H} = J \sum_{i,j} \hat{\vec{S}}_i\hat{\vec{S}}j + \mathcal{W}\sum_i h_i \hat{S}{i}^{z}
$$
Where J is the coupling strength between nearest neighbours (set to 1 in this work to fix an energy scale), and
Much extant work studies the variance of certain intensive parameters across the MBL transition. In this work, and indeed in the present literature, many of the distributions under study are heavily skewed, and so the mean and variance alone are not enough characterize the distribution. I attempt to use the entropy of the distributions as a measure of distribution 'width' which is agnostic about the moments of the probability distributions. Define the entropy
Define the adjacency matrix
In the ergodic phase, the QMI between arbitrary pairs is drawn from a narrow distribution. In the localized phase, the distribution of QMI broadens considerably. I suspect the rise in larger QMI values corresponds to greater correlations on small length scales, and the tails of the distribution to the drop in long-range correlations. I haven't yet looked at QMI versus distance, but the next results corroborate this interpretation:
Define the degree of a node in a graph by the sum of the weights of all the edges connected to it,
In the ergodic phase, most nodes are equally well connected. The log-log plot suggests a power-law scaling of the probability of finding a node with a given degree, which is characteristic of scale-free graphs. In the localized phase, the degrees are generally weaker, consistent with a decay of entanglement with the rest of the system. A multipartite distribution is clearly visible, with a sharp cutoff at
In the ergodic phase, the spins are overwhemingly likely to have near-unit von Neumann entropy, and therefore highly entangled. The QMI is employed as a lens to find out where this entanglement is - how distributed the quantum state of a spin is among the non-local degrees of freedom within the rest of the chain. In the localize phase, spins are more likely to have very low (even near-zero) entropy, suggesting that they could almost be factored out of the global quantum state. The localized phase has a distinct 'hump' just above zero entropy, and a thick tail trailing towards 1, showing that most spins are generally less entangled, but still bound in nonlocal subspaces. The 'hump' is also visible in the degree distribution. This is not surprising, as a nonzero single-spin entropy in a pure state reflects entanglement, and so there should be a correlation between degree (total QMI) and the von Neumann entropy (see later section)
The microscopic statistics of the entropy and the QMI are consistent with the construction of l-bits and the existence of clusters of entangled spins. The prospect of applying graph factorization to the state graph requires another line of inquiry, which directly examines the length scales of entanglement within the system. One way to do this is a recursive cluster-building algorithm (as has been explored in a recent Arxiv preprint). Another is to consider a cornerstone of the graph partitioner's toolkit: The graph Laplacian.
Define the Laplacian of the graph by
One can furnish the definition of
- The graph Laplacian is the lattice approximation to the continuous Laplacian
- The graph laplacian is positive semidefinite, with a nullspace of dimension equal to the number of disjoint subgraphs within a graph. Thus, a connected graph has one zero eigenvalue.
- By analogy with the continuous Laplacian, which is diagonal in the Fourier basis, the eigenstates of the graph Laplacian are commonly referred to as the "fourier modes" of a function on the nodes of a graph. The eigenvalues are thus the (real) amplitudes of these modes.
The trace of the Laplacian - by analogy, the total intensity of the lattice function - also shows a clear change across the localization transition. Something weird is going on in the lower right, though...
The large body of literature on graph laplacians would enable straightforward study of this object. In particular, the use of the Laplacian eigenvectors to approximate graph partitions may be related to the support of l-bits/Q-LIOMs. However, I have not made any advances towards this. There are some interpretational issues also;
- The physical meaning 'degree' of a node is not something I've worked out, and
- It's not clear what function the Laplacian is decomposing in the sense of a Fourier transform so the Laplacian itself is physically ambiguous.
The final object I describe here is a variation on the Laplacian, with a better information-theoretic grounding. Define the matrix
Probability distributions of the eigenvalues of |
The spectrum shows a clear and distinct transition with increasing disorder. The off-diagonal elements, the QMI weights, alter the spectrum from the von Neumann entropy distribution to a remarkable 'tiered' structure, which resembles a combination of the degre
Currently I do not have a clear interpretation of this data, but the dramatic tiered structure of the (log) spectrum in the localized phase is in stark contrast with the narrow peak in the ergodic phase. The data is also visually much 'cleaner' than the Laplacian spectrum for the same data set.
Probability distributions of the trace of the Aleph (upper left), the PDF of the log of the trace (upper right), and the Shannon entropy (lower row) of the distributions versus disorder. |
The trace of
Observe that for a pure bipartite system, $$ I_{AB} = S_A + S_B - S_{AB} $$ $$ \implies S_{AB} = S_A + S_B - I_{AB} = 0 $$
hence the sum of the reduced entropies minus the QMI must be zero. For multipartite systems, there should exist a hierarchy of conditions whereby the susbsystem entropies, the pairwise QMI, and the QMI between tuples of system should all sum to the total system entropy: Zero, in a pure state.
Therefore define the 'missing' information as the difference
The structure of the matrices themselves - in partcular
Here are some examples, where lighter colours are larger entries, of the
Entries of |
- Graph structure
- Centrality measures
- Fielder vectors/partitions
- system composition as (generalized) graph product
- Connecting to contemporary studies
- Area/Volume law
- L-bits ~ decomposition into (approximate) union of subgraphs?
- Length Scaling
- Plot distribution entropies vs W for various L on same plot
- Extensions
- bose hubbard model
- extended
$\aleph$ - two-body reduced entropies
- Relation to correlation functions
- Correlation lengths & graph linear algebra - length scales
- Other...
- Time evolution of
$\aleph$ - What is the interpretation of these spectra?
- disorder Temperature?
- Time evolution of