Replies: 3 comments 3 replies
-
Hi @JoeGibbs88, this should be possible a few ways using If you have the TN of the circuit
Note you can specify |
Beta Was this translation helpful? Give feedback.
-
Yes, once you have extracted the TNs representing the unitaries from the circuit objects, the way they compose is defined by the way edges are indexed. Since they have matching outer indices, if you compose two (A & B), you get Tr[A B], but more than two makes the indices clash. What you need to do is align them like thus: W = W_circ(n, depth, gate2=gate2, tags='W').get_uni()
D = D_circ(n, tags='D').get_uni()
Wdag = W.conj()
# for visualizing
W.add_tag('W')
D.add_tag('D')
Wdag.add_tag('Wdag')
W.upper_ind_id = '__a{}__'
W.lower_ind_id = '__b{}__'
D.upper_ind_id = '__b{}__'
D.lower_ind_id = '__c{}__'
# transpose by switching lower and upper!
Wdag.lower_ind_id = '__c{}__'
Wdag.upper_ind_id = '__a{}__'
(W | D | Wdag).draw(['W', 'D', 'Wdag'], show_inds='all') There is the |
Beta Was this translation helpful? Give feedback.
-
What I posted constructs ...
Wdag.lower_ind_id = '__c{}__'
Wdag.upper_ind_id = '__d{}__'
...
U = qtn.Tensor(
data=U_dense.reshape([2] * (2 * n)),
# explicitly put in lower and upper indices to close the trace
inds=[f'__d{i}__' for i in range(n)] + [f'__a{i}__' for i in range(n)],
tags={'U_TARGET'}
)
... If you plot |
Beta Was this translation helpful? Give feedback.
-
I would like to variationally learn a diagonalization of a short time Hamiltonian e.g. exactly what is being done in your example 10 'Tensor Network Training of Quantum Circuits', but with an ansatz of the form WDW^{\dagger} (see eqn (1) of https://arxiv.org/pdf/1910.04292.pdf).
In Quimb you do not explicitly give parameters as an input to the parameterised unitary, so I am unsure how I would 'copy' the W unitary and apply its inverse later.
Another tangential reason this would be useful would be training a translationally invariant quantum circuit, where you are training one cell per layer that is repeated multiple times.
Beta Was this translation helpful? Give feedback.
All reactions