TorchQuantum Qubit Rotation Tutorial#

Note: This tutorial was adapted from Pennylane’s Basic tutorial: qubit rotation by Josh Izaac.

To see how TorchQuantum allows the easy construction and optimization of quantum functions, let’s consider the simple case of qubit rotation.

The task at hand is to optimize two rotation gates in order to flip a single qubit from state |0⟩ to state |1⟩.

The quantum circuit#

In the qubit rotation example, we wish to implement the following quantum circuit:

image.png

Breaking this down step-by-step, we first start with a qubit in the ground state \(|0⟩ = [1\ 0]^T\), and rotate it around the x-axis by applying the gate

\(\begin{split}R_x(\phi_1) = e^{-i \phi_1 \sigma_x /2} = \begin{bmatrix} \cos \frac{\phi_1}{2} & -i \sin \frac{\phi_1}{2} \\ -i \sin \frac{\phi_1}{2} & \cos \frac{\phi_1}{2} \end{bmatrix}, \end{split}\)

and then around the y-axis via the gate

\(\begin{split}R_y(\phi_2) = e^{-i \phi_2 \sigma_y/2} = \begin{bmatrix} \cos \frac{\phi_2}{2} & - \sin \frac{\phi_2}{2} \\ \sin \frac{\phi_2}{2} & \cos \frac{\phi_2}{2} \end{bmatrix}.\end{split}\)

After these operations the qubit is now in the state

\(| \psi \rangle = R_y(\phi_2) R_x(\phi_1) | 0 \rangle.\)

Finally, we measure the expectation value \(⟨ψ∣σ_z∣ψ⟩\) of the Pauli-Z operator

Using the above to calculate the exact expectation value, we find that

\(\begin{split}\sigma_z = \begin{bmatrix} 1 & 0 \\ 0 & -1 \end{bmatrix}.\end{split}\)

Depending on the circuit parameters \(ϕ_1\) and \(ϕ_2\), the output expectation lies between 1 (if $|ψ⟩ = |0⟩) and -1 (if |ψ⟩ = |1⟩).

\(\langle \psi \mid \sigma_z \mid \psi \rangle = \langle 0 \mid R_x(\phi_1)^\dagger R_y(\phi_2)^\dagger \sigma_z R_y(\phi_2) R_x(\phi_1) \mid 0 \rangle = \cos(\phi_1)\cos(\phi_2).\)

Let’s see how we can easily implement and optimize this circuit using TorchQuantum.

Importing TorchQuantum#

The first thing we need to do is install and import TorchQuantum. To utilize all of TorchQuantum’s features, install it from source.

[1]:
!git clone https://github.com/mit-han-lab/torchquantum.git
!cd torchquantum && pip install --editable .
Cloning into 'torchquantum'...
remote: Enumerating objects: 13551, done.
remote: Counting objects: 100% (1822/1822), done.
remote: Compressing objects: 100% (758/758), done.
remote: Total 13551 (delta 1085), reused 1640 (delta 980), pack-reused 11729
Receiving objects: 100% (13551/13551), 104.07 MiB | 21.17 MiB/s, done.
Resolving deltas: 100% (7442/7442), done.
Obtaining file:///content/torchquantum
  Preparing metadata (setup.py) ... done
Requirement already satisfied: numpy>=1.19.2 in /usr/local/lib/python3.10/dist-packages (from torchquantum==0.1.7) (1.22.4)
Requirement already satisfied: torchvision>=0.9.0.dev20210130 in /usr/local/lib/python3.10/dist-packages (from torchquantum==0.1.7) (0.15.2+cu118)
Requirement already satisfied: tqdm>=4.56.0 in /usr/local/lib/python3.10/dist-packages (from torchquantum==0.1.7) (4.65.0)
Requirement already satisfied: setuptools>=52.0.0 in /usr/local/lib/python3.10/dist-packages (from torchquantum==0.1.7) (67.7.2)
Requirement already satisfied: torch>=1.8.0 in /usr/local/lib/python3.10/dist-packages (from torchquantum==0.1.7) (2.0.1+cu118)
Collecting torchdiffeq>=0.2.3 (from torchquantum==0.1.7)
  Downloading torchdiffeq-0.2.3-py3-none-any.whl (31 kB)
Collecting torchpack>=0.3.0 (from torchquantum==0.1.7)
  Downloading torchpack-0.3.1-py3-none-any.whl (34 kB)
Collecting qiskit==0.38.0 (from torchquantum==0.1.7)
  Downloading qiskit-0.38.0.tar.gz (13 kB)
  Preparing metadata (setup.py) ... done
Requirement already satisfied: matplotlib>=3.3.2 in /usr/local/lib/python3.10/dist-packages (from torchquantum==0.1.7) (3.7.1)
Collecting pathos>=0.2.7 (from torchquantum==0.1.7)
  Downloading pathos-0.3.0-py3-none-any.whl (79 kB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 79.8/79.8 kB 4.7 MB/s eta 0:00:00
Collecting pylatexenc>=2.10 (from torchquantum==0.1.7)
  Downloading pylatexenc-2.10.tar.gz (162 kB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 162.6/162.6 kB 9.0 MB/s eta 0:00:00
  Preparing metadata (setup.py) ... done
Collecting dill==0.3.4 (from torchquantum==0.1.7)
  Downloading dill-0.3.4-py2.py3-none-any.whl (86 kB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 86.9/86.9 kB 8.1 MB/s eta 0:00:00
Collecting qiskit-terra==0.21.2 (from qiskit==0.38.0->torchquantum==0.1.7)
  Downloading qiskit_terra-0.21.2-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (6.7 MB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 6.7/6.7 MB 13.8 MB/s eta 0:00:00
Collecting qiskit-aer==0.11.0 (from qiskit==0.38.0->torchquantum==0.1.7)
  Downloading qiskit_aer-0.11.0-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (19.2 MB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 19.2/19.2 MB 64.4 MB/s eta 0:00:00
Collecting qiskit-ibmq-provider==0.19.2 (from qiskit==0.38.0->torchquantum==0.1.7)
  Downloading qiskit_ibmq_provider-0.19.2-py3-none-any.whl (240 kB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 240.4/240.4 kB 23.6 MB/s eta 0:00:00
Requirement already satisfied: scipy>=1.0 in /usr/local/lib/python3.10/dist-packages (from qiskit-aer==0.11.0->qiskit==0.38.0->torchquantum==0.1.7) (1.10.1)
Requirement already satisfied: requests>=2.19 in /usr/local/lib/python3.10/dist-packages (from qiskit-ibmq-provider==0.19.2->qiskit==0.38.0->torchquantum==0.1.7) (2.27.1)
Collecting requests-ntlm>=1.1.0 (from qiskit-ibmq-provider==0.19.2->qiskit==0.38.0->torchquantum==0.1.7)
  Downloading requests_ntlm-1.2.0-py3-none-any.whl (6.0 kB)
Requirement already satisfied: urllib3>=1.21.1 in /usr/local/lib/python3.10/dist-packages (from qiskit-ibmq-provider==0.19.2->qiskit==0.38.0->torchquantum==0.1.7) (1.26.16)
Requirement already satisfied: python-dateutil>=2.8.0 in /usr/local/lib/python3.10/dist-packages (from qiskit-ibmq-provider==0.19.2->qiskit==0.38.0->torchquantum==0.1.7) (2.8.2)
Requirement already satisfied: websocket-client>=1.0.1 in /usr/local/lib/python3.10/dist-packages (from qiskit-ibmq-provider==0.19.2->qiskit==0.38.0->torchquantum==0.1.7) (1.6.1)
Collecting websockets>=10.0 (from qiskit-ibmq-provider==0.19.2->qiskit==0.38.0->torchquantum==0.1.7)
  Downloading websockets-11.0.3-cp310-cp310-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (129 kB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 129.9/129.9 kB 13.7 MB/s eta 0:00:00
Collecting retworkx>=0.11.0 (from qiskit-terra==0.21.2->qiskit==0.38.0->torchquantum==0.1.7)
  Downloading retworkx-0.13.0-py3-none-any.whl (10 kB)
Collecting ply>=3.10 (from qiskit-terra==0.21.2->qiskit==0.38.0->torchquantum==0.1.7)
  Downloading ply-3.11-py2.py3-none-any.whl (49 kB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 49.6/49.6 kB 5.5 MB/s eta 0:00:00
Requirement already satisfied: psutil>=5 in /usr/local/lib/python3.10/dist-packages (from qiskit-terra==0.21.2->qiskit==0.38.0->torchquantum==0.1.7) (5.9.5)
Requirement already satisfied: sympy>=1.3 in /usr/local/lib/python3.10/dist-packages (from qiskit-terra==0.21.2->qiskit==0.38.0->torchquantum==0.1.7) (1.11.1)
Collecting stevedore>=3.0.0 (from qiskit-terra==0.21.2->qiskit==0.38.0->torchquantum==0.1.7)
  Downloading stevedore-5.1.0-py3-none-any.whl (49 kB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 49.6/49.6 kB 5.1 MB/s eta 0:00:00
Collecting tweedledum<2.0,>=1.1 (from qiskit-terra==0.21.2->qiskit==0.38.0->torchquantum==0.1.7)
  Downloading tweedledum-1.1.1-cp310-cp310-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (929 kB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 929.7/929.7 kB 55.5 MB/s eta 0:00:00
Collecting symengine>=0.9 (from qiskit-terra==0.21.2->qiskit==0.38.0->torchquantum==0.1.7)
  Downloading symengine-0.10.0-cp310-cp310-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (37.4 MB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 37.4/37.4 MB 14.3 MB/s eta 0:00:00
Requirement already satisfied: contourpy>=1.0.1 in /usr/local/lib/python3.10/dist-packages (from matplotlib>=3.3.2->torchquantum==0.1.7) (1.1.0)
Requirement already satisfied: cycler>=0.10 in /usr/local/lib/python3.10/dist-packages (from matplotlib>=3.3.2->torchquantum==0.1.7) (0.11.0)
Requirement already satisfied: fonttools>=4.22.0 in /usr/local/lib/python3.10/dist-packages (from matplotlib>=3.3.2->torchquantum==0.1.7) (4.40.0)
Requirement already satisfied: kiwisolver>=1.0.1 in /usr/local/lib/python3.10/dist-packages (from matplotlib>=3.3.2->torchquantum==0.1.7) (1.4.4)
Requirement already satisfied: packaging>=20.0 in /usr/local/lib/python3.10/dist-packages (from matplotlib>=3.3.2->torchquantum==0.1.7) (23.1)
Requirement already satisfied: pillow>=6.2.0 in /usr/local/lib/python3.10/dist-packages (from matplotlib>=3.3.2->torchquantum==0.1.7) (8.4.0)
Requirement already satisfied: pyparsing>=2.3.1 in /usr/local/lib/python3.10/dist-packages (from matplotlib>=3.3.2->torchquantum==0.1.7) (3.1.0)
Collecting ppft>=1.7.6.6 (from pathos>=0.2.7->torchquantum==0.1.7)
  Downloading ppft-1.7.6.6-py3-none-any.whl (52 kB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 52.8/52.8 kB 5.7 MB/s eta 0:00:00
INFO: pip is looking at multiple versions of pathos to determine which version is compatible with other requirements. This could take a while.
Collecting pathos>=0.2.7 (from torchquantum==0.1.7)
  Downloading pathos-0.2.9-py3-none-any.whl (76 kB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 76.9/76.9 kB 8.4 MB/s eta 0:00:00
  Downloading pathos-0.2.8-py2.py3-none-any.whl (81 kB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 81.7/81.7 kB 8.7 MB/s eta 0:00:00
Collecting multiprocess>=0.70.12 (from pathos>=0.2.7->torchquantum==0.1.7)
  Downloading multiprocess-0.70.14-py310-none-any.whl (134 kB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 134.3/134.3 kB 14.5 MB/s eta 0:00:00
Collecting pox>=0.3.0 (from pathos>=0.2.7->torchquantum==0.1.7)
  Downloading pox-0.3.2-py3-none-any.whl (29 kB)
Requirement already satisfied: filelock in /usr/local/lib/python3.10/dist-packages (from torch>=1.8.0->torchquantum==0.1.7) (3.12.2)
Requirement already satisfied: typing-extensions in /usr/local/lib/python3.10/dist-packages (from torch>=1.8.0->torchquantum==0.1.7) (4.7.1)
Requirement already satisfied: networkx in /usr/local/lib/python3.10/dist-packages (from torch>=1.8.0->torchquantum==0.1.7) (3.1)
Requirement already satisfied: jinja2 in /usr/local/lib/python3.10/dist-packages (from torch>=1.8.0->torchquantum==0.1.7) (3.1.2)
Requirement already satisfied: triton==2.0.0 in /usr/local/lib/python3.10/dist-packages (from torch>=1.8.0->torchquantum==0.1.7) (2.0.0)
Requirement already satisfied: cmake in /usr/local/lib/python3.10/dist-packages (from triton==2.0.0->torch>=1.8.0->torchquantum==0.1.7) (3.25.2)
Requirement already satisfied: lit in /usr/local/lib/python3.10/dist-packages (from triton==2.0.0->torch>=1.8.0->torchquantum==0.1.7) (16.0.6)
Requirement already satisfied: h5py in /usr/local/lib/python3.10/dist-packages (from torchpack>=0.3.0->torchquantum==0.1.7) (3.8.0)
Collecting loguru (from torchpack>=0.3.0->torchquantum==0.1.7)
  Downloading loguru-0.7.0-py3-none-any.whl (59 kB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 60.0/60.0 kB 7.4 MB/s eta 0:00:00
Collecting multimethod (from torchpack>=0.3.0->torchquantum==0.1.7)
  Downloading multimethod-1.9.1-py3-none-any.whl (10 kB)
Requirement already satisfied: pyyaml in /usr/local/lib/python3.10/dist-packages (from torchpack>=0.3.0->torchquantum==0.1.7) (6.0)
Requirement already satisfied: tensorboard in /usr/local/lib/python3.10/dist-packages (from torchpack>=0.3.0->torchquantum==0.1.7) (2.12.3)
Collecting tensorpack (from torchpack>=0.3.0->torchquantum==0.1.7)
  Downloading tensorpack-0.11-py2.py3-none-any.whl (296 kB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 296.3/296.3 kB 27.0 MB/s eta 0:00:00
Requirement already satisfied: toml in /usr/local/lib/python3.10/dist-packages (from torchpack>=0.3.0->torchquantum==0.1.7) (0.10.2)
INFO: pip is looking at multiple versions of multiprocess to determine which version is compatible with other requirements. This could take a while.
Collecting multiprocess>=0.70.12 (from pathos>=0.2.7->torchquantum==0.1.7)
  Downloading multiprocess-0.70.13-py310-none-any.whl (133 kB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 133.1/133.1 kB 10.7 MB/s eta 0:00:00
  Downloading multiprocess-0.70.12.2-py39-none-any.whl (128 kB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 128.7/128.7 kB 15.6 MB/s eta 0:00:00
Requirement already satisfied: six>=1.5 in /usr/local/lib/python3.10/dist-packages (from python-dateutil>=2.8.0->qiskit-ibmq-provider==0.19.2->qiskit==0.38.0->torchquantum==0.1.7) (1.16.0)
Requirement already satisfied: certifi>=2017.4.17 in /usr/local/lib/python3.10/dist-packages (from requests>=2.19->qiskit-ibmq-provider==0.19.2->qiskit==0.38.0->torchquantum==0.1.7) (2023.5.7)
Requirement already satisfied: charset-normalizer~=2.0.0 in /usr/local/lib/python3.10/dist-packages (from requests>=2.19->qiskit-ibmq-provider==0.19.2->qiskit==0.38.0->torchquantum==0.1.7) (2.0.12)
Requirement already satisfied: idna<4,>=2.5 in /usr/local/lib/python3.10/dist-packages (from requests>=2.19->qiskit-ibmq-provider==0.19.2->qiskit==0.38.0->torchquantum==0.1.7) (3.4)
Requirement already satisfied: mpmath>=0.19 in /usr/local/lib/python3.10/dist-packages (from sympy>=1.3->qiskit-terra==0.21.2->qiskit==0.38.0->torchquantum==0.1.7) (1.3.0)
Requirement already satisfied: MarkupSafe>=2.0 in /usr/local/lib/python3.10/dist-packages (from jinja2->torch>=1.8.0->torchquantum==0.1.7) (2.1.3)
Requirement already satisfied: absl-py>=0.4 in /usr/local/lib/python3.10/dist-packages (from tensorboard->torchpack>=0.3.0->torchquantum==0.1.7) (1.4.0)
Requirement already satisfied: grpcio>=1.48.2 in /usr/local/lib/python3.10/dist-packages (from tensorboard->torchpack>=0.3.0->torchquantum==0.1.7) (1.56.0)
Requirement already satisfied: google-auth<3,>=1.6.3 in /usr/local/lib/python3.10/dist-packages (from tensorboard->torchpack>=0.3.0->torchquantum==0.1.7) (2.17.3)
Requirement already satisfied: google-auth-oauthlib<1.1,>=0.5 in /usr/local/lib/python3.10/dist-packages (from tensorboard->torchpack>=0.3.0->torchquantum==0.1.7) (1.0.0)
Requirement already satisfied: markdown>=2.6.8 in /usr/local/lib/python3.10/dist-packages (from tensorboard->torchpack>=0.3.0->torchquantum==0.1.7) (3.4.3)
Requirement already satisfied: protobuf>=3.19.6 in /usr/local/lib/python3.10/dist-packages (from tensorboard->torchpack>=0.3.0->torchquantum==0.1.7) (3.20.3)
Requirement already satisfied: tensorboard-data-server<0.8.0,>=0.7.0 in /usr/local/lib/python3.10/dist-packages (from tensorboard->torchpack>=0.3.0->torchquantum==0.1.7) (0.7.1)
Requirement already satisfied: werkzeug>=1.0.1 in /usr/local/lib/python3.10/dist-packages (from tensorboard->torchpack>=0.3.0->torchquantum==0.1.7) (2.3.6)
Requirement already satisfied: wheel>=0.26 in /usr/local/lib/python3.10/dist-packages (from tensorboard->torchpack>=0.3.0->torchquantum==0.1.7) (0.40.0)
Requirement already satisfied: termcolor>=1.1 in /usr/local/lib/python3.10/dist-packages (from tensorpack->torchpack>=0.3.0->torchquantum==0.1.7) (2.3.0)
Requirement already satisfied: tabulate>=0.7.7 in /usr/local/lib/python3.10/dist-packages (from tensorpack->torchpack>=0.3.0->torchquantum==0.1.7) (0.8.10)
Requirement already satisfied: msgpack>=0.5.2 in /usr/local/lib/python3.10/dist-packages (from tensorpack->torchpack>=0.3.0->torchquantum==0.1.7) (1.0.5)
Collecting msgpack-numpy>=0.4.4.2 (from tensorpack->torchpack>=0.3.0->torchquantum==0.1.7)
  Downloading msgpack_numpy-0.4.8-py2.py3-none-any.whl (6.9 kB)
Requirement already satisfied: pyzmq>=16 in /usr/local/lib/python3.10/dist-packages (from tensorpack->torchpack>=0.3.0->torchquantum==0.1.7) (23.2.1)
Requirement already satisfied: cachetools<6.0,>=2.0.0 in /usr/local/lib/python3.10/dist-packages (from google-auth<3,>=1.6.3->tensorboard->torchpack>=0.3.0->torchquantum==0.1.7) (5.3.1)
Requirement already satisfied: pyasn1-modules>=0.2.1 in /usr/local/lib/python3.10/dist-packages (from google-auth<3,>=1.6.3->tensorboard->torchpack>=0.3.0->torchquantum==0.1.7) (0.3.0)
Requirement already satisfied: rsa<5,>=3.1.4 in /usr/local/lib/python3.10/dist-packages (from google-auth<3,>=1.6.3->tensorboard->torchpack>=0.3.0->torchquantum==0.1.7) (4.9)
Requirement already satisfied: requests-oauthlib>=0.7.0 in /usr/local/lib/python3.10/dist-packages (from google-auth-oauthlib<1.1,>=0.5->tensorboard->torchpack>=0.3.0->torchquantum==0.1.7) (1.3.1)
Collecting cryptography>=1.3 (from requests-ntlm>=1.1.0->qiskit-ibmq-provider==0.19.2->qiskit==0.38.0->torchquantum==0.1.7)
  Downloading cryptography-41.0.2-cp37-abi3-manylinux_2_28_x86_64.whl (4.3 MB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 4.3/4.3 MB 79.8 MB/s eta 0:00:00
Collecting pyspnego>=0.1.6 (from requests-ntlm>=1.1.0->qiskit-ibmq-provider==0.19.2->qiskit==0.38.0->torchquantum==0.1.7)
  Downloading pyspnego-0.9.1-py3-none-any.whl (132 kB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 132.9/132.9 kB 12.8 MB/s eta 0:00:00
Collecting rustworkx==0.13.0 (from retworkx>=0.11.0->qiskit-terra==0.21.2->qiskit==0.38.0->torchquantum==0.1.7)
  Downloading rustworkx-0.13.0-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (1.9 MB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 1.9/1.9 MB 62.6 MB/s eta 0:00:00
Collecting pbr!=2.1.0,>=2.0.0 (from stevedore>=3.0.0->qiskit-terra==0.21.2->qiskit==0.38.0->torchquantum==0.1.7)
  Downloading pbr-5.11.1-py2.py3-none-any.whl (112 kB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 112.7/112.7 kB 8.0 MB/s eta 0:00:00
Requirement already satisfied: cffi>=1.12 in /usr/local/lib/python3.10/dist-packages (from cryptography>=1.3->requests-ntlm>=1.1.0->qiskit-ibmq-provider==0.19.2->qiskit==0.38.0->torchquantum==0.1.7) (1.15.1)
Requirement already satisfied: pyasn1<0.6.0,>=0.4.6 in /usr/local/lib/python3.10/dist-packages (from pyasn1-modules>=0.2.1->google-auth<3,>=1.6.3->tensorboard->torchpack>=0.3.0->torchquantum==0.1.7) (0.5.0)
Requirement already satisfied: oauthlib>=3.0.0 in /usr/local/lib/python3.10/dist-packages (from requests-oauthlib>=0.7.0->google-auth-oauthlib<1.1,>=0.5->tensorboard->torchpack>=0.3.0->torchquantum==0.1.7) (3.2.2)
Requirement already satisfied: pycparser in /usr/local/lib/python3.10/dist-packages (from cffi>=1.12->cryptography>=1.3->requests-ntlm>=1.1.0->qiskit-ibmq-provider==0.19.2->qiskit==0.38.0->torchquantum==0.1.7) (2.21)
Building wheels for collected packages: qiskit, pylatexenc
  Building wheel for qiskit (setup.py) ... done
  Created wheel for qiskit: filename=qiskit-0.38.0-py3-none-any.whl size=12128 sha256=7a54933fa9c2e1b1caffdc6129aa17723a1f8a19655b68eb148d3b916a542664
  Stored in directory: /root/.cache/pip/wheels/9c/b0/59/d6281e20610c76a5f88c9b931c6b338410f70b4ba6561453bc
  Building wheel for pylatexenc (setup.py) ... done
  Created wheel for pylatexenc: filename=pylatexenc-2.10-py3-none-any.whl size=136820 sha256=087f5465344ad90f93c062a3e9d3224bf3afdb393350b74383207ecbe6a0509b
  Stored in directory: /root/.cache/pip/wheels/d3/31/8b/e09b0386afd80cfc556c00408c9aeea5c35c4d484a9c762fd5
Successfully built qiskit pylatexenc
Installing collected packages: pylatexenc, ply, websockets, tweedledum, symengine, rustworkx, ppft, pox, pbr, multimethod, msgpack-numpy, loguru, dill, tensorpack, stevedore, retworkx, multiprocess, cryptography, qiskit-terra, pyspnego, pathos, requests-ntlm, qiskit-aer, qiskit-ibmq-provider, qiskit, torchpack, torchdiffeq, torchquantum
  Running setup.py develop for torchquantum
Successfully installed cryptography-41.0.2 dill-0.3.4 loguru-0.7.0 msgpack-numpy-0.4.8 multimethod-1.9.1 multiprocess-0.70.12.2 pathos-0.2.8 pbr-5.11.1 ply-3.11 pox-0.3.2 ppft-1.7.6.6 pylatexenc-2.10 pyspnego-0.9.1 qiskit-0.38.0 qiskit-aer-0.11.0 qiskit-ibmq-provider-0.19.2 qiskit-terra-0.21.2 requests-ntlm-1.2.0 retworkx-0.13.0 rustworkx-0.13.0 stevedore-5.1.0 symengine-0.10.0 tensorpack-0.11 torchdiffeq-0.2.3 torchpack-0.3.1 torchquantum-0.1.7 tweedledum-1.1.1 websockets-11.0.3

Note: To be able to install TorchQuantum on Colab, you must restart your runtime before continuing!

After installing from source (and restarting if using Colab!), you can import TorchQuantum.

[1]:
import torchquantum as tq

Creating a device#

Before we can construct our quantum node, we need to initialize a device.

Definition

Any computational object that can apply quantum operations and return a measurement value is called a quantum device.

Devices are loaded in PennyLane via the classQuantumDevice()

For this tutorial, we are using the qubit model, so let’s initialize the ‘default’ device provided by TorchQuantum.

[4]:
qdev = tq.QuantumDevice(
    n_wires=1, device_name="default", bsz=1, device="cuda", record_op=True
)

For all devices, QuantumDevice() accepts the following arguments:

  • n_wires: number of qubits to initialize the device with

  • device_name: name of the quantum device to be loaded

  • bsz: batch size of the quantum state

  • device: which classical computing device to use, ‘cpu’ or ‘cuda’ (similar to the device option in PyTorch)

  • record_op: whether to record the operations on the quantum device and then they can be used to construct a static computation graph

Here, as we only require a single qubit for this example, we set wires=1.

Constructing the Circuit#

Now that we have initialized our device, we can begin to construct the circuit. In TorchQuantum, there are multiple ways to construct a circuit, and we can explore a few of them.

[5]:
# specify parameters
params = [0.54, 0.12]

# create circuit
qdev.rx(params=params[0], wires=0)
qdev.ry(params=params[1], wires=0)

This method calls the gates directly from the QuantumDevice. For the rotations, we can specify which wire it belongs to (zero-indexed) and a parameter theta for the amount of rotation. However, the rotation gates also have other parameters.

  • wires: which qibits the gate is applied to

  • theta: the amount of rotation

  • n_wires: number of qubits the gate is applied to

  • static: whether use static mode computation

  • parent_graph: Parent QuantumGraph of current operation

  • inverse: whether inverse the gate

  • comp_method: option to use ‘bmm’ or ‘einsum’ method to perform matrix vector multiplication

To get the following expected value, we can use two different functions from torchquantum’s measurement module.

[6]:
from torchquantum.measurement import expval_joint_analytical, expval_joint_sampling
  • expval_joint_analytical will compute the expectation value of a joint observable in analytical way, assuming the statevector is available. This can only be run on a classical simulator, not real quantum hardware.

  • expval_joint_analytical will compute the expectation value of a joint observable from sampling the measurement bistring. This can be run on both a classical simulation and real quantum hardware. Since this is sampling the measurements, it requires a parameters for the number of shots, n_shots.

[7]:
exp_a = expval_joint_analytical(qdev, "Z")
exp_s = expval_joint_sampling(qdev, "Z", n_shots=1024)

print(exp_a, exp_s)
tensor([0.8515]) tensor([0.8184])

The two numbers are about the same, and if we increase the number of shots for the joint sampling, its expected value should approach the same value as the analytical.

Calculating quantum gradients#

From the expected values output, notice that the analytical expected value has an automatically-calculated gradient which can be used when constructing quantum machine learning models. This is because TorchQuantum automatically calculates the gradients. Let’s find the gradient of each individual gate.

To do so, we can create the circuit slightly differently, saving each operation as a variable then adding it to the circuit. We can then once again get the expected value with expval_joint_analytical.

[8]:
qdev = tq.QuantumDevice(n_wires=1)

op1 = tq.RX(has_params=True, trainable=True, init_params=0.54)
op1(qdev, wires=0)

op2 = tq.RY(has_params=True, trainable=True, init_params=0.12)
op2(qdev, wires=0)


expval = expval_joint_analytical(qdev, "Z")

We can then call .backward() on the expected value, just like in PyTorch. Afterwards, we can see the gradient of each operation under the params option.

[9]:
expval[0].backward()

# calculate the gradients for each operation!
print(op1.params.grad, op2.params.grad)
tensor([[-0.5104]]) tensor([[-0.1027]])

Optimization#

Next, let’s make use of PyTorch’s optimizers to optimize the two circuit parameters \(\phi_1\) and \(\phi_2\) such that the qubit, originally in state |0⟩, is rotated to be in state |1⟩. This is equivalent to measuring a Pauli-Z expectation value of -1, since the state |1⟩ is an eigenvector of the Pauli-Z matrix with eigenvalue λ=−1.

To construct this circuit, we can use a class similar to a PyTorch module! We can begin by importing torch.

[38]:
import torch

We can next create the class extending the PyTorch module and add our gates in a similar fashion as the previous steps.

[39]:
import torchquantum as tq
import torchquantum.functional as tqf


class OptimizationModel(torch.nn.Module):
    def __init__(self):
        super().__init__()
        self.rx0 = tq.RX(has_params=True, trainable=True, init_params=0.011)
        self.ry0 = tq.RY(has_params=True, trainable=True, init_params=0.012)

    def forward(self):
        # create a quantum device to run the gates
        qdev = tq.QuantumDevice(n_wires=1)

        # add some trainable gates (need to instantiate ahead of time)
        self.rx0(qdev, wires=0)
        self.ry0(qdev, wires=0)

        return expval_joint_analytical(qdev, "Z")

To optimize the rotation, we need to define a cost function. By minimizing the cost function, the optimizer will determine the values of the circuit parameters that produce the desired outcome.

In this case, our desired outcome is a Pauli-Z expectation value of −1. Since we know that the Pauli-Z expectation is bound between [−1, 1], we can define our cost directly as the output of the circuit.

Similar to PyTorch, we can create a train function to compute the gradients of the loss function and have the optimizer perform an optimization step.

[53]:
def train(model, device, optimizer):
    targets = 0

    outputs = model()
    loss = outputs
    optimizer.zero_grad()
    loss.backward()
    optimizer.step()

    return loss.item()

Finally, we can run the model. We can import PyTorch’s gradient descent module and use it to optimize our model.

[54]:
def main():
    seed = 0
    torch.manual_seed(seed)

    use_cuda = torch.cuda.is_available()
    device = torch.device("cuda" if use_cuda else "cpu")

    model = OptimizationModel()
    n_epochs = 200
    optimizer = torch.optim.SGD(model.parameters(), lr=0.1)

    for epoch in range(1, n_epochs + 1):
        # train
        loss = train(model, device, optimizer)
        output = (model.rx0.params[0].item(), model.ry0.params[0].item())
        print(f"Epoch {epoch}: {output}")

        if epoch % 10 == 0:
            print(f"Loss after step {epoch}: {loss}")

Finally, we can call the main function and run the entire sequence!

[55]:
main()
Epoch 1: (0.012099898420274258, 0.013199898414313793)
Epoch 2: (0.013309753499925137, 0.014519752934575081)
Epoch 3: (0.014640549197793007, 0.015971548855304718)
Epoch 4: (0.01610436476767063, 0.017568465322256088)
Epoch 5: (0.01771448366343975, 0.01932499371469021)
Epoch 6: (0.019485509023070335, 0.02125706896185875)
Epoch 7: (0.021433496847748756, 0.023382212966680527)
Epoch 8: (0.023576095700263977, 0.02571968361735344)
Epoch 9: (0.025932706892490387, 0.028290653601288795)
Epoch 10: (0.028524650260806084, 0.03111839108169079)
Loss after step 10: 0.9992638230323792
Epoch 11: (0.031375348567962646, 0.03422846272587776)
Epoch 12: (0.0345105305314064, 0.03764895722270012)
Epoch 13: (0.037958454340696335, 0.041410721838474274)
Epoch 14: (0.04175013676285744, 0.04554762691259384)
Epoch 15: (0.04591960832476616, 0.050096847116947174)
Epoch 16: (0.05050419643521309, 0.055099159479141235)
Epoch 17: (0.05554480850696564, 0.06059926748275757)
Epoch 18: (0.06108624115586281, 0.06664614379405975)
Epoch 19: (0.06717751175165176, 0.07329340279102325)
Epoch 20: (0.07387219369411469, 0.08059966564178467)
Loss after step 20: 0.9950657486915588
Epoch 21: (0.08122873306274414, 0.08862894773483276)
Epoch 22: (0.08931083232164383, 0.09745106101036072)
Epoch 23: (0.09818772971630096, 0.10714197158813477)
Epoch 24: (0.10793451964855194, 0.11778417229652405)
Epoch 25: (0.11863239109516144, 0.12946699559688568)
Epoch 26: (0.13036876916885376, 0.14228680729866028)
Epoch 27: (0.14323736727237701, 0.15634718537330627)
Epoch 28: (0.15733805298805237, 0.17175881564617157)
Epoch 29: (0.172776460647583, 0.18863925337791443)
Epoch 30: (0.18966329097747803, 0.20711229741573334)
Loss after step 30: 0.9676356315612793
Epoch 31: (0.20811320841312408, 0.2273070216178894)
Epoch 32: (0.2282431572675705, 0.2493562251329422)
Epoch 33: (0.2501700222492218, 0.27339422702789307)
Epoch 34: (0.2740074098110199, 0.29955384135246277)
Epoch 35: (0.2998615801334381, 0.32796236872673035)
Epoch 36: (0.32782599329948425, 0.35873648524284363)
Epoch 37: (0.3579748272895813, 0.39197587966918945)
Epoch 38: (0.3903552293777466, 0.427755743265152)
Epoch 39: (0.42497843503952026, 0.46611812710762024)
Epoch 40: (0.4618101119995117, 0.5070626139640808)
Loss after step 40: 0.8138567805290222
Epoch 41: (0.5007606744766235, 0.5505368709564209)
Epoch 42: (0.5416762828826904, 0.5964280366897583)
Epoch 43: (0.5843320488929749, 0.6445562839508057)
Epoch 44: (0.6284285187721252, 0.6946715116500854)
Epoch 45: (0.6735928058624268, 0.7464552521705627)
Epoch 46: (0.7193858623504639, 0.7995281219482422)
Epoch 47: (0.7653157711029053, 0.8534636497497559)
Epoch 48: (0.8108565211296082, 0.9078077673912048)
Epoch 49: (0.8554709553718567, 0.9621021151542664)
Epoch 50: (0.8986347317695618, 1.0159088373184204)
Loss after step 50: 0.3750203549861908
Epoch 51: (0.9398593902587891, 1.0688340663909912)
Epoch 52: (0.9787107706069946, 1.1205471754074097)
Epoch 53: (1.0148218870162964, 1.1707944869995117)
Epoch 54: (1.0478986501693726, 1.2194054126739502)
Epoch 55: (1.0777196884155273, 1.2662931680679321)
Epoch 56: (1.1041301488876343, 1.311449408531189)
Epoch 57: (1.127032995223999, 1.354935884475708)
Epoch 58: (1.1463772058486938, 1.3968735933303833)
Epoch 59: (1.1621465682983398, 1.4374314546585083)
Epoch 60: (1.1743487119674683, 1.4768157005310059)
Loss after step 60: 0.052838265895843506
Epoch 61: (1.1830050945281982, 1.5152597427368164)
Epoch 62: (1.1881437301635742, 1.553015947341919)
Epoch 63: (1.1897931098937988, 1.590348243713379)
Epoch 64: (1.1879782676696777, 1.6275262832641602)
Epoch 65: (1.1827187538146973, 1.6648198366165161)
Epoch 66: (1.1740283966064453, 1.702493667602539)
Epoch 67: (1.1619168519973755, 1.7408030033111572)
Epoch 68: (1.146392583847046, 1.7799879312515259)
Epoch 69: (1.1274679899215698, 1.820267915725708)
Epoch 70: (1.1051654815673828, 1.8618348836898804)
Loss after step 70: -0.10590392351150513
Epoch 71: (1.0795255899429321, 1.9048453569412231)
Epoch 72: (1.0506160259246826, 1.9494123458862305)
Epoch 73: (1.018541693687439, 1.9955958127975464)
Epoch 74: (0.9834545850753784, 2.043394088745117)
Epoch 75: (0.9455628991127014, 2.0927350521087646)
Epoch 76: (0.9051381945610046, 2.1434707641601562)
Epoch 77: (0.8625186085700989, 2.1953752040863037)
Epoch 78: (0.8181073665618896, 2.2481465339660645)
Epoch 79: (0.7723652124404907, 2.30141544342041)
Epoch 80: (0.7257967591285706, 2.354759931564331)
Loss after step 80: -0.4779837727546692
Epoch 81: (0.6789312362670898, 2.4077253341674805)
Epoch 82: (0.6322994232177734, 2.459847927093506)
Epoch 83: (0.5864096879959106, 2.5106801986694336)
Epoch 84: (0.5417252779006958, 2.5598134994506836)
Epoch 85: (0.4986463487148285, 2.6068966388702393)
Epoch 86: (0.4574976861476898, 2.6516494750976562)
Epoch 87: (0.4185234606266022, 2.6938676834106445)
Epoch 88: (0.38188809156417847, 2.7334227561950684)
Epoch 89: (0.34768232703208923, 2.770256519317627)
Epoch 90: (0.31593260169029236, 2.8043713569641113)
Loss after step 90: -0.876086413860321
Epoch 91: (0.28661224246025085, 2.835820436477661)
Epoch 92: (0.25965315103530884, 2.8646953105926514)
Epoch 93: (0.23495660722255707, 2.891116142272949)
Epoch 94: (0.21240299940109253, 2.915221691131592)
Epoch 95: (0.1918598860502243, 2.937161445617676)
Epoch 96: (0.1731884628534317, 2.957089900970459)
Epoch 97: (0.156248539686203, 2.97516131401062)
Epoch 98: (0.14090220630168915, 2.991525888442993)
Epoch 99: (0.12701639533042908, 3.0063281059265137)
Epoch 100: (0.11446458846330643, 3.019704818725586)
Loss after step 100: -0.9828835129737854
Epoch 101: (0.1031278446316719, 3.0317838191986084)
Epoch 102: (0.09289533644914627, 3.042684316635132)
Epoch 103: (0.08366449177265167, 3.052516460418701)
Epoch 104: (0.07534093409776688, 3.0613811016082764)
Epoch 105: (0.0678381696343422, 3.069370985031128)
Epoch 106: (0.06107722595334053, 3.0765702724456787)
Epoch 107: (0.054986197501420975, 3.0830557346343994)
Epoch 108: (0.0494997613132, 3.088897228240967)
Epoch 109: (0.04455867409706116, 3.0941579341888428)
Epoch 110: (0.040109291672706604, 3.0988948345184326)
Loss after step 110: -0.997883677482605
Epoch 111: (0.03610309213399887, 3.1031599044799805)
Epoch 112: (0.03249623253941536, 3.106999635696411)
Epoch 113: (0.029249126091599464, 3.1104564666748047)
Epoch 114: (0.026326047256588936, 3.1135683059692383)
Epoch 115: (0.023694779723882675, 3.1163694858551025)
Epoch 116: (0.021326277405023575, 3.1188907623291016)
Epoch 117: (0.019194360822439194, 3.1211602687835693)
Epoch 118: (0.01727544330060482, 3.1232030391693115)
Epoch 119: (0.015548276714980602, 3.1250417232513428)
Epoch 120: (0.013993724249303341, 3.1266965866088867)
Loss after step 120: -0.9997422099113464
Epoch 121: (0.012594552710652351, 3.128185987472534)
Epoch 122: (0.011335243470966816, 3.1295266151428223)
Epoch 123: (0.010201825760304928, 3.130733013153076)
Epoch 124: (0.00918172113597393, 3.131819009780884)
Epoch 125: (0.008263605646789074, 3.132796287536621)
Epoch 126: (0.0074372864328324795, 3.1336758136749268)
Epoch 127: (0.006693588104099035, 3.134467363357544)
Epoch 128: (0.006024251226335764, 3.1351797580718994)
Epoch 129: (0.005421841982752085, 3.1358211040496826)
Epoch 130: (0.004879669286310673, 3.1363983154296875)
Loss after step 130: -0.9999685883522034
Epoch 131: (0.004391710739582777, 3.13691782951355)
Epoch 132: (0.0039525460451841354, 3.137385368347168)
Epoch 133: (0.0035572960041463375, 3.1378061771392822)
Epoch 134: (0.003201569663360715, 3.1381847858428955)
Epoch 135: (0.0028814151883125305, 3.1385254859924316)
Epoch 136: (0.0025932753924280405, 3.1388320922851562)
Epoch 137: (0.002333949087187648, 3.139108180999756)
Epoch 138: (0.002100554993376136, 3.1393566131591797)
Epoch 139: (0.0018905001925304532, 3.139580249786377)
Epoch 140: (0.0017014506738632917, 3.1397814750671387)
Loss after step 140: -0.9999963045120239
Epoch 141: (0.001531305955722928, 3.139962673187256)
Epoch 142: (0.0013781756861135364, 3.1401257514953613)
Epoch 143: (0.0012403583386912942, 3.140272378921509)
Epoch 144: (0.0011163227027282119, 3.140404462814331)
Epoch 145: (0.0010046905372291803, 3.1405231952667236)
Epoch 146: (0.0009042215533554554, 3.1406302452087402)
Epoch 147: (0.0008137994445860386, 3.1407265663146973)
Epoch 148: (0.0007324195466935635, 3.140813112258911)
Epoch 149: (0.0006591776036657393, 3.1408910751342773)
Epoch 150: (0.0005932598724029958, 3.140961170196533)
Loss after step 150: -0.9999995231628418
Epoch 151: (0.0005339339259080589, 3.141024351119995)
Epoch 152: (0.00048054056242108345, 3.1410810947418213)
Epoch 153: (0.000432486500358209, 3.141132354736328)
Epoch 154: (0.00038923785905353725, 3.1411783695220947)
Epoch 155: (0.00035031407605856657, 3.1412198543548584)
Epoch 156: (0.0003152826684527099, 3.1412570476531982)
Epoch 157: (0.0002837544016074389, 3.1412906646728516)
Epoch 158: (0.0002553789527155459, 3.1413209438323975)
Epoch 159: (0.00022984105453360826, 3.141348123550415)
Epoch 160: (0.00020685694471467286, 3.1413726806640625)
Loss after step 160: -1.0
Epoch 161: (0.0001861712516983971, 3.14139461517334)
Epoch 162: (0.0001675541279837489, 3.1414144039154053)
Epoch 163: (0.00015079871809575707, 3.141432285308838)
Epoch 164: (0.00013571884483098984, 3.1414482593536377)
Epoch 165: (0.0001221469574375078, 3.141462802886963)
Epoch 166: (0.00010993226169375703, 3.1414756774902344)
Epoch 167: (9.893903188640252e-05, 3.1414873600006104)
Epoch 168: (8.904512651497498e-05, 3.141497850418091)
Epoch 169: (8.014061313588172e-05, 3.141507387161255)
Epoch 170: (7.212655327748507e-05, 3.1415159702301025)
Loss after step 170: -1.0
Epoch 171: (6.491389649454504e-05, 3.141523599624634)
Epoch 172: (5.8422505389899015e-05, 3.1415305137634277)
Epoch 173: (5.258025339571759e-05, 3.1415367126464844)
Epoch 174: (4.732222805614583e-05, 3.1415421962738037)
Epoch 175: (4.2590003431541845e-05, 3.141547203063965)
Epoch 176: (3.833100345218554e-05, 3.1415517330169678)
Epoch 177: (3.4497901651775464e-05, 3.1415557861328125)
Epoch 178: (3.10481118503958e-05, 3.141559362411499)
Epoch 179: (2.794330066535622e-05, 3.1415627002716064)
Epoch 180: (2.5148970962618478e-05, 3.1415657997131348)
Loss after step 180: -1.0
Epoch 181: (2.263407441205345e-05, 3.141568422317505)
Epoch 182: (2.0370667698443867e-05, 3.141570806503296)
Epoch 183: (1.83336014742963e-05, 3.141572952270508)
Epoch 184: (1.6500242054462433e-05, 3.1415748596191406)
Epoch 185: (1.485021766711725e-05, 3.1415765285491943)
Epoch 186: (1.3365195627557114e-05, 3.141578197479248)
Epoch 187: (1.2028675882902462e-05, 3.1415796279907227)
Epoch 188: (1.0825808203662746e-05, 3.141580820083618)
Epoch 189: (9.743227565195411e-06, 3.1415820121765137)
Epoch 190: (8.76890499057481e-06, 3.14158296585083)
Loss after step 190: -1.0
Epoch 191: (7.89201476436574e-06, 3.1415839195251465)
Epoch 192: (7.1028134698281065e-06, 3.141584873199463)
Epoch 193: (6.392532213794766e-06, 3.1415855884552)
Epoch 194: (5.753278855991084e-06, 3.1415863037109375)
Epoch 195: (5.177951152290916e-06, 3.141587018966675)
Epoch 196: (4.660155809688149e-06, 3.141587495803833)
Epoch 197: (4.194140274194069e-06, 3.141587972640991)
Epoch 198: (3.7747263377241325e-06, 3.1415884494781494)
Epoch 199: (3.3972537494264543e-06, 3.1415889263153076)
Epoch 200: (3.057528374483809e-06, 3.141589403152466)
Loss after step 200: -1.0

We can see that the optimization converges after approximately 160 steps.

Substituting this into the theoretical result \(⟨ψ∣σ_z∣ψ⟩ = \cos ϕ_1 \cos ϕ_2\), we can verify that this is indeed one possible value of the circuit parameters that produces \(⟨ψ∣σ_z∣ψ⟩ = −1\), resulting in the qubit being rotated to the state |1⟩.