Quantify dataset - examples

See also

The complete source code of this tutorial can be found in

Quantify dataset - examples.py.ipynb

Quantify dataset - examples.py.py

In this page we explore a series of datasets that comply with the Quantify dataset specification.

2D dataset example

We use the mk_two_qubit_chevron_dataset() to generate our exemplary dataset. Its source code is conveniently displayed in the drop down below.

dataset = dataset_examples.mk_two_qubit_chevron_dataset()

assert dataset == round_trip_dataset(dataset)  # confirm read/write
dataset
<xarray.Dataset>
Dimensions:  (repetitions: 5, main_dim: 1200)
Coordinates:
    amp      (main_dim) float64 0.45 0.4534 0.4569 0.4603 ... 0.5431 0.5466 0.55
    time     (main_dim) float64 0.0 0.0 0.0 0.0 0.0 ... 1e-07 1e-07 1e-07 1e-07
Dimensions without coordinates: repetitions, main_dim
Data variables:
    pop_q0   (repetitions, main_dim) float64 0.5 0.5 0.5 ... 0.4886 0.4818 0.5
    pop_q1   (repetitions, main_dim) float64 0.5 0.5 0.5 ... 0.5243 0.5371 0.5
Attributes:
    tuid:                      20211208-140524-981-2bb6bd
    dataset_name:              
    dataset_state:             None
    timestamp_start:           None
    timestamp_end:             None
    quantify_dataset_version:  2.0.0
    software_versions:         {}
    relationships:             []
    json_serialize_exclude:    []

The data within this dataset can be easily visualized using xarray facilities, however we first need to convert the Quantify dataset to a “gridded” version with as shown below.

Since our dataset contains multiple repetitions of the same experiment, it is convenient to visualize them on different plots.

dataset_gridded = dh.to_gridded_dataset(
    dataset,
    dimension="main_dim",
    coords_names=dattrs.get_main_coords(dataset),
)
dataset_gridded.pop_q0.plot.pcolormesh(x="amp", col="repetitions")
_ = dataset_gridded.pop_q1.plot.pcolormesh(x="amp", col="repetitions")
../../_images/Quantify dataset - examples.py_4_0.png ../../_images/Quantify dataset - examples.py_4_1.png

In xarray, among other features, it is possible to average along a dimension which can be very convenient to average out some of the noise:

_ = dataset_gridded.pop_q0.mean(dim="repetitions").plot(x="amp")
../../_images/Quantify dataset - examples.py_5_0.png

A repetitions dimension can be indexed by a coordinate such that we can have some specific label for each of our repetitions. To showcase this, we will modify the previous dataset by merging it with a dataset containing the relevant extra information.

coord_dims = ("repetitions",)
coord_values = ["A", "B", "C", "D", "E"]
dataset_indexed_rep = xr.Dataset(coords=dict(repetitions=(coord_dims, coord_values)))

dataset_indexed_rep
<xarray.Dataset>
Dimensions:      (repetitions: 5)
Coordinates:
  * repetitions  (repetitions) <U1 'A' 'B' 'C' 'D' 'E'
Data variables:
    *empty*
# merge with the previous dataset
dataset_rep = dataset_gridded.merge(dataset_indexed_rep, combine_attrs="drop_conflicts")

assert dataset_rep == round_trip_dataset(dataset_rep)  # confirm read/write

dataset_rep
<xarray.Dataset>
Dimensions:      (repetitions: 5, amp: 30, time: 40)
Coordinates:
  * repetitions  (repetitions) object 'A' 'B' 'C' 'D' 'E'
  * amp          (amp) float64 0.45 0.4534 0.4569 0.4603 ... 0.5431 0.5466 0.55
  * time         (time) float64 0.0 2.564e-09 5.128e-09 ... 9.744e-08 1e-07
Data variables:
    pop_q0       (repetitions, amp, time) float64 0.5 0.5 0.5 ... 0.5 0.5 0.5
    pop_q1       (repetitions, amp, time) float64 0.5 0.5 0.5 ... 0.5 0.5 0.5
Attributes:
    tuid:                      20211208-140524-981-2bb6bd
    dataset_name:              
    dataset_state:             None
    timestamp_start:           None
    timestamp_end:             None
    quantify_dataset_version:  2.0.0
    software_versions:         {}
    relationships:             []
    json_serialize_exclude:    []

Now we can select a specific repetition by its coordinate, in this case a string label.

_ = dataset_rep.pop_q0.sel(repetitions="E").plot(x="amp")
../../_images/Quantify dataset - examples.py_8_0.png

T1 dataset examples

The T1 experiment is one of the most common quantum computing experiments. Here we explore how the datasets for such an experiment, for a transmon qubit, can be stored using the Quantify dataset with increasing levels of data detail.

We start with the most simple format that contains only processed (averaged) measurements and finish with a dataset containing the raw digitized signals from the transmon readout during a T1 experiment.

First we define a few parameters of our mock qubit and mock data acquisition.

# parameters of our qubit model
tau = 30e-6
ground = -0.2 + 0.65j  # ground state on the IQ-plane
excited = 0.7 - 0.4j  # excited state on the IQ-plane
centers = ground, excited
sigmas = [0.1] * 2  # sigma, NB in general not the same for both state

# mock of data acquisition configuration
# NB usually at least 1000+ shots are taken, here we use less for faster code execution
num_shots = 256
# time delays between exciting the qubit and measuring its state
t1_times = np.linspace(0, 120e-6, 30)

# NB this are the ideal probabilities from repeating the measurement many times for a
# qubit with a lifetime given by tau
probabilities = exp_decay_func(t=t1_times, tau=tau, offset=0, n_factor=1, amplitude=1)

# Ideal experiment result
plt.ylabel("|1> probability")
plt.suptitle("Typical processed data of a T1 experiment")
plt.plot(t1_times * 1e6, probabilities, ".-")
_ = plt.xlabel("Time [µs]")
../../_images/Quantify dataset - examples.py_12_0.png
# convenience dict with the mock parameters
mock_conf = dict(
    num_shots=num_shots,
    centers=centers,
    sigmas=sigmas,
    t1_times=t1_times,
    probabilities=probabilities,
)

T1 experiment averaged

In this first example we generate the individual measurement shots and average it, similar to what some instrument are capable of doing directly in the hardware.

Here is how we store this data in the dataset along with the coordinates of these datapoints:

dataset = dataset_examples.mk_t1_av_dataset(**mock_conf)
assert dataset == round_trip_dataset(dataset)  # confirm read/write

dataset
<xarray.Dataset>
Dimensions:   (main_dim: 30)
Coordinates:
    t1_time   (main_dim) float64 0.0 4.138e-06 8.276e-06 ... 0.0001159 0.00012
Dimensions without coordinates: main_dim
Data variables:
    q0_iq_av  (main_dim) complex128 (-0.19894114958423859+0.6515500138845804j...
Attributes:
    tuid:                      20211208-140527-853-b753ca
    dataset_name:              
    dataset_state:             None
    timestamp_start:           None
    timestamp_end:             None
    quantify_dataset_version:  2.0.0
    software_versions:         {}
    relationships:             []
    json_serialize_exclude:    []
dataset.q0_iq_av.shape, dataset.q0_iq_av.dtype
((30,), dtype('complex128'))
dataset_gridded = dh.to_gridded_dataset(
    dataset,
    dimension="main_dim",
    coords_names=dattrs.get_main_coords(dataset),
)
dataset_gridded
<xarray.Dataset>
Dimensions:   (t1_time: 30)
Coordinates:
  * t1_time   (t1_time) float64 0.0 4.138e-06 8.276e-06 ... 0.0001159 0.00012
Data variables:
    q0_iq_av  (t1_time) complex128 (-0.19894114958423859+0.6515500138845804j)...
Attributes:
    tuid:                      20211208-140527-853-b753ca
    dataset_name:              
    dataset_state:             None
    timestamp_start:           None
    timestamp_end:             None
    quantify_dataset_version:  2.0.0
    software_versions:         {}
    relationships:             []
    json_serialize_exclude:    []
plot_xr_complex(dataset_gridded.q0_iq_av)
fig, ax = plot_xr_complex_on_plane(dataset_gridded.q0_iq_av)
_ = plot_complex_points(centers, ax=ax)
../../_images/Quantify dataset - examples.py_19_0.png ../../_images/Quantify dataset - examples.py_19_1.png

T1 experiment averaged with calibration points

It is common for many experiment to require calibration data in order to interpret the results. Often, these calibration datapoints have different array shapes. E.g. it can be just two simple datapoints corresponding to the ground and excited states of our transmon.

To accommodate this data in the dataset we make use of a secondary dimensions along which the variables and its coordinate will lie along.

Additionally, since the secondary variable and coordinate used for calibration can have arbitrary names and relate to other variable in more complex ways, we specify this relationship in the dataset attributes (see QDatasetIntraRelationship). This information can be used later, for example, to run an appropriate analysis on this dataset.

dataset = dataset_examples.mk_t1_av_with_cal_dataset(**mock_conf)
assert dataset == round_trip_dataset(dataset)  # confirm read/write

dataset
<xarray.Dataset>
Dimensions:       (main_dim: 30, cal_dim: 2)
Coordinates:
    t1_time       (main_dim) float64 0.0 4.138e-06 ... 0.0001159 0.00012
    cal           (cal_dim) <U3 '|0>' '|1>'
Dimensions without coordinates: main_dim, cal_dim
Data variables:
    q0_iq_av      (main_dim) complex128 (-0.19894114958423859+0.6515500138845...
    q0_iq_av_cal  (cal_dim) complex128 (0.7010588504157614-0.3984499861154196...
Attributes:
    tuid:                      20211208-140528-394-bd09b7
    dataset_name:              
    dataset_state:             None
    timestamp_start:           None
    timestamp_end:             None
    quantify_dataset_version:  2.0.0
    software_versions:         {}
    relationships:             [{'item_name': 'q0_iq_av', 'relation_type': 'c...
    json_serialize_exclude:    []
dattrs.get_main_dims(dataset), dattrs.get_secondary_dims(dataset)
(['main_dim'], ['cal_dim'])
dataset.relationships
[
    {
        'item_name': 'q0_iq_av',
        'relation_type': 'calibration',
        'related_names': ['q0_iq_av_cal'],
        'relation_metadata': {}
    }
]

As before the coordinates can be set to index the variables that lie along the same dimensions:

dataset_gridded = dh.to_gridded_dataset(
    dataset,
    dimension="main_dim",
    coords_names=dattrs.get_main_coords(dataset),
)
dataset_gridded = dh.to_gridded_dataset(
    dataset_gridded,
    dimension="cal_dim",
    coords_names=dattrs.get_secondary_coords(dataset_gridded),
)
dataset_gridded
<xarray.Dataset>
Dimensions:       (t1_time: 30, cal: 2)
Coordinates:
  * t1_time       (t1_time) float64 0.0 4.138e-06 ... 0.0001159 0.00012
  * cal           (cal) <U3 '|0>' '|1>'
Data variables:
    q0_iq_av      (t1_time) complex128 (-0.19894114958423859+0.65155001388458...
    q0_iq_av_cal  (cal) complex128 (0.7010588504157614-0.3984499861154196j) (...
Attributes:
    tuid:                      20211208-140528-394-bd09b7
    dataset_name:              
    dataset_state:             None
    timestamp_start:           None
    timestamp_end:             None
    quantify_dataset_version:  2.0.0
    software_versions:         {}
    relationships:             [{'item_name': 'q0_iq_av', 'relation_type': 'c...
    json_serialize_exclude:    []
fig = plt.figure(figsize=(8, 5))

ax = plt.subplot2grid((1, 10), (0, 0), colspan=9, fig=fig)
plot_xr_complex(dataset_gridded.q0_iq_av, ax=ax)

ax_calib = plt.subplot2grid((1, 10), (0, 9), colspan=1, fig=fig, sharey=ax)
for i, color in zip(
    range(2), ["C0", "C1"]
):  # plot each calibration point with same color
    dataset_gridded.q0_iq_av_cal.real[i : i + 1].plot.line(
        marker="o", ax=ax_calib, linestyle="", color=color
    )
    dataset_gridded.q0_iq_av_cal.imag[i : i + 1].plot.line(
        marker="o", ax=ax_calib, linestyle="", color=color
    )
ax_calib.yaxis.set_label_position("right")
ax_calib.yaxis.tick_right()

fig, ax = plot_xr_complex_on_plane(dataset_gridded.q0_iq_av)
_ = plot_complex_points(dataset_gridded.q0_iq_av_cal.values, ax=ax)
../../_images/Quantify dataset - examples.py_25_0.png ../../_images/Quantify dataset - examples.py_25_1.png

We can use the calibration points to normalize the data and obtain the typical T1 decay.

T1 experiment storing all shots

Now we will include in the dataset all the single qubit states (shot) for each individual measurement.

dataset = dataset_examples.mk_t1_shots_dataset(**mock_conf)
dataset
<xarray.Dataset>
Dimensions:          (main_dim: 30, cal_dim: 2, repetitions: 256)
Coordinates:
    t1_time          (main_dim) float64 0.0 4.138e-06 ... 0.0001159 0.00012
    cal              (cal_dim) <U3 '|0>' '|1>'
Dimensions without coordinates: main_dim, cal_dim, repetitions
Data variables:
    q0_iq_av         (main_dim) complex128 (-0.19894114958423859+0.6515500138...
    q0_iq_av_cal     (cal_dim) complex128 (0.7010588504157614-0.3984499861154...
    q0_iq_shots      (repetitions, main_dim) complex128 (-0.289836545355741+0...
    q0_iq_shots_cal  (repetitions, cal_dim) complex128 (0.610163454644259-0.4...
Attributes:
    tuid:                      20211208-140529-249-b64912
    dataset_name:              
    dataset_state:             None
    timestamp_start:           None
    timestamp_end:             None
    quantify_dataset_version:  2.0.0
    software_versions:         {}
    relationships:             [{'item_name': 'q0_iq_av', 'relation_type': 'c...
    json_serialize_exclude:    []
dataset_gridded = dh.to_gridded_dataset(
    dataset,
    dimension="main_dim",
    coords_names=dattrs.get_main_coords(dataset),
)
dataset_gridded = dh.to_gridded_dataset(
    dataset_gridded,
    dimension="cal_dim",
    coords_names=dattrs.get_secondary_coords(dataset_gridded),
)
dataset_gridded
<xarray.Dataset>
Dimensions:          (cal: 2, t1_time: 30, repetitions: 256)
Coordinates:
  * cal              (cal) <U3 '|0>' '|1>'
  * t1_time          (t1_time) float64 0.0 4.138e-06 ... 0.0001159 0.00012
Dimensions without coordinates: repetitions
Data variables:
    q0_iq_av         (t1_time) complex128 (-0.19894114958423859+0.65155001388...
    q0_iq_av_cal     (cal) complex128 (0.7010588504157614-0.3984499861154196j...
    q0_iq_shots      (repetitions, t1_time) complex128 (-0.289836545355741+0....
    q0_iq_shots_cal  (repetitions, cal) complex128 (0.610163454644259-0.41025...
Attributes:
    tuid:                      20211208-140529-249-b64912
    dataset_name:              
    dataset_state:             None
    timestamp_start:           None
    timestamp_end:             None
    quantify_dataset_version:  2.0.0
    software_versions:         {}
    relationships:             [{'item_name': 'q0_iq_av', 'relation_type': 'c...
    json_serialize_exclude:    []

In this dataset we have both the averaged values and all the shots. The averaged values can be plotted in the same way as before.

_ = plot_xr_complex(dataset_gridded.q0_iq_av)
_, ax = plot_xr_complex_on_plane(dataset_gridded.q0_iq_av)
_ = plot_complex_points(dataset_gridded.q0_iq_av_cal.values, ax=ax)
../../_images/Quantify dataset - examples.py_30_0.png ../../_images/Quantify dataset - examples.py_30_1.png

Here we focus on inspecting how the individual shots are distributed on the IQ plane for some particular Time values.

Note that we are plotting the calibration points as well.

chosen_time_values = [
    t1_times[1],  # second value selected otherwise we won't see both centers
    t1_times[len(t1_times) // 5],  # a value close to the end of the experiment
]
for t_example in chosen_time_values:
    shots_example = (
        dataset_gridded.q0_iq_shots.real.sel(t1_time=t_example),
        dataset_gridded.q0_iq_shots.imag.sel(t1_time=t_example),
    )
    plt.hexbin(*shots_example)
    plt.xlabel("I")
    plt.ylabel("Q")
    calib_0 = dataset_gridded.q0_iq_av_cal.sel(cal="|0>")
    calib_1 = dataset_gridded.q0_iq_av_cal.sel(cal="|1>")
    plot_complex_points([calib_0, calib_1], ax=plt.gca())
    plt.suptitle(f"Shots for t = {t_example:.5f} [s]")
    plt.show()
../../_images/Quantify dataset - examples.py_31_0.png ../../_images/Quantify dataset - examples.py_31_1.png

We can collapse (average along) the repetitions dimension:

q0_iq_shots_mean = dataset_gridded.q0_iq_shots.mean(dim="repetitions", keep_attrs=True)
plot_xr_complex(q0_iq_shots_mean)
_, ax = plot_xr_complex_on_plane(q0_iq_shots_mean)
_ = plot_complex_points(centers, ax=ax)
../../_images/Quantify dataset - examples.py_32_0.png ../../_images/Quantify dataset - examples.py_32_1.png

T1 experiment storing digitized signals for all shots

Finally, in addition to the individual shots we will store all the digitized readout signals that are required to obtain the previous measurement results.

dataset = dataset_examples.mk_t1_traces_dataset(**mock_conf)
assert dataset == round_trip_dataset(dataset)  # confirm read/write

dataset
<xarray.Dataset>
Dimensions:          (main_dim: 30, cal_dim: 2, repetitions: 256, trace_dim: 300)
Coordinates:
    t1_time          (main_dim) float64 0.0 4.138e-06 ... 0.0001159 0.00012
    cal              (cal_dim) <U3 '|0>' '|1>'
    trace_time       (trace_dim) float64 0.0 1e-09 2e-09 ... 2.98e-07 2.99e-07
Dimensions without coordinates: main_dim, cal_dim, repetitions, trace_dim
Data variables:
    q0_iq_av         (main_dim) complex128 (-0.19894114958423859+0.6515500138...
    q0_iq_av_cal     (cal_dim) complex128 (0.7010588504157614-0.3984499861154...
    q0_iq_shots      (repetitions, main_dim) complex128 (-0.289836545355741+0...
    q0_iq_shots_cal  (repetitions, cal_dim) complex128 (0.610163454644259-0.4...
    q0_traces        (repetitions, main_dim, trace_dim) complex128 (-0.289836...
    q0_traces_cal    (repetitions, cal_dim, trace_dim) complex128 (0.61016345...
Attributes:
    tuid:                      20211208-140531-302-f9386b
    dataset_name:              
    dataset_state:             None
    timestamp_start:           None
    timestamp_end:             None
    quantify_dataset_version:  2.0.0
    software_versions:         {}
    relationships:             [{'item_name': 'q0_iq_av', 'relation_type': 'c...
    json_serialize_exclude:    []
dataset.q0_traces.shape, dataset.q0_traces_cal.shape
((256, 30, 300), (256, 2, 300))
dataset_gridded = dh.to_gridded_dataset(
    dataset,
    dimension="main_dim",
    coords_names=["t1_time"],
)
dataset_gridded = dh.to_gridded_dataset(
    dataset_gridded,
    dimension="cal_dim",
    coords_names=["cal"],
)
dataset_gridded = dh.to_gridded_dataset(
    dataset_gridded, dimension="trace_dim", coords_names=["trace_time"]
)
dataset_gridded
<xarray.Dataset>
Dimensions:          (trace_time: 300, cal: 2, t1_time: 30, repetitions: 256)
Coordinates:
  * trace_time       (trace_time) float64 0.0 1e-09 2e-09 ... 2.98e-07 2.99e-07
  * cal              (cal) <U3 '|0>' '|1>'
  * t1_time          (t1_time) float64 0.0 4.138e-06 ... 0.0001159 0.00012
Dimensions without coordinates: repetitions
Data variables:
    q0_iq_av         (t1_time) complex128 (-0.19894114958423859+0.65155001388...
    q0_iq_av_cal     (cal) complex128 (0.7010588504157614-0.3984499861154196j...
    q0_iq_shots      (repetitions, t1_time) complex128 (-0.289836545355741+0....
    q0_iq_shots_cal  (repetitions, cal) complex128 (0.610163454644259-0.41025...
    q0_traces        (repetitions, t1_time, trace_time) complex128 (-0.289836...
    q0_traces_cal    (repetitions, cal, trace_time) complex128 (0.61016345464...
Attributes:
    tuid:                      20211208-140531-302-f9386b
    dataset_name:              
    dataset_state:             None
    timestamp_start:           None
    timestamp_end:             None
    quantify_dataset_version:  2.0.0
    software_versions:         {}
    relationships:             [{'item_name': 'q0_iq_av', 'relation_type': 'c...
    json_serialize_exclude:    []
dataset_gridded.q0_traces.shape, dataset_gridded.q0_traces.dims
((256, 30, 300), ('repetitions', 't1_time', 'trace_time'))

All the previous data is also present, but in this dataset we can inspect the IQ signal for each individual shot. Let’s inspect the signal of the shot number 123 of the last “point” of the T1 experiment:

trace_example = dataset_gridded.q0_traces.sel(
    repetitions=123, t1_time=dataset_gridded.t1_time[-1]
)
trace_example.shape, trace_example.dtype
((300,), dtype('complex128'))

Now we can plot this digitized signals for each quadrature. For clarity we plot only part of the signal.

trace_example_plt = trace_example[:200]
trace_example_plt.real.plot(figsize=(15, 5), marker=".", label="I-quadrature")
trace_example_plt.imag.plot(marker=".", label="Q-quadrature")
plt.gca().legend()
plt.show()
../../_images/Quantify dataset - examples.py_39_0.png