Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Live SSB on Dectris ARINA data streams in Nion Swift #139

Open
TomaSusi opened this issue Jan 29, 2024 · 38 comments
Open

Live SSB on Dectris ARINA data streams in Nion Swift #139

TomaSusi opened this issue Jan 29, 2024 · 38 comments

Comments

@TomaSusi
Copy link

TomaSusi commented Jan 29, 2024

I am trying to adopt the LiberTEM/Ptychography4.0 SSB code to analyze Nion Swift data collected on the Dectris ARINA detector, with the ultimate goal of running live SSB and piping the results to Nion Swift.

I am getting a TypeError when trying to load in the data using the InlineJobExecutor, and at first I though this was due to Swift data format, but I am able to reproduce the error using one of the sample datasets. My minimal code example reads:

import libertem.api as lt
from libertem.executor.inline import InlineJobExecutor 

inline_executor = InlineJobExecutor() 
inline_ctx = lt.Context(executor=inline_executor)

ds = inline_ctx.load('hdf5', data='/Users/tomasusi/Downloads/calibrationData_circularProbe.h5', ds_path="/data")

This results in a TypeError regardless of what I try to do with the ds_path keyword.

Traceback (most recent call last):                                                                                                                                                                           
  File "<stdin>", line 1, in <module>                                                                                                                                                                        
  File "/Users/tomasusi/miniconda3/envs/test/lib/python3.11/site-packages/libertem/api.py", line 450, in load
    return load(
           ^^^^^
  File "/Users/tomasusi/miniconda3/envs/test/lib/python3.11/site-packages/libertem/io/dataset/__init__.py", line 151, in load
    ds = cls(*args, **kwargs)
         ^^^^^^^^^^^^^^^^^^^^
TypeError: H5DataSet.__init__() got an unexpected keyword argument 'data'

Am I doing something wrong, or is there a bug perhaps? I am using LiberTEM 0.13.1 installed from conda-forge in a Python 3.11.7 conda environment on macOS 14.2.1.

@sk1p
Copy link
Member

sk1p commented Jan 29, 2024

First of all, thank you for the report!

ds = inline_ctx.load('hdf5', data='/Users/tomasusi/Downloads/calibrationData_circularProbe.h5', ds_path="/data")

This should instead read (replacing data with path):

ds = inline_ctx.load('hdf5', path='/Users/tomasusi/Downloads/calibrationData_circularProbe.h5', ds_path="/data")

The data kwarg is only used for the memory dataset, which is mostly useful for testing purposes.

See our docs for the supported kwargs when loading HDF5.

inline_executor = InlineJobExecutor()

Orthogonal to this issue, depending on your system, it can also make sense to try the default executor, for example:

context = Context.make_with('dask')

macOS 14.2.1

Just for reference, is this on a system with Intel CPUs, or one of the newer ARM-based ones?

I am trying to adopt the LiberTEM/Ptychography4.0 SSB code to analyze Nion Swift data collected on the Dectris ARINA detector.

I'm curious, does Nion Swift use its own HDF5 format, or is it internally storing the data compressed, as it comes from the detector/DCU? If the latter, you may need to install hdf5plugin to add support for the bslz4 compression/filter, and be aware of another issue concerning the way DECTRIS saves data.

Hope this helps, and let me know if there are additional issues.

@TomaSusi
Copy link
Author

Ahh... so my mistake – I had first tried the in-memory loader, and must have just copied the code incorrectly. It works perfectly fine when using path. Thanks!

This is an Intel system (which will be also the case for our Nion user PC), but I can also test on my MacBook if you're interested?

Swift datasets work out of the box, I believe they just store them as arrays in a HDF5 container.

However, I am quite interested in trying to do live SSB, so it may be that I will need to work with the DECTRIS datastreams, good to know.

@TomaSusi
Copy link
Author

Maybe I can use the opportunity to ask a further question: I am getting a warning about the MaskContainer for my test dataset:

Mask factory size 577508642 larger than warning limit 1048576, may be inefficient

How should I think about this? Should I pick a selection of trotters only to use for the mask factory?

@uellue
Copy link
Member

uellue commented Jan 29, 2024

Maybe I can use the opportunity to ask a further question: I am getting a warning about the MaskContainer for my test dataset:

Mask factory size 577508642 larger than warning limit 1048576, may be inefficient

How should I think about this? Should I pick a selection of trotters only to use for the mask factory?

You can ignore that warning with SSB, as long as it works! By the way, for speed you can try the Binned SSB Ptychography-4-0/ptychography#68 It works, the PR is just in the backlog for finishing.

@TomaSusi
Copy link
Author

It does work, though reconstruction takes about 3 min on my (admittedly old) iMac Pro. Probably will need to look at GPU acceleration on Linux box (or the User PC itself).

The ARINA bins to 96 x 96 pix so not convinced I need to bin that further?

@sk1p
Copy link
Member

sk1p commented Jan 29, 2024

Ahh... so my mistake – I had first tried the in-memory loader, and must have just copied the code incorrectly. It works perfectly fine when using path. Thanks!

Good to hear that it works now!

This is an Intel system (which will be also the case for our Nion user PC), but I can also test on my MacBook if you're interested?

I was mostly asking because ARM-based Macs are currently a blind spot in our QA, so there may be issues related to installation etc. on these systems.

Swift datasets work out of the box, I believe they just store them as arrays in a HDF5 container.

Ok.

However, I am quite interested in trying to do live SSB, so it may be that I will need to work with the DECTRIS datastreams, good to know.

That's a tiny bit different, as we don't read the HDF5 files as they are written, but we directly get the data stream via zeromq from the DCU. That is not affected by the above-mentioned issue.

@TomaSusi
Copy link
Author

TomaSusi commented Jan 29, 2024

Swift datasets work out of the box, I believe they just store them as arrays in a HDF5 container.

Ok.

However, I am quite interested in trying to do live SSB, so it may be that I will need to work with the DECTRIS datastreams, good to know.

That's a tiny bit different, as we don't read the HDF5 files as they are written, but we directly get the data stream via zeromq from the DCU. That is not affected by the above-mentioned issue.

Ah there may be a nuance here: Swift saves the DECTRIS Arina dataset internally as .h5. If I export the file, loading it into LiberTEM has no problem. However, if I try to directly read the .h5 file that Swift internally stores the data in, I get an error

ValueError: Invalid vmin or vmax

when trying to plot the sum_result using the default colors.LogNorm. Maybe that is indeed due to the mentioned issue?

@uellue
Copy link
Member

uellue commented Jan 29, 2024

It does work, though reconstruction takes about 3 min on my (admittedly old) iMac Pro. Probably will need to look at GPU acceleration on Linux box (or the User PC itself).

The time strongly depends on the scan resolution. From 128x128 to 256x256 you get a 16x increase in computational effort. It also depends a lot on the scan step, size of the primary beam and convergence angle, so hard to say if this is already good or can be improved! The UDF already supports GPUs, if I read the source code and remember correctly.

The ARINA bins to 96 x 96 pix so not convinced I need to bin that further?

16x16 is enough for SSB, and it also crops to the primary beam. The binned version also processes whole partitions, which might be faster, in particular on GPUs. It also uses improvements in LiberTEM for handling large shared objects for the trotter stack. We've hit, I think, 50.000 fps live with an ARINA (I don't remember if 128x128 or 256x256), so it is worth a try!

@TomaSusi
Copy link
Author

The ARINA bins to 96 x 96 pix so not convinced I need to bin that further?

16x16 is enough for SSB, and it also crops to the primary beam. The binned version also processes whole partitions, which might be faster, in particular on GPUs. It also uses improvements in LiberTEM for handling large shared objects for the trotter stack. We've hit, I think, 50.000 fps live with an ARINA (I don't remember if 128x128 or 256x256), so it is worth a try!

Oh okay, so not just the binning! Yes, I will definitely try this, is there any example code on the live implementation you could share?

@uellue
Copy link
Member

uellue commented Jan 29, 2024

@TomaSusi
Copy link
Author

The last section of https://github.com/Ptychography-4-0/ptychography/blob/dec7ce7fd3469decf79ff4b48098cced584c718f/examples/ssb-example.ipynb has an example!

How is that '/cachedata/weber/ssb/slice_00001_thick_1.9525_nm_blocksz100.raw' created? It still sounds like a local file and not a stream on the DECTRIS server PC?

@uellue
Copy link
Member

uellue commented Jan 29, 2024

...for doing it live one switches out the data source: https://libertem.github.io/LiberTEM-live/dectris-acquisition-example.html

@sk1p do we have the improved notebooks somewhere? Convenient live processing is very much WIP, contributions welcome! There's also https://github.com/LiberTEM/LiberTEM-live-server where work on a Nion Swift integration was supposed to happen, but I am not sure how that is going ATM.

@TomaSusi
Copy link
Author

Now that we have the first ARINA installed on a Nion machine, I believe Andreas Mittelberger is empowered to spend some more time on this with us, and yes we'd obviously be happy to contribute back where it makes sense.

@uellue
Copy link
Member

uellue commented Jan 29, 2024

That would be great! The current architectural idea is that LiberTEM is running between the detector and the GUI. LiberTEM receives the full data stream from the detector and control commands from the GUI, and sends a strongly reduced data stream to the GUI for display. It was already used successfully with SerialEM that way: https://github.com/Pr4Et/SavvyScan

@uellue
Copy link
Member

uellue commented Jan 29, 2024

...also, @sk1p is working on integration with CEOS' Panta Rhei.

@uellue
Copy link
Member

uellue commented Jan 29, 2024

In particular the latter works pretty well! We've done 1 mio fps live interactive 4D STEM with an Amsterdam Scientific Instruments CheeTah T3 (Timepix) using a prototype, and as long as Nion Swift can keep up with plot updates it should also be possible there.

@TomaSusi
Copy link
Author

I'll rename this issue to talk about live processing to keep this discussion going.

@TomaSusi TomaSusi changed the title TypeError in loading sample HDF5 dataset Live SSB on Dectris ARINA data streams in Nion Swift Jan 29, 2024
@sk1p
Copy link
Member

sk1p commented Jan 29, 2024

...for doing it live one switches out the data source: https://libertem.github.io/LiberTEM-live/dectris-acquisition-example.html

@sk1p do we have the improved notebooks somewhere? Convenient live processing is very much WIP, contributions welcome!

The document you linked to is one of our example notebooks, and should be up-to-date. Other than that, the DECTRIS section in our docs should contain further details.

There's also https://github.com/LiberTEM/LiberTEM-live-server where work on a Nion Swift integration was supposed to happen, but I am not sure how that is going ATM.

I'll follow up on their status soon.

Ah there may be a nuance here: Swift saves the DECTRIS Arina dataset internally as .h5. If I export the file, loading it into LiberTEM has no problem. However, if I try to directly read the .h5 file that Swift internally stores the data in, I get an error

ValueError: Invalid vmin or vmax

when trying to plot the sum_result using the default colors.LogNorm. Maybe that is indeed due to the mentioned issue?

Using LogNorm has a caveat: if there are zeros in your data, and vmin=0, you get said error message. You can reproduce similar to this:

a = np.zeros((128, 128))
plt.imshow(a, norm=LogNorm())
plt.show()

You can try to mask out the zero values, for example like this:

plt.imshow(np.ma.masked_where(a == 0, a), norm=LogNorm())

That will give you "data missing" markers where you have zero values, but should plot without issues.

@TomaSusi
Copy link
Author

TomaSusi commented Jan 29, 2024

Right, fair point about the LogNorm but since direct-electron datasets will always have zeros in the data, shouldn't there be some better way to handle this directly..?

As a workaround, I find that this works decently for my data:

norm=colors.SymLogNorm(linthresh=1)

@sk1p
Copy link
Member

sk1p commented Jan 31, 2024

Right, fair point about the LogNorm but since direct-electron datasets will always have zeros in the data, shouldn't there be some better way to handle this directly..?

I think we do handle these kind of issues in the live plotting interface (which could be improved, surely, as I don't think it supports log scaling by default yet, for example). When getting back the raw results as arrays, we don't really know about the next processing step, so "punching holes" for zeros for log scaling or similar needs to be handled by the data consumer.

norm=colors.SymLogNorm(linthresh=1)

That's a good one, too 👍

@TomaSusi
Copy link
Author

TomaSusi commented Jan 31, 2024

By the way, for speed you can try the Binned SSB Ptychography-4-0/ptychography#68 It works, the PR is just in the backlog for finishing.

@uellue, I tried to clone and install this fork, which I believe is where you have the binned SSB:
https://github.com/uellue/ptychography/tree/binned

The regular non-binned SSB runs fine on my dataset.

However, trying to run the example code at the end of the notebook you pointed to (https://github.com/Ptychography-4-0/ptychography/blob/dec7ce7fd3469decf79ff4b48098cced584c718f/examples/ssb-example.ipynb) on the same data, I get an error:

---------------------------------------------------------------------------
AssertionError                            Traceback (most recent call last)
File <timed exec>:1

File ~/miniconda3/envs/py4DSTEM_dev/lib/python3.10/site-packages/libertem/api.py:982, in Context.run_udf(self, dataset, udf, roi, corrections, progress, backends, plots, sync)
    980 with tracer.start_as_current_span("Context.run_udf"):
    981     if sync:
--> 982         return self._run_sync(
    983             dataset=dataset,
    984             udf=udf,
    985             roi=roi,
    986             corrections=corrections,
    987             progress=progress,
    988             backends=backends,
    989             plots=plots,
    990             iterate=False,
    991         )
    992     else:
    993         return self._run_async(
    994             dataset=dataset,
    995             udf=udf,
   (...)
   1001             iterate=False,
   1002         )

File ~/miniconda3/envs/py4DSTEM_dev/lib/python3.10/site-packages/libertem/api.py:1275, in Context._run_sync(self, dataset, udf, roi, corrections, progress, backends, plots, iterate)
   1273     return _run_sync_wrap()
   1274 else:
-> 1275     udf_results = run_gen_get_last(_run_sync_wrap())
   1276     if udf_is_list:
   1277         return udf_results.buffers

File ~/miniconda3/envs/py4DSTEM_dev/lib/python3.10/site-packages/libertem/common/async_utils.py:97, in run_gen_get_last(gen)
     95 try:
     96     while True:
---> 97         result = gen.__next__()
     98 except StopIteration:
     99     pass

File ~/miniconda3/envs/py4DSTEM_dev/lib/python3.10/site-packages/libertem/api.py:91, in ResultGenerator.__next__(self)
     90 def __next__(self):
---> 91     return next(self._task_results)

File ~/miniconda3/envs/py4DSTEM_dev/lib/python3.10/site-packages/libertem/api.py:1260, in Context._run_sync.<locals>._run_sync_wrap.<locals>._inner()
   1259 def _inner():
-> 1260     for udf_results in result_iter:
   1261         yield udf_results
   1262         if enable_plotting:

File ~/miniconda3/envs/py4DSTEM_dev/lib/python3.10/site-packages/libertem/udf/base.py:85, in ResultsForDataSet.__next__(self)
     84 def __next__(self) -> Tuple[Tuple["UDFData", ...], TaskProtocol]:
---> 85     return next(self._gen)

File ~/miniconda3/envs/py4DSTEM_dev/lib/python3.10/site-packages/libertem/udf/base.py:2526, in UDFRunner.run_for_dataset_sync.<locals>._inner()
   2524 num_results = 0
   2525 try:
-> 2526     for part_results, task in result_iter:
   2527         num_results += 1
   2528         with tracer.start_as_current_span("_apply_part_result -> UDF.merge"):

File ~/miniconda3/envs/py4DSTEM_dev/lib/python3.10/site-packages/libertem/udf/base.py:2481, in UDFRunner.results_for_dataset_sync.<locals>._inner()
   2478 yield params_handle
   2480 if tasks:
-> 2481     for res in executor.run_tasks(
   2482         tasks,
   2483         params_handle,
   2484         cancel_id,
   2485         task_comm_handler,
   2486     ):
   2487         if progress:
   2488             pman.finalize_task(res[1])

File ~/miniconda3/envs/py4DSTEM_dev/lib/python3.10/site-packages/libertem/executor/inline.py:91, in InlineJobExecutor.run_tasks(self, tasks, params_handle, cancel_id, task_comm_handler)
     89     cloudpickle.loads(cloudpickle.dumps(task))
     90 task_comm_handler.handle_task(task, worker_queue)
---> 91 result = task(env=env, params=self._scattered[params_handle])
     92 if self._debug:
     93     cloudpickle.loads(cloudpickle.dumps(result))

File ~/miniconda3/envs/py4DSTEM_dev/lib/python3.10/site-packages/libertem/udf/base.py:1930, in UDFTask.__call__(self, params, env)
   1925 with self._propagate_tracing(), tracer.start_as_current_span("UDFTask.__call__"):
   1926     udfs = [
   1927         cls.new_for_partition(kwargs, self.partition, params.roi)
   1928         for cls, kwargs in zip(self._udf_classes, params.kwargs)
   1929     ]
-> 1930     return self._runner_cls(udfs, progress=self._progress).run_for_partition(
   1931         self.partition, params, env, self._user_backends,
   1932     )

File ~/miniconda3/envs/py4DSTEM_dev/lib/python3.10/site-packages/libertem/udf/base.py:2026, in UDFPartRunner.run_for_partition(self, partition, params, env, backend_choice)
   2024     if env.worker_context is not None:
   2025         partition.set_worker_context(env.worker_context)
-> 2026     self._run_udfs(
   2027         ds_backend, execution_plan, partition, params.tiling_scheme, roi, dtype
   2028     )
   2029     self._wrapup_udfs(partition)
   2030 finally:

File ~/miniconda3/envs/py4DSTEM_dev/lib/python3.10/site-packages/libertem/udf/base.py:2077, in UDFPartRunner._run_udfs(self, ds_backend, execution_plan, partition, tiling_scheme, roi, dtype)
   2075             methods = udf_methods[backend]
   2076             device_tile = converter.get(backend)
-> 2077             self._run_tile(
   2078                 zip(udfs, methods), partition, tile, backend, device_tile, roi=roi
   2079             )
   2080         partition_progress.signal_tile_complete(tile)
   2081 except AttributeError as e:

File ~/miniconda3/envs/py4DSTEM_dev/lib/python3.10/site-packages/libertem/udf/base.py:2186, in UDFPartRunner._run_tile(self, udfs_and_methods, partition, tile, array_backend, device_tile, roi)
   2183         udf_frame.process_frame(frame)
   2184 elif udf_method == UDFMethod.PARTITION:
   2185     # Internal checks for dataset consistency
-> 2186     assert partition.slice.adjust_for_roi(roi) == tile.tile_slice
   2187     udf.set_views_for_tile(partition, tile)
   2188     udf.set_slice(tile.tile_slice)

AssertionError: 

@sk1p
Copy link
Member

sk1p commented Feb 1, 2024

Sorry you are hitting these issues.

However, trying to run the example code at the end of the notebook you pointed to (https://github.com/Ptychography-4-0/ptychography/blob/dec7ce7fd3469decf79ff4b48098cced584c718f/examples/ssb-example.ipynb) on the same data, I get an error:

[...]

File ~/miniconda3/envs/py4DSTEM_dev/lib/python3.10/site-packages/libertem/udf/base.py:2186, in UDFPartRunner._run_tile(self, udfs_and_methods, partition, tile, array_backend, device_tile, roi)
2183 udf_frame.process_frame(frame)
2184 elif udf_method == UDFMethod.PARTITION:
2185 # Internal checks for dataset consistency
-> 2186 assert partition.slice.adjust_for_roi(roi) == tile.tile_slice
2187 udf.set_views_for_tile(partition, tile)
2188 udf.set_slice(tile.tile_slice)

AssertionError:

Hmm, I remember we hit this assertion previously. It's related to partition-by-partition processing with certain data sources. This is on the HDF5 data from nion swift, correct? Could you give some details about the file, such as chunking of the dataset, and/or maybe provide an example file that reproduces the issue (could be just an empty data set, as long as the shapes/chunking is similar, and as long as that hits the same issue)? This would be very helpful!

@TomaSusi
Copy link
Author

TomaSusi commented Feb 1, 2024

Hmm, I remember we hit this assertion previously. It's related to partition-by-partition processing with certain data sources. This is on the HDF5 data from nion swift, correct? Could you give some details about the file, such as chunking of the dataset, and/or maybe provide an example file that reproduces the issue (could be just an empty data set, as long as the shapes/chunking is similar, and as long as that hits the same issue)? This would be very helpful!

The data shape is (256, 256, 96, 96) and chunking seems to be (1, 16, 96, 96).

This is still some of the first test data on our ARINA, so the quality is not great, but this should not matter for testing. I do note that I am able to reconstruct a (somewhat lousy) SSB phase image using PyPtychoSTEM, but the LiberTEM SSB does not seem to yield anything sensible for the data despite okay-looking COM plots – probably binning will help.

I'll send you a link to the data privately, alongside the metadata parameters used for the reconstruction.

@sk1p
Copy link
Member

sk1p commented Feb 1, 2024

but the LiberTEM SSB does not seem to yield anything sensible

It may be that the (parameters for...) rotation between detector and scan is off. Or maybe @uellue has another idea?

I'll send you a link to the data privately, alongside the metadata parameters used for the reconstruction

Thank you, that is very helpful.

@TomaSusi
Copy link
Author

TomaSusi commented Feb 1, 2024

but the LiberTEM SSB does not seem to yield anything sensible

It may be that the (parameters for...) rotation between detector and scan is off. Or maybe @uellue has another idea?

That's possible, but I do calculate the rotation by minimizing the curl using py4DSTEM's DPC functionality, and use the same derived rotation for PyPtychoSTEM. But admittedly the data is simply not great, I'm curious to see if binning helps!

@uellue
Copy link
Member

uellue commented Feb 1, 2024

@TomaSusi One has to take care that these software packages understand these parameters in the same way as LiberTEM resp. the SSB implementation! In Thermo Fisher software, what they call "scan rotation" is actually an image rotation, i.e. opposite direction, for example. Having the y axis point up instead of down for scan and/or detector is popular as well, meaning the handedness should be checked, too.

The DPC method might give wrong results if the microscope or zone axis are not aligned perfectly, unless the specimen is very thin (graphene and such).

@TomaSusi
Copy link
Author

TomaSusi commented Feb 1, 2024

Fair enough! How does one determine the rotation using LiberTEM tools? I will not accept having to change the value manually and looking at what seems best ;-) We almost always work with very thin specimen, in this case bilayer hBN.

@uellue
Copy link
Member

uellue commented Feb 1, 2024

You can take an overfocused dataset and check where the detector image moves as a function of scan position using frame picking in the web GUI.

@TomaSusi
Copy link
Author

TomaSusi commented Feb 1, 2024

I'm afraid that won't work for me, I do all my analyses in Jupyter.

@uellue
Copy link
Member

uellue commented Feb 1, 2024

If you trust DPC, you can use this function: https://libertem.github.io/LiberTEM/reference/udf.html#libertem.udf.com.guess_corrections :-)

@TomaSusi
Copy link
Author

TomaSusi commented Feb 1, 2024

Of course we're trying to calibrate this at the instrument and save it to metadata, but it may be tricky to get it exactly right for perpetuity.

@uellue
Copy link
Member

uellue commented Feb 1, 2024

Yes, it should be recalibrated and checked frequently.

@TomaSusi
Copy link
Author

TomaSusi commented Feb 2, 2024

I have now much less noisy datasets for monolayer graphene, where the guess_corrections seems to agree with the rotations I get from py4DSTEM (and we also aligned the scan axes in hardware on the Nion), but LiberTEM SSB still fails to reconstruct anything useful even though PyPtychoSTEM gives beautiful SSB phase images.

@uellue
Copy link
Member

uellue commented Feb 5, 2024

Interesting! As far as I can remember, PyPtychoSTEM has a qualitative reconstruction mode where the trotters are weighted by their noise-normalized transfer function. I'm not sure if it is the default or if one has to activate it. On the high and low end of spatial frequencies the trotters are rather small, so they are sensitive to errors and noise. If the high and low spatial frequencies are filtered out, the result "looks better". LiberTEM SSB can filter out small trotters with the "cutoff" parameter. Maybe results look better with that? Also, binned SSB improves the behavior since it averages. Live WDD also has better noise rejection while being reasonably quantitative.

For qualitative phase contrast iCoM is pretty good: Less sensitive to noise and much faster to compute.

@TomaSusi
Copy link
Author

TomaSusi commented Feb 6, 2024

I checked in with Christoph, and that qualitative mode was probably only in the original Matlab code, PyPtychoSTEM reconstructs phases quantitatively.

I tried increasing the cutoff in LiberTEM but can't seem to improve the result. Note that I am not talking only about a poor reconstruction, but a straight-up failure to produce anything sensible (output from plt.imshow(udf_result['phase'].data) attached).

image

I think the problem is that generating trotters seems to fail for some reason, at least I presume they should not look like this (output of plt.imshow(trotters[1].todense() resulting from trotters = generate_masks(**rec_params, **mask_params) attached; here I binned the diffraction data by 4 before loading it):

download-11

@TomaSusi
Copy link
Author

TomaSusi commented Feb 6, 2024

Ahh... I found the issue – your parameters take the scanstep in m, not Å :D Now I am getting something sensible out!

@uellue
Copy link
Member

uellue commented Feb 6, 2024

Ah, good to hear. That should explain it!

@TomaSusi
Copy link
Author

TomaSusi commented Feb 6, 2024

Got this far on the newer datasets, have to cut off the smallest trotters quite heavily to get the noise in the LiberTEM reconstruction down to a similar level. Still trying to understand with Christoph why the range of phase values is different between the two reconstructions... (btw the py4DSTEM DPC phase is essentially iDPC though they implement an iterative scheme for it)

With 8x software binning, the LiberTEM-based reconstruction completes in about 7 seconds on my desktop, this might already run fast enough on our beefy new Nion user PC to do semi-live full-frame reconstructions :)

image

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants