Item type:Research Data, Open Access

Deep PACBED: Multitask Analysis of PACBED Images Using Deep Neural Networks

Description

This dataset contains the trained models and the test data set presented in the paper "Deep PACBED: Multitask Analysis of PACBED Images Using Deep Neural Networks". In this paper, we present a deep learning approach to automatically analyzing sample parameters from position-averaged convergent beam electron diffraction (PACBED) images. While previous machine learning approaches analyze each sample parameter individually, our work investigates multitask deep neural networks that simultaneously predict multiple sample parameters, outperforming models trained on a single parameter. We provide downloads for two trained models in the Tensorflow2 SavedModel format (https://www.tensorflow.org/guide/saved_model) in the archive files "model_EfficientNetV2S_branching.tar" (a branching neural network based on the EfficientNetV2 architectures) and "model_EfficientNetV2S_MTAN.tar" (our proposed adaptation of the multitask attention network (MTAN) to the EfficientNetV2 architecture). A script to load and run the models can be found in our Git-Repository: https://github.com/umr-ds/Deep-PACBED/. The archive file "data_experimental.tar" contains our experimental test dataset consisting of 1050 PACBED images. The images are organized into separate folders based on the different materials and crystallographic orientations (010 and 110). Each folder contains a "full.csv"-file that contains the metadata and labels for the images in the "img" subfolder, that are stored as .png-files. The "md_split" folder contains the annotations split into .csv files for the training and test subsets used in our work.

Metadata

show more
Files
Document
Type
Size
Schneider, Daniel; Jonas Scheunert; Heimes, Damien; Firoozabadi, Seyyed Salehedin; Volz, Kerstin; Freisleben, Bernd: Deep PACBED: Multitask Analysis of PACBED Images Using Deep Neural Networks. .

License

Except where otherwised noted, this item's license is described as Attribution 4.0 International