PtychoNet: Fast and High Quality Phase Retrieval for Ptychography
Ziqiao Guan (Stony Brook University), Esther Tsai (Brookhaven National Laboratory), Xiaojing Huang (Brookhaven National Laboratory), Kevin Yager (Brookhaven National Laboratory), Hong Qin (Stony Brook University) AbstractPtychography is a coherent diffractive imaging (CDI) method that captures multiple diffraction patterns of a sample with a set of shifted localized illuminations (``probes''). The reconstruction problem, known as ``phase retrieval'', is conventionally solved by iterative algorithms. In this paper, we propose PtychoNet, a deep learning based method to perform phase retrieval for ptychography in a non-iterative manner. We devise a generative network to encode a full ptychography scan, reverse the diffractions at each scanning point and compute the amplitude and phase of the object. We demonstrate successful reconstruction using PtychoNet as well as recovering fine features in the case of extreme sparse scanning where conventional methods fail to give recognizable features.
DOI
10.5244/C.33.222
https://dx.doi.org/10.5244/C.33.222
Files
BibTeX
@inproceedings{BMVC2019,
title={PtychoNet: Fast and High Quality Phase Retrieval for Ptychography},
author={Ziqiao Guan and Esther Tsai and Xiaojing Huang and Kevin Yager and Hong Qin},
year={2019},
month={September},
pages={222.1--222.13},
articleno={222},
numpages={13},
booktitle={Proceedings of the British Machine Vision Conference (BMVC)},
publisher={BMVA Press},
editor={Kirill Sidorov and Yulia Hicks},
doi={10.5244/C.33.222},
url={https://dx.doi.org/10.5244/C.33.222}
}
title={PtychoNet: Fast and High Quality Phase Retrieval for Ptychography},
author={Ziqiao Guan and Esther Tsai and Xiaojing Huang and Kevin Yager and Hong Qin},
year={2019},
month={September},
pages={222.1--222.13},
articleno={222},
numpages={13},
booktitle={Proceedings of the British Machine Vision Conference (BMVC)},
publisher={BMVA Press},
editor={Kirill Sidorov and Yulia Hicks},
doi={10.5244/C.33.222},
url={https://dx.doi.org/10.5244/C.33.222}
}