摘要:In Talbot-Lau interferometry, the sample position yielding the highest phase sensitivity suffers from strong geometric blur. This trade-off between phase-sensitivity and spatial resolution is a fundamental challenge in such interferometric imaging applications with either neutron or conventional x-ray sources due to their relatively large beam-defining apertures or focal spots. In this study, a deep learning method is introduced to estimate a high phase-sensitive and high spatial resolution image from a trained neural network to attempt to avoid the trade-off for both high phase-sensitivity and high resolution. To realize this, the training data sets of the differential phase contrast images at a pair of sample positions, one of which is close to the phase grating and the other close to the detector, are numerically generated and are used as the inputs for the training data set of a generative adversarial network. The trained network has been applied to the real experimental data sets from a neutron grating interferometer and we have obtained improved images both in phase-sensitivity and spatial resolution.