We demonstrate that a deep neural network can significantly improve optical microscopy, enhancing its spatial resolution over a large field of view and depth of field. After its training, the only input to this network is an image acquired using a regular optical microscope, without any changes to its design. We blindly tested this deep learning approach using various tissue samples that are imaged with low-resolution and wide-field systems, where the network rapidly outputs an image with better resolution, matching the performance of higher numerical aperture lenses and also significantly surpassing their limited field of view and depth of field. These results are significant for various fields that use microscopy tools, including, e.g., life sciences, where optical microscopy is considered as one of the most widely used and deployed techniques. Beyond such applications, the presented approach might be applicable to other imaging modalities, also spanning different parts of the electromagnetic spectrum, and can be used to design computational imagers that get better as they continue to image specimens and establish new transformations among different modes of imaging.
|Original language||English (US)|
|State||Published - Nov 17 2017|
Bibliographical noteKAUST Repository Item: Exported on 2020-10-01
Acknowledgements: Presidential Early Career Award for Scientists and Engineers (PECASE); Army Research Office (ARO) (W911NF- 13-1-0419, W911NF-13-1-0197); ARO Life Sciences Division; National Science Foundation (NSF) (0963183); Division of Chemical, Bioengineering, Environmental, and Transport Systems (CBET) Division Biophotonics Program; Division of Emerging Frontiers in Research and Innovation (EFRI), NSF EAGER Award, NSF INSPIRE Award, NSF Partnerships for Innovation: Building Innovation Capacity (PFI:BIC) Program; Office of Naval Research (ONR); National Institutes of Health (NIH); Howard Hughes Medical Institute (HHMI); Vodafone Foundation; Mary Kay Foundation (TMKF); Steven&Alexandra Cohen Foundation; King Abdullah University of Science and Technology (KAUST); American Recovery and Reinvestment Act of 2009 (ARRA). European Union’s Horizon 2020 Framework Programme (H2020); H2020 Marie Sklodowska- Curie Actions (MSCA) (H2020-MSCA-IF-2014-65959).
This publication acknowledges KAUST support, but has no KAUST affiliated authors.