Mobile phones have facilitated the creation of field-portable, cost-effective imaging and sensing technologies that approach laboratory-grade instrument performance. However, the optical imaging interfaces of mobile phones are not designed for microscopy and produce distortions in imaging microscopic specimens. Here, we report on the use of deep learning to correct such distortions introduced by mobile-phone-based microscopes, facilitating the production of high-resolution, denoised, and color-corrected images, matching the performance of benchtop microscopes with high-end objective lenses, also extending their limited depth of field. After training a convolutional neural network, we successfully imaged various samples, including human tissue sections and Papanicolaou and blood smears, where the recorded images were highly compressed to ease storage and transmission. This method is applicable to other low-cost, aberrated imaging systems and could offer alternatives for costly and bulky microscopes, while also providing a framework for standardization of optical images for clinical and biomedical applications.
|Original language||English (US)|
|Number of pages||11|
|State||Published - Mar 15 2018|
Bibliographical noteKAUST Repository Item: Exported on 2020-10-01
Acknowledgements: The Ozcan Research Group at UCLA acknowledges the support of NSF Engineering Research Center (ERC, PATHS-UP), the Army Research Office (ARO; W911NF-13-1-0419 and W911NF-13-1-0197), the ARO Life Sciences Division, the National Science Foundation (NSF) CBET Division Biophotonics Program, the NSF Emerging Frontiers in Research and Innovation (EFRI) Award, the NSF EAGER Award, NSF INSPIRE Award, NSF Partnerships for Innovation: Building Innovation Capacity (PFI:BIC) Program, Office of Naval Research (ONR), the National Institutes of Health (NIH) the Howard Hughes Medical Institute (HHMI), Vodafone Americas Foundation the Mary Kay Foundation, Steven & Alexandra Cohen Foundation and KAUST.
This publication acknowledges KAUST support, but has no KAUST affiliated authors.