Computational Alleviation of Depth-dependent Degradation in Fluorescence Images
This technology includes an approach that dramatically lessens the effects of depth-dependent degradation in fluorescence microscopy images. First, we develop realistic ‘forward models’ of the depth dependent degradation and apply these forward models to shallow imaging planes that are expected to be relatively free of such degradation. In doing so, we create synthetic image planes that resemble the degradation found in deeper imaging planes. Second, we train neural networks to remove the effect of such degradation, using the shallow images as ground truth. This procedure can then be applied to the acquired images deeper into the volumes, reducing the degradation in those planes. A key insight in this procedure is that many acquired image volumes already contain shallow ‘ground truth’ planes that may be used for deep learning, thus eliminating the need to separately acquire ‘clean’ training data. We have applied this method to diverse microscopy datasets, including imaging volumes acquired with light-sheet microscopes.