摘要:Fluorescence lifetime imaging microscopy (FLIM) is a powerful technique to probe the molecular environment of fluorophores. The analysis of FLIM images is usually performed with time consuming fitting methods. For accelerating this analysis, sophisticated deep learning architectures based on convolutional neural networks have been developed for restrained lifetime ranges but they require long training time. In this work, we present a simple neural network formed only with fully connected layers able to analyze fluorescence lifetime images. It is based on the reduction of high dimensional fluorescence intensity temporal decays into four parameters which are the phasor coordinates, the mean and amplitude-weighted lifetimes. This network called Phasor-Net has been applied for a time domain FLIM system excited with an 80 MHz laser repetition frequency, with negligible jitter and afterpulsing. Due to the restricted time interval of 12.5 ns, the training range of the lifetimes was limited between 0.2 and 3.0 ns; and the total photon number was lower than 10
6, as encountered in live cell imaging. From simulated biexponential decays, we demonstrate that Phasor-Net is more precise and less biased than standard fitting methods. We demonstrate also that this simple architecture gives almost comparable performance than those obtained from more sophisticated networks but with a faster training process (15 min instead of 30 min). We finally apply successfully our method to determine biexponential decays parameters for FLIM experiments in living cells expressing EGFP linked to mCherry and fused to a plasma membrane protein.