Computational based time-resolved multispectral fluorescence microscopy

I. INTRODUCTION

Section:

ChooseTop of pageABSTRACTI. INTRODUCTION <<II. MATERIALS AND METHODSIII. RESULTS AND DISCUSSI...IV. CONCLUSION AND FUTURE...SUPPLEMENTARY MATERIALREFERENCESPrevious sectionNext sectionFluorescence microscopy is a powerful technique to study physical, chemical, and biological processes over a broad range of applications.11. J. R. Lakowicz, Principles of Fluorescence Spectroscopy, 3rd ed. (Springer, New York, 2006). It generally comprises an imaging system, which aims at revealing the fluorophores’ localization, and various methods have been proposed over the years to maximize the information attainable from the multidimensional fluorescence signal. Indeed, the emission spectrum represents a fingerprint of a fluorophore, and consequently, multispectral/hyperspectral microscopy22. L. Gao and R. T. Smith, J. Biophotonics 8, 441 (2015). https://doi.org/10.1002/jbio.201400051 has the capability to discriminate fluorophores and to study their spectroscopic properties. Its applications span from biomedical3,43. G. Lu and B. Fei, J. Biomed. Opt. 19, 010901 (2014). https://doi.org/10.1117/1.jbo.19.1.0109014. J. Zhang, R. E. Campbell, A. Y. Ting, and R. Y. Tsien, Nat. Rev. Mol. Cell Biol. 3, 906 (2002). https://doi.org/10.1038/nrm976 to materials science.55. C. Trovatello, A. Genco, C. Cruciano, B. Ardini, Q. Li, X. Zhu, G. Valentini, G. Cerullo, and C. Manzoni, Opt. Mater. X 14, 100145 (2022). https://doi.org/10.1016/j.omx.2022.100145 Similarly, the information provided by the fluorescence temporal decay can be exploited to better discriminate spectrally overlapping fluorophores6,76. M. Y. Berezin and S. Achilefu, Chem. Rev. 110, 2641 (2010). https://doi.org/10.1021/cr900343z7. R. Datta, T. M. Heaster, J. T. Sharick, A. A. Gillette, and M. C. Skala, J. Biomed. Opt. 25, 071203 (2020). https://doi.org/10.1117/1.jbo.25.7.071203 and to probe their local environment88. X. Liu, D. Lin, W. Becker, J. Niu, B. Yu, L. Liu, and J. Qu, J. Innov. Opt. Health Sci. 12, 1930003 (2019). https://doi.org/10.1142/s1793545819300039 (e.g., pH, temperature, chemical bonds, and energy transfer6,96. M. Y. Berezin and S. Achilefu, Chem. Rev. 110, 2641 (2010). https://doi.org/10.1021/cr900343z9. R. Cubeddu, D. Comelli, C. D’Andrea, P. Taroni, and G. Valentini, J. Phys. D: Appl. Phys. 35, R61 (2002). https://doi.org/10.1088/0022-3727/35/9/201) that affects the de-excitation pathways. Time-resolved microscopy has found important applications in the study of cellular metabolism10,1110. C. Stringari, H. Wang, M. Geyfman, V. Crosignani, V. Kumar, J. S. Takahashi, B. Andersen, and E. Gratton, Cell Rep. 10, 1 (2015). https://doi.org/10.1016/j.celrep.2014.12.00711. A. J. Bower, J. E. Sorrells, J. Li, M. Marjanovic, R. Barkalifa, and S. A. Boppart, Biomed. Opt. Express 10, 6408 (2019). https://doi.org/10.1364/boe.10.006408 and natural and artificial photosynthetic processes1212. F. Mascia, L. Girolomoni, M. J. P. Alcocer, I. Bargigia, F. Perozeni, S. Cazzaniga, G. Cerullo, C. D’Andrea, and M. Ballottari, Sci. Rep. 7, 16319 (2017). https://doi.org/10.1038/s41598-017-16641-6 and in the characterization of functional properties of engineered materials.1313. E. S. Barnard, E. T. Hoke, S. T. Connor, J. R. Groves, T. Kuykendall, Z. Yan, E. C. Samulon, E. D. Bourret-Courchesne, S. Aloni, P. J. Schuck, C. H. Peters, and B. E. Hardin, Sci. Rep. 3, 2098 (2013). https://doi.org/10.1038/srep02098In the most common imaging acquisition schemes, the sample is scanned across the area of interest with short light pulses and the emitted fluorescence, after being spectrally dispersed (e.g., by a grating or a prism), is acquired in parallel by a linear array of detectors or by a single detector while the dispersive element is rotated (monochromator mode).22. L. Gao and R. T. Smith, J. Biophotonics 8, 441 (2015). https://doi.org/10.1002/jbio.201400051 The fluorescence temporal profile can be measured directly in time domain by using a gated detector or by the Time-Correlated Single-Photon Counting (TCSPC) technique.77. R. Datta, T. M. Heaster, J. T. Sharick, A. A. Gillette, and M. C. Skala, J. Biomed. Opt. 25, 071203 (2020). https://doi.org/10.1117/1.jbo.25.7.071203 In the latter case, each measurement is repeated until a sufficient number of photons are collected, to draw a histogram of their arrival times. The most common detectors are Photomultiplier Tubes (PMTs) or Single-Photon Avalanche Diodes (SPADs). A wide-field scheme is frequently based on a time-gated intensified camera, which has the advantage of directly providing an image (thanks to parallelization in space) and the disadvantages of a limited number of temporal samples (serial acquisition of delayed temporal gates) and a lower temporal resolution of about 100 ps compared to a few tens of ps of TCSPC.7,87. R. Datta, T. M. Heaster, J. T. Sharick, A. A. Gillette, and M. C. Skala, J. Biomed. Opt. 25, 071203 (2020). https://doi.org/10.1117/1.jbo.25.7.0712038. X. Liu, D. Lin, W. Becker, J. Niu, B. Yu, L. Liu, and J. Qu, J. Innov. Opt. Health Sci. 12, 1930003 (2019). https://doi.org/10.1142/s1793545819300039 Then, the time-resolved signal is typically fitted to a multi-exponential model to retrieve the maps of the fluorescence lifetimes and relative amplitudes.Imaging complemented with both temporal and spectral dimensions results in a four-dimensional dataset14,1514. W. Becker, A. Bergmann, and C. Biskup, Microsc. Res. Tech. 70, 403 (2007). https://doi.org/10.1002/jemt.2043215. Q. Pian, R. Yao, N. Sinsuebphon, and X. Intes, Nat. Photonics 11, 411 (2017). https://doi.org/10.1038/nphoton.2017.82 that increases the specificity of fluorescence microscopy. However, more dimensions generally lead to a longer acquisition time, which should be limited when measuring in vivo or photosensitive specimens.1616. Y. Zhang, I. H. Guldner, E. L. Nichols, D. Benirschke, C. J. Smith, S. Zhang, and S. S. Howard, Optica 8, 885 (2021). https://doi.org/10.1364/optica.426870An efficient acquisition scheme is the Single-Pixel Camera (SPC),1717. M. F. Duarte, M. A. Davenport, D. Takhar, J. N. Laska, T. Sun, K. F. Kelly, and R. G. Baraniuk, IEEE Signal Process. Mag. 25, 83 (2008). https://doi.org/10.1109/msp.2007.914730 which allows us to acquire multidimensional information by means of a point-like detector (or a linear array) and to reduce the number of measurements. In the SPC scheme, a target is illuminated with a series of sampling masks (patterns), generated on a Spatial Light Modulator (SLM), and the emitted light is integrated in a point-like detector by using a lens. Each measurement is the hardware implementation of the inner product between the target emission and the patterns; then, a computational operation is required for image reconstruction.1818. M. P. Edgar, G. M. Gibson, and M. J. Padgett, Nat. Photonics 13, 13 (2019). https://doi.org/10.1038/s41566-018-0300-7In the past decade, much interest has been devoted to Compressive Sensing (CS), a sampling strategy that requires less acquisition effort than conventional schemes, preserving the information content. This is possible by performing the acquisition in a domain that is incoherent to that in which the information is supposed to be sparse.19,2019. E. J. Candes, J. Romberg, and T. Tao, IEEE Trans. Inf. Theory 52, 489 (2006). https://doi.org/10.1109/tit.2005.86208320. D. L. Donoho, IEEE Trans. Inf. Theory 52, 1289 (2006). https://doi.org/10.1109/tit.2006.871582 In the SPC design, this is achieved through a proper choice of patterns. Thereby, combining SPC with CS, the resulting image can be inferred with a number M of measurements lower than the number of pixels N. Recently, compression ratios (CR = 1 − M/N) higher than 90% have been reported.21,2221. M. Alemohammad, J. Shin, D. N. Tran, J. R. Stroud, S. P. Chin, T. D. Tran, and M. A. Foster, Opt. Lett. 43, 2989 (2018). https://doi.org/10.1364/ol.43.00298922. G. Calisesi, A. Ghezzi, D. Ancora, C. D’Andrea, G. Valentini, A. Farina, and A. Bassi, Prog. Biophys. Mol. Biol. 168, 66 (2022). https://doi.org/10.1016/j.pbiomolbio.2021.06.004 It is worth recalling that, in this case, the image reconstruction requires the solution of an ill-posed inverse problem, and thus, regularization is needed.2323. M. Bertero and P. Boccacci, Introduction to Inverse Problems in Imaging, 1st ed. (CRC Press, 1998).A few architectures have been proposed for compressive fluorescence microscopy.2222. G. Calisesi, A. Ghezzi, D. Ancora, C. D’Andrea, G. Valentini, A. Farina, and A. Bassi, Prog. Biophys. Mol. Biol. 168, 66 (2022). https://doi.org/10.1016/j.pbiomolbio.2021.06.004 CS multispectral systems are frequently based on SPC coupled to a spectrometer,2424. V. Studer, J. Bobin, M. Chahid, H. S. Mousavi, E. Candes, and M. Dahan, Proc. Natl. Acad. Sci. U. S. A. 109, E1679 (2012). https://doi.org/10.1073/pnas.1119511109 on interferometric techniques,2525. D. N. Wadduwage, V. R. Singh, H. Choi, Z. Yaqoob, H. Heemskerk, P. Matsudaira, and P. T. C. So, Optica 4, 546 (2017). https://doi.org/10.1364/optica.4.000546 or on coded imaging,2626. C. F. Cull, K. Choi, D. J. Brady, and T. Oliver, Appl. Opt. 49, B59 (2010). https://doi.org/10.1364/ao.49.000b59 while CS time-resolved ones are generally based on SPC coupled to a PMT or a SPAD detector.15,27–2915. Q. Pian, R. Yao, N. Sinsuebphon, and X. Intes, Nat. Photonics 11, 411 (2017). https://doi.org/10.1038/nphoton.2017.8227. F. Rousset, N. Ducros, F. Peyrin, G. Valentini, C. D’Andrea, and A. Farina, Opt. Express 26, 10550 (2018). https://doi.org/10.1364/oe.26.01055028. L. Klein, A. S. Kristoffersen, J. Touš, and K. Žídek, Opt. Express 30, 15708 (2022). https://doi.org/10.1364/oe.45504929. A. Ghezzi, A. Farina, A. Bassi, G. Valentini, I. Labanca, G. Acconcia, I. Rech, and C. D’Andrea, Opt. Lett. 46, 1353 (2021). https://doi.org/10.1364/ol.419381 SPC with a spectrometer and a multichannel time-resolved detector has been presented by Pian et al.1515. Q. Pian, R. Yao, N. Sinsuebphon, and X. Intes, Nat. Photonics 11, 411 (2017). https://doi.org/10.1038/nphoton.2017.82 for macro-/mesoscopic applications. Streak cameras have also been used as a multidimensional detector, either with SPC3030. M. Legrand, A. Bercegol, L. Lombez, J.-F. Guillemoles, and D. Ory, in Proceeding of SPIE 11681, Physics, Simulation, and Photonic Engineering of Photovoltaic Devices X (SPIE, 2021), pp. 43–48. or for compressive ultrafast spectral FLIM.3131. P. Wang, J. Liang, and L. V. Wang, Nat. Commun. 11, 2091 (2020). https://doi.org/10.1038/s41467-020-15745-4 It is worth stressing that all these acquisition schemes share a computational imaging paradigm where a strong integration between hardware and software allows for improving the performances3232. M. Ochoa, A. Rudkouskaya, R. Yao, P. Yan, M. Barroso, and X. Intes, Biomed. Opt. Express 11, 5401 (2020). https://doi.org/10.1364/boe.396771 of the conventional microscope in terms of acquisition speed and information content.However, the spatial resolution of SPC systems is generally limited by the trade-off among several factors: measurement/reconstruction time, compression ratio, and frame rate. The typical number of pixels of an SPC image is less than 128 × 1283333. R. Stojek, A. Pastuszczak, P. Wróbel, and R. Kotyński, Opt. Express 30, 22730 (2022). https://doi.org/10.1364/oe.460025 (the actual size of the field-of-view is set by the projection optics). However, a better spatial resolution is possible at the cost of a higher acquisition time, which is not always compatible with the sample. One strategy to mitigate this limit is provided by Data Fusion (DF). Earlier, this computational imaging method was adopted in remote sensing, and then, it spread to several areas, including autonomous vehicles and biomedical imaging.34,3534. J. Zhang, Int. J. Image Data Fusion 1, 5 (2010). https://doi.org/10.1080/1947983090356103535. C. Callenberg, A. Lyons, D. d. Brok, A. Fatima, A. Turpin, V. Zickus, L. Machesky, J. Whitelaw, D. Faccio, and M. B. Hullin, Sci. Rep. 11, 1689 (2021). https://doi.org/10.1038/s41598-021-81159-x With DF, the information obtained with different imaging modalities can be merged into a single dataset. Recently, Soldevila et al.3636. F. Soldevila, A. J. M. Lenz, A. Ghezzi, A. Farina, C. D’Andrea, and E. Tajahuerce, Opt. Lett. 46, 4312 (2021). https://doi.org/10.1364/ol.434127 employed two SPCs to measure, respectively, a time-resolved and a multispectral dataset with a low spatial resolution and a standard Complementary Metal–Oxide–Semiconductor (CMOS) camera to obtain a high-spatial resolution image. The result, after DF, is a single multispectral and time-resolved dataset with a higher spatial resolution.

In this work, we present a novel wide-field, multispectral, time-resolved fluorescence microscope based on computational imaging techniques. This scheme provides spectral, temporal, and spatial information and a significant reduction in the acquired dataset by combining a single-pixel camera and a conventional pixelated detector [Charge-Coupled Device (CCD) camera] with compressive sensing and data fusion techniques. We take advantage of the proposed versatile design to demonstrate a non-mechanical zoom capability, which allows us to reconstruct with high resolution a region of interest (ROI) smaller than the field-of-view (FOV). We propose and characterize the experimental system, and we demonstrate its potential on cell samples.

II. MATERIALS AND METHODS

Section:

ChooseTop of pageABSTRACTI. INTRODUCTIONII. MATERIALS AND METHODS <<III. RESULTS AND DISCUSSI...IV. CONCLUSION AND FUTURE...SUPPLEMENTARY MATERIALREFERENCESPrevious sectionNext sectionThe experimental system is shown in Fig. 1.The illumination path features a 40 MHz mode-locked supercontinuum fiber laser (Fianium Inc., SC-450), with a few tens of ps of pulse duration. The laser beam is filtered by a bandpass filter [10 nm full width at half maximum (FWHM)] centered at 520 nm (FB520-10, Thorlabs, Inc.), coupled to a graded-index fiber (100 μm diameter), and then expanded and projected by a lens (L1, f = 25 mm) on a Digital Micromirror Device (DMD, V-7000 ViaLUX GmbH) acting as a binary reflective SLM. The DMD is constituted by a matrix (1024 × 768) of independently tiltable (±12°) micromirrors of size 13.7 × 13.7 μm2 each. The DMD chip is rotated by 45° to have both the incident and the reflected beams on the horizontal plane. As a DMD acts as a two-dimensional grating for coherent light,3737. Using Lasers with DLP® DMD Technology, Texas Instruments, 2008, p. 10. the angle of the incident light is adequately tuned to approach the blaze condition at the illumination wavelength. The system’s apertures remove all the diffraction orders of the DMD but the central one, and this helps in preventing diffraction-induced distortions in the illumination patterns.3838. N. Chakrova, B. Rieger, and S. Stallinga, in Proceeding of SPIE 9330, Three-Dimensional and Multidimensional Microscopy: Image Acquisition and Processing XXII,San Francisco, California, UnitedStates (SPIE, 2015), p. 933008.

The modulated light is collected by a 2 inches achromatic doublet lens (L2, f = 200 mm, AC508-200-A, Thorlabs, Inc.), reflected by a dichroic mirror (DM, cut-on 550 nm, DMLP550R, Thorlabs, Inc.) toward an objective (Obj., Plan N 40× NA 0.65 infinity corrected, Olympus Corp.), and thus, sent to the specimen.

In the detection path, the fluorescence is selected by a long-pass filter at 550 nm (F1, FELH0550, Thorlabs, Inc.) and can be addressed to a spectrometer (Acton SP-2151i, Princeton Instruments, Inc.) by means of a lens (L3, f = 50 mm) and an aperture-converting fiber bundle (from 3 mm diameter to 1 × 10 mm2). The spectrometer includes a diffraction grating (600 lines/mm) that disperses the light toward a 16-channel PMT detector (PML-16-1, Becker & Hickl GmbH) whose channels have a bandwidth of ∼9 nm FWHM each. Different PMT channels were corrected for spectral efficiencies using a commercial spectrometer (Hamamatsu TM-VIS/NIR C10083CA-2100) as a reference. The detected signal is then processed with a TCSPC board (SPC-130-EM, Becker & Hickl GmbH) with 4096 temporal channels. Patterns are pre-loaded on the DMD memory and then projected for a pre-determined exposure time. A trigger signal generated by the DMD allows us to synchronize the TCSPC acquisition with each pattern. The temporal Instrumental Response Function (IRF) is acquired by means of a reflecting surface (a silicon wafer) at the sample plane, without the long-pass filter F1. All the 16 PMT channels show an IRF width of less than 180 ps FWHM.

With a flip mirror (FM), light can be delivered to a 16-bit 512 × 512 cooled CCD camera (VersArray 512, Princeton Instruments) through an achromatic doublet lens (L4, f = 200 mm). The camera is used both for focusing operations and for acquiring a high-resolution image (256 × 256 pixels) of the sample. Another filter (F2, FESH0700, Thorlabs, Inc.) is placed in front of the camera to make this detector work within the same spectral band of the 16-channel PMT.

The acquisition with SPC has the following mathematical formulation: mM×1 = WM×NxN×1, with xN×1 being the target to be imaged reshaped into a column array, WM×N being the measurement matrix, and mM×1 being the vector of the measurements. In this work, WM×N is built from the first M rows of the scrambled Hadamard matrix, which is obtained, in turn, from random permutations of columns and rows of the Hadamard matrix HN×N with N = 1024.39,4039. M. Harwit and N. J. A. Sloane, Hadamard Transform Optics (Academic Press, 1979).40. L. Gan, T. T. Do, and T. D. Tran, in 2008 16th European Signal Processing Conference (IEEE, 2008), pp. 1–5. This means that the DMD modulates the light according to the rows of WM×N, reshaped in a collection of M patterns whose size is N = 32 × 32. We substituted the native (+1, −1) matrix elements with (+1, 0) so that the patterns can be encoded onto a DMD. Moreover, scrambled Hadamard patterns are pseudo-random and, thus, well suited for CS. Each pattern contains 1 and 0 elements (in the same amount), except for one pattern that corresponds to a uniform illumination (only 1 value): to prevent the latter from saturating the dynamic range of the detector, a measurement is performed with the complementary of another pattern and the two are subsequently added together.4141. N. Huynh, E. Zhang, M. Betcke, S. Arridge, P. Beard, and B. Cox, Optica 3, 26 (2016). https://doi.org/10.1364/optica.3.000026 The spectral and temporal dimensions of each element of mM×1 are implicit, as they are acquired in parallel.With the choice of a 40× objective, each pattern illuminates an area of about 150 × 150 μm2, which is quantized with 32 × 32 pixels (each pixel involves 16 × 16 DMD mirrors). However, the setup design offers a handy zoom feature to increase the spatial resolution:2828. L. Klein, A. S. Kristoffersen, J. Touš, and K. Žídek, Opt. Express 30, 15708 (2022). https://doi.org/10.1364/oe.455049 by using a calibrated and registered camera image (e.g., the one obtained at the end of the focusing operations), a smaller area can be selected, and therein, patterns can be projected by simply adjusting the codification of micromirrors in the DMD. Hence, 32 × 32 pixels are exploited over different fields of view. This represents a method for multiscale imaging, which preserves the image’s aspect ratio between various zoom levels and does not require any change in the optical setup.When the number of patterns M equals the desired number of pixels N, the resulting image can be reconstructed by a straightforward matrix inversion. If MN patterns have been collected, the dataset mM×1 requires a decoding step into the pixel’s space with a reconstruction algorithm. We chose the total variation minimization by augmented Lagrangian and alternating direction algorithms (TVAL3),4242. C. Li, W. Yin, H. Jiang, and Y. Zhang, Comput. Optim. Appl. 56, 507 (2013). https://doi.org/10.1007/s10589-013-9576-1 a state-of-the-art solver in CS. A background noise subtraction is performed on the experimental data within each spectral channel, considering the mean value of the time-resolved signal over 1 ns before the arrival of the pulse; then, a low-resolution image (32 × 32) is reconstructed for each time bin and each spectral channel, leading to a 4D hypercube yspc. Further details on the use and settings of TVAL3 can be found in the supplementary material.yspc can be fused with the CCD camera image yccd, which has a higher spatial resolution (for every pixel in yspc, there are 64 in yccd), to reconstruct a high-resolution 4D hypercube named X. This has been obtained with a variational approach through the minimization of the following merit function FX:3636. F. Soldevila, A. J. M. Lenz, A. Ghezzi, A. Farina, C. D’Andrea, and E. Tajahuerce, Opt. Lett. 46, 4312 (2021). https://doi.org/10.1364/ol.434127FX=12ε‖STX−yccd‖22+12‖RLX−yspc‖22+12δ‖RGX−ỹspc‖22,where S is an operator that performs the integration of the multidimensional variable across the spectral dimension and, analogously, T acts by integrating across the temporal dimension (i.e., S and T applied to X lead to an image of the same dimension of yccd), RL is a spatial down-sampling operator, and RG integrates the space dimension to get a global wavelength–time map. ỹspc is the wavelength–time map that is obtained from yspc after it has been integrated over the spatial dimension. ɛ and δ are two hyperparameters that are tuned to find the best trade-off in minimizing the three contributions of FX. The first term expresses the data fidelity of the reconstructed data to the CCD image, the second one expresses the data fidelity of the down-sampled reconstructed data, and the third term expresses the global fidelity of the time–wavelength map. Minimization is obtained through a conjugated-gradient descent with line search.36,4336. F. Soldevila, A. J. M. Lenz, A. Ghezzi, A. Farina, C. D’Andrea, and E. Tajahuerce, Opt. Lett. 46, 4312 (2021). https://doi.org/10.1364/ol.43412743. W. H. Press, S. A. Teukolsky, W. T. Vetterling, and B. P. Flannery, Numerical Recipes: The Art of Scientific Computing, 3rd ed. (Cambridge University Press, 2007). A background noise subtraction is required for yccd, too. The camera image has been used to create a pixel mask for yspc, based on a fixed threshold, to exclude noisy pixels mainly lying in the background. The images from CCD and the SPC data are normalized to have the same energy.Two types of samples have been measured to validate the system’s capabilities. First, the imaging properties have been characterized with fluorescent beads (4 µm diameter each) deposited on a microscope slide (FocalCheck F36909, Invitrogen). Each bead is stained with four different fluorophores; two of them emit within the detection spectral window. The second is a sample of fixed HEK-293 (ATCC) cells. Cells are treated with poly(3-hexylthiophene-2,5-diyl) nanoparticles (P3HT NPs) (abs: 500 nm; em.: 650 nm), and their actin filaments are stained with phalloidin conjugated to Alexa-Fluor 488 (ALF, abs.: 490 nm and em.: 520 nm, Sigma-Aldrich). Additional information about the sample preparation is reported in the supplementary material. These samples are quite interesting due to the possible role played by nanoparticles at the interface with living cells. Organic and inorganic nanoparticles have proven to be a valuable tool for optical cell stimulation, biosensing, and drug delivery, among others, even reaching clinical trials for various applications.44–4644. F. Benfenati and G. Lanzani, Nat. Rev. Mater. 6, 1 (2021). https://doi.org/10.1038/s41578-020-00267-845. H. Altug, S.-H. Oh, S. A. Maier, and J. Homola, Nat. Nanotechnol. 17, 5 (2022). https://doi.org/10.1038/s41565-021-01045-546. A. C. Anselmo and S. Mitragotri, Bioeng. Transl. Med. 1, 10 (2016). https://doi.org/10.1002/btm2.10003

III. RESULTS AND DISCUSSION

Section:

ChooseTop of pageABSTRACTI. INTRODUCTIONII. MATERIALS AND METHODSIII. RESULTS AND DISCUSSI... <<IV. CONCLUSION AND FUTURE...SUPPLEMENTARY MATERIALREFERENCESPrevious sectionNext sectionFigure 2(a) shows a wide field fluorescence image of the bead sample acquired by using a CCD camera with 1 s exposure time and uniform illumination. The same field of view (FOV) is acquired by the SPC by using 307 patterns (CR = 70%) for 10 ms each (3 s total measurement time), resulting in a 4D dataset (x, y, time, and spectrum). The sample is illuminated with 170 μW, corresponding to ∼1 × 106 counts per second on the TCSPC board. The spectral dimension is resolved with 16 channels over the 550–690 nm band and with 256 temporal bins of 100 ps each. Figure 2(b) shows the reconstructed image obtained by integrating over the entire emission spectrum and a temporal window of 10 ns from the fluorescence peak.Data fusion is applied to the two datasets. Figure 2(c) represents the reconstructed images at several iterations of the regularization process from the acquired SPC image to the one after DF, within the small area indicated by a red square in Fig. 2(a). Initially, the two highlighted spots (blue and orange) contain the same spectral and temporal information (graphs in the upper row) as they belong to the same pixel of the low-resolution image. After DF, the orange spot has lowered into the background, while the blue one corresponds to a bead. The temporal and spectral signals (graphs in the lower row) have differentiated, too: the orange plots approach the baseline, while the blue plots still contain a spectrum and a decay.As previously stated, one of the advantages of the proposed SPC-based microscope is the capability to zoom into a specific area without changing the optics while measuring emission spectrum and temporal decay. Figures 2(d) and 2(e) show the zoom over a group of three beads within the red square in Fig. 2(a) without and with compression (CR = 90%; M = 102 measurements), respectively. The size of the zoomed area is about 20 × 20 μm2, exploiting 64 × 64 DMD mirrors, and it has been discretized into 32 × 32 pixels. Hence, the pixel size of the zoomed image is about 0.6 μm, which is equivalent to the spatial resolution achievable with our CCD camera. The measurement time is 20 ms per pattern, which leads to a total acquisition time of 2 s for CR = 90%. In this case, it is still possible to recognize the shape of the three beads even if we observe a small loss of spatial detail.An analysis of the impact of compression on image quality, and on spectral and temporal information, has been performed and is reported in the supplementary material; in particular, in Fig. S1(c), the Peak Signal-to-Noise Ratio (PSNR) parameter with respect to the CCD camera image, considered as a reference, is plotted as an indicator of the image degradation as the compression level increases. The parameter is calculated on images integrated over the full spectral band and a 10 ns temporal window from the fluorescence peak. We observe a slow decay trend from the uncompressed case up to CR = 80%; then, beyond CR = 90%, there is a rapid degradation. For this reason, we have considered CR = 90% as the highest acceptable compression level for this dataset. Three images at intermediate values of CR (60%, 80%, and 95%) are reported in Fig. S2.Once the image quality has been evaluated, it is important to quantify the effect of CS and DF on the reconstructed spectral and temporal behaviors. With the 4D dataset from the zoomed image, we performed an analysis by varying the compression ratio from 10% to 90%, shown in Fig. S1. A quantitative analysis of the similarity among spectra at different CRs is summarized in Table S3 by calculating the mean squared error (MSE) with respect to the uncompressed case, as reported in the supplementary material. Until CR = 90%, the MSE error is less than 0.3%. Temporal curves are fitted to a mono-exponential model, and the corresponding lifetimes are compared at different CRs. Until CR = 80%, the maximum error, with respect to the uncompressed case, is We then measured a cellular sample composed of ALF-stained HEK cells and P3HT nanoparticles. Following a workflow similar to the one adopted for the bead sample, a CCD image [Fig. 3(a)] is acquired over a wide field of view (150 × 150 μm2) with an exposure time of 1 s. The same FOV has been measured with SPC with 205 patterns (CR = 80%), for 50 ms each (about 10 s of measurement time). The sample is illuminated with 280 μW. The corresponding dataset, integrated over the whole spectral dimension (580–720 nm) and 10 ns window from the fluorescence peak, is reported in Fig. 3(b).

One control microscope slide containing HEK cells marked with ALF fluorophore and another one containing P3HT alone (no cells) have been measured with the CCD camera [Figs. S3(a) and S3(d)] and a time-resolved multichannel PMT [Figs. S3(b) and S3(e); Figs. S3(c) and S3(f)], in order to spectrally and temporally characterize the fluorophores. ALF is characterized by a mono-exponential behavior with a lifetime of about 3.14 ns and P3HT by a lifetime shorter than 200 ps.

By means of the preliminary measurements, ALF can be highlighted by selecting a spectral band between 580 and 595 nm and a time gate of 7 ns, starting 1.5 ns after the fluorescence peak. Similarly, P3HT can be better discriminated with a spectral band 650–700 nm and a time gate of 1 ns from the peak. Data fusion is applied as described above, and the resulting two gated-and-filtered images are assigned to the corresponding spectral range and are visualized in Fig. 3(c). Multidimensionality and data fusion can help in showing a synthetic representation of the specimen, which is useful for a rapid screening or deciding where to proceed with further measurements. These images allow us to better localize the aggregates of P3HT nanoparticles (the red spots) inside the cells, and this is quite relevant for studying their spatial and temporal internalization.To increase the spatial resolution and, hence, to discriminate smaller P3HT clusters, SPC measurement is repeated in a zoomed FOV [marked in red in Fig. 3(b)], of size 40 × 40 μm2. We used 77 patterns with 16 × 16 pixels (CR = 70%), projected for 100 ms each (total time 8 s), and the dataset is fused with the camera image of Fig. 3(a), resulting in Fig. 3(d). From the two selected regions, circled with blue and orange, the spectral [Fig. 3(e)] and temporal [Fig. 3(f)] signals are extracted. While the low signal from the orange region does not allow us to deduce the content based on the emission spectrum, the temporal profile shows a very clear presence of a short lifetime, compatible with the presence of P3HT. Conversely, the blue region has a stronger signal in the leftmost spectral channels and a slower decay, corresponding to a prevalent contribution of ALF.This set of measurements demonstrates the capability of the setup to discriminate nanoparticles in a complex environment, such as the biological one. The ability to collect multispectral information from nanoparticles could be used in the field of biosensing exploiting lifetimes-related information to enhance specificity.47,4847. B. Martín-Gracia, A. Martín-Barreiro, C. Cuestas-Ayllón, V. Grazú, A. Line, A. Llorente, J. M. de la Fuente, and M. Moros, J. Mater. Chem. B 8, 6710 (2020). https://doi.org/10.1039/d0tb00861c48. M. Holzinger, A. Le Goff, and S. Cosnier, Front. Chem. 2, 63 (2014). https://doi.org/10.3389/fchem.2014.00063 Furthermore, nanoparticles can be sensing/triggering elements for biological hubs, such as synapses and intracellular organelles.

IV. CONCLUSION AND FUTURE PERSPECTIVES

Section:

ChooseTop of pageABSTRACTI. INTRODUCTIONII. MATERIALS AND METHODSIII. RESULTS AND DISCUSSI...IV. CONCLUSION AND FUTURE... <<SUPPLEMENTARY MATERIALREFERENCESPrevious sectionNext section

In this work, we have proposed and experimentally validated a time-resolved multispectral fluorescence microscope based on single-pixel camera detection and its symbiotic use with compressive sensing and data fusion algorithms. We have shown that this integration between hardware and algorithms, at the base of the computational imaging approach, is an effective strategy for reducing the acquisition time while preserving multidimensional images with high spatial, temporal, and spectral resolutions. In fact, while spectral and temporal information is acquired in parallel and fully sampled, for what concerns the spatial dimensions, we have reported images with 32 × 32 pixels obtained from a low percentage (CR = 80% for the cell sample) of measurements and whose pixel number is then increased by exploiting a camera image up to 256 × 256. In this way, the measurement duration is much shorter (about 0.3%) than that of a fully sampled SPC system with the same number of pixels in the final image and the same integration time for photon counting. This approach is important when a short measurement time is a priority to avoid bleaching and cell damage or to capture dynamic phenomena. Moreover, it allows a handy zoom capability without changing the magnification optics. Zooming allows us to quickly observe FOVs of different sizes and, consequently, different spatial resolutions, with relevant opportunities in biological applications.

The use of multiple detectors to observe the same sample and perform data fusion certainly raises the issue of different spectral efficiencies. Since the PMT data are spectrally calibrated and the response of a CCD sensor is almost flat within the spectral window we used, this aspect was not critical at this time, but, in general, it might be useful to allow the S operator of data fusion to account for the spectral efficiencies. The use of a DMD, which is a standard component of common light projectors, a state-of-the-art cooled camera, and a linear array of time-resolved detectors, makes this design less expensive, less complex, and more sensitive than pixelated detector-based systems.

Future developments to improve the performance, in terms of acquisition time reduction and higher temporal, spectral, and spatial resolutions, involve hardware and software co-design. In the future, algorithms for the optimization of the hyperparameters of the data fusion can be applied,49,5049. S. Liu and J. Zhang, Acta Geophys. 69, 809 (2021). https://doi.org/10.1007/s11600-021-00569-750. Z. Petrášek, HJ. Eckert, and K. Kemnitz, Photosynth. Res. 102, 157 (2009). https://doi.org/10.1007/s11120-009-9444-0 in order to automatize the process, which currently relies on manual tuning. A possible strategy to further reduce the integration time relies on technological improvements of the detection chain by exploiting more efficient TCSPC systems,51,5251. S. Farina, I. Labanca, G. Acconcia, A. Ghezzi, A. Farina, C. D’Andrea, and I. Rech, Opt. Lett. 47, 82 (2022). https://doi.org/10.1364/ol.44481552. J. E. Sorrells, R. R. Iyer, L. Yang, E. M. Martin, G. Wang, H. Tu, M. Marjanovic, and S. A. Boppart, ACS Photonics 9, 2748 (2022). https://doi.org/10.1021/acsphotonics.2c00505 more sensitive detectors, or higher parallelism in multichannel detector architecture.2929. A. Ghezzi, A. Farina, A. Bassi, G. Valentini, I. Labanca, G. Acconcia, I. Rech, and C. D’Andrea, Opt. Lett. 46, 1353 (2021). https://doi.org/10.1364/ol.419381 Concerning the reconstruction algorithm, recently, machine learning has shown to be effective3232. M. Ochoa, A. Rudkouskaya, R. Yao, P. Yan, M. Barroso, and X. Intes, Biomed. Opt. Express 11, 5401 (2020). https://doi.org/10.1364/boe.396771 for high CRs and lower computation times. Different patterns, as an alternative to those based on the Hadamard matrix,30,32,5330. M. Legrand, A. Bercegol, L. Lombez, J.-F. Guillemoles, and D. Ory, in Proceeding of SPIE 11681, Physics, Simulation, and Photonic Engineering of Photovoltaic Devices X (SPIE, 2021), pp. 43–48.32. M. Ochoa, A. Rudkouskaya, R. Yao, P. Yan, M. Barroso, and X. Intes, Biomed. Opt. Express 11, 5401 (2020). https://doi.org/10.1364/boe.39677153. A. L. Mur, P. Leclerc, F. Peyrin, and N. Ducros, Opt. Express 29, 17097 (2021). https://doi.org/10.1364/oe.424228 can be used to increase the CR and spatial details, such as wavelet, discrete cosines, or Morlet.54,5554. M. Ochoa, Q. Pian, R. Yao, N. Ducros, and X. Intes, Opt. Lett. 43, 4370 (2018). https://doi.org/10.1364/ol.43.00437055. P. Wijesinghe, A. Escobet-Montalbán, M. Chen, P. R. T. Munro, and K. Dholakia, Opt. Lett. 44, 4981 (2019). https://doi.org/10.1364/ol.44.004981 In particular, optimal patterns can be designed with deep learning5656. G. M. Gibson, S. D. Johnson, and M. J. Padgett, Opt. Express 28, 28190 (2020). https://doi.org/10.1364/oe.403195 or even with adaptive methods.5757. F. Rousset, N. Ducros, A. Farina, G. Valentini, C. D’Andrea, and F. Peyrin, IEEE Trans. Comput. Imaging 3, 36 (2017). https://doi.org/10.1109/tci.2016.2637079 Moreover, a static/dynamic combination of the compressed dataset with global analysis and linear fast-fit approaches5858. A. Ghezzi, A. Farina, G. Valentini, A. Bassi, and C. D’Andrea, in Proceeding of SPIE 11966, Three-Dimensional and Multidimensional Microscopy: Image Acquisition and Processing XXIX (SPIE, 2022), pp. 1–6. can provide a faster analysis toward real-time applications. In conclusion, we believe that the proposed experimental system represents an ideal platform for the development of advanced computational imaging approaches exploiting novel algorithms.

Comments (0)

No login
gif