AI-powered cell-imaging technique avoids use of toxic dyes

Last modified January 11, 2021. Published January 11, 2021.
Time-lapse gradient light interference microscopy, or GLIM, left, and phase imaging with computational specificity imaged over seven days. (Credit: Popescu group)

Time-lapse gradient light interference microscopy, or GLIM, left, and phase imaging with computational specificity imaged over seven days. (Credit: Popescu group)

A University of Illinois at Urbana-Champaign research group developed an imaging method that uses artificial intelligence to reveal important visual information of live cells without using fluorescent dyes, which are toxic to some specimens.

Described in a paper published in Nature Communications, the “label-free” method was intended to overcome the weaknesses of fluorescence microscopy, the most common imaging tool in biology. Staining cells and shining high-energy light on them to make certain features glow comes with the downside of destroying them with the dyes or light, which can limit scientific investigations, according to the researchers.

“Once you're able to do label-free imaging, let's say to all live cells, then you are able to watch it over a long period of time without destroying it, without affecting its function,” said Gabriel Popescu, a professor of electrical and computer engineering at the University of Illinois and a coauthor of the paper.

His team’s phase imaging with computational specificity, or PICS, pairs a low-light imaging approach with a trained neural network to simulate dyes by visually highlighting nuclei or other desired cell features without disrupting the specimen. According to the authors, the technique could replace some uses of fluorescence microscopy and allow for new research opportunities, they said.

PICS employs phase-contrast microscopy, which measures the phase shift of light waves shone on transparent subjects to capture them more clearly. It is label-free and uses little light but cannot distinguish chemical points of interest like fluorescent microscopy can.

To address this downside, the researchers developed an artificial-intelligence program that compares how stained cells look through fluorescence microscopy and phase-contrast microscopy to learn how to spot otherwise invisible features without the dyes. In less than half of a day, it could accurately predict and paint the features of unstained cells onto images and videos.

“With that, you don't need any UV source, you don't need any chemicals, you don't need any optical filters,” Popescu said. “It's just a GPU that it trains in a few hours, but then the inference generating that synthetic fluorescence takes 60 milliseconds to 100 milliseconds.”

PICS computed the inferred stains more quickly than the frame rate of the image-capturing system, so they could be viewed in real time, which the researchers described as a step forward in the field of AI-enhanced imaging. Other milestones they identified include its high sensitivity and its ability to measure the growth of a cell’s nucleus and cytoplasm mass over time.

The image-capturing systems used in the study were produced by a label-free quantitative-imaging company Popescu founded in 2009 and currently leads. The company will eventually market the PICS technology as a commercial product, he said. 

“The user will be able to train the neural network for a particular application ... and then eliminate chemical fluorescence from their routine operation,” Popescu said.

He credited the AI and imaging work of other research groups that was used in PICS’s development, including those from Google and the Allen institute that helped with translating contrasts within images and training their neural network.

Popescu’s lab has already applied PICS in other scientific research. One study on bull sperm cells used the technique to measure the dry-mass content of thousands of specimens and found that their body ratios affect their success at different stages of artificial reproduction. PICS is also being used in the lab’s ongoing research on neural-network growth in the brain to tell apart axons and dendrites, two neuron protrusions that usually require fluorescence microscopy to distinguish.

Popescu said it could also see wider medical applications, such as “instantaneously” identifying cancerous regions in a biopsies or predicting cancer patient outcomes.

“This is an exciting direction for us where we see a lot of potential for many applications, from basic science to cancer research,” Popescu said.

The article, “Phase imaging with computational specificity (PICS) for measuring dry mass changes in sub-cellular compartments” was published Dec. 7 inNature CommunicationsThe authors of the study were Mikhail Kandel, Yuchen He, Young Jae Lee, Taylor Hsuan-Yu Chen, Kathryn Michele Sullivan, Onur Aydin, M. Tahur Saif, Hyunjoon Kong, Nahil Sobh and Gabriel Popescu, University of Illinois at Urbana-Champaign. The lead author was Mikhail Kandel.

Saving