Coupled advances in live cell imaging and molecular labeling have transformed our ability to visualize cellular dynamics. Yet, accurately analyzing data collected in vivo can be complicated by the hidden constraints that tissue structures—the blank spaces of microscopy images—impose on cellular movements. For instance, lymphocytes are diverted along winding and branched blood vessel highways rather than allowed to move directly between locations.
To enable more precise analyses of cellular trajectories, Liepe and colleagues have now developed two computational tools that help faithfully map three-dimensional (3D) imaging data onto appropriate 2D representations. The first, “unwrapping,” is designed for instances in which prior knowledge suggests that a cell is likely to be moving along a flat or gently curved surface, such as the cornea or retina. The second, “Riemannian manifold learning,” is appropriate for more complex topologies, such as the undulating villi of the snaking small intestine. When determining the underlying targets and directionality of random walks simulated over a variety of surfaces, the authors show that both tools outperform results obtained using XY-projections or straight 3D paths. Applying their methods to in vivo imaging data, they then correctly recover the previously confirmed observations that random haemocyte movements in Drosophila lack bias or directionality, and that neutrophil trafficking in response wounding in zebrafish is chemotactic.
By making their software publically available, Liepe et al. are now poised to help others accurately characterize cellular motions through complex tissues. This could prove especially useful when examining preclinical models for the activity of CAR-T cells in tumors or tracking the invasiveness of native lymphocytes after immunotherapy.
J. Liepe et al., Accurate reconstruction of cell and particle tracks from 3D live imaging data, Cell Syst. 3, 102–107 (2016). [Full Text]
- Copyright © 2016, American Association for the Advancement of Science