CellPathfinder, High Content Analysis Software

With an intuitive and user-friendly interface, the software enables multidimensional analysis and graphical visualization of large volumes of image data. In addition, its Machine Learning and Deep Learning capabilities significantly enhance object recognition performance, supporting more complex and advanced analyses such as 3D culture systems and live-cell imaging.

You can download trial software. Software download

Simple and Intuitive Workflow

  • Built-in ready-made analysis protocols
  • Easy selection even for first-time image analysis users, thanks to clear icon-based navigation
  • Simultaneous display of multiple images enables easy comparison across wells and time points

3D Analysis

  • Three-dimensional analysis of 3D samples such as spheroids
  • Easy generation of high-resolution 3D images from Z-stack data
  • 3D visualization of object recognition results from image analysis
  • Enables analysis of volume and positional relationships along the Z-axis

AI Capabilities

  • Equipped with Machine Learning and Deep Learning functions
  • Intuitive operation enables shape recognition, cell counting, and cell classification for cells and intracellular organelles in both fluorescence and brightfield images
  • Comprehensive quantification of complex phenotypes with simple operations, such as selecting negative/positive wells and entering compound concentration information

Graph Generation

  • Visualize calculated numerical data in a variety of graph formats
  • Supports bar charts, line graphs, pie charts, scatter plots, heat maps, and histograms
  • Enables calculation of Z′-factor and EC50/IC50 values

Gating Function

  • Classifies cells into populations with similar characteristics
  • Enables evaluation of the number and proportion of each cell population, as well as feature analysis at the population level

Details

Classification (Gating)

Cells can be classified into populations with similar characteristics. This enables evaluation of the number and proportion of each cell population, as well as feature analysis at the population level.

Cell Cycle Analysis

Visualization of anticancer drug effects on cell cycle progression in both 2D and 3D cultures.

Cell Cycle Analysis

HepG2 cells stably expressing Fucci(CA)5 were imaged at 1-hour intervals for 72 hours. Using Fucci probes, the G1, S, and G2–M phases during time-lapse imaging were identified, and the image analysis software automatically calculated the number and proportion of cells in each cell cycle phase.
h2-3-hGem(1/110) and AzaleaB5-hCdt1(1/100) Cy(–) were excited at 488 nm and 561 nm, respectively.

A. Cell cycle analysis images of Fucci(CA)5 spheres (3D) obtained with CellPathfinder. Three fluorescence colors distinguish G1, S,
 and G2–M phases. One Z-slice image from a Z-range of 70 µm acquired at 7.8 µm steps.
B. Cell cycle information indicated by Fucci(CA)5 (NEB: nuclear envelope breakdown / NER: nuclear envelope reformation).
C. Changes in cell cycle after 24-hour treatment with 30 nM etoposide (a topoisomerase II inhibitor) in 2D (top) and 3D (bottom) cultures,
 showing cell cycle arrest in S or G2 phase.
D. Time-course changes in cell cycle distribution (G1, S, G2–M) under various concentrations of etoposide (Etp) in 2D (left) and 3D (right)
 cultures. Shaded areas indicate cell death.

Imaging conditions:
Objective lens: 20×
Excitation: 488 nm (h2-3-hGeminin), 561 nm (AzaleaB5-hCdt1 Cy–)
Z-range: 70 µm, Z-step: 7.8 µm (slice analysis, 3D)
Time-lapse: 1-hour intervals for 72 hours

・For application details, click here.

Deep Area Finder: Deep Learning Function

High-accuracy cell recognition is possible even in brightfield images by simply painting cells or intracellular organelles on the image. Analyses that were previously difficult or abandoned due to insufficient accuracy can now be performed.

Deep Area Finder: Deep Learning Function  

 

Evaluation of Neurite Outgrowth

Analysis of neurite extension without staining

  • CE Bright Field images used
  • Comparison of Deep Learning and Machine Learning results

Evaluation of Neurite Outgrowth1  

Evaluation of Neurite Outgrowth2

A. Deep Learning improves recognition accuracy for low-contrast neurites and cell body morphology.
B. Deep Learning improves cell separation accuracy in densely populated regions.

Evaluation of Neurite Outgrowth3

C. Comparison of analysis results between Deep Learning and Machine Learning.

Evaluation of Neurite Outgrowth4

Improved cell segmentation and recognition accuracy with Deep Learning enables extraction of more precise features:

  • Increased number of recognized cells
  • Increased cell area per cell
  • Increased number of neurite branches per cell

Objective lens: 20×
Excitation: Brightfield (CE Bright Field generated during analysis)
Data courtesy of: Daiichi Sankyo RD Novare Co., Ltd. (Mr. Hayata)

Deep Image Response: Deep Learning Function

Complex phenotypes can be comprehensively quantified with simple operations—no cell recognition protocol required. Users only need to select negative and positive wells and input compound concentration information.

Supports EC50 / IC50 calculation

Deep Image Response1

Deep Image Response1

 

Neurotoxicity Assessment

Toxicity evaluation without staining
No need for cell recognition or feature selection

  • Evaluation possible by training with only negative and positive wells
  • No specialized image analysis expertise required
  • Enables evaluation of live cells using unstained images

Cells were treated with 1,287 compounds and brightfield images were acquired using the CellVoyager CV7000S. After generating CE Bright Field images, wells with toxin treatment and no test compound were set as negative, and wells without toxin or test compound were set as positive. Deep Image Response was then used to evaluate toxicity rescue effects, and results were compared with ATP-based assays.

Neurotoxicity Assessmen

A. Negative and positive wells used for training
B. Comparison between ATP levels and Deep Image Response scores
(a) Intermediate ATP level with high Deep Image Response score
(b) Both ATP level and Deep Image Response score are high
(c) High ATP level but low Deep Image Response score

A correlation between ATP levels and Deep Image Response scores was observed. In cases such as (c), differences undetectable by ATP levels alone could be identified by Deep Image Response.

Objective lens: 20×
Excitation: Brightfield
Plate: 384-well plate

 

System Configuration

・Software
・Dedicated workstation
・Display

Dedicated workstation specifications (model: Dell Precision)
CPU: Intel® Xeon
Memory: 128 GB
HDD: System (C:) 1 TB, Storage (D:) 4 TB
OS: Microsoft Windows 10 IoT Enterprise (Japanese / English)
GPU: NVIDIA® RTX A400, RTX 4000 Ada
Display: 2560 × 1440, dual monitors

 

Yokogawa Life Science

We post our information to the following SNSs.
Please follow us.

•Twitter @Yokogawa_LS
•Facebook Yokogawa Life Science
•LinkedIn Yokogawa Life Science
•YouTube Life Science Yokogawa

 

Yokogawa's Official Social Media Account List

Social Media Account List

Resources

Overview:
  • Colony Formation
  • Scratch Wound
  • Cytotoxicity
  • Neurite Outgrowth
  • Co-culture Analysis
  • Cell Tracking
Industries:
Application Note
Application Note
Application Note
Application Note
Application Note
Overview:

Cell stage categorized using FucciTime lapse imaging of Fucci-added Hela cells was conducted over 48 hrs at 1 hr intervals. Gating was performed based on the mean intensities of 488 nm and 561 nm for each cell. They were categorized into four stages, and the cell count for each was calculated.

Industries:
Application Note
Application Note
Application Note
Yokogawa Technical Report
Overview:

We have been developing a prototype of a genomic drug test support system using our CSU confocal scanner. This system administers chemical compounds that serve as potential drug candidates into living cells, which are the most basic components of all living organisms, records the changes in the amount and localization of target molecules inside cells with the CSU confocal scanner and a highly sensitive CCD camera, and processes and quantifies the captured high-resolution image data.

Yokogawa Technical Report
18.6 MB
Yokogawa Technical Report
6.1 MB
Industries:
Overview:

In this tutorial, we will learn how to perform cell tracking with CellPathfinder through the analysis of test images.

Overview:

In this tutorial, a method for analyzing ramified structure, using CellPathfinder, for the analysis of the vascular endothelial cell angiogenesis function will be explained.

Overview:

In this tutorial, we will learn how to perform time-lapse analysis of objects with little movement using CellPathfinder, through calcium imaging of iPS cell-derived cardiomyocytes.

Overview:

In this tutorial, we will observe the change in number and length of neurites due to nerve growth factor (NGF) stimulation in PC12 cells.

Overview:

In this tutorial, image analysis of collapsing stress fibers will be performed, and concentration-dependence curves will be drawn for quantitative evaluation.

Overview:

In this tutorial, we will identify the cell cycles G1-phase, G2/M-phase, etc. using the intranuclear DNA content.

Overview:

In this tutorial, spheroid diameter and cell (nuclei) count within the spheroid will be analyzed.

Overview:

In this tutorial, a method for analyzing ramified structure, using CellPathfinder, for the analysis of the vascular endothelial cell angiogenesis function will be explained.

Overview:

In this tutorial, using images of zebrafish whose blood vessels are labeled with EGFP, tiling of the images and recognition of blood vessels within an arbitrary region will be explained.

Overview:

In this tutorial, intranuclear and intracytoplasmic NFκB will be measured and their ratios calculated, and a dose-response curve will be created.

Videos

Overview:

YOKOGAWA will contribute to technology evolution particularly in measurement and analytical tools to help build a world where researchers will increasingly focus on insightful interpretation of data, and advancing Life Science to benefit humanity.

Looking for more information on our people, technology and solutions?

Contact Us

Top