End-to-end solution for connectomic reconstruction

Reconstructing a connectome from volume electron microscopy data is a complex task. With over 10 years of experience in connectomics, our team leverages advanced machine learning techniques and cutting-edge technology to deliver precise and reliable results. Learn more in this page.

"It used to take us at least 3 to 6 months to align and segment an EM dataset. [...] With Voxelytics, a machine learning pipeline developed by scalable minds, we were able to generate a segmentation from an aligned dataset in a matter of days. This accelerated workflow enabled us to focus on the biological analysis much sooner than previously possible." Read more

Sahil LoombaPost-doc research scientist at Max Planck Institute for Brain Research

"WEBKNOSSOS supports almost all our connectomics work in the retina. It's the best web-based tool out there for integrated team-based annotation, proofreading, display, and output of serial EM data. The toolset is diverse, powerful, and intuitive. The functionality and ease of use just keep getting better. It has been a pleasure to work with the responsive and science-friendly folks at scalable minds."
David BersonProfessor at Brown University

Our process

1. Alignment

Illustration

A good alignment and registration are crucial to allow accurate data interpretation. Read more about our alignment process.

2. Neuron segmentation

Illustration

We run our ML models and evaluate the results. We iterate on the parameters until achieving satisfactory results. 

3. Synapse detection

Illustration

Similar to neuron segmentation, we use ML to detect synapses. Model performance is evaluated through recall metrics.

4. Connectome & Results

Illustration

Finally, we generate the connectome. We'll provide various deliverables, such as interactive connectivity map, statistics, 3D meshes, etc.

Case study

Neuron reconstruction of Human cortex from volume EM. Raw data by Loomba et al. (Science 2022). Segmentation, connectome, and animation by scalable minds.

Loomba et al. submitted a comparative study of neural structures in 8 Mouse, Macaque, and Human datasets (SBEM, each 0.5-2TB).
For that, a reliable, repeatable, highly-scalable solution for Connectome reconstruction across three different species with significant differences in cellular morphology was required.
We rolled out our ML pipeline for automated image alignment, neuron segmentation, and connectome reconstruction. The workflows instantly worked out well for the mouse dataset. To improve the reconstruction quality for the macaque and human tissue, we interactively re-trained the segmentation models with data labeled in WEBKNOSSOS. We decided on a best-performing configuration after using the integrated evaluation methods and rolled that out to the remaining datasets.

All the computation was executed highly parallelized on an HPC and the results were instantly available in WEBKNOSSOS for inspection. Our declarative analysis approach, the repeatable workflows, and extensible task architecture allowed us to quickly iterate on the 8 datasets and derive insights in a manner of days.

How it works

Illustration

1. Book an intro call

Discuss your research goals and data characteristics with us. Define the analysis tasks and arrange data access on WEBKNOSSOS.


Illustration

2. Receive a free segmentation sample for your data

Once we have access to your data, we will perform a segmentation on a subsample of your data (typically 1 GVx). Based on this, we can discuss the next steps and evaluate the need for re-training.

Illustration

3. Optional retraining for your data

We have a large selection of pre-trained models for various types of EM images. However, sometimes it is required to retrain models for particular image characteristics. In that case, our annotators can generate the required ground truth and we will train custom models for optimal results.

Illustration

4. Automated processing

We roll out our machine learning pipeline on your data. The processing pipeline includes stack alignment, neuron segmentation, neurite type detection, nuclei/somata/blood vessel classification, synapse detection, and connectome assembly.
 

Illustration

5. Polish your results in WEBKNOSSOS

Visualize and evaluate the results in WEBKNOSSOS. Use the advanced proofreading tools in WEBKNOSSOS to correct any remaining errors on the objects you care about. Benefit from the collaboration features to speed up this process.

Illustration

6. Work on your scientific analysis

Explore the results in WEBKNOSSOS and use the available Python libraries for scientific analysis. Of course, you can download the data at any time!

Pricing and duration

We offer fair prices and fast delivery times. Our goal is to make science accessible to labs of all sizes, and we understand the importance of meeting publication deadlines.
Every project includes:● Alignment● Neuron reconstruction● Synapse detection● Connectome generation
Custom deliverables can also be added (additional costs will apply), such as:● PSD area● spinehead volume● neuron type detection● and more.

Standard project

Illustration

    Dataset size:500 GB - 5 TB

    Price: 30k - 80k € (+ VAT)

    Artboard 53

    Duration:2 - 6 months

Custom project

Illustration

    Dataset size:5 TB - 2 PB

    Price: Custom
    Artboard 53
    Duration:Custom

Some examples

Illustration

Dense neuron segmentation of mouse layer 4 somatosensory cortex 

Full dense neuron instance segmentation using modified U-Nets and hierarchical agglomeration. Read blog post.

Illustration

Synapse, vesicle, and mitochondria detection in cortex

CNN-based segmentation of all synapses, vesicles, and mitochondria in preparation for synaptic connectivity mapping.

Illustration

Axon and dendrite classification
  

Integrate semantic segmentation of neuron subtypes (axon, dendrite, glia, etc) into the agglomeration to prevent merger error based on prior biological knowledge.

Support your publications with rich visuals

Neuron reconstruction made by scalable minds with Voxelytics and WEBKNOSSOS. Raw SBEM data by Motta et al., Science 2019.