A study of autoencoders as a feature extraction technique for spike sorting

PLoS One. 2023 Mar 9;18(3):e0282810. doi: 10.1371/journal.pone.0282810. eCollection 2023.

Abstract

Spike sorting is the process of grouping spikes of distinct neurons into their respective clusters. Most frequently, this grouping is performed by relying on the similarity of features extracted from spike shapes. In spite of recent developments, current methods have yet to achieve satisfactory performance and many investigators favour sorting manually, even though it is an intensive undertaking that requires prolonged allotments of time. To automate the process, a diverse array of machine learning techniques has been applied. The performance of these techniques depends however critically on the feature extraction step. Here, we propose deep learning using autoencoders as a feature extraction method and evaluate extensively the performance of multiple designs. The models presented are evaluated on publicly available synthetic and real "in vivo" datasets, with various numbers of clusters. The proposed methods indicate a higher performance for the process of spike sorting when compared to other state-of-the-art techniques.

Publication types

  • Research Support, Non-U.S. Gov't

MeSH terms

  • Action Potentials / physiology
  • Algorithms*
  • Machine Learning*
  • Neurons / physiology
  • Signal Processing, Computer-Assisted

Grants and funding

The research leading to these results has received funding from: NO (Norway) Grants 2014-2021, under Project contract number 20/2020 (RO-NO-2019-0504), four grants from the Ro-manian National Authority for Scientific Research and Innovation, CNCS-UEFISCDI (codes PN-III-P2-2.1-PED-2019-0277, PN-III-P3-3.6-H2020-2020-0109, ERA-NET-FLAG-ERA-ModelDXConsciousness, and ERANET-NEURON-Unscrambly), and a H2020 grant funded by the European Commission (grant agreement 952096, NEUROTWIN). The funders had no role in study design, data collection and analysis, decision to publish, or preparation of the manuscript.