Large-scale foundation models and generative AI for BigData neuroscience

Neurosci Res. 2025 Jun:215:3-14. doi: 10.1016/j.neures.2024.06.003. Epub 2024 Jun 17.

Abstract

Recent advances in machine learning have led to revolutionary breakthroughs in computer games, image and natural language understanding, and scientific discovery. Foundation models and large-scale language models (LLMs) have recently achieved human-like intelligence thanks to BigData. With the help of self-supervised learning (SSL) and transfer learning, these models may potentially reshape the landscapes of neuroscience research and make a significant impact on the future. Here we present a mini-review on recent advances in foundation models and generative AI models as well as their applications in neuroscience, including natural language and speech, semantic memory, brain-machine interfaces (BMIs), and data augmentation. We argue that this paradigm-shift framework will open new avenues for many neuroscience research directions and discuss the accompanying challenges and opportunities.

Keywords: BigData; Brain-machine interface; Embedding; Foundation model; Generative AI; Representation learning; Self-supervised learning; Transfer learning; Transformer.

Publication types

  • Review

MeSH terms

  • Artificial Intelligence*
  • Brain-Computer Interfaces
  • Humans
  • Machine Learning*
  • Neurosciences* / methods