Linear feature projection-based real-time decoding of limb state from dorsal root ganglion recordings

Sungmin Han, Jun Uk Chu, Jong Woong Park, Inchan Youn

Research output: Contribution to journalArticlepeer-review

2 Citations (Scopus)


Proprioceptive afferent activities recorded by a multichannel microelectrode have been used to decode limb movements to provide sensory feedback signals for closed-loop control in a functional electrical stimulation (FES) system. However, analyzing the high dimensionality of neural activity is one of the major challenges in real-time applications. This paper proposes a linear feature projection method for the real-time decoding of ankle and knee joint angles. Single-unit activity was extracted as a feature vector from proprioceptive afferent signals that were recorded from the L7 dorsal root ganglion during passive movements of ankle and knee joints. The dimensionality of this feature vector was then reduced using a linear feature projection composed of projection pursuit and negentropy maximization (PP/NEM). Finally, a time-delayed Kalman filter was used to estimate the ankle and knee joint angles. The PP/NEM approach had a better decoding performance than did other feature projection methods, and all processes were completed within the real-time constraints. These results suggested that the proposed method could be a useful decoding method to provide real-time feedback signals in closed-loop FES systems.

Original languageEnglish
Pages (from-to)77-90
Number of pages14
JournalJournal of Computational Neuroscience
Issue number1
Publication statusPublished - 2019 Feb 15


  • Kalman filter
  • Linear feature projection
  • Negentropy maximization
  • Projection pursuit
  • Proprioceptive afferent

ASJC Scopus subject areas

  • Sensory Systems
  • Cognitive Neuroscience
  • Cellular and Molecular Neuroscience


Dive into the research topics of 'Linear feature projection-based real-time decoding of limb state from dorsal root ganglion recordings'. Together they form a unique fingerprint.

Cite this