演題詳細
Poster
ブレイン・マシン/コンピュータ・インターフェイス
BMI/BCI
開催日 | 2014/9/12 |
---|---|
時間 | 14:00 - 15:00 |
会場 | Poster / Exhibition(Event Hall B) |
視覚誘発神経活動を使った義手の精緻制御のためのBMIをめざして
Toward brain-machine interface using neural activity in the visual cortex for dexterous control of a prosthetic hand
- P2-372
- 林 隆介 / Ryusuke Hayashi:1 嵯峨 智 / Satoshi Saga:2
- 1:産業技術総合研究所 システム脳科学研究グループ / System Neuroscience Group, AIST, Tsukuba, Japan 2:筑波大学 システム情報系 情報工学域 / Div of Info Eng, Univ of Tsukuba, Tsukuba, Japan
Introduction: Advances in brain-machine interface (BMI) have enabled direct control from the motor cortex to prosthetic devices to replicate dexterous finger movements. The previously reported BMIs, however, have limitation to assist patients with paralysis as a consequence of damage in the motor cortex. Since we can visually imagine intended hand movements and many neurons in the inferior temporal (IT) cortex are activated by the images of body parts including hand, there is a good possibility to achieve dexterous control of prosthetic hands by decoding neural ensembles from the IT cortex. Method: To test this hypothesis, we chronically implanted three multi-electrode arrays in the IT cortex of a macaque monkey and recorded multi-unit activity from 190 electrodes while the animal viewed images of three different hand-signs (rock, scissors, and paper). The spike counts with the time window of 100ms were offline processed to classify which image the animal viewed at a time (3 hand-sign images or blank screen) using linear discriminant analysis. We then controlled a five-finger moveable robotic hand (Handroid, ITK co.ltd.) according to the predicted hand shape via a motor driver consisting of an mbed microcontroller and Matlab software. A cover made of RTV silicone rubber (durometer less than 10 Shore A) that resembles a natural hand in appearance (Satoh Giken co.) was attached to the hand, expecting for future use to evoke natural activity in the IT cortex when animal controls the hand in real-time while viewing it. Results: The results show that classification accuracies are higher than 80% on average. Hand shapes were decoded with highest accuracy (88.8%) at the latencies of 400 ms after visual stimulus onset. The analysis of randomly sampled data across different recording days shows that average classification accuracy is higher than 90 % when data contains signals from more than 350ch. These results suggest that visual information decoded from the IT neural activity is useful for the dexterous control of multiple digits of a prosthetic hand.