講演抄録/キーワード |
講演名 |
2014-03-14 09:30
Study of Recognizing Hand Actions from Video Sequences during Suture Surgeries Based on Temporally-Sectioned SIFT and Sliding Window Based Neural Networks ○Ye Li・Jun Ohya(Waseda Univ.)・Toshio Chiba(NCCHD)・Rong Xu(Waseda Univ.)・Hiromasa Yamashita(NCCHD) PRMU2013-193 |
抄録 |
(和) |
(まだ登録されていません) |
(英) |
Towards the realization of a robotic nurse that can support surgeries autonomously by recognizing surgical situations only using video informations, this paper proposes an improved method by using sectioned-SIFT and sliding window based neural network that can recognize surgeon’s hand actions: suture and tying. Hand area is detected by using color information and then the video sequence is partitioned into sections. Sectioned-SIFT descriptors are computed in each section and built a word vocabulary. Histogram feature of the action is spliced by using word’s frequency in each section. Finally, sliding window and neural network is used to recognize the significant actions: suture and tying. The proposed method has achieved the 100% recognition rate for manually extracted actions and 90% recognition rate for whole surgery video sequences. |
キーワード |
(和) |
/ / / / / / / |
(英) |
action recognition / sectioned-SIFT / BP neural network / RSN / / / / |
文献情報 |
信学技報, vol. 113, no. 493, PRMU2013-193, pp. 151-156, 2014年3月. |
資料番号 |
PRMU2013-193 |
発行日 |
2014-03-06 (PRMU) |
ISSN |
Print edition: ISSN 0913-5685 Online edition: ISSN 2432-6380 |
著作権に ついて |
技術研究報告に掲載された論文の著作権は電子情報通信学会に帰属します.(許諾番号:10GA0019/12GB0052/13GB0056/17GB0034/18GB0034) |
PDFダウンロード |
PRMU2013-193 |