Committee 
Date Time 
Place 
Paper Title / Authors 
Abstract 
Paper # 
SCE 
20230120 14:10 
Tokyo 
KikaiShinkoKaikan Bldg. (Primary: Onsite, Secondary: Online) 
Introduction of a fluctuation mechanism of the oscillation frequency of the oscillatorbased random number generator using Josephson oscillation Takeshi Onomi (Fukuoka Inst. Tech.) SCE202215 
An oscillatorbased true random number generator using superconducting single flux quantum circuits and Josephson oscill... [more] 
SCE202215 pp.1216 
SCE 
20150805 10:25 
Kanagawa 
Yokohama National Univ. 
Demonstration of a relaxation oscillator based on a superconducting Schmitt trigger inverter Takeshi Onomi (Fukuoka Inst. Tech.) SCE201517 
A new relaxation oscillator using a superconducting Schmitt trigger inverter is proposed and tested. The superconducting... [more] 
SCE201517 pp.5357 
MBE, NC (Joint) 
20141121 11:00 
Miyagi 
Tohoku University 
A Comparison of Back Propagation Learning between the Inversefunction Delayless Model and a Conventional Model Yuta Horiuchi (Tohoku Univ), Yoshihiro Hayakawa (SNCT), Takeshi Onomi, Koji Nakajima (Tohoku Univ) NC201426 
For the combinatorial optimization problem using the hopfield model, avoidance of the local minimum problem is important... [more] 
NC201426 pp.710 
MBE, NC (Joint) 
20141121 11:25 
Miyagi 
Tohoku University 
The Relation between Dispersion of Initial Values and Pretraining of Deep Neural Networks Seitaro Shinagawa (Tohoku univ.), Yoshihiro Hayakawa (SNCT), Takeshi Onomi, Koji Nakajima (Tohoku univ.) NC201427 
[more] 
NC201427 pp.1114 
SCE 
20140723 11:35 
Tokyo 
KikaiShinkoKaikan Bldg. 
Superconducting Schmitt trigger inverter and its application Takeshi Onomi (Tohoku Univ.) SCE201428 
A new superconducting Schmitt trigger inverter and a relaxation oscillator are proposed. The proposed superconducting Sc... [more] 
SCE201428 pp.2529 
NLP 
20140630 16:00 
Miyagi 
Tohoku Univ. 
Backpropagation learning using inverse function delayless model Yuta Horiuchi (Tohoku Univ.), Yoshihiro Hayakawa (SNCT), Takeshi Onomi, Koji Nakajima (Tohoku Univ.) NLP201425 
The Inverse function Delayed (ID) model has been proposed as one of novel neural models. ID model has a oscillation capa... [more] 
NLP201425 pp.2730 
NLP 
20140630 16:25 
Miyagi 
Tohoku Univ. 
Study on the hardware of the Bidirectional Associative Memories by using the Inverse Function Delayless model Chunyu Bao, Takeshi Onomi, Yoshihiro Hayakawa, Shigeo Sato, Koji Nakajima (Tohoku Univ.) NLP201426 
In conventional macro models such as the Hopfield model, the problems that are caused by the solution of the network not... [more] 
NLP201426 pp.3136 
NLP 
20140701 10:00 
Miyagi 
Tohoku Univ. 
Learning Restricted Boltzmann Machine with discrete learning parameter Seitaro Shinagawa (Tohoku Univ.), Yoshihiro Hayakawa (SNCT), Shigeo Sato, Takeshi Onomi, Koji Nakajima (Tohoku Univ.) NLP201427 
Recently, the method of Deep Neural Network (DNN) with hierarchical learning has been remarkable for performance to solv... [more] 
NLP201427 pp.3740 
SCE 
20140124 13:15 
Tokyo 
Kikaishinkoukaikan Bldg. 
Analysis of rfSQUID ladder circuits with a single flux quantum signal for the transmission direction Yuya Tsuji, Takeshi Onomi, Koji Nakajima (Tohoku Univ.) SCE201351 
Although SFQ circuit technique is very predominant in respect of power consumption, the circuit system of a semiconducto... [more] 
SCE201351 pp.97100 
SCE 
20140124 13:40 
Tokyo 
Kikaishinkoukaikan Bldg. 
Comparison of the final addition circuit in SFQ parallel multiplier with a tree structure partial product adder circuit Akifumi Yamada, Takeshi Onomi, Koji Nakajima (Tohoku Univ.) SCE201352 
A single flux quantum (SFQ) circuit is capable of highspeed operation in a few 10 GHz, and it has a big advantage compa... [more] 
SCE201352 pp.101104 
NLP 
20100310 10:00 
Tokyo 

Neural Networks and the Application to the 4Queen Problem Yusuke Maenami, Takeshi Onomi (Tohoku Univ), Yoshihiro Hayakawa (Sendai Nat Coll. of Tech.), Shigeo Sato, Koji Nakajima (Tohoku Univ) NLP2009172 
A combination optimization problem is generally NP difficulty or NP completeness. When problem size becomes large, it is... [more] 
NLP2009172 pp.8185 