Print edition: ISSN 0913-5685 Online edition: ISSN 2432-6380
[TOP] | [2006] | [2007] | [2008] | [2009] | [2010] | [2011] | [2012] | [Japanese] / [English]
NLP2007-141
A Lossless Steganography Technique for Speech Codec Based on Nonlinear Quantization
Naofumi Aoki (Hokkaido Univ.)
pp. 1 - 4
NLP2007-142
An automatic music composition system by using chaotic signal
Ayumi Kodaira, Kenya Jin'no (Kanto Gakuin Univ.)
pp. 5 - 8
NLP2007-143
Neural Gas Containing Two Kinds of Neurons and its Behaviors
Keiko Kanda, Haruna Matsushita, Yoshifumi Nishio (Tokushima Univ.)
pp. 9 - 12
NLP2007-144
Complex-Valued Multistate Associative Memory with Nonlinear Multilevel Function
Gouhei Tanaka, Kazuyuki Aihara (Tokyo Univ.)
pp. 13 - 18
NLP2007-145
Analysis of Random Boolean Network Dynamics Based on One-dimensional Mapping
Nobumitsu Fujiwara, Shinji Doi, Sadatoshi Kumagai (Osaka Univ.)
pp. 19 - 24
NLP2007-146
A Generalised Entropy based Assosiative Memory
Masahiro Nakagawa (Nagaoka Univ. of Tech.)
pp. 25 - 30
NLP2007-147
A modified Particle Swarm Optimization to search an optimal value
Kai Yamasaki, Tomokadu Kaneko, Kenya Jin'no (Kanto Gakuin Univ.)
pp. 31 - 34
NLP2007-148
Hopfield NN Using Scaling Law for Quadratic Assignment Problem
Yoshifumi Tada, Yoko Uwate, Yoshifumi Nishio (Tokushima Univ.)
pp. 35 - 38
NLP2007-149
On Data Clustering by a Stochastic Embedding Method
Naoto Nishikawa, Shinji Doi, Sadatoshi Kumagai (Osaka Univ.)
pp. 39 - 44
NLP2007-150
Batch-Learning Self-Organizing Map with False-Neighbor Degree for Effective Self-Organization
Haruna Matsushita, Yoshifumi Nishio (Tokushima Univ.)
pp. 45 - 50
NLP2007-151
On the global bifurcation structure of a detailed ventricular myocardial cell model and drug sensitivity of ionic channels
Rei Yamaguchi, Satoru Hisakado, Shinji Doi, Sadatoshi Kumagai (Osaka Univ.)
pp. 51 - 56
NLP2007-152
Image Restoration and Interpolation by Radial Basis Function Network
Zhenxing Pan, Nobumitsu Fujiwara, Shinji Doi, Sadatoshi Kumagai (Osaka Univ.)
pp. 57 - 62
Note: Each article is a technical report without peer review, and its polished version will be published elsewhere.