JPH06195485A - Neural network outputting membership function - Google Patents

Neural network outputting membership function

Info

Publication number
JPH06195485A
JPH06195485A JP4356857A JP35685792A JPH06195485A JP H06195485 A JPH06195485 A JP H06195485A JP 4356857 A JP4356857 A JP 4356857A JP 35685792 A JP35685792 A JP 35685792A JP H06195485 A JPH06195485 A JP H06195485A
Authority
JP
Japan
Prior art keywords
cells
cell
membership function
coupling coefficient
output
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
JP4356857A
Other languages
Japanese (ja)
Other versions
JP3296609B2 (en
Inventor
Tetsuro Muraji
哲朗 連
Chinami Tanaka
ちなみ 田中
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Mikuni Corp
Original Assignee
Mikuni Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Mikuni Corp filed Critical Mikuni Corp
Priority to JP35685792A priority Critical patent/JP3296609B2/en
Publication of JPH06195485A publication Critical patent/JPH06195485A/en
Application granted granted Critical
Publication of JP3296609B2 publication Critical patent/JP3296609B2/en
Anticipated expiration legal-status Critical
Expired - Fee Related legal-status Critical Current

Links

Landscapes

  • Feedback Control In General (AREA)
  • Image Analysis (AREA)

Abstract

PURPOSE:To make the type of an NN have flexibility and to obtain a membership function by a balanced arrangement, in the NN outputting the membership function. CONSTITUTION:This neural network(NN) has a cell 2 and a bias cell 1 outputting input signals, plural first intermediate cells 3 to 6 to be connected with the signals of the cells by a prescribed coupling coefficient, plural second cells 7 to 9 to be connected with these intermediate cells by the prescribed coupling coefficient and two output cells 10, 11. An educator signal of a fixed value is impressed via the one side 10 of the output cells.

Description

【発明の詳細な説明】Detailed Description of the Invention

【0001】[0001]

【産業上の利用分野】本発明は、メンバーシップ関数を
出力するニューラルネットワーク(以下NNと称す)に
関する。
BACKGROUND OF THE INVENTION 1. Field of the Invention The present invention relates to a neural network (hereinafter referred to as NN) that outputs a membership function.

【0002】[0002]

【従来の技術】近年、ニューロコンピュータが開発さ
れ、各種の研究成果が発表されている。ニューロコンピ
ュータとは、脳の基本素子であるニューロン(神経細
胞)に着目したものであり、これらが結合した結果でき
るNNをヒントとして、脳と同じような機能を達成しよ
うとするものである。これらNNの特徴は各ニューロン
間の並列情報処理にあり、かつ学習能力のある点であ
る。しかも実際のネットワークモデルとしては、各ニュ
ーロンに対応したセルを備えた入力層,中間層及び出力
層からなり、各層にあるセルは、その前後の層にある全
てのセルと結合する構成を有している。又、各セル同士
の結合には荷重と称する結合係数を導入し、かつ、これ
らの係数は学習により修正するようにしている。
2. Description of the Related Art In recent years, a neuro computer has been developed and various research results have been announced. The neurocomputer focuses on neurons (nerve cells), which are the basic elements of the brain, and tries to achieve the same function as the brain by using the NN resulting from the combination of these as a hint. The feature of these NNs lies in the parallel information processing between the neurons and the ability of learning. Moreover, the actual network model consists of an input layer, a middle layer, and an output layer, each of which has cells corresponding to each neuron, and cells in each layer are connected to all cells in the layers before and after that layer. ing. In addition, a coupling coefficient called a load is introduced for coupling each cell, and these coefficients are corrected by learning.

【0003】[0003]

【発明が解決しようとする課題】上記したように、従来
技術によればNNを構成する各階層に属するセルは、そ
の前後の層の全てのセルと結合している。したがって、
従来の階層型NNの学習プログラムでは、1つのセルは
その前後の層の全てのセルと結合しているとして計算が
なされることになり、NNの型に柔軟性がなく、かつ出
力波形に片寄りが生ずると言う欠点がある。本発明は上
記欠点を解決するためになされたものであり、NNに柔
軟性を持たせると共に片寄りのない配置でメンバーシッ
プ関数を出力するNNを提供することを目的としてい
る。
As described above, according to the conventional technique, the cells belonging to each layer forming the NN are combined with all the cells in the layers before and after that. Therefore,
In the conventional hierarchical NN learning program, calculation is performed assuming that one cell is combined with all cells in the layers before and after it, and the NN type is inflexible and the output waveform has a partial shape. There is a drawback that it causes a deviation. The present invention has been made to solve the above drawbacks, and an object of the present invention is to provide an NN that gives flexibility to the NN and outputs a membership function with an even arrangement.

【0004】[0004]

【課題を解決するための手段】上記目的を達成するた
め、本発明は入力信号を出力するセル及びバイアスセル
と、前記セルの信号を所定の結合係数を介して接続され
る複数の第1の中間セルと、これら中間セルと所定の結
合係数にて接続される複数の第2の中間セルと、2つの
出力セルとを有し、前記出力セルの一方を介して一定値
の教師信号を印加する構成とした。
To achieve the above object, the present invention provides a cell for outputting an input signal and a bias cell, and a plurality of first cells connected to the signal of the cell via a predetermined coupling coefficient. An intermediate cell, a plurality of second intermediate cells connected to these intermediate cells with a predetermined coupling coefficient, and two output cells, and a constant-value teacher signal is applied via one of the output cells. It was configured to do.

【0005】[0005]

【実施例】以下図面を参照して実施例を説明する。図1
は本発明によるNNの一実施例の構成図である。図にお
いて、1〜11はセルであり、この内でセル1は特別なセ
ルであってバイアスセルと称し、常に1を出力する。W
1 〜W7 ,WB1 〜WB4 は結合係数である。セル3〜
6の入出力関数は式(2) に示すシグモイド関数で、セル
7〜11の入出力関数は式(1) に示す恒等関数である。な
お、セル4とセル8,セル6とセル9の結合係数は夫々
図に示すように−1に固定し学習は行なわれない。そし
てセル7にはセル3からの結合係数1とバイアスセル1
からの結合係数1が入力し、同じくセル9にはセル6か
らの結合係数−1とバイアスセル1からの結合係数1が
入力されている。又、セル10はセル7〜9の出力の総和
が1に近づくよう学習するためのセルであり、そのた
め、セル10には教師信号1を与え、更にセル10とセル7
〜9との結合係数は1に固定し、学習しない。
Embodiments will be described below with reference to the drawings. Figure 1
FIG. 1 is a configuration diagram of an embodiment of a NN according to the present invention. In the figure, 1 to 11 are cells, of which cell 1 is a special cell, called a bias cell, which always outputs 1. W
1 ~W 7, WB 1 ~WB 4 is a coupling coefficient. Cell 3-
The input / output function of 6 is the sigmoid function shown in equation (2), and the input / output function of cells 7 to 11 is the identity function shown in equation (1). The coupling coefficients of cells 4 and 8 and cells 6 and 9 are fixed to -1 as shown in the figure, and learning is not performed. The cell 7 has the coupling coefficient 1 from the cell 3 and the bias cell 1
, The coupling coefficient 1 from the cell 6 and the coupling coefficient 1 from the bias cell 1 are also input to the cell 9. Further, the cell 10 is a cell for learning so that the total sum of the outputs of the cells 7 to 9 approaches 1. Therefore, the teacher signal 1 is given to the cell 10, and the cells 10 and 7
The coupling coefficient with 9 is fixed to 1 and not learned.

【数1】 y=x …………………………………………(1) y=1−exp (−x)/1+exp (−x) …………(2) 但し、x:セルへの入力の総和 , y:セルの出力。 図3は図1に示すセル7,8,9の出力である。セル7
については、セル3からのシグモイド関数がそのままの
形(結合係数1)で入力され、かつバイアスセル1から
の結合係数で右方にシフトする。セル8の出力は、セル
4からの結合係数(−1)によるシグモイド関数の反転
出力とセル5からのシグモイド関数そのものとの合計さ
れたものとして出力し、セル9の出力はセル6からの結
合係数(−1)によるシグモイド関数の反転出力とバイ
アスセル1からの結合係数1によるシフトによる。な
お、結合係数W1 〜W4 はセル7〜9のメンバーシップ
関数のなだらかさが決まり、WB1 〜WB4 はセル7〜
9のメンバーシップ関数のX軸方向のオフセットが決ま
る。
[Equation 1] y = x …………………………………… (1) y = 1−exp (−x) / 1 + exp (−x) ………… (2) However, x: sum of input to cell, y: output of cell. FIG. 3 shows the outputs of cells 7, 8 and 9 shown in FIG. Cell 7
For, the sigmoid function from cell 3 is input as it is (coupling coefficient 1), and the coupling coefficient from bias cell 1 shifts to the right. The output of cell 8 is output as the sum of the inverted output of the sigmoid function from the coupling coefficient (-1) from cell 4 and the sigmoid function itself from cell 5, and the output of cell 9 is the combination from cell 6. Due to the inverted output of the sigmoid function by the coefficient (−1) and the shift by the coupling coefficient 1 from the bias cell 1. The coupling coefficients W 1 to W 4 determine the smoothness of the membership functions of the cells 7 to 9, and WB 1 to WB 4 are the cells 7 to 9.
The X-axis offset of the membership function of 9 is determined.

【0006】図2は他の実施例の構成図であり、図1と
の差はメンバーシップ関数機能を有するNN(図中の符
号A)からシグモイド関数を扱うセルを2個とした点で
あり、その間の接続は図2に示す通りである。前記実施
例と同様にセル8はセル5〜7からの単純加算出力を入
力とすると共に、教師信号1を与え、セル8とセル5〜
7との結合係数は1に固定し、学習しない。図4はセル
5〜7によるメンバーシップ関数波形であり、その作成
については図1の場合と同様である。本実施例において
も図1と同様な効果が得られる。
FIG. 2 is a block diagram of another embodiment. The difference from FIG. 1 is that two cells are used for handling a sigmoid function from an NN (a symbol A in the figure) having a membership function function. , The connection between them is as shown in FIG. Similar to the above-described embodiment, the cell 8 receives the simple addition output from the cells 5 to 7 and gives the teacher signal 1 to the cells 8 and 5 to 5.
The coupling coefficient with 7 is fixed to 1 and is not learned. FIG. 4 is a membership function waveform by cells 5 to 7, and the creation thereof is the same as in the case of FIG. Also in this embodiment, the same effect as in FIG. 1 can be obtained.

【0007】[0007]

【発明の効果】以上説明したように、本発明によれば単
純加算を出力するセルに一定値の教師信号を与えるよう
構成したので、メンバーシップ関数の出力の総和が一定
になるように学習でき、片寄りのない配置でメンバーシ
ップ関数が得られる。
As described above, according to the present invention, since a teacher signal having a constant value is provided to the cell that outputs simple addition, learning can be performed so that the total sum of the membership function outputs is constant. , The membership function is obtained in a biased arrangement.

【図面の簡単な説明】[Brief description of drawings]

【図1】本発明によるNNの一実施例の構成図。FIG. 1 is a configuration diagram of an embodiment of a NN according to the present invention.

【図2】他の実施例の構成図。FIG. 2 is a configuration diagram of another embodiment.

【図3】図1のセル7,8,9の出力特性図。3 is an output characteristic diagram of cells 7, 8 and 9 in FIG.

【図4】図2のセル5,6,7の出力特性図。4 is an output characteristic diagram of cells 5, 6, and 7 of FIG.

【符号の説明】[Explanation of symbols]

1〜11 セル W1 〜W7 ,WB1 〜WB4 結合係数1 to 11 cells W 1 to W 7 , WB 1 to WB 4 coupling coefficient

Claims (1)

【特許請求の範囲】[Claims] 【請求項1】 入力信号を出力するセル及びバイアスセ
ルと、前記セルの信号を所定の結合係数を介して接続さ
れる複数の第1の中間セルと、これら中間セルと所定の
結合係数にて接続される複数の第2の中間セルと、2つ
の出力セルとを有し、前記出力セルの一方を介して一定
値の教師信号を印加することを特徴とするメンバーシッ
プ関数を出力するニューラルネットワーク。
1. A cell that outputs an input signal and a bias cell, a plurality of first intermediate cells that connect the signals of the cells through a predetermined coupling coefficient, and these intermediate cells and a predetermined coupling coefficient. A neural network having a plurality of second intermediate cells connected to each other and two output cells, wherein a teacher signal having a constant value is applied through one of the output cells, and outputs a membership function. .
JP35685792A 1992-12-22 1992-12-22 Neural network that outputs membership function Expired - Fee Related JP3296609B2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP35685792A JP3296609B2 (en) 1992-12-22 1992-12-22 Neural network that outputs membership function

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP35685792A JP3296609B2 (en) 1992-12-22 1992-12-22 Neural network that outputs membership function

Publications (2)

Publication Number Publication Date
JPH06195485A true JPH06195485A (en) 1994-07-15
JP3296609B2 JP3296609B2 (en) 2002-07-02

Family

ID=18451116

Family Applications (1)

Application Number Title Priority Date Filing Date
JP35685792A Expired - Fee Related JP3296609B2 (en) 1992-12-22 1992-12-22 Neural network that outputs membership function

Country Status (1)

Country Link
JP (1) JP3296609B2 (en)

Also Published As

Publication number Publication date
JP3296609B2 (en) 2002-07-02

Similar Documents

Publication Publication Date Title
Yang et al. Exponential stability and oscillation of Hopfield graded response neural network
JPH07114524A (en) Signal processor
Rujan et al. A geometric approach to learning in neural networks
JPH06195485A (en) Neural network outputting membership function
JPH0652338A (en) Neural network for generating membership function
JPH04237388A (en) Neuro processor
JPH01147657A (en) Chaos circuit network
Chichilnisky et al. Patterns of power: bargaining and incentives in two-person games
JPH0652139A (en) Data structure for neural network
JP3343626B2 (en) Neural networks for fuzzy inference
JPH0652140A (en) Data structure for neural network
JPH06195488A (en) Neural network for fuzzy inference
JPH03257659A (en) Neural network
JPH03257658A (en) Dynamic system modeling method for neural network
JPH03268074A (en) System and device for recognition of pattern
JPH03196250A (en) Neuron simulation circuit
JPH03265077A (en) Feedback neural cell model
JP3343625B2 (en) Neural networks for fuzzy inference
JP3118018B2 (en) Neural network
JP2607351B2 (en) Error Signal Generation Method for Efficient Learning of Multilayer Perceptron Neural Network
Hammer A NP-hardness Result for a Sigmoidal 3-node Neural Network
JPH04275690A (en) Character recognizing device
JP3292495B2 (en) Neuro-fuzzy fusion system
JPH0736181B2 (en) Neural network circuit
JPH0736183B2 (en) Network configuration data processor

Legal Events

Date Code Title Description
LAPS Cancellation because of no payment of annual fees