TW270192B - Artificial neural network architecture - Google Patents

Artificial neural network architecture

Info

Publication number
TW270192B
TW270192B TW84104739A TW84104739A TW270192B TW 270192 B TW270192 B TW 270192B TW 84104739 A TW84104739 A TW 84104739A TW 84104739 A TW84104739 A TW 84104739A TW 270192 B TW270192 B TW 270192B
Authority
TW
Taiwan
Prior art keywords
processor
wij
activation function
neural network
artificial neural
Prior art date
Application number
TW84104739A
Other languages
Chinese (zh)
Inventor
Jyh-Dar Chiue
Hwai-Juu Jang
Original Assignee
Ind Tech Res Inst
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Ind Tech Res Inst filed Critical Ind Tech Res Inst
Priority to TW84104739A priority Critical patent/TW270192B/en
Application granted granted Critical
Publication of TW270192B publication Critical patent/TW270192B/en

Links

Landscapes

  • Complex Calculations (AREA)

Abstract

An artificial neural network circuit comprises: one one-dimensional systolic array processor with M processor elements, in which the i-th processor element(i=1, 2, ...,M) includes: one weight storage memory for storing synapse weight Wij(j=1, 2, ...,N), and one sub-processor element for processing a series of input signals Xj and Wij, accumulating to output gi, in which gi=sum of f(Wij, Xj) and each gi is stored in shift register consisting of M storage elements, in which the i-th storage element stores the i-th gi value. The one-dimensional systolic array processor includes one Activation function processor element that independently calculates Activation function value Yi of each gi value transmitted from shift register in sequence, then outputs Yi, in which Yi=S(gi), S is Activation function.
TW84104739A 1995-05-11 1995-05-11 Artificial neural network architecture TW270192B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
TW84104739A TW270192B (en) 1995-05-11 1995-05-11 Artificial neural network architecture

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
TW84104739A TW270192B (en) 1995-05-11 1995-05-11 Artificial neural network architecture

Publications (1)

Publication Number Publication Date
TW270192B true TW270192B (en) 1996-02-11

Family

ID=51396980

Family Applications (1)

Application Number Title Priority Date Filing Date
TW84104739A TW270192B (en) 1995-05-11 1995-05-11 Artificial neural network architecture

Country Status (1)

Country Link
TW (1) TW270192B (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TWI784816B (en) * 2020-12-23 2022-11-21 美商超捷公司 Input and digital output mechanisms for analog neural memory in a deep learning artificial neural network
US11729970B2 (en) 2018-10-16 2023-08-15 Silicon Storage Technology, Inc. Input and digital output mechanisms for analog neural memory in a deep learning artificial neural network
TWI841222B (en) * 2017-03-09 2024-05-01 美商谷歌有限責任公司 Vector processing unit and computing system having the same, and computer-implemented method
US12075618B2 (en) 2018-10-16 2024-08-27 Silicon Storage Technology, Inc. Input and digital output mechanisms for analog neural memory in a deep learning artificial neural network

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TWI841222B (en) * 2017-03-09 2024-05-01 美商谷歌有限責任公司 Vector processing unit and computing system having the same, and computer-implemented method
US11729970B2 (en) 2018-10-16 2023-08-15 Silicon Storage Technology, Inc. Input and digital output mechanisms for analog neural memory in a deep learning artificial neural network
US12075618B2 (en) 2018-10-16 2024-08-27 Silicon Storage Technology, Inc. Input and digital output mechanisms for analog neural memory in a deep learning artificial neural network
TWI784816B (en) * 2020-12-23 2022-11-21 美商超捷公司 Input and digital output mechanisms for analog neural memory in a deep learning artificial neural network

Similar Documents

Publication Publication Date Title
US4254474A (en) Information processing system using threshold passive modification
Takefuji et al. Artificial neural networks for four-coloring map problems and K-colorability problems
MS Jr et al. A digital neural network architecture for VLSI
US5087826A (en) Multi-layer neural network employing multiplexed output neurons
EP0314170A2 (en) Multi-layer neural network to which dynamic programming techniques are applicable
US5799134A (en) One dimensional systolic array architecture for neural network
US6366897B1 (en) Cortronic neural networks with distributed processing
Cushing A strong ergodic theorem for some nonlinear matrix models for the dynamics of structured populations
Saeks et al. On the Design of an MIMD Neural Network Processor
TW270192B (en) Artificial neural network architecture
US5384896A (en) Learning machine
JPH05233586A (en) Digital neural circuit and its driving method
Kuan A recurrent Newton algorithm and its convergence properties
Koutroumbas et al. Generalized Hamming networks and applications
Hattori et al. Quick learning for multidirectional associative memories
US5122983A (en) Charged-based multiplier circuit
Jutten et al. Simulation machine and integrated implementation of neural networks: A review of methods, problems and realizations
Zhenjiang et al. An extended BAM neural network model
FI103305B (en) Associative neural network
Wilson Neural Computing on a One Dimensional SIMD Array.
Wang Optimal harvesting for age distribution and weighted size competitive species with diffusion
Yokoi A fundamental element for neural computer-Folthret
Ramacher et al. WSI architecture of a neurocomputer module
Kwan Systolic architectures for Hopfield network, BAM and multi-layer feed-forward network
Lin et al. Identification of dynamic systems using recurrent fuzzy neural network