JPWO2021095361A5 - - Google Patents
Download PDFInfo
- Publication number
- JPWO2021095361A5 JPWO2021095361A5 JP2021555926A JP2021555926A JPWO2021095361A5 JP WO2021095361 A5 JPWO2021095361 A5 JP WO2021095361A5 JP 2021555926 A JP2021555926 A JP 2021555926A JP 2021555926 A JP2021555926 A JP 2021555926A JP WO2021095361 A5 JPWO2021095361 A5 JP WO2021095361A5
- Authority
- JP
- Japan
- Prior art keywords
- unit
- input
- data
- neural network
- layer
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000003062 neural network model Methods 0.000 claims 10
- 238000013528 artificial neural network Methods 0.000 claims 2
- 230000006870 function Effects 0.000 claims 2
- 230000000644 propagated effect Effects 0.000 claims 1
Claims (8)
データを入力する入力部(10)と、
前記ニューラルネットワークモデルによる推論により、前記入力部にて入力されたデータを分析する推論部(11)と、
前記推論部による分析結果を出力する出力部(13)と、
を備える演算装置(1)。 A storage unit (12) that stores a trained neural network model, in which the weighting coefficient is defined by a probability distribution and includes a probability layer that propagates the mean and variance of the probability distribution of the output value to the subsequent stage.
Input unit (10) for inputting data and
An inference unit (11) that analyzes the data input in the input unit by inference by the neural network model, and
An output unit (13) that outputs the analysis result by the inference unit, and
An arithmetic unit (1).
データを入力する入力部(10)と、
前記ニューラルネットワークモデルによる推論により、前記入力部にて入力されたデータを分析する推論部(11)と、
前記推論部による分析結果を出力する出力部(13)と、
を備え、
前記ニューラルネットワークモデルは、所与の確率分布と学習によって設定された重み係数の組み合わせによって構成される1つの確率層を有するとともに、その他の層は重み係数を確定値で規定しており、
前記推論部は、前記入力部にて入力されたデータを前記ニューラルネットワークモデルに適用したときに、前記確率層に現れた値と前記重み係数の確率分布とに基づいて、入力されたデータの分析を行う演算装置(1)。 A storage unit (12) that stores the trained neural network model,
Input unit (10) for inputting data and
An inference unit (11) that analyzes the data input in the input unit by inference by the neural network model, and
An output unit (13) that outputs the analysis result by the inference unit, and
Equipped with
The neural network model has one probability layer composed of a combination of a given probability distribution and a weighting coefficient set by learning, and the other layers define the weighting coefficient with a definite value.
The inference unit analyzes the input data based on the value appearing in the probability layer and the probability distribution of the weighting coefficient when the data input by the input unit is applied to the neural network model. (1).
前記推論部は、前記画像データ、音声データまたはテキストデータを複数のクラスに分類すると共に、そのクラスに分類される信頼性を求める請求項1から4のいずれか1項に記載の演算装置。 The input unit inputs image data, voice data, or text data, and inputs the image data, the voice data, or the text data.
The arithmetic unit according to any one of claims 1 to 4 , wherein the inference unit classifies the image data, voice data, or text data into a plurality of classes and obtains reliability classified into the classes.
前記推論部は前記映像に映る物体をクラスに分類すると共にその信頼性を推論し、
前記出力部は、物体のクラスと信頼性のデータとを車両制御部(21)に対して出力する請求項1から4のいずれか1項に記載の演算装置(2)。 The input unit inputs an image taken by the camera (20) of the autonomous driving vehicle, and inputs the image.
The reasoning unit classifies the objects shown in the image into classes and infers their reliability.
The arithmetic unit (2) according to any one of claims 1 to 4 , wherein the output unit outputs the class of the object and the reliability data to the vehicle control unit (21).
重み係数が確率分布によって規定され、出力値の確率分布の平均と分散を後段に伝播させる確率層を含む、ニューラルネットワークモデルにより構成され、
入力されたデータを前記ニューラルネットワークモデルに適用したときに、出力層に伝播された出力値の確率分布の平均と分散に基づいて、入力されたデータを分析する学習済みモデル。 A trained model of a neural network that makes a computer function to analyze input data.
The weighting coefficient is defined by the probability distribution and consists of a neural network model that includes a probability layer that propagates the mean and variance of the probability distribution of the output values later.
A trained model that analyzes the input data based on the mean and variance of the probability distribution of the output values propagated to the output layer when the input data is applied to the neural network model.
所与の確率分布と学習によって設定された重み係数の組み合わせによって構成される1つの確率層を有するとともに、その他の層は重み係数を確定値で規定したニューラルネットワークモデルにより構成され、
入力されたデータを前記ニューラルネットワークモデルに適用したときに、前記確率層に現れた値と前記重み係数の確率分布とに基づいて、入力されたデータの分析を行う学習済みモデル。 A trained model of a neural network that makes a computer function to analyze input data.
It has one probability layer composed of a combination of a given probability distribution and a weighting coefficient set by learning, and the other layer is composed of a neural network model in which the weighting coefficient is defined by a definite value.
A trained model that analyzes the input data based on the value appearing in the probability layer and the probability distribution of the weighting coefficient when the input data is applied to the neural network model.
Applications Claiming Priority (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2019203988 | 2019-11-11 | ||
JP2019203988 | 2019-11-11 | ||
PCT/JP2020/035254 WO2021095361A1 (en) | 2019-11-11 | 2020-09-17 | Arithmetic device and learned model |
Publications (3)
Publication Number | Publication Date |
---|---|
JPWO2021095361A1 JPWO2021095361A1 (en) | 2021-05-20 |
JPWO2021095361A5 true JPWO2021095361A5 (en) | 2022-07-07 |
JP7386462B2 JP7386462B2 (en) | 2023-11-27 |
Family
ID=75912167
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
JP2021555926A Active JP7386462B2 (en) | 2019-11-11 | 2020-09-17 | Computing unit and trained model |
Country Status (2)
Country | Link |
---|---|
JP (1) | JP7386462B2 (en) |
WO (1) | WO2021095361A1 (en) |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP7457752B2 (en) | 2022-06-15 | 2024-03-28 | 株式会社安川電機 | Data analysis system, data analysis method, and program |
Family Cites Families (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11761790B2 (en) * | 2016-12-09 | 2023-09-19 | Tomtom Global Content B.V. | Method and system for image-based positioning and mapping for a road network utilizing object detection |
-
2020
- 2020-09-17 JP JP2021555926A patent/JP7386462B2/en active Active
- 2020-09-17 WO PCT/JP2020/035254 patent/WO2021095361A1/en active Application Filing
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11922322B2 (en) | Exponential modeling with deep learning features | |
KR102644947B1 (en) | Training method for neural network, recognition method using neural network, and devices thereof | |
US12020167B2 (en) | Gradient adversarial training of neural networks | |
Whittington et al. | An approximation of the error backpropagation algorithm in a predictive coding network with local hebbian synaptic plasticity | |
KR102492318B1 (en) | Model training method and apparatus, and data recognizing method | |
EP3459021B1 (en) | Training neural networks using synthetic gradients | |
US8694451B2 (en) | Neural network system | |
Deng et al. | Driving style recognition method using braking characteristics based on hidden Markov model | |
US20190095301A1 (en) | Method for detecting abnormal session | |
US20180053085A1 (en) | Inference device and inference method | |
US20180129930A1 (en) | Learning method based on deep learning model having non-consecutive stochastic neuron and knowledge transfer, and system thereof | |
US20180121791A1 (en) | Temporal difference estimation in an artificial neural network | |
US20210089867A1 (en) | Dual recurrent neural network architecture for modeling long-term dependencies in sequential data | |
WO2020241356A1 (en) | Spiking neural network system, learning processing device, learning method, and recording medium | |
JPWO2021095361A5 (en) | ||
JP2020191088A (en) | Neural network with layer to solve semidefinite programming problem | |
Verma et al. | Fuzzy inference network with mamdani fuzzy inference system | |
US11521053B2 (en) | Network composition module for a bayesian neuromorphic compiler | |
US11769036B2 (en) | Optimizing performance of recurrent neural networks | |
JP5881030B2 (en) | Artificial intelligence device that expands knowledge in a self-organizing manner | |
Dao | Image classification using convolutional neural networks | |
JP7386462B2 (en) | Computing unit and trained model | |
US20190325294A1 (en) | Recurrent neural network model compaction | |
US20230306259A1 (en) | Information processing apparatus, information processing method and program | |
KR102090109B1 (en) | Learning and inference apparatus and method |