JP6706326B2 - リカレントニューラルネットワークモデルの圧縮 - Google Patents
リカレントニューラルネットワークモデルの圧縮 Download PDFInfo
- Publication number
- JP6706326B2 JP6706326B2 JP2018534819A JP2018534819A JP6706326B2 JP 6706326 B2 JP6706326 B2 JP 6706326B2 JP 2018534819 A JP2018534819 A JP 2018534819A JP 2018534819 A JP2018534819 A JP 2018534819A JP 6706326 B2 JP6706326 B2 JP 6706326B2
- Authority
- JP
- Japan
- Prior art keywords
- recurrent
- layer
- weight matrix
- matrix
- rnn
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 230000000306 recurrent effect Effects 0.000 title claims description 173
- 230000006835 compression Effects 0.000 title claims description 54
- 238000007906 compression Methods 0.000 title claims description 54
- 238000003062 neural network model Methods 0.000 title 1
- 239000010410 layer Substances 0.000 claims description 210
- 239000011159 matrix material Substances 0.000 claims description 168
- 238000013528 artificial neural network Methods 0.000 claims description 109
- 238000000034 method Methods 0.000 claims description 42
- 239000011229 interlayer Substances 0.000 claims description 29
- 230000008569 process Effects 0.000 claims description 10
- 238000012549 training Methods 0.000 claims description 7
- 238000000354 decomposition reaction Methods 0.000 claims description 5
- 230000006403 short-term memory Effects 0.000 claims description 5
- 230000004913 activation Effects 0.000 claims description 4
- 230000006870 function Effects 0.000 claims description 4
- 230000007787 long-term memory Effects 0.000 claims description 2
- 238000012545 processing Methods 0.000 description 13
- 238000004590 computer program Methods 0.000 description 12
- 230000009471 action Effects 0.000 description 7
- 238000004891 communication Methods 0.000 description 5
- 230000015654 memory Effects 0.000 description 5
- 238000010586 diagram Methods 0.000 description 3
- 238000013500 data storage Methods 0.000 description 2
- 230000003993 interaction Effects 0.000 description 2
- 238000010801 machine learning Methods 0.000 description 2
- 230000003287 optical effect Effects 0.000 description 2
- 238000013515 script Methods 0.000 description 2
- 238000000926 separation method Methods 0.000 description 2
- 238000004422 calculation algorithm Methods 0.000 description 1
- 238000004364 calculation method Methods 0.000 description 1
- 239000012634 fragment Substances 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 238000007726 management method Methods 0.000 description 1
- 230000002688 persistence Effects 0.000 description 1
- 230000000644 propagated effect Effects 0.000 description 1
- 230000002787 reinforcement Effects 0.000 description 1
- 230000004044 response Effects 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 230000001953 sensory effect Effects 0.000 description 1
- 239000000758 substrate Substances 0.000 description 1
- 238000013518 transcription Methods 0.000 description 1
- 230000035897 transcription Effects 0.000 description 1
- 238000013519 translation Methods 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/04—Architecture, e.g. interconnection topology
- G06N3/044—Recurrent networks, e.g. Hopfield networks
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N20/00—Machine learning
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/04—Architecture, e.g. interconnection topology
- G06N3/049—Temporal neural networks, e.g. delay elements, oscillating neurons or pulsed inputs
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/08—Learning methods
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B2219/00—Program-control systems
- G05B2219/30—Nc systems
- G05B2219/33—Director till display
- G05B2219/33025—Recurrent artificial neural network
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B2219/00—Program-control systems
- G05B2219/30—Nc systems
- G05B2219/40—Robotics, robotics mapping to robotics vision
- G05B2219/40326—Singular value decomposition
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F17/00—Digital computing or data processing equipment or methods, specially adapted for specific functions
- G06F17/10—Complex mathematical operations
- G06F17/16—Matrix or vector computation, e.g. matrix-matrix or matrix-vector multiplication, matrix factorization
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/04—Architecture, e.g. interconnection topology
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/08—Learning methods
- G06N3/084—Backpropagation, e.g. using gradient descent
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- Software Systems (AREA)
- Computing Systems (AREA)
- Artificial Intelligence (AREA)
- Mathematical Physics (AREA)
- General Physics & Mathematics (AREA)
- Data Mining & Analysis (AREA)
- Evolutionary Computation (AREA)
- General Engineering & Computer Science (AREA)
- Biomedical Technology (AREA)
- Molecular Biology (AREA)
- General Health & Medical Sciences (AREA)
- Computational Linguistics (AREA)
- Biophysics (AREA)
- Life Sciences & Earth Sciences (AREA)
- Health & Medical Sciences (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Medical Informatics (AREA)
- Image Analysis (AREA)
- Machine Translation (AREA)
Description
Wh=UΣVT
を満たす。
102 ニューラルネットワーク入力
110 リカレントニューラルネットワーク
120 圧縮リカレント層
122 現在の層入力
124 新規層状態
126 新規層出力
130 リカレント層
134 新規層状態
136 新規層出力
142 ニューラルネットワーク出力
200 方法
300 方法
Claims (20)
前記圧縮RNNが、複数のリカレント層を含み、
前記圧縮RNNの前記複数のリカレント層がそれぞれ、複数の時間ステップの時間ステップごとに、前記時間ステップについてのそれぞれの層入力を受信し、前記時間ステップについての前記層入力を処理して、前記時間ステップについてのそれぞれの層出力を生成するように構成され、
前記複数のリカレント層のそれぞれが、それぞれのリカレント重み行列Wh、およびそれぞれの層間重み行列Wxを有し、
前記複数のリカレント層のうちの少なくとも1つが圧縮され、それにより、前記圧縮された層のそれぞれのリカレント重み行列
システム。
現在の層入力に、前の層についての層間重み行列を適用し、
リカレント入力に、前記リカレント重み行列を適用する
ことによって、前記時間ステップについての前記それぞれの層出力を生成する
ように構成されている、請求項1に記載のシステム。
それぞれのリカレント入力に、前記第1の圧縮重み行列および前記射影行列を適用することによって、部分的に、前記時間ステップについての前記それぞれの層出力を生成する
ように構成されている、請求項1または2に記載のシステム。
前記それぞれの層間重み行列が、前記第2の圧縮重み行列と前記射影行列との積によって定義される、請求項1から3のいずれか一項に記載のシステム。
前記RNNが、前記1つまたは複数のコンピュータによって実装され、複数のリカレント層を含み、
前記RNNの前記複数のリカレント層がそれぞれ、複数の時間ステップの時間ステップごとに、前記時間ステップについてのそれぞれの層入力を受信し、前記層入力を処理して、前記時間ステップについてのそれぞれの層出力を生成するように構成され、
各リカレント層が、それぞれのリカレント重み行列Wh、およびそれぞれの層間重み行列Wxを有し、
前記方法が、前記複数のリカレント層のうちの1つについて、
第1の圧縮重み行列
第2の圧縮重み行列
を含み、前記第2の圧縮重み行列
前記第1の圧縮重み行列を、切り詰められた第1のユニタリ行列および切り詰められた矩形対角行列によって定義される行列に設定するステップ、および
前記射影行列を切り詰められた第2のユニタリ行列の転置行列に設定するステップを含む、方法。
前記それぞれのリカレント重み行列を前記第1の圧縮重み行列と前記射影行列との積に置き換えるステップと、
前記それぞれの層間重み行列を前記第2の圧縮重み行列と前記射影行列との積に置き換えるステップと
をさらに含む、請求項9に記載の方法。
Applications Claiming Priority (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US201662290624P | 2016-02-03 | 2016-02-03 | |
US62/290,624 | 2016-02-03 | ||
PCT/US2016/068913 WO2017136070A1 (en) | 2016-02-03 | 2016-12-28 | Compressed recurrent neural network models |
Publications (2)
Publication Number | Publication Date |
---|---|
JP2019509539A JP2019509539A (ja) | 2019-04-04 |
JP6706326B2 true JP6706326B2 (ja) | 2020-06-03 |
Family
ID=57882138
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
JP2018534819A Active JP6706326B2 (ja) | 2016-02-03 | 2016-12-28 | リカレントニューラルネットワークモデルの圧縮 |
Country Status (7)
Country | Link |
---|---|
US (2) | US10878319B2 (ja) |
EP (1) | EP3374932B1 (ja) |
JP (1) | JP6706326B2 (ja) |
KR (1) | KR102100977B1 (ja) |
CN (1) | CN107038476A (ja) |
DE (2) | DE102016125918A1 (ja) |
WO (1) | WO2017136070A1 (ja) |
Families Citing this family (39)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP3398119B1 (en) * | 2016-02-05 | 2022-06-22 | Deepmind Technologies Limited | Generative neural networks for generating images using a hidden canvas |
US10783535B2 (en) | 2016-05-16 | 2020-09-22 | Cerebri AI Inc. | Business artificial intelligence management engine |
US10599935B2 (en) * | 2017-02-22 | 2020-03-24 | Arm Limited | Processing artificial neural network weights |
US10762563B2 (en) | 2017-03-10 | 2020-09-01 | Cerebri AI Inc. | Monitoring and controlling continuous stochastic processes based on events in time series data |
US10402723B1 (en) | 2018-09-11 | 2019-09-03 | Cerebri AI Inc. | Multi-stage machine-learning models to control path-dependent processes |
US11037330B2 (en) * | 2017-04-08 | 2021-06-15 | Intel Corporation | Low rank matrix compression |
US11216437B2 (en) | 2017-08-14 | 2022-01-04 | Sisense Ltd. | System and method for representing query elements in an artificial neural network |
US11106975B2 (en) * | 2017-10-20 | 2021-08-31 | Asapp, Inc. | Fast neural network implementations by increasing parallelism of cell computations |
WO2019078885A1 (en) * | 2017-10-20 | 2019-04-25 | Google Llc | PARALLEL EXECUTION OF OPERATIONS OF ACTIVATION UNITS WITH RELEASE |
US11556775B2 (en) * | 2017-10-24 | 2023-01-17 | Baidu Usa Llc | Systems and methods for trace norm regularization and faster inference for embedded models |
CN113807510B (zh) | 2017-12-30 | 2024-05-10 | 中科寒武纪科技股份有限公司 | 集成电路芯片装置及相关产品 |
CN109993290B (zh) | 2017-12-30 | 2021-08-06 | 中科寒武纪科技股份有限公司 | 集成电路芯片装置及相关产品 |
CN109993291B (zh) * | 2017-12-30 | 2020-07-07 | 中科寒武纪科技股份有限公司 | 集成电路芯片装置及相关产品 |
EP3624019A4 (en) | 2017-12-30 | 2021-03-24 | Cambricon Technologies Corporation Limited | CHIP DEVICE WITH INTEGRATED CIRCUIT AND ASSOCIATED PRODUCT |
CN109993292B (zh) | 2017-12-30 | 2020-08-04 | 中科寒武纪科技股份有限公司 | 集成电路芯片装置及相关产品 |
US11586924B2 (en) * | 2018-01-23 | 2023-02-21 | Qualcomm Incorporated | Determining layer ranks for compression of deep networks |
US10657426B2 (en) * | 2018-01-25 | 2020-05-19 | Samsung Electronics Co., Ltd. | Accelerating long short-term memory networks via selective pruning |
US11593068B2 (en) * | 2018-02-27 | 2023-02-28 | New York University | System, method, and apparatus for recurrent neural networks |
CN110533157A (zh) * | 2018-05-23 | 2019-12-03 | 华南理工大学 | 一种基于svd和剪枝用于深度循环神经网络的压缩方法 |
JP2020034625A (ja) * | 2018-08-27 | 2020-03-05 | 日本電信電話株式会社 | 音声認識装置、音声認識方法、及びプログラム |
US11068942B2 (en) | 2018-10-19 | 2021-07-20 | Cerebri AI Inc. | Customer journey management engine |
CN109523995B (zh) * | 2018-12-26 | 2019-07-09 | 出门问问信息科技有限公司 | 语音识别方法、语音识别装置、可读存储介质和电子设备 |
US11599773B2 (en) | 2018-12-27 | 2023-03-07 | Micron Technology, Inc. | Neural networks and systems for decoding encoded data |
CN109670158B (zh) * | 2018-12-27 | 2023-09-29 | 北京及客科技有限公司 | 一种用于根据资讯数据生成文本内容的方法与设备 |
CN109740737B (zh) * | 2018-12-30 | 2021-02-19 | 联想(北京)有限公司 | 卷积神经网络量化处理方法、装置及计算机设备 |
US11444845B1 (en) * | 2019-03-05 | 2022-09-13 | Amazon Technologies, Inc. | Processing requests using compressed and complete machine learning models |
CN110580525B (zh) * | 2019-06-03 | 2021-05-11 | 北京邮电大学 | 适用于资源受限的设备的神经网络压缩方法及系统 |
CN112308197B (zh) * | 2019-07-26 | 2024-04-09 | 杭州海康威视数字技术股份有限公司 | 一种卷积神经网络的压缩方法、装置及电子设备 |
US11922315B2 (en) * | 2019-08-26 | 2024-03-05 | Microsoft Technology Licensing, Llc. | Neural adapter for classical machine learning (ML) models |
US11424764B2 (en) * | 2019-11-13 | 2022-08-23 | Micron Technology, Inc. | Recurrent neural networks and systems for decoding encoded data |
WO2021117942A1 (ko) * | 2019-12-12 | 2021-06-17 | 전자부품연구원 | 저복잡도 딥러닝 가속 하드웨어 데이터 가공장치 |
US11188616B2 (en) | 2020-02-25 | 2021-11-30 | International Business Machines Corporation | Multi-linear dynamical model reduction |
KR20210136706A (ko) * | 2020-05-08 | 2021-11-17 | 삼성전자주식회사 | 전자 장치 및 이의 제어 방법 |
WO2021234967A1 (ja) * | 2020-05-22 | 2021-11-25 | 日本電信電話株式会社 | 音声波形生成モデル学習装置、音声合成装置、それらの方法、およびプログラム |
KR20220064054A (ko) * | 2020-11-11 | 2022-05-18 | 포항공과대학교 산학협력단 | 행렬곱 연산량 감소 방법 및 장치 |
US11563449B2 (en) | 2021-04-27 | 2023-01-24 | Micron Technology, Inc. | Systems for error reduction of encoded data using neural networks |
US11973513B2 (en) | 2021-04-27 | 2024-04-30 | Micron Technology, Inc. | Decoders and systems for decoding encoded data using neural networks |
CA3168515A1 (en) * | 2021-07-23 | 2023-01-23 | Cohere Inc. | System and method for low rank training of neural networks |
US11755408B2 (en) | 2021-10-07 | 2023-09-12 | Micron Technology, Inc. | Systems for estimating bit error rate (BER) of encoded data using neural networks |
Family Cites Families (39)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5408424A (en) * | 1993-05-28 | 1995-04-18 | Lo; James T. | Optimal filtering by recurrent neural networks |
AU2001295591A1 (en) * | 2000-10-13 | 2002-04-22 | Fraunhofer-Gesellschaft Zur Forderung Der Angewandten Forschung E.V. | A method for supervised teaching of a recurrent artificial neural network |
US9235800B2 (en) * | 2010-04-14 | 2016-01-12 | Siemens Aktiengesellschaft | Method for the computer-aided learning of a recurrent neural network for modeling a dynamic system |
WO2012109407A1 (en) * | 2011-02-09 | 2012-08-16 | The Trustees Of Columbia University In The City Of New York | Encoding and decoding machine with recurrent neural networks |
US8489529B2 (en) * | 2011-03-31 | 2013-07-16 | Microsoft Corporation | Deep convex network with joint use of nonlinear random projection, Restricted Boltzmann Machine and batch-based parallelizable optimization |
US9292787B2 (en) * | 2012-08-29 | 2016-03-22 | Microsoft Technology Licensing, Llc | Computer-implemented deep tensor neural network |
US20140156575A1 (en) * | 2012-11-30 | 2014-06-05 | Nuance Communications, Inc. | Method and Apparatus of Processing Data Using Deep Belief Networks Employing Low-Rank Matrix Factorization |
US9519858B2 (en) * | 2013-02-10 | 2016-12-13 | Microsoft Technology Licensing, Llc | Feature-augmented neural networks and applications of same |
US9728184B2 (en) * | 2013-06-18 | 2017-08-08 | Microsoft Technology Licensing, Llc | Restructuring deep neural network acoustic models |
US9620108B2 (en) * | 2013-12-10 | 2017-04-11 | Google Inc. | Processing acoustic sequences using long short-term memory (LSTM) neural networks that include recurrent projection layers |
US9400955B2 (en) * | 2013-12-13 | 2016-07-26 | Amazon Technologies, Inc. | Reducing dynamic range of low-rank decomposition matrices |
US9552526B2 (en) * | 2013-12-19 | 2017-01-24 | University Of Memphis Research Foundation | Image processing using cellular simultaneous recurrent network |
US9721202B2 (en) * | 2014-02-21 | 2017-08-01 | Adobe Systems Incorporated | Non-negative matrix factorization regularized by recurrent neural networks for audio processing |
US9324321B2 (en) * | 2014-03-07 | 2016-04-26 | Microsoft Technology Licensing, Llc | Low-footprint adaptation and personalization for a deep neural network |
US11256982B2 (en) * | 2014-07-18 | 2022-02-22 | University Of Southern California | Noise-enhanced convolutional neural networks |
US20160035344A1 (en) * | 2014-08-04 | 2016-02-04 | Google Inc. | Identifying the language of a spoken utterance |
US10783900B2 (en) * | 2014-10-03 | 2020-09-22 | Google Llc | Convolutional, long short-term memory, fully connected deep neural networks |
US10229356B1 (en) * | 2014-12-23 | 2019-03-12 | Amazon Technologies, Inc. | Error tolerant neural network model compression |
US10223635B2 (en) * | 2015-01-22 | 2019-03-05 | Qualcomm Incorporated | Model compression and fine-tuning |
CN104598972A (zh) * | 2015-01-22 | 2015-05-06 | 清华大学 | 一种大规模数据回归神经网络快速训练方法 |
CN104700828B (zh) * | 2015-03-19 | 2018-01-12 | 清华大学 | 基于选择性注意原理的深度长短期记忆循环神经网络声学模型的构建方法 |
US10515301B2 (en) * | 2015-04-17 | 2019-12-24 | Microsoft Technology Licensing, Llc | Small-footprint deep neural network |
US20160328644A1 (en) * | 2015-05-08 | 2016-11-10 | Qualcomm Incorporated | Adaptive selection of artificial neural networks |
US10091140B2 (en) * | 2015-05-31 | 2018-10-02 | Microsoft Technology Licensing, Llc | Context-sensitive generation of conversational responses |
US20160350653A1 (en) * | 2015-06-01 | 2016-12-01 | Salesforce.Com, Inc. | Dynamic Memory Network |
US10515307B2 (en) * | 2015-06-05 | 2019-12-24 | Google Llc | Compressed recurrent neural network models |
GB201511887D0 (en) * | 2015-07-07 | 2015-08-19 | Touchtype Ltd | Improved artificial neural network for language modelling and prediction |
CN105184369A (zh) * | 2015-09-08 | 2015-12-23 | 杭州朗和科技有限公司 | 用于深度学习模型的矩阵压缩方法和装置 |
US10217018B2 (en) * | 2015-09-15 | 2019-02-26 | Mitsubishi Electric Research Laboratories, Inc. | System and method for processing images using online tensor robust principal component analysis |
US20170083623A1 (en) * | 2015-09-21 | 2017-03-23 | Qualcomm Incorporated | Semantic multisensory embeddings for video search by text |
US10366158B2 (en) * | 2015-09-29 | 2019-07-30 | Apple Inc. | Efficient word encoding for recurrent neural network language models |
US10395118B2 (en) * | 2015-10-29 | 2019-08-27 | Baidu Usa Llc | Systems and methods for video paragraph captioning using hierarchical recurrent neural networks |
US9807473B2 (en) * | 2015-11-20 | 2017-10-31 | Microsoft Technology Licensing, Llc | Jointly modeling embedding and translation to bridge video and language |
US10332509B2 (en) * | 2015-11-25 | 2019-06-25 | Baidu USA, LLC | End-to-end speech recognition |
US10078794B2 (en) * | 2015-11-30 | 2018-09-18 | Pilot Ai Labs, Inc. | System and method for improved general object detection using neural networks |
US10832120B2 (en) * | 2015-12-11 | 2020-11-10 | Baidu Usa Llc | Systems and methods for a multi-core optimized recurrent neural network |
US10824941B2 (en) * | 2015-12-23 | 2020-11-03 | The Toronto-Dominion Bank | End-to-end deep collaborative filtering |
US10482380B2 (en) * | 2015-12-30 | 2019-11-19 | Amazon Technologies, Inc. | Conditional parallel processing in fully-connected neural networks |
US10515312B1 (en) * | 2015-12-30 | 2019-12-24 | Amazon Technologies, Inc. | Neural network model compaction using selective unit removal |
-
2016
- 2016-12-28 KR KR1020187017732A patent/KR102100977B1/ko active IP Right Grant
- 2016-12-28 EP EP16829466.8A patent/EP3374932B1/en active Active
- 2016-12-28 JP JP2018534819A patent/JP6706326B2/ja active Active
- 2016-12-28 WO PCT/US2016/068913 patent/WO2017136070A1/en active Application Filing
- 2016-12-29 US US15/394,617 patent/US10878319B2/en active Active
- 2016-12-30 DE DE102016125918.7A patent/DE102016125918A1/de active Pending
- 2016-12-30 DE DE202016008253.2U patent/DE202016008253U1/de active Active
- 2016-12-30 CN CN201611262293.6A patent/CN107038476A/zh active Pending
-
2020
- 2020-12-04 US US17/112,966 patent/US11948062B2/en active Active
Also Published As
Publication number | Publication date |
---|---|
EP3374932A1 (en) | 2018-09-19 |
US11948062B2 (en) | 2024-04-02 |
CN107038476A (zh) | 2017-08-11 |
KR20180084988A (ko) | 2018-07-25 |
JP2019509539A (ja) | 2019-04-04 |
WO2017136070A1 (en) | 2017-08-10 |
EP3374932B1 (en) | 2022-03-16 |
US20210089916A1 (en) | 2021-03-25 |
KR102100977B1 (ko) | 2020-04-14 |
US10878319B2 (en) | 2020-12-29 |
DE102016125918A1 (de) | 2017-08-03 |
DE202016008253U1 (de) | 2017-05-26 |
US20170220925A1 (en) | 2017-08-03 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP6706326B2 (ja) | リカレントニューラルネットワークモデルの圧縮 | |
US11741366B2 (en) | Compressed recurrent neural network models | |
AU2022201819B2 (en) | Batch normalization layers | |
JP6758406B2 (ja) | ワイドアンドディープマシンラーニングモデル | |
US20210117801A1 (en) | Augmenting neural networks with external memory | |
US11210579B2 (en) | Augmenting neural networks with external memory | |
US20160358073A1 (en) | Whitened neural network layers | |
WO2019157251A1 (en) | Neural network compression | |
JP2019517075A (ja) | 比較セットを使用する入力例の分類 | |
US20170154262A1 (en) | Resizing neural networks | |
US20170140271A1 (en) | Neural programming | |
EP3452960A1 (en) | Augmenting neural networks with external memory using reinforcement learning |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
A621 | Written request for application examination |
Free format text: JAPANESE INTERMEDIATE CODE: A621 Effective date: 20180829 |
|
A977 | Report on retrieval |
Free format text: JAPANESE INTERMEDIATE CODE: A971007 Effective date: 20190913 |
|
A131 | Notification of reasons for refusal |
Free format text: JAPANESE INTERMEDIATE CODE: A131 Effective date: 20190924 |
|
A601 | Written request for extension of time |
Free format text: JAPANESE INTERMEDIATE CODE: A601 Effective date: 20191224 |
|
A521 | Request for written amendment filed |
Free format text: JAPANESE INTERMEDIATE CODE: A523 Effective date: 20200218 |
|
TRDD | Decision of grant or rejection written | ||
A01 | Written decision to grant a patent or to grant a registration (utility model) |
Free format text: JAPANESE INTERMEDIATE CODE: A01 Effective date: 20200420 |
|
A61 | First payment of annual fees (during grant procedure) |
Free format text: JAPANESE INTERMEDIATE CODE: A61 Effective date: 20200515 |
|
R150 | Certificate of patent or registration of utility model |
Ref document number: 6706326 Country of ref document: JP Free format text: JAPANESE INTERMEDIATE CODE: R150 |
|
R250 | Receipt of annual fees |
Free format text: JAPANESE INTERMEDIATE CODE: R250 |
|
R250 | Receipt of annual fees |
Free format text: JAPANESE INTERMEDIATE CODE: R250 |