JPWO2019210237A5 - - Google Patents
Download PDFInfo
- Publication number
- JPWO2019210237A5 JPWO2019210237A5 JP2020558533A JP2020558533A JPWO2019210237A5 JP WO2019210237 A5 JPWO2019210237 A5 JP WO2019210237A5 JP 2020558533 A JP2020558533 A JP 2020558533A JP 2020558533 A JP2020558533 A JP 2020558533A JP WO2019210237 A5 JPWO2019210237 A5 JP WO2019210237A5
- Authority
- JP
- Japan
- Prior art keywords
- machine learning
- learning model
- training data
- layer
- layers
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000010801 machine learning Methods 0.000 claims 27
- 230000000875 corresponding Effects 0.000 claims 6
Claims (12)
訓練データを受信すること、
前記訓練データに基づいて機械学習モデルを訓練することであって、前記機械学習モデルは、1つ又は複数のノードをそれぞれ有する複数の層を含み、前記1つ又は複数のノードは、前記機械学習モデルの別の層のノードとの1つ又は複数の接続を有する、訓練すること、
前記機械学習モデルの前記接続に関連する重みを評価することであって、各接続は対応する重みを有する、評価すること、
閾値条件を満たさない重みを有する1つ又は複数の接続を前記機械学習モデルから除去すること、
前記機械学習モデルの前記層に関連する層重みを評価することであって、各層は対応する層重みを有する、評価すること、
層閾値条件を満たさない層重みを有する1つ又は複数の層を前記機械学習モデルから除去すること、及び
前記接続及び前記1つ又は複数の層を除去した後で前記機械学習モデルを更新すること
を含む、方法。 It ’s a computer-implemented method.
Receiving training data,
Training a machine learning model based on the training data, wherein the machine learning model includes a plurality of layers each having one or a plurality of nodes, and the one or a plurality of nodes are the machine learning. Training, having one or more connections with nodes in another layer of the model,
To evaluate the weights associated with the connection of the machine learning model, wherein each connection has a corresponding weight.
Removing one or more connections with weights that do not meet the threshold condition from the machine learning model .
To evaluate the layer weights associated with the layers of the machine learning model, wherein each layer has a corresponding layer weight.
Removing one or more layers with layer weights that do not meet the layer threshold condition from the machine learning model, and
A method comprising updating the machine learning model after removing the connection and the one or more layers .
前記訓練データの一部を除去することによって前記訓練データを減らすこと
を更に含む、請求項1に記載の方法。 Receiving the training data
The method of claim 1, further comprising reducing the training data by removing a portion of the training data.
前記第1の時点に関連する前記第1のデータを前記訓練データから除去すること
を更に含む、請求項2に記載の方法。 The training data may include the first data related to the first time point and the second data related to the second time point, and the training data may be reduced by removing the part of the training data. ,
The method of claim 2, further comprising removing the first data associated with the first time point from the training data.
訓練データを受信すること、
前記訓練データに基づいて機械学習モデルを訓練することであって、前記機械学習モデルは、1つ又は複数のノードをそれぞれ有する複数の層を含み、前記1つ又は複数のノードは、前記機械学習モデルの別の層のノードとの1つ又は複数の接続を有する、訓練すること、
前記機械学習モデルの前記接続に関連する重みを評価することであって、各接続は対応する重みを有する、評価すること、
閾値条件を満たさない重みを有する1つ又は複数の接続を前記機械学習モデルから除去すること、
前記機械学習モデルの前記層に関連する層重みを評価することであって、各層は対応する層重みを有する、評価すること、
層閾値条件を満たさない層重みを有する1つ又は複数の層を前記機械学習モデルから除去すること、及び
前記接続及び前記1つ又は複数の層を除去した後で前記機械学習モデルを更新すること
を含む、
非一時的コンピュータ可読媒体。 A non-temporary computer-readable medium that stores a set of instructions that can be executed by at least one processor of the computer system to allow the computer system to perform a method for simplifying a machine learning model. ,
Receiving training data,
Training a machine learning model based on the training data, wherein the machine learning model includes a plurality of layers each having one or a plurality of nodes, and the one or a plurality of nodes are the machine learning. Training, having one or more connections with nodes in another layer of the model,
To evaluate the weights associated with the connection of the machine learning model, wherein each connection has a corresponding weight.
Removing one or more connections with weights that do not meet the threshold condition from the machine learning model .
To evaluate the layer weights associated with the layers of the machine learning model, wherein each layer has a corresponding layer weight.
Removing one or more layers with layer weights that do not meet the layer threshold condition from the machine learning model, and
It comprises updating the machine learning model after removing the connection and the one or more layers .
Non-temporary computer readable medium.
前記訓練データの一部を除去することによって前記訓練データを減らすこと
を更に含む、請求項6に記載の非一時的コンピュータ可読媒体。 Receiving the training data
The non-temporary computer-readable medium of claim 6 , further comprising reducing the training data by removing a portion of the training data.
前記第1の時点に関連する前記第1のデータを前記訓練データから除去すること
を更に含む、請求項7に記載の非一時的コンピュータ可読媒体。 The training data may include the first data related to the first time point and the second data related to the second time point, and the training data may be reduced by removing the part of the training data. ,
The non-temporary computer-readable medium of claim 7 , further comprising removing the first data associated with the first time point from the training data.
前記機械学習モデルに与えられる入力データを評価するためのフィルタを生成すること を前記コンピュータシステムに行わせるために前記コンピュータシステムの前記少なくとも1つのプロセッサによって更に実行される、請求項6に記載の非一時的コンピュータ可読媒体。 The above set of instructions
6. The non. Temporary computer readable medium.
1つのプロセッサと、を含むコンピュータシステムであって、
前記1つのプロセッサは、前記1組の命令を実行して、
訓練データを受信すること、
前記訓練データに基づいて機械学習モデルを訓練することであって、前記機械学習モデルは、1つ又は複数のノードをそれぞれ有する複数の層を含み、前記1つ又は複数のノードは、前記機械学習モデルの別の層のノードとの1つ又は複数の接続を有する、訓練すること、
前記機械学習モデルの前記接続に関連する重みを評価することであって、各接続は対応する重みを有する、評価すること、
閾値条件を満たさない重みを有する1つ又は複数の接続を前記機械学習モデルから除去すること、
前記機械学習モデルの前記層に関連する層重みを評価することであって、各層は対応する層重みを有する、評価すること、
層閾値条件を満たさない層重みを有する1つ又は複数の層を前記機械学習モデルから除去すること、及び
前記接続及び前記1つ又は複数の層を除去した後で前記機械学習モデルを更新すること
を前記システムに行わせるように構成される、
コンピュータシステム。 A memory that stores a set of instructions and
A computer system that includes one processor,
The one processor executes the set of instructions and
Receiving training data,
Training a machine learning model based on the training data, wherein the machine learning model includes a plurality of layers each having one or a plurality of nodes, and the one or a plurality of nodes are the machine learning. Training, having one or more connections with nodes in another layer of the model,
To evaluate the weights associated with the connection of the machine learning model, wherein each connection has a corresponding weight.
Removing one or more connections with weights that do not meet the threshold condition from the machine learning model .
To evaluate the layer weights associated with the layers of the machine learning model, wherein each layer has a corresponding layer weight.
Removing one or more layers with layer weights that do not meet the layer threshold condition from the machine learning model, and
The system is configured to update the machine learning model after removing the connection and the one or more layers .
Computer system.
Applications Claiming Priority (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US201862663955P | 2018-04-27 | 2018-04-27 | |
US62/663,955 | 2018-04-27 | ||
PCT/US2019/029450 WO2019210237A1 (en) | 2018-04-27 | 2019-04-26 | Method and system for performing machine learning |
Publications (3)
Publication Number | Publication Date |
---|---|
JP2021522574A JP2021522574A (en) | 2021-08-30 |
JPWO2019210237A5 true JPWO2019210237A5 (en) | 2022-04-07 |
JP7162074B2 JP7162074B2 (en) | 2022-10-27 |
Family
ID=68291171
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
JP2020558533A Active JP7162074B2 (en) | 2018-04-27 | 2019-04-26 | Method and system for performing machine learning |
Country Status (5)
Country | Link |
---|---|
US (1) | US11308395B2 (en) |
EP (1) | EP3785179A4 (en) |
JP (1) | JP7162074B2 (en) |
CN (1) | CN111919226A (en) |
WO (1) | WO2019210237A1 (en) |
Families Citing this family (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR20190099930A (en) * | 2018-02-20 | 2019-08-28 | 삼성전자주식회사 | Method and apparatus for controlling data input and output of fully connected network |
CN110533158B (en) * | 2018-05-25 | 2023-10-13 | 宏达国际电子股份有限公司 | Model construction method, system and non-volatile computer readable recording medium |
US10990850B1 (en) * | 2018-12-12 | 2021-04-27 | Amazon Technologies, Inc. | Knowledge distillation and automatic model retraining via edge device sample collection |
US11537436B2 (en) * | 2019-10-02 | 2022-12-27 | Qualcomm Incorporated | Method of configuring a memory block allocation of a machine learning network |
US20210201110A1 (en) * | 2019-12-31 | 2021-07-01 | Alibaba Group Holding Limited | Methods and systems for performing inference with a neural network |
US20210375441A1 (en) * | 2020-05-29 | 2021-12-02 | Regents Of The University Of Minnesota | Using clinical notes for icu management |
US20240095525A1 (en) * | 2021-02-04 | 2024-03-21 | Telefonaktiebolaget Lm Ericsson (Publ) | Building an explainable machine learning model |
US20220405517A1 (en) * | 2021-06-17 | 2022-12-22 | Guangzhou Automobile Group Co., Ltd. | System, method, and vehicle for recognition of traffic signs |
Family Cites Families (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2703010B2 (en) * | 1988-12-23 | 1998-01-26 | 株式会社日立製作所 | Neural net signal processing processor |
US6760776B1 (en) | 2000-04-10 | 2004-07-06 | International Business Machines Corporation | Method and apparatus for processing network frames in a network processor by embedding network control information such as routing and filtering information in each received frame |
US8402543B1 (en) * | 2011-03-25 | 2013-03-19 | Narus, Inc. | Machine learning based botnet detection with dynamic adaptation |
JP6042274B2 (en) | 2013-06-28 | 2016-12-14 | 株式会社デンソーアイティーラボラトリ | Neural network optimization method, neural network optimization apparatus and program |
JP6236296B2 (en) | 2013-11-14 | 2017-11-22 | 株式会社デンソーアイティーラボラトリ | Learning device, learning program, and learning method |
CN104732231B (en) | 2015-04-13 | 2019-02-26 | 广州广电运通金融电子股份有限公司 | A kind of recognition methods of valuable bills |
JP6480644B1 (en) | 2016-03-23 | 2019-03-13 | グーグル エルエルシー | Adaptive audio enhancement for multi-channel speech recognition |
KR102619443B1 (en) * | 2016-09-30 | 2023-12-28 | 삼성전자주식회사 | Wrist temperature rhythm acquisition apparatus and method, core temperature rhythm acquisition apparatus and method, wearable device |
US20180096249A1 (en) * | 2016-10-04 | 2018-04-05 | Electronics And Telecommunications Research Institute | Convolutional neural network system using adaptive pruning and weight sharing and operation method thereof |
US10366302B2 (en) * | 2016-10-10 | 2019-07-30 | Gyrfalcon Technology Inc. | Hierarchical category classification scheme using multiple sets of fully-connected networks with a CNN based integrated circuit as feature extractor |
KR102415506B1 (en) | 2016-10-26 | 2022-07-01 | 삼성전자주식회사 | Device and method to reduce neural network |
-
2019
- 2019-04-26 EP EP19791717.2A patent/EP3785179A4/en active Pending
- 2019-04-26 WO PCT/US2019/029450 patent/WO2019210237A1/en active Application Filing
- 2019-04-26 CN CN201980022623.XA patent/CN111919226A/en active Pending
- 2019-04-26 JP JP2020558533A patent/JP7162074B2/en active Active
- 2019-04-26 US US16/396,563 patent/US11308395B2/en active Active
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN104067314B (en) | Humanoid image partition method | |
DE112005001277B4 (en) | Method and device for vectoring multiple input commands | |
CN104751842B (en) | The optimization method and system of deep neural network | |
CN101093518B (en) | Method and system for optimizing of pipeline logical structure arrangement in circuit design | |
CN107169454A (en) | A kind of facial image age estimation method, device and its terminal device | |
CN107330516A (en) | Model parameter training method, apparatus and system | |
US11308399B2 (en) | Method for topological optimization of graph-based models | |
JP2023072025A5 (en) | Information processing system, electronic device, information processing method, and computer program | |
WO2019167603A1 (en) | Neural machine translation model training method and device, and computer program therefor | |
WO2017165693A4 (en) | Use of clinical parameters for the prediction of sirs | |
JPWO2019210237A5 (en) | ||
CN104484527B (en) | Uniform load automatic dynamic amending method in a kind of discrete topology process of topology optimization | |
CN106997373A (en) | A kind of link prediction method based on depth confidence network | |
CN109192226A (en) | A kind of signal processing method and device | |
CN107274425A (en) | A kind of color image segmentation method and device based on Pulse Coupled Neural Network | |
DE202014011350U1 (en) | FFT accelerator | |
JP6180340B2 (en) | Dialog sentence generating apparatus, dialog sentence generating method and program | |
JP6828834B2 (en) | Logical operation device, logical calculation method, and program | |
Heras et al. | An empirical study of encodings for group MaxSAT | |
CN110706808A (en) | Aneurysm rupture state prediction method and device | |
CN109065154B (en) | Decision result determination method, device, equipment and readable storage medium | |
CN106815246A (en) | Document storing method and device in non-relational database | |
US20230162037A1 (en) | Machine learning method and pruning method | |
CN114254764B (en) | Feedback-based machine learning model searching method, system, equipment and medium | |
CN115641282A (en) | Target detection method and device for synthetic aperture radar image |