GB2583623A - Fusing sparse kernels to approximate a full kernel of a convolutional neural network - Google Patents

Fusing sparse kernels to approximate a full kernel of a convolutional neural network Download PDF

Info

Publication number
GB2583623A
GB2583623A GB2010475.8A GB202010475A GB2583623A GB 2583623 A GB2583623 A GB 2583623A GB 202010475 A GB202010475 A GB 202010475A GB 2583623 A GB2583623 A GB 2583623A
Authority
GB
United Kingdom
Prior art keywords
kernel
complementary
sparse
pattern
sample
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
GB2010475.8A
Other languages
English (en)
Other versions
GB202010475D0 (en
Inventor
Chen Richard
Fan Quanfu
Pistoia Marco
Suzumura Toyotaro
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
International Business Machines Corp
Original Assignee
International Business Machines Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by International Business Machines Corp filed Critical International Business Machines Corp
Publication of GB202010475D0 publication Critical patent/GB202010475D0/en
Publication of GB2583623A publication Critical patent/GB2583623A/en
Withdrawn legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • G06F18/213Feature extraction, e.g. by transforming the feature space; Summarisation; Mappings, e.g. subspace methods
    • G06F18/2134Feature extraction, e.g. by transforming the feature space; Summarisation; Mappings, e.g. subspace methods based on separation criteria, e.g. independent component analysis
    • G06F18/21345Feature extraction, e.g. by transforming the feature space; Summarisation; Mappings, e.g. subspace methods based on separation criteria, e.g. independent component analysis enforcing sparsity or involving a domain transformation
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • G06F18/214Generating training patterns; Bootstrap methods, e.g. bagging or boosting
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/25Fusion techniques
    • G06F18/253Fusion techniques of extracted features
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/0464Convolutional networks [CNN, ConvNet]
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/0495Quantised networks; Sparse networks; Compressed networks
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/44Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components
    • G06V10/443Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components by matching or filtering
    • G06V10/449Biologically inspired filters, e.g. difference of Gaussians [DoG] or Gabor filters
    • G06V10/451Biologically inspired filters, e.g. difference of Gaussians [DoG] or Gabor filters with interaction between the filter responses, e.g. cortical complex cells
    • G06V10/454Integrating the filters into a hierarchical structure, e.g. convolutional neural networks [CNN]
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/82Arrangements for image or video recognition or understanding using pattern recognition or machine learning using neural networks

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Evolutionary Computation (AREA)
  • Data Mining & Analysis (AREA)
  • Artificial Intelligence (AREA)
  • General Physics & Mathematics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • General Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Computing Systems (AREA)
  • Software Systems (AREA)
  • Biomedical Technology (AREA)
  • Molecular Biology (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Mathematical Physics (AREA)
  • Computational Linguistics (AREA)
  • Biophysics (AREA)
  • Multimedia (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Evolutionary Biology (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Biodiversity & Conservation Biology (AREA)
  • Databases & Information Systems (AREA)
  • Medical Informatics (AREA)
  • Image Analysis (AREA)
  • Complex Calculations (AREA)
GB2010475.8A 2017-12-14 2018-12-13 Fusing sparse kernels to approximate a full kernel of a convolutional neural network Withdrawn GB2583623A (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US15/841,480 US10740659B2 (en) 2017-12-14 2017-12-14 Fusing sparse kernels to approximate a full kernel of a convolutional neural network
PCT/IB2018/059993 WO2019116291A1 (en) 2017-12-14 2018-12-13 Fusing sparse kernels to approximate a full kernel of a convolutional neural network

Publications (2)

Publication Number Publication Date
GB202010475D0 GB202010475D0 (en) 2020-08-19
GB2583623A true GB2583623A (en) 2020-11-04

Family

ID=66814568

Family Applications (1)

Application Number Title Priority Date Filing Date
GB2010475.8A Withdrawn GB2583623A (en) 2017-12-14 2018-12-13 Fusing sparse kernels to approximate a full kernel of a convolutional neural network

Country Status (6)

Country Link
US (1) US10740659B2 (enExample)
JP (1) JP7179850B2 (enExample)
CN (1) CN111344720A (enExample)
DE (1) DE112018006377T5 (enExample)
GB (1) GB2583623A (enExample)
WO (1) WO2019116291A1 (enExample)

Families Citing this family (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR102548718B1 (ko) * 2017-06-07 2023-06-28 삼성전자주식회사 전자 장치 및 그 제어 방법
US12393829B2 (en) * 2019-07-25 2025-08-19 Samsung Electronics Co., Ltd. Methods and systems with convolutional neural network (CNN) performance
US11144290B2 (en) 2019-09-13 2021-10-12 Huawei Technologies Co., Ltd. Method and apparatus for enabling autonomous acceleration of dataflow AI applications
US12265911B2 (en) * 2020-02-06 2025-04-01 Google Llc Neural network layers with a controlled degree of spatial invariance
US20210256385A1 (en) * 2020-02-14 2021-08-19 Northeastern University Computer-implemented methods and systems for dnn weight pruning for real-time execution on mobile devices
US11494875B2 (en) 2020-03-25 2022-11-08 Nintendo Co., Ltd. Systems and methods for machine learned image conversion
US11379951B2 (en) 2020-03-25 2022-07-05 Nintendo Co., Ltd. Systems and methods for machine learned image conversion
CN113344199B (zh) * 2021-06-17 2024-05-03 阿波罗智联(北京)科技有限公司 用于训练可分离卷积网络的方法、路侧设备及云控平台
WO2023281371A1 (en) * 2021-07-04 2023-01-12 Numenta, Inc. Hardware architecture for processing tensors with complementary sparsity
US20250138820A1 (en) * 2023-10-26 2025-05-01 Etched.ai, Inc. Model-specific asic compilation using fused kernel replacement

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20170103309A1 (en) * 2015-10-08 2017-04-13 International Business Machines Corporation Acceleration of convolutional neural network training using stochastic perforation
WO2017132830A1 (en) * 2016-02-02 2017-08-10 Xiaogang Wang Methods and systems for cnn network adaption and object online tracking
CN107330463A (zh) * 2017-06-29 2017-11-07 南京信息工程大学 基于cnn多特征联合和多核稀疏表示的车型识别方法
US20170337471A1 (en) * 2016-05-18 2017-11-23 Nec Laboratories America, Inc. Passive pruning of filters in a convolutional neural network

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9424513B2 (en) * 2011-11-09 2016-08-23 Qualcomm Incorporated Methods and apparatus for neural component memory transfer of a referenced pattern by including neurons to output a pattern substantially the same as the referenced pattern
CN104077612B (zh) * 2014-07-15 2017-09-22 中国科学院合肥物质科学研究院 一种基于多特征稀疏表示技术的害虫图像识别方法
US9652817B2 (en) 2015-03-12 2017-05-16 Samsung Electronics Co., Ltd. Automated compute kernel fusion, resizing, and interleave
CN105046193B (zh) * 2015-06-05 2018-07-10 上海大学 一种基于融合稀疏表示矩阵的人体动作识别方法
US9741107B2 (en) * 2015-06-05 2017-08-22 Sony Corporation Full reference image quality assessment based on convolutional neural network
US9972063B2 (en) 2015-07-30 2018-05-15 International Business Machines Corporation Pipelined approach to fused kernels for optimization of machine learning workloads on graphical processing units
US9904874B2 (en) 2015-11-05 2018-02-27 Microsoft Technology Licensing, Llc Hardware-efficient deep convolutional neural networks
US10181188B2 (en) * 2016-02-19 2019-01-15 International Business Machines Corporation Structure-preserving composite model for skin lesion segmentation

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20170103309A1 (en) * 2015-10-08 2017-04-13 International Business Machines Corporation Acceleration of convolutional neural network training using stochastic perforation
WO2017132830A1 (en) * 2016-02-02 2017-08-10 Xiaogang Wang Methods and systems for cnn network adaption and object online tracking
US20170337471A1 (en) * 2016-05-18 2017-11-23 Nec Laboratories America, Inc. Passive pruning of filters in a convolutional neural network
CN107330463A (zh) * 2017-06-29 2017-11-07 南京信息工程大学 基于cnn多特征联合和多核稀疏表示的车型识别方法

Also Published As

Publication number Publication date
JP7179850B2 (ja) 2022-11-29
CN111344720A (zh) 2020-06-26
JP2021507345A (ja) 2021-02-22
GB202010475D0 (en) 2020-08-19
US20190188526A1 (en) 2019-06-20
DE112018006377T5 (de) 2020-08-20
WO2019116291A1 (en) 2019-06-20
US10740659B2 (en) 2020-08-11

Similar Documents

Publication Publication Date Title
GB2583623A (en) Fusing sparse kernels to approximate a full kernel of a convolutional neural network
US12014258B2 (en) Method and device for optimizing simulation data, and computer-readable storage medium
CN109117831B (zh) 物体检测网络的训练方法和装置
CN113642659B (zh) 一种训练样本集生成的方法、装置、电子设备及存储介质
CN109949219B (zh) 一种超分辨率图像的重构方法、装置及设备
CN113704082A (zh) 模型评测方法、装置、电子设备及存储介质
CN109919209A (zh) 一种领域自适应深度学习方法及可读存储介质
US20220189008A1 (en) Method for detecting data defects and computing device utilizing method
CN107204956B (zh) 网站识别方法及装置
US12475171B2 (en) Identifying similar content in a multi-item embedding space
US20190138899A1 (en) Processing apparatus, processing method, and nonvolatile recording medium
CN112400187A (zh) 用于检测生物医学图像中的异常的敲除自动编码器
CN111340785B (zh) 模型训练方法、产品表面缺陷检测方法和存储介质
CN109377508B (zh) 图像处理方法和装置
CN113781164B (zh) 虚拟试衣模型训练方法、虚拟试衣方法和相关装置
CN110728328A (zh) 分类模型的训练方法和装置
CN111768405B (zh) 处理标注图像的方法、装置、设备和存储介质
CN111932438A (zh) 图像风格迁移方法、设备及存储装置
CN109726195A (zh) 一种数据增强方法及装置
CN111046933B (zh) 图像分类方法、装置、存储介质及电子设备
US20160188680A1 (en) Electronic device and information searching method for the electronic device
US20250371862A1 (en) Image processing method and apparatus, device, and medium
CN112989364A (zh) 用于数据仿真的方法、设备和计算机程序产品
Kim et al. On efficient language and vision assistants for visually-situated natural language understanding: What matters in reading and reasoning
CN111260757A (zh) 一种图像处理方法、装置及终端设备

Legal Events

Date Code Title Description
WAP Application withdrawn, taken to be withdrawn or refused ** after publication under section 16(1)