WO2019200544A1 - 网络模型的应用开发方法及相关产品 - Google Patents

网络模型的应用开发方法及相关产品 Download PDF

Info

Publication number
WO2019200544A1
WO2019200544A1 PCT/CN2018/083433 CN2018083433W WO2019200544A1 WO 2019200544 A1 WO2019200544 A1 WO 2019200544A1 CN 2018083433 W CN2018083433 W CN 2018083433W WO 2019200544 A1 WO2019200544 A1 WO 2019200544A1
Authority
WO
WIPO (PCT)
Prior art keywords
network model
neural network
calculation
training sample
user
Prior art date
Application number
PCT/CN2018/083433
Other languages
English (en)
French (fr)
Inventor
赵睿哲
Original Assignee
深圳鲲云信息科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 深圳鲲云信息科技有限公司 filed Critical 深圳鲲云信息科技有限公司
Priority to US17/044,487 priority Critical patent/US11954576B2/en
Priority to PCT/CN2018/083433 priority patent/WO2019200544A1/zh
Priority to CN201880003164.6A priority patent/CN109643229B/zh
Publication of WO2019200544A1 publication Critical patent/WO2019200544A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/10Interfaces, programming languages or software development kits, e.g. for simulating neural networks
    • G06N3/105Shells for specifying net layout
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F8/00Arrangements for software engineering
    • G06F8/30Creation or generation of source code
    • G06F8/34Graphical or visual programming
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F8/00Arrangements for software engineering
    • G06F8/40Transformation of program code
    • G06F8/41Compilation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/10Interfaces, programming languages or software development kits, e.g. for simulating neural networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods

Definitions

  • the present application relates to the field of information processing technologies, and in particular, to an application development method and related products of a network model.
  • Network models such as neural network models are becoming more and more widely used with the development of technology.
  • computers, servers and other devices they can implement training and operations on network models, but the existing neural network platforms need to be professional.
  • the developers are developing, so that the user is inconvenient to use, the versatility is not strong, and the user experience is affected.
  • the embodiment of the present application provides an application development method and related products of a network model, which can implement a graphical neural network platform to compile a neural network model, and improve versatility and user experience.
  • an application development method for a network model comprising the following steps:
  • the establishment end command of the neural network model is collected, and the intermediate neural network model is verified according to the end command to determine whether the computing module conflicts with other computing modules. If there is no conflict, a built-in neural network matching the calculation graph of the intermediate network model is generated. The model generates the execution code that matches the calculation graph.
  • an application development platform for a network model includes:
  • a transceiver unit configured to receive a neural network model establishment command
  • the transceiver unit is further configured to acquire a calculation module selected by a user and a connection relationship of the calculation module,
  • a processing unit configured to add the calculation module and the connection relationship to the initial neural network model calculation map, and obtain an intermediate neural network model
  • the transceiver unit is further configured to receive a setup end command of the neural network model
  • a processing unit configured to verify the intermediate neural network model according to the end command to determine whether the computing module conflicts with other computing modules; if not conflict, generate a built-in neural network model matching the calculated graph of the intermediate network model and generate and The calculation code that matches the calculation graph.
  • a computer readable storage medium storing a computer program for electronic data exchange, wherein the computer program causes a computer to perform the method of the second aspect.
  • a computer program product comprising a non-transitory computer readable storage medium storing a computer program, the computer program being operative to cause a computer to perform the method of the second aspect.
  • the technical solution of the network model is simulated to obtain an output result, and then the output result is displayed, so that the user can judge whether the network model is suitable for the corresponding hardware structure by using the output result. This can improve the user experience.
  • FIG. 1 is a schematic flowchart diagram of an application development method of a network model according to an embodiment of the present application.
  • FIG. 2 is a schematic structural diagram of an application development platform of a network model provided by an embodiment of the present application.
  • references to "an embodiment” herein mean that a particular feature, structure, or characteristic described in connection with the embodiments can be included in at least one embodiment of the present application.
  • the appearances of the phrases in various places in the specification are not necessarily referring to the same embodiments, and are not exclusive or alternative embodiments that are mutually exclusive. Those skilled in the art will understand and implicitly understand that the embodiments described herein can be combined with other embodiments.
  • Neural networks have broad and attractive prospects in the fields of system identification, pattern recognition, and intelligent control. Especially in intelligent control, people are especially interested in the self-learning function of neural networks, and regard the important feature of neural networks as One of the key keys to solving the problem of controller adaptability in automatic control.
  • Neural Networks is a complex network system formed by a large number of simple processing units (called neurons) that are interconnected to each other. It reflects many basic features of human brain function and is highly complex. Nonlinear dynamic learning system. Neural networks have massively parallel, distributed storage and processing, self-organizing, adaptive, and self-learning capabilities, and are particularly well-suited for handling inaccurate and ambiguous information processing problems that require many factors and conditions to be considered simultaneously.
  • the development of neural networks is related to neuroscience, mathematical science, cognitive science, computer science, artificial intelligence, information science, cybernetics, robotics, microelectronics, psychology, optical computing, molecular biology, etc. The edge of the interdisciplinary.
  • the basis of neural networks is the neurons.
  • Neurons are biological models based on nerve cells of the biological nervous system. When people study the biological nervous system to explore the mechanism of artificial intelligence, the neurons are mathematically generated, and the mathematical model of the neuron is generated.
  • neural network A large number of neurons of the same form are connected to form a neural network.
  • the neural network is a highly nonlinear dynamic system. Although the structure and function of each neuron are not complicated, the dynamic behavior of neural networks is very complicated; therefore, neural networks can express various phenomena in the actual physical world.
  • the neural network model is based on a mathematical model of neurons.
  • the Artificial Neural Network is a description of the first-order properties of the human brain system. Simply put, it is a mathematical model.
  • the neural network model is represented by network topology, node characteristics, and learning rules.
  • the great appeal of neural networks to people includes: parallel distributed processing, high robustness and fault tolerance, distributed storage and learning capabilities, and the ability to fully approximate complex nonlinear relationships.
  • Typical neural network models with more applications include BP neural network, Hopfield network, ART network and Kohonen network.
  • FIG. 1 is a method for developing a network model according to the application.
  • the method is implemented by a network model application platform.
  • the method is as shown in FIG. 1 and includes the following steps:
  • Step S101 Receive a neural network model establishment command, and establish an initial neural network model calculation map according to the establishment command;
  • the above neural network model establishing command can be displayed by an icon, and when the user clicks the icon, it is determined that the neural network model establishing command is received.
  • the acquisition of the above establishment command can also be implemented in other ways.
  • Step S102 acquiring a calculation module selected by the user and a connection relationship of the calculation module, adding the calculation module and the connection relationship to the initial neural network model calculation diagram, and obtaining an intermediate neural network model;
  • the manner of obtaining the calculation module selected by the user may include:
  • the manner of determining the connection relationship of the computing module may include:
  • Step S103 Collect an establishment end command of the neural network model, and verify the intermediate neural network model according to the end command to determine whether the calculation module conflicts with other calculation modules, and if not conflict, generate a calculation map matching the intermediate network model.
  • a neural network model is built and the execution code that matches the calculation map is generated.
  • the technical solution provided by the present application does not require a professional to graphically design each computing module, so that the user only needs to select the computing module required from various graphical computing modules, and then determine the connection relationship of the computing module.
  • the establishment of the neural network model graphically, so that the user's professional requirements are not high, can be widely applied to the establishment of neural network models.
  • the foregoing method may further include:
  • This technical solution provides a marking function of the training sample, which provides the user with the marking of the input training sample by the corresponding type of marking method and the marking example, and provides the basis for execution of the subsequent operation or training.
  • the foregoing method may further include:
  • the labeled training samples are input as input data into the built-in neural network model for training to obtain a trained neural network model.
  • This technical solution provides a process for training a neural network model, that is, a training neural network model can be obtained by training the weight data of the established neural network model through the platform.
  • the foregoing method may further include:
  • the labeled training sample is input as input data into the built-in neural network model for forward operation to obtain a forward operation result, and the forward operation result is displayed.
  • This technical solution provides the user with a calculation of the input data and displays the result of the forward operation, so that the user can determine whether the built neural network model is reasonable by comparing the labeled training sample with the calculated result, and the user is interested in the neural network model. Optimization provides direction and ideas.
  • FIG. 2 provides an application development platform of a network model, where the application development platform of the network model includes:
  • the transceiver unit 201 is configured to receive a neural network model establishment command
  • the creating unit 202 is configured to establish an initial neural network model calculation map according to the establishment command;
  • the transceiver unit 201 is further configured to acquire a calculation module selected by the user and a connection relationship of the calculation module,
  • the processing unit 203 is configured to add the calculation module and the connection relationship to the initial neural network model calculation map, and obtain an intermediate neural network model;
  • the transceiver unit 201 is further configured to receive a setup end command of the neural network model
  • the processing unit 203 is configured to verify, according to the end command, the intermediate neural network model to determine whether the computing module conflicts with other computing modules; if not conflict, generate a built-in neural network model matching the calculated graph of the intermediate network model and generate The execution code that matches the calculation graph.
  • the transceiver unit 201 is further configured to receive a training sample input by the user;
  • the processing unit 203 is further configured to determine a type of the established initial neural network model, determine a marking manner of the training sample according to the type, and display an example of the marking, display the marking example, and prompt the user to mark the training sample according to the marking manner. Marked training samples.
  • This technical solution provides a marking function of the training sample, which provides the user with the marking of the input training sample by the corresponding type of marking method and the marking example, and provides the basis for execution of the subsequent operation or training.
  • the processing unit 203 is further configured to input the marked training sample as input data into the built-in neural network model for training to obtain a trained neural network model.
  • the labeled training samples are input as input data into the built-in neural network model for training to obtain a trained neural network model.
  • the processing unit 203 is further configured to input the marked training sample as input data into the built-in neural network model to perform a forward operation result, and obtain a forward operation result, and display the forward operation result.
  • This technical solution provides the user with a calculation of the input data and displays the result of the forward operation, so that the user can determine whether the built neural network model is reasonable by comparing the labeled training sample with the calculated result, and the user is interested in the neural network model. Optimization provides direction and ideas.
  • the present application also provides a computer readable storage medium storing a computer program for electronic data exchange, wherein the computer program causes the computer to perform the method as shown in Figure 1 and a refinement of the method.
  • the application also provides a computer program product comprising a non-transitory computer readable storage medium storing a computer program operative to cause a computer to perform the method as shown in FIG. 1 and the method Refinement plan.
  • the disclosed apparatus may be implemented in other ways.
  • the device embodiments described above are merely illustrative.
  • the division of the unit is only a logical function division.
  • there may be another division manner for example, multiple units or components may be combined or may be Integrate into another system, or some features can be ignored or not executed.
  • the mutual coupling or direct coupling or communication connection shown or discussed may be an indirect coupling or communication connection through some interface, device or unit, and may be electrical or otherwise.
  • the units described as separate components may or may not be physically separated, and the components displayed as units may or may not be physical units, that is, may be located in one place, or may be distributed to multiple network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of the embodiment.
  • each functional unit in each embodiment of the present application may be integrated into one processing unit, or each unit may exist physically separately, or two or more units may be integrated into one unit.
  • the above integrated unit can be implemented in the form of hardware or in the form of a software program module.
  • the integrated unit if implemented in the form of a software program module and sold or used as a standalone product, may be stored in a computer readable memory.
  • a computer readable memory A number of instructions are included to cause a computer device (which may be a personal computer, server or network device, etc.) to perform all or part of the steps of the methods described in various embodiments of the present application.
  • the foregoing memory includes: a U disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a removable hard disk, a magnetic disk, or an optical disk, and the like, which can store program codes.
  • ROM Read-Only Memory
  • RAM Random Access Memory

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Software Systems (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Computing Systems (AREA)
  • General Health & Medical Sciences (AREA)
  • Data Mining & Analysis (AREA)
  • Evolutionary Computation (AREA)
  • Computational Linguistics (AREA)
  • Molecular Biology (AREA)
  • Biophysics (AREA)
  • Biomedical Technology (AREA)
  • Artificial Intelligence (AREA)
  • Mathematical Physics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • User Interface Of Digital Computer (AREA)
  • Management, Administration, Business Operations System, And Electronic Commerce (AREA)

Abstract

本公开提供了一种网络模型的应用开发方法及相关产品,所述方法包括如下步骤:接收神经网络模型建立命令,依据该建立命令建立初始神经网络模型计算图;获取用户选择的计算模块以及该计算模块的连接关系,将该计算模块以及连接关系添加如该初始神经网络模型计算图内得到中间神经网络模型;采集神经网络模型的建立结束命令,依据该结束命令对该中间神经网络模型进行验证确定该计算模块是否与其他计算模块冲突,如不冲突,生成与该中间网络模型的计算图匹配的建成神经网络模型并生成与该计算图匹配的执行代码。本申请提供的技术方案具有通用性高的优点。

Description

网络模型的应用开发方法及相关产品 技术领域
本申请涉及信息处理技术领域,具体涉及一种网络模型的应用开发方法及相关产品。
背景技术
随着信息技术的不断发展和人们日益增长的需求,人们对信息及时性的要求越来越高了。网络模型例如神经网络模型随着技术的发展应用的越来越广泛,对于计算机、服务器等设备而言,其对网络模型执行训练以及运算的均能够实现,但是现有的神经网络的平台需要专业的开发人员来开发,这样用户使用不方便,通用性不强,影响用户体验度。
申请内容
本申请实施例提供了一种网络模型的应用开发方法及相关产品,可以实现图形化的神经网络平台实现对神经网络模型的编译,提高通用性和用户体验度。
第一方面,提供一种网络模型的应用开发方法,所述方法包括如下步骤:
接收神经网络模型建立命令,依据该建立命令建立初始神经网络模型计算图;
获取用户选择的计算模块以及该计算模块的连接关系,将该计算模块以及连接关系添加到该初始神经网络模型计算图内,并得到中间神经网络模型;
采集神经网络模型的建立结束命令,依据该结束命令对该中间神经网络模型进行验证确定该计算模块是否与其他计算模块冲突,如不冲突,生成与该中间网络模型的计算图匹配的建成神经网络模型并生成与该计算图匹配的执行代码。
第二方面,提供一种网络模型的应用开发平台,所述网络模型的应用开发平台包括:
收发单元,用于接收神经网络模型建立命令;
创建单元,用于依据该建立命令建立初始神经网络模型计算图;
所述收发单元,还用于获取用户选择的计算模块以及该计算模块的连接关系,
处理单元,用于将该计算模块以及连接关系添加到该初始神经网络模型计算图内,并得到中间神经网络模型;
所述收发单元,还用于接收神经网络模型的建立结束命令;
处理单元,用于依据该结束命令对该中间神经网络模型进行验证确定该计算模块是否与其他计算模块冲突;如不冲突,生成与该中间网络模型的计算图匹配的建成神经网络模型并生成与该计算图匹配的执行代码。
第三方面,提供一种计算机可读存储介质,其存储用于电子数据交换的计算机程序,其中,所述计算机程序使得计算机执行第二方面所述的方法。
第四方面,提供一种计算机程序产品,所述计算机程序产品包括存储了计算机程序的非瞬时性计算机可读存储介质,所述计算机程序可操作来使计算机执行第二方面所述的方法。
本申请提供的技术方案在进行网络模型的更新以后,对该网络模型进行模拟运行得到输出结果,然后展示该输出结果,这样用户可以通过该输出结果来判断该网络模型是否适合该对应的硬件结构,这样可以提高用户体验度。
附图说明
为了更清楚地说明本申请实施例中的技术方案,下面将对实施例描述中所需要使用的附图作简单地介绍,显而易见地,下面描述中的附图是本申请的一些实施例,对于本领域普通技术人员来讲,在不付出创造性劳动的前提下,还可以根据这些附图获得其他的附图。
图1是本申请实施例提供的一种网络模型的应用开发方法的流程示意图。
图2是本申请一个实施例提供的网络模型的应用开发平台的结构示意图。
具体实施方式
下面将结合本申请实施例中的附图,对本申请实施例中的技术方案进行清楚、完整地描述,显然,所描述的实施例是本申请一部分实施例,而不是全部的实施例。基于本申请中的实施例,本领域普通技术人员在没有作出创造性劳 动前提下所获得的所有其他实施例,都属于本申请保护的范围。
本申请的说明书和权利要求书及所述附图中的术语“第一”、“第二”、“第三”和“第四”等是用于区别不同对象,而不是用于描述特定顺序。此外,术语“包括”和“具有”以及它们任何变形,意图在于覆盖不排他的包含。例如包含了一系列步骤或单元的过程、方法、系统、产品或设备没有限定于已列出的步骤或单元,而是可选地还包括没有列出的步骤或单元,或可选地还包括对于这些过程、方法、产品或设备固有的其它步骤或单元。
在本文中提及“实施例”意味着,结合实施例描述的特定特征、结构或特性可以包含在本申请的至少一个实施例中。在说明书中的各个位置出现该短语并不一定均是指相同的实施例,也不是与其它实施例互斥的独立的或备选的实施例。本领域技术人员显式地和隐式地理解的是,本文所描述的实施例可以与其它实施例相结合。
模拟人类实际神经网络的数学方法问世以来,人们已慢慢习惯了把这种人工神经网络直接称为神经网络。神经网络在系统辨识、模式识别、智能控制等领域有着广泛而吸引人的前景,特别在智能控制中,人们对神经网络的自学习功能尤其感兴趣,并且把神经网络这一重要特点看作是解决自动控制中控制器适应能力这个难题的关键钥匙之一。
神经网络(Neural Networks,NN)是由大量的、简单的处理单元(称为神经元)广泛地互相连接而形成的复杂网络系统,它反映了人脑功能的许多基本特征,是一个高度复杂的非线性动力学习系统。神经网络具有大规模并行、分布式存储和处理、自组织、自适应和自学能力,特别适合处理需要同时考虑许多因素和条件的、不精确和模糊的信息处理问题。神经网络的发展与神经科学、数理科学、认知科学、计算机科学、人工智能、信息科学、控制论、机器人学、微电子学、心理学、光计算、分子生物学等有关,是一门新兴的边缘交叉学科。
神经网络的基础在于神经元。
神经元是以生物神经系统的神经细胞为基础的生物模型。在人们对生物神经系统进行研究,以探讨人工智能的机制时,把神经元数学化,从而产生了神经元数学模型。
大量的形式相同的神经元连结在—起就组成了神经网络。神经网络是一个高度非线性动力学系统。虽然,每个神经元的结构和功能都不复杂,但是神经 网络的动态行为则是十分复杂的;因此,用神经网络可以表达实际物理世界的各种现象。
神经网络模型是以神经元的数学模型为基础来描述的。人工神经网络(Artificial Neural Network)是对人类大脑系统的一阶特性的一种描述。简单地讲,它是一个数学模型。神经网络模型由网络拓扑.节点特点和学习规则来表示。神经网络对人们的巨大吸引力主要包括:并行分布处理、高度鲁棒性和容错能力、分布存储及学习能力、能充分逼近复杂的非线性关系。
在控制领域的研究课题中,不确定性系统的控制问题长期以来都是控制理论研究的中心主题之一,但是这个问题一直没有得到有效的解决。利用神经网络的学习能力,使它在对不确定性系统的控制过程中自动学习系统的特性,从而自动适应系统随时间的特性变异,以求达到对系统的最优控制;显然这是一种十分振奋人心的意向和方法。
人工神经网络的模型现在有数十种之多,应用较多的典型的神经网络模型包括BP神经网络、Hopfield网络、ART网络和Kohonen网络。
参阅图1,图1为本申请提供的一种网络模型的应用开发方法,该方法由网络模型应用平台实现,该方法如图1所示,包括如下步骤:
步骤S101、接收神经网络模型建立命令,依据该建立命令建立初始神经网络模型计算图;
上述神经网络模型建立命令可以通过一个图标来显示,当用户点击该图标时,确定接收到神经网络模型建立命令。当然在实际应用中,还可以通过其他的方式来实现上述建立命令的获取。
步骤S102、获取用户选择的计算模块以及该计算模块的连接关系,将该计算模块以及连接关系添加到该初始神经网络模型计算图内,并得到中间神经网络模型;
上述获取用户选择的计算模块的方式具体可以包括:
获取用户通过拖拽功能选择的计算模块。
上述确定计算模块的连接关系的方式可以包括:
确定该计算模块的功能,依据该功能确定该可能的所有连接关系,将所有连接关系显示给用户选择。例如一个卷积计算模块,那么其输入端可能连接转换模块的输入端或上一层的输出数据,其输出端可能连接激活模块的输入端或 下一层的输入数据。当然上述所有的连接关系仅仅是为了举例,本申请具体实施方式并不限制上述连接关系的显示方式或选择方式。
步骤S103、采集神经网络模型的建立结束命令,依据该结束命令对该中间神经网络模型进行验证确定该计算模块是否与其他计算模块冲突,如不冲突,生成与该中间网络模型的计算图匹配的建成神经网络模型并生成与该计算图匹配的执行代码。
本申请提供的技术方案无需专业人员,将各个计算模块进行图形化,这样用户只需要从各种图形化的计算模块中选择其所需要的计算模块,然后确定该计算模块的连接关系即能够实现神经网络模型图形化的建立,这样对用户的专业性要求不高,能够被广泛的应用到神经网络模型的建立中。
可选的,上述方法在步骤S103之后还可以包括:
接收用户输入的训练样本,确定该建立的初始神经网络模型的类型,依据该类型确定该训练样本的标记方式以及标记示例,显示该标记示例以及提示用户按该标记方式对该训练样本进行标记得到标记后的训练样本。
此技术方案提供了训练样本的标记功能,其通过对应类型的标记方式以及标记示例为用户提供了对输入的训练样本实现标记,为后续的运算或训练提供了执行的基础。
可选的,上述方法在标记以后还可以包括:
将标记后的训练样本作为输入数据输入到该建成神经网络模型内进行训练得到训练后的神经网络模型。
此技术方案提供了对神经网络模型训练的过程,即能够通过该平台对建立的神经网络模型的权值数据进行训练得到训练好的神经网络模型。
可选的,上述方法在标记以后还可以包括:
将标记后的训练样本作为输入数据输入到该建成神经网络模型内进行正向运算得到正向运算结果,将该正向运算结果展示。
此技术方案为用户提供了一种输入数据的计算,并展示正向运算结果,这样用户即能够通过标记的训练样本与计算结果比对确定该建成神经网络模型是否合理,为用户对神经网络模型进行优化提供了方向以及思路。
参阅图2,图2提供了一种网络模型的应用开发平台,所述网络模型的应用开发平台包括:
收发单元201,用于接收神经网络模型建立命令;
创建单元202,用于依据该建立命令建立初始神经网络模型计算图;
收发单元201,还用于获取用户选择的计算模块以及该计算模块的连接关系,
处理单元203,用于将该计算模块以及连接关系添加到该初始神经网络模型计算图内,并得到中间神经网络模型;
收发单元201,还用于接收神经网络模型的建立结束命令;
处理单元203,用于依据该结束命令对该中间神经网络模型进行验证确定该计算模块是否与其他计算模块冲突;如不冲突,生成与该中间网络模型的计算图匹配的建成神经网络模型并生成与该计算图匹配的执行代码。
可选的,
收发单元201,还用于接收用户输入的训练样本;
处理单元203,还用于确定该建立的初始神经网络模型的类型,依据该类型确定该训练样本的标记方式以及标记示例,显示该标记示例以及提示用户按该标记方式对该训练样本进行标记得到标记后的训练样本。
此技术方案提供了训练样本的标记功能,其通过对应类型的标记方式以及标记示例为用户提供了对输入的训练样本实现标记,为后续的运算或训练提供了执行的基础。
可选的,
处理单元203,还用于将标记后的训练样本作为输入数据输入到该建成神经网络模型内进行训练得到训练后的神经网络模型。
将标记后的训练样本作为输入数据输入到该建成神经网络模型内进行训练得到训练后的神经网络模型。
可选的,
处理单元203,还用于将标记后的训练样本作为输入数据输入到该建成神经网络模型内进行正向运算得到正向运算结果,将该正向运算结果展示。
此技术方案为用户提供了一种输入数据的计算,并展示正向运算结果,这样用户即能够通过标记的训练样本与计算结果比对确定该建成神经网络模型是否合理,为用户对神经网络模型进行优化提供了方向以及思路。
本申请还提供一种计算机可读存储介质,其存储用于电子数据交换的计算 机程序,其中,所述计算机程序使得计算机执行如图1所示的方法以及该方法的细化方案。
本申请还提供一种计算机程序产品,所述计算机程序产品包括存储了计算机程序的非瞬时性计算机可读存储介质,所述计算机程序可操作来使计算机执行如图1所示的方法以及该方法的细化方案。
需要说明的是,对于前述的各方法实施例,为了简单描述,故将其都表述为一系列的动作组合,但是本领域技术人员应该知悉,本申请并不受所描述的动作顺序的限制,因为依据本申请,某些步骤可以采用其他顺序或者同时进行。其次,本领域技术人员也应该知悉,说明书中所描述的实施例均属于可选实施例,所涉及的动作和模块并不一定是本申请所必须的。
在上述实施例中,对各个实施例的描述都各有侧重,某个实施例中没有详述的部分,可以参见其他实施例的相关描述。
在本申请所提供的几个实施例中,应该理解到,所揭露的装置,可通过其它的方式实现。例如,以上所描述的装置实施例仅仅是示意性的,例如所述单元的划分,仅仅为一种逻辑功能划分,实际实现时可以有另外的划分方式,例如多个单元或组件可以结合或者可以集成到另一个系统,或一些特征可以忽略,或不执行。另一点,所显示或讨论的相互之间的耦合或直接耦合或通信连接可以是通过一些接口,装置或单元的间接耦合或通信连接,可以是电性或其它的形式。
所述作为分离部件说明的单元可以是或者也可以不是物理上分开的,作为单元显示的部件可以是或者也可以不是物理单元,即可以位于一个地方,或者也可以分布到多个网络单元上。可以根据实际的需要选择其中的部分或者全部单元来实现本实施例方案的目的。
另外,在本申请各个实施例中的各功能单元可以集成在一个处理单元中,也可以是各个单元单独物理存在,也可以两个或两个以上单元集成在一个单元中。上述集成的单元既可以采用硬件的形式实现,也可以采用软件程序模块的形式实现。
所述集成的单元如果以软件程序模块的形式实现并作为独立的产品销售或使用时,可以存储在一个计算机可读取存储器中。基于这样的理解,本申请的技术方案本质上或者说对现有技术做出贡献的部分或者该技术方案的全部或部 分可以以软件产品的形式体现出来,该计算机软件产品存储在一个存储器中,包括若干指令用以使得一台计算机设备(可为个人计算机、服务器或者网络设备等)执行本申请各个实施例所述方法的全部或部分步骤。而前述的存储器包括:U盘、只读存储器(ROM,Read-Only Memory)、随机存取存储器(RAM,Random Access Memory)、移动硬盘、磁碟或者光盘等各种可以存储程序代码的介质。
本领域普通技术人员可以理解上述实施例的各种方法中的全部或部分步骤是可以通过程序来指令相关的硬件来完成,该程序可以存储于一计算机可读存储器中,存储器可以包括:闪存盘、只读存储器(英文:Read-Only Memory,简称:ROM)、随机存取器(英文:Random Access Memory,简称:RAM)、磁盘或光盘等。
以上对本申请实施例进行了详细介绍,本文中应用了具体个例对本申请的原理及实施方式进行了阐述,以上实施例的说明只是用于帮助理解本申请的方法及其核心思想;同时,对于本领域的一般技术人员,依据本申请的思想,在具体实施方式及应用范围上均会有改变之处,综上所述,本说明书内容不应理解为对本申请的限制。

Claims (10)

  1. 一种网络模型的应用开发方法,其特征在于,所述方法包括如下步骤:
    接收神经网络模型建立命令,依据该建立命令建立初始神经网络模型计算图;
    获取用户选择的计算模块以及该计算模块的连接关系,将该计算模块以及连接关系添加到该初始神经网络模型计算图内,并得到中间神经网络模型;
    采集神经网络模型的建立结束命令,依据该结束命令对该中间神经网络模型进行验证确定该计算模块是否与其他计算模块冲突,如不冲突,生成与该中间网络模型的计算图匹配的建成神经网络模型并生成与该计算图匹配的执行代码。
  2. 根据权利要求1所述的方法,其特征在于,所述方法还包括:
    接收用户输入的训练样本,确定该建立的初始神经网络模型的类型,依据该类型确定该训练样本的标记方式以及标记示例,显示该标记示例以及提示用户按该标记方式对该训练样本进行标记得到标记后的训练样本。
  3. 根据权利要求2所述的方法,其特征在于,所述方法还包括:
    将标记后的训练样本作为输入数据输入到该建成神经网络模型内进行训练得到训练后的神经网络模型。
  4. 根据权利要求2所述的方法,其特征在于,所述方法还包括:
    将标记后的训练样本作为输入数据输入到该建成神经网络模型内进行正向运算得到正向运算结果,将该正向运算结果展示。
  5. 一种网络模型的应用开发平台,其特征在于,所述网络模型的应用开发平台包括:
    收发单元,用于接收神经网络模型建立命令;
    创建单元,用于依据该建立命令建立初始神经网络模型计算图;
    所述收发单元,还用于获取用户选择的计算模块以及该计算模块的连接关系,
    处理单元,用于将该计算模块以及连接关系添加到该初始神经网络模型计算图内,并得到中间神经网络模型;
    所述收发单元,还用于接收神经网络模型的建立结束命令;
    处理单元,用于依据该结束命令对该中间神经网络模型进行验证确定该计算模块是否与其他计算模块冲突;如不冲突,生成与该中间网络模型的计算图匹配的建成神经网络模型并生成与该计算图匹配的执行代码。
  6. 根据权利要求5所述的网络模型的应用开发平台,其特征在于,
    所述收发单元,还用于接收用户输入的训练样本;
    所述处理单元,还用于确定该建立的初始神经网络模型的类型,依据该类型确定该训练样本的标记方式以及标记示例,显示该标记示例以及提示用户按该标记方式对该训练样本进行标记得到标记后的训练样本。
  7. 根据权利要求6所述的网络模型的应用开发平台,其特征在于,
    所述处理单元,还用于将标记后的训练样本作为输入数据输入到该建成神经网络模型内进行训练得到训练后的神经网络模型。
  8. 根据权利要求6所述的网络模型的应用开发平台,其特征在于,
    所述处理单元,还用于将标记后的训练样本作为输入数据输入到该建成神经网络模型内进行正向运算得到正向运算结果,将该正向运算结果展示。
  9. 一种计算机可读存储介质,其特征在于,其存储用于电子数据交换的计算机程序,其中,所述计算机程序使得计算机执行如权利要求1-4任一项所述的方法。
  10. 一种计算机程序产品,其特征在于,所述计算机程序产品包括存储了计算机程序的非瞬时性计算机可读存储介质,所述计算机程序可操作来使计算机执行如权利要求1-4任一项所述的方法。
PCT/CN2018/083433 2018-04-17 2018-04-17 网络模型的应用开发方法及相关产品 WO2019200544A1 (zh)

Priority Applications (3)

Application Number Priority Date Filing Date Title
US17/044,487 US11954576B2 (en) 2018-04-17 2018-04-17 Method for implementing and developing network model and related product
PCT/CN2018/083433 WO2019200544A1 (zh) 2018-04-17 2018-04-17 网络模型的应用开发方法及相关产品
CN201880003164.6A CN109643229B (zh) 2018-04-17 2018-04-17 网络模型的应用开发方法、平台及计算机可读存储介质

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2018/083433 WO2019200544A1 (zh) 2018-04-17 2018-04-17 网络模型的应用开发方法及相关产品

Publications (1)

Publication Number Publication Date
WO2019200544A1 true WO2019200544A1 (zh) 2019-10-24

Family

ID=66060279

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2018/083433 WO2019200544A1 (zh) 2018-04-17 2018-04-17 网络模型的应用开发方法及相关产品

Country Status (3)

Country Link
US (1) US11954576B2 (zh)
CN (1) CN109643229B (zh)
WO (1) WO2019200544A1 (zh)

Families Citing this family (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109165249B (zh) * 2018-08-07 2020-08-04 阿里巴巴集团控股有限公司 数据处理模型构建方法、装置、服务器和用户端
US11615321B2 (en) 2019-07-08 2023-03-28 Vianai Systems, Inc. Techniques for modifying the operation of neural networks
US11681925B2 (en) 2019-07-08 2023-06-20 Vianai Systems, Inc. Techniques for creating, analyzing, and modifying neural networks
US11640539B2 (en) 2019-07-08 2023-05-02 Vianai Systems, Inc. Techniques for visualizing the operation of neural networks using samples of training data
CN110515626B (zh) * 2019-08-20 2023-04-18 Oppo广东移动通信有限公司 深度学习计算框架的代码编译方法及相关产品
WO2021077283A1 (zh) * 2019-10-22 2021-04-29 深圳鲲云信息科技有限公司 神经网络计算压缩方法、系统及存储介质
US11593617B2 (en) 2019-12-31 2023-02-28 X Development Llc Reservoir computing neural networks based on synaptic connectivity graphs
US11593627B2 (en) 2019-12-31 2023-02-28 X Development Llc Artificial neural network architectures based on synaptic connectivity graphs
US11631000B2 (en) 2019-12-31 2023-04-18 X Development Llc Training artificial neural networks based on synaptic connectivity graphs
US11620487B2 (en) * 2019-12-31 2023-04-04 X Development Llc Neural architecture search based on synaptic connectivity graphs
US11568201B2 (en) 2019-12-31 2023-01-31 X Development Llc Predicting neuron types based on synaptic connectivity graphs
US11625611B2 (en) 2019-12-31 2023-04-11 X Development Llc Training artificial neural networks based on synaptic connectivity graphs
CN112748953B (zh) * 2020-07-02 2023-08-15 腾讯科技(深圳)有限公司 基于神经网络模型的数据处理方法、装置及电子设备
CN113052305B (zh) * 2021-02-19 2022-10-21 展讯通信(上海)有限公司 神经网络模型的运行方法、电子设备和存储介质

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5812992A (en) * 1995-05-24 1998-09-22 David Sarnoff Research Center Inc. Method and system for training a neural network with adaptive weight updating and adaptive pruning in principal component space
CN1578968A (zh) * 2001-08-29 2005-02-09 霍尼韦尔国际公司 用于受监督的神经网络学习的组合方法
CN107766940A (zh) * 2017-11-20 2018-03-06 北京百度网讯科技有限公司 用于生成模型的方法和装置

Family Cites Families (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7502763B2 (en) * 2005-07-29 2009-03-10 The Florida International University Board Of Trustees Artificial neural network design and evaluation tool
US8818923B1 (en) * 2011-06-27 2014-08-26 Hrl Laboratories, Llc Neural network device with engineered delays for pattern storage and matching
US9460387B2 (en) 2011-09-21 2016-10-04 Qualcomm Technologies Inc. Apparatus and methods for implementing event-based updates in neuron networks
US10095718B2 (en) * 2013-10-16 2018-10-09 University Of Tennessee Research Foundation Method and apparatus for constructing a dynamic adaptive neural network array (DANNA)
US9953171B2 (en) * 2014-09-22 2018-04-24 Infosys Limited System and method for tokenization of data for privacy
EP4202782A1 (en) 2015-11-09 2023-06-28 Google LLC Training neural networks represented as computational graphs
US11093818B2 (en) 2016-04-11 2021-08-17 International Business Machines Corporation Customer profile learning based on semi-supervised recurrent neural network using partially labeled sequence data
US11288573B2 (en) 2016-05-05 2022-03-29 Baidu Usa Llc Method and system for training and neural network models for large number of discrete features for information rertieval
US10319019B2 (en) 2016-09-14 2019-06-11 Ebay Inc. Method, medium, and system for detecting cross-lingual comparable listings for machine translation using image similarity
US10726507B1 (en) * 2016-11-11 2020-07-28 Palantir Technologies Inc. Graphical representation of a complex task
US10032110B2 (en) * 2016-12-13 2018-07-24 Google Llc Performing average pooling in hardware
CN107016175B (zh) 2017-03-23 2018-08-31 中国科学院计算技术研究所 适用神经网络处理器的自动化设计方法、装置及优化方法
CN107239315B (zh) 2017-04-11 2019-11-15 赛灵思公司 面向神经网络异构计算平台的编程模型
CN107480194B (zh) 2017-07-13 2020-03-13 中国科学院自动化研究所 多模态知识表示自动学习模型的构建方法及系统
CN107563417A (zh) * 2017-08-18 2018-01-09 北京天元创新科技有限公司 一种深度学习人工智能模型建立方法及系统
CN107862058B (zh) 2017-11-10 2021-10-22 北京百度网讯科技有限公司 用于生成信息的方法和装置
CN107832837B (zh) * 2017-11-28 2021-09-28 南京大学 一种基于压缩感知原理的卷积神经网络压缩方法及解压缩方法
US11205236B1 (en) * 2018-01-24 2021-12-21 State Farm Mutual Automobile Insurance Company System and method for facilitating real estate transactions by analyzing user-provided data

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5812992A (en) * 1995-05-24 1998-09-22 David Sarnoff Research Center Inc. Method and system for training a neural network with adaptive weight updating and adaptive pruning in principal component space
CN1578968A (zh) * 2001-08-29 2005-02-09 霍尼韦尔国际公司 用于受监督的神经网络学习的组合方法
CN107766940A (zh) * 2017-11-20 2018-03-06 北京百度网讯科技有限公司 用于生成模型的方法和装置

Also Published As

Publication number Publication date
CN109643229A (zh) 2019-04-16
US20210042602A1 (en) 2021-02-11
CN109643229B (zh) 2022-10-04
US11954576B2 (en) 2024-04-09

Similar Documents

Publication Publication Date Title
WO2019200544A1 (zh) 网络模型的应用开发方法及相关产品
US11783227B2 (en) Method, apparatus, device and readable medium for transfer learning in machine learning
CN110674869B (zh) 分类处理、图卷积神经网络模型的训练方法和装置
WO2019091020A1 (zh) 权重数据存储方法和基于该方法的神经网络处理器
US20190122409A1 (en) Multi-Dimensional Puppet with Photorealistic Movement
JP7043596B2 (ja) ニューラルアーキテクチャ検索
EP2819089A1 (en) Method and system for providing education service based on knowledge unit, and computer-readable recording medium
CN110110062A (zh) 机器智能问答方法、装置与电子设备
CN109101624A (zh) 对话处理方法、装置、电子设备及存储介质
CN107544960B (zh) 一种基于变量绑定和关系激活的自动问答方法
TW201633181A (zh) 用於經非同步脈衝調制的取樣信號的事件驅動型時間迴旋
WO2022184124A1 (zh) 生理电信号分类处理方法、装置、计算机设备和存储介质
WO2018113260A1 (zh) 情绪表达的方法、装置和机器人
Shah et al. Problem solving chatbot for data structures
CN108737491A (zh) 信息推送方法和装置以及存储介质、电子装置
CN111651989A (zh) 命名实体识别方法和装置、存储介质及电子装置
WO2019200545A1 (zh) 网络模型的运行方法及相关产品
Sun et al. Education teaching evaluation method aided by adaptive genetic programming and robust scheduling
CN117744759A (zh) 文本信息的识别方法、装置、存储介质及电子设备
WO2020118555A1 (zh) 一种网络模型数据存取方法、装置及电子设备
WO2019200548A1 (zh) 网络模型编译器及相关产品
WO2023185972A1 (zh) 数据处理方法、装置和电子设备
CN117273105A (zh) 一种针对神经网络模型的模块构建方法及装置
CN117573946A (zh) 对话样本生成方法、聊天对话大模型训练方法及相关装置
Flood Neural networks in civil engineering: a review

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18915738

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 25.01.2021)

122 Ep: pct application non-entry in european phase

Ref document number: 18915738

Country of ref document: EP

Kind code of ref document: A1