WO2018133567A1 - Neuron weight information processing method and system, neuron information processing method and system, and computer device - Google Patents
Neuron weight information processing method and system, neuron information processing method and system, and computer device Download PDFInfo
- Publication number
- WO2018133567A1 WO2018133567A1 PCT/CN2017/114659 CN2017114659W WO2018133567A1 WO 2018133567 A1 WO2018133567 A1 WO 2018133567A1 CN 2017114659 W CN2017114659 W CN 2017114659W WO 2018133567 A1 WO2018133567 A1 WO 2018133567A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- neuron
- information
- weight
- neurons
- group
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/04—Architecture, e.g. interconnection topology
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/06—Physical realisation, i.e. hardware implementation of neural networks, neurons or parts of neurons
Definitions
- the receiving neuron output information of the front-end neuron, and the weight index corresponding to the neuron output information include:
- the current neuron information when the collaborative group is a pulse cooperative group, includes historical membrane potential information;
- the historical membrane potential information of the pulsed effective neurons is updated.
- the neuron is set to allow data to be issued or not allowed to be distributed, and a predetermined number of consecutive neurons are grouped into a coordinated group, and the cooperative group can be flexibly configured according to requirements. .
- the pulse synergy group after the pulse effective neuron outputs the cooperative output information, the historical membrane potential information of the pulsed effective neurons is updated, so that the entire cooperative group completes subsequent information processing, and the pulse The cooperative neurons do not update the historical membrane potential information, and in the subsequent information processing, the function of weight expansion is completed, and the information processing capability of the entire pulse neural network is improved by the pulse synergy group.
- the weight information obtaining module is configured to: according to the weight index, read the correspondence of the weight index information, and obtain the weight information, where the weight index information correspondence relationship is a correspondence between the weight index and the weight information;
- the front end neuron comprises an artificial neuron or a pulsed neuron.
- the weight index includes a storage address of weight information corresponding to the weight index.
- the front-end neuron outputs an information receiving module, including:
- the invention also provides a neuron information processing system with input weight expansion, comprising:
- a collaborative group determining module configured to determine a preset number of consecutive neurons as a collaborative group, and determine a last neuron in the collaborative group as a valid neuron, except the effective neurons in the collaborative group
- the neurons are identified as synergistic neurons
- FIG. 2 is a schematic flow chart of a method for processing a neuron weight information according to another embodiment
- FIG. 3 is a schematic structural diagram of a neuron weight information processing system according to an embodiment
- FIG. 4 is a schematic structural diagram of a neuron weight information processing system of another embodiment
- FIG. 5 is a schematic flow chart of a neuron information processing method for input weight expansion according to an embodiment
- FIG. 6 is a schematic flow chart of a neuron information processing method for input weight expansion in another embodiment
- FIG. 7 is a schematic structural diagram of a neuron information processing system with an input weight extension according to an embodiment
- FIG. 1 is a schematic flowchart of a method for processing a neuron weight information according to an embodiment.
- the method for processing a neuron weight information as shown in FIG. 1 includes:
- the neuron output information of the front-end neuron is output information calculated by the front-end neuron;
- the weight index is index information for the current neuron to retrieve the weight information corresponding to the front-end neuron output information.
- the weight indexing method can occupy a smaller information transmission space in the process of information transmission, which not only reduces the processing requirements of the hardware, but also needs to change the index information, so that the change of the weight information can be more flexibly and conveniently performed.
- the update makes it easier to update the weight information in the neural network.
- Step S200 The weight index is read according to the weight index, and the weight index information is obtained, and the weight index information correspondence relationship is a correspondence between the weight index and the weight information.
- the weight index information correspondence relationship may be stored locally in the current neuron or may be stored in other locations in the neural network as long as the current neuron can be read.
- Step S300 acquiring input information of the front-end neuron according to the weight information and the neuron output information.
- the read weight information is subjected to corresponding operation processing according to different neuron models to obtain input information of the front end neurons.
- the front-end neuron including artificial neurons or pulsed neurons, ie, front-end neurons and current neurons, may be artificial neurons or pulsed neurons, ie, artificial neural networks or pulsed nerves.
- the meta-network can use the weight indexing method in the transmission of the weight information.
- the neuron output information of the front-end neuron and the weight index corresponding to the neuron output information include: membrane potential information output by the front-end artificial neuron And the connection weight index of the front end artificial neuron and the current artificial neuron; the reading the weight index information corresponding relationship according to the weight index, and obtaining the weight information, including: according to the front end artificial neuron and the current artificial neuron Connect the weight index to read the connection weight between the front-end artificial neurons and the current artificial neurons.
- the neuron output information of the receiving front end neuron, and the weight index corresponding to the neuron output information include: pulse tip information output by the front end pulse neuron, front end pulse The weighting index of the connection between the neuron and the current pulsed neuron; reading the weighted index information correspondence according to the weight index, and obtaining the weight information, including: reading according to the connection weight index of the front-end pulse neuron and the current pulse neuron The connection weight of the front-end pulse neuron and the current pulsed neuron is taken.
- the use of the weight index is applicable not only to the artificial neural network but also to the pulse neural network, and improves the information processing capability of the artificial neural network and the pulsed neural network.
- the weight index includes a storage address of weight information corresponding to the weight index.
- the storage address of the weight information is used as the index information, so that the neuron receiving the index information directly uses the storage address information to query the weight information, thereby improving the extraction efficiency of the weight information, thereby improving the entire neural network. Information processing efficiency.
- the receiving neuron output information of the front-end neuron and the weight index corresponding to the neuron output information include: receiving routing information output by the front-end neuron, the routing information including the front-end neural network Meta-outputted neuron output information, and a weight index corresponding to the neuron output information; parsing the routing information, and acquiring the neuron output information and the weight index.
- the weight index is set in the routing information, and the weight index information is transmitted by using the information transmission data in the existing neural network.
- the index information may be stored using fixed length or variable length information bits.
- the weight index and the neuron output information are set and transmitted in the routing information, and the existing routing data is fully utilized, thereby improving the information use efficiency between the neurons.
- the weight information calculated by the weight reduction algorithm is used to limit the weight to some fixed value according to the preset weight value range and the initial weight information.
- the preset weight value range is a range formed by a maximum and a minimum weight required in the network; and the weight reduction algorithm may discretize the weight under the premise of ensuring the accuracy of the algorithm. For some fixed values, such as weight binarization, binarization algorithm.
- the weight information calculated by the weight reduction algorithm is reduced for the storage weight.
- the storage space of the hardware of the information, and the accuracy of the weight information is maintained.
- FIG. 2 is a schematic flowchart diagram of a method for processing a neuron weight information according to another embodiment.
- the method for processing a neuron weight information as shown in FIG. 2 includes:
- Step S100 Receive neuron output information of the front end neuron, and a weight index corresponding to the neuron output information.
- Step S200 The weight index is read according to the weight index, and the weight index information is obtained, and the weight index information correspondence relationship is a correspondence between the weight index and the weight information.
- Step S300 acquiring input information of the front-end neuron according to the weight information and the neuron output information.
- Step S400 calculating current neuron output information according to the neuron output algorithm according to the input information of the front end neuron and the read current neuron information.
- the current artificial neuron information includes: a current artificial nerve Meta-bias information; then, according to the input information of the front-end neuron and the read current neuron information, calculating current neuron output information according to a neuron output algorithm, including: outputting according to the front-end artificial neuron
- the membrane potential information, the connection weight of the front end artificial neuron and the current artificial neuron, and the current artificial neuron bias information are used to calculate the current artificial neuron output information through a preset artificial neuron activation function.
- the current pulsed neuron information includes: historical membrane potential information and membrane potential leakage information; then the input information according to the front-end neuron and the read current neuron information Calculating, according to the neuron output algorithm, the current neuron output information, including: according to the front end pulse neuron input information, a connection weight of the front end pulse neuron and the current pulse neuron, the historical membrane potential information, the Membrane potential leakage information, calculated by pulse neuron calculation model, current pulse neuron output information
- Step S500 determining destination information of the current neuron output information, and searching for a destination index correspondence relationship according to the destination information, and acquiring a weight index of the destination information, where the destination index correspondence relationship includes between the destination information and the weight index Correspondence.
- the connection relationship between the current neuron and the back-end neuron has been determined, and the destination information of the current neuron output information has also been determined.
- the destination information of the current neuron output information the destination index correspondence relationship is searched, and the corresponding weight index can be used.
- Step S600 outputting the current neuron output information and the weight index.
- outputting the current neuron output information and the weight index to the neurons of the back end can complete the entire transfer process of the weight index.
- the current neuron calculates the output information of the current neuron for output, according to the target neuron of the output information, after searching for the corresponding weight index, the current neuron output information and the weight are output. index.
- the current neuron sends the weight index to the neurons in the back end, and the weight index information is completely transmitted in the neural network, which improves the information processing capability of the neural network.
- FIG. 3 is a schematic structural diagram of a neuron weight information processing system according to an embodiment.
- the neuron weight information processing system shown in FIG. 3 includes:
- the front-end neuron output information receiving module 100 is configured to receive neuron output information of the front-end neuron, and a weight index corresponding to the neuron output information; the front-end neuron includes an artificial neuron or a pulsed neuron.
- the weight index includes a storage address of the weight information corresponding to the weight index.
- the front-end neuron output information receiving module includes: a routing information receiving unit, configured to receive routing information output by the front-end neuron, where the routing information includes neuron output information output by the front-end neuron, and output with the neuron A weight index corresponding to the information; a routing information parsing unit, configured to parse the routing information, and obtain the neuron output information and the weight index.
- the front-end neuron input information obtaining module 300 is configured to acquire input information of the front-end neuron according to the weight information and the neuron output information.
- the output information of the front-end neuron received by the current neuron carries a weight index of the weight information between the front-end neuron and the current neuron, and the current neuron reads the weight information according to the received weight index information.
- corresponding operation processing is performed to obtain input information of the front-end neurons.
- the weight information is no longer transmitted directly between the neurons, but the index of the weight information is transmitted, which not only saves the amount of information transmitted between the networks, but also can change the setting of the weight information of each neuron more flexibly, and improves the setting.
- Information processing capabilities of neural networks are examples of information from the neural networks.
- weight index is applicable not only to artificial neural networks, but also to pulsed neural networks, which improves the information processing capabilities of artificial neural networks and pulsed neural networks.
- the storage address of the weight information is used as the index information, so that the neuron receiving the index information directly uses the storage address information to query the weight information, thereby improving the extraction efficiency of the weight information, thereby improving the information processing efficiency of the entire neural network.
- the weight index and the neuron output information are set and transmitted in the routing information, and the existing routing data is fully utilized, thereby improving the information use efficiency between the neurons.
- the weight information calculated by the weight reduction algorithm has a weight value within a preset value range, which reduces the storage space of the hardware for storing the weight information, and maintains the weight. The accuracy of the information.
- FIG. 4 is a schematic structural diagram of a neuron weight information processing system according to another embodiment, and the neuron weight information processing system shown in FIG. 4 includes:
- the front-end neuron output information receiving module 100 is configured to receive neuron output information of the front-end neuron, and a weight index corresponding to the neuron output information.
- the weight information obtaining module 200 is configured to read the weight index information corresponding relationship according to the weight index, and obtain weight information, where the weight index information correspondence relationship is a correspondence between the weight index and the weight information.
- the front-end neuron input information obtaining module 300 is configured to acquire input information of the front-end neuron according to the weight information and the neuron output information.
- the current neuron output information obtaining module 400 is configured to calculate current neuron output information according to the neuron output algorithm according to the input information of the front end neuron and the read current neuron information.
- the weight index determining module 500 is configured to determine destination information of the current neuron output information, and search for a destination index correspondence according to the destination information, and obtain a weight index of the destination information, where the destination index correspondence includes the destination information. The correspondence between the weight index and the weight index.
- the weight index sending module 600 is configured to output the current neuron output information and the weight index.
- the current neuron calculates the output information of the current neuron for output, according to the target neuron of the output information, after searching for the corresponding weight index, the current neuron output information and the weight are output. index.
- the current neuron sends the weight index to the neurons at the back end, and the weight index information is completely transmitted in the neural network. Improve the information processing capabilities of neural networks.
- FIG. 5 is a schematic flowchart of a method for processing neuron information of an input weight extension according to an embodiment, and the method for processing neuron information of the input weight extension shown in FIG. 1 includes:
- Step S400' the valid neuron acquires cooperative output information according to the received front-end neuron information, the read current neuron information of the effective neuron, and the horizontal accumulation information.
- step S500' the effective neurons output the coordinated output information.
- the weight information of the plurality of input groups breaks the shortcoming of the limited type of input weight of the existing single neuron, and improves the information processing capability of the neural network.
- the setting is issued with an enable identifier for setting a determined preset number of consecutive neurons as a cooperative group, and setting only the last neuron to output information.
- a predetermined number of consecutive artificial neurons are determined as an artificial cooperative group, and the last artificial neuron in the artificial cooperative group is determined as an artificial effective neuron, and the artificial cooperative group is divided.
- the artificial neurons outside the artificial effective neurons are determined as artificial cooperative neurons; or a preset number of consecutive pulse neurons are determined as a pulse cooperative group, and the last pulse neuron in the pulse cooperative group is determined as a pulse An effective neuron, wherein the pulsed neurons other than the pulsed effective neurons in the pulse synergy group are determined as pulse cooperative neurons.
- a predetermined number of consecutive artificial neurons are determined as an artificial cooperative group, or a preset number of consecutive pulse neurons are determined as a pulse cooperative group, in an artificial neural network or a pulsed neural network,
- the cooperative group can be determined to expand the input weight of a single neuron, and improve the information processing capability of the artificial neural network or the pulse neural network.
- the current neuron information includes historical membrane potential information; after the step of outputting the cooperative output information by the effective neuron, the method further includes: updating the pulse effective The historical membrane potential information of neurons.
- the pulse effective neurons output the coordinated output information
- the historical membrane potential information of the pulsed effective neurons is updated, so that the entire collaborative group completes subsequent information processing, and the pulse synergy
- the neurons do not update the historical membrane potential information, and in the subsequent information processing, the function of weight expansion is completed, and the information processing capability of the entire pulse neural network is improved through the pulse synergy group.
- FIG. 6 is a schematic flowchart diagram of a method for processing neuron information of an input weight extension according to another embodiment, and the method for processing neuron information of the input weight extension shown in FIG. 2 includes:
- the front-end neuron information includes: front-end neuron output information, a connection weight index of the front-end neuron and the current neuron.
- the front-end neuron information includes: membrane potential information outputted by the front-end artificial neurons, and a connection weight index of the front-end artificial neurons and the current artificial neurons.
- the front-end neuron information includes: pulse tip information outputted by the front-end pulse neuron, and a connection weight index of the front-end pulse neuron and the current pulse neuron.
- Step S100a' determining a preset number of consecutive neurons as a cooperative group, determining a last neuron in the cooperative group as a valid neuron, and selecting a neuron other than the effective neuron in the cooperative group Determined as a synergistic neuron.
- Step S200a' the first coordinated neuron in the cooperative group reads the connection weight of the front-end neuron and the current neuron according to the connection weight index of the front-end neuron and the current neuron; according to the front-end neuron and The connection weight of the current neuron, the front-end neuron information, and the horizontal accumulation intermediate information of the first coordinated neuron.
- connection weight index of the front end artificial neuron and the current artificial neuron is an address information
- the current neuron is indexed according to the received connection weight of the front end artificial neuron and the current artificial neuron
- the current nerve is
- the connection weight of the front end artificial neuron and the current artificial neuron is read, and according to the connection weight information, the output information of the front end neuron can be used in the calculation process of participating in the current neuron output information. More accurately reflects the weight of the output information of the front-end neurons.
- the front-end neuron information includes membrane potential information output by the front-end artificial neuron, based on the membrane potential information output by the front-end artificial neuron, and the read front-end neuron and current nerve
- the weight of the connection of the element after multiplication, obtains the horizontal accumulation intermediate information of the first artificial coordinated neuron and puts it into the accumulator.
- the front-end neuron information includes pulse tip information output by the front-end pulse neuron, according to pulse tip information output by the front-end pulse neuron, and the read front-end neuron and current nerve
- the weight of the connection of the element after multiplication, obtains the horizontal accumulation intermediate information of the first pulse cooperative neuron and puts it into the accumulator.
- Step S300a' the subsequent cooperative neuron in the collaborative group sequentially reads the connection weight of the front-end neuron and the current neuron according to the connection weight index of the front-end neuron and the current neuron; according to the front-end neuron Obtaining intermediate information of the connection weights of the current neurons, the front-end neuron information, and the lateral accumulation of the front-end cooperative neurons, acquiring the horizontal accumulation intermediate information of the cooperative neurons, and the last synergistic nerve in the synergistic group The horizontal accumulation intermediate information of the element is determined as the horizontal accumulation information.
- the subsequent cooperative neurons in the collaborative group separately calculate the received front-end neuron output information and the connected front-end neuron and the current neuron, and calculate according to the preset neuron mode, such as After multiplying, the horizontal accumulation intermediate information of the coordinated neurons of the front end connected thereto is accumulated to obtain the horizontal accumulation intermediate information of the current coordinated neurons. Until the last coordinated neuron obtains the horizontal accumulation intermediate information, it is confirmed as the horizontal accumulation information.
- Step S400a' the valid neuron acquires cooperative output information according to the received front-end neuron information, the read current neuron information of the effective neuron, and the horizontal accumulation information.
- the current neuron information includes current artificial neuron offset information.
- the effective neuron obtains collaborative output information according to the received front-end neuron information, the read current neuron information of the effective neuron, and the horizontal accumulation information, including: a membrane output according to the front-end artificial neuron Potential information,
- the connection weight of the front-end neuron and the current neuron, the current artificial neuron bias information, and the cooperative output information of the artificial effective neuron are calculated by a preset artificial neuron activation function.
- the current neuron information includes historical membrane potential information and membrane potential leakage information.
- the effective neuron obtains cooperative output information according to the received front-end neuron information, the read current neuron information of the effective neuron, and the horizontal accumulation information, including: according to the pulse output by the front-end pulse neuron
- the cutting-edge information, the connection weight of the front-end neuron and the current neuron, the historical membrane potential information, and the membrane potential leakage information are calculated by a pulsed neuron calculation model to calculate cooperative output information of the pulsed effective neurons.
- connection weight of the front-end neuron and the current neuron in the received front-end neuron information is indexed, and the connection weight of the front-end neuron and the current neuron is read, it is used to calculate the horizontal accumulation intermediate information, and
- the weight information of each coordinated neuron in a collaborative group is fully utilized, and in the cooperative output information of the effective neuron output, the weight information of each coordinated neuron is embodied, which is equivalent to the weight information of the effective neurons. It has been expanded to improve the information processing capabilities of neural networks.
- the collaborative group determining module 100' is configured to determine a preset number of consecutive neurons as a cooperative group, and determine a last neuron in the collaborative group as a valid neuron, and save the effective neural group in the cooperative group
- the neurons outside the element are determined as cooperative neurons; and are used for setting an issue enablement identifier of the neurons in the collaborative group, the issue enablement identifier includes allowing data to be released or not allowing data to be released, and the effective neurons are
- the issuance enablement flag is set to allow the release of data, and the issue enablement identification of all of the coordinated neurons is set to not allow the release of data.
- the method includes: an artificial neuron determining unit, configured to determine a preset number of consecutive artificial neurons as an artificial cooperative group, and determine a last artificial neuron in the artificial cooperative group as an artificial effective neuron, and the artificial synergy
- the artificial neurons other than the artificial effective neurons in the group are determined as artificial cooperative neurons
- the pulsed neuron determining unit is configured to determine a preset number of consecutive pulse neurons as a pulse synergy group, and the pulse The last pulsed neuron in the cooperative group is determined as a pulsed effective neuron, and the pulsed neurons other than the pulsed effective neurons in the pulse cooperative group are determined as pulse cooperative neurons.
- the horizontally accumulated information obtaining module 200 ′ is configured to: acquire, by the first coordinated neuron in the collaborative group, the horizontal accumulated intermediate information of the first coordinated neuron according to the received front-end neuron information; Each of the cooperative neurons sequentially acquires the horizontal accumulation intermediate information of the cooperative neurons according to the received front-end neuron information and the lateral accumulation intermediate information of the front-end cooperative neurons, and the last synergistic nerve in the synergistic group
- the horizontal accumulation intermediate information of the element is determined as horizontal accumulation information
- the horizontal accumulation intermediate information of the last coordinated neuron in the cooperation group is determined as horizontal accumulation information
- the front end neuron information includes: front end neuron output information, front end a weighted index of the connection between the neuron and the current neuron
- the horizontal accumulation information acquisition module 200' is used to coordinate the first coordinated neuron in the group, the root According to the connection weight index of the front-end neuron and the current neuron, the connection weight of the front-end neuron and the current neuron is read; according to the connection weight of
- the horizontal accumulation information acquisition module 200 ′ when implementing the hardware circuit by using specific components, the horizontal accumulation intermediate information generated by each coordinated neuron in the cooperation group is transmitted to the next through the shared register. Synergistic neurons or effective neurons are used for membrane potential accumulation, and this way of feedback addition can be achieved with an accumulator. More specifically, the cooperative neuron obtains the horizontal accumulation intermediate information of the front-end cooperative neurons by reading the shared register. After the valid neuron outputs the information, the shared register needs to be cleared to wait for the next or next collaborative group to work properly.
- the input circuit circuits of the cooperative neurons in the cooperative group and the last effective neurons may be the same, that is, the same as the effective neurons, and the synergistic neurons are also
- the input circuit has the function of reading the current neuron information, and the current neuron input information of each cooperative neuron is set to 0 by using the software design method.
- the collaborative output information output module 400' is configured to output the coordinated output information by the valid neurons.
- the neurons in the back end are equivalent to all the neurons in the cooperative group, and the multiple inputs correspond to one effective output.
- the weight information of the multiple inputs can be fully utilized, and the existing neuron input weights are broken.
- the shortcomings of the limited type improve the information processing capability of the neural network.
- each of the cooperative groups After receiving the weight of the connection between the front-end neuron and the current neuron in the received front-end neuron information, reading the connection weight of the front-end neuron and the current neuron, and calculating the horizontal accumulation intermediate information, each of the cooperative groups
- the weight information of the cooperative neurons is fully utilized, and in the cooperative output information of the effective neuron output, the weight information of each coordinated neuron is embodied, which is equivalent to expanding the weight information of the effective neurons, thereby improving The information processing capability of the neural network.
- Determining a preset number of consecutive artificial neurons as an artificial cooperative group, or determining a preset number of consecutive pulse neurons as a pulse synergistic group, and determining a cooperative group in an artificial neural network or a pulsed neural network The expansion of input weights of individual neurons improves the information processing capability of artificial neural networks or pulsed neural networks.
- FIG. 8 is a schematic structural diagram of a neuron information processing system for input weight expansion according to another embodiment.
- the input weight-expanded neuron information processing system shown in FIG. 4 includes:
- the collaborative group determining module 100' is configured to determine a preset number of consecutive neurons as a cooperative group, and determine a last neuron in the collaborative group as a valid neuron, and save the effective neural group in the cooperative group Neurons outside the element are identified as synergistic neurons.
- the horizontally accumulated information obtaining module 200 ′ is configured to: acquire, by the first coordinated neuron in the collaborative group, the horizontal accumulated intermediate information of the first coordinated neuron according to the received front-end neuron information; Each of the cooperative neurons sequentially acquires the horizontal accumulation intermediate information of the cooperative neurons according to the received front-end neuron information and the lateral accumulation intermediate information of the front-end cooperative neurons, and the last synergistic nerve in the synergistic group The horizontal accumulation intermediate information of the element is determined as the horizontal accumulation information.
- the collaborative output information obtaining module 300' is configured to obtain the collaborative output information according to the received front-end neuron information, the read current neuron information of the effective neuron, and the horizontal accumulated information.
- the historical membrane potential update module 500' is configured to update the historical membrane potential information of the pulsed effective neurons.
- the pulse effective neurons output the coordinated output information
- the historical membrane potential information of the pulsed effective neurons is updated, so that the entire collaborative group completes subsequent information processing, and the pulse synergy
- the neurons do not update the historical membrane potential information, and in the subsequent information processing, the function of weight expansion is completed, and the information processing capability of the entire pulse neural network is improved through the pulse synergy group.
- an embodiment of the present invention further provides a computer device including a memory, a processor, and a computer program stored on the memory and executable on the processor, wherein the processor executes the computer
- a computer device including a memory, a processor, and a computer program stored on the memory and executable on the processor, wherein the processor executes the computer
- Non-volatile memory can include read only memory (ROM), programmable ROM (PROM), electrically programmable ROM (EPROM), electrically erasable programmable ROM (EEPROM), or flash memory.
- Volatile memory can include random access memory (RAM) or external cache memory.
- RAM is available in a variety of formats, such as static RAM (SRAM), dynamic RAM (DRAM), synchronous DRAM (SDRAM), double data rate SDRAM (DDRSDRAM), enhanced SDRAM (ESDRAM), synchronization chain.
- SRAM static RAM
- DRAM dynamic RAM
- SDRAM synchronous DRAM
- DDRSDRAM double data rate SDRAM
- ESDRAM enhanced SDRAM
- Synchlink DRAM SLDRAM
- Memory Bus Radbus
- RDRAM Direct RAM
- DRAM Direct Memory Bus Dynamic RAM
- RDRAM Memory Bus Dynamic RAM
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Biomedical Technology (AREA)
- Biophysics (AREA)
- General Health & Medical Sciences (AREA)
- Data Mining & Analysis (AREA)
- Evolutionary Computation (AREA)
- Computational Linguistics (AREA)
- Molecular Biology (AREA)
- Computing Systems (AREA)
- General Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Mathematical Physics (AREA)
- Software Systems (AREA)
- Artificial Intelligence (AREA)
- Neurology (AREA)
- Management, Administration, Business Operations System, And Electronic Commerce (AREA)
Abstract
A neuron weight information processing method, a neuron information processing method and system for input weight extension, and a computer device. The neuron weight information processing method comprises: receiving neuron output information about a front-end neuron and a weight index corresponding to the neuron output information (S100); reading, according to the weight index, a weight index information correlation to acquire weight information, wherein the weight index information correlation is a correlation between the weight index and the weight information (S200); and acquiring input information about the front-end neuron according to the weight information and the neuron output information (S300). Weight information is no longer directly transmitted between neurons, and an index of the weight information is transmitted, so that not only is the amount of information transmitted between networks saved on, but the setting of the weight information can also be flexibly changed.
Description
相关申请Related application
本申请要求2017年01月20日申请的,申请号为201710042090.4,名称为“输入权重拓展的神经元信息处理方法和系统”,以及申请号为201710042087.2,名称为“神经元权重信息处理方法和系统”的中国专利申请的优先权,在此将其全文引入作为参考。This application claims the application date of January 20, 2017, the application number is 201710042090.4, the name is "the input weight extension of the neuron information processing method and system", and the application number is 201710042087.2, the name is "neuron weight information processing method and system" The priority of the Chinese patent application is incorporated herein by reference in its entirety.
本发明涉及人工神经网络技术领域,特别是涉及神经元权重信息处理方法和系统,输入权重拓展的神经元信息处理方法和系统及计算机设备。The invention relates to the field of artificial neural network technology, in particular to a neuron weight information processing method and system, a neuron information processing method and system and a computer device with input weight extension.
如今的人工神经网络研究绝大多数仍是在冯·诺依曼计算机软件并搭配高性能GPGPU(General Purpose Graphic Processing Units通用图形处理单元)平台中实现的,整个过程的硬件开销、能耗和信息处理速度都不容乐观。为此,近几年神经形态计算领域迅猛发展,即采用硬件电路直接构建神经网络从而模拟大脑的功能,试图实现大规模并行、低能耗、可支撑复杂模式学习的计算平台。Most of today's artificial neural network research is still implemented in von Neumann computer software and with the high-performance GPGPU (General Purpose Graphic Processing Units) platform, the hardware overhead, energy consumption and information of the whole process. Processing speed is not optimistic. To this end, in recent years, the field of neuromorphic computing has developed rapidly, that is, the use of hardware circuits to directly construct neural networks to simulate the function of the brain, trying to achieve a massively parallel, low-energy, computing platform that can support complex mode learning.
然而,传统的神经形态系统中,将神经网络中各神经元之间的连接和权重信息进行绑定时,每一个神经元之间的连接都需要传递相应的权重信息,占据了较多的硬件处理资源,硬件对于网络性能的约束很大,并且权重信息降低了神经元的处理能力,限制了神经元的灵活性,不适合通用的神经网络的硬件框架。或者,单个神经元权重存储器的物理空间的数量有限,在单个神经元的输入信号的数量大于所述物理空间的数量时,导致有的输入信号对应的权重信息只能利用现有的权重,对于某些对于参数较为敏感的神经网络来说,极大的影响了神经元的应用性能。However, in the traditional neuromorphic system, when the connection between the neurons in the neural network and the weight information are bound, the connection between each neuron needs to transmit the corresponding weight information, occupying more hardware. Processing resources, hardware constraints on network performance, and weight information reduces the processing power of neurons, limits the flexibility of neurons, and is not suitable for the hardware framework of general neural networks. Alternatively, the number of physical spaces of a single neuron weight memory is limited. When the number of input signals of a single neuron is greater than the number of physical spaces, the weight information corresponding to some input signals can only utilize the existing weights. Some neural networks that are sensitive to parameters greatly affect the application performance of neurons.
发明内容Summary of the invention
基于此,有必要针对上述问题,提供一种神经元权重信息处理方法和系统、输入权重拓展的神经元信息处理方法和系统及计算机设备。Based on this, it is necessary to provide a method and system for processing neuron weight information, a method and system for processing neuron information with input weight expansion, and a computer device.
一种神经元权重信息处理方法和系统,所述方法包括:A method and system for processing neuron weight information, the method comprising:
接收前端神经元的神经元输出信息,以及与所述神经元输出信息对应的权重索引;Receiving neuron output information of the front end neuron, and a weight index corresponding to the neuron output information;
根据所述权重索引,读取权重索引信息对应关系,获取权重信息,所述权重索引信息对应关系为权重索引和权重信息之间的对应关系;
Obtaining, according to the weight index, the correspondence information of the weight index information, and obtaining the weight information, where the correspondence relationship of the weight index information is a correspondence between the weight index and the weight information;
根据所述权重信息和所述神经元输出信息,获取所述前端神经元的输入信息。And inputting the input information of the front end neuron according to the weight information and the neuron output information.
在其中一个实施例中,所述前端神经元,包括人工神经元或脉冲神经元。In one embodiment, the front end neuron comprises an artificial neuron or a pulsed neuron.
在其中一个实施例中,所述权重索引,包括与所述权重索引对应的权重信息的存储地址。In one embodiment, the weight index includes a storage address of weight information corresponding to the weight index.
在其中一个实施例中,所述接收前端神经元的神经元输出信息,以及与所述神经元输出信息对应的权重索引,包括:In one embodiment, the receiving neuron output information of the front-end neuron, and the weight index corresponding to the neuron output information, include:
接收前端神经元输出的路由信息,所述路由信息包括前端神经元输出的神经元输出信息,以及与所述神经元输出信息对应的权重索引;Receiving routing information output by the front end neuron, the routing information including neuron output information output by the front end neuron, and a weight index corresponding to the neuron output information;
解析所述路由信息,获取所述神经元输出信息和所述权重索引。Parsing the routing information, and acquiring the neuron output information and the weight index.
在其中一个实施例中,所述权重信息,包括:In one embodiment, the weight information includes:
根据预设的权重取值范围和初始权重信息,利用权重约化算法计算出的所述权重信息。The weight information calculated by the weight reduction algorithm is used according to the preset weight value range and the initial weight information.
在其中一个实施例中,在根据所述权重信息和所述神经元输出信息,获取所述前端神经元的输入信息的步骤之后,所述方法还包括:In one embodiment, after the step of acquiring the input information of the front-end neuron according to the weight information and the neuron output information, the method further includes:
根据所述前端神经元的输入信息和读取的当前神经元信息,根据神经元输出算法,计算当前神经元输出信息;Calculating current neuron output information according to the neuron output algorithm according to the input information of the front end neuron and the read current neuron information;
确定所述当前神经元输出信息的目的信息,并根据所述目的信息查找目的索引对应关系,获取所述目的信息的权重索引,所述目的索引对应关系包括目的信息和权重索引之间的对应关系;Determining the destination information of the current neuron output information, and searching the destination index correspondence according to the destination information, and acquiring a weight index of the destination information, where the destination index correspondence includes a correspondence between the destination information and the weight index ;
输出所述当前神经元输出信息和所述权重索引。The current neuron output information and the weight index are output.
在其中一个实施例中,当前神经元接收的前端神经元的输出信息中,携带前端神经元与当前神经元之间的权重信息的权重索引,当前神经元根据接收到的权重索引信息读取权重信息后,根据不同的神经元模型进行相应的运算处理,获取前端神经元的输入信息。神经元之间不再将权重信息直接传递,而是将权重信息的索引进行传递,不但节省了网络之间的传递信息量,并且可以更加灵活的改变各个神经元的权重信息的设置,提高了神经元网络的信息处理能力。In one embodiment, the output information of the front-end neuron received by the current neuron carries a weight index of the weight information between the front-end neuron and the current neuron, and the current neuron reads the weight according to the received weight index information. After the information, the corresponding arithmetic processing is performed according to different neuron models, and the input information of the front-end neurons is obtained. The weight information is no longer transmitted directly between the neurons, but the index of the weight information is transmitted, which not only saves the amount of information transmitted between the networks, but also can change the setting of the weight information of each neuron more flexibly, and improves the setting. Information processing capabilities of neural networks.
在其中一个实施例中,权重索引的使用,不但适用于人工神经网络,也适用于脉冲神经网络,提高了人工神经网络和脉冲神经网络的信息处理能力。In one of the embodiments, the use of the weight index is applicable not only to the artificial neural network but also to the pulse neural network, which improves the information processing capability of the artificial neural network and the pulsed neural network.
在其中一个实施例中,采用权重信息的存储地址作为其索引信息,可以使接收到索引信息的神经元直接利用存储地址信息查询到权重信息,提高权重信息的提取效率,进而提高整个神经元网络的信息处理效率。In one embodiment, the storage address of the weight information is used as the index information, so that the neuron receiving the index information directly uses the storage address information to query the weight information, thereby improving the extraction efficiency of the weight information, thereby improving the entire neural network. Information processing efficiency.
在其中一个实施例中,将权重索引和神经元输出信息,设置于路由信息中进行传送,充分利用了现有的路由数据,提高了神经元之间的信息使用效率。In one of the embodiments, the weight index and the neuron output information are set in the routing information for transmission, and the existing routing data is fully utilized, thereby improving the information use efficiency between the neurons.
在其中一个实施例中,根据预设的权重取值范围,利用权重约化算法计算出的权重信息,
其权重值在预设的取值范围之内,且为预设的取值范围的有限个离散值,减少了用于存储权重信息的硬件的存储空间,并且保持了权重信息的精度。In one embodiment, the weight information calculated by the weight reduction algorithm is used according to a preset weight range.
The weight value is within a preset value range and is a finite number of discrete values of the preset value range, which reduces the storage space of the hardware for storing the weight information, and maintains the precision of the weight information.
在其中一个实施例中,当前神经元计算出当前神经元的输出信息后,根据输出信息的目的神经元,查找对应的权重索引后,输出所述当前神经元输出信息和所述权重索引。当前神经元将权重索引发送至后端的神经元,将权重索引信息在神经网络中进行了完整的传递,提高了神经网络的信息处理能力。In one embodiment, after the current neuron calculates the output information of the current neuron, the current neuron output information and the weight index are output after the corresponding weight index is searched according to the target neuron of the output information. The current neuron sends the weight index to the neurons in the back end, and the weight index information is completely transmitted in the neural network, which improves the information processing capability of the neural network.
上述实施例中,当前神经元接收的前端神经元的输出信息中,携带前端神经元与当前神经元之间的权重信息的权重索引,当前神经元根据接收到的权重索引信息读取权重信息后,根据不同的神经元模型进行相应的运算处理,获取前端神经元的输入信息。神经元之间不再将权重信息直接传递,而是将权重信息的索引进行传递,不但节省了网络之间的传递信息量,并且可以更加灵活的改变各个神经元的权重信息的设置,提高了神经元网络的信息处理能力In the above embodiment, the output information of the front-end neuron received by the current neuron carries a weight index of the weight information between the front-end neuron and the current neuron, and the current neuron reads the weight information according to the received weight index information. According to different neuron models, corresponding operation processing is performed to obtain input information of the front-end neurons. The weight information is no longer transmitted directly between the neurons, but the index of the weight information is transmitted, which not only saves the amount of information transmitted between the networks, but also can change the setting of the weight information of each neuron more flexibly, and improves the setting. Information processing capability of neural networks
在一个实施例中,还提供一种输入权重拓展的神经元信息处理方法,所述方法包括:In an embodiment, a neuron information processing method for input weight expansion is further provided, and the method includes:
将预设数量的连续的神经元确定为协同组,将所述协同组中最后一个神经元确定为有效神经元,将所述协同组中除所述有效神经元外的神经元确定为协同神经元;Determining a predetermined number of consecutive neurons as a synergistic group, determining a last neuron in the synergistic group as a valid neuron, and determining a neuron other than the effective neuron in the synergistic group as a coordinated nerve yuan;
所述协同组中的第一个协同神经元,根据接收的前端神经元信息,获取第一个协同神经元的横向累加中间信息;The first coordinated neuron in the collaborative group obtains the horizontal accumulated intermediate information of the first coordinated neuron according to the received front-end neuron information;
所述协同组中后续的各协同神经元,依次根据接收的前端神经元信息,和前端协同神经元的横向累加中间信息,获取所述各协同神经元的横向累加中间信息,并将所述协同组中最后一个协同神经元的横向累加中间信息确定为横向累加信息;The subsequent cooperative neurons in the collaborative group sequentially acquire the horizontal accumulation intermediate information of the cooperative neurons according to the received front-end neuron information and the lateral accumulation intermediate information of the front-end cooperative neurons, and the synergy The horizontal accumulation intermediate information of the last coordinated neuron in the group is determined as the horizontal accumulation information;
所述有效神经元根据接收的前端神经元信息、读取的所述有效神经元的当前神经元信息和所述横向累加信息,获取协同输出信息;The effective neuron acquires collaborative output information according to the received front-end neuron information, the read current neuron information of the effective neuron, and the horizontal accumulation information;
所述有效神经元输出所述协同输出信息。The effective neurons output the coordinated output information.
在其中一个实施例中,所述将所述协同组中最后一个神经元确定为有效神经元,将所述协同组中除所述有效神经元外的神经元确定为协同神经元,包括:In one embodiment, the determining the last neuron in the collaborative group as a valid neuron, and determining a neuron other than the effective neuron in the collaborative group as a cooperative neuron, includes:
设置所述协同组内神经元的发放使能标识,所述发放使能标识包括允许发放数据或不允许发放数据,将所述有效神经元的发放使能标识设置为允许发放数据,并将所有所述协同神经元的发放使能标识设置为不允许发放数据。Setting an issuance enablement identifier of a neuron in the collaborative group, the release enablement identifier includes allowing data to be issued or not allowing data to be issued, setting an issue enablement identifier of the effective neuron to allow data to be issued, and The issuance enablement identifier of the coordinated neuron is set to not allow the issuance of data.
在其中一个实施例中,所述前端神经元信息包括:前端神经元输出信息、前端神经元与当前神经元的连接权重索引;In one embodiment, the front-end neuron information includes: front-end neuron output information, a connection weight index of the front-end neuron and the current neuron;
所述协同组中的第一个协同神经元,根据接收的前端神经元信息,获取第一个协同神经元的横向累加中间信息,包括:The first coordinated neuron in the collaborative group obtains the horizontal accumulated intermediate information of the first coordinated neuron according to the received front-end neuron information, including:
协同组中的第一个协同神经元,根据所述前端神经元与当前神经元的连接权重索引,读
取前端神经元与当前神经元的连接权重;The first coordinated neuron in the collaborative group is read according to the connection weight index of the front-end neuron and the current neuron
Taking the connection weight of the front-end neuron and the current neuron;
根据所述前端神经元与当前神经元的连接权重、所述前端神经元信息,获取第一个协同神经元的横向累加中间信息;Obtaining horizontal accumulation intermediate information of the first coordinated neuron according to the connection weight of the front-end neuron and the current neuron, and the front-end neuron information;
所述协同组中后续的各协同神经元,依次根据接收的前端神经元信息,和前端协同神经元的横向累加中间信息,获取所述各协同神经元的横向累加中间信息,包括:The subsequent cooperative neurons in the collaborative group sequentially obtain the horizontal accumulation intermediate information of the cooperative neurons according to the received front-end neuron information and the horizontal accumulation intermediate information of the front-end cooperative neurons, including:
所述协同组中的后续协同神经元,依次根据所述前端神经元与当前神经元的连接权重索引,读取前端神经元与当前神经元的连接权重;The subsequent cooperative neurons in the collaborative group sequentially index the connection weights of the front-end neurons and the current neurons according to the connection weight index of the front-end neurons and the current neurons;
根据所述前端神经元与当前神经元的连接权重、所述前端神经元信息,和前端协同神经元的横向累加中间信息,获取所述各协同神经元的横向累加中间信息。And obtaining, according to the connection weight of the front-end neuron and the current neuron, the front-end neuron information, and the lateral accumulation intermediate information of the front-end cooperative neuron, the horizontal accumulation intermediate information of each coordinated neuron is obtained.
在其中一个实施例中,所述将预设数量的连续的神经元确定为协同组,将所述协同组中最后一个神经元确定为有效神经元,将所述协同组中除所述有效神经元外的神经元确定为协同神经元,包括:In one embodiment, the predetermined number of consecutive neurons are determined as a cooperative group, and the last neuron in the collaborative group is determined as a valid neuron, and the effective neural group is excluded from the collaborative group The neurons outside the element are identified as synergistic neurons, including:
将预设数量的连续的人工神经元确定为人工协同组,将所述人工协同组中最后一个人工神经元确定为人工有效神经元,将所述人工协同组中除所述人工有效神经元外的人工神经元确定为人工协同神经元;或Determining a predetermined number of consecutive artificial neurons as an artificial synergistic group, and determining a last artificial neuron in the artificial cooperative group as an artificial effective neuron, and excluding the artificial effective neurons in the artificial cooperative group Artificial neurons are identified as artificial synergistic neurons; or
将预设数量的连续的脉冲神经元确定为脉冲协同组,将所述脉冲协同组中最后一个脉冲神经元确定为脉冲有效神经元,将所述脉冲协同组中除所述脉冲有效神经元外的脉冲神经元确定为脉冲协同神经元。Determining a predetermined number of consecutive pulse neurons as a pulse synergy group, and determining a last pulse neuron in the pulse synergy group as a pulse effective neuron, wherein the pulse synergy group is other than the pulse effective neuron The pulsed neurons are identified as pulse-synchronized neurons.
在其中一个实施例中,当所述协同组为脉冲协同组时,所述当前神经元信息包括历史膜电位信息;In one embodiment, when the collaborative group is a pulse cooperative group, the current neuron information includes historical membrane potential information;
在所述有效神经元输出所述协同输出信息的步骤之后,所述方法还包括:After the step of outputting the collaborative output information by the effective neurons, the method further includes:
更新所述脉冲有效神经元的所述历史膜电位信息。The historical membrane potential information of the pulsed effective neurons is updated.
在其中一个实施例中,通过将预设数量的连续的神经元确定为协同组,只将所述协同组中的最后一个神经元的信息进行输出,剩余神经元只将信息进行累加后迭加至后端的神经元,将协同组内的所有神经元等效于一个有效的节点,多个输入对应一个有效输出,可以充分利用所述多个输入的权重信息,打破了现有的神经元输入权重类型有限的缺点,提高了神经网络的信息处理能力。In one embodiment, by determining a predetermined number of consecutive neurons as a cooperative group, only the information of the last neuron in the cooperative group is output, and the remaining neurons only accumulate the information and superimpose the information. The neurons to the back end are equivalent to all the neurons in the cooperative group, and the multiple inputs correspond to one valid output, which can fully utilize the weight information of the multiple inputs, breaking the existing neuron input. The shortcomings of limited weight types improve the information processing capabilities of neural networks.
在其中一个实施例中,通过设置发放使能标识,将神经元设置为允许发放数据或不允许发放数据,将预设数量的连续的神经元组成一个协同组,可根据需求灵活的组成协同组。In one embodiment, by setting a release enable identifier, the neuron is set to allow data to be issued or not allowed to be distributed, and a predetermined number of consecutive neurons are grouped into a coordinated group, and the cooperative group can be flexibly configured according to requirements. .
在其中一个实施例中,通过接收的前端神经元信息中的前端神经元与当前神经元的连接权重索引,读取前端神经元与当前神经元的连接权重后,用于计算横向累加中间信息,将一个协同组中的各协同神经元的权重信息进行了充分利用,并在有效神经元输出的协同输出信
息中,将各协同神经元的权重信息进行了体现,相当于将有效神经元的权重信息进行了扩展,从而提高了神经网络的信息处理能力。In one embodiment, the connection weight of the front-end neuron and the current neuron in the received front-end neuron information is indexed, and the connection weight of the front-end neuron and the current neuron is read, and then used to calculate the horizontal accumulation intermediate information. The weight information of each coordinated neuron in a collaborative group is fully utilized, and the cooperative output signal of the effective neuron output
In the interest rate, the weight information of each cooperative neuron is embodied, which is equivalent to expanding the weight information of the effective neurons, thereby improving the information processing capability of the neural network.
在其中一个实施例中,将预设数量的连续的人工神经元确定为人工协同组,或将预设数量的连续的脉冲神经元确定为脉冲协同组,在人工神经网络或脉冲神经网络中,都可以确定协同组,进行单个神经元输入权重的扩展,提高人工神经网络或脉冲神经网络的信息处理能力。In one embodiment, a predetermined number of consecutive artificial neurons are determined as an artificial cooperative group, or a predetermined number of consecutive pulse neurons are determined as a pulse cooperative group, in an artificial neural network or an impulse neural network, All can determine the synergy group, expand the input weight of a single neuron, and improve the information processing capability of the artificial neural network or the pulse neural network.
在其中一个实施例中,在脉冲协同组中,脉冲有效神经元输出协同输出信息后,将所述脉冲有效神经元的历史膜电位信息进行更新,以便整个协同组完成后续的信息处理,而脉冲协同神经元不更新历史膜电位信息,在后续的信息处理中,完成权重拓展的功能,通过脉冲协同组,提高整个脉冲神经网络的信息处理能力。In one embodiment, in the pulse synergy group, after the pulse effective neuron outputs the cooperative output information, the historical membrane potential information of the pulsed effective neurons is updated, so that the entire cooperative group completes subsequent information processing, and the pulse The cooperative neurons do not update the historical membrane potential information, and in the subsequent information processing, the function of weight expansion is completed, and the information processing capability of the entire pulse neural network is improved by the pulse synergy group.
上述实施例中,通过将预设数量的连续的神经元确定为协同组,只将所述协同组中的最后一个神经元的信息进行输出,剩余神经元只将信息进行累加后迭加至后端的神经元,将协同组内的所有神经元等效于一个有效的节点,多个输入组对应一个有效输出,可以充分利用所述多个输入组的权重信息,打破了现有单个神经元输入权重类型有限的缺点,提高了神经网络的信息处理能力。In the above embodiment, by determining a predetermined number of consecutive neurons as a cooperative group, only information of the last neuron in the cooperative group is output, and the remaining neurons only add information and then superimpose it. The neurons in the end group are equivalent to all the neurons in the cooperative group, and the plurality of input groups correspond to one effective output, and the weight information of the plurality of input groups can be fully utilized to break the existing single neuron input. The shortcomings of limited weight types improve the information processing capabilities of neural networks.
本发明还提供一种计算机设备,其中,包括存储器、处理器,及存储在存储器上并可在处理器上运行的计算机程序,所述处理器执行所述计算机程序时实现上述任一实施例中方法的步骤。The present invention also provides a computer device, comprising: a memory, a processor, and a computer program stored on the memory and operable on the processor, the processor executing the computer program to implement any of the above embodiments The steps of the method.
本发明还提供一种神经元权重信息处理系统,包括:The invention also provides a neuron weight information processing system, comprising:
前端神经元输出信息接收模块,用于接收前端神经元的神经元输出信息,以及与所述神经元输出信息对应的权重索引;a front-end neuron output information receiving module, configured to receive neuron output information of the front-end neuron, and a weight index corresponding to the neuron output information;
权重信息获取模块,用于根据所述权重索引,读取权重索引信息对应关系,获取权重信息,所述权重索引信息对应关系为权重索引和权重信息之间的对应关系;The weight information obtaining module is configured to: according to the weight index, read the correspondence of the weight index information, and obtain the weight information, where the weight index information correspondence relationship is a correspondence between the weight index and the weight information;
前端神经元输入信息获取模块,用于根据所述权重信息和所述神经元输出信息,获取所述前端神经元的输入信息。The front-end neuron input information acquiring module is configured to acquire input information of the front-end neuron according to the weight information and the neuron output information.
在其中一个实施例中,所述前端神经元,包括人工神经元或脉冲神经元。In one embodiment, the front end neuron comprises an artificial neuron or a pulsed neuron.
在其中一个实施例中,所述权重索引,包括与所述权重索引对应的权重信息的存储地址。In one embodiment, the weight index includes a storage address of weight information corresponding to the weight index.
在其中一个实施例中,所述前端神经元输出信息接收模块,包括:In one embodiment, the front-end neuron outputs an information receiving module, including:
路由信息接收单元,用于接收前端神经元输出的路由信息,所述路由信息包括前端神经元输出的神经元输出信息,以及与所述神经元输出信息对应的权重索引;a routing information receiving unit, configured to receive routing information output by the front end neuron, where the routing information includes neuron output information output by the front end neuron, and a weight index corresponding to the neuron output information;
路由信息解析单元,用于解析所述路由信息,获取所述神经元输出信息和所述权重索引。The routing information parsing unit is configured to parse the routing information, and obtain the neuron output information and the weight index.
本发明还提供一种输入权重拓展的神经元信息处理系统,包括:
The invention also provides a neuron information processing system with input weight expansion, comprising:
协同组确定模块,用于将预设数量的连续的神经元确定为协同组,将所述协同组中最后一个神经元确定为有效神经元,将所述协同组中除所述有效神经元外的神经元确定为协同神经元;a collaborative group determining module, configured to determine a preset number of consecutive neurons as a collaborative group, and determine a last neuron in the collaborative group as a valid neuron, except the effective neurons in the collaborative group The neurons are identified as synergistic neurons;
横向累加信息获取模块,用于所述协同组中后续的各协同神经元,依次根据接收的前端神经元信息,和前端协同神经元的横向累加中间信息,获取所述各协同神经元的横向累加中间信息,并将所述协同组中最后一个协同神经元的横向累加中间信息确定为横向累加信息;a horizontally accumulating information acquiring module, configured to acquire, according to the received front-end neuron information and the horizontally accumulated intermediate information of the front-end cooperative neurons, the horizontal accumulation of the cooperative neurons Intermediate information, and determining lateral accumulation intermediate information of the last coordinated neuron in the collaborative group as lateral accumulation information;
协同输出信息获取模块,用于所述有效神经元根据接收的前端神经元信息、读取的所述有效神经元的当前神经元信息和所述横向累加信息,获取协同输出信息;a cooperative output information obtaining module, configured to acquire, by the valid neuron, collaborative output information according to the received front-end neuron information, the read current neuron information of the effective neuron, and the horizontal accumulation information;
协同输出信息输出模块,用于所述有效神经元输出所述协同输出信息。And a cooperative output information output module, configured to output the collaborative output information by the effective neurons.
为了清楚地说明本申请的技术方案,下面将对实施例中所需要使用的附图作简单地介绍。显而易见地,下面描述中的附图仅仅是本申请的一些实施例,对于本领域普通技术人员来讲,在不付出创造性劳动的前提下,还可以根据这些附图获得其他的附图。In order to clearly explain the technical solutions of the present application, the drawings to be used in the embodiments will be briefly described below. Obviously, the drawings in the following description are only some of the embodiments of the present application, and those skilled in the art can obtain other drawings according to the drawings without any creative work.
图1为一个实施例的神经元权重信息处理方法的流程示意图;1 is a schematic flow chart of a method for processing a neuron weight information according to an embodiment;
图2为另一个实施例的神经元权重信息处理方法的流程示意图;2 is a schematic flow chart of a method for processing a neuron weight information according to another embodiment;
图3为一个实施例的神经元权重信息处理系统的结构示意图;3 is a schematic structural diagram of a neuron weight information processing system according to an embodiment;
图4为另一个实施例的神经元权重信息处理系统的结构示意图;4 is a schematic structural diagram of a neuron weight information processing system of another embodiment;
图5为一个实施例的输入权重拓展的神经元信息处理方法的流程示意图;FIG. 5 is a schematic flow chart of a neuron information processing method for input weight expansion according to an embodiment; FIG.
图6为另一个实施例的输入权重拓展的神经元信息处理方法的流程示意图;6 is a schematic flow chart of a neuron information processing method for input weight expansion in another embodiment;
图7为一个实施例的输入权重拓展的神经元信息处理系统的结构示意图;7 is a schematic structural diagram of a neuron information processing system with an input weight extension according to an embodiment;
图8为另一个实施例的输入权重拓展的神经元信息处理系统的结构示意图。FIG. 8 is a schematic structural diagram of a neuron information processing system for input weight expansion of another embodiment.
为了使本发明的目的、技术方案及优点更加清楚明白,以下结合附图及实施例对本发明进行进一步详细说明。应当理解,此处所描述的具体实施例仅用以解释本发明,并不用于限定本发明。The present invention will be further described in detail below with reference to the accompanying drawings and embodiments. It is understood that the specific embodiments described herein are merely illustrative of the invention and are not intended to limit the invention.
图1为一个实施例的神经元权重信息处理方法的流程示意图,如图1所示的神经元权重信息处理方法包括:FIG. 1 is a schematic flowchart of a method for processing a neuron weight information according to an embodiment. The method for processing a neuron weight information as shown in FIG. 1 includes:
步骤S100,接收前端神经元的神经元输出信息,以及与所述神经元输出信息对应的权重索引。Step S100: Receive neuron output information of the front end neuron, and a weight index corresponding to the neuron output information.
具体的,所述前端神经元的神经元输出信息,是前端神经元计算得出的输出信息;所述
权重索引,是用于当前神经元检索到与前端神经元输出信息对应的权重信息的索引信息。Specifically, the neuron output information of the front-end neuron is output information calculated by the front-end neuron;
The weight index is index information for the current neuron to retrieve the weight information corresponding to the front-end neuron output information.
采用权重索引的方式,可以在信息的传递过程中,占用更小的信息传递空间,不但降低了硬件的处理需求,并且只需要改变索引信息,就能够更加灵活的方便的对权重信息的变化进行更新,使得神经网络中权重信息的更新更加方便。The weight indexing method can occupy a smaller information transmission space in the process of information transmission, which not only reduces the processing requirements of the hardware, but also needs to change the index information, so that the change of the weight information can be more flexibly and conveniently performed. The update makes it easier to update the weight information in the neural network.
步骤S200,根据所述权重索引,读取权重索引信息对应关系,获取权重信息,所述权重索引信息对应关系为权重索引和权重信息之间的对应关系。Step S200: The weight index is read according to the weight index, and the weight index information is obtained, and the weight index information correspondence relationship is a correspondence between the weight index and the weight information.
具体的,所述权重索引信息对应关系,可以存储在当前神经元本地,也可以存储在神经网络中的其它位置,只要当前神经元能够读取到即可。Specifically, the weight index information correspondence relationship may be stored locally in the current neuron or may be stored in other locations in the neural network as long as the current neuron can be read.
步骤S300,根据所述权重信息和所述神经元输出信息,获取所述前端神经元的输入信息。Step S300, acquiring input information of the front-end neuron according to the weight information and the neuron output information.
具体的,将所述读取到的权重信息,根据不同的神经元模型进行相应的运算处理,获取前端神经元的输入信息。Specifically, the read weight information is subjected to corresponding operation processing according to different neuron models to obtain input information of the front end neurons.
在本实施例中,当前神经元接收的前端神经元的输出信息中,携带前端神经元与当前神经元之间的权重信息的权重索引,当前神经元根据接收到的权重索引信息读取权重信息后,根据不同的神经元模型进行相应的运算处理,获取前端神经元的输入信息。神经元之间不再将权重信息直接传递,而是将权重信息的索引进行传递,不但节省了网络之间的传递信息量,并且可以更加灵活的改变各个神经元的权重信息的设置,提高了神经元网络的信息处理能力。In this embodiment, the output information of the front-end neuron received by the current neuron carries a weight index of the weight information between the front-end neuron and the current neuron, and the current neuron reads the weight information according to the received weight index information. After that, according to different neuron models, corresponding operation processing is performed to obtain input information of the front-end neurons. The weight information is no longer transmitted directly between the neurons, but the index of the weight information is transmitted, which not only saves the amount of information transmitted between the networks, but also can change the setting of the weight information of each neuron more flexibly, and improves the setting. Information processing capabilities of neural networks.
在其中一个实施例中,所述前端神经元,包括人工神经元或脉冲神经元,即,前端神经元和当前神经元,可以为人工神经元或脉冲神经元,即人工神经元网络或脉冲神经元网络,均可以在权重信息的传递中,采用权重索引的方式。In one embodiment, the front-end neuron, including artificial neurons or pulsed neurons, ie, front-end neurons and current neurons, may be artificial neurons or pulsed neurons, ie, artificial neural networks or pulsed nerves. The meta-network can use the weight indexing method in the transmission of the weight information.
具体的,当所述神经元为人工神经元时,所述接收前端神经元的神经元输出信息,以及与所述神经元输出信息对应的权重索引,包括:前端人工神经元输出的膜电位信息、前端人工神经元与当前人工神经元的连接权重索引;所述根据所述权重索引,读取权重索引信息对应关系,获取权重信息,包括:根据所述前端人工神经元与当前人工神经元的连接权重索引,读取前端人工神经元与当前人工神经元的连接权重。Specifically, when the neuron is an artificial neuron, the neuron output information of the front-end neuron and the weight index corresponding to the neuron output information include: membrane potential information output by the front-end artificial neuron And the connection weight index of the front end artificial neuron and the current artificial neuron; the reading the weight index information corresponding relationship according to the weight index, and obtaining the weight information, including: according to the front end artificial neuron and the current artificial neuron Connect the weight index to read the connection weight between the front-end artificial neurons and the current artificial neurons.
当所述神经元为脉冲神经元时,所述接收前端神经元的神经元输出信息,以及与所述神经元输出信息对应的权重索引,包括:前端脉冲神经元输出的脉冲尖端信息、前端脉冲神经元与当前脉冲神经元的连接权重索引;根据所述权重索引,读取权重索引信息对应关系,获取权重信息,包括:根据所述前端脉冲神经元与当前脉冲神经元的连接权重索引,读取前端脉冲神经元与当前脉冲神经元的连接权重。When the neuron is a pulsed neuron, the neuron output information of the receiving front end neuron, and the weight index corresponding to the neuron output information, include: pulse tip information output by the front end pulse neuron, front end pulse The weighting index of the connection between the neuron and the current pulsed neuron; reading the weighted index information correspondence according to the weight index, and obtaining the weight information, including: reading according to the connection weight index of the front-end pulse neuron and the current pulse neuron The connection weight of the front-end pulse neuron and the current pulsed neuron is taken.
在本实施例中,权重索引的使用,不但适用于人工神经网络,也适用于脉冲神经网络,提高了人工神经网络和脉冲神经网络的信息处理能力。In this embodiment, the use of the weight index is applicable not only to the artificial neural network but also to the pulse neural network, and improves the information processing capability of the artificial neural network and the pulsed neural network.
在其中一个实施例中,所述权重索引,包括与所述权重索引对应的权重信息的存储地址。
In one embodiment, the weight index includes a storage address of weight information corresponding to the weight index.
具体的,除权重信息的存储地址外,也可以采用权重信息的编号等作为权重索引,或根据实际需求,采用其它方式对权重信息进行索引信息的获取后,用做权重信息的索引信息。Specifically, in addition to the storage address of the weight information, the weight information may be used as the weight index, or the index information of the weight information may be obtained by using other methods according to actual requirements, and then used as the index information of the weight information.
在本实施例中,采用权重信息的存储地址作为其索引信息,可以使接收到索引信息的神经元直接利用存储地址信息查询到权重信息,提高权重信息的提取效率,进而提高整个神经元网络的信息处理效率。In this embodiment, the storage address of the weight information is used as the index information, so that the neuron receiving the index information directly uses the storage address information to query the weight information, thereby improving the extraction efficiency of the weight information, thereby improving the entire neural network. Information processing efficiency.
在其中一个实施例中,所述接收前端神经元的神经元输出信息,以及与所述神经元输出信息对应的权重索引,包括:接收前端神经元输出的路由信息,所述路由信息包括前端神经元输出的神经元输出信息,以及与所述神经元输出信息对应的权重索引;解析所述路由信息,获取所述神经元输出信息和所述权重索引。In one embodiment, the receiving neuron output information of the front-end neuron and the weight index corresponding to the neuron output information include: receiving routing information output by the front-end neuron, the routing information including the front-end neural network Meta-outputted neuron output information, and a weight index corresponding to the neuron output information; parsing the routing information, and acquiring the neuron output information and the weight index.
具体的,将所述权重索引设置在路由信息中,利用现有的神经网络中的信息传递数据,将权重索引信息进行传递。在路由信息中,利用固定长度或可变长度的信息位存储所述索引信息即可。Specifically, the weight index is set in the routing information, and the weight index information is transmitted by using the information transmission data in the existing neural network. In the routing information, the index information may be stored using fixed length or variable length information bits.
在本实施例中,将权重索引和神经元输出信息,设置于路由信息中进行传送,充分利用了现有的路由数据,提高了神经元之间的信息使用效率。In this embodiment, the weight index and the neuron output information are set and transmitted in the routing information, and the existing routing data is fully utilized, thereby improving the information use efficiency between the neurons.
在其中一个实施例中,根据预设的权重取值范围和初始权重信息,利用权重约化算法计算出的所述权重信息,将权重限定在某些固定的值。In one embodiment, the weight information calculated by the weight reduction algorithm is used to limit the weight to some fixed value according to the preset weight value range and the initial weight information.
具体的,所述预设的权重取值范围,是为网络中所需的权重最大值和最小值形成的范围;所述权重约化算法,在保证算法精度的前提下,可将权重离散化为某些固定的值,例如权重二值化、三值化算法。Specifically, the preset weight value range is a range formed by a maximum and a minimum weight required in the network; and the weight reduction algorithm may discretize the weight under the premise of ensuring the accuracy of the algorithm. For some fixed values, such as weight binarization, binarization algorithm.
在本实施例中,根据预设的权重取值范围,利用权重约化算法计算出的权重信息,其权重值在预设的取值范围之内的某些固定值,减少了用于存储权重信息的硬件的存储空间,并且保持了权重信息的精度。In this embodiment, according to the preset weight range, the weight information calculated by the weight reduction algorithm, the fixed value of the weight value within a preset value range, is reduced for the storage weight. The storage space of the hardware of the information, and the accuracy of the weight information is maintained.
图2为另一个实施例的神经元权重信息处理方法的流程示意图,如图2所示的神经元权重信息处理方法包括:FIG. 2 is a schematic flowchart diagram of a method for processing a neuron weight information according to another embodiment. The method for processing a neuron weight information as shown in FIG. 2 includes:
步骤S100,接收前端神经元的神经元输出信息,以及与所述神经元输出信息对应的权重索引。Step S100: Receive neuron output information of the front end neuron, and a weight index corresponding to the neuron output information.
步骤S200,根据所述权重索引,读取权重索引信息对应关系,获取权重信息,所述权重索引信息对应关系为权重索引和权重信息之间的对应关系。Step S200: The weight index is read according to the weight index, and the weight index information is obtained, and the weight index information correspondence relationship is a correspondence between the weight index and the weight information.
步骤S300,根据所述权重信息和所述神经元输出信息,获取所述前端神经元的输入信息。Step S300, acquiring input information of the front-end neuron according to the weight information and the neuron output information.
步骤S400,根据所述前端神经元的输入信息和读取的当前神经元信息,根据神经元输出算法,计算当前神经元输出信息。Step S400, calculating current neuron output information according to the neuron output algorithm according to the input information of the front end neuron and the read current neuron information.
具体的,当所述神经元为人工神经元时,所述当前人工神经元信息包括:当前人工神经
元偏置信息;则所述根据所述前端神经元的输入信息和读取的当前神经元信息,根据神经元输出算法,计算当前神经元输出信息,包括:根据所述前端人工神经元输出的膜电位信息、所述前端人工神经元与当前人工神经元的连接权重、所述当前人工神经元偏置信息,通过预设的人工神经元激活函数,计算当前人工神经元输出信息。Specifically, when the neuron is an artificial neuron, the current artificial neuron information includes: a current artificial nerve
Meta-bias information; then, according to the input information of the front-end neuron and the read current neuron information, calculating current neuron output information according to a neuron output algorithm, including: outputting according to the front-end artificial neuron The membrane potential information, the connection weight of the front end artificial neuron and the current artificial neuron, and the current artificial neuron bias information are used to calculate the current artificial neuron output information through a preset artificial neuron activation function.
当所述神经元为脉冲神经元时,所述当前脉冲神经元信息包括:历史膜电位信息和膜电位泄漏信息;则所述根据所述前端神经元的输入信息和读取的当前神经元信息,根据神经元输出算法,计算当前神经元输出信息,包括:根据所述前端脉冲神经元输入信息、所述前端脉冲神经元与当前脉冲神经元的连接权重、所述历史膜电位信息、所述膜电位泄露信息,通过脉冲神经元计算模型,计算当前脉冲神经元输出信息When the neuron is a pulsed neuron, the current pulsed neuron information includes: historical membrane potential information and membrane potential leakage information; then the input information according to the front-end neuron and the read current neuron information Calculating, according to the neuron output algorithm, the current neuron output information, including: according to the front end pulse neuron input information, a connection weight of the front end pulse neuron and the current pulse neuron, the historical membrane potential information, the Membrane potential leakage information, calculated by pulse neuron calculation model, current pulse neuron output information
步骤S500,确定所述当前神经元输出信息的目的信息,并根据所述目的信息查找目的索引对应关系,获取所述目的信息的权重索引,所述目的索引对应关系包括目的信息和权重索引之间的对应关系。Step S500, determining destination information of the current neuron output information, and searching for a destination index correspondence relationship according to the destination information, and acquiring a weight index of the destination information, where the destination index correspondence relationship includes between the destination information and the weight index Correspondence.
具体的,当一个神经网络的任务确定后,当前神经元与后端神经元的连接关系便已经确定,当前神经元输出信息的目的信息也已经确定。根据所述当前神经元输出信息的目的信息,查找目的索引对应关系,便可以对应的权重索引。Specifically, when the task of a neural network is determined, the connection relationship between the current neuron and the back-end neuron has been determined, and the destination information of the current neuron output information has also been determined. According to the destination information of the current neuron output information, the destination index correspondence relationship is searched, and the corresponding weight index can be used.
步骤S600,输出所述当前神经元输出信息和所述权重索引。Step S600, outputting the current neuron output information and the weight index.
具体的,输出所述当前神经元输出信息和所述权重索引给后端的神经元即可完成权重索引的整个传递过程。Specifically, outputting the current neuron output information and the weight index to the neurons of the back end can complete the entire transfer process of the weight index.
在本实施例中,当前神经元计算出用于输出的当前神经元的输出信息后,根据输出信息的目的神经元,查找对应的权重索引后,输出所述当前神经元输出信息和所述权重索引。当前神经元将权重索引发送至后端的神经元,将权重索引信息在神经网络中进行了完整的传递,提高了神经网络的信息处理能力。In this embodiment, after the current neuron calculates the output information of the current neuron for output, according to the target neuron of the output information, after searching for the corresponding weight index, the current neuron output information and the weight are output. index. The current neuron sends the weight index to the neurons in the back end, and the weight index information is completely transmitted in the neural network, which improves the information processing capability of the neural network.
图3为一个实施例的神经元权重信息处理系统的结构示意图,如图3所示的神经元权重信息处理系统包括:FIG. 3 is a schematic structural diagram of a neuron weight information processing system according to an embodiment. The neuron weight information processing system shown in FIG. 3 includes:
前端神经元输出信息接收模块100,用于接收前端神经元的神经元输出信息,以及与所述神经元输出信息对应的权重索引;所述前端神经元,包括人工神经元或脉冲神经元。所述权重索引,包括与所述权重索引对应的权重信息的存储地址。所述前端神经元输出信息接收模块,包括:路由信息接收单元,用于接收前端神经元输出的路由信息,所述路由信息包括前端神经元输出的神经元输出信息,以及与所述神经元输出信息对应的权重索引;路由信息解析单元,用于解析所述路由信息,获取所述神经元输出信息和所述权重索引。The front-end neuron output information receiving module 100 is configured to receive neuron output information of the front-end neuron, and a weight index corresponding to the neuron output information; the front-end neuron includes an artificial neuron or a pulsed neuron. The weight index includes a storage address of the weight information corresponding to the weight index. The front-end neuron output information receiving module includes: a routing information receiving unit, configured to receive routing information output by the front-end neuron, where the routing information includes neuron output information output by the front-end neuron, and output with the neuron A weight index corresponding to the information; a routing information parsing unit, configured to parse the routing information, and obtain the neuron output information and the weight index.
权重信息获取模块200,用于根据所述权重索引,读取权重索引信息对应关系,获取权重信息,所述权重索引信息对应关系为权重索引和权重信息之间的对应关系;所述权重信息,
包括:根据预设的权重取值范围和初始权重信息,利用权重约化算法计算出的所述权重信息。The weight information obtaining module 200 is configured to read the weight index information corresponding relationship according to the weight index, and obtain the weight information, where the weight index information correspondence relationship is a correspondence between the weight index and the weight information; the weight information,
The method includes: calculating, by using a weight reduction algorithm, the weight information according to a preset weight value range and initial weight information.
前端神经元输入信息获取模块300,用于根据所述权重信息和所述神经元输出信息,获取所述前端神经元的输入信息。The front-end neuron input information obtaining module 300 is configured to acquire input information of the front-end neuron according to the weight information and the neuron output information.
在本实施例中,当前神经元接收的前端神经元的输出信息中,携带前端神经元与当前神经元之间的权重信息的权重索引,当前神经元根据接收到的权重索引信息读取权重信息后,根据不同的神经元模型进行相应的运算处理,获取前端神经元的输入信息。神经元之间不再将权重信息直接传递,而是将权重信息的索引进行传递,不但节省了网络之间的传递信息量,并且可以更加灵活的改变各个神经元的权重信息的设置,提高了神经元网络的信息处理能力。权重索引的使用,不但适用于人工神经网络,也适用于脉冲神经网络,提高了人工神经网络和脉冲神经网络的信息处理能力。采用权重信息的存储地址作为其索引信息,可以使接收到索引信息的神经元直接利用存储地址信息查询到权重信息,提高权重信息的提取效率,进而提高整个神经元网络的信息处理效率。将权重索引和神经元输出信息,设置于路由信息中进行传送,充分利用了现有的路由数据,提高了神经元之间的信息使用效率。根据预设的权重取值范围,利用权重约化算法计算出的权重信息,其权重值在预设的取值范围之内,减少了用于存储权重信息的硬件的存储空间,并且保持了权重信息的精度。In this embodiment, the output information of the front-end neuron received by the current neuron carries a weight index of the weight information between the front-end neuron and the current neuron, and the current neuron reads the weight information according to the received weight index information. After that, according to different neuron models, corresponding operation processing is performed to obtain input information of the front-end neurons. The weight information is no longer transmitted directly between the neurons, but the index of the weight information is transmitted, which not only saves the amount of information transmitted between the networks, but also can change the setting of the weight information of each neuron more flexibly, and improves the setting. Information processing capabilities of neural networks. The use of weight index is applicable not only to artificial neural networks, but also to pulsed neural networks, which improves the information processing capabilities of artificial neural networks and pulsed neural networks. The storage address of the weight information is used as the index information, so that the neuron receiving the index information directly uses the storage address information to query the weight information, thereby improving the extraction efficiency of the weight information, thereby improving the information processing efficiency of the entire neural network. The weight index and the neuron output information are set and transmitted in the routing information, and the existing routing data is fully utilized, thereby improving the information use efficiency between the neurons. According to the preset weight range, the weight information calculated by the weight reduction algorithm has a weight value within a preset value range, which reduces the storage space of the hardware for storing the weight information, and maintains the weight. The accuracy of the information.
图4为另一个实施例的神经元权重信息处理系统的结构示意图,如图4所示的神经元权重信息处理系统包括:4 is a schematic structural diagram of a neuron weight information processing system according to another embodiment, and the neuron weight information processing system shown in FIG. 4 includes:
前端神经元输出信息接收模块100,用于接收前端神经元的神经元输出信息,以及与所述神经元输出信息对应的权重索引。The front-end neuron output information receiving module 100 is configured to receive neuron output information of the front-end neuron, and a weight index corresponding to the neuron output information.
权重信息获取模块200,用于根据所述权重索引,读取权重索引信息对应关系,获取权重信息,所述权重索引信息对应关系为权重索引和权重信息之间的对应关系。The weight information obtaining module 200 is configured to read the weight index information corresponding relationship according to the weight index, and obtain weight information, where the weight index information correspondence relationship is a correspondence between the weight index and the weight information.
前端神经元输入信息获取模块300,用于根据所述权重信息和所述神经元输出信息,获取所述前端神经元的输入信息。The front-end neuron input information obtaining module 300 is configured to acquire input information of the front-end neuron according to the weight information and the neuron output information.
当前神经元输出信息获取模块400,用于根据所述前端神经元的输入信息和读取的当前神经元信息,根据神经元输出算法,计算当前神经元输出信息。The current neuron output information obtaining module 400 is configured to calculate current neuron output information according to the neuron output algorithm according to the input information of the front end neuron and the read current neuron information.
权重索引确定模块500,用于确定所述当前神经元输出信息的目的信息,并根据所述目的信息查找目的索引对应关系,获取所述目的信息的权重索引,所述目的索引对应关系包括目的信息和权重索引之间的对应关系。The weight index determining module 500 is configured to determine destination information of the current neuron output information, and search for a destination index correspondence according to the destination information, and obtain a weight index of the destination information, where the destination index correspondence includes the destination information. The correspondence between the weight index and the weight index.
权重索引发送模块600,用于输出所述当前神经元输出信息和所述权重索引。The weight index sending module 600 is configured to output the current neuron output information and the weight index.
在本实施例中,当前神经元计算出用于输出的当前神经元的输出信息后,根据输出信息的目的神经元,查找对应的权重索引后,输出所述当前神经元输出信息和所述权重索引。当前神经元将权重索引发送至后端的神经元,将权重索引信息在神经网络中进行了完整的传递,
提高了神经网络的信息处理能力。In this embodiment, after the current neuron calculates the output information of the current neuron for output, according to the target neuron of the output information, after searching for the corresponding weight index, the current neuron output information and the weight are output. index. The current neuron sends the weight index to the neurons at the back end, and the weight index information is completely transmitted in the neural network.
Improve the information processing capabilities of neural networks.
图5为一个实施例的输入权重拓展的神经元信息处理方法的流程示意图,如图1所示的输入权重拓展的神经元信息处理方法包括:FIG. 5 is a schematic flowchart of a method for processing neuron information of an input weight extension according to an embodiment, and the method for processing neuron information of the input weight extension shown in FIG. 1 includes:
步骤S100’,将预设数量的连续的神经元确定为协同组,将所述协同组中最后一个神经元确定为有效神经元,将所述协同组中除所述有效神经元外的神经元确定为协同神经元。Step S100', determining a preset number of consecutive neurons as a cooperative group, determining a last neuron in the cooperative group as a valid neuron, and selecting a neuron other than the effective neuron in the cooperative group Determined as a synergistic neuron.
具体的,所述预设数量,可以根据权重股拓展的需求灵活的设定。在设定好一个协同组后,整个协同组等效为一个有效的节点,协同组内连续的神经元中,只有最后一个神经元可以输出信息,所以将最后一个神经元确定为有效神经元,剩余的神经元用于协同工作,进行权重信息的拓展,而不进行数据的输出,所以确定为协同神经元,协同最后一个有效神经元进行信息的处理。Specifically, the preset quantity may be flexibly set according to the demand for the expansion of the weight stock. After setting up a collaborative group, the entire collaborative group is equivalent to a valid node. Among the consecutive neurons in the collaborative group, only the last neuron can output information, so the last neuron is determined as a valid neuron. The remaining neurons are used for cooperative work, and the weight information is expanded without outputting the data, so it is determined as a cooperative neuron, and the last effective neuron is processed for information processing.
步骤S200’,所述协同组中的第一个协同神经元,根据接收的前端神经元信息,获取第一个协同神经元的横向累加中间信息。Step S200', the first coordinated neuron in the collaborative group acquires the horizontal accumulated intermediate information of the first coordinated neuron according to the received front-end neuron information.
具体的,所述协同组中的第一个协同神经元,根据接收到的前端神经元信息获取到用于后续神经元累加的横向累加中间信息,不再读取自身的神经元信息,以及不再进行输出信息的计算。Specifically, the first coordinated neuron in the collaborative group obtains the horizontal accumulated intermediate information for subsequent neuron accumulation according to the received front-end neuron information, and no longer reads its own neuron information, and Then calculate the output information.
步骤S300’,所述协同组中后续的各协同神经元,依次根据接收的前端神经元信息,和前端协同神经元的横向累加中间信息,获取所述各协同神经元的横向累加中间信息,并将所述协同组中最后一个协同神经元的横向累加中间信息确定为横向累加信息。Step S300', the subsequent cooperative neurons in the collaborative group sequentially acquire the horizontal accumulation intermediate information of the cooperative neurons according to the received front-end neuron information and the lateral accumulation intermediate information of the front-end cooperative neurons, and The lateral accumulation intermediate information of the last coordinated neuron in the collaborative group is determined as lateral accumulation information.
具体的,后续的协同神经元,在对接收到的前端神经元信息进行处理后,和前端协同神经元的横向累加中间信息一起,获取当前协同神经元的横向累加中间信息,即,协同组内的协同神经元,只对其接收到的前端神经元信息进行处理,并与协同组内的其它协同神经元的横向累加中间信息进行进一步的累加,直至最后一个协同神经元,最后一个协同神经元获取的横向累加中间信息确定为横向累加信息,用于后续的有效神经元的计算。Specifically, the subsequent cooperative neuron, after processing the received front-end neuron information, together with the lateral accumulation intermediate information of the front-end cooperative neuron, acquires the horizontal accumulation intermediate information of the current coordinated neuron, that is, within the cooperative group The cooperative neuron only processes the front-end neuron information it receives, and further accumulates the intermediate information with the lateral accumulation of other cooperative neurons in the collaborative group until the last synergistic neuron, the last synergistic neuron. The acquired lateral accumulation intermediate information is determined as lateral accumulation information for subsequent calculation of effective neurons.
步骤S400’,所述有效神经元根据接收的前端神经元信息、读取的所述有效神经元的当前神经元信息和所述横向累加信息,获取协同输出信息。Step S400', the valid neuron acquires cooperative output information according to the received front-end neuron information, the read current neuron information of the effective neuron, and the horizontal accumulation information.
具体的,所述有效神经元,对接收到的前端神经元信息、读取当前神经元信息,和前端的所有的协同神经元计算得出的横向累加信息,进行计算后,获取最终用于输出的协同输出信息。Specifically, the effective neurons, the received front-end neuron information, the current neuron information, and the horizontal accumulated information calculated by all the cooperative neurons of the front end are calculated, and finally obtained for output. Collaborative output information.
步骤S500’,所述有效神经元输出所述协同输出信息。In step S500', the effective neurons output the coordinated output information.
在本实施例中,通过将预设数量的连续的神经元确定为协同组,只将所述协同组中的最后一个神经元的信息进行输出,剩余神经元只将信息进行累加后迭加至后端的神经元,将协同组内的所有神经元等效于一个有效的节点,多个输入组对应一个有效输出,可以充分利用
所述多个输入组的权重信息,打破了现有单个神经元输入权重类型有限的缺点,提高了神经网络的信息处理能力。In this embodiment, by determining a predetermined number of consecutive neurons as a cooperative group, only information of the last neuron in the cooperative group is output, and the remaining neurons only add information to the synergistic group. The neurons in the back end are equivalent to all the neurons in the cooperative group to one effective node, and the multiple input groups correspond to one effective output, which can be fully utilized.
The weight information of the plurality of input groups breaks the shortcoming of the limited type of input weight of the existing single neuron, and improves the information processing capability of the neural network.
在其中一个实施例中,设置所述协同组内神经元的发放使能标识,所述发放使能标识包括允许发放数据或不允许发放数据,将所述有效神经元的发放使能标识设置为允许发放数据,并将所有所述协同神经元的发放使能标识设置为不允许发放数据。In one embodiment, an issue enable identifier of a neuron in the collaborative group is set, the release enable identifier includes allowing data to be issued or not allowed to be distributed, and setting an issue enable identifier of the effective neuron to The data is allowed to be issued, and the issuance enablement identifiers of all the cooperative neurons are set to not allow the data to be issued.
具体的,所述设置发放使能标识,用于将确定好的预设数量的连续的神经元设定为一个协同组,设定只有最后一个神经元可以输出信息。Specifically, the setting is issued with an enable identifier for setting a determined preset number of consecutive neurons as a cooperative group, and setting only the last neuron to output information.
在其中一个实施例中,通过设置发放使能标识,将神经元设置为允许发放数据或不允许发放数据,将预设数量的连续的神经元组成一个协同组,可根据需求灵活的组成协同组。In one embodiment, by setting a release enable identifier, the neuron is set to allow data to be issued or not allowed to be distributed, and a predetermined number of consecutive neurons are grouped into a coordinated group, and the cooperative group can be flexibly configured according to requirements. .
在其中一个实施例中,将预设数量的连续的人工神经元确定为人工协同组,将所述人工协同组中最后一个人工神经元确定为人工有效神经元,将所述人工协同组中除所述人工有效神经元外的人工神经元确定为人工协同神经元;或将预设数量的连续的脉冲神经元确定为脉冲协同组,将所述脉冲协同组中最后一个脉冲神经元确定为脉冲有效神经元,将所述脉冲协同组中除所述脉冲有效神经元外的脉冲神经元确定为脉冲协同神经元。In one embodiment, a predetermined number of consecutive artificial neurons are determined as an artificial cooperative group, and the last artificial neuron in the artificial cooperative group is determined as an artificial effective neuron, and the artificial cooperative group is divided. The artificial neurons outside the artificial effective neurons are determined as artificial cooperative neurons; or a preset number of consecutive pulse neurons are determined as a pulse cooperative group, and the last pulse neuron in the pulse cooperative group is determined as a pulse An effective neuron, wherein the pulsed neurons other than the pulsed effective neurons in the pulse synergy group are determined as pulse cooperative neurons.
在本实施例中,将预设数量的连续的人工神经元确定为人工协同组,或将预设数量的连续的脉冲神经元确定为脉冲协同组,在人工神经网络或脉冲神经网络中,都可以确定协同组,进行单个神经元输入权重的扩展,提高人工神经网络或脉冲神经网络的信息处理能力。In this embodiment, a predetermined number of consecutive artificial neurons are determined as an artificial cooperative group, or a preset number of consecutive pulse neurons are determined as a pulse cooperative group, in an artificial neural network or a pulsed neural network, The cooperative group can be determined to expand the input weight of a single neuron, and improve the information processing capability of the artificial neural network or the pulse neural network.
当所述协同组为脉冲协同组时,所述当前神经元信息包括历史膜电位信息;在所述有效神经元输出所述协同输出信息的步骤之后,所述方法还包括:更新所述脉冲有效神经元的所述历史膜电位信息。When the collaborative group is a pulse cooperative group, the current neuron information includes historical membrane potential information; after the step of outputting the cooperative output information by the effective neuron, the method further includes: updating the pulse effective The historical membrane potential information of neurons.
在本实施例中,在脉冲协同组中,脉冲有效神经元输出协同输出信息后,将所述脉冲有效神经元的历史膜电位信息进行更新,以便整个协同组完成后续的信息处理,而脉冲协同神经元不更新历史膜电位信息,在后续的信息处理中,完成权重拓展的功能,通过脉冲协同组,提高整个脉冲神经网络的信息处理能力。In this embodiment, in the pulse cooperation group, after the pulse effective neurons output the coordinated output information, the historical membrane potential information of the pulsed effective neurons is updated, so that the entire collaborative group completes subsequent information processing, and the pulse synergy The neurons do not update the historical membrane potential information, and in the subsequent information processing, the function of weight expansion is completed, and the information processing capability of the entire pulse neural network is improved through the pulse synergy group.
图6为另一个实施例的输入权重拓展的神经元信息处理方法的流程示意图,如图2所示的输入权重拓展的神经元信息处理方法包括:FIG. 6 is a schematic flowchart diagram of a method for processing neuron information of an input weight extension according to another embodiment, and the method for processing neuron information of the input weight extension shown in FIG. 2 includes:
所述前端神经元信息包括:前端神经元输出信息、前端神经元与当前神经元的连接权重索引。The front-end neuron information includes: front-end neuron output information, a connection weight index of the front-end neuron and the current neuron.
对于人工协同组,所述前端神经元信息包括:为前端人工神经元输出的膜电位信息,前端人工神经元与当前人工神经元的连接权重索引。For the artificial collaboration group, the front-end neuron information includes: membrane potential information outputted by the front-end artificial neurons, and a connection weight index of the front-end artificial neurons and the current artificial neurons.
对于脉冲协同组,所述前端神经元信息包括:为前端脉冲神经元输出的脉冲尖端信息,前端脉冲神经元与当前脉冲神经元的连接权重索引。
For the pulse synergy group, the front-end neuron information includes: pulse tip information outputted by the front-end pulse neuron, and a connection weight index of the front-end pulse neuron and the current pulse neuron.
步骤S100a’,将预设数量的连续的神经元确定为协同组,将所述协同组中最后一个神经元确定为有效神经元,将所述协同组中除所述有效神经元外的神经元确定为协同神经元。Step S100a', determining a preset number of consecutive neurons as a cooperative group, determining a last neuron in the cooperative group as a valid neuron, and selecting a neuron other than the effective neuron in the cooperative group Determined as a synergistic neuron.
具体的,同步骤S100’。Specifically, the same step S100'.
步骤S200a’,协同组中的第一个协同神经元,根据所述前端神经元与当前神经元的连接权重索引,读取前端神经元与当前神经元的连接权重;根据所述前端神经元与当前神经元的连接权重、所述前端神经元信息,获取第一个协同神经元的横向累加中间信息。Step S200a', the first coordinated neuron in the cooperative group reads the connection weight of the front-end neuron and the current neuron according to the connection weight index of the front-end neuron and the current neuron; according to the front-end neuron and The connection weight of the current neuron, the front-end neuron information, and the horizontal accumulation intermediate information of the first coordinated neuron.
具体的,所述前端人工神经元与当前人工神经元的连接权重索引,是一个地址信息,当前神经元根据接收到的所述前端人工神经元与当前人工神经元的连接权重索引,在当前神经元内的存储器中,读取到前端人工神经元与当前人工神经元的连接权重,根据所述的连接权重信息,可以将前端神经元的输出信息,在参与当前神经元输出信息的计算过程中,更准确的反应出前端神经元的输出信息的权重。Specifically, the connection weight index of the front end artificial neuron and the current artificial neuron is an address information, and the current neuron is indexed according to the received connection weight of the front end artificial neuron and the current artificial neuron, and the current nerve is In the memory in the element, the connection weight of the front end artificial neuron and the current artificial neuron is read, and according to the connection weight information, the output information of the front end neuron can be used in the calculation process of participating in the current neuron output information. More accurately reflects the weight of the output information of the front-end neurons.
当协同组为人工协同组时,所述前端神经元信息包括前端人工神经元输出的膜电位信息,根据所述前端人工神经元输出的膜电位信息,和读取到的前端神经元与当前神经元的连接权重,进行相乘后,获取第一个人工协同神经元的横向累加中间信息,并放入累加器中。When the collaborative group is an artificial cooperative group, the front-end neuron information includes membrane potential information output by the front-end artificial neuron, based on the membrane potential information output by the front-end artificial neuron, and the read front-end neuron and current nerve The weight of the connection of the element, after multiplication, obtains the horizontal accumulation intermediate information of the first artificial coordinated neuron and puts it into the accumulator.
当协同组为脉冲协同组时,所述前端神经元信息包括前端脉冲神经元输出的脉冲尖端信息,根据所述前端脉冲神经元输出的脉冲尖端信息,和读取到的前端神经元与当前神经元的连接权重,进行相乘后,获取第一个脉冲协同神经元的横向累加中间信息,并放入累加器中。When the collaborative group is a pulse cooperative group, the front-end neuron information includes pulse tip information output by the front-end pulse neuron, according to pulse tip information output by the front-end pulse neuron, and the read front-end neuron and current nerve The weight of the connection of the element, after multiplication, obtains the horizontal accumulation intermediate information of the first pulse cooperative neuron and puts it into the accumulator.
步骤S300a’,所述协同组中的后续协同神经元,依次根据所述前端神经元与当前神经元的连接权重索引,读取前端神经元与当前神经元的连接权重;根据所述前端神经元与当前神经元的连接权重、所述前端神经元信息,和前端协同神经元的横向累加中间信息,获取所述各协同神经元的横向累加中间信息,并将所述协同组中最后一个协同神经元的横向累加中间信息确定为横向累加信息。Step S300a', the subsequent cooperative neuron in the collaborative group sequentially reads the connection weight of the front-end neuron and the current neuron according to the connection weight index of the front-end neuron and the current neuron; according to the front-end neuron Obtaining intermediate information of the connection weights of the current neurons, the front-end neuron information, and the lateral accumulation of the front-end cooperative neurons, acquiring the horizontal accumulation intermediate information of the cooperative neurons, and the last synergistic nerve in the synergistic group The horizontal accumulation intermediate information of the element is determined as the horizontal accumulation information.
具体的,协同组中的后续的协同神经元,分别将接收到的前端神经元输出信息和读取到的前端神经元与当前神经元的连接权重,按照预设的神经元模式进行计算,如进行相乘后,再与与之相连的前端的协同神经元的横向累加中间信息进行累加,获取当前协同神经元的横向累加中间信息。直至最后一个协同神经元获取到横向累加中间信息后,确认为横向累加信息。Specifically, the subsequent cooperative neurons in the collaborative group separately calculate the received front-end neuron output information and the connected front-end neuron and the current neuron, and calculate according to the preset neuron mode, such as After multiplying, the horizontal accumulation intermediate information of the coordinated neurons of the front end connected thereto is accumulated to obtain the horizontal accumulation intermediate information of the current coordinated neurons. Until the last coordinated neuron obtains the horizontal accumulation intermediate information, it is confirmed as the horizontal accumulation information.
步骤S400a’,所述有效神经元根据接收的前端神经元信息、读取的所述有效神经元的当前神经元信息和所述横向累加信息,获取协同输出信息。Step S400a', the valid neuron acquires cooperative output information according to the received front-end neuron information, the read current neuron information of the effective neuron, and the horizontal accumulation information.
具体的,当协同组为人工协同组时,所述当前神经元信息包括当前人工神经元偏置信息。所述有效神经元根据接收的前端神经元信息、读取的所述有效神经元的当前神经元信息和所述横向累加信息,获取协同输出信息,包括:根据所述前端人工神经元输出的膜电位信息、
所述前端神经元与当前神经元的连接权重、所述当前人工神经元偏置信息,通过预设的人工神经元激活函数,计算所述人工有效神经元的协同输出信息。Specifically, when the collaboration group is an artificial collaboration group, the current neuron information includes current artificial neuron offset information. The effective neuron obtains collaborative output information according to the received front-end neuron information, the read current neuron information of the effective neuron, and the horizontal accumulation information, including: a membrane output according to the front-end artificial neuron Potential information,
The connection weight of the front-end neuron and the current neuron, the current artificial neuron bias information, and the cooperative output information of the artificial effective neuron are calculated by a preset artificial neuron activation function.
当协同组为脉冲神经元时,所述当前神经元信息包括历史膜电位信息和膜电位泄漏信息。所述有效神经元根据接收的前端神经元信息、读取的所述有效神经元的当前神经元信息和所述横向累加信息,获取协同输出信息,包括:根据所述前端脉冲神经元输出的脉冲尖端信息、所述前端神经元与当前神经元的连接权重、所述历史膜电位信息、所述膜电位泄露信息,通过脉冲神经元计算模型,计算所述脉冲有效神经元的协同输出信息。When the collaborative group is a pulsed neuron, the current neuron information includes historical membrane potential information and membrane potential leakage information. The effective neuron obtains cooperative output information according to the received front-end neuron information, the read current neuron information of the effective neuron, and the horizontal accumulation information, including: according to the pulse output by the front-end pulse neuron The cutting-edge information, the connection weight of the front-end neuron and the current neuron, the historical membrane potential information, and the membrane potential leakage information are calculated by a pulsed neuron calculation model to calculate cooperative output information of the pulsed effective neurons.
步骤S500a’,所述有效神经元输出所述协同输出信息。In step S500a', the effective neurons output the coordinated output information.
在本实施例中,通过接收的前端神经元信息中的前端神经元与当前神经元的连接权重索引,读取前端神经元与当前神经元的连接权重后,用于计算横向累加中间信息,将一个协同组中的各协同神经元的权重信息进行了充分利用,并在有效神经元输出的协同输出信息中,将各协同神经元的权重信息进行了体现,相当于将有效神经元的权重信息进行了扩展,从而提高了神经网络的信息处理能力。In this embodiment, after the connection weight of the front-end neuron and the current neuron in the received front-end neuron information is indexed, and the connection weight of the front-end neuron and the current neuron is read, it is used to calculate the horizontal accumulation intermediate information, and The weight information of each coordinated neuron in a collaborative group is fully utilized, and in the cooperative output information of the effective neuron output, the weight information of each coordinated neuron is embodied, which is equivalent to the weight information of the effective neurons. It has been expanded to improve the information processing capabilities of neural networks.
图7为一个实施例的输入权重拓展的神经元信息处理系统的结构示意图,如图3所示的输入权重拓展的神经元信息处理系统,包括:FIG. 7 is a schematic structural diagram of an input weight-expanded neuron information processing system according to an embodiment, and the input weight-expanded neuron information processing system shown in FIG. 3 includes:
协同组确定模块100’,用于将预设数量的连续的神经元确定为协同组,将所述协同组中最后一个神经元确定为有效神经元,将所述协同组中除所述有效神经元外的神经元确定为协同神经元;用于设置所述协同组内神经元的发放使能标识,所述发放使能标识包括允许发放数据或不允许发放数据,将所述有效神经元的发放使能标识设置为允许发放数据,并将所有所述协同神经元的发放使能标识设置为不允许发放数据。包括:人工神经元确定单元,用于将预设数量的连续的人工神经元确定为人工协同组,将所述人工协同组中最后一个人工神经元确定为人工有效神经元,将所述人工协同组中除所述人工有效神经元外的人工神经元确定为人工协同神经元;或脉冲神经元确定单元,用于将预设数量的连续的脉冲神经元确定为脉冲协同组,将所述脉冲协同组中最后一个脉冲神经元确定为脉冲有效神经元,将所述脉冲协同组中除所述脉冲有效神经元外的脉冲神经元确定为脉冲协同神经元。The collaborative group determining module 100' is configured to determine a preset number of consecutive neurons as a cooperative group, and determine a last neuron in the collaborative group as a valid neuron, and save the effective neural group in the cooperative group The neurons outside the element are determined as cooperative neurons; and are used for setting an issue enablement identifier of the neurons in the collaborative group, the issue enablement identifier includes allowing data to be released or not allowing data to be released, and the effective neurons are The issuance enablement flag is set to allow the release of data, and the issue enablement identification of all of the coordinated neurons is set to not allow the release of data. The method includes: an artificial neuron determining unit, configured to determine a preset number of consecutive artificial neurons as an artificial cooperative group, and determine a last artificial neuron in the artificial cooperative group as an artificial effective neuron, and the artificial synergy The artificial neurons other than the artificial effective neurons in the group are determined as artificial cooperative neurons; or the pulsed neuron determining unit is configured to determine a preset number of consecutive pulse neurons as a pulse synergy group, and the pulse The last pulsed neuron in the cooperative group is determined as a pulsed effective neuron, and the pulsed neurons other than the pulsed effective neurons in the pulse cooperative group are determined as pulse cooperative neurons.
横向累加信息获取模块200’,用于所述协同组中的第一个协同神经元,根据接收的前端神经元信息,获取第一个协同神经元的横向累加中间信息;所述协同组中后续的各协同神经元,依次根据接收的前端神经元信息,和前端协同神经元的横向累加中间信息,获取所述各协同神经元的横向累加中间信息,并将所述协同组中最后一个协同神经元的横向累加中间信息确定为横向累加信息,并将所述协同组中最后一个协同神经元的横向累加中间信息确定为横向累加信息;所述前端神经元信息包括:前端神经元输出信息、前端神经元与当前神经元的连接权重索引;所述横向累加信息获取模块200’,用于协同组中的第一个协同神经元,根
据所述前端神经元与当前神经元的连接权重索引,读取前端神经元与当前神经元的连接权重;根据所述前端神经元与当前神经元的连接权重、所述前端神经元信息,获取第一个协同神经元的横向累加中间信息;所述协同组中的后续协同神经元,依次根据所述前端神经元与当前神经元的连接权重索引,读取前端神经元与当前神经元的连接权重;根据所述前端神经元与当前神经元的连接权重、所述前端神经元信息,和前端协同神经元的横向累加中间信息,获取所述各协同神经元的横向累加中间信息。当所述协同组为脉冲协同组时,所述当前神经元信息包括历史膜电位信息。The horizontally accumulated information obtaining module 200 ′ is configured to: acquire, by the first coordinated neuron in the collaborative group, the horizontal accumulated intermediate information of the first coordinated neuron according to the received front-end neuron information; Each of the cooperative neurons sequentially acquires the horizontal accumulation intermediate information of the cooperative neurons according to the received front-end neuron information and the lateral accumulation intermediate information of the front-end cooperative neurons, and the last synergistic nerve in the synergistic group The horizontal accumulation intermediate information of the element is determined as horizontal accumulation information, and the horizontal accumulation intermediate information of the last coordinated neuron in the cooperation group is determined as horizontal accumulation information; the front end neuron information includes: front end neuron output information, front end a weighted index of the connection between the neuron and the current neuron; the horizontal accumulation information acquisition module 200' is used to coordinate the first coordinated neuron in the group, the root
According to the connection weight index of the front-end neuron and the current neuron, the connection weight of the front-end neuron and the current neuron is read; according to the connection weight of the front-end neuron and the current neuron, and the front-end neuron information, The horizontally accumulating intermediate information of the first coordinated neuron; the subsequent cooperative neuron in the cooperative group sequentially reads the connection between the front-end neuron and the current neuron according to the connection weight index of the front-end neuron and the current neuron Weighting; obtaining, according to the connection weight of the front-end neuron and the current neuron, the front-end neuron information, and the lateral accumulation intermediate information of the front-end cooperative neuron, acquiring the horizontal accumulation intermediate information of each coordinated neuron. When the collaborative group is a pulse cooperative group, the current neuron information includes historical membrane potential information.
具体的,所述横向累加信息获取模块200’,在利用具体的元器件进行硬件电路的实现时,所述协同组内的各协同神经元生成的横向累加中间信息,通过共享寄存器传递给下一个协同神经元或有效神经元用于膜电位累加,这种反馈加法的方式可用累加器实现。更具体的,协同神经元获取前端协同神经元的横向累加中间信息,是通过读取共享寄存器实现的。有效神经元输出信息之后,需要将共享寄存器清0,以等待下一次或下一个协同组正常工作。在进行神经网络电路设计时,为简化电路结构,可对协同组内的各协同神经元与最后的有效神经元的输入电路电路结构相同,即,与有效神经元相同,各协同神经元,也具有读取当前神经元信息的输入电路,在利用软件的设计方式,设置各协同神经元的当前神经元输入信息输入为0即可。Specifically, the horizontal accumulation information acquisition module 200 ′, when implementing the hardware circuit by using specific components, the horizontal accumulation intermediate information generated by each coordinated neuron in the cooperation group is transmitted to the next through the shared register. Synergistic neurons or effective neurons are used for membrane potential accumulation, and this way of feedback addition can be achieved with an accumulator. More specifically, the cooperative neuron obtains the horizontal accumulation intermediate information of the front-end cooperative neurons by reading the shared register. After the valid neuron outputs the information, the shared register needs to be cleared to wait for the next or next collaborative group to work properly. In the design of the neural network circuit, in order to simplify the circuit structure, the input circuit circuits of the cooperative neurons in the cooperative group and the last effective neurons may be the same, that is, the same as the effective neurons, and the synergistic neurons are also The input circuit has the function of reading the current neuron information, and the current neuron input information of each cooperative neuron is set to 0 by using the software design method.
协同输出信息获取模块300’,用于所述有效神经元根据接收的前端神经元信息、读取的所述有效神经元的当前神经元信息和所述横向累加信息,获取协同输出信息。The collaborative output information obtaining module 300' is configured to obtain the collaborative output information according to the received front-end neuron information, the read current neuron information of the effective neuron, and the horizontal accumulated information.
协同输出信息输出模块400’,用于所述有效神经元输出所述协同输出信息。The collaborative output information output module 400' is configured to output the coordinated output information by the valid neurons.
在本实施例中,通过将预设数量的连续的神经元确定为协同组,只将所述协同组中的最后一个神经元的信息进行输出,剩余神经元只将信息进行累加后迭加至后端的神经元,将协同组内的所有神经元等效于一个有效的节点,多个输入对应一个有效输出,可以充分利用所述多个输入的权重信息,打破了现有的神经元输入权重类型有限的缺点,提高了神经网络的信息处理能力。通过设置发放使能标识,将神经元设置为允许发放数据或不允许发放数据,将预设数量的连续的神经元组成一个协同组,可根据需求灵活的组成协同组。通过接收的前端神经元信息中的前端神经元与当前神经元的连接权重索引,读取前端神经元与当前神经元的连接权重后,用于计算横向累加中间信息,将一个协同组中的各协同神经元的权重信息进行了充分利用,并在有效神经元输出的协同输出信息中,将各协同神经元的权重信息进行了体现,相当于将有效神经元的权重信息进行了扩展,从而提高了神经网络的信息处理能力。将预设数量的连续的人工神经元确定为人工协同组,或将预设数量的连续的脉冲神经元确定为脉冲协同组,在人工神经网络或脉冲神经网络中,都可以确定协同组,进行单个神经元输入权重的扩展,提高人工神经网络或脉冲神经网络的信息处理能力。
In this embodiment, by determining a predetermined number of consecutive neurons as a cooperative group, only information of the last neuron in the cooperative group is output, and the remaining neurons only add information to the synergistic group. The neurons in the back end are equivalent to all the neurons in the cooperative group, and the multiple inputs correspond to one effective output. The weight information of the multiple inputs can be fully utilized, and the existing neuron input weights are broken. The shortcomings of the limited type improve the information processing capability of the neural network. By setting the release enable identifier, the neurons are set to allow data to be released or not allowed to be distributed, and a predetermined number of consecutive neurons are grouped into a coordinated group, and the cooperative group can be flexibly configured according to requirements. After receiving the weight of the connection between the front-end neuron and the current neuron in the received front-end neuron information, reading the connection weight of the front-end neuron and the current neuron, and calculating the horizontal accumulation intermediate information, each of the cooperative groups The weight information of the cooperative neurons is fully utilized, and in the cooperative output information of the effective neuron output, the weight information of each coordinated neuron is embodied, which is equivalent to expanding the weight information of the effective neurons, thereby improving The information processing capability of the neural network. Determining a preset number of consecutive artificial neurons as an artificial cooperative group, or determining a preset number of consecutive pulse neurons as a pulse synergistic group, and determining a cooperative group in an artificial neural network or a pulsed neural network The expansion of input weights of individual neurons improves the information processing capability of artificial neural networks or pulsed neural networks.
图8为另一个实施例的输入权重拓展的神经元信息处理系统的结构示意图,如图4所示的输入权重拓展的神经元信息处理系统,包括:FIG. 8 is a schematic structural diagram of a neuron information processing system for input weight expansion according to another embodiment. The input weight-expanded neuron information processing system shown in FIG. 4 includes:
协同组确定模块100’,用于将预设数量的连续的神经元确定为协同组,将所述协同组中最后一个神经元确定为有效神经元,将所述协同组中除所述有效神经元外的神经元确定为协同神经元。The collaborative group determining module 100' is configured to determine a preset number of consecutive neurons as a cooperative group, and determine a last neuron in the collaborative group as a valid neuron, and save the effective neural group in the cooperative group Neurons outside the element are identified as synergistic neurons.
横向累加信息获取模块200’,用于所述协同组中的第一个协同神经元,根据接收的前端神经元信息,获取第一个协同神经元的横向累加中间信息;所述协同组中后续的各协同神经元,依次根据接收的前端神经元信息,和前端协同神经元的横向累加中间信息,获取所述各协同神经元的横向累加中间信息,并将所述协同组中最后一个协同神经元的横向累加中间信息确定为横向累加信息。The horizontally accumulated information obtaining module 200 ′ is configured to: acquire, by the first coordinated neuron in the collaborative group, the horizontal accumulated intermediate information of the first coordinated neuron according to the received front-end neuron information; Each of the cooperative neurons sequentially acquires the horizontal accumulation intermediate information of the cooperative neurons according to the received front-end neuron information and the lateral accumulation intermediate information of the front-end cooperative neurons, and the last synergistic nerve in the synergistic group The horizontal accumulation intermediate information of the element is determined as the horizontal accumulation information.
协同输出信息获取模块300’,用于所述有效神经元根据接收的前端神经元信息、读取的所述有效神经元的当前神经元信息和所述横向累加信息,获取协同输出信息。The collaborative output information obtaining module 300' is configured to obtain the collaborative output information according to the received front-end neuron information, the read current neuron information of the effective neuron, and the horizontal accumulated information.
协同输出信息输出模块400’,用于所述有效神经元输出所述协同输出信息。The collaborative output information output module 400' is configured to output the coordinated output information by the valid neurons.
历史膜电位更新模块500’,用于更新所述脉冲有效神经元的所述历史膜电位信息。The historical membrane potential update module 500' is configured to update the historical membrane potential information of the pulsed effective neurons.
在本实施例中,在脉冲协同组中,脉冲有效神经元输出协同输出信息后,将所述脉冲有效神经元的历史膜电位信息进行更新,以便整个协同组完成后续的信息处理,而脉冲协同神经元不更新历史膜电位信息,在后续的信息处理中,完成权重拓展的功能,通过脉冲协同组,提高整个脉冲神经网络的信息处理能力。In this embodiment, in the pulse cooperation group, after the pulse effective neurons output the coordinated output information, the historical membrane potential information of the pulsed effective neurons is updated, so that the entire collaborative group completes subsequent information processing, and the pulse synergy The neurons do not update the historical membrane potential information, and in the subsequent information processing, the function of weight expansion is completed, and the information processing capability of the entire pulse neural network is improved through the pulse synergy group.
基于同样的发明思想,本发明一个实施例还提供一种计算机设备,包括存储器、处理器,及存储在存储器上并可在处理器上运行的计算机程序,其中,所述处理器执行所述计算机程序时实现上述实施例所提及方法的步骤。Based on the same inventive concept, an embodiment of the present invention further provides a computer device including a memory, a processor, and a computer program stored on the memory and executable on the processor, wherein the processor executes the computer The steps of the method mentioned in the above embodiments are implemented at the time of the program.
本领域普通技术人员可以理解实现上述实施例方法中的全部或部分流程,是可以通过计算机程序或指令相关的硬件来完成,所述的程序可存储于一计算机可读取存储介质中,该程序在执行时,可包括如上述各方法的实施例的流程。其中,本申请所提供的各实施例中所使用的对存储器、存储、数据库或其它介质的任何引用,均可包括非易失性和/或易失性存储器。非易失性存储器可包括只读存储器(ROM)、可编程ROM(PROM)、电可编程ROM(EPROM)、电可擦除可编程ROM(EEPROM)或闪存。易失性存储器可包括随机存取存储器(RAM)或者外部高速缓冲存储器。作为说明而非局限,RAM以多种形式可得,诸如静态RAM(SRAM)、动态RAM(DRAM)、同步DRAM(SDRAM)、双数据率SDRAM(DDRSDRAM)、增强型SDRAM(ESDRAM)、同步链路(Synchlink)DRAM(SLDRAM)、存储器总线(Rambus)直接RAM(RDRAM)、直接存储器总线动态RAM(DRDRAM)、以及存储器总线动态RAM(RDRAM)等。
A person skilled in the art can understand that all or part of the process of implementing the foregoing embodiment can be completed by a computer program or instruction related hardware, and the program can be stored in a computer readable storage medium. When executed, the flow of an embodiment of the methods as described above may be included. Any reference to a memory, storage, database or other medium used in the various embodiments provided herein may include non-volatile and/or volatile memory. Non-volatile memory can include read only memory (ROM), programmable ROM (PROM), electrically programmable ROM (EPROM), electrically erasable programmable ROM (EEPROM), or flash memory. Volatile memory can include random access memory (RAM) or external cache memory. By way of illustration and not limitation, RAM is available in a variety of formats, such as static RAM (SRAM), dynamic RAM (DRAM), synchronous DRAM (SDRAM), double data rate SDRAM (DDRSDRAM), enhanced SDRAM (ESDRAM), synchronization chain. Synchlink DRAM (SLDRAM), Memory Bus (Rambus) Direct RAM (RDRAM), Direct Memory Bus Dynamic RAM (DRDRAM), and Memory Bus Dynamic RAM (RDRAM).
以上所述实施例的各技术特征可以进行任意的组合,为使描述简洁,未对上述实施例中的各个技术特征所有可能的组合都进行描述,然而,只要这些技术特征的组合不存在矛盾,都应当认为是本说明书记载的范围。The technical features of the above-described embodiments may be arbitrarily combined. For the sake of brevity of description, all possible combinations of the technical features in the above embodiments are not described. However, as long as there is no contradiction between the combinations of these technical features, All should be considered as the scope of this manual.
以上所述实施例仅表达了本发明的几种实施方式,其描述较为具体和详细,但并不能因此而理解为对本发明专利范围的限制。应当指出的是,对于本领域的普通技术人员来说,在不脱离本发明构思的前提下,还可以做出若干变形和改进,这些都属于本发明的保护范围。因此,本发明专利的保护范围应以所附权利要求为准。
The above-mentioned embodiments are merely illustrative of several embodiments of the present invention, and the description thereof is more specific and detailed, but is not to be construed as limiting the scope of the invention. It should be noted that a number of variations and modifications may be made by those skilled in the art without departing from the spirit and scope of the invention. Therefore, the scope of the invention should be determined by the appended claims.
Claims (14)
- 一种神经元权重信息处理方法,其特征在于,所述方法包括:A method for processing a neuron weight information, the method comprising:接收前端神经元的神经元输出信息,以及与所述神经元输出信息对应的权重索引;Receiving neuron output information of the front end neuron, and a weight index corresponding to the neuron output information;根据所述权重索引,读取权重索引信息对应关系,获取权重信息,所述权重索引信息对应关系为权重索引和权重信息之间的对应关系;Obtaining, according to the weight index, the correspondence information of the weight index information, and obtaining the weight information, where the correspondence relationship of the weight index information is a correspondence between the weight index and the weight information;根据所述权重信息和所述神经元输出信息,获取所述前端神经元的输入信息。And inputting the input information of the front end neuron according to the weight information and the neuron output information.
- 根据权利要求1所述的神经元权重信息处理方法,其特征在于:The method for processing neuron weight information according to claim 1, wherein:所述前端神经元,包括人工神经元或脉冲神经元。The front end neurons include artificial neurons or pulsed neurons.
- 根据权利要求1所述的神经元权重信息处理方法,其特征在于:The method for processing neuron weight information according to claim 1, wherein:所述权重索引,包括与所述权重索引对应的权重信息的存储地址。The weight index includes a storage address of the weight information corresponding to the weight index.
- 根据权利要求1所述的神经元权重信息处理方法,其特征在于,所述接收前端神经元的神经元输出信息,以及与所述神经元输出信息对应的权重索引,包括:The method for processing a neuron weight information according to claim 1, wherein the receiving the neuron output information of the front-end neuron and the weight index corresponding to the neuron output information comprises:接收前端神经元输出的路由信息,所述路由信息包括前端神经元输出的神经元输出信息,以及与所述神经元输出信息对应的权重索引;Receiving routing information output by the front end neuron, the routing information including neuron output information output by the front end neuron, and a weight index corresponding to the neuron output information;解析所述路由信息,获取所述神经元输出信息和所述权重索引。Parsing the routing information, and acquiring the neuron output information and the weight index.
- 根据权利要求1所述的神经元权重信息处理方法,其特征在于,所述权重信息,包括:The method for processing a neuron weight information according to claim 1, wherein the weight information comprises:根据预设的权重取值范围和初始权重信息,利用权重约化算法计算出的所述权重信息。The weight information calculated by the weight reduction algorithm is used according to the preset weight value range and the initial weight information.
- 根据权利要求1所述的神经元权重信息处理方法,其特征在于,在根据所述权重信息和所述神经元输出信息,获取所述前端神经元的输入信息的步骤之后,所述方法还包括:The method for processing a neuron weight information according to claim 1, wherein after the step of acquiring input information of the front-end neuron based on the weight information and the neuron output information, the method further includes :根据所述前端神经元的输入信息和读取的当前神经元信息,根据神经元输出算法,计算当前神经元输出信息;Calculating current neuron output information according to the neuron output algorithm according to the input information of the front end neuron and the read current neuron information;确定所述当前神经元输出信息的目的信息,并根据所述目的信息查找目的索引对应关系,获取所述目的信息的权重索引,所述目的索引对应关系包括目的信息和权重索引之间的对应关系;Determining the destination information of the current neuron output information, and searching the destination index correspondence according to the destination information, and acquiring a weight index of the destination information, where the destination index correspondence includes a correspondence between the destination information and the weight index ;输出所述当前神经元输出信息和所述权重索引。The current neuron output information and the weight index are output.
- 一种输入权重拓展的神经元信息处理方法,其特征在于,所述方法包括:A neuron information processing method for input weight expansion, characterized in that the method comprises:将预设数量的连续的神经元确定为协同组,将所述协同组中最后一个神经元确定为有效神经元,将所述协同组中除所述有效神经元外的神经元确定为协同神经元;Determining a predetermined number of consecutive neurons as a synergistic group, determining a last neuron in the synergistic group as a valid neuron, and determining a neuron other than the effective neuron in the synergistic group as a coordinated nerve yuan;所述协同组中的第一个协同神经元,根据接收的前端神经元信息,获取第一个协同神经元的横向累加中间信息;The first coordinated neuron in the collaborative group obtains the horizontal accumulated intermediate information of the first coordinated neuron according to the received front-end neuron information;所述协同组中后续的各协同神经元,依次根据接收的前端神经元信息,和前端协同神经元的横向累加中间信息,获取所述各协同神经元的横向累加中间信息,并将所述协同组中最后一个协同神经元的横向累加中间信息确定为横向累加信息;The subsequent cooperative neurons in the collaborative group sequentially acquire the horizontal accumulation intermediate information of the cooperative neurons according to the received front-end neuron information and the lateral accumulation intermediate information of the front-end cooperative neurons, and the synergy The horizontal accumulation intermediate information of the last coordinated neuron in the group is determined as the horizontal accumulation information;所述有效神经元根据接收的前端神经元信息、读取的所述有效神经元的当前神经元信息 和所述横向累加信息,获取协同输出信息;The valid neuron according to the received front-end neuron information, the current neuron information of the read valid neuron And the horizontally accumulating information, obtaining collaborative output information;所述有效神经元输出所述协同输出信息。The effective neurons output the coordinated output information.
- 根据权利要求7所述的输入权重拓展的神经元信息处理方法,其特征在于,所述将所述协同组中最后一个神经元确定为有效神经元,将所述协同组中除所述有效神经元外的神经元确定为协同神经元,包括:The method for processing a neuron information of the input weight extension according to claim 7, wherein the last neuron in the cooperative group is determined as a valid neuron, and the effective neural group is excluded from the cooperative group The neurons outside the element are identified as synergistic neurons, including:设置所述协同组内神经元的发放使能标识,所述发放使能标识包括允许发放数据或不允许发放数据,将所述有效神经元的发放使能标识设置为允许发放数据,并将所有所述协同神经元的发放使能标识设置为不允许发放数据。Setting an issuance enablement identifier of a neuron in the collaborative group, the release enablement identifier includes allowing data to be issued or not allowing data to be issued, setting an issue enablement identifier of the effective neuron to allow data to be issued, and The issuance enablement identifier of the coordinated neuron is set to not allow the issuance of data.
- 根据权利要求7所述的输入权重拓展的神经元信息处理方法,其特征在于:The input weight extension neuron information processing method according to claim 7, wherein:所述前端神经元信息包括:前端神经元输出信息、前端神经元与当前神经元的连接权重索引;The front-end neuron information includes: front-end neuron output information, a connection weight index of the front-end neuron and the current neuron;所述协同组中的第一个协同神经元,根据接收的前端神经元信息,获取第一个协同神经元的横向累加中间信息,包括:The first coordinated neuron in the collaborative group obtains the horizontal accumulated intermediate information of the first coordinated neuron according to the received front-end neuron information, including:协同组中的第一个协同神经元,根据所述前端神经元与当前神经元的连接权重索引,读取前端神经元与当前神经元的连接权重;The first cooperative neuron in the cooperative group reads the connection weight of the front-end neuron and the current neuron according to the connection weight index of the front-end neuron and the current neuron;根据所述前端神经元与当前神经元的连接权重、所述前端神经元信息,获取第一个协同神经元的横向累加中间信息;Obtaining horizontal accumulation intermediate information of the first coordinated neuron according to the connection weight of the front-end neuron and the current neuron, and the front-end neuron information;所述协同组中后续的各协同神经元,依次根据接收的前端神经元信息,和前端协同神经元的横向累加中间信息,获取所述各协同神经元的横向累加中间信息,包括:The subsequent cooperative neurons in the collaborative group sequentially obtain the horizontal accumulation intermediate information of the cooperative neurons according to the received front-end neuron information and the horizontal accumulation intermediate information of the front-end cooperative neurons, including:所述协同组中的后续协同神经元,依次根据所述前端神经元与当前神经元的连接权重索引,读取前端神经元与当前神经元的连接权重;The subsequent cooperative neurons in the collaborative group sequentially index the connection weights of the front-end neurons and the current neurons according to the connection weight index of the front-end neurons and the current neurons;根据所述前端神经元与当前神经元的连接权重、所述前端神经元信息,和前端协同神经元的横向累加中间信息,获取所述各协同神经元的横向累加中间信息。And obtaining, according to the connection weight of the front-end neuron and the current neuron, the front-end neuron information, and the lateral accumulation intermediate information of the front-end cooperative neuron, the horizontal accumulation intermediate information of each coordinated neuron is obtained.
- 根据权利要求7所述的输入权重拓展的神经元信息处理方法,其特征在于,所述将预设数量的连续的神经元确定为协同组,将所述协同组中最后一个神经元确定为有效神经元,将所述协同组中除所述有效神经元外的神经元确定为协同神经元,包括:The input weight extension neuron information processing method according to claim 7, wherein the predetermined number of consecutive neurons are determined as a cooperative group, and the last neuron in the cooperative group is determined to be effective. a neuron, wherein the neuron other than the effective neuron in the synergistic group is determined as a synergistic neuron, including:将预设数量的连续的人工神经元确定为人工协同组,将所述人工协同组中最后一个人工神经元确定为人工有效神经元,将所述人工协同组中除所述人工有效神经元外的人工神经元确定为人工协同神经元;或Determining a predetermined number of consecutive artificial neurons as an artificial synergistic group, and determining a last artificial neuron in the artificial cooperative group as an artificial effective neuron, and excluding the artificial effective neurons in the artificial cooperative group Artificial neurons are identified as artificial synergistic neurons; or将预设数量的连续的脉冲神经元确定为脉冲协同组,将所述脉冲协同组中最后一个脉冲神经元确定为脉冲有效神经元,将所述脉冲协同组中除所述脉冲有效神经元外的脉冲神经元确定为脉冲协同神经元。Determining a predetermined number of consecutive pulse neurons as a pulse synergy group, and determining a last pulse neuron in the pulse synergy group as a pulse effective neuron, wherein the pulse synergy group is other than the pulse effective neuron The pulsed neurons are identified as pulse-synchronized neurons.
- 根据权利要求10所述的输入权重拓展的神经元信息处理方法,其特征在于:The input weight extension neuron information processing method according to claim 10, characterized in that:当所述协同组为脉冲协同组时,所述当前神经元信息包括历史膜电位信息; When the collaborative group is a pulse cooperative group, the current neuron information includes historical membrane potential information;在所述有效神经元输出所述协同输出信息的步骤之后,所述方法还包括:After the step of outputting the collaborative output information by the effective neurons, the method further includes:更新所述脉冲有效神经元的所述历史膜电位信息。The historical membrane potential information of the pulsed effective neurons is updated.
- 一种计算机设备,其特征在于,包括存储器、处理器,及存储在存储器上并可在处理器上运行的计算机程序,所述处理器执行所述计算机程序时实现权利要求1-11中任意一项方法的步骤。A computer device, comprising: a memory, a processor, and a computer program stored on the memory and operable on the processor, the processor executing the computer program to implement any one of claims 1-11 The steps of the method.
- 一种神经元权重信息处理系统,其特征在于,包括:A neuron weight information processing system, comprising:前端神经元输出信息接收模块,用于接收前端神经元的神经元输出信息,以及与所述神经元输出信息对应的权重索引;a front-end neuron output information receiving module, configured to receive neuron output information of the front-end neuron, and a weight index corresponding to the neuron output information;权重信息获取模块,用于根据所述权重索引,读取权重索引信息对应关系,获取权重信息,所述权重索引信息对应关系为权重索引和权重信息之间的对应关系;The weight information obtaining module is configured to: according to the weight index, read the correspondence of the weight index information, and obtain the weight information, where the weight index information correspondence relationship is a correspondence between the weight index and the weight information;前端神经元输入信息获取模块,用于根据所述权重信息和所述神经元输出信息,获取所述前端神经元的输入信息。The front-end neuron input information acquiring module is configured to acquire input information of the front-end neuron according to the weight information and the neuron output information.
- 一种输入权重拓展的神经元信息处理系统,其特征在于,包括:A neuron information processing system for input weight expansion, comprising:协同组确定模块,用于将预设数量的连续的神经元确定为协同组,将所述协同组中最后一个神经元确定为有效神经元,将所述协同组中除所述有效神经元外的神经元确定为协同神经元;a collaborative group determining module, configured to determine a preset number of consecutive neurons as a collaborative group, and determine a last neuron in the collaborative group as a valid neuron, except the effective neurons in the collaborative group The neurons are identified as synergistic neurons;横向累加信息获取模块,用于所述协同组中的第一个协同神经元,根据接收的前端神经元信息,获取第一个协同神经元的横向累加中间信息;所述协同组中后续的各协同神经元,依次根据接收的前端神经元信息,和前端协同神经元的横向累加中间信息,获取所述各协同神经元的横向累加中间信息,并将所述协同组中最后一个协同神经元的横向累加中间信息确定为横向累加信息;a horizontal accumulation information obtaining module, configured, by the first coordinated neuron in the collaborative group, acquiring, according to the received front-end neuron information, horizontal accumulated intermediate information of the first coordinated neuron; Synchronizing neurons, according to the received front-end neuron information, and the lateral accumulation intermediate information of the front-end cooperative neurons, acquiring the lateral accumulation intermediate information of the cooperative neurons, and the last synergistic neuron in the synergistic group The horizontal accumulation intermediate information is determined as the horizontal accumulation information;协同输出信息获取模块,用于所述有效神经元根据接收的前端神经元信息、读取的所述有效神经元的当前神经元信息和所述横向累加信息,获取协同输出信息;a cooperative output information obtaining module, configured to acquire, by the valid neuron, collaborative output information according to the received front-end neuron information, the read current neuron information of the effective neuron, and the horizontal accumulation information;协同输出信息输出模块,用于所述有效神经元输出所述协同输出信息。 And a cooperative output information output module, configured to output the collaborative output information by the effective neurons.
Applications Claiming Priority (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201710042087.2 | 2017-01-20 | ||
CN201710042090.4 | 2017-01-20 | ||
CN201710042090.4A CN106815638B (en) | 2017-01-20 | 2017-01-20 | Input weight expanded neuron information processing method and system |
CN201710042087.2A CN106875010B (en) | 2017-01-20 | 2017-01-20 | Neuron weight information processing method and system |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2018133567A1 true WO2018133567A1 (en) | 2018-07-26 |
Family
ID=62907871
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/CN2017/114659 WO2018133567A1 (en) | 2017-01-20 | 2017-12-05 | Neuron weight information processing method and system, neuron information processing method and system, and computer device |
Country Status (1)
Country | Link |
---|---|
WO (1) | WO2018133567A1 (en) |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104036451A (en) * | 2014-06-20 | 2014-09-10 | 深圳市腾讯计算机系统有限公司 | Parallel model processing method and device based on multiple graphics processing units |
CN106022468A (en) * | 2016-05-17 | 2016-10-12 | 成都启英泰伦科技有限公司 | Artificial neural network processor integrated circuit and design method therefor |
CN106104406A (en) * | 2014-03-06 | 2016-11-09 | 前进公司 | Neutral net and the method for neural metwork training |
CN106815638A (en) * | 2017-01-20 | 2017-06-09 | 清华大学 | The neuronal messages processing method and system that input weight is expanded |
CN106875010A (en) * | 2017-01-20 | 2017-06-20 | 清华大学 | Neuron weight information processing method and system |
-
2017
- 2017-12-05 WO PCT/CN2017/114659 patent/WO2018133567A1/en active Application Filing
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN106104406A (en) * | 2014-03-06 | 2016-11-09 | 前进公司 | Neutral net and the method for neural metwork training |
CN104036451A (en) * | 2014-06-20 | 2014-09-10 | 深圳市腾讯计算机系统有限公司 | Parallel model processing method and device based on multiple graphics processing units |
CN106022468A (en) * | 2016-05-17 | 2016-10-12 | 成都启英泰伦科技有限公司 | Artificial neural network processor integrated circuit and design method therefor |
CN106815638A (en) * | 2017-01-20 | 2017-06-09 | 清华大学 | The neuronal messages processing method and system that input weight is expanded |
CN106875010A (en) * | 2017-01-20 | 2017-06-20 | 清华大学 | Neuron weight information processing method and system |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11942191B2 (en) | Compound property prediction method and apparatus, computer device, and readable storage medium | |
WO2018133568A1 (en) | Compound-mode neuron information processing method and system, and computer device | |
US8904149B2 (en) | Parallelization of online learning algorithms | |
CN101739381B (en) | Barrier synchronization apparatus, barrier synchronization process system and method | |
US10146248B2 (en) | Model calculation unit, control unit and method for calibrating a data-based function model | |
CN104765589B (en) | Grid parallel computation preprocess method based on MPI | |
CN110232095B (en) | Data synchronization method, device, storage medium and server | |
CN106845632B (en) | Method and system for converting impulse neural network information into artificial neural network information | |
CN109583594B (en) | Deep learning training method, device, equipment and readable storage medium | |
CN108446770B (en) | Distributed machine learning slow node processing system and method based on sampling | |
CN111914936B (en) | Data characteristic enhancement method and device for corpus data and computer equipment | |
CN110442574B (en) | Data processing method, electronic equipment and computer readable storage medium | |
CN104933463A (en) | Training method of deep neural network model and equipment thereof | |
US20210133548A1 (en) | Self-adaptive threshold neuron information processing method, self-adaptive leakage value neuron information processing method, system, computer device and readable storage medium | |
CN114356578A (en) | Parallel computing method, device, equipment and medium for natural language processing model | |
KR20190065144A (en) | Processing element and operating method thereof in neural network | |
CN106202224B (en) | Search processing method and device | |
CN106815638B (en) | Input weight expanded neuron information processing method and system | |
WO2018133567A1 (en) | Neuron weight information processing method and system, neuron information processing method and system, and computer device | |
CN110457348B (en) | Data processing method and device | |
CN104573331B (en) | A kind of k nearest neighbor data predication method based on MapReduce | |
DE202014011350U1 (en) | FFT accelerator | |
CN106844533A (en) | A kind of packet method for congregating and device | |
CN110018457B (en) | Method for designing satellite-borne SAR echo data frame header identifier detection functional module | |
CN109472023B (en) | Entity association degree measuring method and system based on entity and text combined embedding and storage medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 17892935 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 17892935 Country of ref document: EP Kind code of ref document: A1 |