CN109657784A - Neural network creation method and device, image processing method and electronic equipment - Google Patents

Neural network creation method and device, image processing method and electronic equipment Download PDF

Info

Publication number
CN109657784A
CN109657784A CN201811591916.3A CN201811591916A CN109657784A CN 109657784 A CN109657784 A CN 109657784A CN 201811591916 A CN201811591916 A CN 201811591916A CN 109657784 A CN109657784 A CN 109657784A
Authority
CN
China
Prior art keywords
network
neural network
subelement
weighting
unit
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201811591916.3A
Other languages
Chinese (zh)
Inventor
张祥雨
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Megvii Technology Co Ltd
Beijing Maigewei Technology Co Ltd
Original Assignee
Beijing Maigewei Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Maigewei Technology Co Ltd filed Critical Beijing Maigewei Technology Co Ltd
Priority to CN201811591916.3A priority Critical patent/CN109657784A/en
Publication of CN109657784A publication Critical patent/CN109657784A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/06Physical realisation, i.e. hardware implementation of neural networks, neurons or parts of neurons
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Biomedical Technology (AREA)
  • Biophysics (AREA)
  • Evolutionary Computation (AREA)
  • Computational Linguistics (AREA)
  • Data Mining & Analysis (AREA)
  • Artificial Intelligence (AREA)
  • General Health & Medical Sciences (AREA)
  • Molecular Biology (AREA)
  • Computing Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Mathematical Physics (AREA)
  • Software Systems (AREA)
  • Neurology (AREA)
  • Image Analysis (AREA)

Abstract

The disclosure provides a kind of neural network creation method, creating device, image processing method, electronic equipment and computer readable storage medium using Processing with Neural Network input picture.The neural network creation method, it include: the multiple initial network units of configuration, each of the multiple initial network unit includes at least multiple network subelements of input terminal having the same and output end, and the weighting gating unit connecting with the output end of the multiple network subelement;Initial neural network is configured, the initial neural network includes at least the multiple initial network unit;Based on the goal task of the neural network, the training initial neural network, until meeting predetermined training termination condition;And for each of the multiple initial network unit, a network subelement is selected by the weighting gating unit, and remove other network subelements and the weighting gating unit, to obtain the neural network of creation.

Description

Neural network creation method and device, image processing method and electronic equipment
Technical field
This disclosure relates to field of image processing, more specifically, this disclosure relates to a kind of neural network creation method, creation dress It sets, using the image processing method, electronic equipment and computer readable storage medium of Processing with Neural Network input picture.
Background technique
Neural network is a kind of extensive, multi-parameters optimization tool.By a large amount of training data, neural network can Learn the hiding feature for being difficult to summarize in data out, so that the task of multinomial complexity is completed, such as Face datection, image, semantic point It cuts, object detection, movement tracking, natural language translation etc..Neural network is widely applied by artificial intelligence circle.
Currently, neural network model design is mainly complicated by the calculating of billions of secondary floating-point operations (FLOPs) such as per second The indirect measurement of degree instructs.However, directly measurement additionally depends on other factors, such as memory access as such as processing speed Ask cost and platform identity etc..That is, when carrying out neural network creation, other than considering FLOPs, it is also necessary to which assessment is closed In the direct measurement of target platform.By carrying out theory analysis and reality to the reason of influence neural network model practical execution speed Verifying, can propose the design principle of high efficiency neural network.It is therefore desirable to be able to according to the design of high efficiency neural network Principle creates the neural network for being suitble to different platform and usage scenario.
Summary of the invention
Propose the disclosure in view of the above problems.Present disclose provides a kind of neural network creation method, creating device, Use the image processing method, electronic equipment and computer readable storage medium of Processing with Neural Network input picture.
According to one aspect of the disclosure, a kind of neural network creation method is provided, comprising: configure multiple initial networks Unit, each of the multiple initial network unit includes at least input terminal having the same and multiple networks of output end is single Member, and the weighting gating unit being connect with the output end of the multiple network subelement;Configure initial neural network, institute Initial neural network is stated including at least the multiple initial network unit;Based on the goal task of the neural network, training institute Initial neural network is stated, until meeting predetermined training termination condition;And for each in the multiple initial network unit It is a, a network subelement is selected by the weighting gating unit, and remove other network subelements and weighting choosing Logical unit, to obtain the neural network of creation.
A kind of neural network creating device another aspect of the present disclosure provides, comprising: configuration module is used for Multiple initial network units are configured, each of the multiple initial network unit includes at least input terminal having the same and output Multiple network subelements at end, and the weighting gating unit being connect with the output end of the multiple network subelement;Match Initial neural network is set, the initial neural network includes at least the multiple initial network unit;Training module, for being based on The goal task of the neural network, the training initial neural network, until meeting predetermined training termination condition;And creation Module, for selecting a network by the weighting gating unit for each of the multiple initial network unit Subelement, and other network subelements and the weighting gating unit are removed, to obtain the neural network of creation.
According to the another aspect of the disclosure, a kind of image processing method using Processing with Neural Network input picture is provided Method, the neural network include input layer, middle layer and output layer, and described image processing method includes: the creation nerve net Network;The input picture is received via the input layer;The characteristics of image of the input picture is extracted via the middle layer;With And via output layer output for the processing result of the input picture, wherein creating the neural network includes: to use Neural network creation method as described above, to create the neural network.
Still another aspect of the present disclosure provides a kind of electronic equipment, comprising: memory, for storing computer Readable instruction;And processor, for running the computer-readable instruction, execute neural network creation method as described above or Image processing method.
Still another aspect of the present disclosure provides a kind of computer readable storage mediums, can for storing computer Reading instruction, when the computer-readable instruction is executed by computer, so that the computer executes neural network as described above Creation method or image processing method.
It will be described in as follows, neural network creation method according to an embodiment of the present disclosure is in high efficiency neural network General principle of design under, make full use of goal task data, select the height for being suitble to the goal task data and meeting automatically The Artificial Neural Network Structures of the general principle of design of efficiency neural network.Therefore, nerve net according to an embodiment of the present disclosure Network creation method avoids the limitation of hand-designed model, while being also equipped with the high efficiency executed.
It is to be understood that foregoing general description and following detailed description are both illustrative, and it is intended to In the further explanation of the claimed technology of offer.
Detailed description of the invention
The embodiment of the present invention is described in more detail in conjunction with the accompanying drawings, the above and other purposes of the present invention, Feature and advantage will be apparent.Attached drawing is used to provide to further understand the embodiment of the present invention, and constitutes explanation A part of book, is used to explain the present invention together with the embodiment of the present invention, is not construed as limiting the invention.In the accompanying drawings, Identical reference label typically represents same parts or step.
Fig. 1 is the flow chart for illustrating neural network creation method according to an embodiment of the present disclosure;
Fig. 2 is the schematic diagram for illustrating initial neural network according to an embodiment of the present disclosure;
Fig. 3 A is the one of the initial network unit that diagram uses neural network creation method according to an embodiment of the present disclosure A schematic diagram;
Fig. 3 B is the one of the network unit that diagram is created using neural network creation method according to an embodiment of the present disclosure A schematic diagram;
Fig. 4 A is the another of the initial network unit that diagram uses neural network creation method according to an embodiment of the present disclosure One schematic diagram;
Fig. 4 B is the another of the network unit that diagram uses neural network creation method according to an embodiment of the present disclosure to create One schematic diagram;
Fig. 5 is the flow chart for illustrating image processing method according to an embodiment of the present disclosure;
Fig. 6 is the schematic diagram for illustrating image processing method according to an embodiment of the present disclosure;
Fig. 7 is the block diagram for illustrating neural network creating device according to an embodiment of the present disclosure;
Fig. 8 is the hardware block diagram for illustrating electronic equipment according to an embodiment of the present disclosure;And
Fig. 9 is the schematic diagram for illustrating computer readable storage medium according to an embodiment of the present disclosure.
Specific embodiment
In order to enable the purposes, technical schemes and advantages of the disclosure become apparent, root is described in detail below with reference to accompanying drawings According to the example embodiment of the disclosure.Obviously, described embodiment is only a part of this disclosure embodiment, rather than this public affairs The whole embodiments opened, it should be appreciated that the disclosure is not limited by example embodiment described herein.
Firstly, describing neural network creation method according to an embodiment of the present disclosure referring to figs. 1 to Fig. 4 B.
Fig. 1 is the flow chart for illustrating neural network creation method according to an embodiment of the present disclosure.As shown in Figure 1, according to The neural network creation method of embodiment of the disclosure includes the following steps.
In step s101, multiple initial network units are configured.
In step s 102, configuration includes at least the initial neural network of multiple initial network units.
Hereinafter, describing initial neural network according to an embodiment of the present disclosure referring in particular to Fig. 2 and Fig. 3 A.
Fig. 2 is the schematic diagram for illustrating initial neural network according to an embodiment of the present disclosure.As shown in Fig. 2, initial nerve Network includes at least multiple initial network unit MB1、MB2…MBn, and multiple initial network unit MB1、MB2…MBnSequence connects It connects.It should be noted that initial neural network shown in Fig. 2 is only schematical, it is according to an embodiment of the present disclosure initial Neural network is without being limited thereto, but can include other any number of units according to specific tasks demand, for example including other The network units such as convolution, Chi Hua, full connection, which are interted, to be connected between multiple initial network units, to constitute complete training net Network.
Fig. 3 A is the one of the initial network unit that diagram uses neural network creation method according to an embodiment of the present disclosure A schematic diagram.
As shown in Figure 3A, multiple initial network unit MB1、MB2…MBmEach MB include at least input having the same Multiple network subelement B at end 301 and output end 3021、B2…Bn, and with the multiple network subelement B1、B2…BnInstitute State the weighting gating unit 303 of the connection of output end 302.The weighting gating unit 302 is the multiple network subelement B1、 B2…BnEach of corresponding weighting gating coefficient is provided.
Referring back to Fig. 1, initial neural network and multiple original nets therein are configured in step S101 and S102 After network unit, in step s 103, based on the goal task of the neural network, the initial neural network is trained, until Meet predetermined training termination condition.In one embodiment of the present disclosure, the training initial neural network includes described in adjustment Weighting gating unit 302 is the multiple network subelement B1、B2…BnEach of corresponding weighting gating coefficient is provided. It is to be appreciated that the training initial neural network further includes other convolution, Chi Hua, full connection unit and multiple networks Unit B1、B2…BnAll parameters of itself.In one embodiment of the present disclosure, make a reservation for training termination condition and depend on target Task, such as in the neural network in the case where executing classification task, predetermined training termination condition to can be cross entropy Loss function convergence.
When training training termination condition predetermined to satisfaction in step s 103, in step S104, for the multiple first Each of beginning network unit selects a network subelement by the weighting gating unit, and removes other networks Subelement and the weighting gating unit, to obtain the neural network of creation.
In one embodiment of the present disclosure, a network subelement is selected to indicate by the weighting gating unit 303 Are as follows:
Wherein, BiIndicate the output valve of each network subelement in the multiple network subelement, Softmax indicates to add Weigh gate function, diIndicate that the weighting of each network subelement gates coefficient, T indicates temperature parameter, and i is 1 between n Natural number, n is the number of the multiple network subelement.
In the training process of step S103, start train when, T be initialized as a smaller value (for example, 0.1), and as training incrementally increases T.When T is very big, if by the property of Softmax function in above-mentioned expression formula it is found that The weighting of each network subelement gates coefficient diIt is not stringent equal, then maximum diCorresponding Softmax functional value approach It is other then level off to 0 in 1.That is, selecting network single by weighting gating unit 303 after training Member includes: to retain the wherein corresponding weighting gating maximum network subelement of coefficient.
Fig. 3 B is the one of the network unit that diagram is created using neural network creation method according to an embodiment of the present disclosure A schematic diagram.Utilize the net of the neural network of neural network creation method creation according to an embodiment of the present disclosure shown in FIG. 1 Network unit is for example as shown in Figure 3B, and the training of goal task data set is utilized by executing to initial cell as shown in Figure 3A, Retain wherein corresponding weighting and gates coefficient d2Maximum network subelement B2, while removing other network subelements B1、 B3…BnAnd weighting gating unit 303, to obtain the network unit of creation.Similarly, for as shown in Figure 2 the multiple Initial network unit MB1、MB2…MBmEach of, a network subelement is selected by the weighting gating unit, and Other network subelements and the weighting gating unit are removed, the neural network of creation can be obtained.
In addition, can also continue to the training in goal task data after selecting a network subelement in step S104, The fine tuning of model is executed to obtain the neural network of final output.
The net that neural network creation method according to an embodiment of the present disclosure creates is described above by reference to Fig. 3 A and 3B The general example of network unit.The generality example shown in Fig. 3 A and 3B, for initial network unit MB1、MB2…MBmNot Provide specific limitation.Fig. 4 A and Fig. 4 B then further illustrate neural network creation method according to an embodiment of the present disclosure Application under the design principle of high efficiency neural network.
As shown in Figure 4 A, the neural network to be created is that network (Shuffle Net) is reset in second generation channel, one of them Network subelement needs to meet the preset rules that network is reset in second generation channel.A network of network is reset in second generation channel It is first two-way by channel segmentation by channel segmentation unit 401, without any processing all the way, another way then includes in unit Multiple network subelement B of the input terminal having the same 402 and output end 403 that need to create and train1、B2…Bn, Yi Jiyu The multiple network subelement B1、B2…BnThe output end 403 connect weighting gating unit 404.Hereafter, two-way leads to again It crosses channel concatenation unit 405 two-way is spliced together, is merged and exported by the progress of channel rearrangement units 406 information.
In this example, multiple network subelement B1、B2…BnEach meet second generation channel reset network it is pre- If regular.Specifically, each network subelement meets following one or more conditions: the input of one network subelement It is less than preset threshold with output channel number difference, for example, outputs and inputs identical " balance " convolution unit of port number;It is described One network subelement executes depth and separates convolution;One network subelement executes pond;One network is single Member executes neuronal activation.
By utilizing the neural network creation method according to an embodiment of the present disclosure described referring to Fig.1, as shown in Figure 4 B, The training of goal task data set is utilized by executing to initial cell as shown in Figure 4 A, retains wherein corresponding weighting gating Coefficient d2Maximum network subelement B2, while removing other network subelements B1、B3…BnAnd weighting gating unit 404, to obtain the network unit of creation.
Due to multiple network subelement B1、B2…BnEach meet second generation channel reset network preset rules, That so so create includes network subelement B2Network unit MB necessarily satisfying for second generation channel reset network default rule Then, and second generation channel is also centainly finally met by the neural network that multiple network unit MB so created are constituted and resets net The preset rules of network.By referring to the neural network creation method according to an embodiment of the present disclosure that Fig. 1 to Fig. 4 B is described, in height Under the general principle of design of efficiency neural network (for example, second generation channel reset network), realizes to automatically create and be suitble to the mesh The Artificial Neural Network Structures of the general principle of design of mark task data and the high efficiency neural network met.
More than, describe neural network creation method according to an embodiment of the present disclosure.Hereinafter, will referring to figure 5 and figure 6 into One step describes image processing method according to an embodiment of the present disclosure.Fig. 5 is at diagram image according to an embodiment of the present disclosure The flow chart of reason method;Fig. 6 is the schematic diagram for illustrating image processing method according to an embodiment of the present disclosure.
As shown in figure 5, creating neural network in step S501.Image processing method according to an embodiment of the present disclosure It can use the neural network creation method above with reference to Fig. 1 to Fig. 4 B description, to create the neural network.
Further, as shown in Figure 5 and Figure 6, in step S502, input picture 600 is received via input layer 61.Such as figure 6 schematically show, and the convolutional neural networks 60 created in step S501 include input layer 61, middle layer 62 and output layer 63。
In step S503, the characteristics of image of the input picture 600 is extracted via middle layer 62.As Fig. 6 schematically shows Out, middle layer 62 may include multiple cascade sublayers, and multiple sublayers include but is not limited to convolutional layer, pond layer, active coating Deng.
In step S504, via the output of output layer 63 for the processing result 601 of the input picture 600.It is exemplary Ground, which can be classification results, and as Fig. 6 is schematically shown, output layer 63 for example passes through global pool, full connection Deng processing after output category result 601.Image processing method according to an embodiment of the present disclosure is not limited to execute the mesh in image Mark classification, but further include the similarity comparison of target object detection, segmentation, the motion prediction of target object and target object Deng.Illustratively, via nerve network system output layer export processing result can also be target object location information, Image segmentation result, the motion-prediction result of target object, similarity of target object etc..
Fig. 7 is the block diagram for illustrating neural network creating device according to an embodiment of the present disclosure.It is as shown in Figure 7 according to this The neural network creating device 70 of disclosed embodiment can be used for executing mind according to an embodiment of the present disclosure as shown in Figure 1 Through network creation method.As shown in fig. 7, neural network creating device 70 according to an embodiment of the present disclosure includes configuration module 701, training module 702 and creation module 703.It will be understood by those skilled in the art that: these unit modules can individually by hardware, It is individually realized in various ways by software or by a combination thereof, and the present disclosure is not limited to they any one.
For configuring multiple initial network units, each of the multiple initial network unit at least wraps configuration module 701 Include multiple network subelements of input terminal and output end having the same, and the output with the multiple network subelement Hold the weighting gating unit of connection;Initial neural network is configured, the initial neural network includes at least the multiple original net Network unit.
Training module 702 is used for the goal task based on the neural network, the training initial neural network, Zhi Daoman The predetermined training termination condition of foot.
Creation module 703 is used to gate single each of the multiple initial network unit by the weighting Member one network subelement of selection, and other network subelements and the weighting gating unit are removed, to obtain the mind of creation Through network.
In one embodiment of the present disclosure, weighting gating unit is that each of the multiple network subelement provides Corresponding weighting gates coefficient.The training module 702 adjusts the weighting gating coefficient, and the creation module 703 is protected Stay the wherein corresponding weighting gating maximum network subelement of coefficient.
In one embodiment of the present disclosure, one network subelement meets following one or more conditions: described The port number difference that outputs and inputs of one network subelement is less than preset threshold;One network subelement executes depth can Separate convolution;One network subelement executes pond;One network subelement executes neuronal activation.
In one embodiment of the present disclosure, the neural network is that network, one network are reset in second generation channel Subelement meets the preset rules that network is reset in second generation channel.
Fig. 8 is the hardware block diagram for illustrating electronic equipment according to an embodiment of the present disclosure.As shown in figure 8, according to the disclosure The electronic equipment 80 of embodiment includes memory 801 and processor 802.Each component in electronic equipment 80 passes through bus system And/or bindiny mechanism's (not shown) interconnection of other forms.
The memory 801 is for storing computer-readable instruction.Specifically, memory 801 may include one or more A computer program product, the computer program product may include various forms of computer readable storage mediums, such as Volatile memory and/or nonvolatile memory.The volatile memory for example may include random access memory (RAM) and/or cache memory (cache) etc..The nonvolatile memory for example may include read-only memory (ROM), hard disk, flash memory etc..
The processor 802 can be central processing unit (CPU), graphics processing unit (GPU) or have at data The processing unit of reason ability and/or the other forms of instruction execution capability, and can control other groups in electronic equipment 80 Part is to execute desired function.In one embodiment of the present disclosure, the processor 802 is for running the memory 801 The computer-readable instruction of middle storage, so that the electronic equipment 80 executes the neural network creation side described referring to Fig.1 Method or the image processing method described referring to Fig. 5.
Additionally, it should be appreciated that the component and structure of electronic equipment 80 shown in Fig. 8 are only exemplary, rather than limit Property processed, as needed, electronic equipment 80 also can have other assemblies and structure.For example, image collecting device and output dress Set equal (not shown).Image collecting device can be used for acquiring the image to be processed for being used for image procossing, and will be captured Image is stored in memory 801 for the use of other components.It is of course also possible to using described in the acquisition of other image capture devices Image to be processed, and the image of acquisition is sent to electronic equipment 80, electronic equipment 80 can store the image received Into memory 801.Output device can export various information to external (such as user), such as at image information and image Manage result.Output device may include one or more of display, loudspeaker, projector, network interface card etc..
Fig. 9 is the schematic diagram for illustrating computer readable storage medium according to an embodiment of the present disclosure.As shown in figure 9, root Computer-readable instruction 901 is stored thereon with according to the computer readable storage medium 900 of the embodiment of the present disclosure.When the computer When readable instruction 901 is run by processor, the neural network creation method described referring to Fig.1 or the figure referring to Fig. 5 description are executed As processing method.
More than, describe neural network creation method according to an embodiment of the present disclosure, creating device, use with reference to the accompanying drawings Image processing method, electronic equipment and the computer readable storage medium of Processing with Neural Network input picture.According to the disclosure Embodiment neural network creation method under the general principle of design of high efficiency neural network, make full use of goal task number According to selecting the neural network of the general principle of design of high efficiency neural network for being suitble to the goal task data and meeting automatically Model structure.Therefore, neural network creation method according to an embodiment of the present disclosure avoids the limitation of hand-designed model, It has been also equipped with the high efficiency executed simultaneously.
The specification and claims of the disclosure and term " first " in attached drawing, " second " and " third " etc. are to be used for Different objects are distinguished, are not use to describe a particular order.
Those of ordinary skill in the art may be aware that list described in conjunction with the examples disclosed in the embodiments of the present disclosure Member and algorithm steps can be realized with the combination of electronic hardware or computer software and electronic hardware.These functions are actually It is implemented in hardware or software, the specific application and design constraint depending on technical solution.Professional technician Each specific application can be used different methods to achieve the described function, but this realization is it is not considered that exceed The scope of the present invention.
The basic principle of the disclosure is described in conjunction with specific embodiments above, however, it is desirable to, it is noted that in the disclosure The advantages of referring to, advantage, effect etc. are only exemplary rather than limitation, must not believe that these advantages, advantage, effect etc. are the disclosure Each embodiment is prerequisite.In addition, detail disclosed above is merely to exemplary effect and the work being easy to understand With, rather than limit, it is that must be realized using above-mentioned concrete details that above-mentioned details, which is not intended to limit the disclosure,.
Device involved in the disclosure, device, equipment, system block diagram only as illustrative example and be not intended to It is required that or hint must be attached in such a way that box illustrates, arrange, configure.As those skilled in the art will appreciate that , it can be connected by any way, arrange, configure these devices, device, equipment, system.Such as "include", "comprise", " tool " etc. word be open vocabulary, refer to " including but not limited to ", and can be used interchangeably with it.Vocabulary used herein above "or" and "and" refer to vocabulary "and/or", and can be used interchangeably with it, unless it is not such that context, which is explicitly indicated,.Here made Vocabulary " such as " refers to phrase " such as, but not limited to ", and can be used interchangeably with it.
In addition, as used herein, the "or" instruction separation used in the enumerating of the item started with "at least one" It enumerates, so that enumerating for such as " at least one of A, B or C " means A or B or C or AB or AC or BC or ABC (i.e. A and B And C).In addition, wording " exemplary " does not mean that the example of description is preferred or more preferable than other examples.
It may also be noted that in the system and method for the disclosure, each component or each step are can to decompose and/or again Combination nova.These decompose and/or reconfigure the equivalent scheme that should be regarded as the disclosure.
The technology instructed defined by the appended claims can not departed from and carried out to the various of technology described herein Change, replace and changes.In addition, the scope of the claims of the disclosure is not limited to process described above, machine, manufacture, thing Composition, means, method and the specific aspect of movement of part.Can use carried out to corresponding aspect described herein it is essentially identical Function or realize essentially identical result there is currently or later to be developed processing, machine, manufacture, event group At, means, method or movement.Thus, appended claims include such processing, machine, manufacture, event within its scope Composition, means, method or movement.
The above description of disclosed aspect is provided so that any person skilled in the art can make or use this It is open.Various modifications in terms of these are readily apparent to those skilled in the art, and are defined herein General Principle can be applied to other aspect without departing from the scope of the present disclosure.Therefore, the disclosure is not intended to be limited to Aspect shown in this, but according to principle disclosed herein and the consistent widest range of novel feature.
In order to which purpose of illustration and description has been presented for above description.In addition, this description is not intended to the reality of the disclosure It applies example and is restricted to form disclosed herein.Although already discussed above multiple exemplary aspects and embodiment, this field skill Its certain modifications, modification, change, addition and sub-portfolio will be recognized in art personnel.

Claims (13)

1. a kind of neural network creation method, comprising:
Configure multiple initial network units, each of the multiple initial network unit include at least input terminal having the same and Multiple network subelements of output end, and the weighting gating connecting with the output end of the multiple network subelement are single Member;
Initial neural network is configured, the initial neural network includes at least the multiple initial network unit;
Based on the goal task of the neural network, the training initial neural network, until meeting predetermined training termination condition; And
For each of the multiple initial network unit, select network single by the weighting gating unit Member, and other network subelements and the weighting gating unit are removed, to obtain the neural network of creation.
2. neural network creation method as described in claim 1, wherein the weighting gating unit is the multiple network Each of unit provides corresponding weighting and gates coefficient,
Wherein, the training initial neural network includes:
The weighting gating coefficient is adjusted,
Wherein, described to include: by the weighting gating unit one network subelement of selection
Retain the wherein corresponding weighting and gates the maximum network subelement of coefficient.
3. neural network creation method as claimed in claim 1 or 2, wherein one network subelement meets with next A or multiple conditions:
One network subelement outputs and inputs port number difference less than preset threshold;
One network subelement executes depth and separates convolution;
One network subelement executes pond;
One network subelement executes neuronal activation.
4. neural network creation method as claimed in claim 1 or 2, wherein the neural network is the rearrangement of second generation channel Network, one network subelement meet the preset rules that network is reset in second generation channel.
5. neural network creation method as claimed in claim 2, wherein described to select one by the weighting gating unit Network subelement indicates are as follows:
Wherein, BiIndicate the output valve of each network subelement in the multiple network subelement, diIndicate each described net The weighting of string bag unit gates coefficient, and T indicates that temperature parameter, i are 1 to the natural number between n, and n is that the multiple network is single The number of member,
Wherein, the training initial neural network further include: as training incrementally increases T.
6. a kind of neural network creating device, comprising:
Configuration module, for configuring multiple initial network units, each of the multiple initial network unit, which includes at least, to be had Multiple network subelements of identical input terminal and output end, and connect with the output end of the multiple network subelement Weighting gating unit;Initial neural network is configured, the initial neural network includes at least the multiple initial network unit;
Training module, for the goal task based on the neural network, the training initial neural network is predetermined until meeting Training termination condition;And
Creation module, for being selected by the weighting gating unit for each of the multiple initial network unit One network subelement, and other network subelements and the weighting gating unit are removed, to obtain the neural network of creation.
7. neural network creating device as claimed in claim 6, wherein the weighting gating unit is the multiple network Each of unit provides corresponding weighting and gates coefficient,
Wherein, the training module adjusts the weighting and gates coefficient, and
Wherein, the creation module retains the wherein corresponding weighting gating maximum network subelement of coefficient.
8. neural network creating device as claimed in claims 6 or 7, wherein one network subelement meets with next A or multiple conditions:
One network subelement outputs and inputs port number difference less than preset threshold;
One network subelement executes depth and separates convolution;
One network subelement executes pond;
One network subelement executes neuronal activation.
9. neural network creating device as claimed in claims 6 or 7, wherein the neural network is the rearrangement of second generation channel Network, one network subelement meet the preset rules that network is reset in second generation channel.
10. neural network creating device as claimed in claim 7, wherein described to pass through weighting gating unit selection one A network subelement indicates are as follows:
Wherein, BiIndicate the output valve of each network subelement in the multiple network subelement, diIndicate each described net The weighting of string bag unit gates coefficient, and T indicates that temperature parameter, i are 1 to the natural number between n, and n is that the multiple network is single The number of member,
Wherein, the creation module incrementally increases T with trained.
11. a kind of image processing method using Processing with Neural Network input picture, the neural network includes input layer, centre Layer and output layer, described image processing method include:
Create the neural network;
The input picture is received via the input layer;
The characteristics of image of the input picture is extracted via the middle layer;And
The processing result for the input picture is exported via the output layer,
Wherein, creating the neural network includes: using described in any item neural network creation sides such as claims 1 to 5 Method, to create the neural network.
12. a kind of electronic equipment, comprising:
Memory, for storing computer-readable instruction;And
Processor executes the neural network as described in any one of claims 1 to 5 for running the computer-readable instruction Creation method or image processing method as claimed in claim 11.
13. a kind of computer readable storage medium, for storing computer-readable instruction, when the computer-readable instruction is by counting When calculation machine executes, so that the computer executes the neural network creation method as described in any one of claims 1 to 5 or such as weighs Benefit require 11 described in image processing method.
CN201811591916.3A 2018-12-25 2018-12-25 Neural network creation method and device, image processing method and electronic equipment Pending CN109657784A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811591916.3A CN109657784A (en) 2018-12-25 2018-12-25 Neural network creation method and device, image processing method and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811591916.3A CN109657784A (en) 2018-12-25 2018-12-25 Neural network creation method and device, image processing method and electronic equipment

Publications (1)

Publication Number Publication Date
CN109657784A true CN109657784A (en) 2019-04-19

Family

ID=66116240

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811591916.3A Pending CN109657784A (en) 2018-12-25 2018-12-25 Neural network creation method and device, image processing method and electronic equipment

Country Status (1)

Country Link
CN (1) CN109657784A (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110414570A (en) * 2019-07-04 2019-11-05 北京迈格威科技有限公司 Image classification model generating method, device, equipment and storage medium
CN110517180A (en) * 2019-07-24 2019-11-29 北京旷视科技有限公司 Image processing method, device and electronic equipment based on high-precision neural network
CN111523657A (en) * 2020-04-26 2020-08-11 云知声智能科技股份有限公司 Neural network accelerator creating method and device, electronic device and storage medium
CN113011384A (en) * 2021-04-12 2021-06-22 重庆邮电大学 Anchor-frame-free target detection method based on lightweight convolution

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1100541A (en) * 1993-06-14 1995-03-22 莫托罗拉公司 Neural network and method of using same
CN1150847A (en) * 1994-05-02 1997-05-28 摩托罗拉公司 Computer utilizing neural network and method of using same
US8665697B1 (en) * 2009-12-23 2014-03-04 Kbc Research Foundation Pvt. Ltd. Subchannel formation in OFDMA systems
CN108108768A (en) * 2017-12-29 2018-06-01 清华大学 Photovoltaic glass defect classification method and device based on convolutional neural networks
CN108288075A (en) * 2018-02-02 2018-07-17 沈阳工业大学 A kind of lightweight small target detecting method improving SSD
CN108734712A (en) * 2017-04-18 2018-11-02 北京旷视科技有限公司 The method, apparatus and computer storage media of background segment
CN108875752A (en) * 2018-03-21 2018-11-23 北京迈格威科技有限公司 Image processing method and device, computer readable storage medium
CN108875904A (en) * 2018-04-04 2018-11-23 北京迈格威科技有限公司 Image processing method, image processing apparatus and computer readable storage medium

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1100541A (en) * 1993-06-14 1995-03-22 莫托罗拉公司 Neural network and method of using same
CN1150847A (en) * 1994-05-02 1997-05-28 摩托罗拉公司 Computer utilizing neural network and method of using same
US8665697B1 (en) * 2009-12-23 2014-03-04 Kbc Research Foundation Pvt. Ltd. Subchannel formation in OFDMA systems
CN108734712A (en) * 2017-04-18 2018-11-02 北京旷视科技有限公司 The method, apparatus and computer storage media of background segment
CN108108768A (en) * 2017-12-29 2018-06-01 清华大学 Photovoltaic glass defect classification method and device based on convolutional neural networks
CN108288075A (en) * 2018-02-02 2018-07-17 沈阳工业大学 A kind of lightweight small target detecting method improving SSD
CN108875752A (en) * 2018-03-21 2018-11-23 北京迈格威科技有限公司 Image processing method and device, computer readable storage medium
CN108875904A (en) * 2018-04-04 2018-11-23 北京迈格威科技有限公司 Image processing method, image processing apparatus and computer readable storage medium

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
KOHEI YAMAMOTO等: "PCAS: Pruning Channels with Attention Statistics for Deep Network Compression", 《ARXIV:1806.05382V2》 *
NINGNING MA等: "ShuffleNet V2: Practical Guidelines for Efficient CNN Architecture Design", 《ECCV 2018》 *
亢凯: "基于归一化像素差异特征和深度学习的监控视频人脸检测系统设计与实现", 《中国硕士学位论文全文数据库(信息科技辑)》 *

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110414570A (en) * 2019-07-04 2019-11-05 北京迈格威科技有限公司 Image classification model generating method, device, equipment and storage medium
CN110414570B (en) * 2019-07-04 2022-01-28 北京迈格威科技有限公司 Image classification model generation method, device, equipment and storage medium
CN110517180A (en) * 2019-07-24 2019-11-29 北京旷视科技有限公司 Image processing method, device and electronic equipment based on high-precision neural network
CN110517180B (en) * 2019-07-24 2023-09-19 北京旷视科技有限公司 Image processing method and device based on high-precision neural network and electronic equipment
CN111523657A (en) * 2020-04-26 2020-08-11 云知声智能科技股份有限公司 Neural network accelerator creating method and device, electronic device and storage medium
CN111523657B (en) * 2020-04-26 2023-06-20 云知声智能科技股份有限公司 Neural network accelerator creation method and device, electronic equipment and storage medium
CN113011384A (en) * 2021-04-12 2021-06-22 重庆邮电大学 Anchor-frame-free target detection method based on lightweight convolution
CN113011384B (en) * 2021-04-12 2022-11-25 重庆邮电大学 Anchor-frame-free target detection method based on lightweight convolution

Similar Documents

Publication Publication Date Title
CN109992779B (en) Emotion analysis method, device, equipment and storage medium based on CNN
CN109657784A (en) Neural network creation method and device, image processing method and electronic equipment
Gavai et al. MobileNets for flower classification using TensorFlow
CN105981008B (en) Learn depth face representation
CN108875904A (en) Image processing method, image processing apparatus and computer readable storage medium
CN108875752A (en) Image processing method and device, computer readable storage medium
JP7376731B2 (en) Image recognition model generation method, device, computer equipment and storage medium
CN111489412A (en) Semantic image synthesis for generating substantially realistic images using neural networks
CN107844784A (en) Face identification method, device, computer equipment and readable storage medium storing program for executing
CN108140032A (en) Automatic video frequency is summarized
CN108416065A (en) Image based on level neural network-sentence description generates system and method
EP3114540A1 (en) Neural network and method of neural network training
CN111127390B (en) X-ray image processing method and system based on transfer learning
JP7257756B2 (en) Image identification device, image identification method, learning device, and neural network
CN109902192B (en) Remote sensing image retrieval method, system, equipment and medium based on unsupervised depth regression
CN108090498A (en) A kind of fiber recognition method and device based on deep learning
Wu et al. Centroid transformers: Learning to abstract with attention
CN109271516A (en) Entity type classification method and system in a kind of knowledge mapping
CN111079833A (en) Image recognition method, image recognition device and computer-readable storage medium
DE102022106057A1 (en) AUTHENTICATOR-INTEGRATED GENERATIVE ADVERSARIAL NETWORK (GAN) FOR SECURE DEEPFAKE GENERATION
CN111860528A (en) Image segmentation model based on improved U-Net network and training method
CN109711136A (en) Store equipment, identifying code Picture Generation Method and device
CN111340179A (en) Convolutional neural network topology method
Liu Human face expression recognition based on deep learning-deep convolutional neural network
Nguyen et al. Reinforcement learning based navigation with semantic knowledge of indoor environments

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20190419