CN106886764A - A kind of panic degree computational methods and device based on deep learning - Google Patents
A kind of panic degree computational methods and device based on deep learning Download PDFInfo
- Publication number
- CN106886764A CN106886764A CN201710096873.0A CN201710096873A CN106886764A CN 106886764 A CN106886764 A CN 106886764A CN 201710096873 A CN201710096873 A CN 201710096873A CN 106886764 A CN106886764 A CN 106886764A
- Authority
- CN
- China
- Prior art keywords
- panic
- image
- degree
- entropy
- crowd
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/52—Surveillance or monitoring of activities, e.g. for recognising suspicious objects
- G06V20/53—Recognition of crowd images, e.g. recognition of crowd congestion
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N20/00—Machine learning
Abstract
The invention provides a kind of panic degree computational methods based on deep learning and device, the method includes:Normal population image set and panic crowd's image set are obtained, the normal population image set includes multiple normal population images, and panic crowd's image set includes multiple panic crowd's images;The normal population image set and panic crowd's image set are processed respectively, obtains normal population fear degree and panic crowd panic degree;Fear degree grade is set up according to the normal population fear degree and the panic crowd panic degree;Normal population image, each described panic crowd's image and the panic degree grade each described are processed using convolutional neural networks model, obtains panic degree calculating network.The invention also discloses corresponding panic degree computing device.
Description
Technical field
The present invention relates to a kind of panic degree computational methods based on deep learning and device.
Background technology
At present, conventional machines learning method needs to be manually specified the characteristics of image that can describe crowd panic, then using machine
Whether the method training detection present image of device study there is the model of fear, finally carry out crowd panic inspection with the model
Survey.The method is that whether fear occurs to present image to detect, without being weighed to panic degree grade, Zhi Neng
Could be detected after generation fear and obtained.Secondly, the method for conventional machines study make use of manual extraction feature, this mode to carry
The feature for obtaining often is difficult the specific task of good match, and this mode extracts feature and takes more, is extremely difficult to
Real-time.The order of severity of crowd panic is judged by personal experience, subjectivity is very strong, and evaluation result has deviation, it is difficult to unified
A kind of judgment criteria.
The content of the invention
In view of this, it is an object of the invention to provide a kind of panic degree computational methods based on deep learning and device,
To try hard to solve the problems, such as or at least alleviate above.
In a first aspect, the scheme of the application provides a kind of panic degree computational methods based on deep learning, including:
Normal population image set and panic crowd's image set are obtained, the normal population image set includes multiple normal populations
Image, panic crowd's image set includes multiple panic crowd's images;
The normal population image set and panic crowd's image set are processed respectively, obtains normal population panic
Degree and panic crowd panic degree;
Fear degree grade is set up according to the normal population fear degree and the panic crowd panic degree;
Using convolutional neural networks model to normal population image each described, each described panic crowd's image and described
Panic degree grade is processed, and obtains panic degree calculating network.
Alternatively, in the method according to the invention, it is described respectively to the normal population image set and the panic people
Group's image set is processed, and obtains normal population fear degree and panic crowd panic degree, including:
Calculate each described normal population image and motion entropy, speed entropy and the direction of each panic crowd's image
Entropy;
Motion entropy, speed entropy and direction entropy to each normal population image are processed, and are obtained normal population and are feared
Flurried degree;
Motion entropy, speed entropy and direction entropy to each panic crowd's image are processed, and are obtained panic crowd and are feared
Flurried degree.
Alternatively, in the method according to the invention, motion entropy, the speed entropy to each normal population image
Processed with direction entropy, obtained normal population fear degree, including:
The average value of the motion entropy, the speed entropy and the direction entropy of each normal population image is calculated,
Obtain multiple normal picture metrics;
The average of the multiple image metric value is calculated, the normal population fear degree is obtained.
Alternatively, in the method according to the invention, motion entropy, the speed entropy to each panic crowd's image
Processed with direction entropy, obtained panic crowd panic degree, including:
The average value of the motion entropy, the speed entropy and the direction entropy of each panic crowd's image is calculated,
Obtain multiple fear image metric values;
The all of panic image metric value is ranked up, the described panic image metric value of threshold number is calculated
Average value, obtains the panic crowd panic degree.
Alternatively, in the method according to the invention, the utilization convolutional neural networks model is to normal person each described
Group's image, each described panic crowd's image and the panic degree grade are processed, and obtain panic degree calculating network, including:
Color matrices, each described panic people that each normal population image is set up using convolutional neural networks model
Mapping relations between the color matrices and the panic degree grade of group's image, obtain the convolutional neural networks.
Second aspect, the scheme of the application provides a kind of convolutional neural networks computing device based on deep learning, including:
Image receiving unit, for obtaining normal population image set and panic crowd's image set, the normal population image
Collection includes multiple normal population images, and panic crowd's image set includes multiple panic crowd's images;
First computing unit, for respectively to the normal population image set and panic crowd's image set at
Reason, obtains normal population fear degree and panic crowd panic degree;
Level de-termination unit, for setting up fear degree according to the normal population fear degree and the panic crowd panic degree
Grade;
Second computing unit, for using convolutional neural networks model to normal population image each described, described in each
Panic crowd's image and the panic degree grade are processed, and obtain panic degree calculating network.
Alternatively, in a device in accordance with the invention, first computing unit is additionally operable to:
Calculate each described normal population image and motion entropy, speed entropy and the direction of each panic crowd's image
Entropy;
Motion entropy, speed entropy and direction entropy to each normal population image are processed, and are obtained normal population and are feared
Flurried degree;
Motion entropy, speed entropy and direction entropy to each panic crowd's image are processed, and are obtained panic crowd and are feared
Flurried degree.
Alternatively, in a device in accordance with the invention, first computing unit is additionally operable to:
The average value of the motion entropy, the speed entropy and the direction entropy of each normal population image is calculated,
Obtain multiple normal picture metrics;
The average of the multiple image metric value is calculated, the normal population fear degree is obtained.
Alternatively, in a device in accordance with the invention, first computing unit is additionally operable to:
The average value of the motion entropy, the speed entropy and the direction entropy of each panic crowd's image is calculated,
Obtain multiple fear image metric values;
The all of panic image metric value is ranked up, the described panic image metric value of threshold number is calculated
Average value, obtains the panic crowd panic degree.
Alternatively, in a device in accordance with the invention, second computing unit is additionally operable to:
Color matrices, each described panic people that each normal population image is set up using convolutional neural networks model
Mapping relations between the color matrices and the panic degree grade of group's image, obtain the panic degree calculating network.
Technology according to the present invention scheme, using deep learning technology study crowd's image and crowd's image panic degree it
Between mapping relations, set up fear degree calculating network, can automatically be quickly obtained the panic degree of image, detection speed comparatively fast,
The purpose of real-time detection is reached, meanwhile, the mesh that prevention crowd panic occurs can be reached with root fear degree when there is no fear
's.
To enable the above objects, features and advantages of the present invention to become apparent, preferred embodiment cited below particularly, and coordinate
Appended accompanying drawing, is described in detail below.
Brief description of the drawings
Technical scheme in order to illustrate more clearly the embodiments of the present invention, below will be attached to what is used needed for embodiment
Figure is briefly described, it will be appreciated that the following drawings illustrate only certain embodiments of the present invention, thus be not construed as it is right
The restriction of scope, for those of ordinary skill in the art, on the premise of not paying creative work, can also be according to this
A little accompanying drawings obtain other related accompanying drawings.
Fig. 1 shows the flow chart of a kind of panic degree computational methods that the embodiment of the present invention is provided;
Fig. 2 shows that the panic degree grade that the embodiment of the present invention is provided determines the flow chart of method;
Fig. 3 shows the structure chart of a kind of panic degree computing device that the embodiment of the present invention is provided.
Specific embodiment
To make the purpose, technical scheme and advantage of the embodiment of the present invention clearer, below in conjunction with the embodiment of the present invention
Middle accompanying drawing, is clearly and completely described to the technical scheme in the embodiment of the present invention, it is clear that described embodiment is only
It is a part of embodiment of the invention, rather than whole embodiments.The present invention generally described and illustrated in accompanying drawing herein is real
The component for applying example can be arranged and designed with a variety of configurations.Therefore, it is of the invention to what is provided in the accompanying drawings below
The detailed description of embodiment is not intended to limit the scope of claimed invention, but is merely representative of selected reality of the invention
Apply example.Based on embodiments of the invention, the institute that those skilled in the art are obtained on the premise of creative work is not made
There is other embodiment, belong to the scope of protection of the invention.
Fig. 1 shows the flow chart of a kind of panic degree computational methods for being provided according to embodiments of the present invention.Such as Fig. 1 institutes
Show, the method starts from step S110.
In step s 110, normal population image set and panic crowd's image set are obtained.Wherein, normal population image set bag
Multiple normal population images are included, panic crowd's image set includes multiple panic crowd's images.Normal population image set and panic people
Group's image set is obtained by artificial screening, and e.g., normal population image is each frame intercepted in normal population image/video
Image, panic crowd's image is each two field picture intercepted in panic crowd's image/video, normal population image set and fear
The quantity of image does not do any limitation herein in crowd's image set.
In the step s 120, normal population image set and panic crowd's image set are processed respectively, obtains normal person
Group's fear degree and panic crowd panic degree.
In one embodiment, the motion entropy, speed of each normal population image and each panic crowd's image are calculated
Entropy and direction entropy;Motion entropy, speed entropy and direction entropy to each normal population image are processed, and obtain normal population panic
Degree;Motion entropy, speed entropy and direction entropy to each panic crowd's image are processed, and obtain panic crowd panic degree.
In one embodiment, calculate each normal population image the motion entropy, speed entropy and direction entropy it is flat
Average, obtains multiple normal picture metrics;The average of multiple images metric is calculated, normal population fear degree is obtained.
Fig. 2 is referred in one embodiment, first, gray processing treatment is carried out to normal population image, calculate at gray processing
The motion entropy of each the normal population image after reason, speed entropy and direction entropy.Prior art carries out gray processing treatment to image
Method is a lot, is not described in detail herein.It should be appreciated that all can all exist to the method that image carries out gray processing treatment
In protection scope of the present invention.
Motion entropy calculating process is as follows:
Statistics gray processing treatment normal population image in each gray value occur number of times, calculate each gray value with it is upper
The ratio of the total pixel number in image is stated, the ratio uses s as the probability of each gray valueiRepresent.Computing formula is as follows:
Wherein, Hs is the motion entropy of normal population image;
sxIt is the ratio of total pixel number in the gray value x and image of pixel, i is each pixel in normal population image
Gray value, x ∈ [0,255].
Speed entropy calculating process is as follows:
Continuous two field pictures A, B are selected in normal population image, by manual type, demarcates every in image A and B
The distance that individual who object is moved in current coordinate system, the minimum range that will be demarcated and ultimate range are used as move distance area
Between, 8 subintervals are divided into by the move distance is interval average, calculate the picture of the who object in each subinterval image A
The ratio of prime number and total pixel number, the probability that the ratio is occurred as friction speed uses viRepresent.Computing formula is as follows:
Wherein, HvIt is the speed entropy of normal population image;
vyIt is the pixel count and the ratio of total pixel number of y intervals who object, y ∈ [1,8].
The calculating process of direction entropy is as follows:
Continuous two field pictures A, B are selected in normal population image, 360 degree of plane spaces of image are divided into 8
Interval, i.e. each interval are 45 degree.By manual type, everyone thing object is demarcated in image A and B in current coordinate system
Mobile direction, the pixel count of the who object where calculating in each interval and the ratio of total pixel number, using the ratio as
The probability that different directions occur, uses oiRepresent.Computing formula is as follows:
Wherein, H0It is the direction entropy of normal population image;
ozIt is the pixel count of who object in z-th interval and the ratio of total pixel number, z ∈ [1,8].
After motion entropy, speed entropy and the direction entropy for obtaining all normal population images, each normal population image is calculated
Motion entropy, the average h of speed entropy and direction entropy1, then calculate all normal population image average h1Average value Hn, this is average
Value HnThere is no panic angle value during fear in expression crowd.
In one embodiment, calculate each panic crowd's image the motion entropy, speed entropy and direction entropy it is flat
Average, obtains multiple fear image metric values, and all of panic image metric value is ranked up, and calculates the fear of threshold number
The average value of image metric value, obtains panic crowd panic degree.
In panic crowd's image set the motion entropy of each panic crowd's image, the calculating process of speed entropy and direction entropy with it is upper
State calculating process identical, do not described excessively herein.
After motion entropy, speed entropy and the direction entropy for obtaining all panic crowd's images, each panic crowd's image is calculated
Motion entropy, speed entropy and direction entropy average h2, by all of average h2Sorted according to order from big to small, take threshold value
The average of the average of for example preceding 10% number of data represents the panic degree H during most serious of crowd panic Chengduh。
In step s 130, fear degree grade is set up according to normal population fear degree and panic crowd panic degree.
In one embodiment, by [H0, Hh] interval as fear degree, it is 9 subintervals by the panic degree interval division,
The length in each subinterval is identical, obtains panic degree grade, and each panic degree grade corresponds to numeral 1-9, and 0 expression crowd does not send out
Afraid flurried degree, i.e. H0, 10 represent that crowd occurs fear degree most serious, i.e. Hh。
In step S140, using convolutional neural networks model to each normal population image, each panic crowd's image
Processed with panic degree grade, obtained panic degree calculating network.
In one embodiment, using convolutional neural networks model set up each normal population image color matrices,
Mapping relations between the color matrices and fear degree grade of each panic crowd's image, obtain panic degree calculating network.
In order to rapidly detect the panic degree grade of each image automatically, using panic crowd's image set and normal person
Group image set carry out training convolutional neural networks model, each pixel in each image can with color matrices as (R, G,
B) three-dimensional vector represents, calculates each image by convolutional neural networks model according to above-mentioned three-dimensional matrice with each image
Mapping relations between panic degree, that is, set up fear degree calculating network.Wherein, convolutional neural networks model calculating process is detailed
Thin introduction, is no longer excessively described herein, it being understood, however, that all moulds that can be used for setting up panic degree calculating network
Type is all within the scope of the present invention.
Need the fear for predicting certain image when spending, using the image as panic degree calculating network input, you can obtain
The panic degree of above-mentioned image.Realize using panic degree calculating network it is fast automatic the panic degree of photo current is sentenced
It is disconnected, reach the purpose of real-time detection.
Technology according to the present invention scheme, using deep learning technology study crowd's image and crowd's image panic degree it
Between mapping relations, set up fear degree calculating network, can automatically be quickly obtained the panic degree of image, detection speed comparatively fast,
The purpose of real-time detection is reached, meanwhile, the mesh that prevention crowd panic occurs can be reached with root fear degree when there is no fear
's.
Fig. 3 shows the structure chart of a kind of panic degree computing device for being provided according to embodiments of the present invention.Such as Fig. 3 institutes
Show, the device includes:Image receiving unit 310, the first computing unit 320, the computing unit of level de-termination unit 330 and second
340。
Image receiving unit 310 obtains normal population image set and panic crowd's image set, the normal population image set
Including multiple normal population images, panic crowd's image set includes multiple panic crowd's images.
First computing unit 320 is processed the normal population image set and panic crowd's image set respectively,
Obtain normal population fear degree and panic crowd panic degree.
Alternatively, the first computing unit 320 is additionally operable to calculate each described normal population image and each described panic people
Group's motion entropy of image, speed entropy and direction entropy;Motion entropy, speed entropy and direction entropy to each normal population image enter
Row treatment, obtains normal population fear degree;At motion entropy, speed entropy and direction entropy to each panic crowd's image
Reason, obtains panic crowd panic degree.
Alternatively, the first computing unit 320 is additionally operable to calculate the motion entropy, described of each normal population image
The average value of speed entropy and the direction entropy, obtains multiple normal picture metrics;Calculate the equal of the multiple image metric value
Value, obtains the normal population fear degree.
First computing unit 320 is additionally operable to calculate the motion entropy, the speed entropy of each panic crowd's image
With the average value of the direction entropy, multiple fear image metric values are obtained;The all of panic image metric value is arranged
Sequence, calculates the average value of the described panic image metric value of threshold number, obtains the panic crowd panic degree.
Level de-termination unit 330 sets up fear degree etc. according to the normal population fear degree and the panic crowd panic degree
Level.
Second computing unit 340 using convolutional neural networks model to normal population image each described, each is described probably
Flurried crowd's image and the panic degree grade are processed, and obtain panic degree calculating network.
Alternatively, the second computing unit 340 is additionally operable to set up each described normal population using convolutional neural networks model
Mapping relations between the color matrices of image, the color matrices of each panic crowd's image and the panic degree grade,
Obtain the panic degree calculating network.
Wherein, the calculating process of motion entropy, direction entropy and speed entropy is described in detail above, no longer carries out herein
Excessive narration.
In specification mentioned herein, numerous specific details are set forth.It is to be appreciated, however, that implementation of the invention
Example can be put into practice in the case of without these details.In some instances, known method, knot is not been shown in detail
Structure and technology, so as not to obscure the understanding of this description.
Similarly, it will be appreciated that in order to simplify one or more that the disclosure and helping understands in each inventive aspect, exist
Above to the description of exemplary embodiment of the invention in, each feature of the invention is grouped together into single implementation sometimes
In example, figure or descriptions thereof.However, the method for the disclosure should be construed to reflect following intention:I.e. required guarantor
The application claims of shield are than the feature more features that is expressly recited in each claim.More precisely, as following
As claims reflect, inventive aspect is all features less than single embodiment disclosed above.Therefore, abide by
Thus the claims for following specific embodiment are expressly incorporated in the specific embodiment, and wherein each claim is in itself
As separate embodiments of the invention.
Those skilled in the art should be understood the module or unit or group of the equipment in example disclosed herein
Part can be arranged in equipment as depicted in this embodiment, or alternatively can be positioned at and the equipment in the example
In one or more different equipment.Module in aforementioned exemplary can be combined as a module or be segmented into multiple in addition
Submodule.
Those skilled in the art are appreciated that can be carried out adaptively to the module in the equipment in embodiment
Change and they are arranged in one or more equipment different from the embodiment.Can be the module or list in embodiment
Unit or component be combined into a module or unit or component, and can be divided into addition multiple submodule or subelement or
Sub-component.In addition at least some in such feature and/or process or unit exclude each other, can use any
Combine to all features disclosed in this specification (including adjoint claim, summary and accompanying drawing) and so disclosed appoint
Where all processes or unit of method or equipment are combined.Unless expressly stated otherwise, this specification (including adjoint power
Profit is required, summary and accompanying drawing) disclosed in each feature can the alternative features of or similar purpose identical, equivalent by offer carry out generation
Replace.
Although additionally, it will be appreciated by those of skill in the art that some embodiments described herein include other embodiments
In included some features rather than further feature, but the combination of the feature of different embodiments means in of the invention
Within the scope of and form different embodiments.For example, in the following claims, embodiment required for protection is appointed
One of meaning mode can be used in any combination.
Additionally, some in the embodiment be described as herein can be by the processor of computer system or by performing
The combination of method or method element that other devices of the function are implemented.Therefore, with for implementing methods described or method
The processor of the necessary instruction of element forms the device for implementing the method or method element.Additionally, device embodiment
Element described in this is the example of following device:The device is used to implement as performed by the element for the purpose for implementing the invention
Function.
As used in this, unless specifically stated so, come using ordinal number " first ", " second ", " the 3rd " etc.
Description plain objects are merely representative of and are related to the different instances of similar object, and are not intended to imply that the object being so described must
Must have the time it is upper, spatially, sequence aspect or given order in any other manner.
Although the embodiment according to limited quantity describes the present invention, above description, the art are benefited from
It is interior it is clear for the skilled person that in the scope of the present invention for thus describing, it can be envisaged that other embodiments.Additionally, it should be noted that
The language that is used in this specification primarily to readable and teaching purpose and select, rather than in order to explain or limit
Determine subject of the present invention and select.Therefore, in the case of without departing from the scope of the appended claims and spirit, for this
Many modifications and changes will be apparent from for the those of ordinary skill of technical field.For the scope of the present invention, to this
The done disclosure of invention is illustrative and not restrictive, and it is intended that the scope of the present invention be defined by the claims appended hereto.
Claims (10)
1. a kind of panic degree computational methods based on deep learning, it is characterised in that including:
Normal population image set and panic crowd's image set are obtained, the normal population image set includes multiple normal population figures
Picture, panic crowd's image set includes multiple panic crowd's images;
The normal population image set and panic crowd's image set are processed respectively, obtain normal population fear degree and
Panic crowd panic degree;
Fear degree grade is set up according to the normal population fear degree and the panic crowd panic degree;
Using convolutional neural networks model to normal population image, each panic crowd's image and the fear each described
Degree grade is processed, and obtains panic degree calculating network.
2. the method for claim 1, it is characterised in that described respectively to the normal population image set and the fear
Crowd's image set is processed, and obtains normal population fear degree and panic crowd panic degree, including:
Calculate each described normal population image and motion entropy, speed entropy and the direction entropy of each panic crowd's image;
Motion entropy, speed entropy and direction entropy to each normal population image are processed, and obtain normal population fear degree;
Motion entropy, speed entropy and direction entropy to each panic crowd's image are processed, and obtain panic crowd panic degree.
3. method as claimed in claim 2, it is characterised in that motion entropy, the speed to each normal population image
Degree entropy and direction entropy are processed, and obtain normal population fear degree, including:
The average value of the motion entropy, the speed entropy and the direction entropy of each normal population image is calculated, is obtained
Multiple normal picture metrics;
The average of the multiple image metric value is calculated, the normal population fear degree is obtained.
4. method as claimed in claim 2, it is characterised in that motion entropy, the speed to each panic crowd's image
Degree entropy and direction entropy are processed, and obtain panic crowd panic degree, including:
The average value of the motion entropy, the speed entropy and the direction entropy of each panic crowd's image is calculated, is obtained
Multiple fear image metric values;
The all of panic image metric value is ranked up, calculate threshold number described panic image metric value it is average
Value, obtains the panic crowd panic degree.
5. the method for claim 1, it is characterised in that the utilization convolutional neural networks model is to normal each described
Crowd's image, each described panic crowd's image and the panic degree grade are processed, and obtain panic degree calculating network, bag
Include:
The color matrices of each normal population image, each described panic crowd are set up using convolutional neural networks model to scheme
Mapping relations between the color matrices of picture and the panic degree grade, obtain the panic degree calculating network.
6. a kind of convolutional neural networks computing device based on deep learning, it is characterised in that including:
Image receiving unit, for obtaining normal population image set and panic crowd's image set, the normal population image set bag
Multiple normal population images are included, panic crowd's image set includes multiple panic crowd's images;
First computing unit, for processing the normal population image set and panic crowd's image set respectively, obtains
To normal population fear degree and panic crowd panic degree;
Level de-termination unit, for setting up fear degree etc. according to the normal population fear degree and the panic crowd panic degree
Level;
Second computing unit, for utilizing convolutional neural networks model to normal population image, each described fear each described
Crowd's image and the panic degree grade are processed, and obtain panic degree calculating network.
7. device as claimed in claim 6, it is characterised in that first computing unit is additionally operable to:
Calculate each described normal population image and motion entropy, speed entropy and the direction entropy of each panic crowd's image;
Motion entropy, speed entropy and direction entropy to each normal population image are processed, and obtain normal population fear degree;
Motion entropy, speed entropy and direction entropy to each panic crowd's image are processed, and obtain panic crowd panic degree.
8. device as claimed in claim 7, it is characterised in that first computing unit is additionally operable to:
The average value of the motion entropy, the speed entropy and the direction entropy of each normal population image is calculated, is obtained
Multiple normal picture metrics;
The average of the multiple image metric value is calculated, the normal population fear degree is obtained.
9. device as claimed in claim 7, it is characterised in that first computing unit is additionally operable to:
The average value of the motion entropy, the speed entropy and the direction entropy of each panic crowd's image is calculated, is obtained
Multiple fear image metric values;
The all of panic image metric value is ranked up, calculate threshold number described panic image metric value it is average
Value, obtains the panic crowd panic degree.
10. device as claimed in claim 6, it is characterised in that second computing unit is additionally operable to:
The color matrices of each normal population image, each described panic crowd are set up using convolutional neural networks model to scheme
Mapping relations between the color matrices of picture and the panic degree grade, obtain the panic degree calculating network.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201710096873.0A CN106886764B (en) | 2017-02-22 | 2017-02-22 | Panic degree calculation method and device based on deep learning |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201710096873.0A CN106886764B (en) | 2017-02-22 | 2017-02-22 | Panic degree calculation method and device based on deep learning |
Publications (2)
Publication Number | Publication Date |
---|---|
CN106886764A true CN106886764A (en) | 2017-06-23 |
CN106886764B CN106886764B (en) | 2020-11-06 |
Family
ID=59179049
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201710096873.0A Expired - Fee Related CN106886764B (en) | 2017-02-22 | 2017-02-22 | Panic degree calculation method and device based on deep learning |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN106886764B (en) |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109408889A (en) * | 2018-09-21 | 2019-03-01 | 同济大学 | Macroscopical crowd panic measure and its application based on comentropy |
CN109472049A (en) * | 2018-09-29 | 2019-03-15 | 同济大学 | Macroscopical crowd panic Dynamical model method for building up and kinetic model application |
CN112150667A (en) * | 2020-09-22 | 2020-12-29 | 杭州海康威视数字技术股份有限公司 | Information processing method and device and information recorder |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104077613A (en) * | 2014-07-16 | 2014-10-01 | 电子科技大学 | Crowd density estimation method based on cascaded multilevel convolution neural network |
CN105138982A (en) * | 2015-08-21 | 2015-12-09 | 中南大学 | Crowd abnormity detection and evaluation method based on multi-characteristic cluster and classification |
CN105184815A (en) * | 2015-07-31 | 2015-12-23 | 江苏诚创信息技术研发有限公司 | Assembly event detection method and system |
CN105447458A (en) * | 2015-11-17 | 2016-03-30 | 深圳市商汤科技有限公司 | Large scale crowd video analysis system and method thereof |
CN106022244A (en) * | 2016-05-16 | 2016-10-12 | 广东工业大学 | Unsupervised crowd abnormity monitoring and positioning method based on recurrent neural network modeling |
-
2017
- 2017-02-22 CN CN201710096873.0A patent/CN106886764B/en not_active Expired - Fee Related
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104077613A (en) * | 2014-07-16 | 2014-10-01 | 电子科技大学 | Crowd density estimation method based on cascaded multilevel convolution neural network |
CN105184815A (en) * | 2015-07-31 | 2015-12-23 | 江苏诚创信息技术研发有限公司 | Assembly event detection method and system |
CN105138982A (en) * | 2015-08-21 | 2015-12-09 | 中南大学 | Crowd abnormity detection and evaluation method based on multi-characteristic cluster and classification |
CN105447458A (en) * | 2015-11-17 | 2016-03-30 | 深圳市商汤科技有限公司 | Large scale crowd video analysis system and method thereof |
CN106022244A (en) * | 2016-05-16 | 2016-10-12 | 广东工业大学 | Unsupervised crowd abnormity monitoring and positioning method based on recurrent neural network modeling |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109408889A (en) * | 2018-09-21 | 2019-03-01 | 同济大学 | Macroscopical crowd panic measure and its application based on comentropy |
CN109408889B (en) * | 2018-09-21 | 2022-08-12 | 同济大学 | Macroscopic population panic measurement method based on information entropy and application thereof |
CN109472049A (en) * | 2018-09-29 | 2019-03-15 | 同济大学 | Macroscopical crowd panic Dynamical model method for building up and kinetic model application |
CN109472049B (en) * | 2018-09-29 | 2021-03-26 | 同济大学 | Macroscopic population panic propagation dynamics model establishment method and dynamics model application |
CN112150667A (en) * | 2020-09-22 | 2020-12-29 | 杭州海康威视数字技术股份有限公司 | Information processing method and device and information recorder |
Also Published As
Publication number | Publication date |
---|---|
CN106886764B (en) | 2020-11-06 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN102891966B (en) | Focusing method and device for digital imaging device | |
CN112132023A (en) | Crowd counting method based on multi-scale context enhanced network | |
WO2019210555A1 (en) | People counting method and device based on deep neural network and storage medium | |
CN108053449A (en) | Three-dimensional rebuilding method, device and the binocular vision system of binocular vision system | |
JP2017525210A5 (en) | ||
CN109191512A (en) | The depth estimation method and device of binocular image, equipment, program and medium | |
CN106886764A (en) | A kind of panic degree computational methods and device based on deep learning | |
WO2014187223A1 (en) | Method and apparatus for identifying facial features | |
CN105118027A (en) | Image defogging method | |
CN111127435B (en) | No-reference image quality evaluation method based on double-current convolution neural network | |
Liu et al. | Image de-hazing from the perspective of noise filtering | |
CN109284735B (en) | Mouse feelings monitoring method, device and storage medium | |
CN110827269B (en) | Crop growth change condition detection method, device, equipment and medium | |
CN109889733B (en) | Automatic exposure compensation method, storage medium and computer | |
WO2017080410A1 (en) | Method and apparatus for identifying pupil in image | |
CN102663718A (en) | Method and system for deblurring of gloablly inconsistent image | |
CN108305250A (en) | The synchronous identification of unstructured robot vision detection machine components and localization method | |
US8953877B2 (en) | Noise estimation for images | |
US9330340B1 (en) | Noise estimation for images using polynomial relationship for pixel values of image features | |
CN112580585A (en) | Excavator target detection method and device based on stacked dense network | |
US20150187051A1 (en) | Method and apparatus for estimating image noise | |
CN110599532A (en) | Depth estimation model optimization and depth estimation processing method and device for image | |
CN112767385B (en) | No-reference image quality evaluation method based on significance strategy and feature fusion | |
CN111652297B (en) | Fault picture generation method for image detection model training | |
CN116844114A (en) | Helmet detection method and device based on YOLOv7-WFD model |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant | ||
CF01 | Termination of patent right due to non-payment of annual fee |
Granted publication date: 20201106 |
|
CF01 | Termination of patent right due to non-payment of annual fee |