CN111709340B - Umbrella use behavior detection method and system - Google Patents

Umbrella use behavior detection method and system Download PDF

Info

Publication number
CN111709340B
CN111709340B CN202010517896.6A CN202010517896A CN111709340B CN 111709340 B CN111709340 B CN 111709340B CN 202010517896 A CN202010517896 A CN 202010517896A CN 111709340 B CN111709340 B CN 111709340B
Authority
CN
China
Prior art keywords
umbrella
target
information matrix
data
information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010517896.6A
Other languages
Chinese (zh)
Other versions
CN111709340A (en
Inventor
简梦雅
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hangzhou Yunshitong Internet Technology Co ltd
Original Assignee
Hangzhou Yunshitong Internet Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hangzhou Yunshitong Internet Technology Co ltd filed Critical Hangzhou Yunshitong Internet Technology Co ltd
Priority to CN202010517896.6A priority Critical patent/CN111709340B/en
Publication of CN111709340A publication Critical patent/CN111709340A/en
Application granted granted Critical
Publication of CN111709340B publication Critical patent/CN111709340B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/52Surveillance or monitoring of activities, e.g. for recognising suspicious objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • G06F18/214Generating training patterns; Bootstrap methods, e.g. bagging or boosting
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V2201/00Indexing scheme relating to image or video recognition or understanding
    • G06V2201/07Target detection

Abstract

The application relates to a method and a system for detecting umbrella use behaviors. According to the method for detecting the umbrella use behaviors, the target detection network local to the umbrella detection device is trained according to the original image data, and the target detection network has pertinence in identifying the umbrella use behaviors in the image, so that the accuracy is greatly improved. The original image to be detected is obtained once every preset time period through the target detection network, whether the umbrella use behavior occurs in the preset time period can be judged, the detection speed is high, the accuracy is high, and the misjudgment rate is reduced.

Description

Umbrella use behavior detection method and system
Technical Field
The present disclosure relates to the field of image recognition and detection technologies, and in particular, to a method and a system for detecting an umbrella use behavior.
Background
Airport guarantee works have great significance to maintaining airport flights in order. In airport security work, airport foreign matter detection is a relatively important component. The engine of the airplane sucks the peripheral foreign matters into the airplane during the service of the airplane, so that serious safety accidents are caused. Because there is a risk of sucking the umbrella when the aircraft is started, airport staff and passengers can strictly forbid to use the umbrella such as umbrella and sun umbrella. The detection of the using behaviors of airport staff and passenger umbrellas can discover the using behaviors of the umbrellas in time and take stopping measures in time, so that the running safety of the airplane is guaranteed to a great extent.
The detection method of the using behavior of the traditional umbrella is generally realized by a manual inspection mode. This creates a serious problem: the detection efficiency is low and the detection accuracy is low. On the one hand, the situation that the check is missed can be easily generated by arranging special detection personnel for checking at the boarding entrance, and the detection accuracy is low. On the other hand, the manual detection mode severely restricts the passenger passing efficiency, increases the passing time, and causes low detection efficiency.
Disclosure of Invention
Based on this, it is necessary to provide a method and a system for detecting the use behavior of an umbrella, which aims at the problems of low detection efficiency and low detection accuracy of the conventional detection method for the use behavior of an umbrella.
The application provides a method for detecting umbrella use behaviors, which comprises the following steps:
acquiring original image data, and establishing an umbrella data set based on the original image data;
training a target detection network by using the umbrella data set;
obtaining an original image to be detected every preset time period, and inputting the original image to be detected into the target detection network;
operating the target detection network and outputting a detection result;
and acquiring the detection result, and judging whether the umbrella use behavior occurs in the preset time period according to the detection result.
The application also provides a detection system of umbrella use behavior, including:
the image acquisition device is used for acquiring an original image to be detected once every preset time period;
the umbrella detection device is in communication connection with the image acquisition device and is used for executing the detection method of the umbrella use behavior so as to detect the illegal target in the original image to be detected;
and the server is in communication connection with the umbrella detection device and is used for storing the target information of the illegal target sent by the umbrella detection device.
The application relates to a detection method and a detection system for umbrella use behaviors, which train a target detection network local to an umbrella detection device according to original image data, and the target detection network has pertinence to the identification of the umbrella use behaviors in images, so that the accuracy is greatly improved. The original image to be detected is obtained once every preset time period through the target detection network, whether the umbrella use behavior occurs in the preset time period can be judged, the detection speed is high, the accuracy is high, and the misjudgment rate is reduced.
Drawings
FIG. 1 is a flow chart of a method for detecting the use behavior of an umbrella according to an embodiment of the present disclosure;
fig. 2 is a schematic structural diagram of a detection system for umbrella use behavior according to an embodiment of the present application.
Detailed Description
In order to make the objects, technical solutions and advantages of the present application more apparent, the present application will be further described in detail with reference to the accompanying drawings and examples. It should be understood that the specific embodiments described herein are for purposes of illustration only and are not intended to limit the present application.
The application provides a method for detecting umbrella use behaviors.
It should be noted that, the method for detecting the umbrella use behavior provided by the application is not limited to the application field and the application scene. Optionally, the method for detecting the umbrella use behavior can be applied to airport guarantee work, and is particularly applied to detecting whether boarding personnel use the umbrella.
The method for detecting the umbrella use behavior is not limited to an execution main body. Alternatively, the main body of the umbrella use behavior detection method provided in the present application may be an umbrella detection device 20. Specifically, the umbrella detecting device 20 may be any terminal having a data processing function, such as a computer.
As shown in fig. 1, in an embodiment of the present application, the method for detecting the usage behavior of the umbrella includes the following steps:
s100, acquiring original image data, and establishing an umbrella data set based on the original image data.
Specifically, the raw image data may include any image of any region to be detected. The area to be detected may be a boarding entrance. Since the original image data is used for training a subsequent target detection network, the data size of the original image data is large, and the coverage range is wide.
S200, training a target detection network by using the umbrella data set.
In particular, the umbrella data set includes a plurality of data relating to umbrella holding behaviour. Based on the umbrella data set, training of the target detection network can be achieved.
S300, obtaining an original image to be detected once every preset time period, and inputting the original image to be detected to the target detection network.
Specifically, the setting of the preset time period may be preset by a monitoring person who holds the umbrella. Alternatively, the preset time period may be 1 second, so as to ensure that the original image to be measured may be acquired in real time.
S400, operating the target detection network and outputting a detection result.
Specifically, the target detection network can identify and analyze the original image to be detected to obtain a detection result of the umbrella holding behavior.
S500, acquiring the detection result, and judging whether umbrella use behaviors occur in the preset time period according to the detection result.
Specifically, monitoring personnel can timely find out personnel using the umbrella in the area to be detected according to the detection result, timely prevent the action of using the umbrella, and effectively guarantee the safety of the airplane.
In this embodiment, by training the target detection network local to the umbrella detection device 20 according to the original image data, the target detection network has pertinence to identify the umbrella use behavior in the image, and the accuracy is greatly improved. The original image to be detected is obtained once every preset time period through the target detection network, whether the umbrella use behavior occurs in the preset time period can be judged, the detection speed is high, the accuracy is high, and the misjudgment rate is reduced.
In an embodiment of the present application, the step S100 includes:
s110, acquiring original image data.
Specifically, the raw image data may be acquired by the image acquisition device 10. The image capturing apparatus 10 may be any device having a capturing or video recording function, for example, a monitoring camera. The image acquisition device 10 is communicatively connected to the umbrella detection device 20. After acquiring the raw image data, the image acquisition device 10 transmits the raw image data to the umbrella detecting device 20.
And S120, screening data information related to umbrella use behaviors in the original image data to serve as umbrella data.
Specifically, there are a wide variety of data information in the raw image data, such as population density of the region to be detected, cleanliness, light brightness, and the like. In this step, the umbrella detecting device 20 only screens the data information related to the umbrella use behavior.
S130, establishing an umbrella data set, and incorporating the umbrella data into the umbrella data set.
Alternatively, the umbrella data set may be stored in a local database or uploaded to the personal server 30 as an original database to implement a backup function.
In this embodiment, the screening and collection of the data information related to the umbrella use behavior is implemented by screening the data information related to the umbrella use behavior in the original image data and establishing the umbrella data set based on the data information related to the umbrella use behavior, so as to provide an original data source for training of a subsequent target detection network.
In an embodiment of the present application, the data information related to the usage behavior of the umbrella includes one or more of airport personnel data without using the umbrella, passenger data without using the umbrella, airport personnel data with using the umbrella, passenger data with using the umbrella, style data of the umbrella, and color data of the umbrella.
In particular, considering that a passenger or an airport staff using the umbrella may cover the upper body with the umbrella, the umbrella detection device 20 cannot identify and classify the umbrella, and the data information related to the umbrella use behavior may include personnel data of the situation that the umbrella covers the upper body, so as to avoid omission of data screening.
The specific content setting of the data information related to the umbrella use behavior of the embodiment reflects the comprehensiveness of data screening.
In an embodiment of the present application, the step S200 includes:
and S210, marking the umbrella data in the umbrella data set with information.
In the step, the umbrella data in the umbrella data set are marked by using a marking tool. Alternatively, the umbrella data may be labeled as both umbrella and personnel umbrella data. The main purpose of this step is to provide a training set for training the target detection network, and a data base for training the target detection network that can detect personnel and umbrellas.
S220, selecting a feature extraction network and a detection network frame, and establishing a target detection network according to the feature extraction network and the detection network frame.
Specifically, a res101 network with high network speed and good feature extraction effect can be selected as the feature extraction network. And an SSD network with good one-stage effect can be selected as the detection network frame.
And S230, training the target detection network by adopting a transfer learning training method according to the umbrella data marked with the information.
Of course, other methods for training the target detection network may be used in this step.
In this embodiment, the construction of the target detection network is realized by selecting the feature extraction network and the detection network frame. The target detection network is trained through the transfer learning training method, so that the detection accuracy of the trained target detection network is high.
In an embodiment of the present application, the step S400 includes:
s410, collecting target rectangular frames of all targets in the original image to be detected, and generating a target information matrix:
Figure GDA0004143893970000061
wherein the target information matrix is composed of n element rows. Each element row is made up of five elements. Each object has a row of elements, obj being the object information matrix. n is the sequence number of the target. cls n Is the type of object. X is x n Target rectangle for targetThe abscissa of the upper left corner coordinate point of the box. y is n Is the ordinate of the coordinate point of the upper left corner of the rectangular frame of the target. w (w) n The length of the target rectangular frame in the horizontal direction is the target. h is a n The length of the target rectangular frame in the vertical direction is the target. The value of n is a natural number.
Specifically, the original image to be measured and the original image data are different. The raw image data is a large number of raw images used in training the target detection network. The original image to be detected is an original image to be detected of the trained target detection network in the actual application process.
The object detection network can decompose the original image to be detected into a plurality of objects. For example, one passenger is a target. Similarly, a flower pot is also an object and a luggage case is also an object. Each object has an object rectangular box. The information of the target rectangular box may represent the entire information of the target. The target detection network can collect target rectangular frames of all targets in the original image to be detected, and a target information matrix is generated.
As shown in matrix form, the target information matrix is made up of n rows of elements. The value of n is a natural number, i.e., n is 0,1,2,3. Each object has a row of elements.
And S420, screening data information related to the umbrella use behaviors from the target information matrix, and generating an umbrella information matrix and a personnel information matrix.
Specifically, the umbrella information matrix includes umbrella usage information for each target. The personnel information matrix comprises personnel information of each target, and it can be understood that the identity of each target is information such as a passenger or airport staff, a name, an age, a registration number, an identity card and the like through the personnel information matrix.
And S430, matching the umbrella information matrix with the personnel information matrix, removing element rows of the targets irrelevant to personnel information in the umbrella information matrix, and outputting the umbrella information matrix after the removal.
In particular, this step may remove information about some targets that are not related to personnel information, such as an umbrella placed on the ground alone, which is not used by a person, but which, although detected as targets by the target detection network, is in this step of interfering information, which needs to be removed.
In this embodiment, the target information matrix is generated by collecting the target rectangular frames of all the targets in the original image to be detected, so as to realize the centralized extraction of the key information in the original image to be detected. And through matching the umbrella information matrix and the personnel information matrix, removing element rows of targets irrelevant to personnel information in the umbrella information matrix, so that interference information in the umbrella information matrix is removed, and the misjudgment rate is reduced.
In an embodiment of the present application, the step S420 includes the following steps:
s421, traversing all elements in the target information matrix to obtain the target type of each target.
In particular, the method comprises the steps of,
s422, creating an umbrella information matrix, and incorporating element rows of which the target type is that of an umbrella into the umbrella information matrix:
Figure GDA0004143893970000081
wherein Um is an umbrella information matrix. k is the sequence number of the object whose object type is the object of the umbrella. X is x uk The abscissa of the upper left corner coordinate point of the target rectangular frame, which is the target of the umbrella, is the target type. y is uk The ordinate of the coordinate point of the upper left corner of the target rectangular frame which is the target of the umbrella. w (w) uk The length of the target rectangular frame in the horizontal direction, which is the target type of the umbrella. h is a uk The length of the target rectangular frame in the vertical direction, which is the target type of the umbrella. The value of k is a natural number.
In particular, the form of the umbrella information matrix is similar to that of the target information matrix, but each element row in the umbrella information matrix has only four elements. The umbrella information matrix contains information about whether to use the umbrella.
S423, creating a personnel information matrix, and incorporating element rows of the targets with the target types of personnel into the personnel information matrix:
Figure GDA0004143893970000082
wherein Pe is a human information matrix. s is the serial number of the object whose object type is the person. X is x ps The abscissa of the upper left corner coordinate point of the target rectangular frame, which is the target of which the target type is a person. y is ps The ordinate of the upper left corner coordinate point of the target rectangular frame is the target of which the target type is a person. w (w) ps The length of the target rectangular frame in the horizontal direction, which is the target of which the target type is the target of the person. h is a ps The length of the target rectangular frame in the vertical direction, which is the target of which the target type is a person. The value of s is a natural number.
In particular, the form of the personnel information matrix is the same as the form of the umbrella information matrix, except that the two matrices have different target types. The personnel information matrix contains information of personnel.
In this embodiment, the element rows in the target information matrix are classified and generalized according to different target types, so that the subsequent matching of personnel information and umbrella use behaviors is facilitated.
In an embodiment of the present application, the step S430 includes the following steps:
s431, selecting one element row in the umbrella information matrix as an element row to be matched.
Specifically, the rule is not selected in this step, and one element row is arbitrarily selected.
S432, based on the formula 1, sequentially matching the element rows to be matched with each element row in the personnel information matrix, and sequentially calculating matching parameter values of each matching;
Figure GDA0004143893970000091
wherein ov is the matching parameter value. n is the sequence number of the target corresponding to the element row to be matched. m is the sequence number of the target corresponding to the element row in the personnel information matrix in one-time matching. X is x un And the abscissa of the coordinate point of the upper left corner of the target rectangular frame of the target corresponding to the element row to be matched. w (w) un And the length of the target rectangular frame which is the target corresponding to the element row to be matched in the horizontal direction. X is x pm In one match. And the length of the target rectangular frame of the target corresponding to the element row in the personnel information matrix in the horizontal direction. w (w) pm In one matching, the length of a target rectangular frame of a target corresponding to the element row in the personnel information matrix in the horizontal direction is the length of the target rectangular frame. The values of n and m are natural numbers.
Specifically, the element rows to be matched are matched with each element row in the personnel information matrix in sequence, and the matching parameter value of each matching is calculated in sequence. For example, there are 6 element rows in the personnel information matrix, the element rows to be matched need to be matched with the 6 element rows one by one, that is, the matching is performed for 6 times, and 6 matching parameter values are calculated.
S433, judging whether the matching parameter value is greater than 0.3 in a certain matching process.
Specifically, the threshold of the matching parameter value is set to 0.3 in the present embodiment, and may be changed to other values, and the detection algorithm and the detection logic of the target detection network are determined.
And S434, if the matching parameter value calculated in a certain matching process is greater than 0.3, taking the element row to be matched as the element row of the target related to personnel information, and reserving the element row to be matched in the umbrella information matrix.
In particular, it is now shown that the object in the element row to be matched is a person using an umbrella, the object and the element row of the object can be kept.
And S435, if the calculated matching parameter values are smaller than or equal to 0.3 in all the matching processes, taking the element row to be matched as the element row of the target irrelevant to personnel information, and removing the element row to be matched from the umbrella information matrix.
Specifically, it is shown that the object in the element row to be matched is not a person using the umbrella, and may be a flowerpot or other object irrelevant to the umbrella use, and the object and the element row of the object are removed.
And S436, executing the matching step for each element row in the umbrella information matrix, and outputting the umbrella information matrix after the removal processing.
Specifically, the steps S431 to S435 are executed for each element row in the umbrella information matrix until all the element rows in the umbrella information matrix are matched. Further, the umbrella information matrix after the removal processing is output.
In this embodiment, by sequentially matching each element row in the umbrella information matrix with each element row in the personnel information matrix, and sequentially calculating the matching parameter value of each matching, it is possible to determine whether the target corresponding to each element row is related to the umbrella use behavior, and the matching accuracy is high and the omission ratio is small.
In an embodiment of the present application, the step S500 includes:
s510, traversing all elements in the removed umbrella information matrix, and obtaining the row numbers of all element rows in the removed umbrella information matrix.
S520, judging whether the number of rows of all element rows in the umbrella information matrix after the removal processing is equal to 0.
And S530, if the number of lines of all element lines in the umbrella information matrix after the removal processing is equal to 0, judging that the umbrella use behavior does not occur in the preset time period, and returning to the initial step of acquiring the original image data.
Specifically, all targets in the removed umbrella information matrix are personnel with umbrella use behaviors, and if the number of lines of all element rows in the umbrella information matrix is equal to 0, the umbrella information matrix is empty, one element row is not available, one target is not available, and it is determined that the umbrella use behaviors do not occur in the preset time period.
S540, if the number of lines of all element lines in the removed umbrella information matrix is not equal to 0, judging that umbrella use behaviors occur in the preset time period.
Specifically, if the number of rows of all the element rows in the umbrella information matrix is not equal to 0, which indicates that the umbrella information matrix is not empty, there is an element row, and one element row corresponds to one target, so that the target is also present, it is determined that the umbrella use behavior occurs within the preset time period.
In this embodiment, by determining whether the number of rows of all the element rows in the removed umbrella information matrix is equal to 0, accurate and rapid determination of the umbrella use behavior occurring in the preset time period can be achieved.
In an embodiment of the present application, after the step S500, the method further includes the following steps:
and S610, after judging that the umbrella use behavior occurs in the preset time period, acquiring all targets in the umbrella information matrix after the removal processing. Further, all targets are defined as offending targets.
And S620, extracting target information corresponding to each violation target, and sending the target information of each violation target to the server 30 for storage.
In particular, of course, the offending target information may also be stored in a database local to the umbrella detection device 20. The target information of the offending target may include one or more of target frame information, camera information, and time information of the offending target.
In this embodiment, by sending the target information of each offending target to the server 30 for storage, it is possible to achieve that monitoring personnel can timely stop the umbrella use behavior on site in the area to be detected after the umbrella use behavior occurs.
The application also provides a detection system for the umbrella use behavior.
As shown in fig. 2, in an embodiment of the present application, the umbrella usage behavior detection system includes an image acquisition device 10, an umbrella detection device 20, and a server 30. The umbrella detecting device 20 is connected with the image acquisition device 10 in a communication manner. The umbrella detection device 20 is also in communication with the server 30.
The image acquisition device 10 is configured to acquire an original image to be measured once every preset time period. The umbrella detecting device 20 is used for executing the detecting method of the umbrella using behavior mentioned in the foregoing. The umbrella detecting device 20 is used for detecting an offending target in the original image to be detected. The server 30 is configured to store target information of the offending target transmitted by the umbrella detecting device 20.
The technical features of the above embodiments may be combined arbitrarily, and the steps of the method are not limited to the execution sequence, so that all possible combinations of the technical features in the above embodiments are not described for brevity of description, however, as long as there is no contradiction between the combinations of the technical features, they should be considered as the scope of the description of the present specification.
The above examples only represent a few embodiments of the present application, which are described in more detail and are not to be construed as limiting the scope of the present application. It should be noted that it would be apparent to those skilled in the art that various modifications and improvements could be made without departing from the spirit of the present application, which would be within the scope of the present application. Accordingly, the scope of protection of the present application shall be subject to the appended claims.

Claims (8)

1. A method for detecting the use behavior of an umbrella, comprising:
acquiring original image data, and establishing an umbrella data set based on the original image data;
training a target detection network by using the umbrella data set;
obtaining an original image to be detected every preset time period, and inputting the original image to be detected into the target detection network;
operating the target detection network and outputting a detection result;
acquiring the detection result, and judging whether umbrella use behaviors occur in the preset time period according to the detection result;
the step of operating the target detection network and outputting a detection result comprises the following steps:
collecting target rectangular frames of all targets in the original image to be detected to generate a target information matrix
Figure FDA0004143893960000011
Wherein the target information matrix consists of n element rows, each element row consists of five elements, each target is provided with one element row, obj is the target information matrix, n is the sequence number of the target, cls n For the type of object, x n The abscissa, y, of the coordinate point of the upper left corner of the target rectangular frame of the target n Ordinate, w, of coordinate point of upper left corner of target rectangular frame serving as target n Length of the target rectangular frame in the horizontal direction as the target, h n The length of the target rectangular frame in the vertical direction is the target; the value of n is a natural number;
screening data information related to umbrella use behaviors from the target information matrix to generate an umbrella information matrix and a personnel information matrix;
matching the umbrella information matrix with the personnel information matrix, removing element rows of targets irrelevant to personnel information in the umbrella information matrix, and outputting the umbrella information matrix after the removal treatment;
matching the parachute kit information matrix with the personnel information matrix, removing the element rows of the target irrelevant to the personnel information in the parachute kit information matrix, and outputting the parachute kit information matrix after the removal, wherein the step comprises the following steps:
selecting one element row in the umbrella information matrix as an element row to be matched;
based on the formula 1, sequentially matching the element rows to be matched with each element row in the personnel information matrix, and sequentially calculating matching parameter values of each matching;
Figure FDA0004143893960000021
wherein ov is a matching parameter value, n is a sequence number of a target corresponding to the element row to be matched, m is a sequence number of a target corresponding to the element row in the personnel information matrix in one-time matching, and x un An abscissa, w, of a coordinate point of the upper left corner of the target rectangular frame of the target corresponding to the element row to be matched un A length x of a target rectangular frame of a target corresponding to the element row to be matched in the horizontal direction pm In one matching, the length, w, of the target rectangular frame of the target corresponding to the element row in the personnel information matrix in the horizontal direction pm In one-time matching, the length of a target rectangular frame of a target corresponding to an element row in the personnel information matrix in the horizontal direction; the values of n and m are natural numbers;
judging whether the matching parameter value is greater than 0.3 in a certain matching process or not;
if the matching parameter value calculated in a certain matching process is larger than 0.3, the element row to be matched is used as the element row of the target related to personnel information, and the element row to be matched is reserved in the umbrella information matrix;
if the calculated matching parameter values are smaller than or equal to 0.3 in all matching processes, taking the element row to be matched as an element row of a target irrelevant to personnel information, and removing the element row to be matched from the umbrella information matrix;
and executing the matching step for each element row in the umbrella information matrix, and outputting the umbrella information matrix after the removal processing.
2. The method of claim 1, wherein the step of acquiring raw image data and creating an umbrella dataset based on the raw image data comprises:
acquiring original image data;
screening data information related to umbrella use behaviors in the original image data to serve as umbrella data;
establishing an umbrella data set, and incorporating the umbrella data into the umbrella data set.
3. The method for detecting the use behavior of an umbrella according to claim 2, wherein the data information related to the use behavior of an umbrella includes one or more of airport personnel data without using an umbrella, passenger data without using an umbrella, airport personnel data with using an umbrella, passenger data with using an umbrella, style data of an umbrella, and color data of an umbrella.
4. A method of detecting umbrella use behavior according to claim 3, wherein the step of training a target detection network using the umbrella data set comprises:
information labeling is carried out on the umbrella data in the umbrella data set;
selecting a feature extraction network and a detection network frame, and establishing a target detection network according to the feature extraction network and the detection network frame;
and training the target detection network by adopting a transfer learning training method according to the umbrella data marked with the information.
5. The method for detecting umbrella use behavior according to claim 4, wherein the step of screening the target information matrix for data information related to umbrella use behavior to generate an umbrella information matrix and a personnel information matrix comprises:
traversing all elements in the target information matrix to obtain a target type of each target;
creating an umbrella information matrix, and incorporating element rows of an object of which the object type is an umbrella into the umbrella information matrix
Figure FDA0004143893960000031
Wherein Um is an umbrella information matrix, k is a sequence number of an object of which the object type is an umbrella, and x is uk The abscissa, y, of the coordinate point of the upper left corner of the target rectangular frame of the target with the target type of umbrella uk Ordinate, w, of coordinate point of upper left corner of target rectangular frame of target with target type of umbrella uk Length of target rectangular frame in horizontal direction, h, of target type being target of umbrella uk The length of the target rectangular frame in the vertical direction, which is the target type of the umbrella; the value of k is a natural number;
creating a personnel information matrix, and incorporating element rows of targets with target types of personnel into the personnel information matrix
Figure FDA0004143893960000041
Wherein Pe is a personnel information matrix, s is a sequence number of a target with a target type of personnel, and x ps Abscissa, y, of upper left corner coordinate point of target rectangular frame of target type of target of person ps Ordinate, w of upper left corner coordinate point of target rectangular frame of target type of person ps Length of the target rectangular frame in the horizontal direction, h, which is the target of which the target type is the target of the person ps The length of the target rectangular frame in the vertical direction, which is the target of which the target type is the target of the person; the value of s is a natural number.
6. The method for detecting the umbrella use behavior according to claim 5, wherein the step of acquiring the detection result and determining whether the umbrella use behavior occurs within the preset time period according to the detection result comprises:
traversing all elements in the removed umbrella information matrix, and obtaining the row numbers of all element rows in the removed umbrella information matrix;
judging whether the number of lines of all element lines in the umbrella information matrix after the removal processing is equal to 0;
if the number of lines of all element lines in the removed umbrella information matrix is equal to 0, judging that umbrella use behaviors do not appear in the preset time period, and returning to the initial step of acquiring the original image data;
and if the number of lines of all element lines in the umbrella information matrix after the removal processing is not equal to 0, judging that umbrella use behaviors occur in the preset time period.
7. The method for detecting the umbrella use behavior according to claim 6, wherein after the step of acquiring the detection result and determining whether the umbrella use behavior occurs within the preset time period according to the detection result, the method for detecting the umbrella use behavior further comprises:
when the umbrella use behavior is judged to occur in the preset time period, all targets in the umbrella information matrix after removal processing are obtained, and all targets are defined as illegal targets;
and extracting target information corresponding to each violation target, and sending the target information of each violation target to a server for storage.
8. A system for detecting the use of an umbrella, comprising:
the image acquisition device is used for acquiring an original image to be detected once every preset time period;
umbrella detection means, in communication with said image acquisition means, for performing the method of detecting the use behaviour of an umbrella according to any one of claims 1 to 7, so as to detect an offending target in said original image to be detected;
and the server is in communication connection with the umbrella detection device and is used for storing the target information of the illegal target sent by the umbrella detection device.
CN202010517896.6A 2020-06-09 2020-06-09 Umbrella use behavior detection method and system Active CN111709340B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010517896.6A CN111709340B (en) 2020-06-09 2020-06-09 Umbrella use behavior detection method and system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010517896.6A CN111709340B (en) 2020-06-09 2020-06-09 Umbrella use behavior detection method and system

Publications (2)

Publication Number Publication Date
CN111709340A CN111709340A (en) 2020-09-25
CN111709340B true CN111709340B (en) 2023-05-30

Family

ID=72539026

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010517896.6A Active CN111709340B (en) 2020-06-09 2020-06-09 Umbrella use behavior detection method and system

Country Status (1)

Country Link
CN (1) CN111709340B (en)

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110378259A (en) * 2019-07-05 2019-10-25 桂林电子科技大学 A kind of multiple target Activity recognition method and system towards monitor video
CN111126252A (en) * 2019-12-20 2020-05-08 浙江大华技术股份有限公司 Stall behavior detection method and related device

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6126759B1 (en) * 2016-06-16 2017-05-10 株式会社オプティム Information provision system

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110378259A (en) * 2019-07-05 2019-10-25 桂林电子科技大学 A kind of multiple target Activity recognition method and system towards monitor video
CN111126252A (en) * 2019-12-20 2020-05-08 浙江大华技术股份有限公司 Stall behavior detection method and related device

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
Bingjie Xu, Junnan Li, Yongkang Wong, Qi Zhao and Mohan S. Kankanhalli.Interact as You Intend: Intention-Driven Human-Object Interaction Detection.《IEEE Transactions on Multimedia》.2019,第22卷(第22期),1-10页. *
Cheng-Yang Fu, Wei Liu, Ananth Ranga, Ambrish Tyagi, Alexander C. Berg.DSSD : Deconvolutional Single Shot Detector.《arXIV》.2017,全文. *

Also Published As

Publication number Publication date
CN111709340A (en) 2020-09-25

Similar Documents

Publication Publication Date Title
CN107358146B (en) Method for processing video frequency, device and storage medium
CN108170813A (en) A kind of method and its system of full media content intelligent checks
CN109547748B (en) Object foot point determining method and device and storage medium
CN106650623A (en) Face detection-based method for verifying personnel and identity document for exit and entry
KR102227756B1 (en) Structure damage judgment service system and service provision method
CN110096606B (en) Foreign roll personnel management method and device and electronic equipment
CN111783718A (en) Target object state identification method and device, storage medium and electronic device
CN114821725A (en) Miner face recognition system based on neural network
CN110796014A (en) Garbage throwing habit analysis method, system and device and storage medium
CN111709340B (en) Umbrella use behavior detection method and system
CN111723656B (en) Smog detection method and device based on YOLO v3 and self-optimization
CN108090473B (en) Method and device for recognizing human face under multiple cameras
CN109448193A (en) Identity information recognition methods and device
CN109979056A (en) Access control system and method based on image recognition technology
CN205942742U (en) Airport identity authentication system based on gait discernment
CN109801394B (en) Staff attendance checking method and device, electronic equipment and readable storage medium
KR102342495B1 (en) Method and Apparatus for Creating Labeling Model with Data Programming
CN112101192B (en) Artificial intelligence-based camouflage detection method, device, equipment and medium
CN114429677A (en) Coal mine scene operation behavior safety identification and assessment method and system
CN114241400A (en) Monitoring method and device of power grid system and computer readable storage medium
CN114038040A (en) Machine room inspection monitoring method, device and equipment
CN113591620A (en) Early warning method, device and system based on integrated mobile acquisition equipment
CN116311080B (en) Monitoring image detection method and device
CN105912663A (en) User tag merging method based on big data
CN113536847A (en) Industrial scene video analysis system and method based on deep learning

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant