CN117218678A - Behavior detection method and device and electronic equipment - Google Patents

Behavior detection method and device and electronic equipment Download PDF

Info

Publication number
CN117218678A
CN117218678A CN202311015030.5A CN202311015030A CN117218678A CN 117218678 A CN117218678 A CN 117218678A CN 202311015030 A CN202311015030 A CN 202311015030A CN 117218678 A CN117218678 A CN 117218678A
Authority
CN
China
Prior art keywords
behavior
target
matched
distance
related object
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202311015030.5A
Other languages
Chinese (zh)
Inventor
李斌
焦继乐
冯雪涛
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhejiang Shenxiang Intelligent Technology Co ltd
Original Assignee
Zhejiang Shenxiang Intelligent Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhejiang Shenxiang Intelligent Technology Co ltd filed Critical Zhejiang Shenxiang Intelligent Technology Co ltd
Priority to CN202311015030.5A priority Critical patent/CN117218678A/en
Publication of CN117218678A publication Critical patent/CN117218678A/en
Pending legal-status Critical Current

Links

Landscapes

  • Image Analysis (AREA)

Abstract

The application discloses a behavior detection method, a behavior detection device, electronic equipment and a storage medium, and relates to the field of behavior detection. The method comprises the following steps: identifying at least two target subjects in successive frames; matching the behavior part of the target subject with the behavior related object according to the position relation of the behavior part of the target subject and the behavior related object; according to the continuous frames, a change process of a first distance between the matched behavior parts of the first target main body and the matched behavior parts of the second target main body on a time line and a change process of the behavior related objects are obtained, and whether target behaviors exist between the target main bodies or not is judged, wherein the behaviors are bidirectional behaviors or multidirectional behaviors; the second target body is a target body different from the first target body among the at least two target bodies. The method uses continuous frames to automatically and accurately detect potential target behaviors, and has lower implementation cost.

Description

Behavior detection method and device and electronic equipment
Technical Field
The application relates to the field of behavior detection, in particular to a bidirectional or multidirectional behavior detection method. The application also relates to a behavior detection device, an electronic device and a computer-readable storage medium.
Background
With the development of society, people frequently participate in various public business/public welfare activities in public business places, and more intensive interaction behaviors occur between people, and observation needs exist for the behaviors.
For example, certain interactions involving commercial transactions, such as shopping payments, etc., require frequent checking whether the interactions are within a specified range to oversee and secure the legal rights of the relevant parties. In addition, in other situations, such as robot operations, or sports referees, it is often necessary to check whether specific bi-directional or multi-directional behaviors exist, and take corresponding measures based on the observations to meet compliance requirements in these specific scenarios.
Taking the example of interactive behavior in transactions, to discover the risk of non-compliance that may exist in an activity, organizers often employ two methods: first, hiring professional team for blind access, and second, hiring operators for manual spot check in field video.
Aiming at the detection of potential risk behaviors, the two methods have higher labor cost and lower efficiency, and partial risk behaviors are easily missed. Therefore, it becomes necessary to find a more efficient, accurate and easy to implement behavior detection method.
Disclosure of Invention
The application provides a behavior detection method, which aims to solve the problems of higher labor cost and lower accuracy of behavior detection in the prior art. The application further provides a behavior detection apparatus, an electronic device and a computer-readable storage medium.
The application provides a behavior detection method, which comprises the following steps:
identifying at least two target subjects in successive frames;
matching the behavior part of the target subject with the behavior related object according to the position relation of the behavior part of the target subject and the behavior related object;
according to the continuous frames, a change process of a first distance between the matched behavior parts of the first target main body and the matched behavior parts of the second target main body on a time line and a change process of the behavior related objects are obtained, and whether target behaviors exist between the target main bodies or not is judged, wherein the behaviors are bidirectional behaviors or multidirectional behaviors; the second target body is a target body different from the first target body among the at least two target bodies.
Optionally, the multi-directional behavior relates to at least one third target subject other than the first target subject and the second target subject, and the determining whether there is a target behavior between the target subjects includes:
And if bidirectional behaviors exist between the first target main body and the third target main body, and meanwhile, bidirectional behaviors also exist between the second target main body and the third target main body, judging that multidirectional behaviors exist between the target main bodies.
Optionally, the determining whether there is a target behavior between the target subjects includes:
and if the first correlation exists between the change process of the first distance on the time line and the change process of the behavior related object, judging that the target behavior exists between the target subjects.
Optionally, the process of changing the first distance between the matched behavior portion of the first target subject and the matched behavior portion of the second target subject on the time line includes:
in the continuous frames, the first distance keeps continuously decreasing for a number of continuous frames greater than a preset first threshold, and then the first distance keeps continuously increasing for a number of continuous frames greater than a preset second threshold.
Optionally, the process of changing the first distance between the matched behavior portion of the first target subject and the matched behavior portion of the second target subject on the time line includes:
In the continuous frames, the continuous frame number of which the first distance is smaller than a preset third threshold value is larger than a preset fourth threshold value.
Optionally, the process of changing the first distance between the matched behavior portion of the first target subject and the matched behavior portion of the second target subject on the time line includes:
in the continuous frames, the continuous frame number of the first distance between the preset fifth threshold value and the preset sixth threshold value is larger than the preset seventh threshold value.
Optionally, the behavior related object changing process includes:
and the position relation of the behavior relatives and the matched behavior parts is changed in the continuous frames.
Optionally, the behavior related object changing process includes:
and a change process of the position relation of the behavior related object matched with the behavior part of the first target main body and the behavior related object matched with the behavior part of the second target main body in the continuous frames.
Optionally, the method further comprises:
and identifying at least one category of the target subjects related to the target behaviors as a target category.
Optionally, the matching the behavior part and the behavior related object according to the position information of the behavior part and the behavior related object of the target main body includes:
And if the key point position of the behavior part falls within the preset distance threshold range of the behavior related object, the behavior part is matched with the behavior related object.
The application also provides a behavior detection device, which comprises:
a target detection unit for identifying at least two target subjects in successive frames;
a behavior matching unit for matching the behavior part of the target subject with the behavior related object according to the positional relationship between the behavior part of the target subject and the behavior related object;
a behavior determination unit, configured to obtain, according to the continuous frame, a change process of a first distance between a matched behavior portion of a first target subject and a matched behavior portion of a second target subject on a time line, and a change process of the behavior related object, and determine whether there is a target behavior between the target subjects, where the behavior is bidirectional behavior or multidirectional behavior; the second target body is a target body different from the first target body among the at least two target bodies.
The present application also provides an electronic device including: a processor, a memory, and computer program instructions stored on the memory and executable on the processor; the processor, when executing the computer program instructions, implements the method described above.
The present application also provides a computer readable storage medium having stored therein computer executable instructions which when executed by a processor are adapted to carry out a method as described above.
Compared with the prior art, the application has the following advantages:
according to the behavior detection method, the target main body is identified and tracked, and matching is carried out based on the position relation between the behavior part of the target main body and the behavior related object, so that the association relation between the behavior part and the behavior related object around the target behavior is established. And then comprehensively judging whether the target behavior exists or not according to the time-dependent change characteristics of the distance between the behavior parts of the main body and the time-dependent change characteristics of the contemporaneous behavior relatives and the prior information of the target behavior. Because the multi-frame mode is used and the change information of the behavior part and the behavior related object is fused, the association between the action of the target main body and the target behavior is tighter, and the accuracy degree of behavior detection is improved. Moreover, because tracking identification based on video is adopted, no special manpower is needed, and therefore, the labor cost is low.
Drawings
Fig. 1 is a flow chart of a method provided by a first embodiment of the present application.
Fig. 2 is a schematic diagram of multi-target body tracking.
Fig. 3 is a schematic diagram of a human body keypoint detection model.
Fig. 4 is a matching scenario of a behavior part and a behavior related object of a target subject.
Fig. 5 is a flow chart of target behavior determination between target subjects.
Fig. 6 is a schematic diagram of a process of a distance change between the behavior parts of the target subject.
Fig. 7 is a schematic diagram of a hand-held handset code scanning.
Fig. 8 is a block diagram of a unit of a behavior detection apparatus provided in a second embodiment of the present application.
Fig. 9 is a block diagram of an electronic device according to a third embodiment of the present application.
Detailed Description
In order that those skilled in the art can better understand the technical solutions of the present application, the present application will be clearly and completely described below with reference to the accompanying drawings in the embodiments of the present application. The application can be practiced in many other ways than those described below, and therefore, based on the examples provided herein, one of ordinary skill in the art will be able to arrive at all other examples that are within the scope of the application without undue burden.
It should be noted that the terms "first," "second," "third," and the like in the claims, description, and drawings of the present application are used for distinguishing between similar objects and not necessarily for describing a particular sequential or chronological order. The data so used is interchangeable under appropriate circumstances such that the embodiments of the application described herein are capable of operation in sequences other than those illustrated or otherwise described herein. Furthermore, the terms "comprises," "comprising," and their variants are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus. In public business, for public business/public welfare activities, various public interactions occur between the behavioral subjects participating in the activity that are directly related to the topic of the activity. In order to ensure the daily good operation of the activities of the places, the interaction behavior of the behavior main body needs to be monitored to a necessary extent on the premise of legal and reasonable. Interaction behavior between subjects generally requires the inclusion of at least two behavior subjects invoking their own behavior means, such as a motor organ or instrument, surrounding some item highly related to behavior, to dynamically interact with another behavior subject. Different interaction behaviors during implementation, the behavior body can also present different variation characteristics by calling the behavior means and the behavior related objects, so that the analysis of the characteristics can reveal potential interaction behaviors among the bodies.
A first embodiment of the present application provides a behavior detection method. Firstly, video detection and tracking are carried out on target subjects in a target behavior area, and tracks of target subjects belonging to the same identity are associated together; then extracting body information of the target main body to obtain the position of the key point of the behavior part of the target main body, and the position and category information of the behavior related object; and analyzing the change characteristics of the distance between the behavior parts of any two target subjects in the behavior area along with time, and determining whether the target behaviors exist or not by combining the change process of the behavior related objects.
The behavior detection method provided in this embodiment is described in detail below. Referring to fig. 1, fig. 1 is a flowchart of the behavior detection method.
S101, identifying at least two target subjects in the continuous frames.
Within the video coverage area, there may be multiple active target subjects, and potential target behavior may occur between any two or more target subjects. In order to identify the target behavior, the target subject belonging to the same identity needs to be first identified in continuous video frames, and the complete motion trail of the target subject is outlined, which is essentially a target tracking task.
To accomplish the object tracking task, it is first necessary to detect objects in each frame, where the number of objects may be plural, each object surrounded by a detection frame, and assign each object a separate unique identification. In successive frames, each identical object will generate its own trajectory. The target category here may be any category of object having a motion attribute, such as pedestrians, animals, autonomous mobile machines, etc.
There are a number of disclosed techniques that can implement the target tracking task under a single shot, such as SORT, DEEPSORT, JDE, fairMOT. The following is a simple description of target tracking, taking only the target subject as a human body as an example, because it is not the focus of the present application.
In detection video for a human body, a face picture of very high quality is generally not available due to camera resolution and photographing angle. In the case of failure of face recognition, a pedestrian re-recognition technique (hereinafter abbreviated as ReID technique) may be used to detect and track a target human body. ReID technology is a technology that uses computer vision to determine whether a particular pedestrian is present in an image or video sequence, and is widely known as a sub-problem of image retrieval.
Specifically, reID features are extracted from each target human body, and detection frames of the target human body belonging to one identity in different video frames are associated according to the similarity of the ReID features and the position information of the human body detection frame, so that a complete motion track is formed. Fig. 2 provides an illustration of the effect of target human detection and tracking for two consecutive frames of pedestrian detection video, where each pedestrian is surrounded by a detection box, the upper left corner of each detection box having a different unique identification by which the same pedestrian of the two frames can be associated. Such as a pedestrian identified as "180", it can be seen by the positional shift that the pedestrian is moving away from the lens.
S102, matching the behavior part of the target main body with the behavior related object according to the position relation of the behavior part of the target main body and the behavior related object.
When the target behaviors occur among the target subjects, the self behavior parts are often required to be used for making the adaptive target behaviors around a certain or some behavior related matters. Therefore, before determining the target behavior, it is necessary to first establish a matching relationship between the behavior part of the target subject and the behavior related object to reflect the behavior intention of the behavior part applied to the behavior related object. To achieve such matching, it is necessary to further identify the behavior part and the behavior related object of the target subject on the basis of identifying the target subject, and then perform matching based on the positional relationship between the two, which will be described in order below.
First, a behavior part of a target subject is identified.
The action part refers to an exercise apparatus of a target subject acting on an action-related object or an exercise organ intended to act on the action-related object or as an extension of the exercise organ, such as hands, feet, or a stick of a human body, etc.; for another example, an arm of the machine is autonomously moved.
And identifying the behavior part of the target main body, namely detecting the key point positions reflecting the behavior part through a key point detection model corresponding to the category of the target main body according to the appearance characteristics of the behavior part and the structural association of the target main body, identifying the behavior part of the target main body according to the overall distribution of the key points, and finally determining the positions of the behavior parts. The key points of the behavior part can be visual geometric feature points such as boundary points, inflection points, center points and the like, and can also be areas with specific physiological significance such as five sense organs, joints and the like.
Taking a human body as an example, a key point detection model for the human body may be adopted. The human body key point detection model can detect key points of a human body, and the key points often describe skeleton information of the human body. Fig. 3 shows an application schematic based on a human body key point detection model, wherein the left side is a preset human body skeleton key point proportion distribution, and the right side is a human body key point position detected in a target human body.
Second, behavior relatives are identified.
The behavior related object refers to an object related to the object that acts or affects the target subjects by using their respective behavior units in order to implement the target behavior. Different target behaviors have different behavioral relatives. For example, the behavior-related object of the code scanning is a handheld device, and the behavior-related object of the ball game is a ball.
For a specific target behavior between target subjects, the behavior correlation is generally determined, so before matching the behavior part of the target subject and the behavior correlation, the method further comprises: in the successive frames, behavior relatives, which are related to the target behavior, are identified. For example, in the code scanning behavior, the behavior related object is specifically defined as a mobile phone; in basketball games, the behavioral relatives are specifically defined as basketball. Therefore, the matching range of the follow-up action part and the action related object can be reduced, and the matching accuracy is improved.
Identifying behavioral relatives, which are object detection problems, has a number of published solutions. The following is a brief description of only two general methods, as they are not the focus of the present application.
The behavior related object is identified, and the disclosed object detection model can be used for detecting the behavior related object according to visual properties of the behavior related object, such as appearance, color, shape and combination characteristics thereof, and acquiring the corresponding position of the behavior related object. There are a number of disclosed techniques that enable detection of objects, such as Faster RCNN, SSD, YOLO, etc.
And identifying the behavior related objects, namely, a recognition method based on picture retrieval can be used, and the specific category of the behavior related objects can be judged by collecting the base pictures of various behavior related objects in the behavior scene and calculating a mechanism of similarity voting. This may require maintaining a rich base picture.
Finally, the behavior part and the behavior related object are matched.
The behavior part and the behavior related object are matched, and the aim is to establish an association pairing relation between the behavior part and the behavior related object so as to reflect the behavior action or the behavior influence of the behavior part on the behavior related object.
In order to avoid interference with behavior-independent items in the video field of view, the present application recognizes that when a portion of a behavior and a behavior-related object are about to be correlated, they need to be spatially close to a certain extent before such a correlation can play a visible role in the performance of the behavior. Therefore, matching is performed only when the distance between the action part and the action related object satisfies the requirement. Specifically, if the key point position of the behavior part falls within a preset distance threshold range of the behavior related object, the behavior part and the behavior related object are matched.
The size of the distance threshold is set depending on the characteristics of the target behavior. Scenario (1) and scenario (2) in fig. 4 show two different target behavior scenarios, which differ significantly in the requirements for the distance threshold. For the hand-held handset of scenario 1, the hand and handset must fit very well to be considered matching, so the distance threshold can be set very small up to 0. For the player in scene (2) to control the ball, the ball and the player are often in motion, so long as the ball falls within a certain spatial range of the player's feet, the player has a large possibility of palm-controlled ball motion, and thus the distance threshold can be set to be larger, for example, set with reference to the average step size of the human body.
Based on the characteristics of the target behaviors, the behavior related objects matched with the behavior parts of different target subjects may be the same specific behavior related object, or may be merely the same class of behavior related objects, and the behavior parts of the same target subject may also be matched with a plurality of behavior related objects. The application is not limited in this regard. As shown in fig. 4, the acrobatic throwing ball of the scene (3), the hands as the action parts, the ball as the action related object, and the same two hands can be matched with a plurality of balls. The football in the scene (4) is robbed, the feet are used as the action parts, the football is used as the action related object, and the feet of a plurality of different players can be matched with the same football.
S103, according to the continuous frames, acquiring a change process of a first distance between a matched behavior part of a first target main body and a matched behavior part of a second target main body on a time line and a change process of the behavior related object, and judging whether target behaviors exist between the target main bodies, wherein the behaviors are bidirectional behaviors or multidirectional behaviors; the second target body is a target body different from the first target body among the at least two target bodies.
When there is a target behavior between the target subjects, a cooperative action relationship must occur between the behavior parts of the target subjects, and such cooperative action often causes a characteristic change process of the distance between the behavior parts with time. During this time, the course of the accompanying cooperativity change also frequently occurs for the behavior relatives that have been matched with the behavior department. Therefore, based on the change process between the behavior parts and the behavior related object, it is possible to determine with high confidence whether or not there is a target behavior between the target subjects.
To facilitate distinguishing between different target subjects, the present application names the different target subjects in the order of the first target subject, the second target subject …. Broadly, the second target subject is a target subject different from the first target subject among the at least two target subjects, and the second target subject may be one of a plurality of candidate target subjects, or the second target subject refers to other target subjects than the two or three or even more first target subjects; how the second target subject understands whether the target behavior needs to be determined as needed is bidirectional behavior or multidirectional behavior, and the specific characteristics of behavior occurrence.
The bidirectional behavior refers to a behavior involving two target subjects, which are respectively called a first target subject and a second target subject; the second target subject is another behavioral subject that is determined and different from the first target subject. Typical bi-directional behavior, such as marketing transactions.
In order to link with the description of the bidirectional behavior, in this embodiment, the second target body may be understood as a specific one of the target bodies in a narrow sense, and a third target body may be set to represent a target body other than the first target body and the second target body, that is, the bidirectional behavior involves at least one third target body other than the first target body and the second target body.
Since the multidirectional behavior can be regarded as a superposition of a plurality of bidirectional behaviors on each other, the determination of the multidirectional behavior can be completely converted into a combined determination of the bidirectional behaviors. Specifically, if bidirectional behavior exists between the first target subject and the third target subject, and meanwhile bidirectional behavior also exists between the second target subject and the third target subject, it is determined that multidirectional behavior exists between the target subjects. In view of this, the present application is mainly described for determining bidirectional behavior, and a person skilled in the art can easily determine multidirectional behavior according to the above-described combined determination description.
The following describes the bidirectional behavior determination process in detail. Referring to fig. 5, fig. 5 provides a schematic diagram of a bidirectional behavior determination process. Wherein the first behavior part refers to a matched behavior part of the first target main body, and the first behavior related object refers to a behavior related object matched with the first behavior part; the second behavior part refers to a matched behavior part of the second target subject, and the second behavior related object refers to a behavior related object matched with the second behavior part.
First, a change in a first distance between the first behavior portion and the second behavior portion over a time line is analyzed.
The first distance is a distance between a key point position of the first behavior portion and a key point position of the second behavior portion. For simplicity, the application is defined by Euclidean distance. That is, the first distance is a euclidean distance between a key point position of the first behavior portion and a key point position of the second behavior portion. If there are multiple key points in the first behavior portion and/or the second behavior portion, the first distance may be defined according to a preset statistical policy, for example, a nearest distance or an average distance between the key points is taken as a distance between the behavior portions, which is not limited herein.
The change process of the first distance between the first behavior portion and the second behavior portion on the time line refers to a change mode of the distance between the behavior portions on the size, such as increasing, decreasing, maintaining, and combinations of the three, which is presented as the time line advances. Different target behaviors often exhibit different patterns of change over the first distance, based on analysis of the distance change patterns of a particular scene, may help determine the presence of a target behavior.
Fig. 6 illustrates a typical process of changing the distance between the behaviours, which will be described in turn. But the present application is not limited thereto.
Process 1: in the continuous frames, the first distance keeps continuously decreasing for a number of continuous frames greater than a preset first threshold, and then the first distance keeps continuously increasing for a number of continuous frames greater than a preset second threshold.
For some target behaviors, the behavior parts of the first target main body and the behavior parts of the second target main body are gradually close and then gradually far away, so that the distance between the behavior parts is in a statistical rule of becoming smaller and larger. It is thus possible to count the number of frames in which the distance between the matched behavior portions remains continuously decreasing and later continuously increasing, respectively, and if both exceed a preset threshold value, consider that the distance between the behavior portions of the target subjects has the change pattern, and thus possibly predict that there is a target behavior between the target subjects.
Taking code scanning transaction as an example, when a transaction occurs between a merchant and a customer through handheld equipment, a mobile phone held by the customer and equipment held by the merchant are mutually close, and after the code scanning action is completed, the mobile phone held by the customer and the equipment held by the merchant are mutually far away. Then, by counting the time line change process of the distance between the hand of the handheld mobile phone of the customer and the hand of the handheld device of the merchant, the change process can be found.
Process 2: in the continuous frames, the continuous frame number of which the first distance is smaller than a preset third threshold value is larger than a preset fourth threshold value.
For some target behaviors, the behavior parts of the first target body and the second target body need to be kept at a certain proximity degree in a period of time, and then the distance between the behavior parts can be changed without exceeding a certain range all the time. It is therefore possible to count the number of consecutive frames in which the distance between the matched behavioural parts is smaller than the preset third threshold, and if the preset fourth threshold is exceeded, consider that the distance between the behavioural parts of the target subject has the change pattern, and thus possibly indicate that there is a target behaviour between the target subjects.
Taking self-service transaction in a bank business hall as an example, in order to accelerate the conventional transaction of clients, such as card opening, inquiry and the like, self-service transaction equipment with a touch screen is often placed in the bank business hall, and the clients self-service transaction related matters through the equipment with the assistance of the staff in the business hall. Then, by counting the time line change of the distance between the hand clicking the touch screen of the customer and the hand clicking the touch screen of the worker, it can be found that the above change exists between the two.
Process 3: in the continuous frames, the continuous frame number of the first distance between the preset fifth threshold value and the preset sixth threshold value is larger than the preset seventh threshold value.
For some target behaviors, the behavior parts of the first target body and the second target body need to be separated by a certain range in a period of time, so that the distance between the behavior parts can be changed regularly within a certain interval all the time. It is therefore possible to count the number of consecutive frames for which the distance between the matched behavior portions is greater than the preset fifth threshold value and less than the preset sixth threshold value, and if the number of consecutive frames exceeds the preset seventh threshold value, consider that the distance between the behavior portions of the target subjects has the change pattern, and thus possibly indicate that there is a target behavior between the target subjects.
Taking basketball game as an example, in basketball game, one tactic that is often adopted is the semi-field staring at the defending tactic. The defender is positioned between the attacking player and the basketball goal, and maintains a certain defending distance for the attacking player, such as a distance of an arm interval, and the defender can actively interfere shooting, passing, breaking through and other tactical actions of an opponent by waving the hands. Then, by counting the time-line change in the distance between the hands of the offender and the defender, it can be found that the above change exists between the hands.
Second, the course of change in the behavioral relatives is analyzed.
The behavior related object is the behavior related object which is matched with the behavior part of the target main body in the last step.
The change process of the behavior related object refers to a change process exhibited by a positional relationship between the behavior related objects or between the behavior related objects and the behavior parts accompanying a change process of a distance between the behavior parts of the target subject during occurrence of the target behavior. The positional relationship characterizes the relative spatial relationship of the behavioral relatives and the behavioral portion, and the course of the change exhibited by the positional relationship may be dynamic, static, or a mixture of dynamic and static, such as near, holding, far, and combinations thereof, without limitation.
The following describes the course of change of the behavior related substance in detail.
The behavior related object change process comprises the following steps: and the position relation of the behavior relatives and the matched behavior parts is changed in the continuous frames.
After the behavior related object is matched with the behavior part of the target main body, the position of the behavior part of the target main body is inevitably changed along with the development of the target behavior, and the position relationship between the behavior related object matched with the behavior part and the behavior part is also possibly changed.
For example, when the hand-held mobile phone performs the code scanning action, the hand-held mobile phone and the hand are attached together, that is, the position relationship of the hand-held mobile phone and the hand is relatively static, so that the action related object is in a relatively static state with the action part in the changing process.
The behavior related object change process comprises the following steps: and a change process of the position relation of the behavior related object matched with the behavior part of the first target main body and the behavior related object matched with the behavior part of the second target main body in the continuous frames.
When the target behavior occurs between the target subjects, the matched behavior portions and the behavior related objects are generally required to participate together for implementation, and the behavior portions and the behavior related objects may move synchronously, so that when the distance between the behavior portions changes, the positional relationship between the matched behavior related objects may also change simultaneously.
For example, when the mobile phone is held for code scanning, the distance between the shopping guide hand and the customer hand is continuously reduced, and the mobile phone held by the shopping guide hand and the mobile phone held by the customer hand are also continuously close, so that the change process of the behavior related objects is a state that the space distance between the behavior related objects is continuously reduced.
Finally, the target behavior is determined based on the two above-described changes.
From the above-mentioned change process of the distance between the first behavior portion and the second behavior portion on the time line and the change process of the behavior related object, it can be seen that there is a close relationship between the two, and valuable clues are provided for the determination of the target behavior, and the mutual reflection of the two together forms the basis for the determination of the target behavior.
The determining whether there is a target behavior between the target subjects includes: and if the first correlation exists between the change process of the first distance on the time line and the change process of the behavior related object, judging that the target behavior exists between the target subjects.
The correlation refers to a relationship in which two variables are related to each other and change within a certain range according to a certain rule. Specifically, in the present application, the distance between the behavior parts of the target subject is related to the time line and the behavior related object. If the expected correlation relationship with the target behavior is similar, then it is determined that there is a potential target behavior between the target subjects.
The following describes examples of actions.
Example 1: in a department store special cabinet, if the code scanning behavior of the mobile phone exists, the distance between the hand of the shopping-guided handheld mobile phone and the hand of the handheld mobile phone of the customer is firstly reduced and then is increased, and the changing duration period meets the preset threshold value; meanwhile, the handheld mobile phone and the hand should be kept attached, or the position relationship of the shopping guide handheld mobile phone and the customer handheld mobile phone should be synchronously approaching and then separating. If the video detection obtains the transaction behavior related change process to meet the situation, judging that the mobile phone code scanning behavior possibly exists.
Example 2: in a football match, if a first player and an second player compete together for the same football, the distance between the first player's foot and the second player's foot is continuously less than a certain threshold; meanwhile, the feet of football and team members always keep close position relation. If the video detection acquires the game behavior related change process which meets the situation, the possibility of the robbery behavior is judged.
Monitoring
Example 3: in the group game, if a tug-of-war game is set, a red strap is tied in the middle of the tug-of-war rope as a marker strap and is positioned on the central line, and two teams stand according to preset positions. After the competition starts, the two parties exert force to pull the ropes together, in the process, the two teams keep a distance to repeatedly saw, and meanwhile, the position of the marker band moves back and forth in a small range near the center line. If the video detection acquires the game behavior related change process which meets the situation, the condition that the tug-of-war behavior possibly exists is judged.
In some detection scenarios, since the target body may be mixed with other non-target bodies, in order to further determine that the target behavior is a behavior between target bodies, on the basis of having determined that there is a potential target behavior, it is further required to identify, for the target bodies involved in the target behavior, at least one class of the target body as a target class. The target class characterizes the identity of the target subject.
Identifying the category of the target subject may be regarded as classifying the video image. Thus, the disclosed image classification techniques may be employed. For example, a large number of samples in a video scene are collected, an image of a target subject containing a target class is taken as a positive sample, other images are taken as negative samples, an image classifier is trained, and the trained classifier is applied to the video frame. The presence of target behavior may be further determined if there is a target subject in the video with a target category that matches the target task.
The determined target behavior prompts that the target behavior possibly exists in the video. If the target behavior is an undesirable behavior, it may present a risk to the interested party that subsequent measures need to be taken to address such risk. And if the target behavior judgment result is compared with the preset behavior result record corresponding to the target behavior, judging whether the target behavior judgment result has consistency, and obtaining a consistency judgment result. Further, if the consistency judgment result is inconsistent, an alarm signal is output.
A specific example will be provided below to fully explain the above method.
For the special cabinets of department stores, in order to uniformly manage funds, generally, the stores uniformly collect funds, and after deducting various fees, the stores re-settle the rest funds to merchants to which the special cabinets belong. In order to avoid the deduction of the expense of the market, the merchant has the risk of hiding the transaction and the false report transaction amount from the market, for example, the shopping guide in the special cabinet does not enter the sales order into a sales system appointed by the market, or uses third party sales equipment which is not allowed by the market to enter, and false business data is provided for the market, and the behavior is a flyer.
The market needs to collect rents from department stores. In general, the rent and the sales of the special cabinet are in positive correlation, and once the fly bill behavior occurs, the special cabinet can report part of sales data in a hidden way, thereby bringing economic loss to the market. On the other hand, sales shops often perform sales promotion activities such as point and rebate, and sales personnel can check in sales orders during the sales promotion activities, so that the sales slip is obtained. In addition, once the quality of the goods is a problem, there may be a risk of the market and the special cabinet regarding after-sales problems due to the goods not being checked in.
In the flyer behavior, as shown in fig. 7, a possible scenario is that the shopping guide uses the handheld mobile phone 3 and the handheld mobile phone 4 to provide a two-dimensional code which does not belong to the shopping mall cashing system, and the customer uses the handheld mobile phone 1 to scan the two-dimensional code near the shopping guide mobile phone 4 to complete the code scanning action, so that funds can reach the account of the merchant. It can be seen that during this process, the shopping guide and its hands, the customer and his hands, and their respective cell phones, through specific changes in location, distance between them, complete the fly behaviour.
In order to detect the fly list behavior, videos acquired by video equipment such as cameras of a special cabinet area of a department store aiming at department stores can be utilized, and based on the videos, the method disclosed by the application is used for detecting the fly list behavior. The method comprises the following steps:
first, the customer and the shopping guide are identified. Both the customer and the shopping guide are target subjects.
And secondly, matching the adversary with the mobile phone. The hand here is a behavior part, and the mobile phone is a behavior related object.
And acquiring the position coordinates of the two hands through the human body key point detection model. Specifically, define the ith person's t frame left hand coordinates asDefine the ith person's t frame right hand coordinates as +.>Wherein (1)>Represents the left-hand abscissa and the ordinate of the ith person's t frame, >Representing the right-hand abscissa and ordinate of the ith person's t frame.
Defining an ith person t frame detection obtaining mobile phone detection frame as follows:where N represents a total of N rectangular boxes detected. />Wherein x, y represents the coordinates of the upper left corner of the detected rectangular frame, w, h represents the width and height of the detected rectangular frame, s represents the confidence of the detected frame, and c represents the category of the detected frame. Here, a category of whether or not the target human body has a hand-held mobile phone is defined.
Further, the left and right hand coordinates of the ith person's t frame are matched with the hand-held object detection frame. By, for each hand coordinate p of the ith person's t frame i,t And each hand-held object detection frame B i,t If p i,t In a rectangular shapeB i,t Within the range p i,t The coordinates of the hands of the ith person which have been matched in the t frame are obtained.
Let the distance between the coordinates of the paired hands in two target human bodies i, j in the t-th frame: wherein x is i,t ,y i,t Human hand coordinates, x, paired for the ith person, t frame j,t ,y j,t The coordinates of the paired hands of the jth person in the t frame.
Maintaining a global target body pair queue consisting of T frames:each of which represents the distance between matched behavioural parts in the target subject i, j, the change pattern of the queue is analysed. If a statistical rule that the hand-hand coordinate distance is firstly reduced and then is increased appears, the human body pair is regarded as potential code scanning action.
Specifically, first calculateDetermining whether there is more than N in a queue 1 The frame distance becomes smaller, wherein ∈>And the representative indication function is a value of 1 when the bracket is judged to be true, and a value of 0 when the bracket is judged to be false. Then calculate +.>Determining whether there is more than N in a queue 2 The frame distance becomes large. When both conditions are satisfied at the same time, it is considered that there is such a variation pattern in the distance between the target subjects i, j.
According to the judged complete code scanning transaction behavior, whether one of the two related target human bodies i, j is a special cabinet shopping guide is inspected. Since shopping guide is generally uniform, images of uniform can be collected as positive samples, ordinary customers as negative samples, a picture classifier is trained, and shopping guide identification is performed. If at least one person in the two target human bodies conducts shopping, judging that the behavior is a code scanning transaction behavior.
And comparing the purchasing behavior recognition result with the sales record of the special cabinet, and judging that the special cabinet has the fly list risk if the purchasing behavior recognition result is larger than the sales record of the special cabinet. For the special cabinet with the fly list risk, the video of the identified purchasing behavior can be recorded, and an alarm mechanism of a mall security system is triggered to remind relevant staff to conduct manual audit for final confirmation.
A second embodiment of the present application provides a behavior detection apparatus. In the above embodiment, a behavior detection method is provided, and correspondingly, the application also provides a behavior detection device. Referring to FIG. 8, a block diagram of a behavior detection device according to an embodiment of the present application is shown. Since this embodiment is substantially similar to the method embodiment provided in the first embodiment, the description is relatively simple, and the relevant points will be referred to in the description of the method embodiment. The device embodiments described below are merely illustrative.
The behavior detection device provided in this embodiment includes:
a target detection unit 201 for identifying at least two target subjects in successive frames;
a behavior matching unit 202 for matching the behavior part of the target subject with the behavior related object according to the positional relationship between the behavior part of the target subject and the behavior related object;
before matching the behavior part of the target subject and the behavior related object, the method further comprises: in the successive frames, behavior relatives, which are related to the target behavior, are identified.
The matching of the behavior part and the behavior related object according to the position information of the behavior part and the behavior related object of the target main body includes: and if the key point position of the behavior part falls within the preset distance threshold range of the behavior related object, the behavior part is matched with the behavior related object.
A behavior determination unit 203, configured to obtain, according to the continuous frame, a change process of a first distance between a matched behavior portion of a first target subject and a matched behavior portion of a second target subject on a time line, and a change process of the behavior related object, and determine whether there is a target behavior between the target subjects, where the behavior is bidirectional behavior or multidirectional behavior; the second target body is a target body different from the first target body among the at least two target bodies.
The multi-directional behavior involves at least one third target subject other than the first and second target subjects, the determining whether there is a target behavior between the target subjects comprising: and if bidirectional behaviors exist between the first target main body and the third target main body, and meanwhile, bidirectional behaviors also exist between the second target main body and the third target main body, judging that multidirectional behaviors exist between the target main bodies.
The determining whether there is a target behavior between the target subjects includes: and if the change process of the first distance on the time line has a correlation with the change process of the behavior related object, judging that the target behaviors exist between the target subjects.
A process of changing a distance between the matched behavior portion of the first target subject and the matched behavior portion of the second target subject over a timeline, comprising: in the continuous frames, the first distance keeps continuously decreasing for a number of continuous frames greater than a preset first threshold, and then the first distance keeps continuously increasing for a number of continuous frames greater than a preset second threshold.
A process of changing a distance between the matched behavior portion of the first target subject and the matched behavior portion of the second target subject over a timeline, comprising: in the continuous frames, the continuous frame number of which the first distance is smaller than a preset third threshold value is larger than a preset fourth threshold value.
A process of changing a distance between the matched behavior portion of the first target subject and the matched behavior portion of the second target subject over a timeline, comprising: in the continuous frames, the continuous frame number of the first distance between the preset fifth threshold value and the preset sixth threshold value is larger than the preset seventh threshold value.
The behavior related object change process comprises the following steps: and the position relation of the behavior relatives and the matched behavior parts is changed in the continuous frames.
The behavior related object change process comprises the following steps: and a change process of the position relation of the behavior related object matched with the behavior part of the first target main body and the behavior related object matched with the behavior part of the second target main body in the continuous frames.
The first distance is a euclidean distance of a keypoint location of the matched behavior portion of the first target subject and a keypoint location of the matched behavior portion of the second target subject.
And identifying at least one category of the target subjects related to the target behaviors as a target category.
And comparing the target behavior judgment result with a preset behavior result record corresponding to the target behavior, judging whether the target behavior judgment result has consistency, and obtaining a consistency judgment result. And if the consistency judgment result is inconsistent, outputting an alarm signal.
In general, the monitoring system may include a sensing side (e.g., a camera), a computing side (e.g., a server), and an executing side (e.g., an alarm terminal). The device units of the application can adopt different deployment schemes in the monitoring system according to the monitoring performance requirement and the supporting capability of the system. For example, the target detection unit may be disposed on the perception side, the behavior matching unit and the behavior determination unit may be disposed on the calculation side, and the calculation side may transfer the required action execution instruction to the execution side. Alternatively, the target detection unit and the behavior matching unit may be disposed on the perception side, the behavior determination unit may be disposed on the calculation side, and the calculation side may transfer the required action execution instruction to the execution side. Alternatively, the target detection unit, the behavior matching unit and the behavior determination unit may be disposed on the sensing side, and the sensing side directly transmits the required action execution instruction to the execution side without going through the calculation side, or the required calculation amount may be flexibly distributed among units with calculation capability by adopting other more complex manners such as distributed calculation, load balancing, and the like, which is not limited in this aspect of the present application.
In the foregoing embodiments, a behavior detection method and a corresponding apparatus are provided, and in addition, a third embodiment of the present application further provides an electronic device embodiment corresponding to the foregoing method embodiment and apparatus embodiment, and since the electronic device embodiment is substantially similar to the method embodiment, the description of the electronic device embodiment is relatively simple, and details of relevant technical features and implementation effects should be referred to the corresponding description of the foregoing method embodiment, where the following description of the electronic device embodiment is merely illustrative. The electronic device embodiment is as follows:
in a third embodiment of the present application, please refer to fig. 9, which is a schematic diagram of an electronic device according to the present application. The electronic device includes:
a processor 301;
the method comprises the steps of,
the memory 302 is configured to store a computer program, and after the device runs the computer program through the processor, the steps shown in the above method embodiments are executed, which is not described herein again.
Furthermore, a fourth embodiment of the present application also provides a computer-readable storage medium for implementing the above method. The embodiments of the computer readable storage medium provided by the present application are described more simply, and reference should be made to the corresponding descriptions of the above-described method embodiments, which are merely illustrative.
The computer readable storage medium provided in this embodiment stores computer instructions that, when executed by the processor, implement the steps shown in the foregoing method embodiments, which are not described herein.
In one typical configuration, a computing device includes one or more processors (CPUs), input/output interfaces, network interfaces, and memory.
The memory may include volatile memory in a computer-readable medium, random Access Memory (RAM) and/or nonvolatile memory, such as Read Only Memory (ROM) or flash memory (flash RAM). Memory is an example of computer-readable media.
1. Computer readable media, including both non-transitory and non-transitory, removable and non-removable media, may implement information storage by any method or technology. The information may be computer readable instructions, data structures, modules of a program, or other data. Examples of storage media for a computer include, but are not limited to, phase change memory (PRAM), static Random Access Memory (SRAM), dynamic Random Access Memory (DRAM), other types of Random Access Memory (RAM), read Only Memory (ROM), electrically Erasable Programmable Read Only Memory (EEPROM), flash memory or other memory technology, compact disc read only memory (CD-ROM), digital Versatile Discs (DVD) or other optical storage, magnetic cassettes, magnetic tape magnetic disk storage or other magnetic storage devices, or any other non-transmission medium, which can be used to store information that can be accessed by a computing device. Computer-readable media, as defined herein, does not include non-transitory computer-readable media (transshipment) such as modulated data signals and carrier waves.
2. It will be appreciated by those skilled in the art that embodiments of the present application may be provided as a method, system, or computer program product. Accordingly, the present application may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present application may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
While the application has been described in terms of preferred embodiments, it is not intended to be limiting, but rather, it will be apparent to those skilled in the art that various changes and modifications can be made herein without departing from the spirit and scope of the application as defined by the appended claims.

Claims (13)

1. A behavior detection method, comprising:
identifying at least two target subjects in successive frames;
matching the behavior part of the target subject with the behavior related object according to the position relation of the behavior part of the target subject and the behavior related object;
according to the continuous frames, a change process of a first distance between the matched behavior parts of the first target main body and the matched behavior parts of the second target main body on a time line and a change process of the behavior related objects are obtained, and whether target behaviors exist between the target main bodies or not is judged, wherein the behaviors are bidirectional behaviors or multidirectional behaviors; the second target body is a target body different from the first target body among the at least two target bodies.
2. The method of claim 1, wherein the multi-directional behavior involves at least one third target subject other than the first target subject and the second target subject, the determining whether there is target behavior between the target subjects comprising:
and if bidirectional behaviors exist between the first target main body and the third target main body, and meanwhile, bidirectional behaviors also exist between the second target main body and the third target main body, judging that multidirectional behaviors exist between the target main bodies.
3. The method of claim 1, wherein said determining whether there is a target behavior between the target subjects comprises:
and if the first correlation exists between the change process of the first distance on the time line and the change process of the behavior related object, judging that the target behavior exists between the target subjects.
4. The method of claim 1, wherein the change in the first distance between the matched behavior portion of the first target subject and the matched behavior portion of the second target subject over the timeline comprises:
in the continuous frames, the first distance keeps continuously decreasing for a number of continuous frames greater than a preset first threshold, and then the first distance keeps continuously increasing for a number of continuous frames greater than a preset second threshold.
5. The method of claim 1, wherein the change in the first distance between the matched behavior portion of the first target subject and the matched behavior portion of the second target subject over the timeline comprises:
in the continuous frames, the continuous frame number of which the first distance is smaller than a preset third threshold value is larger than a preset fourth threshold value.
6. The method of claim 1, wherein the change in the first distance between the matched behavior portion of the first target subject and the matched behavior portion of the second target subject over the timeline comprises:
in the continuous frames, the continuous frame number of the first distance between the preset fifth threshold value and the preset sixth threshold value is larger than the preset seventh threshold value.
7. The method of claim 1, wherein the course of action related changes comprises:
and the position relation of the behavior relatives and the matched behavior parts is changed in the continuous frames.
8. The method of claim 1, wherein the course of action related changes comprises:
and a change process of the position relation of the behavior related object matched with the behavior part of the first target main body and the behavior related object matched with the behavior part of the second target main body in the continuous frames.
9. The method according to any one of claims 2-8, further comprising:
and identifying at least one category of the target subjects related to the target behaviors as a target category.
10. The method according to claim 1, wherein the matching of the behavior part and the behavior related object according to the position information of the behavior part and the behavior related object of the target subject includes:
and if the key point position of the behavior part falls within the preset distance threshold range of the behavior related object, the behavior part is matched with the behavior related object.
11. A behavior detection apparatus, characterized by comprising:
a target detection unit for identifying at least two target subjects in successive frames;
a behavior matching unit for matching the behavior part of the target subject with the behavior related object according to the positional relationship between the behavior part of the target subject and the behavior related object;
a behavior determination unit, configured to obtain, according to the continuous frame, a change process of a first distance between a matched behavior portion of a first target subject and a matched behavior portion of a second target subject on a time line, and a change process of the behavior related object, and determine whether there is a target behavior between the target subjects, where the behavior is bidirectional behavior or multidirectional behavior; the second target body is a target body different from the first target body among the at least two target bodies.
12. An electronic device, comprising: a processor, a memory, and computer program instructions stored on the memory and executable on the processor; the processor, when executing the computer program instructions, implements the method of any of the preceding claims 1-10.
13. A computer readable storage medium, characterized in that the computer readable storage medium has stored therein computer executable instructions for implementing the method according to any of the preceding claims 1-10 when being executed by a processor.
CN202311015030.5A 2023-08-11 2023-08-11 Behavior detection method and device and electronic equipment Pending CN117218678A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311015030.5A CN117218678A (en) 2023-08-11 2023-08-11 Behavior detection method and device and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311015030.5A CN117218678A (en) 2023-08-11 2023-08-11 Behavior detection method and device and electronic equipment

Publications (1)

Publication Number Publication Date
CN117218678A true CN117218678A (en) 2023-12-12

Family

ID=89037785

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311015030.5A Pending CN117218678A (en) 2023-08-11 2023-08-11 Behavior detection method and device and electronic equipment

Country Status (1)

Country Link
CN (1) CN117218678A (en)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108762497A (en) * 2018-05-18 2018-11-06 深圳壹账通智能科技有限公司 Body feeling interaction method, apparatus, equipment and readable storage medium storing program for executing
WO2021180060A1 (en) * 2020-03-11 2021-09-16 杭州海康威视数字技术股份有限公司 Channel gate control method, apparatus and system
CN113516092A (en) * 2021-07-27 2021-10-19 浙江大华技术股份有限公司 Method and device for determining target behavior, storage medium and electronic device
US20220125359A1 (en) * 2019-12-02 2022-04-28 Eknauth Persaud Systems and methods for automated monitoring of human behavior
CN114648719A (en) * 2022-03-28 2022-06-21 上海商汤科技开发有限公司 Article state tracking method and device, electronic equipment and storage medium
CN115641548A (en) * 2022-10-10 2023-01-24 浙江莲荷科技有限公司 Abnormality detection method, apparatus, device and storage medium
CN116206363A (en) * 2023-02-09 2023-06-02 北京四维图新科技股份有限公司 Behavior recognition method, apparatus, device, storage medium, and program product

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108762497A (en) * 2018-05-18 2018-11-06 深圳壹账通智能科技有限公司 Body feeling interaction method, apparatus, equipment and readable storage medium storing program for executing
US20220125359A1 (en) * 2019-12-02 2022-04-28 Eknauth Persaud Systems and methods for automated monitoring of human behavior
WO2021180060A1 (en) * 2020-03-11 2021-09-16 杭州海康威视数字技术股份有限公司 Channel gate control method, apparatus and system
CN113516092A (en) * 2021-07-27 2021-10-19 浙江大华技术股份有限公司 Method and device for determining target behavior, storage medium and electronic device
CN114648719A (en) * 2022-03-28 2022-06-21 上海商汤科技开发有限公司 Article state tracking method and device, electronic equipment and storage medium
CN115641548A (en) * 2022-10-10 2023-01-24 浙江莲荷科技有限公司 Abnormality detection method, apparatus, device and storage medium
CN116206363A (en) * 2023-02-09 2023-06-02 北京四维图新科技股份有限公司 Behavior recognition method, apparatus, device, storage medium, and program product

Similar Documents

Publication Publication Date Title
US11790433B2 (en) Constructing shopper carts using video surveillance
CN111415461B (en) Article identification method and system and electronic equipment
JP6148480B2 (en) Image processing apparatus and image processing method
Yoon et al. Analyzing basketball movements and pass relationships using realtime object tracking techniques based on deep learning
TWI778030B (en) Store apparatus, store management method and program
JP4972491B2 (en) Customer movement judgment system
JP7448065B2 (en) Store equipment, store systems, store management methods, programs
CN108198052A (en) User's free choice of goods recognition methods, device and intelligent commodity shelf system
CN107909443A (en) Information-pushing method, apparatus and system
CN111263224B (en) Video processing method and device and electronic equipment
CN111091025B (en) Image processing method, device and equipment
JP2022539920A (en) Method and apparatus for matching goods and customers based on visual and gravity sensing
US20180293598A1 (en) Personal behavior analysis device, personal behavior analysis system, and personal behavior analysis method
CN110309801A (en) A kind of video analysis method, apparatus, system, storage medium and computer equipment
CN107871111A (en) A kind of behavior analysis method and system
JP2002344946A (en) Monitoring system
CN109145127A (en) Image processing method and device, electronic equipment and storage medium
CN111260685B (en) Video processing method and device and electronic equipment
CN110689389A (en) Computer vision-based shopping list automatic maintenance method and device, storage medium and terminal
Melo et al. Low-cost trajectory-based ball detection for impact indication and recording
CN114565976A (en) Training intelligent test method and device
CN117218678A (en) Behavior detection method and device and electronic equipment
Schwenkreis An Approach to Use Deep Learning to Automatically Recognize Team Tactics in Team Ball Games.
RU2599699C1 (en) Method of detecting and analysing competition game activities of athletes
Quattrocchi et al. Put Your PPE on: A Tool for Synthetic Data Generation and Related Benchmark in Construction Site Scenarios.

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination