CN113657189B - Behavior detection method, electronic device, and computer-readable storage medium - Google Patents

Behavior detection method, electronic device, and computer-readable storage medium Download PDF

Info

Publication number
CN113657189B
CN113657189B CN202110845439.4A CN202110845439A CN113657189B CN 113657189 B CN113657189 B CN 113657189B CN 202110845439 A CN202110845439 A CN 202110845439A CN 113657189 B CN113657189 B CN 113657189B
Authority
CN
China
Prior art keywords
monitored object
preset
behavior
head
line
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110845439.4A
Other languages
Chinese (zh)
Other versions
CN113657189A (en
Inventor
罗亮
孙志亮
黄鹏
徐桢
徐瑾
潘武
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhejiang Dahua Technology Co Ltd
Original Assignee
Zhejiang Dahua Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhejiang Dahua Technology Co Ltd filed Critical Zhejiang Dahua Technology Co Ltd
Priority to CN202110845439.4A priority Critical patent/CN113657189B/en
Publication of CN113657189A publication Critical patent/CN113657189A/en
Application granted granted Critical
Publication of CN113657189B publication Critical patent/CN113657189B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Abstract

The application relates to a behavior detection method, an electronic device and a computer readable storage medium. Wherein the method comprises the following steps: acquiring a monitoring video containing a preset monitoring area, and positioning a detection line in the preset monitoring area; performing target tracking based on the monitoring video to obtain tracking information of the head of the monitored object; detecting whether the head of the monitored object crosses the detection line from a preset direction based on the tracking information, and detecting whether the head of the monitored object moves outside a preset monitoring area; in the case that the head of the monitored object crosses the detection line from the preset direction and moves outside the preset monitoring area, it is determined that the monitored object performs the preset behavior. According to the method and the device, the problem that the detection of the getting-up behavior in the related technology is easy to report by mistake is solved, and more accurate detection of the getting-up behavior is realized.

Description

Behavior detection method, electronic device, and computer-readable storage medium
Technical Field
The present application relates to the field of machine vision, and in particular, to a behavior detection method, an electronic device, and a computer-readable storage medium.
Background
In a scenario such as a prison, school, kindergarten, etc., the detection of the getting up behaviour is required for safety reasons. In the existing method for detecting the getting-up behavior, whether the monitored object is higher than the bed surface by a certain height is generally judged by adopting a sensor and other modes, and false alarm is often caused by the reasons of turning over, sitting up in dream and the like of the monitored object.
Disclosure of Invention
In this embodiment, a behavior detection method, an electronic device, and a computer-readable storage medium are provided to solve the problem that the detection of the getting-up behavior in the related art is easy to be misreported.
In a first aspect, in this embodiment, there is provided a behavior detection method, including:
acquiring a monitoring video containing a preset monitoring area, and positioning a detection line in the preset monitoring area;
performing target tracking based on the monitoring video to obtain tracking information of the head of the monitored object;
detecting whether the head of the monitored object crosses the detection line from a preset direction based on the tracking information, and detecting whether the head of the monitored object moves outside the preset monitoring area;
and determining that the monitored object performs a preset behavior under the condition that the head of the monitored object passes through the detection line from the preset direction and moves out of the preset monitoring area.
In some embodiments, the preset monitoring area is a passion area, the detection line is parallel to a passion center line, the passion center line is perpendicular to a sleeping posture extending direction of the monitored object on the passion, the preset direction is along the sleeping posture extending direction, and the preset behavior is a getting-up behavior.
In some embodiments, the monitoring video is a nodding video including the preset monitoring area.
In some of these embodiments, detecting whether the head of the monitored object crosses the detection line from a preset direction based on the tracking information, and detecting whether the head of the monitored object moves outside the preset monitoring area includes:
acquiring a motion trail of the head of the monitored object based on the tracking information;
determining that the head of the monitored object passes through the detection line from the extending direction of the sleeping gesture under the condition that the motion trail is intersected with the detection line;
and under the condition that at least one track point which is positioned outside the paving region exists on the movement track, determining that the head of the monitored object moves outside the paving region.
In some embodiments, the monitoring video is a video shot by a diagonal-mounted monitoring camera including the preset monitoring area.
In some of these embodiments,
the method further comprises the steps of: acquiring a first intersection point of a trunk and the detection line when the monitored object is in a sleeping posture state and a second intersection point of an extension line from a head to a foot and the edge of the preset monitoring area when the monitored object is in the sleeping posture state;
based on the tracking information, detecting whether the head of the monitored object crosses the detection line from a preset direction, and detecting whether the head of the monitored object moves outside the preset monitoring area includes: acquiring a motion trail of the head of the monitored object based on the tracking information; determining that the head of the monitored object passes through the detection line from the sleeping posture extending direction under the condition that at least one track point with the abscissa value larger than or equal to the abscissa value of the first intersection point exists on the motion track; and under the condition that at least one track point with the abscissa value larger than or equal to the abscissa value of the second intersection point exists on the motion track, determining that the head of the monitored object moves out of the paving region.
In some embodiments, acquiring a first intersection point of the trunk and the detection line when the monitored object is in the sleeping posture state, and acquiring a second intersection point of an extension line from the head to the foot and the edge of the preset monitoring area when the monitored object is in the sleeping posture state includes:
based on the tracking information, acquiring an initial position of the head when the monitored object is in a sleeping posture state, and acquiring a longitudinal coordinate value of the initial position;
taking a point on the detection line, the ordinate value of which is the same as the ordinate value of the initial position, as the first intersection point;
positioning a preset edge line which is parallel to the center line of the paving and relatively far away from the initial position in the edge lines of the paving area;
and taking a point on the preset edge line, the longitudinal coordinate value of which is the same as the longitudinal coordinate value of the initial position, as the second intersection point.
In some embodiments, acquiring a first intersection point of the trunk and the detection line when the monitored object is in the sleeping posture state, and acquiring a second intersection point of an extension line from the head to the foot and the edge of the preset monitoring area when the monitored object is in the sleeping posture state includes:
acquiring an initial position of a head and an initial position of a foot when the monitored object is in a sleeping posture state based on the monitoring video, and taking an intersection point of a connecting line of the initial position of the head and the initial position of the foot and the detection line as the first intersection point;
positioning a preset edge line which is parallel to the center line of the paving and relatively far away from the initial position in the edge lines of the paving area;
and taking a point with the same longitudinal coordinate value on the preset edge line as the second intersection point.
In some of these embodiments, the method further comprises: and triggering a behavior alarm under the condition that the monitored object executes the preset behavior.
In some of these embodiments, before triggering a behavior alert, in a case where it is determined that the monitored object performs a preset behavior, the method further includes:
and identifying the identity of the monitored object based on the tracking information of the monitored object, wherein the identity is carried during behavior alarm.
In some of these embodiments, triggering a behavior alert if it is determined that the monitored object performed a preset behavior includes:
acquiring a preset time range corresponding to the monitored object based on the identity, wherein the preset time range represents a time range allowing the monitored object to execute the preset behavior;
and triggering a behavior alarm under the condition that the monitored object executes the preset behavior and the current time is not in the preset time range.
In a second aspect, in this embodiment, there is provided an electronic device including a memory, a processor, and a computer program stored on the memory and executable on the processor, where the processor implements the steps of the behavior detection method described in the first aspect.
In a third aspect, in this embodiment, there is provided a storage medium having stored thereon a computer program which, when executed by a processor, implements the steps of the behavior detection method of the first aspect described above.
Compared with the related art, the behavior detection method, the electronic device and the computer readable storage medium provided in the embodiment are used for acquiring the monitoring video containing the preset monitoring area and positioning the detection line in the preset monitoring area; performing target tracking based on the monitoring video to obtain tracking information of the head of the monitored object; detecting whether the head of the monitored object crosses the detection line from a preset direction based on the tracking information, and detecting whether the head of the monitored object moves outside a preset monitoring area; under the condition that the head of the monitored object passes through the detection line from the preset direction and moves out of the preset monitoring area, the mode that the monitored object executes the preset behavior is determined, the problem that the detection of the getting-up behavior is easy to report by mistake in the related technology is solved, and more accurate detection of the getting-up behavior is realized.
The details of one or more embodiments of the application are set forth in the accompanying drawings and the description below to provide a more thorough understanding of the other features, objects, and advantages of the application.
Drawings
The accompanying drawings, which are included to provide a further understanding of the application and are incorporated in and constitute a part of this application, illustrate embodiments of the application and together with the description serve to explain the application and do not constitute an undue limitation to the application. In the drawings:
fig. 1 is a block diagram of the hardware configuration of a terminal of the behavior detection method of the present embodiment.
Fig. 2 is a flowchart of the behavior detection method of the present embodiment.
Fig. 3 is a schematic diagram of behavior detection based on a nodding video of the present embodiment.
Fig. 4 is a schematic diagram of behavior detection based on oblique-shot video of the present embodiment.
Detailed Description
For a clearer understanding of the objects, technical solutions and advantages of the present application, the present application is described and illustrated below with reference to the accompanying drawings and examples.
Unless defined otherwise, technical or scientific terms used herein shall have the same meaning as commonly understood by one of ordinary skill in the art to which this application belongs. The terms "a," "an," "the," "these," and the like in this application are not intended to be limiting in number, but rather are singular or plural. The terms "comprising," "including," "having," and any variations thereof, as used in the present application, are intended to cover a non-exclusive inclusion; for example, a process, method, and system, article, or apparatus that comprises a list of steps or modules (units) is not limited to the list of steps or modules (units), but may include other steps or modules (units) not listed or inherent to such process, method, article, or apparatus. The terms "connected," "coupled," and the like in this application are not limited to physical or mechanical connections, but may include electrical connections, whether direct or indirect. Reference to "a plurality" in this application means two or more. "and/or" describes an association relationship of an association object, meaning that there may be three relationships, e.g., "a and/or B" may mean: a exists alone, A and B exist together, and B exists alone. Typically, the character "/" indicates that the associated object is an "or" relationship. The terms "first," "second," "third," and the like, as referred to in this application, merely distinguish similar objects and do not represent a particular ordering of objects.
The method embodiments provided in the present embodiment may be executed in a terminal, a computer, or similar computing device. For example, the terminal is operated, and fig. 1 is a block diagram of the hardware structure of the terminal of the behavior detection method of the present embodiment. As shown in fig. 1, the terminal may include one or more (only one is shown in fig. 1) processors 102 and a memory 104 for storing data, wherein the processors 102 may include, but are not limited to, a microprocessor MCU, a programmable logic device FPGA, or the like. The terminal may also include a transmission device 106 for communication functions and an input-output device 108. It will be appreciated by those skilled in the art that the structure shown in fig. 1 is merely illustrative and is not intended to limit the structure of the terminal. For example, the terminal may also include more or fewer components than shown in fig. 1, or have a different configuration than shown in fig. 1.
The memory 104 may be used to store a computer program, for example, a software program of application software and a module, such as a computer program corresponding to the behavior detection method in the present embodiment, and the processor 102 executes the computer program stored in the memory 104 to perform various functional applications and data processing, that is, to implement the above-described method. Memory 104 may include high-speed random access memory, and may also include non-volatile memory, such as one or more magnetic storage devices, flash memory, or other non-volatile solid-state memory. In some examples, the memory 104 may further include memory remotely located relative to the processor 102, which may be connected to the terminal via a network. Examples of such networks include, but are not limited to, the internet, intranets, local area networks, mobile communication networks, and combinations thereof.
The transmission device 106 is used to receive or transmit data via a network. The network includes a wireless network provided by a communication provider of the terminal. In one example, the transmission device 106 includes a network adapter (Network Interface Controller, simply referred to as NIC) that can connect to other network devices through a base station to communicate with the internet. In one example, the transmission device 106 may be a Radio Frequency (RF) module, which is configured to communicate with the internet wirelessly.
In this embodiment, a behavior detection method is provided, fig. 2 is a flowchart of the behavior detection method of this embodiment, and as shown in fig. 2, the flowchart includes the following steps:
step S201, a monitoring video containing a preset monitoring area is obtained, and a detection line is positioned in the preset monitoring area.
Step S202, target tracking is carried out based on the monitoring video, and tracking information of the head of the monitored object is obtained.
Step S203 detects whether the head of the monitored object crosses the detection line from the preset direction and detects whether the head of the monitored object moves outside the preset monitoring area based on the tracking information.
In step S204, in the case that the head of the monitored object crosses the detection line from the preset direction and moves outside the preset monitoring area, it is determined that the monitored object performs the preset behavior.
Through the steps, the head of the monitored object is subjected to target tracking, and whether the monitored object performs the getting-up action is judged based on the movement track of the head of the monitored object, so that the false alarm rate is reduced compared with the mode of only detecting whether the monitored object is higher than the bed surface by a certain height in the related art.
The preset monitoring area and the detection line can be configured by a user. For example, a user manually draws an area of a preset monitoring area and a detection line in a monitoring video.
The preset monitoring area is typically a quadrilateral composed of four vertices, and thus, a user can determine the preset monitoring area by locating the positions of the four vertices within the monitoring video. In this embodiment, the preset monitoring area mainly includes a bed area, and in the case of adopting a passerby, the preset monitoring area is a passerby area. More specifically, the predetermined monitoring area is a paved bed surface area.
In this embodiment, the detection lines are parallel to the tiled midline in the global coordinate system. Thus, the user can determine the position of the detection line by locating the position of a point within a preset monitoring area within the monitoring video. It should be noted that, although the detection line is parallel to the through center line in the global coordinate system, in the coordinate system where the frame of the monitor video is located, the preset monitor area in the frame of the monitor video may not be rectangular due to perspective, lens distortion and other reasons, and the parallel relationship between the detection line and the through center line may be determined by the geometric relationship.
The center line of the through-laid is perpendicular to the extending direction of the sleeping posture of the monitored object on the through-laid, and generally, the short side direction of the through-laid is the extending direction of the sleeping posture of the monitored object. The preset direction is along the extending direction of the sleeping gesture; the preset behavior is a getting-up behavior.
The monitoring video is obtained by a monitoring camera. The monitoring camera can acquire a nodding video and an oblique video.
For example, by installing a monitoring camera on a roof, a nodding video including a preset monitoring area pair can be obtained by nodding down. The video of the nodding can be regarded as a two-dimensional space, fig. 3 is a schematic diagram of behavior detection based on the video of the nodding according to this embodiment, and as shown in fig. 3, when detecting whether the head of the monitored object crosses the detection line 20 from the preset direction based on the tracking information, and detecting whether the head of the monitored object moves outside the preset monitoring area 30, the motion track (1) → (2) → (3) of the head of the monitored object, which is a substantially line segment in the preset monitoring area 30 in the nodding video, can be obtained based on the tracking information; under the condition that the motion trail intersects with the detection line, determining that the head of the monitored object crosses the detection line from the extending direction of the sleeping posture; in the case that at least one track point outside the passthrough area (i.e., the preset monitoring area 30) exists on the movement track, it is determined that the head of the monitored object moves outside the passthrough area.
In the video of the nodding shot shown in fig. 3, the upper left corner of the video frame is taken as the origin, the horizontal direction of the video frame is taken as the X axis, the vertical direction of the video frame is taken as the Y axis, and the through-lay extends along the Y axis. The detection line and the edge of the preset monitoring area can be respectively represented by x values (respectively, a straight line parallel to the Y axis), when judging whether the motion trail is intersected with the detection line and judging whether at least one trail point outside the paving area exists on the motion trail, only judging whether the trail point with the x value equal to the x value of the detection line exists on the motion trail and judging whether the trail point with the x value larger than the x value of the edge line exists on the motion trail is needed, and the detection is simple and convenient.
However, in the case of a lower layer height, the field of view of the nodding video is smaller and complete monitoring cannot be achieved for longer campaigns. To this end, in other embodiments, behavior detection is performed using video captured by a diagonal mounted monitoring camera installed in a corner of a room. Under the same layer height condition, compared with the prone shooting camera, the oblique mounting monitoring camera not only can have a larger visual field, but also can be used for carrying out more daily monitoring on the monitored object, and is wider in application.
Fig. 4 is a schematic diagram of behavior detection based on a slapping video according to the present embodiment, and as shown in fig. 4, the slapping video cannot be simply regarded as a two-dimensional space, but is a projection of a three-dimensional space onto a two-dimensional plane. In this embodiment the behavior detection is based on some geometrically invariant properties of the projection of the three-dimensional space onto the two-dimensional plane.
The behavior detection method of the embodiment may further obtain a first intersection 11 of the trunk and the detection line when the monitored object is in the sleeping posture state, and a second intersection 12 of an extension line from the head to the foot and an edge of the preset monitoring area when the monitored object is in the sleeping posture state.
Wherein, when detecting whether the head of the monitored object crosses the detection line from the preset direction based on the tracking information and detecting whether the head of the monitored object moves outside the preset monitoring area 30, the movement track (1) → (2) → (3) of the head of the monitored object can be obtained based on the tracking information; under the condition that at least one track point with the abscissa value larger than or equal to the abscissa value of the first intersection point 11 exists on the motion track, the head of the monitored object is determined to cross the detection line 20 from the extending direction of the sleeping posture; in the case where at least one locus point having an abscissa value greater than or equal to an abscissa value of the second intersection 12 points exists on the movement locus, it is determined that the head of the monitored object moves outside the passthrough area (i.e., the preset monitoring area 30).
In some embodiments, in order to prevent false alarm, in a case that there are a plurality of track points on the motion track, the abscissa value of which is greater than or equal to the abscissa value of the second intersection point, it is determined that the head of the monitored object moves outside the passthrough area, where each track point corresponds to one video frame image.
In some embodiments, acquiring a first intersection point of the trunk and the detection line when the monitored object is in the sleeping posture state, and acquiring a second intersection point of an extension line from the head to the foot and the edge of the preset monitoring area when the monitored object is in the sleeping posture state includes: based on the tracking information, acquiring an initial position of the head when the monitored object is in a sleeping posture state, and acquiring a longitudinal coordinate value of the initial position; taking a point on the detection line, the ordinate value of which is the same as the ordinate value of the initial position, as a first intersection point; positioning a preset edge line 40 which is parallel to the center line of the paving and relatively far from the initial position in the edge lines of the paving region; a point on the preset edge line 40 where the ordinate value is the same as the ordinate value of the initial position is taken as a second intersection point.
In this embodiment, the sleeping posture direction of the monitored object is simplified to be a strict horizontal direction (taking fig. 4 as an example), and the steps of acquiring the first intersection point and the second intersection point are simplified.
In some embodiments, acquiring a first intersection point of the trunk and the detection line when the monitored object is in the sleeping posture state, and acquiring a second intersection point of an extension line from the head to the foot and the edge of the preset monitoring area when the monitored object is in the sleeping posture state includes: based on a monitoring video, acquiring an initial position of a head and an initial position of a foot when a monitored object is in a sleeping posture state, and taking an intersection point of a connecting line of the initial position of the head and the initial position of the foot and a detection line as a first intersection point; positioning a preset edge line which is parallel to the center line of the paving and relatively far away from the initial position in the edge lines of the paving area; and taking a point on the preset edge line, the ordinate value of which is the same as the ordinate value of the first intersection point, as a second intersection point.
In this embodiment, the initial positions of the head and the foot of the monitored object are obtained by the target detection method, and the first intersection point is further determined, so that the getting-up behavior of the monitored object in a non-standard sleeping posture such as recumbent can be detected more accurately.
In some of these embodiments, a behavior alert is triggered if it is determined that the monitored object has performed a preset behavior. The alarm information during behavior alarm can carry one or more video pictures during and/or before the alarm.
For example, in step S203, when it is detected that the head of the monitored object crosses the detection line from the preset direction, capturing of the video picture of the monitored object is started and temporarily stored. After that, until it is detected that the head of the monitored object moves outside the preset monitoring area, the snapshot is ended. And finally, adding the video pictures into alarm information to perform behavior alarm.
In some embodiments, the identities of the monitored objects may also be identified based on tracking information (e.g., face images in a sequence of tracking images, tracking IDs, etc.) of the monitored objects, and carried in the alarm information.
In some embodiments, in the case that the identity of the monitored object can be identified, different time ranges of getting up may also be set for different monitored objects. For example, based on the identity, acquiring a preset time range corresponding to the monitored object, wherein the preset time range represents a time range allowing the monitored object to execute preset behaviors; and triggering a behavior alarm under the condition that the monitored object executes the preset behavior and the current time is not in the preset time range. Thus, different preset time ranges can be configured for different monitored objects according to the getting-up rule, so that personalized getting-up behavior detection is realized.
In addition, the embodiment can also configure a global time range of the starting behavior detection, execute the behavior detection method in the global time range, and do not detect outside the global time range so as to save computing resources. For example, the global time range may be set to 20:00 to 07:00.
it should be noted that the steps illustrated in the above-described flow or flow diagrams of the figures may be performed in a computer system, such as a set of computer-executable instructions, and that, although a logical order is illustrated in the flow diagrams, in some cases, the steps illustrated or described may be performed in an order other than that illustrated herein. For example, the two detection processes in step S203 may be performed simultaneously or sequentially.
There is also provided in this embodiment an electronic device comprising a memory having stored therein a computer program and a processor arranged to run the computer program to perform the steps of any of the method embodiments described above.
Optionally, the electronic apparatus may further include a transmission device and an input/output device, where the transmission device is connected to the processor, and the input/output device is connected to the processor.
Alternatively, in the present embodiment, the above-described processor may be configured to execute the following steps by a computer program:
step S201, a monitoring video containing a preset monitoring area is obtained, and a detection line is positioned in the preset monitoring area.
Step S202, target tracking is carried out based on the monitoring video, and tracking information of the head of the monitored object is obtained.
Step S203 detects whether the head of the monitored object crosses the detection line from the preset direction and detects whether the head of the monitored object moves outside the preset monitoring area based on the tracking information.
In step S204, in the case that the head of the monitored object crosses the detection line from the preset direction and moves outside the preset monitoring area, it is determined that the monitored object performs the preset behavior.
It should be noted that, specific examples in this embodiment may refer to examples described in the foregoing embodiments and alternative implementations, and are not described in detail in this embodiment.
In addition, in combination with the behavior detection method provided in the above embodiment, a storage medium may be provided in this embodiment. The storage medium has a computer program stored thereon; the computer program when executed by a processor implements the steps of any of the behavior detection methods of the above embodiments.
It should be understood that the specific embodiments described herein are merely illustrative of this application and are not intended to be limiting. All other embodiments, which can be made by one of ordinary skill in the art without undue burden from the present application, are within the scope of the present application in light of the embodiments provided herein.
It is evident that the drawings are only examples or embodiments of the present application, from which the present application can also be adapted to other similar situations by a person skilled in the art without the inventive effort. In addition, it should be appreciated that while the development effort might be complex and lengthy, it would nevertheless be a routine undertaking of design, fabrication, or manufacture for those of ordinary skill having the benefit of this disclosure, and thus should not be construed as an admission of insufficient detail.
The term "embodiment" in this application means that a particular feature, structure, or characteristic described in connection with the embodiment may be included in at least one embodiment of the application. The appearances of such phrases in various places in the specification are not necessarily all referring to the same embodiment, nor are separate or alternative embodiments mutually exclusive. It will be clear or implicitly understood by those of ordinary skill in the art that the embodiments described in this application can be combined with other embodiments without conflict.
The above examples only represent a few embodiments of the present application, which are described in more detail and are not to be construed as limiting the scope of the patent. It should be noted that it would be apparent to those skilled in the art that various modifications and improvements could be made without departing from the spirit of the present application, which would be within the scope of the present application. Accordingly, the scope of protection of the present application shall be subject to the appended claims.

Claims (13)

1. A behavior detection method, characterized by comprising:
acquiring a monitoring video containing a preset monitoring area, and positioning a detection line in the preset monitoring area;
performing target tracking based on the monitoring video to obtain tracking information of the head of the monitored object;
detecting whether the head of the monitored object crosses the detection line from a preset direction based on the tracking information, and detecting whether the head of the monitored object moves outside the preset monitoring area;
and determining that the monitored object performs a preset behavior under the condition that the head of the monitored object passes through the detection line from the preset direction and moves out of the preset monitoring area.
2. The behavior detection method according to claim 1, wherein the preset monitoring area is a passion area, the detection line is parallel to a center line of the passion, the center line of the passion is perpendicular to a sleeping posture extending direction of the monitored object on the passion, the preset direction is along the sleeping posture extending direction, and the preset behavior is a getting-up behavior.
3. The behavior detection method according to claim 2, wherein the monitoring video is a nodding video including the preset monitoring area.
4. The behavior detection method according to claim 3, wherein detecting whether the head of the monitored object crosses the detection line from a preset direction based on the tracking information, and detecting whether the head of the monitored object moves outside the preset monitoring area, comprises:
acquiring a motion trail of the head of the monitored object based on the tracking information;
determining that the head of the monitored object passes through the detection line from the extending direction of the sleeping gesture under the condition that the motion trail is intersected with the detection line;
and under the condition that at least one track point which is positioned outside the paving region exists on the movement track, determining that the head of the monitored object moves outside the paving region.
5. The behavior detection method according to claim 2, wherein the monitoring video is a video shot by a tilt-up monitoring camera including the preset monitoring area.
6. The behavior detection method according to claim 5, wherein,
the method further comprises the steps of: acquiring a first intersection point of a trunk and the detection line when the monitored object is in a sleeping posture state and a second intersection point of an extension line from a head to a foot and the edge of the preset monitoring area when the monitored object is in the sleeping posture state;
based on the tracking information, detecting whether the head of the monitored object crosses the detection line from a preset direction, and detecting whether the head of the monitored object moves outside the preset monitoring area includes: acquiring a motion trail of the head of the monitored object based on the tracking information; determining that the head of the monitored object passes through the detection line from the sleeping posture extending direction under the condition that at least one track point with the abscissa value larger than or equal to the abscissa value of the first intersection point exists on the motion track; and under the condition that at least one track point with the abscissa value larger than or equal to the abscissa value of the second intersection point exists on the motion track, determining that the head of the monitored object moves out of the paving region.
7. The behavior detection method according to claim 6, wherein acquiring a first intersection of a trunk and the detection line when the monitored object is in a sleeping posture state, and a second intersection of an extension line from a head to a foot and an edge of the preset monitoring area when the monitored object is in a sleeping posture state, comprises:
based on the tracking information, acquiring an initial position of the head when the monitored object is in a sleeping posture state, and acquiring a longitudinal coordinate value of the initial position;
taking a point on the detection line, the ordinate value of which is the same as the ordinate value of the initial position, as the first intersection point;
positioning a preset edge line which is parallel to the center line of the paving and relatively far away from the initial position in the edge lines of the paving area;
and taking a point on the preset edge line, the longitudinal coordinate value of which is the same as the longitudinal coordinate value of the initial position, as the second intersection point.
8. The behavior detection method according to claim 6, wherein acquiring a first intersection of a trunk and the detection line when the monitored object is in a sleeping posture state, and a second intersection of an extension line from a head to a foot and an edge of the preset monitoring area when the monitored object is in a sleeping posture state, comprises:
acquiring an initial position of a head and an initial position of a foot when the monitored object is in a sleeping posture state based on the monitoring video, and taking an intersection point of a connecting line of the initial position of the head and the initial position of the foot and the detection line as the first intersection point;
positioning a preset edge line which is parallel to the center line of the paving and relatively far away from the initial position in the edge lines of the paving area;
and taking a point with the same longitudinal coordinate value on the preset edge line as the second intersection point.
9. The behavior detection method according to any one of claims 1 to 8, characterized in that the method further comprises:
and triggering a behavior alarm under the condition that the monitored object executes the preset behavior.
10. The behavior detection method according to claim 9, wherein, in a case where it is determined that the monitored object performs a preset behavior, before triggering a behavior alarm, the method further comprises:
and identifying the identity of the monitored object based on the tracking information of the monitored object, wherein the identity is carried during behavior alarm.
11. The behavior detection method according to claim 10, wherein, in the case where it is determined that the monitored object performs a preset behavior, triggering a behavior alarm includes:
acquiring a preset time range corresponding to the monitored object based on the identity, wherein the preset time range represents a time range allowing the monitored object to execute the preset behavior;
and triggering a behavior alarm under the condition that the monitored object executes the preset behavior and the current time is not in the preset time range.
12. An electronic device comprising a memory and a processor, characterized in that the memory has stored therein a computer program, the processor being arranged to run the computer program to perform the steps of the behavior detection method of any one of claims 1 to 11.
13. A computer readable storage medium having stored thereon a computer program, characterized in that the computer program when executed by a processor implements the steps of the behavior detection method of any one of claims 1 to 11.
CN202110845439.4A 2021-07-26 2021-07-26 Behavior detection method, electronic device, and computer-readable storage medium Active CN113657189B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110845439.4A CN113657189B (en) 2021-07-26 2021-07-26 Behavior detection method, electronic device, and computer-readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110845439.4A CN113657189B (en) 2021-07-26 2021-07-26 Behavior detection method, electronic device, and computer-readable storage medium

Publications (2)

Publication Number Publication Date
CN113657189A CN113657189A (en) 2021-11-16
CN113657189B true CN113657189B (en) 2024-02-09

Family

ID=78478675

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110845439.4A Active CN113657189B (en) 2021-07-26 2021-07-26 Behavior detection method, electronic device, and computer-readable storage medium

Country Status (1)

Country Link
CN (1) CN113657189B (en)

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102074095A (en) * 2010-11-09 2011-05-25 无锡中星微电子有限公司 System and method for monitoring infant behaviors
JP2015186532A (en) * 2014-03-26 2015-10-29 富士通株式会社 Monitoring method, monitoring device, and monitoring program
CN105868707A (en) * 2016-03-28 2016-08-17 华中科技大学 Method for real-time detection of falling from bed behaviors based on depth image information
CN105938540A (en) * 2015-03-03 2016-09-14 富士通株式会社 Behavior detection method and behavior detection apparatus
CN109964248A (en) * 2017-03-02 2019-07-02 欧姆龙株式会社 Nurse auxiliary system and its control method and program
CN111046832A (en) * 2019-12-24 2020-04-21 广州地铁设计研究院股份有限公司 Image recognition-based retrograde determination method, device, equipment and storage medium
CN111783530A (en) * 2020-05-26 2020-10-16 武汉盛元鑫博软件有限公司 Safety system and method for monitoring and identifying behaviors in restricted area
CN111814587A (en) * 2020-06-18 2020-10-23 浙江大华技术股份有限公司 Human behavior detection method, teacher behavior detection method, and related system and device

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2619724A2 (en) * 2010-09-23 2013-07-31 Stryker Corporation Video monitoring system

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102074095A (en) * 2010-11-09 2011-05-25 无锡中星微电子有限公司 System and method for monitoring infant behaviors
JP2015186532A (en) * 2014-03-26 2015-10-29 富士通株式会社 Monitoring method, monitoring device, and monitoring program
CN105938540A (en) * 2015-03-03 2016-09-14 富士通株式会社 Behavior detection method and behavior detection apparatus
CN105868707A (en) * 2016-03-28 2016-08-17 华中科技大学 Method for real-time detection of falling from bed behaviors based on depth image information
CN109964248A (en) * 2017-03-02 2019-07-02 欧姆龙株式会社 Nurse auxiliary system and its control method and program
CN111046832A (en) * 2019-12-24 2020-04-21 广州地铁设计研究院股份有限公司 Image recognition-based retrograde determination method, device, equipment and storage medium
CN111783530A (en) * 2020-05-26 2020-10-16 武汉盛元鑫博软件有限公司 Safety system and method for monitoring and identifying behaviors in restricted area
CN111814587A (en) * 2020-06-18 2020-10-23 浙江大华技术股份有限公司 Human behavior detection method, teacher behavior detection method, and related system and device

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
Tracking and Prediciting Movement Patterns of a Moving Object in Wiresless Sensor Network;S. M. Gupta等;2018 2nd International Conference on Trends in Electronics and Informatics (ICOEI);第586-591页 *
基于Kinect的学习者头部姿态动态识别方法;范子健等;计算机与数字工程;第45卷(第02期);第360-366页 *

Also Published As

Publication number Publication date
CN113657189A (en) 2021-11-16

Similar Documents

Publication Publication Date Title
JP6462183B2 (en) Imaging apparatus and focus control method
US9542745B2 (en) Apparatus and method for estimating orientation of camera
US7554575B2 (en) Fast imaging system calibration
US9697415B2 (en) Recording medium, image processing method, and information terminal
US11933604B2 (en) Detection method and apparatus for automatic driving sensor, and electronic device
KR102354299B1 (en) Camera calibration method using single image and apparatus therefor
CN109313799A (en) Image processing method and equipment
US20160073021A1 (en) Image capturing method, panorama image generating method and electronic apparatus
CN112232279B (en) Personnel interval detection method and device
CN109002786A (en) Method for detecting human face, equipment and computer readable storage medium
CN110213488B (en) Positioning method and related equipment
KR102248459B1 (en) Apparatus and methdo for calibrating a camera
CN113822942B (en) Method for measuring object size by monocular camera based on two-dimensional code
CN103577789A (en) Detection method and device
CN110930463A (en) Method and device for calibrating internal reference of monitoring camera and electronic equipment
CN114640833A (en) Projection picture adjusting method and device, electronic equipment and storage medium
CN110458870B (en) Image registration, fusion and occlusion detection method and device and electronic equipment
CN110163914B (en) Vision-based positioning
CN113657189B (en) Behavior detection method, electronic device, and computer-readable storage medium
CN109785444A (en) Recognition methods, device and the mobile terminal of real plane in image
CN116883515B (en) Optical environment adjusting method and optical calibration device
CN108234932B (en) Method and device for extracting personnel form in video monitoring image
WO2023184783A1 (en) Image detection and recognition apparatus and detection and recognition method therefor
US10514591B2 (en) Camera apparatus, image processing device, and image processing method
CN113936042B (en) Target tracking method and device and computer readable storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant