CN111832357A - Mobile event detection method and device - Google Patents

Mobile event detection method and device Download PDF

Info

Publication number
CN111832357A
CN111832357A CN201910317264.2A CN201910317264A CN111832357A CN 111832357 A CN111832357 A CN 111832357A CN 201910317264 A CN201910317264 A CN 201910317264A CN 111832357 A CN111832357 A CN 111832357A
Authority
CN
China
Prior art keywords
area
monitoring area
determining
monitoring
custom
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201910317264.2A
Other languages
Chinese (zh)
Inventor
修明磊
刘晓稳
刘杨
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Suzhou Ripple Information Technology Co ltd
Original Assignee
Suzhou Ripple Information Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Suzhou Ripple Information Technology Co ltd filed Critical Suzhou Ripple Information Technology Co ltd
Priority to CN201910317264.2A priority Critical patent/CN111832357A/en
Publication of CN111832357A publication Critical patent/CN111832357A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/52Surveillance or monitoring of activities, e.g. for recognising suspicious objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/40Scenes; Scene-specific elements in video content
    • G06V20/41Higher-level, semantic clustering, classification or understanding of video scenes, e.g. detection, labelling or Markovian modelling of sport events or news items
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/40Scenes; Scene-specific elements in video content
    • G06V20/44Event detection

Abstract

The present disclosure relates to a method and an apparatus for detecting a mobile event. The method comprises the following steps: acquiring at least one custom monitoring area of an image frame to be detected; determining a parameter change area in the self-defined monitoring area, wherein the parameter change area is a communication area, and the change value of the parameter change area relative to a preset image parameter at the same position in the last image frame is greater than a preset image parameter threshold value; and under the condition that the area of the parameter change area is larger than a preset area threshold value, determining that a movement event occurs in the self-defined monitoring area. By utilizing the embodiments provided by the disclosure, on one hand, the detection of the mobile events in some unnecessary monitoring scenes can be reduced, so that the interference of the mobile event notification message to the user is reduced; on the other hand, whether the movement event occurs in the user-defined monitoring area is determined based on the change value of the preset image parameter and the area generating the change, so that the detection accuracy of the movement event can be improved.

Description

Mobile event detection method and device
Technical Field
The disclosure relates to the technical field of intelligent security and protection, in particular to a mobile event detection method and device.
Background
With the development of the technologies such as computers, internet of things and the like, the intelligent security technology is greatly developed. The mobile detection is an important component in the technical field of intelligent security, and the mobile detection means that whether a mobile event occurs in a monitored area is determined by using images or videos captured by a monitoring camera device. When it is determined that a movement event occurs within the monitored area, the user may be notified of the movement event.
In the related art, the monitoring area of the monitoring camera apparatus tends to be wide, and a moving event within the monitoring area may include a flying bird in the sky, a puppy running on the lawn, and the like, while the user does not wish to receive a notification message regarding such a moving event. Therefore, detecting movement of some areas such as sky, walls, grass, etc. and notifying users of the movement events in the areas may cause unnecessary interference to the users.
Therefore, there is a need in the related art for a method for detecting a mobile event that better meets the actual application scenario of a user.
Disclosure of Invention
To overcome the problems in the related art, the present disclosure provides a method and an apparatus for detecting a mobile event.
According to a first aspect of the embodiments of the present disclosure, there is provided a mobile event detection method, including:
acquiring at least one custom monitoring area of an image frame to be detected;
determining a parameter change area in the self-defined monitoring area, wherein the parameter change area is a communication area, and the change value of the parameter change area relative to a preset image parameter at the same position in the last image frame is greater than a preset image parameter threshold value;
and under the condition that the area of the parameter change area is larger than a preset area threshold value, determining that a movement event occurs in the self-defined monitoring area.
According to a second aspect of the embodiments of the present disclosure, there is provided a movement event detection apparatus including:
the user-defined area acquisition module is used for acquiring at least one user-defined monitoring area of the image frame to be detected;
a change region determination module, configured to determine a parameter change region in the custom monitoring region, where the parameter change region is a connected region, and a change value of the parameter change region with respect to a preset image parameter at the same position in a previous image frame is greater than a preset image parameter threshold;
and the event determining module is used for determining that a movement event occurs in the self-defined monitoring area under the condition that the area of the parameter change area is larger than a preset area threshold value.
According to a third aspect of the embodiments of the present disclosure, there is provided an electronic apparatus comprising an image pickup device, a storage device, the movement event detection device, and a communication device, wherein,
the camera device is used for acquiring the image frame to be detected;
the storage device is used for storing the information of the at least one custom monitoring area;
the communication device is configured to send a notification message, where the notification message includes information of the mobile event.
According to a fourth aspect of the embodiments of the present disclosure, there is provided an electronic apparatus including:
a processor;
a memory for storing processor-executable instructions;
wherein the processor is configured to perform the above-described movement event detection method.
According to a fifth aspect of embodiments of the present disclosure, there is provided a non-transitory computer-readable storage medium, wherein instructions of the storage medium, when executed by a processor, enable the processor to perform the above-mentioned movement event detection method.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the disclosure.
The mobile event detection method and device provided by each embodiment of the disclosure can allow a user to set a custom monitoring area, and the custom monitoring area can be selected based on the interest area of the user. Movement events may then be detected based on the custom monitored area. Through the embodiment, on one hand, the detection of the mobile events in some unnecessary monitoring scenes can be reduced, so that the interference of the mobile event notification message on the user is reduced; on the other hand, whether the movement event occurs in the user-defined monitoring area is determined based on the change value of the preset image parameter and the area generating the change, so that the detection accuracy of the movement event can be improved.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the present disclosure and together with the description, serve to explain the principles of the disclosure.
FIG. 1 is a flow chart illustrating a method of mobile event detection according to an exemplary embodiment.
FIG. 2 is a diagram illustrating a scenario according to an example embodiment.
FIG. 3 is a diagram illustrating a scenario according to an example embodiment.
FIG. 4 is a diagram illustrating a scenario in accordance with an exemplary embodiment.
FIG. 5 is a diagram illustrating a scenario according to an example embodiment.
FIG. 6 is a block diagram illustrating a mobile event detection device according to an example embodiment.
FIG. 7 is a block diagram illustrating an electronic device in accordance with an example embodiment.
FIG. 8 is a block diagram illustrating an apparatus in accordance with an example embodiment.
FIG. 9 is a block diagram illustrating an apparatus in accordance with an example embodiment.
Detailed Description
Reference will now be made in detail to the exemplary embodiments, examples of which are illustrated in the accompanying drawings. When the following description refers to the accompanying drawings, like numbers in different drawings represent the same or similar elements unless otherwise indicated. The implementations described in the exemplary embodiments below are not intended to represent all implementations consistent with the present disclosure. Rather, they are merely examples of apparatus and methods consistent with certain aspects of the present disclosure, as detailed in the appended claims.
The following describes the movement event detection method according to the present disclosure in detail with reference to fig. 1. Fig. 1 is a flowchart of a method of an embodiment of a mobile event detection method provided by the present disclosure. Although the present disclosure provides method steps as illustrated in the following examples or figures, more or fewer steps may be included in the method based on conventional or non-inventive efforts. In steps where no necessary causal relationship exists logically, the order of execution of the steps is not limited to that provided by the disclosed embodiments.
Specifically, an embodiment of a method for detecting a movement event provided by the present disclosure is shown in fig. 1, where the method may include:
s101: and acquiring at least one custom monitoring area of the image frame to be detected.
S103: and determining a parameter change area in the self-defined monitoring area, wherein the parameter change area is a communication area, and the change value of the parameter change area relative to the preset image parameter at the same position in the last image frame is greater than a preset image parameter threshold value.
S105: and under the condition that the area of the parameter change area is larger than a preset area threshold value, determining that a movement event occurs in the self-defined monitoring area.
The implementation subject of the embodiment of the present disclosure may include an image pickup device having an image pickup function, and the image pickup device may include, for example, a network camera (IP camera), an in-vehicle camera, and the like. The camera may have image data processing capabilities and may process captured image frames and determine movement events within the monitored area. Of course, in other embodiments, the camera device may also have a network communication function, and transmit the acquired image frames to a server or a client through a wired connection or a wireless connection, and the server or the client processes the image frames and determines a movement event in the monitoring area.
In the embodiment of the present disclosure, the wired connection may include a connection using a twisted pair, a coaxial cable, an optical fiber, and the like. The wireless connection mode may include 3G/4G, WiFi, bluetooth, WiMAX, Zigbee, Ultra Wide Band (UWB), and the like. The server can perform data interaction with the camera device based on a network protocol such as HTTP, TCP/IP or FTP and a network communication module. The client may be a terminal device capable of accessing a communication network based on a network protocol. Specifically, for example, the client may be a mobile smart phone, a computer (including a laptop computer and a desktop computer), a tablet camera, a Personal Digital Assistant (PDA), or a smart wearable device. In addition, the client may also be software running on any of the above listed devices, and the disclosure is not limited thereto.
In the embodiment of the disclosure, during the operation of the camera device, images of a monitored area are collected according to a preset sampling frequency, and the images collected within a period of time are connected in sequence to form a monitoring video, so that an image captured by the camera device at a single time can be an image frame corresponding to one frame of image in the monitoring video. In the disclosed embodiment, by comparing the image frame with the previous image frame, the movement event in the monitored area can be determined.
In the embodiment of the present disclosure, a customized monitoring area may be set, where the customized monitoring area is a part of an original monitoring area, and the original monitoring area may be a maximum area where the image capturing device may capture an image. The customized monitoring area may include a monitoring area customized by a user. In one example, a user installs a surveillance camera on an exterior wall of a yard that can capture images of multiple areas of the yard, such as lawns, trees, roads, sky, etc. Then, the user can only select interested areas such as lawn areas and road areas by customizing the monitoring area, and exclude tree areas and sky areas. Of course, in the embodiment of the present disclosure, a plurality of customized monitoring areas may be set in the original monitoring area to meet the requirement of the user for monitoring a plurality of different positions.
In one embodiment of the present disclosure, the customized monitoring area may be set to be obtained as follows:
s201: displaying an original monitoring area in a user interface, wherein a plurality of controllable points are arranged in the original monitoring area;
s203: receiving a moving operation of the plurality of controllable points;
s205: determining the position information of the plurality of controllable points after the movement;
s207: and connecting the plurality of controllable points into a closed area based on the position information, and taking the closed area as the user-defined monitoring area.
In the embodiment of the present disclosure, the original monitoring area may be displayed in a user interface of the client, and a plurality of controllable points may be disposed in the original monitoring area, and positions of the controllable points may be changed in the user interface. In use, after the user selects the controllable point in the user interface, the user can move the controllable point to a required position. The client may receive a moving operation on the plurality of controllable points, and determine location information of the moved controllable points. In the embodiment of the present disclosure, a coordinate system is disposed in the original monitoring area, and therefore, the controllable point has coordinate position information in the coordinate system. In one example, the coordinate system may be established based on pixels in the original monitoring area, and a distance between two adjacent pixels is a minimum distance of the coordinate system. Of course, the coordinate system may also be established based on other criteria, such as actual distance, etc., and the disclosure is not limited thereto.
In the embodiment of the present disclosure, after the position information is determined, the plurality of controllable points may be connected into a closed area based on the position information, and the closed area is used as the custom monitoring area. In one embodiment of the present disclosure, the controllable points may have number information, and after determining the position information, the plurality of controllable points may be connected into a closed area according to the number information. In the embodiment of the present disclosure, the line segment connecting the two controllable points may be a straight line segment, and the closed region formed at this time is an irregular polygon. Of course, in other embodiments, the line segment connecting the two controllable points may also be a curved line segment, and the disclosure is not limited thereto.
In one embodiment of the present disclosure, the customized monitoring area may be set to be obtained as follows:
s301: presenting a plurality of selectable closed graphics in a user interface;
s303: receiving a selection operation, wherein the selection operation comprises information of the selected selectable closed graph;
s305: displaying an original monitoring area in the user interface, and displaying the selected selectable closed graph in the original monitoring area;
s307: receiving a size adjustment operation on the selected selectable closed graph;
s309: and adjusting the size of the selected optional closed image based on the size adjusting operation, and taking the area contained in the selected optional closed image after adjustment as the self-defined monitoring area.
In embodiments of the present disclosure, a plurality of selectable closed graphics may be provided for selection by a user. In particular, a plurality of selectable closed graphics may be presented in the user interface of the client, which may include circles, ovals, polygons, hearts, and the like. The user can select the needed optional closed graphics in the user interface, and based on the selection operation, the client can receive the selection operation of the user, and the selection operation can contain the information of the selected optional closed graphics. After determining the selected selectable closed graph, the client may display the original monitoring area in the user interface, and display the selected selectable closed graph in the original monitoring area.
In the embodiment of the present disclosure, the user may further adjust the size of the selected selectable closed figure in the user interface. In one example, the user may drag four vertices of the quadrilateral, changing the size of the quadrilateral so that the area contained by the quadrilateral is the user region of interest.
In one embodiment of the present disclosure, the customized monitoring area may be set to be obtained as follows:
s401: displaying an original monitoring area in a user interface;
s403: receiving a sliding operation in the user interface;
s405: and under the condition that the graph formed by sliding is determined to be a closed graph, taking the area contained in the closed graph as the custom monitoring area.
In the embodiment of the present disclosure, the user may also perform a sliding operation in the original monitoring area, for example, the user may freely draw a circle, a quadrangle, and other closed areas with any shapes in the user interface to circle the monitoring area of interest. In the case that the graph formed by the sliding of the user is determined to be a closed graph, the area included in the closed graph can be used as the custom monitoring area. In the application scenario shown in fig. 2, the user freely selects the interested area in the original monitoring area, and the area enclosed by the white line in fig. 2 is the customized monitoring area.
It should be noted that, a user may set a plurality of customized monitoring areas in the original monitoring area, and the plurality of customized monitoring areas are not overlapped, so as to meet the monitoring requirements of different locations in the same area of the user. In addition, after determining the position of the at least one customized monitoring area, the user may generate information of the customized monitoring area, for example, coordinate information of the controllable point, shape and position information of the optional closed figure, shape and position information of the closed figure generated by sliding, and the like. In the embodiment of the present disclosure, the information of the at least one custom monitoring area may be stored, and when the main body of the present disclosure implements the method of the embodiment of the present disclosure, the information of the at least one custom monitoring area may be acquired, and parameters such as the shape and the position of the at least one custom monitoring area may be determined in the image frame to be detected.
In the embodiment of the present disclosure, after at least one custom monitoring area of the image frame to be detected is obtained, a parameter change area in the custom monitoring area may be determined, where the parameter change area is a connected area, and a change value of the parameter change area relative to a preset image parameter at the same position in the previous image frame is greater than a preset image parameter threshold. In an embodiment of the present disclosure, the determining a parameter change area in the customized monitoring area may include:
s501: dividing the image frame to be detected into a plurality of cell blocks with equal sizes;
s503: determining a plurality of monitoring unit blocks located in the custom monitoring area from the plurality of unit blocks;
s505: screening a plurality of target unit blocks from the plurality of monitoring unit blocks, wherein the variation value of the target unit blocks relative to the preset image parameters of the unit blocks at the same position in the previous image frame is larger than a preset image parameter threshold value;
s507: and determining a communication area composed of at least one target unit block, and taking the communication area as a parameter change area of the custom monitoring area.
In the embodiment of the present disclosure, the image frame to be detected may be divided into a plurality of cell blocks with equal size. In an embodiment, the cell block may include a plurality of pixel points, and the number of pixel points may include, for example, 2 × 2, 4 × 4, 6 × 4, 8 × 8, and so on, which is not limited in this disclosure. Fig. 3 is a schematic diagram of an application scenario for partitioning an image frame into unit blocks. When the cell block comprises a plurality of pixel points, the speed of subsequent image processing can be reduced, and the chip manufacturing standards of some chip suppliers can be met, and some chip suppliers can only process pixel blocks with preset sizes, such as 4 × 4 pixel blocks and the like. Of course, in order to improve the accuracy of the image processing result, a single pixel may be included in the unit block, and the disclosure is not limited thereto. In addition, the division manner of the cell blocks is not limited to the division based on the pixel points, and the division may also be performed based on parameters such as distance, and the disclosure is not limited herein.
In the embodiment of the disclosure, after the image frame to be detected is divided into a plurality of unit blocks with equal size, a plurality of monitoring unit blocks located in the custom monitoring area may be determined from the plurality of unit blocks. As described above, the image to be detected has a corresponding coordinate system, and therefore, after the plurality of unit blocks of equal size, each unit block has a corresponding coordinate position. And the self-defined monitoring area has parameters such as corresponding position, shape and the like, so that particularly for an irregular polygon, the self-defined monitoring area can be represented by using a coordinate position in the image frame to be detected. In one disclosed embodiment, the determining a plurality of monitoring unit blocks located in the custom monitoring area from among the plurality of unit blocks may include:
s601: determining coordinate positions of a plurality of the unit blocks, respectively;
s603: cutting the user-defined monitoring area into a plurality of first triangles, and respectively determining the directional areas of the first triangles;
s605: taking the coordinate position of the unit block as one vertex, and respectively forming three second triangles with two vertexes in the first triangle;
s607: and under the condition that the sum of the directional areas of the three second triangles is equal to the directional area of the first triangle, determining that the cell block is a monitoring cell block in the self-defined monitoring area.
The following describes an embodiment of S601-S607 with a specific application scenario, as shown in fig. 4, where an irregular octagon is a custom monitoring area disposed on the image frame to be detected shown in fig. 3, and the custom monitoring area has eight vertices. Each unit block in the image frame to be detected has a coordinate position, and when the unit block comprises a plurality of pixel points, the coordinate position of the center position of the unit block can be used as the coordinate position of the unit block. Of course, in other embodiments, the coordinate position of the unit block may also be the position information of the upper left corner, the upper right corner, or any one of the pixel points of the unit block, which is not limited in this disclosure.
Included in determining the custom monitoring areaWhen the cell blocks are in the self-defined monitoring area, whether each cell block is located in the self-defined monitoring area can be judged. In a specific embodiment, as shown in fig. 5, the customized monitoring area may be cut into a plurality of first triangles, and the customized monitoring area in the application scenario is cut into 6 first triangles. Wherein, with (x)1,y1)、(x2,y2)、(x3,y3) The directed area S of the first triangle as the vertex is:
S=(x1*y2)+(x3*y1)+(x2*y3)–(x3*y2)–(x1*y3)–(x2*y1) (1)
in the embodiment of the present disclosure, in the process of determining whether a unit block is located in the custom monitoring area, the coordinate position of the unit block may be used as one vertex, three second triangles are respectively formed with two vertices in the first triangle, and it is determined whether the sum of the directional areas of the three second triangles is equal to the directional area of the first triangle. In this application scenario, assume the coordinates of the cell block A as (x)m,ym) The unit block A is then multiplied by (x)1,y1)、(x2,y2)、(x3,y3) The first triangle being a vertex may form three second triangles, each being at (x)1,y1)、(x2,y2)、(xm,ym) Second triangle as vertex, with (x)1,y1)、(x3,y3)、(xm,ym) Second triangle as vertex, with (x)3,y3)、(x2,y2)、(xm,ym) The second triangle being the vertex, and the directional areas of these three second triangles are S1, S2, S3. If S1+ S2+ S3 is equal to S, it may be determined that cell block a is located within the custom monitored area; otherwise, the monitoring area is located outside the custom monitoring area. Repeating the above operations on each unit block in the image frame to be detected, so that the monitoring unit block contained in the self-defined monitoring area can be determined.
In the embodiment of the present disclosure, after determining the plurality of monitoring unit blocks, a plurality of target unit blocks may be screened from the monitoring unit blocks, and a variation value of the target unit block with respect to a preset image parameter of a unit block at the same position in a previous image frame is greater than a preset image parameter threshold value. Wherein the preset image parameter may include at least one of: brightness, gray scale, saturation, contrast, color temperature, tone scale. In one example, the gray scale value of the monitoring unit block B of the image frame to be detected is 200, the gray scale value of the monitoring unit block C located at the same position as the monitoring unit block B in one image frame of the image frame to be detected is 20, and the gray scale value between the monitoring unit blocks B and C is changed to 180. If the preset image parameter threshold value for the change of the gray value is 20, it can be determined that the monitoring unit block B is the target unit block. After the plurality of target cell blocks are determined, a connected region composed of at least one target cell block may be determined, and the connected region may be used as a parameter change region of the custom monitoring region. The connected region may include eight directions (up, down, left, right, left-up, right-up, left-down, right-down) of the target unit block having at least one other target unit block connected. It should be noted that the same customized monitoring area may include two or more parameter variation areas, for example, corresponding to two persons walking independently, and the disclosure is not limited herein.
In an embodiment of the present disclosure, the determining a parameter change area in the customized monitoring area may include:
s701: dividing the image frame to be detected into a plurality of cell blocks with equal sizes;
s703: screening out a plurality of target cell blocks from the plurality of cell blocks, wherein the variation value of the target cell blocks relative to the preset image parameter of the cell block at the same position in the last image frame is larger than a preset image parameter threshold value;
s705: determining a connected region composed of at least one target unit block;
s707: and determining an overlapping area between the communication area and the custom monitoring area, and taking the overlapping area as a parameter change area of the custom monitoring area.
In the embodiment of the present disclosure, reference may be made to the embodiments of S601 and S605 to S607 for the embodiments of S701 to S705, which are not described herein again. In S707, it may be determined whether the target cell block in the communication area is located in the custom monitoring area, and for a specific implementation, reference may be made to S603, which is not described herein again.
In the embodiment of the present disclosure, after it is determined that a movement event occurs in the custom monitoring area, a notification message may be sent, where the notification message includes information of the movement event. Specifically, the notification message may be sent to the user client, and the notification message may further include information of an event occurrence, a location, a size of the moving target object, and the like.
The mobile event detection method provided by each embodiment of the disclosure can allow a user to set a custom monitoring area, and the custom monitoring area can be selected based on the interest area of the user. Movement events may then be detected based on the custom monitored area. Through the embodiment, on one hand, the detection of the mobile events in some unnecessary monitoring scenes can be reduced, so that the interference of the mobile event notification message on the user is reduced; on the other hand, whether the movement event occurs in the user-defined monitoring area is determined based on the change value of the preset image parameter and the area generating the change, so that the detection accuracy of the movement event can be improved.
In another aspect, an embodiment of the present disclosure further provides a mobile event detection apparatus, and fig. 6 shows a block diagram of the mobile event detection apparatus according to an embodiment of the present disclosure, and as shown in fig. 6, the apparatus 600 includes:
a custom region obtaining module 601, configured to obtain at least one custom monitoring region of an image frame to be detected;
a change region determining module 603, configured to determine a parameter change region in the custom monitoring region, where the parameter change region is a connected region, and a change value of the parameter change region relative to a preset image parameter at the same position in a previous image frame is greater than a preset image parameter threshold;
an event determining module 605, configured to determine that a movement event occurs in the custom monitoring area when it is determined that the area of the parameter change area is greater than a preset area threshold.
Optionally, in an embodiment of the present disclosure, the change region determining module includes:
the cell block dividing submodule is used for dividing the image frame to be detected into a plurality of cell blocks with equal size;
the monitoring unit block determining submodule is used for determining a plurality of monitoring unit blocks located in the self-defined monitoring area from a plurality of the unit blocks;
a target cell block determination subunit configured to screen out a plurality of target cell blocks from the plurality of monitoring cell blocks, where a variation value of a preset image parameter of the target cell block with respect to a cell block at the same position in a previous image frame is greater than a preset image parameter threshold value;
and the connected region determining subunit is used for determining a connected region composed of at least one target unit block, and taking the connected region as a parameter change region of the custom monitoring region.
Optionally, in an embodiment of the present disclosure, the monitoring unit block determining sub-module includes:
a position determination unit for determining coordinate positions of the plurality of unit blocks, respectively;
the triangle cutting unit is used for cutting the user-defined monitoring area into a plurality of first triangles and respectively determining the directed areas of the first triangles;
the triangle construction unit is used for taking the coordinate position of the unit block as one vertex and forming three second triangles with two vertexes in the first triangle respectively;
and the monitoring unit block determining unit is used for determining that the unit block is the monitoring unit block in the self-defined monitoring area under the condition that the sum of the directional areas of the three second triangles is equal to the directional area of the first triangle.
Optionally, in an embodiment of the present disclosure, the change region determining module includes:
the cell block dividing submodule is used for dividing the image frame to be detected into a plurality of cell blocks with equal size;
a target cell block screening submodule for screening a plurality of target cell blocks from the plurality of cell blocks, a variation value of a preset image parameter of the target cell block with respect to a cell block at the same position in a previous image frame being greater than a preset image parameter threshold value;
a connected region determining submodule for determining a connected region composed of at least one target unit block;
and the overlap area determining submodule is used for determining an overlap area between the communication area and the custom monitoring area, and taking the overlap area as a parameter change area of the custom monitoring area.
Optionally, in an embodiment of the present disclosure, the customized monitoring area is set to be obtained as follows:
displaying an original monitoring area in a user interface, wherein a plurality of controllable points are arranged in the original monitoring area;
receiving a moving operation of the plurality of controllable points;
determining the position information of the plurality of controllable points after the movement;
and connecting the plurality of controllable points into a closed area based on the position information, and taking the closed area as the user-defined monitoring area.
Optionally, in an embodiment of the present disclosure, the customized monitoring area is set to be obtained as follows:
presenting a plurality of selectable closed graphics in a user interface;
receiving a selection operation, wherein the selection operation comprises information of the selected selectable closed graph;
displaying an original monitoring area in the user interface, and displaying the selected selectable closed graph in the original monitoring area;
receiving a size adjustment operation on the selected selectable closed graph;
and adjusting the size of the selected optional closed image based on the size adjusting operation, and taking the area contained in the selected optional closed image after adjustment as the self-defined monitoring area.
Optionally, in an embodiment of the present disclosure, the customized monitoring area is set to be obtained as follows:
displaying an original monitoring area in a user interface;
receiving a sliding operation in the user interface;
and under the condition that the graph formed by sliding is determined to be a closed graph, taking the area contained in the closed graph as the custom monitoring area.
Optionally, in an embodiment of the present disclosure, the shape of the customized monitoring area includes one of: circular, oval, polygonal, heart-shaped.
Optionally, in an embodiment of the present disclosure, the preset image parameter includes at least one of: brightness, gray scale, saturation, contrast, color temperature, tone scale.
Optionally, in an embodiment of the present disclosure, the apparatus further includes:
and the message notification module is used for sending a notification message, wherein the notification message comprises the information of the mobile event.
In another aspect, the embodiment of the present disclosure further provides an electronic device, and fig. 7 shows a block diagram of an electronic device according to an embodiment of the present disclosure, and as shown in fig. 7, may include an image capturing device 701, a storage device 703, a movement event detecting device 600, and a communication device 705, wherein,
the camera 701 is configured to acquire the image frame to be detected;
the storage device 703 is configured to store information of the at least one customized monitoring area;
the communication device 705 is configured to send a notification message, where the notification message includes information of the mobility event.
An embodiment of the present disclosure further provides an electronic device, including: a processor; a memory for storing processor-executable instructions; wherein the processor is configured as the method of the above embodiments.
The electronic device may be provided as a terminal, server, or other form of device.
Fig. 8 is a block diagram illustrating an electronic device 800 in accordance with an example embodiment. For example, the electronic device 800 may be a mobile phone, a computer, a digital broadcast terminal, a messaging device, a game console, a tablet device, a medical device, a fitness device, a personal digital assistant, or the like terminal.
Referring to fig. 8, electronic device 800 may include one or more of the following components: processing component 802, memory 804, power component 806, multimedia component 808, audio component 810, input/output (I/O) interface 812, sensor component 814, and communication component 816.
The processing component 802 generally controls overall operation of the electronic device 800, such as operations associated with display, telephone calls, data communications, camera operations, and recording operations. The processing components 802 may include one or more processors 820 to execute instructions to perform all or a portion of the steps of the methods described above. Further, the processing component 802 can include one or more modules that facilitate interaction between the processing component 802 and other components. For example, the processing component 802 can include a multimedia module to facilitate interaction between the multimedia component 808 and the processing component 802.
The memory 804 is configured to store various types of data to support operations at the electronic device 800. Examples of such data include instructions for any application or method operating on the electronic device 800, contact data, phonebook data, messages, pictures, videos, and so forth. The memory 804 may be implemented by any type or combination of volatile or non-volatile memory devices such as Static Random Access Memory (SRAM), electrically erasable programmable read-only memory (EEPROM), erasable programmable read-only memory (EPROM), programmable read-only memory (PROM), read-only memory (ROM), magnetic memory, flash memory, magnetic or optical disks.
The power supply component 806 provides power to the various components of the electronic device 800. The power components 806 may include a power management system, one or more power supplies, and other components associated with generating, managing, and distributing power for the electronic device 800.
The multimedia component 808 includes a screen that provides an output interface between the electronic device 800 and a user. In some embodiments, the screen may include a Liquid Crystal Display (LCD) and a Touch Panel (TP). If the screen includes a touch panel, the screen may be implemented as a touch screen to receive an input signal from a user. The touch panel includes one or more touch sensors to sense touch, slide, and gestures on the touch panel. The touch sensor may not only sense the boundary of a touch or slide action, but also detect the duration and pressure associated with the touch or slide operation. In some embodiments, the multimedia component 808 includes a front facing camera and/or a rear facing camera. The front camera and/or the rear camera may receive external multimedia data when the electronic device 800 is in an operation mode, such as a shooting mode or a video mode. Each front camera and rear camera may be a fixed optical lens system or have a focal length and optical zoom capability.
The audio component 810 is configured to output and/or input audio signals. For example, the audio component 810 includes a Microphone (MIC) configured to receive external audio signals when the electronic device 800 is in an operational mode, such as a call mode, a recording mode, and a voice recognition mode. The received audio signals may further be stored in the memory 804 or transmitted via the communication component 816. In some embodiments, audio component 810 also includes a speaker for outputting audio signals.
The I/O interface 812 provides an interface between the processing component 802 and peripheral interface modules, which may be keyboards, click wheels, buttons, etc. These buttons may include, but are not limited to: a home button, a volume button, a start button, and a lock button.
The sensor assembly 814 includes one or more sensors for providing various aspects of state assessment for the electronic device 800. For example, the sensor assembly 814 may detect an open/closed state of the electronic device 800, the relative positioning of components, such as a display and keypad of the electronic device 800, the sensor assembly 814 may also detect a change in the position of the electronic device 800 or a component of the electronic device 800, the presence or absence of user contact with the electronic device 800, orientation or acceleration/deceleration of the electronic device 800, and a change in the temperature of the electronic device 800. Sensor assembly 814 may include a proximity sensor configured to detect the presence of a nearby object without any physical contact. The sensor assembly 814 may also include a light sensor, such as a CMOS or CCD image sensor, for use in imaging applications. In some embodiments, the sensor assembly 814 may also include an acceleration sensor, a gyroscope sensor, a magnetic sensor, a pressure sensor, or a temperature sensor.
The communication component 816 is configured to facilitate wired or wireless communication between the electronic device 800 and other devices. The electronic device 800 may access a wireless network based on a communication standard, such as WiFi, 2G or 3G, or a combination thereof. In an exemplary embodiment, the communication component 816 receives a broadcast signal or broadcast related information from an external broadcast management system via a broadcast channel. In an exemplary embodiment, the communication component 816 further includes a Near Field Communication (NFC) module to facilitate short-range communications. For example, the NFC module may be implemented based on Radio Frequency Identification (RFID) technology, infrared data association (IrDA) technology, Ultra Wideband (UWB) technology, Bluetooth (BT) technology, and other technologies.
In an exemplary embodiment, the electronic device 800 may be implemented by one or more Application Specific Integrated Circuits (ASICs), Digital Signal Processors (DSPs), Digital Signal Processing Devices (DSPDs), Programmable Logic Devices (PLDs), Field Programmable Gate Arrays (FPGAs), controllers, micro-controllers, microprocessors or other electronic components for performing the above-described methods.
In an exemplary embodiment, a non-transitory computer-readable storage medium, such as the memory 804, is also provided that includes computer program instructions executable by the processor 820 of the electronic device 800 to perform the above-described methods.
Fig. 9 is a block diagram illustrating an electronic device 1900 in accordance with an example embodiment. For example, the electronic device 1900 may be provided as a server. Referring to fig. 9, electronic device 1900 includes a processing component 1922 further including one or more processors and memory resources, represented by memory 1932, for storing instructions, e.g., applications, executable by processing component 1922. The application programs stored in memory 1932 may include one or more modules that each correspond to a set of instructions. Further, the processing component 1922 is configured to execute instructions to perform the above-described method.
The electronic device 1900 may also include a power component 1926 configured to perform power management of the electronic device 1900, a wired or wireless network interface 1950 configured to connect the electronic device 1900 to a network, and an input/output (I/O) interface 1958. The electronic device 1900 may operate based on an operating system stored in memory 1932, such as Windows Server, Mac OS XTM, UnixTM, LinuxTM, FreeBSDTM, or the like.
In an exemplary embodiment, a non-transitory computer readable storage medium, such as the memory 1932, is also provided that includes computer program instructions executable by the processing component 1922 of the electronic device 1900 to perform the above-described methods.
The present disclosure may be systems, methods, and/or computer program products. The computer program product may include a computer-readable storage medium having computer-readable program instructions embodied thereon for causing a processor to implement various aspects of the present disclosure.
The computer readable storage medium may be a tangible device that can hold and store the instructions for use by the instruction execution device. The computer readable storage medium may be, for example, but is not limited to, an electronic memory device, a magnetic memory device, an optical memory device, an electromagnetic memory device, a semiconductor memory device, or any suitable combination of the foregoing. More specific examples (a non-exhaustive list) of the computer readable storage medium would include the following: a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), a Static Random Access Memory (SRAM), a portable compact disc read-only memory (CD-ROM), a Digital Versatile Disc (DVD), a memory stick, a floppy disk, a mechanical coding device, such as punch cards or in-groove projection structures having instructions stored thereon, and any suitable combination of the foregoing. Computer-readable storage media as used herein is not to be construed as transitory signals per se, such as radio waves or other freely propagating electromagnetic waves, electromagnetic waves propagating through a waveguide or other transmission medium (e.g., optical pulses through a fiber optic cable), or electrical signals transmitted through electrical wires.
The computer-readable program instructions described herein may be downloaded from a computer-readable storage medium to a respective computing/processing device, or to an external computer or external storage device via a network, such as the internet, a local area network, a wide area network, and/or a wireless network. The network may include copper transmission cables, fiber optic transmission, wireless transmission, routers, firewalls, switches, gateway computers and/or edge servers. The network adapter card or network interface in each computing/processing device receives computer-readable program instructions from the network and forwards the computer-readable program instructions for storage in a computer-readable storage medium in the respective computing/processing device.
The computer program instructions for carrying out operations of the present disclosure may be assembler instructions, Instruction Set Architecture (ISA) instructions, machine-related instructions, microcode, firmware instructions, state setting data, or source or object code written in any combination of one or more programming languages, including an object oriented programming language such as Smalltalk, C + + or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The computer-readable program instructions may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the case of a remote computer, the remote computer may be connected to the user's computer through any type of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet service provider). In some embodiments, the electronic circuitry that can execute the computer-readable program instructions implements aspects of the present disclosure by utilizing the state information of the computer-readable program instructions to personalize the electronic circuitry, such as a programmable logic circuit, a Field Programmable Gate Array (FPGA), or a Programmable Logic Array (PLA).
Various aspects of the present disclosure are described herein with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems) and computer program products according to embodiments of the disclosure. It will be understood that each block of the flowchart illustrations and/or block diagrams, and combinations of blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer-readable program instructions.
These computer-readable program instructions may be provided to a processor of a general purpose computer, special purpose computer, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks. These computer-readable program instructions may also be stored in a computer-readable storage medium that can direct a computer, programmable data processing apparatus, and/or other devices to function in a particular manner, such that the computer-readable medium storing the instructions comprises an article of manufacture including instructions which implement the function/act specified in the flowchart and/or block diagram block or blocks.
The computer readable program instructions may also be loaded onto a computer, other programmable data processing apparatus, or other devices to cause a series of operational steps to be performed on the computer, other programmable apparatus or other devices to produce a computer implemented process such that the instructions which execute on the computer, other programmable apparatus or other devices implement the functions/acts specified in the flowchart and/or block diagram block or blocks.
The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present disclosure. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of instructions, which comprises one or more executable instructions for implementing the specified logical function(s). In some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
Having described embodiments of the present disclosure, the foregoing description is intended to be exemplary, not exhaustive, and not limited to the disclosed embodiments. Many modifications and variations will be apparent to those of ordinary skill in the art without departing from the scope and spirit of the described embodiments. The terminology used herein is chosen in order to best explain the principles of the embodiments, the practical application, or improvements made to the technology in the marketplace, or to enable others of ordinary skill in the art to understand the embodiments disclosed herein.

Claims (23)

1. A method for mobile event detection, comprising:
acquiring at least one custom monitoring area of an image frame to be detected;
determining a parameter change area in the self-defined monitoring area, wherein the parameter change area is a communication area, and the change value of the parameter change area relative to a preset image parameter at the same position in the last image frame is greater than a preset image parameter threshold value;
and under the condition that the area of the parameter change area is larger than a preset area threshold value, determining that a movement event occurs in the self-defined monitoring area.
2. The method of claim 1, wherein the determining a parameter change area within the custom monitoring area comprises:
dividing the image frame to be detected into a plurality of cell blocks with equal sizes;
determining a plurality of monitoring unit blocks located in the custom monitoring area from the plurality of unit blocks;
screening a plurality of target unit blocks from the plurality of monitoring unit blocks, wherein the variation value of the target unit blocks relative to the preset image parameters of the unit blocks at the same position in the previous image frame is larger than a preset image parameter threshold value;
and determining a communication area composed of at least one target unit block, and taking the communication area as a parameter change area of the custom monitoring area.
3. The method of claim 2, wherein the determining a plurality of blocks of monitoring units located within the custom monitoring area from among the plurality of blocks of units comprises:
determining coordinate positions of a plurality of the unit blocks, respectively;
cutting the user-defined monitoring area into a plurality of first triangles, and respectively determining the directional areas of the first triangles;
taking the coordinate position of the unit block as one vertex, and respectively forming three second triangles with two vertexes in the first triangle;
and under the condition that the sum of the directional areas of the three second triangles is equal to the directional area of the first triangle, determining that the cell block is a monitoring cell block in the self-defined monitoring area.
4. The method of claim 1, wherein the determining a parameter change area within the custom monitoring area comprises:
dividing the image frame to be detected into a plurality of cell blocks with equal sizes;
screening out a plurality of target cell blocks from the plurality of cell blocks, wherein the variation value of the target cell blocks relative to the preset image parameter of the cell block at the same position in the last image frame is larger than a preset image parameter threshold value;
determining a connected region composed of at least one target unit block;
and determining an overlapping area between the communication area and the custom monitoring area, and taking the overlapping area as a parameter change area of the custom monitoring area.
5. The mobile event detection method of claim 1, wherein the custom monitoring area is set to be obtained as follows:
displaying an original monitoring area in a user interface, wherein a plurality of controllable points are arranged in the original monitoring area;
receiving a moving operation of the plurality of controllable points;
determining the position information of the plurality of controllable points after the movement;
and connecting the plurality of controllable points into a closed area based on the position information, and taking the closed area as the user-defined monitoring area.
6. The mobile event detection method of claim 1, wherein the custom monitoring area is set to be obtained as follows:
presenting a plurality of selectable closed graphics in a user interface;
receiving a selection operation, wherein the selection operation comprises information of the selected selectable closed graph;
displaying an original monitoring area in the user interface, and displaying the selected selectable closed graph in the original monitoring area;
receiving a size adjustment operation on the selected selectable closed graph;
and adjusting the size of the selected optional closed image based on the size adjusting operation, and taking the area contained in the selected optional closed image after adjustment as the self-defined monitoring area.
7. The mobile event detection method of claim 1, wherein the custom monitoring area is set to be obtained as follows:
displaying an original monitoring area in a user interface;
receiving a sliding operation in the user interface;
and under the condition that the graph formed by sliding is determined to be a closed graph, taking the area contained in the closed graph as the custom monitoring area.
8. The mobile event detection method of claim 1, wherein the shape of the custom monitoring area comprises one of: circular, oval, polygonal, heart-shaped.
9. The movement event detection method according to claim 1, wherein the preset image parameters include at least one of: brightness, gray scale, saturation, contrast, color temperature, tone scale.
10. The mobile event detection method of claim 1, wherein after said determining that a mobile event has occurred within said custom monitoring area, said method further comprises:
and sending a notification message, wherein the notification message comprises the information of the mobile event.
11. A mobile event detection device, comprising:
the user-defined area acquisition module is used for acquiring at least one user-defined monitoring area of the image frame to be detected;
a change region determination module, configured to determine a parameter change region in the custom monitoring region, where the parameter change region is a connected region, and a change value of the parameter change region with respect to a preset image parameter at the same position in a previous image frame is greater than a preset image parameter threshold;
and the event determining module is used for determining that a movement event occurs in the self-defined monitoring area under the condition that the area of the parameter change area is larger than a preset area threshold value.
12. The movement event detection device of claim 11, wherein the change region determination module comprises:
the cell block dividing submodule is used for dividing the image frame to be detected into a plurality of cell blocks with equal size;
the monitoring unit block determining submodule is used for determining a plurality of monitoring unit blocks located in the self-defined monitoring area from a plurality of the unit blocks;
a target cell block determination subunit configured to screen out a plurality of target cell blocks from the plurality of monitoring cell blocks, where a variation value of a preset image parameter of the target cell block with respect to a cell block at the same position in a previous image frame is greater than a preset image parameter threshold value;
and the connected region determining subunit is used for determining a connected region composed of at least one target unit block, and taking the connected region as a parameter change region of the custom monitoring region.
13. The movement event detection device of claim 12, wherein the monitoring unit block determination submodule comprises:
a position determination unit for determining coordinate positions of the plurality of unit blocks, respectively;
the triangle cutting unit is used for cutting the user-defined monitoring area into a plurality of first triangles and respectively determining the directed areas of the first triangles;
the triangle construction unit is used for taking the coordinate position of the unit block as one vertex and forming three second triangles with two vertexes in the first triangle respectively;
and the monitoring unit block determining unit is used for determining that the unit block is the monitoring unit block in the self-defined monitoring area under the condition that the sum of the directional areas of the three second triangles is equal to the directional area of the first triangle.
14. The movement event detection device of claim 11, wherein the change region determination module comprises:
the cell block dividing submodule is used for dividing the image frame to be detected into a plurality of cell blocks with equal size;
a target cell block screening submodule for screening a plurality of target cell blocks from the plurality of cell blocks, a variation value of a preset image parameter of the target cell block with respect to a cell block at the same position in a previous image frame being greater than a preset image parameter threshold value;
a connected region determining submodule for determining a connected region composed of at least one target unit block;
and the overlap area determining submodule is used for determining an overlap area between the communication area and the custom monitoring area, and taking the overlap area as a parameter change area of the custom monitoring area.
15. The mobile event detection device of claim 11, wherein the custom monitoring area is configured to be obtained by:
displaying an original monitoring area in a user interface, wherein a plurality of controllable points are arranged in the original monitoring area;
receiving a moving operation of the plurality of controllable points;
determining the position information of the plurality of controllable points after the movement;
and connecting the plurality of controllable points into a closed area based on the position information, and taking the closed area as the user-defined monitoring area.
16. The mobile event detection device of claim 11, wherein the custom monitoring area is configured to be obtained by:
presenting a plurality of selectable closed graphics in a user interface;
receiving a selection operation, wherein the selection operation comprises information of the selected selectable closed graph;
displaying an original monitoring area in the user interface, and displaying the selected selectable closed graph in the original monitoring area;
receiving a size adjustment operation on the selected selectable closed graph;
and adjusting the size of the selected optional closed image based on the size adjusting operation, and taking the area contained in the selected optional closed image after adjustment as the self-defined monitoring area.
17. The mobile event detection device of claim 11, wherein the custom monitoring area is configured to be obtained by:
displaying an original monitoring area in a user interface;
receiving a sliding operation in the user interface;
and under the condition that the graph formed by sliding is determined to be a closed graph, taking the area contained in the closed graph as the custom monitoring area.
18. The movement event detection device of claim 11, wherein the shape of the custom monitoring area comprises one of: circular, oval, polygonal, heart-shaped.
19. The movement event detection device according to claim 11, wherein the preset image parameters include at least one of: brightness, gray scale, saturation, contrast, color temperature, tone scale.
20. The movement event detection device according to claim 11, characterized in that the device further comprises:
and the message notification module is used for sending a notification message, wherein the notification message comprises the information of the mobile event.
21. An electronic device comprising a camera, a storage, a movement event detection device according to any one of claims 11-20, a communication device, wherein,
the camera device is used for acquiring the image frame to be detected;
the storage device is used for storing the information of the at least one custom monitoring area;
the communication device is configured to send a notification message, where the notification message includes information of the mobile event.
22. An electronic device, comprising:
a processor;
a memory for storing processor-executable instructions;
wherein the processor is configured to perform the movement event detection method of any one of claims 1-10.
23. A non-transitory computer readable storage medium having instructions that, when executed by a processor, enable the processor to perform the movement event detection method of any of claims 1-10.
CN201910317264.2A 2019-04-19 2019-04-19 Mobile event detection method and device Pending CN111832357A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910317264.2A CN111832357A (en) 2019-04-19 2019-04-19 Mobile event detection method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910317264.2A CN111832357A (en) 2019-04-19 2019-04-19 Mobile event detection method and device

Publications (1)

Publication Number Publication Date
CN111832357A true CN111832357A (en) 2020-10-27

Family

ID=72915269

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910317264.2A Pending CN111832357A (en) 2019-04-19 2019-04-19 Mobile event detection method and device

Country Status (1)

Country Link
CN (1) CN111832357A (en)

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103415876A (en) * 2010-11-17 2013-11-27 欧姆龙科学技术公司 A method and apparatus for monitoring zones
CN103634571A (en) * 2013-11-29 2014-03-12 国家电网公司 Irregular polygon monitoring early warning region delimiting method based on mobile client end
CN103679745A (en) * 2012-09-17 2014-03-26 浙江大华技术股份有限公司 Moving target detection method and device
CN105847746A (en) * 2016-04-01 2016-08-10 北京小米移动软件有限公司 Monitoring method and device
CN106131499A (en) * 2016-07-28 2016-11-16 广州紫川电子科技有限公司 Same monitoring position based on thermal infrared imager multizone monitoring method, Apparatus and system
CN106534796A (en) * 2016-11-29 2017-03-22 北京小米移动软件有限公司 Method and device for monitoring safety of infant and child
CN106851209A (en) * 2017-02-28 2017-06-13 北京小米移动软件有限公司 Monitoring method, device and electronic equipment
WO2017107647A1 (en) * 2015-12-25 2017-06-29 北京奇虎科技有限公司 Camera-based monitoring method, apparatus, and system
CN109068099A (en) * 2018-09-05 2018-12-21 济南大学 Virtual electronic fence monitoring method and system based on video monitoring
US10186124B1 (en) * 2017-10-26 2019-01-22 Scott Charles Mullins Behavioral intrusion detection system

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103415876A (en) * 2010-11-17 2013-11-27 欧姆龙科学技术公司 A method and apparatus for monitoring zones
CN103679745A (en) * 2012-09-17 2014-03-26 浙江大华技术股份有限公司 Moving target detection method and device
CN103634571A (en) * 2013-11-29 2014-03-12 国家电网公司 Irregular polygon monitoring early warning region delimiting method based on mobile client end
WO2017107647A1 (en) * 2015-12-25 2017-06-29 北京奇虎科技有限公司 Camera-based monitoring method, apparatus, and system
CN105847746A (en) * 2016-04-01 2016-08-10 北京小米移动软件有限公司 Monitoring method and device
CN106131499A (en) * 2016-07-28 2016-11-16 广州紫川电子科技有限公司 Same monitoring position based on thermal infrared imager multizone monitoring method, Apparatus and system
CN106534796A (en) * 2016-11-29 2017-03-22 北京小米移动软件有限公司 Method and device for monitoring safety of infant and child
CN106851209A (en) * 2017-02-28 2017-06-13 北京小米移动软件有限公司 Monitoring method, device and electronic equipment
US10186124B1 (en) * 2017-10-26 2019-01-22 Scott Charles Mullins Behavioral intrusion detection system
CN109068099A (en) * 2018-09-05 2018-12-21 济南大学 Virtual electronic fence monitoring method and system based on video monitoring

Non-Patent Citations (5)

* Cited by examiner, † Cited by third party
Title
宁煜西;周铭;李广强;王宁;: "民航航班跟踪视频关键帧提取方法研究", 空军预警学院学报, no. 03, 15 June 2018 (2018-06-15) *
星火燎猿: "判断点在多边形内部", pages 4, Retrieved from the Internet <URL:https://blog.csdn.net/fwj380891124/article/details/7737143> *
曾韬;余永权;: "视频图像变化信息的动态检测算法与实现", 微计算机信息, no. 09, 30 March 2007 (2007-03-30) *
王雪靖;戴亚丽;: "基于Cell_ID的区域定位方法的研究与实现", 软件, no. 11, 15 November 2016 (2016-11-15) *
蓝照华;傅文利;赵进创;陈涛;: "边缘面积值绝对差数累积运动检测算法", 微计算机信息, no. 33, 30 November 2006 (2006-11-30) *

Similar Documents

Publication Publication Date Title
US9667774B2 (en) Methods and devices for sending virtual information card
EP3163498A2 (en) Alarming method and device
EP3188066A1 (en) A method and an apparatus for managing an application
US11481975B2 (en) Image processing method and apparatus, electronic device, and computer-readable storage medium
EP2927787B1 (en) Method and device for displaying picture
CN109948494B (en) Image processing method and device, electronic equipment and storage medium
CN110633755A (en) Network training method, image processing method and device and electronic equipment
EP3125163A1 (en) Method and device for flight notification, and method and device for flight setting
CN107657590B (en) Picture processing method and device and storage medium
EP3147802B1 (en) Method and apparatus for processing information
US11216904B2 (en) Image processing method and apparatus, electronic device, and storage medium
CN112465843A (en) Image segmentation method and device, electronic equipment and storage medium
EP3016319A1 (en) Method and apparatus for dynamically displaying device list
CN111104920A (en) Video processing method and device, electronic equipment and storage medium
CN110989905A (en) Information processing method and device, electronic equipment and storage medium
CN110807393A (en) Early warning method and device based on video analysis, electronic equipment and storage medium
CN111523346A (en) Image recognition method and device, electronic equipment and storage medium
CN110807769B (en) Image display control method and device
CN109285126B (en) Image processing method and device, electronic equipment and storage medium
CN111954058B (en) Image processing method, device, electronic equipment and storage medium
CN112837372A (en) Data generation method and device, electronic equipment and storage medium
CN112508020A (en) Labeling method and device, electronic equipment and storage medium
US11410268B2 (en) Image processing methods and apparatuses, electronic devices, and storage media
CN109255839B (en) Scene adjustment method and device
CN107256149B (en) User interface updating method and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination