CN112702571A - Monitoring method and device - Google Patents

Monitoring method and device Download PDF

Info

Publication number
CN112702571A
CN112702571A CN202011515035.0A CN202011515035A CN112702571A CN 112702571 A CN112702571 A CN 112702571A CN 202011515035 A CN202011515035 A CN 202011515035A CN 112702571 A CN112702571 A CN 112702571A
Authority
CN
China
Prior art keywords
monitoring
camera
radar
moving object
calibration
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202011515035.0A
Other languages
Chinese (zh)
Other versions
CN112702571B (en
Inventor
郑文
张翔
林恒
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Fujian Huichuan Internet Of Things Technology Science And Technology Co ltd
Original Assignee
Fujian Huichuan Internet Of Things Technology Science And Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Fujian Huichuan Internet Of Things Technology Science And Technology Co ltd filed Critical Fujian Huichuan Internet Of Things Technology Science And Technology Co ltd
Priority to CN202011515035.0A priority Critical patent/CN112702571B/en
Publication of CN112702571A publication Critical patent/CN112702571A/en
Application granted granted Critical
Publication of CN112702571B publication Critical patent/CN112702571B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/695Control of camera direction for changing a field of view, e.g. pan, tilt or based on tracking of objects

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Radar Systems Or Details Thereof (AREA)

Abstract

The embodiment of the application provides a monitoring method, which relates to the technical field of monitoring, and is used in a monitoring device comprising a camera and a monitoring radar, wherein the monitoring ranges of the camera and the monitoring radar are different, and the method comprises the following steps: judging whether a moving object exists in the monitoring range of the monitoring radar or not according to the monitoring point cloud data acquired by the monitoring radar; when a moving object exists in the monitoring range of the monitoring radar, judging whether the object volume of the moving object is larger than a preset volume threshold value or not; when the volume of the moving object is larger than a volume threshold value, acquiring monitoring adjustment parameters; and adjusting the monitoring device according to the monitoring adjustment parameters so that the monitoring video of the camera comprises a moving object. Therefore, the implementation of the embodiment can eliminate the visual field blind area as much as possible, thereby improving the monitoring capability and further improving the monitoring effect.

Description

Monitoring method and device
Technical Field
The present application relates to the field of monitoring technologies, and in particular, to a monitoring method and apparatus.
Background
With the continuous enhancement of security and protection technology, more and more monitoring devices appear in the life of people, and all recordable things are recorded for people. However, in practice, it is found that the current monitoring camera can only monitor one aspect, and even if the field of view is increased, a certain blind area of the field of view still exists, so that the monitoring capability is reduced in a phase-changing manner, and the monitoring effect is influenced.
Disclosure of Invention
The application aims to provide a monitoring method and a monitoring device, which can eliminate a visual field blind area as much as possible, thereby improving the monitoring capability and further improving the monitoring effect.
A first aspect of an embodiment of the present application provides a monitoring method, which is used in a monitoring device including a camera and a monitoring radar, where monitoring ranges of the camera and the monitoring radar are different, and the method includes:
judging whether a moving object exists in the monitoring range of the monitoring radar or not according to the monitoring point cloud data acquired by the monitoring radar;
when the moving object exists in the monitoring range of the monitoring radar, judging whether the volume of the moving object is larger than a preset volume threshold value or not;
when the volume of the moving object is larger than the volume threshold value, acquiring monitoring adjustment parameters;
and adjusting the monitoring device according to the monitoring adjustment parameters so that the monitoring video of the camera comprises the moving object.
In the implementation process, the monitoring method can be used in a monitoring device comprising a camera and a monitoring radar, wherein the camera is arranged above the monitoring radar, and the monitoring ranges of the camera and the monitoring radar are different, wherein the method can preferentially judge whether a moving object exists in the monitoring range of the monitoring radar according to the monitoring point cloud data acquired by the monitoring radar; if the moving object exists, judging whether the volume of the moving object is larger than a preset volume threshold value; when the volume of the moving object is larger than a volume threshold value, acquiring monitoring adjustment parameters; and finally, adjusting the monitoring device according to the monitoring adjustment parameters so that the monitoring video of the camera comprises a moving object. Therefore, by implementing the implementation mode, the information of the moving object in the blind area of the camera can be acquired through the monitoring radar, and when the volume of the moving object is larger than the volume threshold value, the camera is prompted to aim at the moving object for video shooting, so that the problem of the prior view blind area can be solved, the monitoring capability is further improved on the basis, and the monitoring effect is improved.
Further, the step of obtaining the monitoring adjustment parameter includes:
acquiring a monitoring calibration parameter;
acquiring object position data corresponding to the moving object from the monitoring point cloud data;
and calculating according to the monitoring calibration parameters and the object position data to obtain monitoring adjustment parameters.
In the implementation process, the method can preferentially obtain the monitoring calibration parameters in the process of obtaining the monitoring adjustment parameters; then, obtaining object position data corresponding to the moving object from the monitoring point cloud data; and finally, calculating according to the monitoring calibration parameters and the object position data to obtain monitoring adjustment parameters. Therefore, by implementing the implementation mode, the coordinate system conversion calculation can be carried out on the object position data according to the pre-calibrated calibration parameters, and the monitoring adjustment parameters are obtained, so that the monitoring device can automatically carry out self-control and adjustment according to the monitoring adjustment parameters, and the camera can shoot the moving object.
Further, the camera is arranged above the monitoring radar, and the step of obtaining the monitoring calibration parameter includes:
adjusting the monitoring device to align the camera with a standard reference object;
acquiring calibration point cloud data of the monitoring radar, and acquiring space coordinate information of the calibration reference object according to the calibration point cloud data;
according to the space coordinate information, calculating a horizontal included angle between the optical axis of the camera and the transverse axis of the monitoring radar coordinate and a vertical height between the optical axis of the camera and the transverse axis of the monitoring radar coordinate;
and determining the horizontal included angle and the vertical height as monitoring calibration parameters.
In the implementation process, the monitoring device described in the method is a monitoring device in which a camera is arranged above a monitoring radar, so that the camera and the monitoring radar have the same central axis; based on the method, in the process of acquiring the monitoring calibration parameters, the monitoring device can be preferentially adjusted to enable the camera to align the calibration reference object, so that the reference object is positioned on the optical axis of the camera; then, acquiring calibration point cloud data of the monitoring radar, and acquiring space coordinate information of a calibration reference object according to the calibration point cloud data, namely acquiring position information of the calibration reference object in a monitoring radar coordinate system; then, according to the space coordinate information, calculating a horizontal included angle between the optical axis of the camera and the transverse axis of the monitoring radar coordinate and a vertical height between the optical axis of the camera and the transverse axis of the monitoring radar coordinate; and finally, determining the horizontal included angle and the vertical height as monitoring calibration parameters. Therefore, by implementing the implementation mode, the position relation between the camera and the monitoring radar can be determined, so that the subsequent monitoring radar can immediately calculate out how to adjust the direction of the camera when detecting the position of an object, and the monitoring efficiency can be improved.
Further, the step of calculating according to the monitoring calibration parameter and the object position data to obtain a monitoring adjustment parameter includes:
performing superposition calculation on the monitoring calibration parameters and the object position data to obtain monitoring adjustment parameters; the monitoring adjustment parameters comprise a horizontal adjustment angle and a pitching adjustment angle.
In the implementation process, in the process of obtaining the monitoring adjustment parameter by calculating according to the monitoring calibration parameter and the object position data, the method can preferentially perform superposition calculation on the monitoring calibration parameter and the object position data to obtain the monitoring adjustment parameter; the monitoring adjustment parameters include a level adjustment angle and a pitch adjustment angle. Therefore, by implementing the implementation mode, the position corresponding relation of the moving object in the monitoring radar coordinate system can be converted into the position corresponding relation between the moving object and the camera optical axis, so that the monitoring adjustment parameters can be determined, and the monitoring device can be controlled.
Further, the step of calculating according to the monitoring calibration parameter and the object position data to obtain a monitoring adjustment parameter includes:
calculating according to the object position data and the volume of the moving object to obtain the focusing magnification of the camera;
calculating according to the monitoring calibration parameters and the object position data to obtain a monitoring adjustment angle;
and determining the focusing magnification of the camera and the monitoring adjustment angle as monitoring adjustment parameters.
In the implementation process, the method can automatically determine the zoom multiple of the camera according to the position of the object and the volume of the object, so that the camera can automatically zoom according to the zoom multiple, the size of the object in a video picture of the camera is moderate, and the picture of a moving object can be displayed more clearly.
A second aspect of the embodiments of the present application provides a monitoring device, where the monitoring device includes a camera and a monitoring radar that have different monitoring ranges, and the monitoring device includes:
the judging unit is used for judging whether a moving object exists in the monitoring range of the monitoring radar or not according to the monitoring point cloud data acquired by the monitoring radar;
the judgment unit is further used for judging whether the volume of the moving object is larger than a preset volume threshold value or not when the moving object exists in the monitoring range of the monitoring radar;
the acquisition unit is used for acquiring monitoring adjustment parameters when the volume of the moving object is larger than the volume threshold;
and the adjusting unit is used for adjusting the monitoring device according to the monitoring adjusting parameters so that the monitoring video of the camera comprises the moving object.
In the implementation process, the monitoring device can judge whether a moving object exists in the monitoring range of the monitoring radar or not through the judging unit according to the monitoring point cloud data acquired by the monitoring radar; when a moving object exists in the monitoring range of the monitoring radar, the judgment unit judges whether the volume of the moving object is larger than a preset volume threshold value; when the volume of the moving object is larger than a volume threshold value, acquiring monitoring adjustment parameters through an acquisition unit; and adjusting the monitoring device according to the monitoring adjustment parameters through the adjusting unit so that the monitoring video of the camera comprises a moving object. Therefore, by implementing the implementation mode, the information of the moving object in the blind area of the camera can be obtained through the monitoring radar, and when the volume of the moving object is larger than the preset volume threshold value, the camera is prompted to aim at the moving object for video shooting, so that the problem of the prior view blind area can be solved, the monitoring capability is further improved on the basis, and the monitoring effect is improved.
Further, the acquisition unit includes:
the acquisition subunit is used for acquiring monitoring calibration parameters;
the acquisition subunit is further configured to acquire object position data corresponding to the moving object from the monitoring point cloud data;
and the calculating subunit is used for calculating according to the monitoring calibration parameter and the object position data to obtain a monitoring adjustment parameter.
In the implementation process, the obtaining unit may obtain the monitoring calibration parameter through the obtaining subunit; then, obtaining object position data corresponding to the moving object from the monitoring point cloud data through the obtaining subunit; and finally, calculating according to the monitoring calibration parameters and the object position data through a calculating subunit to obtain monitoring adjustment parameters. Therefore, by implementing the implementation mode, the coordinate system conversion calculation can be carried out on the object position data according to the pre-calibrated calibration parameters, and the monitoring adjustment parameters are obtained, so that the monitoring device can automatically carry out self-control and adjustment according to the monitoring adjustment parameters, and the camera can shoot the moving object.
Further, the camera set up in monitoring radar top, it includes to acquire the subunit:
the adjusting module is used for adjusting the monitoring device so that the camera aligns to the standard reference object;
the acquisition module is used for acquiring calibration point cloud data of the monitoring radar and acquiring space coordinate information of the calibration reference object according to the calibration point cloud data;
the calculation module is used for calculating a horizontal included angle between the optical axis of the camera and the transverse axis of the monitoring radar coordinate and a vertical height between the optical axis of the camera and the transverse axis of the monitoring radar coordinate according to the space coordinate information;
and the determining module is used for determining the horizontal included angle and the vertical height as monitoring calibration parameters.
In the implementation process, the acquisition subunit can adjust the monitoring device through the adjusting module so that the camera aligns to the standard reference object; the method comprises the steps that calibration point cloud data of a monitoring radar are obtained through an obtaining module, and space coordinate information of a calibration reference object is obtained according to the calibration point cloud data; calculating a horizontal included angle between the optical axis of the camera and the transverse axis of the monitoring radar coordinate and a vertical height between the optical axis of the camera and the transverse axis of the monitoring radar coordinate according to the space coordinate information through a calculation module; and determining the horizontal included angle and the vertical height as monitoring calibration parameters by a determination module. Therefore, by implementing the implementation mode, the position relation between the camera and the monitoring radar can be determined, so that the subsequent monitoring radar can immediately calculate out how to adjust the direction of the camera when detecting the position of an object, and the monitoring efficiency can be improved.
Further, the calculation subunit is specifically configured to perform superposition calculation on the monitoring calibration parameter and the object position data to obtain a monitoring adjustment parameter; the monitoring adjustment parameters comprise a horizontal adjustment angle and a pitching adjustment angle.
In the implementation process, the calculating subunit, in the process of calculating according to the monitoring calibration parameter and the object position data to obtain the monitoring adjustment parameter, may preferentially perform superposition calculation on the monitoring calibration parameter and the object position data to obtain the monitoring adjustment parameter; the monitoring adjustment parameters include a level adjustment angle and a pitch adjustment angle. Therefore, by implementing the implementation mode, the position corresponding relation of the moving object in the monitoring radar coordinate system can be converted into the position corresponding relation between the moving object and the camera optical axis, so that the monitoring adjustment parameters can be determined, and the monitoring device can be controlled.
A third aspect of embodiments of the present application provides an electronic device, including a memory and a processor, where the memory is used to store a computer program, and the processor runs the computer program to enable the electronic device to execute the monitoring method described in any one of the first aspect of embodiments of the present application.
A fourth aspect of the embodiments of the present application provides a computer-readable storage medium, which stores computer program instructions, and when the computer program instructions are read and executed by a processor, the monitoring method according to any one of the first aspect of the embodiments of the present application is executed.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings that are required to be used in the embodiments of the present application will be briefly described below, it should be understood that the following drawings only illustrate some embodiments of the present application and therefore should not be considered as limiting the scope, and that those skilled in the art can also obtain other related drawings based on the drawings without inventive efforts.
Fig. 1 is a schematic flowchart of a monitoring method according to an embodiment of the present disclosure;
fig. 2 is a schematic flow chart of another monitoring method provided in the embodiment of the present application;
fig. 3 is a schematic structural diagram of a monitoring device according to an embodiment of the present disclosure;
fig. 4 is a schematic structural diagram of another monitoring device provided in an embodiment of the present application;
fig. 5 is an exemplary schematic diagram of a monitoring calibration parameter obtaining process according to an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be described below with reference to the drawings in the embodiments of the present application.
It should be noted that: like reference numbers and letters refer to like items in the following figures, and thus, once an item is defined in one figure, it need not be further defined and explained in subsequent figures. Meanwhile, in the description of the present application, the terms "first", "second", and the like are used only for distinguishing the description, and are not to be construed as indicating or implying relative importance.
Example 1
Referring to fig. 1, fig. 1 is a schematic flow chart of a monitoring method according to an embodiment of the present application. The method can be used in any monitoring scenario. The monitoring method is used for a monitoring device comprising a camera and a monitoring radar, wherein the monitoring ranges of the camera and the monitoring radar are different, and the method comprises the following steps:
s101, judging whether a moving object exists in a monitoring range of a monitoring radar or not according to monitoring point cloud data acquired by the monitoring radar, and if so, executing S102; if not, the flow is ended.
In this embodiment, the monitoring radar may be a laser radar or a millimeter wave radar.
In this embodiment, the monitoring radar may be a bosch MRR4 radar or an Ultra Puck (VLP32C) of Velodyne, which is not limited in this embodiment.
In this embodiment, the monitoring point cloud data is point cloud data within a monitoring range of the monitoring radar.
In this embodiment, when a moving object exists in the monitoring range of the monitoring radar, the monitoring point cloud data changes.
In this embodiment, the moving object monitored by the monitoring radar may not exist in the current monitoring range of the camera.
S102, judging whether the volume of the moving object is larger than a preset volume threshold value, if so, executing the steps S103-S104; if not, the flow is ended.
In this embodiment, the volume threshold may be used to classify the approximate type of the moving object, such as distinguishing between moths and vehicles.
S103, acquiring monitoring and adjusting parameters.
In this embodiment, the monitoring adjustment parameter is a parameter for adjusting the monitoring device.
In this embodiment, the monitoring adjustment parameter is specifically used to refer to an adjustment parameter for aligning the optical axis of the camera with the moving object.
And S104, adjusting the monitoring device according to the monitoring adjustment parameters so that the monitoring video of the camera comprises a moving object.
In this embodiment, the camera may acquire the monitoring video.
In this embodiment, the monitoring device is a camera with a holder and a monitoring radar (laser radar or millimeter wave radar), wherein the camera can be used for acquiring a monitoring video and transmitting the monitoring video through a network, and the monitoring radar (laser radar or millimeter wave radar) is used for detecting a moving object in a region range.
In this embodiment, the execution subject of the method may be a computing device such as a computer and a server, and is not limited in this embodiment.
In this embodiment, an execution subject of the method may also be a smart device such as a smart phone and a tablet, which is not limited in this embodiment.
It can be seen that, the implementation of the monitoring method described in fig. 1 can be used in a monitoring device including a camera and a monitoring radar, the camera is disposed above the monitoring radar, and the monitoring ranges of the camera and the monitoring radar are different, wherein the method can preferentially judge whether a moving object exists in the monitoring range of the monitoring radar according to the monitoring point cloud data acquired by the monitoring radar; if the moving object exists, judging whether the volume of the moving object is larger than a preset volume threshold value; when the volume of the moving object is larger than a volume threshold value, acquiring monitoring adjustment parameters; and finally, adjusting the monitoring device according to the monitoring adjustment parameters so that the monitoring video of the camera comprises a moving object. Therefore, by implementing the implementation mode, the information of the moving object in the blind area of the camera can be obtained through the monitoring radar, and when the volume of the moving object is larger than the volume threshold value, the camera is prompted to aim at the moving object for video shooting, so that the problem of the visual field blind area of the existing video monitoring method can be solved, the monitoring capability is further improved on the basis, and the monitoring effect is improved.
Example 2
Referring to fig. 2, fig. 2 is a schematic flow chart of another monitoring method according to an embodiment of the present disclosure. The flow diagram of the monitoring method depicted in fig. 2 is improved from the flow diagram of the monitoring method depicted in fig. 1. The monitoring method comprises the following steps of:
s201, judging whether a moving object exists in the monitoring range of the monitoring radar or not according to the monitoring point cloud data acquired by the monitoring radar, if so, executing the step S202; if not, the flow is ended.
In this embodiment, the monitoring radar is installed on the cloud platform of surveillance camera head to rotate along with the cloud platform. According to different installation positions, the monitoring radar can cover different areas; wherein, the monitoring radar is installed and can be covered great region in the camera below, consequently, the camera sets up in monitoring radar top.
Optionally, when the effective detection range of the monitoring radar is large, an area may be defined within the effective detection range of the monitoring radar as the monitoring range of the monitoring radar.
S202, judging whether the volume of the moving object is larger than a preset volume threshold value, if so, executing steps S203-S209; if not, the flow is ended.
As an optional implementation manner, after step S202, the method may further include:
and acquiring monitoring calibration parameters.
As an alternative embodiment, the step of acquiring the monitoring calibration parameter may include steps S203 to S206.
S203, adjusting the monitoring device to enable the camera to align the standard reference object.
In this embodiment, the calibration reference object may be set by a user or may be an object in the environment.
And S204, acquiring calibration point cloud data of the monitoring radar, and acquiring space coordinate information of a calibration reference object according to the calibration point cloud data.
In this embodiment, the spatial coordinate information may be coordinate information corresponding to a monitoring radar coordinate system.
And S205, calculating a horizontal included angle between the optical axis of the camera and the transverse axis of the monitoring radar coordinate and a vertical height between the optical axis of the camera and the transverse axis of the monitoring radar coordinate according to the space coordinate information.
And S206, determining the horizontal included angle and the vertical height as monitoring calibration parameters.
In this embodiment, the method can calibrate the relative position relationship between the coordinate system of the monitoring radar and the optical axis of the camera, and record the relative position relationship as a monitoring calibration parameter. When the monitoring radar is arranged below the camera and the OXY plane of the monitoring radar is parallel to the optical axis of the camera, the relative position relationship between the monitoring radar and the camera is represented by the horizontal included angle between the optical axis of the camera and the OX coordinate axis of the monitoring radar and the vertical height between the optical axis of the camera and the OXY plane.
Referring to fig. 5, fig. 5 is an exemplary schematic diagram of a process for acquiring the monitoring calibration parameter. The target ball can be set at any position by a user (specifically, the target ball is a calibration reference object which can be an object in an actual scene, so that the setting process of the user can be avoided), the camera is adjusted to be aligned with the target ball, and the point cloud data of the monitoring radar at the moment is acquired. Analyzing the point cloud data to obtain that the target ball is located at a point P, wherein a projection point of the target ball on an XY plane of a monitoring radar coordinate system is a point Q, an included angle between the OQ and an X axis is recorded as an included angle between a camera optical axis and a coordinate cross axis of the monitoring radar, and the length of a PQ line segment is the vertical height between the camera optical axis and the XY plane.
In this embodiment, the method may further determine a visible area of the camera in the monitoring radar coordinate system according to the horizontal field angle range, the vertical field angle range, and the target point of the camera, and may determine a monitoring blind area of the camera outside the visible area.
And S207, obtaining object position data corresponding to the moving object from the monitoring point cloud data.
And S208, calculating according to the monitoring calibration parameters and the object position data to obtain monitoring adjustment parameters.
In this embodiment, when the monitoring radar detects a moving object whose volume is greater than the volume threshold in the monitoring blind area of the camera, the moving object can be brought into the monitoring range of the camera by rotating the holder.
As an optional implementation manner, the step of calculating according to the monitoring calibration parameter and the object position data to obtain the monitoring adjustment parameter includes:
performing superposition calculation on the monitoring calibration parameters and the object position data to obtain monitoring adjustment parameters; the monitoring adjustment parameters include a level adjustment angle and a pitch adjustment angle.
By implementing the implementation mode, the horizontal adjustment angle and the pitching adjustment angle which need to be adjusted by the camera can be accurately calculated, so that the accurate adjustment of the camera is facilitated.
In this embodiment, the method may control the pan/tilt head according to the monitoring adjustment parameter. The azimuth angle of the cradle head required to rotate is a horizontal adjustment angle, and the pitch angle is a pitch adjustment angle.
In this embodiment, the monitoring adjustment parameter may be determined according to object position data in the monitoring point cloud data and camera optical axis data in the monitoring point cloud data, and this embodiment is not limited in any way.
In this embodiment, the camera may be a zoom camera.
As an optional implementation manner, the step of calculating according to the monitoring calibration parameter and the object position data to obtain the monitoring adjustment parameter includes:
calculating according to the object position data and the volume of the moving object to obtain the focusing magnification of the camera;
calculating according to the monitoring calibration parameters and the object position data to obtain a monitoring adjustment angle;
and determining the focusing magnification and the monitoring adjustment angle of the camera as monitoring adjustment parameters.
By implementing the implementation mode, the method can calculate the focusing magnification of the zoom camera according to the position of the moving object and the volume of the moving object, so that the size of the moving object in a video picture of the camera is moderate.
S209, adjusting the monitoring device according to the monitoring adjustment parameters so that the monitoring video of the camera comprises a moving object.
Therefore, by implementing the monitoring method described in fig. 2, the information of the moving object in the blind area of the camera can be obtained through the monitoring radar, and when the volume of the moving object is larger than the preset volume threshold value, the camera is prompted to aim at the moving object for video shooting, so that the problem of the previous view blind area can be solved, the monitoring capability is further improved on the basis, and the monitoring effect is improved.
Example 3
Please refer to fig. 3, fig. 3 is a schematic structural diagram of a monitoring device according to an embodiment of the present disclosure. Wherein, this monitoring device includes the different camera of monitoring range and monitoring radar, and wherein, monitoring device includes:
the judging unit 310 is configured to judge whether a moving object exists in a monitoring range of the monitoring radar according to the monitoring point cloud data acquired by the monitoring radar;
the determining unit 310 is further configured to determine whether the volume of the moving object is greater than a preset volume threshold value when the moving object exists within the monitoring range of the monitoring radar;
an obtaining unit 320, configured to obtain a monitoring adjustment parameter when the volume of the moving object is greater than a volume threshold;
the adjusting unit 330 is configured to adjust the monitoring device according to the monitoring adjustment parameter, so that the monitoring video of the camera includes a moving object.
In this embodiment, for the explanation of the monitoring device, reference may be made to the description in embodiment 1 or embodiment 2, and details are not repeated in this embodiment.
Therefore, by implementing the monitoring device described in fig. 3, the information of the moving object in the blind area of the camera can be acquired through the monitoring radar, and when the volume of the moving object is larger than the preset volume threshold value, the camera is prompted to aim at the moving object for video shooting, so that the problem of the previous view blind area can be solved, the monitoring capability is further improved on the basis, and the monitoring effect is improved.
Example 4
Referring to fig. 4, fig. 4 is a schematic structural diagram of another monitoring device according to an embodiment of the present disclosure. The schematic structural diagram of the monitoring device depicted in fig. 4 is modified from the schematic structural diagram of the monitoring device depicted in fig. 3. Wherein, the obtaining unit 320 includes:
an obtaining subunit 321, configured to obtain a monitoring calibration parameter;
an obtaining subunit 321, configured to obtain object position data corresponding to the moving object from the monitoring point cloud data;
and the calculating subunit 322 is configured to perform calculation according to the monitoring calibration parameter and the object position data to obtain a monitoring adjustment parameter.
As an optional implementation, the camera is disposed above the monitoring radar, and the obtaining subunit 321 includes:
the adjusting module is used for adjusting the monitoring device so as to enable the camera to align the standard reference object;
the acquisition module is used for acquiring calibration point cloud data of the monitoring radar and acquiring space coordinate information of a calibration reference object according to the calibration point cloud data;
the calculation module is used for calculating a horizontal included angle between the optical axis of the camera and the transverse axis of the monitoring radar coordinate and a vertical height between the optical axis of the camera and the transverse axis of the monitoring radar coordinate according to the space coordinate information;
and the determining module is used for determining the horizontal included angle and the vertical height as monitoring calibration parameters.
As an optional implementation manner, the calculating subunit 322 is specifically configured to perform superposition calculation on the monitored calibration parameter and the object position data to obtain a monitored adjustment parameter; the monitoring adjustment parameters include a level adjustment angle and a pitch adjustment angle.
As an optional implementation manner, the calculating subunit 322 may be further configured to calculate according to the object position data and the volume of the moving object, so as to obtain a focusing magnification of the camera; calculating according to the monitoring calibration parameters and the object position data to obtain a monitoring adjustment angle; and determining the focusing magnification and the monitoring adjustment angle of the camera as monitoring adjustment parameters.
In this embodiment, for the explanation of the monitoring device, reference may be made to the description in embodiment 1 or embodiment 2, and details are not repeated in this embodiment.
Therefore, by implementing the monitoring device described in fig. 4, the information of the moving object in the blind area of the camera can be obtained through the monitoring radar, and when the volume of the moving object is larger than the preset volume threshold value, the camera is prompted to aim at the moving object for video shooting, so that the problem of the previous view blind area can be solved, the monitoring capability is further improved on the basis, and the monitoring effect is improved.
An embodiment of the present application provides an electronic device, which includes a memory and a processor, where the memory is used to store a computer program, and the processor runs the computer program to enable the electronic device to execute the monitoring method in any one of embodiment 1 or embodiment 2 of the present application.
An embodiment of the present application provides a computer-readable storage medium, which stores computer program instructions, and when the computer program instructions are read and executed by a processor, the monitoring method in any one of embodiment 1 or embodiment 2 of the present application is executed.
In the embodiments provided in the present application, it should be understood that the disclosed apparatus and method can be implemented in other ways. The apparatus embodiments described above are merely illustrative, and for example, the flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of apparatus, methods and computer program products according to various embodiments of the present application. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
In addition, functional modules in the embodiments of the present application may be integrated together to form an independent part, or each module may exist separately, or two or more modules may be integrated to form an independent part.
The functions, if implemented in the form of software functional modules and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present application or portions thereof that substantially contribute to the prior art may be embodied in the form of a software product stored in a storage medium and including instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present application. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and other various media capable of storing program codes.
The above description is only an example of the present application and is not intended to limit the scope of the present application, and various modifications and changes may be made by those skilled in the art. Any modification, equivalent replacement, improvement and the like made within the spirit and principle of the present application shall be included in the protection scope of the present application. It should be noted that: like reference numbers and letters refer to like items in the following figures, and thus, once an item is defined in one figure, it need not be further defined and explained in subsequent figures.
The above description is only for the specific embodiments of the present application, but the scope of the present application is not limited thereto, and any person skilled in the art can easily conceive of the changes or substitutions within the technical scope of the present application, and shall be covered by the scope of the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.
It is noted that, herein, relational terms such as first and second, and the like may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Also, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other identical elements in a process, method, article, or apparatus that comprises the element.

Claims (10)

1. A monitoring method used in a monitoring apparatus including a camera and a monitoring radar, the monitoring range of the camera and the monitoring radar being different, the method comprising:
judging whether a moving object exists in the monitoring range of the monitoring radar or not according to the monitoring point cloud data acquired by the monitoring radar;
when the moving object exists in the monitoring range of the monitoring radar, judging whether the volume of the moving object is larger than a preset volume threshold value or not;
when the volume of the moving object is larger than the volume threshold value, acquiring monitoring adjustment parameters;
and adjusting the monitoring device according to the monitoring adjustment parameters so that the monitoring video of the camera comprises the moving object.
2. The monitoring method of claim 1, wherein the step of obtaining monitoring adjustment parameters comprises:
acquiring a monitoring calibration parameter;
acquiring object position data corresponding to the moving object from the monitoring point cloud data;
and calculating according to the monitoring calibration parameters and the object position data to obtain monitoring adjustment parameters.
3. The monitoring method according to claim 2, wherein the camera is disposed above the monitoring radar, and the step of obtaining the monitoring calibration parameter includes:
adjusting the monitoring device to align the camera with a standard reference object;
acquiring calibration point cloud data of the monitoring radar, and acquiring space coordinate information of the calibration reference object according to the calibration point cloud data;
according to the space coordinate information, calculating a horizontal included angle between the optical axis of the camera and the transverse axis of the monitoring radar coordinate and a vertical height between the optical axis of the camera and the transverse axis of the monitoring radar coordinate;
and determining the horizontal included angle and the vertical height as monitoring calibration parameters.
4. The monitoring method according to claim 2, wherein the step of calculating the monitoring adjustment parameter based on the monitoring calibration parameter and the object position data comprises:
performing superposition calculation on the monitoring calibration parameters and the object position data to obtain monitoring adjustment parameters; the monitoring adjustment parameters comprise a horizontal adjustment angle and a pitching adjustment angle.
5. The monitoring method according to claim 2, wherein the step of calculating the monitoring adjustment parameter based on the monitoring calibration parameter and the object position data comprises:
calculating according to the object position data and the volume of the moving object to obtain the focusing magnification of the camera;
calculating according to the monitoring calibration parameters and the object position data to obtain a monitoring adjustment angle;
and determining the focusing magnification of the camera and the monitoring adjustment angle as monitoring adjustment parameters.
6. A monitoring device, comprising a camera and a monitoring radar having different monitoring ranges, wherein the monitoring device comprises:
the judging unit is used for judging whether a moving object exists in the monitoring range of the monitoring radar or not according to the monitoring point cloud data acquired by the monitoring radar;
the judgment unit is further used for judging whether the volume of the moving object is larger than a preset volume threshold value or not when the moving object exists in the monitoring range of the monitoring radar;
the acquisition unit is used for acquiring monitoring adjustment parameters when the volume of the moving object is larger than the volume threshold;
and the adjusting unit is used for adjusting the monitoring device according to the monitoring adjusting parameters so that the monitoring video of the camera comprises the moving object.
7. The monitoring device of claim 6, wherein the obtaining unit comprises:
the acquisition subunit is used for acquiring monitoring calibration parameters;
the acquisition subunit is further configured to acquire object position data corresponding to the moving object from the monitoring point cloud data;
and the calculating subunit is used for calculating according to the monitoring calibration parameter and the object position data to obtain a monitoring adjustment parameter.
8. The monitoring device of claim 7, wherein the camera is disposed above the monitoring radar, and the obtaining subunit includes:
the adjusting module is used for adjusting the monitoring device so that the camera aligns to the standard reference object;
the acquisition module is used for acquiring calibration point cloud data of the monitoring radar and acquiring space coordinate information of the calibration reference object according to the calibration point cloud data;
the calculation module is used for calculating a horizontal included angle between the optical axis of the camera and the transverse axis of the monitoring radar coordinate and a vertical height between the optical axis of the camera and the transverse axis of the monitoring radar coordinate according to the space coordinate information;
and the determining module is used for determining the horizontal included angle and the vertical height as monitoring calibration parameters.
9. An electronic device, characterized in that the electronic device comprises a memory for storing a computer program and a processor for executing the computer program to cause the electronic device to perform the monitoring method of any one of claims 1 to 4.
10. A readable storage medium, having stored thereon computer program instructions, which, when read and executed by a processor, perform the monitoring method of any one of claims 1 to 4.
CN202011515035.0A 2020-12-18 2020-12-18 Monitoring method and device Active CN112702571B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011515035.0A CN112702571B (en) 2020-12-18 2020-12-18 Monitoring method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011515035.0A CN112702571B (en) 2020-12-18 2020-12-18 Monitoring method and device

Publications (2)

Publication Number Publication Date
CN112702571A true CN112702571A (en) 2021-04-23
CN112702571B CN112702571B (en) 2022-10-25

Family

ID=75507684

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011515035.0A Active CN112702571B (en) 2020-12-18 2020-12-18 Monitoring method and device

Country Status (1)

Country Link
CN (1) CN112702571B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113487652A (en) * 2021-06-22 2021-10-08 江西晶浩光学有限公司 Security monitoring method, security monitoring device, storage medium and computer device

Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TW201205344A (en) * 2010-07-30 2012-02-01 Hon Hai Prec Ind Co Ltd Adjusting system and method for screen, advertisement board including the same
WO2016183954A1 (en) * 2015-05-21 2016-11-24 中兴通讯股份有限公司 Calculation method and apparatus for movement locus, and terminal
CN108366217A (en) * 2018-03-14 2018-08-03 成都创信特电子技术有限公司 Monitor video acquisition and storage method
CN109164443A (en) * 2018-08-27 2019-01-08 南京微达电子科技有限公司 Rail track foreign matter detecting method and system based on radar and image analysis
CN110456377A (en) * 2019-08-15 2019-11-15 中国人民解放军63921部队 It is a kind of that foreign matter detecting method and system are attacked based on the satellite of three-dimensional laser radar
CN111025297A (en) * 2019-12-24 2020-04-17 京东数字科技控股有限公司 Vehicle monitoring method and device, electronic equipment and storage medium
CN111045000A (en) * 2018-10-11 2020-04-21 阿里巴巴集团控股有限公司 Monitoring system and method
CN111381232A (en) * 2020-03-27 2020-07-07 深圳市深水水务咨询有限公司 River channel safety control method based on photoelectric integration technology
CN111681426A (en) * 2020-02-14 2020-09-18 深圳市美舜科技有限公司 Method for perception and evaluation of traffic safety road conditions
CN111757098A (en) * 2020-06-30 2020-10-09 北京百度网讯科技有限公司 Debugging method and device of intelligent face monitoring camera, camera and medium
CN111932943A (en) * 2020-10-15 2020-11-13 深圳市速腾聚创科技有限公司 Dynamic target detection method and device, storage medium and roadbed monitoring equipment
CN112017210A (en) * 2020-07-14 2020-12-01 创泽智能机器人集团股份有限公司 Target object tracking method and device

Patent Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TW201205344A (en) * 2010-07-30 2012-02-01 Hon Hai Prec Ind Co Ltd Adjusting system and method for screen, advertisement board including the same
WO2016183954A1 (en) * 2015-05-21 2016-11-24 中兴通讯股份有限公司 Calculation method and apparatus for movement locus, and terminal
CN108366217A (en) * 2018-03-14 2018-08-03 成都创信特电子技术有限公司 Monitor video acquisition and storage method
CN109164443A (en) * 2018-08-27 2019-01-08 南京微达电子科技有限公司 Rail track foreign matter detecting method and system based on radar and image analysis
CN111045000A (en) * 2018-10-11 2020-04-21 阿里巴巴集团控股有限公司 Monitoring system and method
CN110456377A (en) * 2019-08-15 2019-11-15 中国人民解放军63921部队 It is a kind of that foreign matter detecting method and system are attacked based on the satellite of three-dimensional laser radar
CN111025297A (en) * 2019-12-24 2020-04-17 京东数字科技控股有限公司 Vehicle monitoring method and device, electronic equipment and storage medium
CN111681426A (en) * 2020-02-14 2020-09-18 深圳市美舜科技有限公司 Method for perception and evaluation of traffic safety road conditions
CN111381232A (en) * 2020-03-27 2020-07-07 深圳市深水水务咨询有限公司 River channel safety control method based on photoelectric integration technology
CN111757098A (en) * 2020-06-30 2020-10-09 北京百度网讯科技有限公司 Debugging method and device of intelligent face monitoring camera, camera and medium
CN112017210A (en) * 2020-07-14 2020-12-01 创泽智能机器人集团股份有限公司 Target object tracking method and device
CN111932943A (en) * 2020-10-15 2020-11-13 深圳市速腾聚创科技有限公司 Dynamic target detection method and device, storage medium and roadbed monitoring equipment

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113487652A (en) * 2021-06-22 2021-10-08 江西晶浩光学有限公司 Security monitoring method, security monitoring device, storage medium and computer device
CN113487652B (en) * 2021-06-22 2023-06-02 江西晶浩光学有限公司 Security monitoring method, security monitoring device, storage medium and computer device

Also Published As

Publication number Publication date
CN112702571B (en) 2022-10-25

Similar Documents

Publication Publication Date Title
US11102417B2 (en) Target object capturing method and device, and video monitoring device
CN110175518B (en) Camera angle adjusting method, device, equipment and system of camera device
CN112616019B (en) Target tracking method and device, holder and storage medium
EP2549738B1 (en) Method and camera for determining an image adjustment parameter
EP3641298B1 (en) Method and device for capturing target object and video monitoring device
CN108537726B (en) Tracking shooting method and device and unmanned aerial vehicle
US20200267309A1 (en) Focusing method and device, and readable storage medium
CN101640788B (en) Method and device for controlling monitoring and monitoring system
CN112799051A (en) Automatic capturing and tracking method and system for low-speed small target
CN111627049B (en) Method and device for determining high-altitude parabolic object, storage medium and processor
AU2013398544B2 (en) A method of determining the location of a point of interest and the system thereof
CN111445531A (en) Multi-view camera navigation method, device, equipment and storage medium
CN109788201B (en) Positioning method and device
JP6723208B2 (en) Improved direction control of surveillance cameras
CN109443305B (en) Distance measuring method and device
CN115063442B (en) Method, equipment and medium for tracking hidden danger targets of power transmission line
CN112702571B (en) Monitoring method and device
JP7128577B2 (en) monitoring device
CN112883866A (en) Method, system and storage medium for detecting regional invasion in real time
CN112243106A (en) Target monitoring method, device and equipment and storage medium
KR20130062489A (en) Device for tracking object and method for operating the same
CN110868584A (en) Focal length calibration method and device of image acquisition device
CN114004876A (en) Dimension calibration method, dimension calibration device and computer readable storage medium
CN114255477A (en) Smoking behavior detection method and related device
CN110738109B (en) Method, device and computer storage medium for detecting user standing

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant