CN115002353B - Camera scheduling method and system under video monitoring collaborative coverage scene - Google Patents

Camera scheduling method and system under video monitoring collaborative coverage scene Download PDF

Info

Publication number
CN115002353B
CN115002353B CN202210762138.XA CN202210762138A CN115002353B CN 115002353 B CN115002353 B CN 115002353B CN 202210762138 A CN202210762138 A CN 202210762138A CN 115002353 B CN115002353 B CN 115002353B
Authority
CN
China
Prior art keywords
picture quality
camera
coverage area
current
collaborative
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202210762138.XA
Other languages
Chinese (zh)
Other versions
CN115002353A (en
Inventor
李兴达
应闻达
李峰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tianyi Digital Life Technology Co Ltd
Original Assignee
Tianyi Digital Life Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tianyi Digital Life Technology Co Ltd filed Critical Tianyi Digital Life Technology Co Ltd
Priority to CN202210762138.XA priority Critical patent/CN115002353B/en
Publication of CN115002353A publication Critical patent/CN115002353A/en
Application granted granted Critical
Publication of CN115002353B publication Critical patent/CN115002353B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Abstract

The invention discloses a camera scheduling method and a system under a video monitoring collaborative coverage scene. Dividing the monitoring area into a plurality of subareas, adjusting the states of the cameras in a stepping way based on the picture quality monitored by the cameras on the subareas and the different importance degrees of the subareas, and carrying out limited times of operation to obtain the optimal picture quality of the total monitoring area so as to obtain the corresponding optimal camera states. The invention can realize the improvement of the monitoring efficiency of the camera without adding a redundant camera, has operability, is more accurate and efficient, and has low system complexity.

Description

Camera scheduling method and system under video monitoring collaborative coverage scene
Technical Field
The invention relates to the field of video networking, in particular to a video monitoring camera collaborative scheduling method and system under a video monitoring collaborative coverage scene of a plurality of cameras.
Background
In a video monitoring system, a plurality of cameras are usually arranged, and a situation that the plurality of cameras cooperatively cover a video monitoring scene of a certain area can occur. Without tuning, there is a high probability that the partial area in the collaborative coverage area is repeatedly covered by two or more cameras, and the partial area is not covered into a blind area, so that the monitoring coverage efficiency is low. In addition, when one or more cameras fail, resulting in a sudden change of the partial area into a monitoring blind area, it is also impossible to quickly adjust the remaining normal cameras to cover the partial area.
CN111698465a discloses a method and a device for adjusting a monitoring coverage area, in which the monitorable areas of the first camera and the second camera are partially overlapped, according to the respective device information and the height of the first camera and the second camera, the height to be adjusted of the second camera is determined on the premise that the first camera is not adjusted, the height of the second camera is adjusted accordingly, and the unnecessary overlapping with the coverage of the first camera is improved by adjusting the height of the second camera, and meanwhile, the monitoring blind area of the first camera can be covered. However, this applies only to the case of two cameras, only to the simple adjustment of the height of the second camera, not to the angularly adjustable camera, nor to the case of a plurality of cameras.
CN113923406a discloses a method and a device for adjusting a video monitoring coverage area, where the video monitoring area includes multiple independent monitoring areas, each of the independent monitoring areas includes multiple cameras and multiple monitoring points, the multiple monitoring points form a first monitoring point set, the optimal monitoring point in the multiple monitoring points form a second monitoring point set, and the camera covering the most monitoring points in each of the independent monitoring areas is used as a target camera. If the target camera is not in the working state, determining the coverage state of each monitoring point, taking the intersection of the second monitoring point set and the first monitoring point set as a first intersection, and adding a camera with the least coverage monitoring points in the first intersection as an execution object.
None of the above solutions takes into account the different coverage importance of the parts (sub-areas) in each monitored area nor the overall picture quality of the monitoring. Therefore, there is a need for a more intelligent and simple, more accurate and efficient multi-video surveillance camera collaborative scheduling method to system to improve the area surveillance coverage efficiency and surveillance quality in normal/fault conditions.
Disclosure of Invention
This summary is provided to introduce a selection of concepts in a simplified form that are further described below in the detailed description. This summary is not intended to identify key features or essential features of the claimed subject matter; nor is it intended to be used to determine or limit the scope of the claimed subject matter.
The camera scheduling technical scheme of the invention introduces the coverage importance index of the subarea of the collaborative coverage area as the picture quality weight of the subarea to quantify the picture quality of the whole collaborative coverage area, and obtains the optimal collaborative coverage area weighted picture quality and the corresponding optimal camera state by performing traversal adjustment on the angles of the cameras of the collaborative coverage area according to the preset stepping value.
The invention discloses a method for scheduling cameras in a video monitoring collaborative coverage scene, which comprises the following steps: dividing the monitored collaborative coverage area into a plurality of subareas; calculating the total current picture quality of the monitored collaborative coverage area as the current optimal picture quality to record when each camera is in an initial state, and recording the corresponding states of each camera as the current optimal camera states; step-by-step adjusting the states of all cameras, and calculating the total current picture quality of the monitored collaborative coverage area; comparing the current picture quality calculated after step adjustment with the recorded current optimal picture quality, if the current picture quality calculated after step adjustment is better than the recorded current optimal picture quality, recording the current picture quality calculated after step adjustment as the current optimal picture quality, and recording the corresponding camera states as the current optimal camera states; repeating the steps until all the camera states are traversed, wherein the finally recorded current optimal picture quality is the final optimal picture quality, and the corresponding current optimal camera state of each camera is the final optimal camera state.
Wherein calculating the total current picture quality of the monitored co-coverage area comprises calculating a maximum value of picture quality of each sub-area and calculating a sum of the maximum values of picture quality of each sub-area as the total current picture quality of the monitored co-coverage area.
The invention discloses a camera scheduling system under a video monitoring collaborative coverage scene, which comprises a collaborative coverage area management module and a terminal management module. The collaborative coverage area management module is used for subarea division, subarea coverage importance index setting and collaborative coverage area picture quality management (including calculation, comparison, recording weighted picture quality and the like); the terminal management module is used for communicating with an external terminal control system to control the angles of the cameras and record the optimal camera states corresponding to the optimal picture quality before recording.
These and other features and advantages will become apparent upon reading the following detailed description and upon reference to the associated drawings. It is to be understood that both the foregoing general description and the following detailed description are explanatory only and are not restrictive of aspects as claimed.
Drawings
The invention will be described in more detail hereinafter with reference to specific embodiments shown in the drawings.
FIG. 1 is a schematic diagram of the positional relationship of a camera and a sub-area of a collaborative coverage area in accordance with an embodiment of the present invention;
FIG. 2 is a flow chart of a method of camera scheduling in a video surveillance co-coverage scenario according to one embodiment of the invention;
FIG. 3 is a flow chart of a method of camera scheduling in a video surveillance co-coverage scenario according to another embodiment of the invention;
fig. 4 is a block diagram of a camera scheduling system in a video surveillance co-coverage scenario in accordance with the present invention.
The flowcharts and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods according to embodiments of the present application. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s).
Detailed Description
The invention will be described in more detail hereinafter with reference to specific embodiments shown in the drawings. Various advantages and benefits of this invention will become apparent to those of ordinary skill in the art upon reading the following detailed description of the specific embodiments. It should be understood, however, that the present invention may be embodied in various forms and should not be limited to the embodiments set forth herein. The following embodiments are provided to enable a more thorough understanding of the present invention. Unless otherwise defined, technical or scientific terms used herein should be given the ordinary meaning as understood by one of ordinary skill in the art to which this application belongs.
The invention relates to a scheduling scheme of a video monitoring camera based on picture quality under a collaborative coverage monitoring scene. And (3) adjusting the deflection angles of the cameras in the horizontal direction and the vertical direction of the collaborative coverage area in a stepping way, and taking the state of each corresponding camera when the weighted picture quality of the collaborative coverage area is maximum as an optimal scheduling scheme. Wherein the coverage importance index of each sub-area in the collaborative coverage area may also be taken into account.
A flowchart of a camera scheduling method in a video surveillance co-coverage scenario according to an embodiment of the present invention is shown in fig. 2. Assume that each camera is a video surveillance camera with a fixed height installation but an adjustable angle. The method comprises the following steps:
step S210-dividing sub-regions: dividing the collaborative coverage area to be monitored into n sub-areas, and setting the space position coordinates of each sub-area. The value of n can be sufficiently large here that the subregions can be regarded as a "point", the n subregions being each represented by A i And represents an integer of 1 to n.
In step S220, an initial picture quality of each sub-area is calculated, which is the maximum value of the picture quality of each sub-area under each camera that is in the initial state to cover the sub-area, and at this time, the corresponding state of each camera is recorded as the current optimal state of the camera, specifically as follows:
b for each of m cameras covering the collaborative coverage area j And represents, where j is an integer from 1 to m. Camera B j Is deflected by an angle alpha in the horizontal direction j Deflection angle in vertical direction is beta j . As can be seen from fig. 1, sub-area a in the co-coverage area i And the camera B j The deflection angle of the optical axis of the lens is theta i,j Sub-region A i And the camera B j Distance l of (2) i,j
In fig. 1, camera B j Sub-region A i Picture quality D of (2) i,j Deflection angle theta with camera optical axis i,j Distance l from camera i,j The following are related: d (D) i,j With theta i,j Is decreased with an increase in l i,j Is decreased by an increase in (c). Specifically: d (D) i,j =k·cosθ i,j /l i,j Where k is a constant, θ i,j ∈(-π/2,π/2)。
Sub-region A i Picture quality D of (2) i To cover the area of each camera B j (j is 1 to m) in theMaximum value of sub-region picture quality, namely:
in step S230, a total picture quality D of the collaborative coverage area is calculated, which is n sub-areas A i Picture quality D of (2) i And (2) sum:
the total initial picture quality D of the collaborative coverage area obtained by the above calculation, which is made in the initial state for the first time, is regarded as the current optimal picture quality D0, and recorded.
Step S240-step adjustment of each camera status: the horizontal direction and the vertical direction of each camera in the collaborative coverage area are deflected by an angle through stepping adjustment. If each of the m cameras can be adjusted by x-th angle, the number of passes of all cameras is m times x. Therefore, the deflection angle step value directly affects the system execution efficiency, and an excessive step value may cause insufficient precision, and an excessive step value may increase the system load, so that the whole scheduling process is too slow. Therefore, the system can be reasonably configured according to the compromise of the operation capability of the system.
In step S250, the total current picture quality D of the co-located coverage area is calculated in the current camera state using the aforementioned formulas 1 and 2.
In step S260—compare picture quality: the current picture quality D obtained in step S250 is compared with the current optimum picture quality D0 of the previous recording.
If the current picture quality D is not better than the current optimum picture quality D0 of the previous recording, the process proceeds directly to step S280.
If the current picture quality D is better than the previously recorded current optimum picture quality D0, the process proceeds to step S270 to record the current picture quality D as a new current optimum picture quality D0, and the current camera state at this time records the current optimum camera state, and then proceeds to step S280.
In step S280, it is determined whether the angles of each camera are traversed, if not, step-by-step adjustment is performed in step S240, if yes, tuning is completed, and the finally recorded camera state corresponding to the optimal picture quality is the optimal camera state obtained by the present time of tuning. And finally obtaining optimal scheduling results of m cameras of the cooperative coverage area to be monitored.
When one or more cameras in the collaborative coverage area fail, the steps S220-S280 can be re-executed, the angles of the remaining normal cameras with various functions can be re-adjusted, the partial coverage capability for the original coverage area of the failed cameras can be provided through a collaborative mechanism, and a new optimal strategy can be achieved.
Fig. 3 is a flowchart of a camera scheduling method in a video surveillance co-coverage scenario according to another embodiment of the present invention. The steps in the flowchart of fig. 3, which are numbered identically to those in fig. 2, are not repeated.
The coverage importance index of each sub-area in the collaborative coverage area is further taken into account compared to the embodiment of fig. 2. Because, each sub-region in the overall monitoring area that is cooperatively covered may be of non-identical importance. Taking this into account, the scheduling of the cameras can be made more accurate and efficient.
For this purpose, step S315, setting the importance index of the sub-region, is added between steps S210 and S220: setting respective coverage importance indexes of n sub-areas by using parameter value lambda i And represents an integer of 1 to n. The setting may be done in batches through the human-machine interface for the importance of different parts of the coverage area. Index lambda i As sub-region A i Is a weight of picture quality.
In step S330-calculate the total weighted picture quality for the collaborative coverage area, which is n sub-areas A i Picture quality D of (2) i Weighted sum:
the first time the above calculation made with each camera in the initial state results in a total initial weighted picture quality D of the collaborative coverage area w Considered as current optimally weighted picture quality D0 w
After each camera angle step, the total current weighted picture quality D of the collaborative coverage area at the current yaw angle is calculated in step S350 using equation 3 w
In step S360-compare weighted picture quality: comparing the current weighted picture quality D obtained in step S350 w Weighted picture quality D0 with the current optimum of the previous recording w
If the current weighted picture quality D w And is not superior to the previously recorded current optimally weighted picture quality D0 w Then the process proceeds directly to step S280.
If the current weighted picture quality D w Current optimum weighted picture quality D0 better than the previous recording w Step S370 is performed to weight the current picture quality D w Recorded as new current optimally weighted picture quality D0 w The current camera state at this time becomes the current optimal camera state, and the process proceeds to step S280.
Fig. 4 illustrates a block diagram of a camera scheduling system 400 in a video surveillance co-coverage scenario, according to an embodiment of the invention.
The system includes a collaborative coverage area management module 410 and a terminal management module 420. Wherein:
the collaborative coverage area management module 410 is configured to:
dividing subareas, dividing a collaborative coverage area into n subareas, and setting space position coordinates of each subarea;
sub-region coverage importance index setting; and
collaborative coverage area picture quality management, including calculating each sub-area picture quality in the current camera state, collaborative coverage area total current (weighted) picture quality, comparing the current (weighted) picture quality with the recorded current optimal (weighted) picture quality, recording the optimal (weighted) picture quality.
The terminal management module 420 is configured to communicate with an external terminal control system, and control and manage changes in camera states by calling a camera angle adjustment function in the existing terminal control system to control angles of each camera, and record states of each camera corresponding to optimal (weighted) picture quality.
It is understood that the above modules may be implemented in hardware, software, a combination of hardware and software. The functional blocks depicted in fig. 4 may be combined into a single functional block or divided into multiple sub-functional blocks.
According to the scheme, based on the monitoring picture quality and different importance degrees of all the subareas in the monitoring area, the optimal area monitoring picture quality effect and the corresponding camera monitoring angle are obtained through limited times of operation, and the improvement of the camera monitoring efficiency can be realized without adding a redundant camera. The camera scheduling method has operability, is more accurate and efficient, and has low complexity.
The above embodiments are only for illustrating the technical solution of the present application, and not for limiting the same; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical scheme described in the foregoing embodiments can be modified or some or all of the technical features thereof can be replaced by equivalents; such modifications and substitutions do not depart from the spirit of the embodiments, and are intended to be included within the scope of the claims and description.

Claims (10)

1. A method for scheduling cameras in a video monitoring collaborative coverage scene comprises the following steps:
step a) dividing the monitored collaborative coverage area into a plurality of subareas;
step b), calculating the total current picture quality of the monitored collaborative coverage area as the current optimal picture quality to record when each camera is in an initial state, and recording the corresponding state of each camera as the current optimal camera state;
step c), step-by-step adjusting the states of all cameras, and calculating the total current picture quality of the monitored collaborative coverage area;
step d) comparing the current picture quality calculated after step adjustment with the recorded current optimal picture quality, if the current picture quality calculated after step adjustment is better than the recorded current optimal picture quality, recording the current picture quality calculated after step adjustment as the current optimal picture quality, and recording the corresponding camera states as the current optimal camera states;
step e) repeating the steps c) and d) until all camera states are traversed, wherein the finally recorded current optimal picture quality is the final optimal picture quality, and the corresponding current optimal camera state of each camera is the final optimal camera state.
2. The method of claim 1, wherein calculating the total current picture quality of the monitored co-coverage area further comprises:
calculating the maximum value of the picture quality of each subarea; and
the sum of the maximum values of the picture quality of the sub-areas is calculated as the total current picture quality of the monitored co-coverage area.
3. The method of claim 1, wherein calculating the total current picture quality of the monitored co-coverage area further comprises:
calculating the maximum value of the picture quality of each subarea;
setting respective coverage importance indexes of all subareas as weights of maximum values of picture quality of all subareas; and
a weighted sum of the maximum values of the picture quality of the sub-regions is calculated as the total current weighted picture quality of the monitored co-coverage area.
4. A method as claimed in claim 2 or 3, wherein the maximum value of the picture quality of each sub-area is the maximum value of the picture quality of the respective camera covering the area in that sub-area.
5. The method of claim 4, wherein one camera B j For a sub-region A i Picture quality D of (2) i,j Angle θ with the camera axis deflection i,j While decreasing with increasing distance l of the camera from the sub-area i,j Is calculated by the following formula:
D i,j =k·cosθ i,j /l i,j
where k is a constant, j is an integer from 1 to m, i is an integer from 1 to n, θ i,j ∈(-π/2,π/2)。
6. The method of claim 1, further comprising repeating steps b) -e) when one or more cameras in the collaborative coverage area fail.
7. The method of claim 1, wherein the monitored co-coverage area is divided into a sufficient number of sub-areas in step a) such that each sub-area is considered a point.
8. The method of claim 1, wherein incrementally adjusting each camera state comprises incrementally adjusting a horizontal and vertical deflection angle of each camera in the coordinated coverage area.
9. Each camera dispatch system under video monitoring cooperation coverage scene includes:
the collaborative coverage area management module is configured to divide a monitored collaborative coverage area into a plurality of sub-areas and manage picture quality of the collaborative coverage area, and includes: dividing the monitored collaborative coverage area into a plurality of subareas, calculating the total current picture quality of the monitored collaborative coverage area under the initial state of each camera to be recorded as the current optimal picture quality, step-adjusting the state of each camera and calculating the total current picture quality of the monitored collaborative coverage area, comparing the calculated current picture quality after step-adjusting with the recorded current optimal picture quality, if the calculated current picture quality after step-adjusting is better than the recorded current optimal picture quality, recording the calculated current picture quality after step-adjusting as the current optimal picture quality, and taking the finally recorded current optimal picture quality as the final optimal picture quality when the step-traversing is completed in all states of each camera; and
the terminal management module is used for communicating with an external terminal control system to control the angles of the cameras and recording the current optimal camera states of the cameras corresponding to the current optimal picture quality.
10. The system of claim 9 wherein the collaborative coverage area management module is further configured for each sub-area coverage importance index setting.
CN202210762138.XA 2022-06-30 2022-06-30 Camera scheduling method and system under video monitoring collaborative coverage scene Active CN115002353B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210762138.XA CN115002353B (en) 2022-06-30 2022-06-30 Camera scheduling method and system under video monitoring collaborative coverage scene

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210762138.XA CN115002353B (en) 2022-06-30 2022-06-30 Camera scheduling method and system under video monitoring collaborative coverage scene

Publications (2)

Publication Number Publication Date
CN115002353A CN115002353A (en) 2022-09-02
CN115002353B true CN115002353B (en) 2023-07-25

Family

ID=83019567

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210762138.XA Active CN115002353B (en) 2022-06-30 2022-06-30 Camera scheduling method and system under video monitoring collaborative coverage scene

Country Status (1)

Country Link
CN (1) CN115002353B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116962649B (en) * 2023-09-19 2024-01-09 安徽送变电工程有限公司 Image monitoring and adjusting system and line construction model

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103400371A (en) * 2013-07-09 2013-11-20 河海大学 Multi-camera synergistic monitoring equipment and method
CN111698465A (en) * 2020-04-29 2020-09-22 视联动力信息技术股份有限公司 Method and device for adjusting monitoring coverage area, electronic equipment and storage medium
CN112583632A (en) * 2020-10-13 2021-03-30 特斯联科技集团有限公司 Camera network topology relation estimation method and system in monitoring scene
CN112969051A (en) * 2021-01-30 2021-06-15 南京新高智联信息技术有限公司 Hydraulic engineering management system based on big data
CN113923406A (en) * 2021-09-29 2022-01-11 四川警察学院 Method, device, equipment and storage medium for adjusting video monitoring coverage area

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103400371A (en) * 2013-07-09 2013-11-20 河海大学 Multi-camera synergistic monitoring equipment and method
CN111698465A (en) * 2020-04-29 2020-09-22 视联动力信息技术股份有限公司 Method and device for adjusting monitoring coverage area, electronic equipment and storage medium
CN112583632A (en) * 2020-10-13 2021-03-30 特斯联科技集团有限公司 Camera network topology relation estimation method and system in monitoring scene
CN112969051A (en) * 2021-01-30 2021-06-15 南京新高智联信息技术有限公司 Hydraulic engineering management system based on big data
CN113923406A (en) * 2021-09-29 2022-01-11 四川警察学院 Method, device, equipment and storage medium for adjusting video monitoring coverage area

Also Published As

Publication number Publication date
CN115002353A (en) 2022-09-02

Similar Documents

Publication Publication Date Title
CN115002353B (en) Camera scheduling method and system under video monitoring collaborative coverage scene
CN110738432B (en) New energy automobile charging management method and device, server and readable storage medium
CN102316326A (en) Image processing device, image processing method, and program
US8818184B2 (en) Lens device, camera system, and exposure control method
CN109714097A (en) A kind of cross-domain coordination scheduling system of satellite resource
WO2020082866A1 (en) Motion trajectory synthesising method and electronic device
CN111885618B (en) Network performance optimization method and device
US20120134535A1 (en) Method for adjusting parameters of video object detection algorithm of camera and the apparatus using the same
CN115617520A (en) Resource parameter configuration method and device, electronic equipment and storage medium
CN103167226B (en) Produce method and the device of panoramic deep image
JP6225896B2 (en) Analysis processing system
CN117081909A (en) Abnormal broadband correction method, device, electronic equipment and storage medium
CN105468706A (en) Page display method and device
JP2019507361A (en) Image defective pixel compensation method, apparatus, and non-transitory computer-readable storage medium
CN114697553A (en) Preset position regulating method and device for equipment, storage medium and electronic equipment
CN111083367B (en) Focusing compensation method and device, electronic equipment and storage medium
CN112396574B (en) License plate image quality processing method and device, storage medium and electronic equipment
CN114189883A (en) Antenna weight value adjusting method and device and computer readable storage medium
CN107438013B (en) Port optimization method, device and system
CN114936987B (en) Lens distortion correction method, device, equipment and storage medium
JPH1155564A (en) Automatic exposure controller
CN111641782B (en) Control method and device for image pickup apparatus, storage medium, and electronic device
CN110839126A (en) Zoom tracking method and device and zoom camera
KR102355779B1 (en) Face Image Registration Method For Face Recognition, Face Recognition Method and Server
WO2024078076A1 (en) Base station energy-saving method and device and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant