CN115002353A - Camera scheduling method and system under video monitoring cooperative coverage scene - Google Patents

Camera scheduling method and system under video monitoring cooperative coverage scene Download PDF

Info

Publication number
CN115002353A
CN115002353A CN202210762138.XA CN202210762138A CN115002353A CN 115002353 A CN115002353 A CN 115002353A CN 202210762138 A CN202210762138 A CN 202210762138A CN 115002353 A CN115002353 A CN 115002353A
Authority
CN
China
Prior art keywords
picture quality
camera
coverage area
current
sub
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202210762138.XA
Other languages
Chinese (zh)
Other versions
CN115002353B (en
Inventor
李兴达
应闻达
李峰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tianyi Digital Life Technology Co Ltd
Original Assignee
Tianyi Digital Life Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tianyi Digital Life Technology Co Ltd filed Critical Tianyi Digital Life Technology Co Ltd
Priority to CN202210762138.XA priority Critical patent/CN115002353B/en
Publication of CN115002353A publication Critical patent/CN115002353A/en
Application granted granted Critical
Publication of CN115002353B publication Critical patent/CN115002353B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Closed-Circuit Television Systems (AREA)
  • Studio Devices (AREA)

Abstract

The invention discloses a camera scheduling method and system under a video monitoring cooperative coverage scene. The monitoring area is divided into a plurality of sub-areas, and based on the picture quality monitored by each camera for each sub-area and the different importance degree of each sub-area, the state of each camera is adjusted step by step and limited operation is carried out to obtain the total optimal picture quality of the monitoring area, so that the corresponding optimal camera state is obtained. According to the invention, the monitoring efficiency of the camera can be improved without adding redundant cameras, and the method has the advantages of operability, higher accuracy and efficiency and low system complexity.

Description

Camera scheduling method and system under video monitoring cooperative coverage scene
Technical Field
The invention relates to the field of video networking, in particular to a video monitoring camera collaborative scheduling method and system under a video monitoring collaborative coverage scene of a plurality of cameras.
Background
A plurality of cameras are generally arranged in a video monitoring system, and a situation that the plurality of cameras cooperatively cover a video monitoring scene in a certain area may occur. Without tuning, there is a high possibility that a partial area in the cooperative coverage area is repeatedly covered by two or more cameras and a partial area is not covered as a blind area, and the monitoring coverage efficiency is low. In addition, when one or more cameras break down and a partial area suddenly becomes a monitoring blind area, the partial area cannot be covered by quickly adjusting the cameras with normal functions.
CN111698465A discloses a method and an apparatus for adjusting a monitoring coverage area, where monitorable areas of a first camera and a second camera are partially overlapped, and according to respective device information and height of the first camera and the second camera, a height to be adjusted of the second camera is determined on the premise that the first camera is not adjusted, and accordingly the height of the second camera is adjusted, and unnecessary coverage overlapping with the first camera is improved by adjusting the height of the second camera, and meanwhile, a monitoring blind area of the first camera can be covered. However, this only applies to the case of two cameras, only a simple adjustment of the height of the second camera, not to angularly adjustable cameras, and not to the case of multiple cameras.
CN113923406A discloses a method and an apparatus for adjusting a video monitoring coverage area, where the video monitoring area includes multiple independent monitoring areas, each independent monitoring area includes multiple cameras and multiple monitoring points, the multiple monitoring points form a first monitoring point set, an optimal monitoring point in the multiple monitoring points forms a second monitoring point set, and a camera covering the most monitoring points in each independent monitoring area is used as a target camera. And if the target camera is not in a working state, determining the covering state of each monitoring point, taking the intersection of the second monitoring point set and the first monitoring point set as a first intersection, and adding the camera with the least covering monitoring points in the first intersection as an execution object.
The above solutions do not consider the different coverage importance of each part (sub-area) in each monitored area, nor relate to the overall picture quality of monitoring. Therefore, a more intelligent, simple, accurate and efficient method and system for cooperatively scheduling multiple video surveillance cameras are needed to improve the coverage efficiency and quality of regional surveillance under normal/fault conditions.
Disclosure of Invention
This summary is provided to introduce a selection of concepts in a simplified form that are further described below in the detailed description. This summary is not intended to identify key features or essential features of the claimed subject matter; nor is it intended to be used as an aid in determining or limiting the scope of the claimed subject matter.
The technical scheme of the camera scheduling of the invention introduces the coverage importance index of the subarea of the cooperative coverage area as the picture quality weight of the subarea to quantify the overall picture quality of the cooperative coverage area, and the optimal weighted picture quality of the cooperative coverage area and the corresponding optimal camera state are obtained by traversing and adjusting the angle of each camera of the cooperative coverage area according to the preset stepping value.
The invention discloses a method for scheduling cameras in a video monitoring cooperative coverage scene, which comprises the following steps: dividing the monitored cooperative coverage area into a plurality of sub-areas; calculating the total current picture quality of the monitored cooperative coverage area as the current optimal picture quality when each camera is in an initial state, and recording the corresponding state of each camera as the current optimal camera state; adjusting the state of each camera step by step, and calculating the total current picture quality of the monitored cooperative coverage area; comparing the current picture quality calculated after the step adjustment with the recorded current optimal picture quality, if the current picture quality calculated after the step adjustment is superior to the recorded current optimal picture quality, recording the current picture quality calculated after the step adjustment as the current optimal picture quality, and recording the corresponding camera state as the current optimal camera state; and repeating the steps until all the camera states are traversed, wherein the finally recorded current optimal picture quality is the final optimal picture quality, and the corresponding current optimal camera state of each camera is the final optimal camera state.
Wherein calculating the total current picture quality of the monitored collaborative coverage area comprises calculating the maximum value of the picture quality of each sub-area, and calculating the sum of the maximum values of the picture quality of each sub-area as the total current picture quality of the monitored collaborative coverage area.
The invention discloses a system for scheduling cameras in a video monitoring cooperative coverage scene. The collaborative coverage area management module is used for sub-area division, sub-area coverage importance index setting and collaborative coverage area picture quality management (including calculation, comparison, weighted picture quality recording and the like); and the terminal management module is used for communicating with an external terminal control system to control the angle of each camera and record the optimal camera state corresponding to the optimal picture quality before recording.
These and other features and advantages will become apparent upon reading the following detailed description and upon reference to the accompanying drawings. It is to be understood that both the foregoing general description and the following detailed description are explanatory only and are not restrictive of aspects as claimed.
Drawings
The present invention will now be described more fully hereinafter with reference to the accompanying drawings, in which specific embodiments of the invention are shown.
FIG. 1 is a schematic diagram of a positional relationship between a camera and a sub-region of a cooperative coverage area, according to an embodiment of the present invention;
FIG. 2 is a flowchart of a camera scheduling method in a video surveillance collaborative coverage scenario according to an embodiment of the present invention;
fig. 3 is a flowchart of a camera scheduling method in a video surveillance collaborative coverage scenario according to another embodiment of the present invention;
fig. 4 is a block diagram of a camera scheduling system in a video surveillance collaborative coverage scenario according to the present invention.
The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to embodiments of the present application. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s).
Detailed Description
The present invention will now be described more fully hereinafter with reference to the accompanying drawings, in which specific embodiments of the invention are shown. Various advantages and benefits of the present invention will become apparent to those of ordinary skill in the art upon reading the following detailed description of the specific embodiments. It should be understood, however, that the present invention may be embodied in various forms and should not be limited to the embodiments set forth herein. The following embodiments are provided so that the invention may be more fully understood. Unless otherwise defined, technical or scientific terms used herein shall have the ordinary meaning as understood by one of ordinary skill in the art to which this application belongs.
The invention relates to a scheduling scheme of a video monitoring camera based on picture quality under a collaborative coverage monitoring scene. And the state of each corresponding camera at the maximum value of the weighted picture quality of the cooperative coverage area is used as an optimal scheduling scheme by adjusting the deflection angle of each camera in the horizontal direction and the vertical direction in the cooperative coverage area in a stepping mode. Wherein the coverage importance index for each sub-area in the cooperative coverage area may also be taken into account.
A flowchart of a camera scheduling method in a video surveillance collaborative coverage scenario according to an embodiment of the present invention is shown in fig. 2. Suppose that each camera is a fixed height mounted but angle adjustable video surveillance camera. The method comprises the following steps:
step S210-dividing sub-regions: and dividing the cooperative coverage area to be monitored into n sub-areas, and setting the space position coordinates of each sub-area. Here, the value of n may be large enough that each sub-region can be considered as a "point", where the n sub-regions are respectively denoted by a i Wherein i is an integer of 1 to n.
In step S220 — calculating an initial picture quality of each sub-region, which is a maximum value of the picture quality of the sub-region under each camera covering the sub-region in the initial state, at which time each corresponding camera state is recorded as a current optimal camera state, specifically as follows:
each of m cameras covering the cooperative coverage area uses B j Wherein j is an integer of 1 to m. Taking a photographImage head B j Has a horizontal deflection angle of alpha j A vertical deflection angle of beta j . As can be seen from fig. 1, the sub-area a in the cooperative coverage area i And the camera B j The deflection angle of the optical axis of the lens is theta i,j Sub-region A i And the camera B j A distance l of i,j
In FIG. 1, a camera B j A pair of sub-regions A i Picture quality D of i,j Angle of deflection theta from optical axis of camera i,j And distance l from the camera i,j The following steps are involved: d i,j With theta i,j Is decreased with l i,j Is increased and decreased. Specifically, the method comprises the following steps: d i,j =k·cosθ i,j /l i,j Where k is a constant, θ i,j ∈(-π/2,π/2)。
Subregion A i Picture quality D of i For covering the area of each camera B j (j is 1 to m) the maximum value of the picture quality in this sub-area, i.e.:
Figure BDA0003724462570000041
in step S230-calculate the total picture quality D of the collaborative coverage area, which is n sub-areas A i Picture quality D of i Sum of:
Figure BDA0003724462570000051
the total initial picture quality D of the cooperative coverage area obtained by the above calculation performed for the first time with the cameras in the initial state is regarded as the current optimum picture quality D0 and recorded.
In step S240 — step adjustment of each camera state: the stepping adjustment is cooperated with the horizontal direction and the vertical direction of each camera in the coverage area to deflect an angle. If each camera step in m cameras can have x-step angle adjustment, then the number of traversal times for all cameras is m times x. Therefore, the system execution efficiency is directly affected by the step value of the deflection angle, and if the step value is too large, the accuracy may be insufficient, and if the step value is too small, the system load may be increased, so that the whole scheduling process is too slow. Therefore, the system can be reasonably configured by making trade-offs according to the computing capacity of the system.
In step S250 — the total current picture quality D of the collaborative coverage area is calculated using the aforementioned formula 1 and formula 2 in the current camera state.
In step S260-compare picture quality: and comparing the current picture quality D obtained in the step S250 with the current optimal picture quality D0 recorded last time.
If the current picture quality D is not better than the current optimum picture quality D0 recorded last time, the process proceeds directly to step S280.
If the current picture quality D is better than the current optimum picture quality D0 recorded last time, the process proceeds to step S270 to record the current picture quality D as a new current optimum picture quality D0, and the current camera state at this time records the current optimum camera state, and then proceeds to step S280.
In step S280, it is determined whether each angular state of each camera has completed traversal, if not, the step is returned to S240 to continue the step adjustment of the camera state, if so, the adjustment is completed, and the finally recorded camera state corresponding to the optimal picture quality is the optimal camera state obtained by the current scheduling. That is, the finally obtained optimal scheduling result of the m cameras in the cooperative coverage area to be monitored.
When one or more cameras in the collaborative coverage area have faults, the steps S220-S280 can be executed again, the angles of the remaining cameras with normal functions are readjusted, partial coverage capacity is provided for the original coverage area of the fault camera through a collaborative mechanism, and a new optimal strategy can be achieved.
Fig. 3 is a flowchart of a camera scheduling method in a video surveillance collaborative coverage scenario according to another embodiment of the present invention. The steps in the flowchart of fig. 3 that are labeled the same as those in fig. 2 are not repeated.
In comparison to the embodiment of fig. 2, the coverage importance index for each sub-area in the collaborative coverage area is further taken into account. Because each sub-area in the overall monitoring area of the cooperative coverage may be of different importance. The scheduling of the camera can be more accurate and efficient by taking the camera into consideration.
To this end, step S315 is added between steps S210 and S220 — setting the importance index of the subregion: setting the respective coverage importance indexes of the n sub-regions by using the parameter value lambda i Wherein i is an integer of 1 to n. The setting may be done in batches through a man-machine interface for importance of different parts of the coverage area. Index lambda i As sub-region A i Weight of picture quality.
In step S330-calculate the total weighted picture quality of the cooperative coverage area, which is n sub-areas A i Picture quality D of i Weighted sum of:
Figure BDA0003724462570000061
the total initial weighted picture quality D of the cooperative coverage area obtained by the calculation under the condition that each camera is in the initial state for the first time w Weighted picture quality D0 as current optimum w
After each camera angle step, the total current weighted picture quality D of the cooperative coverage area at the current deflection angle is calculated in step S350 using equation 3 w
In step S360-compare weighted picture quality: comparing the current weighted picture quality D obtained in step S350 w Current optimum weighted picture quality D0 from previous recording w
If the current weighted picture quality D w Not better than the current optimal weighted picture quality D0 of the previous recording w Then, the process proceeds directly to step S280.
If the current weighted picture quality D w Current optimal weighted picture quality D0 over previous recording w Then, the process proceeds to step S370 to weight the current picture quality D w Record as new currentOptimal weighted picture quality D0 w The current camera state becomes the current optimum camera state, and the process proceeds to step S280.
Fig. 4 shows a block diagram of a camera scheduling system 400 in a video surveillance collaborative coverage scenario according to an embodiment of the present invention.
The system includes a cooperative coverage area management module 410 and a terminal management module 420. Wherein:
the cooperative coverage area management module 410 is configured to:
dividing sub-regions, dividing the cooperative coverage area into n sub-regions, and setting spatial position coordinates of each sub-region;
sub-area coverage importance index setting; and
and managing the image quality of the collaborative coverage area, including calculating the image quality of each sub-area in the current camera state and the total current (weighted) image quality of the collaborative coverage area, comparing the current (weighted) image quality with the recorded current optimal (weighted) image quality, and recording the optimal (weighted) image quality.
The terminal management module 420 is configured to communicate with an external terminal control system, control angles of the cameras by calling a camera angle adjustment function in an existing terminal control system at present, implement control management of camera state changes, and record states of the cameras corresponding to optimal (weighted) picture quality.
It is understood that the above modules may be implemented by hardware, software, or a combination of hardware and software. The functional blocks depicted in fig. 4 may be combined into a single functional block or divided into multiple sub-functional blocks.
According to the scheme, based on the quality of the monitoring picture and different importance degrees of each subarea in the monitoring area, the optimal area monitoring picture quality effect and the corresponding camera monitoring angle are obtained through limited operation, and the camera monitoring efficiency can be improved without adding redundant cameras. The camera scheduling method has operability, is more accurate and efficient, and is low in complexity.
The above embodiments are only used for illustrating the technical solutions of the present application, and not for limiting the same; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some or all of the technical features may be equivalently replaced; such modifications and substitutions do not depart from the spirit and scope of the present disclosure, and the present disclosure should be construed as being covered by the claims and the specification.

Claims (10)

1. A method for scheduling each camera in a video monitoring collaborative coverage scene comprises the following steps:
step a) dividing the monitored cooperative coverage area into a plurality of sub-areas;
step b) calculating the total current picture quality of the monitored cooperative coverage area as the current optimal picture quality when each camera is in the initial state, and recording the corresponding camera state as the current optimal camera state;
step c) adjusting the state of each camera step by step, and calculating the total current picture quality of the monitored collaborative coverage area;
step d) comparing the calculated current picture quality after the step adjustment with the recorded current optimal picture quality, and if the calculated current picture quality after the step adjustment is superior to the recorded current optimal picture quality, recording the calculated current picture quality after the step adjustment as the current optimal picture quality, and recording the corresponding camera state as the current optimal camera state;
and e) repeating the steps c) and d) until all the camera states are traversed, the finally recorded current optimal picture quality is the final optimal picture quality, and the current optimal camera state of each corresponding camera is the final optimal camera state.
2. The method of claim 1, wherein calculating the total current picture quality of the monitored cooperative coverage area further comprises:
calculating the maximum value of the image quality of each subregion; and
and calculating the sum of the maximum values of the image quality of each subarea to be used as the total current image quality of the monitored cooperative coverage area.
3. The method of claim 1, wherein calculating the total current picture quality of the monitored cooperative coverage area further comprises:
calculating the maximum value of the image quality of each subregion;
setting the respective coverage importance index of each subregion as the weight of the maximum value of the image quality of each subregion; and
and calculating the weighted sum of the maximum values of the image quality of each subarea as the total current weighted image quality of the monitored cooperative coverage area.
4. A method as claimed in claim 2 or 3, wherein the maximum value of the picture quality of each subregion is the maximum value of the picture quality of the subregion for the respective camera covering that region.
5. Method according to claim 4, characterized by one camera (B) j ) For one sub-area (A) i ) Picture quality (D) i,j ) Angle of deflection (theta) with optical axis of camera i,j ) While decreasing with the distance (l) of the camera from the sub-area i,j ) Is increased and decreased, the calculation formula is:
D i,j =k·cosθ i,j /l i,j
where k is a constant number, θ i,j ∈(-π/2,π/2)。
6. The method of claim 1, further comprising repeating steps b) -e) upon failure of one or more cameras of interest in the cooperative coverage area.
7. The method of claim 1, wherein the monitored cooperative coverage area is divided into a sufficient number of sub-areas in step a) such that each sub-area is treated as a point.
8. The method of claim 1, wherein incrementally adjusting the status of each camera comprises incrementally adjusting a horizontal and vertical yaw angle of each camera of the cooperative coverage area.
9. A scheduling system for each camera in a video monitoring collaborative coverage scene comprises:
a cooperative coverage area management module, configured to divide the monitored cooperative coverage area into a plurality of sub-areas and perform picture quality management on the cooperative coverage area, where the cooperative coverage area management module includes: calculating the total current picture quality of the collaborative coverage area, comparing the calculated current picture quality after stepping adjustment with the recorded current optimal picture quality, recording the calculated current picture quality after stepping adjustment as the current optimal picture quality if the calculated current picture quality after stepping adjustment is superior to the recorded current optimal picture quality, and taking the finally recorded current optimal picture quality when stepping is finished and traversing all the states of all the cameras as the final optimal picture quality; and
and the terminal management module is used for communicating with an external terminal control system to control the angle of each camera and record the current optimal camera state of each camera corresponding to the current optimal picture quality.
10. The system of claim 9, wherein the cooperative coverage area management module is further configured for each sub-area coverage importance index setting.
CN202210762138.XA 2022-06-30 2022-06-30 Camera scheduling method and system under video monitoring collaborative coverage scene Active CN115002353B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210762138.XA CN115002353B (en) 2022-06-30 2022-06-30 Camera scheduling method and system under video monitoring collaborative coverage scene

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210762138.XA CN115002353B (en) 2022-06-30 2022-06-30 Camera scheduling method and system under video monitoring collaborative coverage scene

Publications (2)

Publication Number Publication Date
CN115002353A true CN115002353A (en) 2022-09-02
CN115002353B CN115002353B (en) 2023-07-25

Family

ID=83019567

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210762138.XA Active CN115002353B (en) 2022-06-30 2022-06-30 Camera scheduling method and system under video monitoring collaborative coverage scene

Country Status (1)

Country Link
CN (1) CN115002353B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116962649A (en) * 2023-09-19 2023-10-27 安徽送变电工程有限公司 Image monitoring and adjusting system and line construction model

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103400371A (en) * 2013-07-09 2013-11-20 河海大学 Multi-camera synergistic monitoring equipment and method
CN111698465A (en) * 2020-04-29 2020-09-22 视联动力信息技术股份有限公司 Method and device for adjusting monitoring coverage area, electronic equipment and storage medium
CN112583632A (en) * 2020-10-13 2021-03-30 特斯联科技集团有限公司 Camera network topology relation estimation method and system in monitoring scene
CN112969051A (en) * 2021-01-30 2021-06-15 南京新高智联信息技术有限公司 Hydraulic engineering management system based on big data
CN113923406A (en) * 2021-09-29 2022-01-11 四川警察学院 Method, device, equipment and storage medium for adjusting video monitoring coverage area

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103400371A (en) * 2013-07-09 2013-11-20 河海大学 Multi-camera synergistic monitoring equipment and method
CN111698465A (en) * 2020-04-29 2020-09-22 视联动力信息技术股份有限公司 Method and device for adjusting monitoring coverage area, electronic equipment and storage medium
CN112583632A (en) * 2020-10-13 2021-03-30 特斯联科技集团有限公司 Camera network topology relation estimation method and system in monitoring scene
CN112969051A (en) * 2021-01-30 2021-06-15 南京新高智联信息技术有限公司 Hydraulic engineering management system based on big data
CN113923406A (en) * 2021-09-29 2022-01-11 四川警察学院 Method, device, equipment and storage medium for adjusting video monitoring coverage area

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116962649A (en) * 2023-09-19 2023-10-27 安徽送变电工程有限公司 Image monitoring and adjusting system and line construction model
CN116962649B (en) * 2023-09-19 2024-01-09 安徽送变电工程有限公司 Image monitoring and adjusting system and line construction model

Also Published As

Publication number Publication date
CN115002353B (en) 2023-07-25

Similar Documents

Publication Publication Date Title
CN115002353A (en) Camera scheduling method and system under video monitoring cooperative coverage scene
CN109946911B (en) Focusing method and device, computer equipment and storage medium
CN112764927B (en) Vehicle selection and resource joint optimization method, system, medium and application
CN106375666A (en) License plate based automatic focusing method and device
CN115631449A (en) Intelligent video identification management method and system
CN105450907A (en) Intelligent terminal and video image stabilization system model parameter calibration method and device thereof
CN113163175A (en) Surveillance camera head layout method and device and computer readable storage medium
CN110334652B (en) Image processing method, electronic device, and storage medium
CN110060264B (en) Neural network training method, video frame processing method, device and system
EP4072124A1 (en) Image capturing method and device, apparatus, and storage medium
CN109525780B (en) Video linkage camera lens zooming method
CN116342642A (en) Target tracking method, device, electronic equipment and readable storage medium
CN110824468A (en) Method and system for tracking multiple targets based on radar control dome camera
CN114697553A (en) Preset position regulating method and device for equipment, storage medium and electronic equipment
CN113657925B (en) Civil engineering cost management method based on artificial intelligence
CN111083367B (en) Focusing compensation method and device, electronic equipment and storage medium
CN112396574B (en) License plate image quality processing method and device, storage medium and electronic equipment
CN108257408B (en) Cooperative parking space monitoring system
GB2492956A (en) Method and system for image quality learning with solid state image sensors
JP6729430B2 (en) Electronic control unit
CN111641782B (en) Control method and device for image pickup apparatus, storage medium, and electronic device
CN114936987B (en) Lens distortion correction method, device, equipment and storage medium
CN116682191B (en) Subway tunnel inspection configuration method and system
JP3697082B2 (en) Image processing apparatus, image processing method, and computer-readable storage medium
CN116415767A (en) Method and device for controlling point positions

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant