WO2023113754A1 - Calibration and management method of ptz cameras - Google Patents

Calibration and management method of ptz cameras Download PDF

Info

Publication number
WO2023113754A1
WO2023113754A1 PCT/TR2022/051487 TR2022051487W WO2023113754A1 WO 2023113754 A1 WO2023113754 A1 WO 2023113754A1 TR 2022051487 W TR2022051487 W TR 2022051487W WO 2023113754 A1 WO2023113754 A1 WO 2023113754A1
Authority
WO
WIPO (PCT)
Prior art keywords
camera
vertical
cameras
zoom level
management method
Prior art date
Application number
PCT/TR2022/051487
Other languages
French (fr)
Inventor
Yunus Emre ESIN
Omer OZDIL
Safak OZTURK
Original Assignee
Havelsan Hava Elektronik San. Ve Tic. A.S.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from TR2021/020173 external-priority patent/TR2021020173A2/en
Application filed by Havelsan Hava Elektronik San. Ve Tic. A.S. filed Critical Havelsan Hava Elektronik San. Ve Tic. A.S.
Publication of WO2023113754A1 publication Critical patent/WO2023113754A1/en

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • H04N7/181Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a plurality of remote sources

Definitions

  • the present invention relates to a method for automatic calibration and management of multiple PTZ (pan-tilt-zoom) cameras.
  • Document CN104361599A describes a method for determining horizontal, vertical (longitudinal) and zoom correction factors for the calibration of PTZ cameras. For this purpose, the displacement of a target (or size change in case of zooming) is determined during the implementation of a certain change in steps.
  • Document CN109769116B describes a method for correcting the initial locations of PTZ cameras. For this purpose, common elements are matched in the images captured at the starting location and at certain angles, and the transformation matrix that associates the image at an angle with the initial image is derived. By using images obtained from more than one angle, a curve showing the relationship between angle and displacement is obtained. In this way, the actual angle of the camera moved at a certain angle can be known.
  • Document CN108476282A describes a method for determining whether received images are of the desired quality.
  • the relationship between the horizontal camera angle and the focal length is also shown in this document.
  • the objective of the present invention is to develop a method for automatic calibration and management of multiple PTZ (pan-tilt-zoom) cameras.
  • Another objective of the present invention is to develop a method that enables cameras to be managed using absolute orientation and absolute zoom level control parameters defined according to an initial orientation and initial zoom level.
  • These absolute control parameters are defined according to a common coordinate system, and it paves the way for the routing commands for cameras having motion relations with different viewing angles, operating ranges and control parameters to be produced in the same format for all cameras, regardless of these differences.
  • the calibration information is stored in paired form with the relevant cameras after the calibration of each camera, and the cameras can be routed independently of their differences from each other by issuing only a single and common format routing commands for different cameras and converting these routing commands using the relevant calibration information.
  • the camera management method of the invention for managing multiple PTZ cameras using absolute orientation and absolute zoom level control parameters defined according to an initial orientation and initial zoom level basically comprises the steps of calibrating multiple cameras, recording the calibration information associated with the relevant cameras, issuing routing commands defined in the form of a targeted orientation and zoom level according to a common coordinate system for a subset of said multiple cameras from a command client, converting the commands according to the calibration information for each camera separately, and operating the cameras in accordance with the converted commands.
  • Said coordinate system comprises two angular space coordinates corresponding to the environment around the camera and a third coordinate corresponding to the zoom level.
  • the step of calibrating the cameras comprises the following sub-steps: determining the horizontal viewing angle of each camera, determining the vertical range of motion of each camera and the relationship between its vertical movements and the vertical control parameter corresponding to its vertical movements, determining the relationship between the zoom level of each camera and the zoom control parameter corresponding to the zoom level.
  • Horizontal viewing angles, vertical motion ranges, relations between vertical motion and vertical control parameter, and relations between zoom level and zoom control parameter are recorded in paired form with the relevant cameras.
  • Orientation commands issued by the command client are defined according to an initial orientation and initial zoom level.
  • the initial orientation and initial zoom level correspond to the orientation and zoom level at which the camera center axis is horizontal and the widest viewing angle is obtained.
  • the horizontal component of the initial orientation on the other hand, can be the center of the horizontal range of motion of the camera if this range is less than 360°, otherwise it can be any location determined by an operator according to the default or usage area.
  • Communication between the cameras and the command client can be achieved via a connection infrastructure such as the Internet, or via connection devices that are connected to a closed network, compatible with different data transfer protocols.
  • the invention enables cameras to be managed at the same location or with a remote command client.
  • the cameras are operated in accordance with the commands provided by the command client and the resulting images can be transmitted to one or more locations comprising the command client.
  • the command client and cameras can also be connected to the relevant network directly or through at least one server.
  • a set of geographically grouped cameras for monitoring a particular facility may be connected to a server at that facility.
  • Various measures are also implemented to ensure secure communication between components connected to open networks.
  • Calibration information can be stored separately for each camera on a driver unit associated with the relevant camera, and the step of converting commands can be performed on the said driver unit.
  • the calibration information can be stored on a server, in paired form with each camera, and the step of converting commands can be performed on the said server. In this case, the converted routing commands are re-issued by the respective server.
  • the following steps are carried out for each camera: setting the camera to its initial orientation and initial zoom level, taking a first snapshot, rotating the camera horizontally by a certain horizontal movement angle, taking a second snapshot, measuring the horizontal displacement in terms of the number of pixels corresponding to the horizontal movement angle by detecting one or more fixed objects in both the first snapshot and the second snapshot, determining the focal length in terms of the number of pixels using the horizontal movement angle and horizontal displacement.
  • the horizontal movement angle should be chosen in a way that it is smaller than the smallest expected viewing angle of the camera.
  • the focal length is determined using the equation below:
  • YYD is the horizontal displacement
  • YHA is the horizontal movement angle
  • the horizontal viewing angle is also determined by using the horizontal pixel number and focal length of the camera. For this purpose, the following equation is used:
  • YGA is the horizontal viewing angle
  • FFS is the horizontal pixel count
  • the average horizontal displacement can also be measured using more than one horizontal movement angle, especially when the camera's rotation sensitivity is low. In this case, the average horizontal displacement is used instead of the horizontal displacement.
  • Said multiple horizontal movement angles are preferably a first horizontal movement angle and a second horizontal movement angle in the opposite direction to the first horizontal movement angle with respect to the initial orientation.
  • the following steps are carried out for each camera whose focal lengths are determined, in order to determine the vertical range of motion and the relationship between their vertical movements and the vertical control parameters corresponding to their vertical movements: setting the camera to its initial orientation and initial zoom level, rotating the camera vertically by a certain vertical control parameter value in terms of angle, assigning a default vertical range of motion in terms of angle corresponding to the vertical control parameter value, taking a third snapshot, measuring the vertical displacement in terms of the number of pixels corresponding to the vertical control parameter value by detecting one or more fixed objects in both the first snapshot and the third snapshot, calculating an actual vertical movement angle using focal length and vertical displacement, determining the vertical range of motion and the relationship between the vertical movements and the corresponding vertical control parameter using the vertical control parameter value, the assumed vertical range of motion, and the actual vertical movement angle.
  • the default vertical range of motion is preferably assigned as the widest vertical range of motion of the camera that can be used without image inversion, i.e. 90°.
  • the actual vertical movement angle is calculated using the following equation:
  • DYD is the vertical displacement and GDHA is the actual vertical movement angle.
  • the vertical range of motion is also determined using the equation below:
  • DHA is the vertical range of motion
  • VDHA is the default range of motion
  • DKPD is the vertical control parameter value.
  • the ratio of the vertical range of motion to the assumed vertical range of motion or the ratio of the actual vertical range of motion to the vertical control parameter value also reveal the relationship between the vertical movements of the camera and the vertical control parameter corresponding to the vertical movements.
  • the following steps are carried out for each camera in order to determine the relationship between the zoom level and the zoom control parameter corresponding to the zoom level: setting the camera to its initial zoom level, dividing the zoom control parameter range into a certain number of steps, sequentially taking snapshots for the zoom level corresponding to each step, starting from the initial zoom level, by detecting one or more fixed objects in the snapshots of the zoom levels corresponding to the consecutive steps, determining the variation of the object sizes for each snapshot according to the sizes of the same object in the snapshot of the initial zoom level, determining the relationship between the zoom level and the zoom control parameter corresponding to the zoom level using object size variation.
  • Object size variation can be determined by calculating homography between snapshots of zoom levels corresponding to consecutive steps.
  • the relationship between the zoom level and the zoom control parameter corresponding to the zoom level can be expressed in the form of a curve constructed with data pairs in the form of image size versus zoom level.
  • the gaps between the data pairs are completed by interpolation and the curve is formed.
  • the invention By storing the calibration information separately for each camera and converting the routing commands, the invention paved the way for the management of multiple cameras having motion relations with different viewing angles, operating ranges and control parameters using the commands created according to a common coordinate system.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Studio Devices (AREA)

Abstract

The present invention relates to a method for automatic calibration and management of multiple PTZ (pan-tilt-zoom) cameras. With the invention, a method was developed in which the calibration information is stored in paired form with the relevant cameras after the calibration of each PTZ camera, and the cameras can be routed independently of their differences from each other by issuing only a single and common format routing commands for different cameras and converting these routing commands using the relevant calibration information. Thus, PTZ cameras having motion relations with different viewing angles, operating ranges and control parameters can be managed using only absolute control parameters. Calibration processes for PTZ camera features are also explained.

Description

DESCRIPTION
CALIBRATION AND MANAGEMENT METHOD OF PTZ CAMERAS
Technical Field
The present invention relates to a method for automatic calibration and management of multiple PTZ (pan-tilt-zoom) cameras.
Background
For the control of cameras that can be moved along a horizontal curve, that is, around a vertical axis (pan) and along a vertical curve, that is, around a horizontal axis (tilt), and that have the zoom function (zoom), it is necessary to know the operating ranges and the zoom function exactly. Errors in this information cause errors in the location information of the captured image, and this error may increase as a result of repeated camera routing processes. The basic control parameters of PTZ cameras are defined in the standard text titled “ONVIF PTZ Service Specification”. Accordingly, PTZ cameras can be controlled using relative parameters defined according to their instantaneous location (RelativeMove) or using absolute parameters defined according to a reference point (AbsoluteMove). However, information on the motion ranges of the cameras can be given approximately by the manufacturer and the zoom function is usually not disclosed. Therefore, in order to manage multiple PTZ cameras, there is a need for solutions that enable the calibration of multiple of PTZ cameras to be controlled using absolute parameters.
Document CN104361599A describes a method for determining horizontal, vertical (longitudinal) and zoom correction factors for the calibration of PTZ cameras. For this purpose, the displacement of a target (or size change in case of zooming) is determined during the implementation of a certain change in steps.
Document CN109769116B describes a method for correcting the initial locations of PTZ cameras. For this purpose, common elements are matched in the images captured at the starting location and at certain angles, and the transformation matrix that associates the image at an angle with the initial image is derived. By using images obtained from more than one angle, a curve showing the relationship between angle and displacement is obtained. In this way, the actual angle of the camera moved at a certain angle can be known.
Document CN108476282A describes a method for determining whether received images are of the desired quality. The relationship between the horizontal camera angle and the focal length is also shown in this document.
Brief Summary and Objectives of the Invention
The objective of the present invention is to develop a method for automatic calibration and management of multiple PTZ (pan-tilt-zoom) cameras.
Another objective of the present invention is to develop a method that enables cameras to be managed using absolute orientation and absolute zoom level control parameters defined according to an initial orientation and initial zoom level. These absolute control parameters are defined according to a common coordinate system, and it paves the way for the routing commands for cameras having motion relations with different viewing angles, operating ranges and control parameters to be produced in the same format for all cameras, regardless of these differences.
For the purposes of the invention, a method was developed in which the calibration information is stored in paired form with the relevant cameras after the calibration of each camera, and the cameras can be routed independently of their differences from each other by issuing only a single and common format routing commands for different cameras and converting these routing commands using the relevant calibration information.
Detailed Description of the Invention
The camera management method of the invention for managing multiple PTZ cameras using absolute orientation and absolute zoom level control parameters defined according to an initial orientation and initial zoom level, basically comprises the steps of calibrating multiple cameras, recording the calibration information associated with the relevant cameras, issuing routing commands defined in the form of a targeted orientation and zoom level according to a common coordinate system for a subset of said multiple cameras from a command client, converting the commands according to the calibration information for each camera separately, and operating the cameras in accordance with the converted commands. Said coordinate system comprises two angular space coordinates corresponding to the environment around the camera and a third coordinate corresponding to the zoom level.
The step of calibrating the cameras comprises the following sub-steps: determining the horizontal viewing angle of each camera, determining the vertical range of motion of each camera and the relationship between its vertical movements and the vertical control parameter corresponding to its vertical movements, determining the relationship between the zoom level of each camera and the zoom control parameter corresponding to the zoom level.
Horizontal viewing angles, vertical motion ranges, relations between vertical motion and vertical control parameter, and relations between zoom level and zoom control parameter are recorded in paired form with the relevant cameras.
Orientation commands issued by the command client are defined according to an initial orientation and initial zoom level. The initial orientation and initial zoom level correspond to the orientation and zoom level at which the camera center axis is horizontal and the widest viewing angle is obtained. The horizontal component of the initial orientation, on the other hand, can be the center of the horizontal range of motion of the camera if this range is less than 360°, otherwise it can be any location determined by an operator according to the default or usage area.
The movement of the camera realized by staying in a horizontal plane with horizontal movement, and its movement realized by staying in a vertical plane with vertical movement are explained herein.
Communication between the cameras and the command client can be achieved via a connection infrastructure such as the Internet, or via connection devices that are connected to a closed network, compatible with different data transfer protocols. The invention enables cameras to be managed at the same location or with a remote command client. The cameras are operated in accordance with the commands provided by the command client and the resulting images can be transmitted to one or more locations comprising the command client. The command client and cameras can also be connected to the relevant network directly or through at least one server. For example, a set of geographically grouped cameras for monitoring a particular facility may be connected to a server at that facility. There may be other servers running on the relevant network for purposes such as controlling the generated commands, arranging connection requests securely, and backing up images. Various measures are also implemented to ensure secure communication between components connected to open networks.
Calibration information can be stored separately for each camera on a driver unit associated with the relevant camera, and the step of converting commands can be performed on the said driver unit. Instead, the calibration information can be stored on a server, in paired form with each camera, and the step of converting commands can be performed on the said server. In this case, the converted routing commands are re-issued by the respective server.
In order to determine the horizontal viewing angles during the calibration of the cameras, the following steps are carried out for each camera: setting the camera to its initial orientation and initial zoom level, taking a first snapshot, rotating the camera horizontally by a certain horizontal movement angle, taking a second snapshot, measuring the horizontal displacement in terms of the number of pixels corresponding to the horizontal movement angle by detecting one or more fixed objects in both the first snapshot and the second snapshot, determining the focal length in terms of the number of pixels using the horizontal movement angle and horizontal displacement. The horizontal movement angle should be chosen in a way that it is smaller than the smallest expected viewing angle of the camera. Thus, the coincidence of the first snapshot and the second snapshot is ensured.
The focal length is determined using the equation below:
Figure imgf000006_0001
Here, Fis the focal length, YYD is the horizontal displacement, and YHA is the horizontal movement angle.
After determining the focal length, the horizontal viewing angle is also determined by using the horizontal pixel number and focal length of the camera. For this purpose, the following equation is used:
Figure imgf000006_0002
Here, YGA is the horizontal viewing angle and FFS is the horizontal pixel count.
The average horizontal displacement can also be measured using more than one horizontal movement angle, especially when the camera's rotation sensitivity is low. In this case, the average horizontal displacement is used instead of the horizontal displacement. Said multiple horizontal movement angles are preferably a first horizontal movement angle and a second horizontal movement angle in the opposite direction to the first horizontal movement angle with respect to the initial orientation.
During the calibration of the cameras, the following steps are carried out for each camera whose focal lengths are determined, in order to determine the vertical range of motion and the relationship between their vertical movements and the vertical control parameters corresponding to their vertical movements: setting the camera to its initial orientation and initial zoom level, rotating the camera vertically by a certain vertical control parameter value in terms of angle, assigning a default vertical range of motion in terms of angle corresponding to the vertical control parameter value, taking a third snapshot, measuring the vertical displacement in terms of the number of pixels corresponding to the vertical control parameter value by detecting one or more fixed objects in both the first snapshot and the third snapshot, calculating an actual vertical movement angle using focal length and vertical displacement, determining the vertical range of motion and the relationship between the vertical movements and the corresponding vertical control parameter using the vertical control parameter value, the assumed vertical range of motion, and the actual vertical movement angle.
The default vertical range of motion is preferably assigned as the widest vertical range of motion of the camera that can be used without image inversion, i.e. 90°.
The actual vertical movement angle is calculated using the following equation:
Figure imgf000007_0001
Here, DYD is the vertical displacement and GDHA is the actual vertical movement angle.
The vertical range of motion is also determined using the equation below:
Figure imgf000007_0002
Here, DHA is the vertical range of motion, VDHA is the default range of motion, and DKPD is the vertical control parameter value. The ratio of the vertical range of motion to the assumed vertical range of motion or the ratio of the actual vertical range of motion to the vertical control parameter value also reveal the relationship between the vertical movements of the camera and the vertical control parameter corresponding to the vertical movements. During the calibration of the cameras, the following steps are carried out for each camera in order to determine the relationship between the zoom level and the zoom control parameter corresponding to the zoom level: setting the camera to its initial zoom level, dividing the zoom control parameter range into a certain number of steps, sequentially taking snapshots for the zoom level corresponding to each step, starting from the initial zoom level, by detecting one or more fixed objects in the snapshots of the zoom levels corresponding to the consecutive steps, determining the variation of the object sizes for each snapshot according to the sizes of the same object in the snapshot of the initial zoom level, determining the relationship between the zoom level and the zoom control parameter corresponding to the zoom level using object size variation.
Object size variation can be determined by calculating homography between snapshots of zoom levels corresponding to consecutive steps. The relationship between the zoom level and the zoom control parameter corresponding to the zoom level can be expressed in the form of a curve constructed with data pairs in the form of image size versus zoom level. Preferably, the gaps between the data pairs are completed by interpolation and the curve is formed.
By storing the calibration information separately for each camera and converting the routing commands, the invention paved the way for the management of multiple cameras having motion relations with different viewing angles, operating ranges and control parameters using the commands created according to a common coordinate system.

Claims

CLAIMS A camera management method for managing multiple PTZ cameras using the absolute orientation and absolute zoom level control parameters defined according to an initial orientation and initial zoom level where the camera center axis is horizontal and the widest viewing angle is obtained, characterized by the following steps:
- calibrating multiple cameras,
- recording the calibration information associated with the relevant cameras,
- issuing orientation and zoom commands defined in the form of a targeted orientation and zoom level according to a common coordinate system for a subset of said multiple cameras from a command client,
- converting the commands according to calibration information for each camera separately,
- operating the cameras according to converted commands. A camera management method according to Claim 1, characterized in that it comprises the following steps applied for the calibration of the cameras:
- determining the horizontal viewing angle of each camera,
- determining the vertical range of motion of each camera and the relationship between its vertical movements and the vertical control parameter corresponding to its vertical movements,
- determining the relationship between the zoom level of each camera and the zoom control parameter corresponding to the zoom level. A camera management method according to Claim 1, characterized by storing the calibration information separately for each camera on a driver unit associated with the relevant camera and converting the commands on the control units. A camera management method according to Claim 1, characterized by storing the calibration information on a server, in paired form with each camera, converting the commands on the server, and publishing the converted routing commands in paired form with the cameras.
8
5. A camera management method according to Claim 2, characterized in that it comprises the following steps for each camera:
- setting the camera to its initial orientation and initial zoom level,
- taking a first snapshot,
- rotating the camera horizontally by a certain horizontal movement angle,
- taking a second snapshot,
- measuring the horizontal displacement in terms of the number of pixels corresponding to the horizontal movement angle by detecting one or more fixed objects in both the first snapshot and the second snapshot,
- determining the focal length in terms of the number of pixels using the horizontal movement angle and horizontal displacement.
6. A camera management method according to Claim 5, characterized by determining the horizontal viewing angle by using the horizontal pixel number and focal length of the camera.
7. A camera management method according to Claim 5, characterized by measuring the average horizontal displacement using more than one horizontal movement angle.
8. A camera management method according to Claim 7, characterized by using a first horizontal movement angle and a second horizontal movement angle in the opposite direction to the first horizontal movement angle with respect to the initial orientation.
9. A camera management method according to Claim 5, characterized by the following steps:
- setting the camera to its initial orientation and initial zoom level,
- rotating the camera vertically by a certain vertical control parameter value in terms of angle,
- assigning a default vertical range of motion in terms of angle corresponding to the vertical control parameter value,
- taking a third snapshot,
9 - measuring the vertical displacement in pixels corresponding to the vertical control parameter value by detecting one or more fixed objects in both the first snapshot and the third snapshot,
- calculating an actual vertical movement angle using focal length and vertical displacement,
- determining the vertical range of motion and the relationship between the vertical movements and the vertical control parameter corresponding to the vertical movements using the vertical control parameter value, the assumed vertical range of motion and the actual vertical movement angle. A camera management method according to Claim 2, characterized by the following steps for each camera:
- setting the camera to its initial zoom level,
- dividing the zoom control parameter range into a certain number of steps,
- sequentially taking snapshots for the zoom level corresponding to each step, starting from the initial zoom level,
- calculating the homography between the snapshots of the zoom levels corresponding to the consecutive steps, detecting one or more fixed objects in all the images, and determining the variation of the object sizes for each snapshot according to the sizes of the same object in the snapshot of the initial zoom level,
- determining the relationship between the zoom level and the zoom control parameter corresponding to the zoom level using object size variation.
10
PCT/TR2022/051487 2021-12-16 2022-12-13 Calibration and management method of ptz cameras WO2023113754A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
TR2021020173 2021-12-16
TR2021/020173 TR2021020173A2 (en) 2021-12-16 CALIBRATION AND MANAGEMENT METHOD OF PTZ CAMERAS

Publications (1)

Publication Number Publication Date
WO2023113754A1 true WO2023113754A1 (en) 2023-06-22

Family

ID=86773191

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/TR2022/051487 WO2023113754A1 (en) 2021-12-16 2022-12-13 Calibration and management method of ptz cameras

Country Status (1)

Country Link
WO (1) WO2023113754A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117593414A (en) * 2024-01-17 2024-02-23 亿海蓝(北京)数据技术股份公司 Method and system for drawing shooting area of camera, electronic equipment and storage medium

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060215031A1 (en) * 2005-03-14 2006-09-28 Ge Security, Inc. Method and system for camera autocalibration
US20100141767A1 (en) * 2008-12-10 2010-06-10 Honeywell International Inc. Semi-Automatic Relative Calibration Method for Master Slave Camera Control
CN103400373A (en) * 2013-07-13 2013-11-20 西安科技大学 Method for automatically identifying and positioning coordinates of image point of artificial mark in camera calibration control field

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060215031A1 (en) * 2005-03-14 2006-09-28 Ge Security, Inc. Method and system for camera autocalibration
US20100141767A1 (en) * 2008-12-10 2010-06-10 Honeywell International Inc. Semi-Automatic Relative Calibration Method for Master Slave Camera Control
CN103400373A (en) * 2013-07-13 2013-11-20 西安科技大学 Method for automatically identifying and positioning coordinates of image point of artificial mark in camera calibration control field

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117593414A (en) * 2024-01-17 2024-02-23 亿海蓝(北京)数据技术股份公司 Method and system for drawing shooting area of camera, electronic equipment and storage medium

Similar Documents

Publication Publication Date Title
US8441541B2 (en) Control apparatus and control method therefor
US8405731B2 (en) Method for compensating hardware misalignments in a camera
US20060209186A1 (en) Field angle adjustment apparatus, camera system, and field angle adjustment method
US10506151B2 (en) Information acquisition apparatus
CN112254663B (en) Plane deformation monitoring and measuring method and system based on image recognition
WO2022152194A1 (en) Calibration method of monitoring camera
WO2023113754A1 (en) Calibration and management method of ptz cameras
KR102061461B1 (en) Stereo camera system using vari-focal lens and operating method thereof
US20220349837A1 (en) Method and system for providing optical distortion information of a vehicle glazing
AU2007249585B2 (en) Real-time capture and transformation of hemispherical video images to images in rectilinear coordinates
KR101595884B1 (en) Monitoring camera method for controlling the same
RU2554102C2 (en) Distributed architecture of forest video monitoring
US11153495B2 (en) Method of controlling pan-tilt-zoom camera by using fisheye camera and monitoring system
KR100888935B1 (en) Method for cooperation between two cameras in intelligent video surveillance systems
JP6478777B2 (en) Control device, control method therefor, and program
JP2018093401A (en) Video monitoring device, video monitoring method, and video monitoring system
JP6269014B2 (en) Focus control device and focus control method
WO2020095541A1 (en) Information processing device, information processing method, and program
KR101817509B1 (en) The method for restoring distortion of PTZ
US20220132041A1 (en) Image capturing control apparatus, image capturing control method, and non-transitory computer-readable storage medium
CN111964604B (en) Plane deformation monitoring and measuring method based on image recognition
JP7191663B2 (en) IMAGING DEVICE, CONTROL METHOD AND PROGRAM
KR102074892B1 (en) PTZ detailed control server and method using viewing angle of CCTV camera
TR2021020173A2 (en) CALIBRATION AND MANAGEMENT METHOD OF PTZ CAMERAS
JP2008154188A (en) Image transmission system, and image transmitting method

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22908132

Country of ref document: EP

Kind code of ref document: A1