CN111103899A - Holder positioning method and device - Google Patents

Holder positioning method and device Download PDF

Info

Publication number
CN111103899A
CN111103899A CN201811247814.XA CN201811247814A CN111103899A CN 111103899 A CN111103899 A CN 111103899A CN 201811247814 A CN201811247814 A CN 201811247814A CN 111103899 A CN111103899 A CN 111103899A
Authority
CN
China
Prior art keywords
information
target object
radar
target
pan
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201811247814.XA
Other languages
Chinese (zh)
Inventor
凌振萍
杨建�
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hangzhou Hikmicro Sensing Technology Co Ltd
Original Assignee
Hangzhou Hikvision Digital Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hangzhou Hikvision Digital Technology Co Ltd filed Critical Hangzhou Hikvision Digital Technology Co Ltd
Priority to CN201811247814.XA priority Critical patent/CN111103899A/en
Publication of CN111103899A publication Critical patent/CN111103899A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D3/00Control of position or direction
    • G05D3/12Control of position or direction using feedback

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Automation & Control Theory (AREA)
  • Radar Systems Or Details Thereof (AREA)

Abstract

The application discloses a method and a device for cloud deck positioning, comprising the following steps: receiving target information transmitted by a radar; converting the target information into omni-directional movement and zooming omni-directional movement zooming information; and controlling the motion of the pan-tilt camera according to the omnibearing moving zooming information to realize the positioning of the pan-tilt to the target object. By applying the technical scheme disclosed by the application, a complex and expensive CCTV system does not need to be deployed in the radar and the cradle head, the cradle head can convert target information, accurate positioning is realized, and the cost of safety monitoring is reduced.

Description

Holder positioning method and device
Technical Field
The present application relates to the field of security monitoring technologies, and in particular, to a method and an apparatus for positioning a pan/tilt head.
Background
The existing safety monitoring system is often realized by utilizing a radar which has ethical advantages in the aspects of geographic range, real-time performance and accuracy. In order to facilitate the workers to clearly see the picture of the target object, a pan-tilt camera is introduced into the safety monitoring system, so that the pan-tilt and the radar are linked. In the prior art, in order to realize linkage of a pan-tilt and a radar, a CCTV system is generally deployed in the system. The CCTV system comprises a GIS geographic information system and can also control the holder and acquire a holder video. The CCTV deployment has the function that the computer outputs the azimuth and the angle of the pan-tilt camera to be adjusted according to the pan-tilt camera and the radar geographical position and the target object geographical position, and then controls the pan-tilt camera to move. Thus, the pan-tilt camera can be aimed at the target object, and the picture is adjusted so that the staff can clearly observe the target object.
As described above, in the prior art, a CCTV system needs to be deployed in a radar and a pan/tilt head to realize linkage of the radar and the pan/tilt head, but the CCTV system is very expensive and complex to implement, and is not beneficial to development of safety monitoring work.
Disclosure of Invention
The embodiment of the application provides a cloud deck positioning method, which can realize the linkage of a radar and a cloud deck without deploying CCTV, reduce the cost of safety monitoring and simplify the operation process. The method specifically comprises the following steps:
a method of pan-tilt positioning, the method comprising:
receiving target information transmitted by a radar, wherein the target information comprises geographic position information of a target object detected by the radar;
converting the target information into omni-directional movement zooming information, wherein the omni-directional movement zooming information is positioning information used for controlling the motion of a holder;
and controlling the motion of the holder according to the omni-directional movement zooming information to realize the positioning of the target object by the holder.
Further, the target information further includes attribute information of the target object and attribute information of the radar, and converting the target information into the omni-directional movement zoom information includes:
and converting the geographic position information of the target object, the attribute information of the target object and the attribute information of the radar into the omnidirectional movement zooming information.
Further, the geographic position information includes longitude and latitude of a geographic position where the target object is located, and a distance between the radar and the target object, the attribute information of the radar includes a pitch angle when the radar is directed at the target object, the attribute information of the target object includes a physical size of the target object detected by the radar, and the converting into the omni-directional movement zoom information includes:
calculating the omnibearing movement information of the holder according to the longitude and latitude, the pitch angle and the target distance;
and calculating zoom information of the pan-tilt according to the physical size of the target object and the pan-tilt frame ratio, wherein the omni-directional movement zoom information comprises the omni-directional movement information and the zoom information.
Further, before the converting the target information into the omni-directional movement zoom information, the method further includes:
acquiring the relative position relation of the radar and the holder in a three-dimensional space;
and determining a three-dimensional space coordinate system according to the relative position relationship, wherein the three-dimensional space coordinate system is used for providing reference coordinates for the conversion from the target information to the omnidirectional movement zooming information.
Further, the method for calculating the omni-directional movement information according to the longitude and latitude, the pitch angle and the target distance comprises the following steps:
determining a three-dimensional space coordinate of the target object in a three-dimensional space coordinate system according to the longitude and latitude, the pitch angle and the target distance, wherein the three-dimensional space coordinate system is determined according to a relative position relation between a radar and the holder in a three-dimensional space;
and calculating the omnibearing pan-tilt movement information of the target object according to the three-dimensional space coordinate of the target object and the three-dimensional space coordinate of the pan-tilt.
Further, the method for calculating the zoom information of the pan/tilt head according to the physical size of the target object and the pan/tilt head picture ratio comprises the following steps:
calculating a horizontal view field half angle of the holder according to the physical size of the target object and the holder picture ratio;
and calculating the zooming information of the holder according to the horizontal view field half angle and the corresponding relation between the magnification of the holder and the view field angle.
The embodiment of the application also provides a cloud platform positioning device, which can realize the linkage of the radar and the cloud platform without deploying CCTV, reduce the cost of safety monitoring and simplify the operation process. The device includes:
the data receiving module is used for receiving target information transmitted by the radar, and the target information comprises geographic position information of a target object detected by the radar;
the data conversion module is used for converting the target information into omni-directional movement and zoom information, wherein the omni-directional movement zoom information is positioning information used for controlling the motion of the tripod head;
and the positioning module controls the motion of the holder according to the omnidirectional movement zooming information to realize the positioning of the target object by the holder.
Further, the target information further includes attribute information of the target object and attribute information of the radar; the geographic position information comprises longitude and latitude of a geographic position where the target object is located and a distance between the radar and the target object, the attribute information of the radar comprises a pitch angle when the radar is aligned with the target object, and the attribute information of the target object comprises a physical size of the target object detected by the radar; the data conversion module includes:
the omnibearing movement information conversion module is used for calculating the omnibearing movement information of the holder according to the longitude and latitude, the pitch angle and the target distance;
and the zooming information conversion module is used for calculating zooming information of the holder according to the physical size of the target object and the picture proportion of the holder.
Further, the apparatus further comprises:
the calibration module is used for acquiring the relative position relation of the radar and the holder in a three-dimensional space; and determining a three-dimensional space coordinate system according to the relative position relationship, wherein the three-dimensional space coordinate system is used for providing reference coordinates for the conversion from the target information to the omnidirectional movement zooming information.
Further, the omni-directional mobile information conversion module includes:
the target object coordinate determination module is used for determining the three-dimensional space coordinate of the target object in a three-dimensional space coordinate system according to the longitude and latitude, the pitch angle and the target distance, and the three-dimensional space coordinate system is determined according to the relative position relation between the radar and the holder in the three-dimensional space;
and the omnibearing movement information calculation module is used for calculating the omnibearing movement information of the holder aiming at the target object according to the three-dimensional space coordinate of the target object and the three-dimensional space coordinate of the holder.
Further, the zoom information conversion module includes:
the horizontal view field calculation module is used for calculating a horizontal view field half angle of the holder according to the size of the target object and the holder picture ratio;
and the zooming information calculation module is used for calculating zooming information of the holder according to the horizontal viewing field half angle and the corresponding relation between the magnification and the viewing angle of the holder.
Embodiments of the present application further provide a computer-readable storage medium storing instructions, which, when executed by a processor, cause the processor to perform the steps in the pan-tilt positioning method as described above.
An embodiment of the present application further provides a pan/tilt head, which includes a camera, the computer-readable storage medium as described above, and a processor capable of executing instructions in the computer-readable storage medium.
According to the technical scheme, the target information is directly converted by the holder, the omni-directional movement zooming information is formed, the positioning is accurately realized, and a CCTV system does not need to be arranged in the radar and the holder, so that the cost of safety monitoring is greatly reduced.
Drawings
Fig. 1 is a scene schematic diagram of a pan-tilt head in the working process of a linked radar according to an embodiment of the present application.
Fig. 2 is a flowchart of a first embodiment of the method of the present application.
Fig. 3 is a flowchart of a second embodiment of the method of the present application.
Fig. 4 is a graph of a relative position relationship between a radar and a pan/tilt head in the second embodiment of the method of the present application.
Fig. 5 is a graph of a relative relationship between a radar, a pan-tilt and a target object in the second embodiment of the method of the present application.
Fig. 6 is a schematic view of a horizontal field angle when the cloud platform is aligned with the target object in the second embodiment of the application method.
Fig. 7 is a schematic structural diagram of a first embodiment of the apparatus of the present application.
Fig. 8 is a schematic diagram of an internal structure of the data conversion module 32 according to an embodiment of the present application.
Fig. 9 is a schematic structural diagram of a second embodiment of the apparatus of the present application.
Detailed Description
In order to make the objects, technical solutions and advantages of the present application more apparent, the present application is further described in detail below by referring to the accompanying drawings and examples.
Fig. 1 is a scene schematic diagram of a pan-tilt head during the operation of a linked radar in the embodiment of the present application. As shown in fig. 1, the system includes a radar 11, a pan/tilt head 12, and a target object 13. Among them, the radar 11 is an electronic device that detects the target object 13 using electromagnetic waves. When the radar 11 emits an electromagnetic wave to irradiate the target object 13 and receives a return wave, target information can be acquired therefrom.
In order to reduce cost and simplify an implementation scheme, the embodiment of the application provides a method for positioning a target object by a holder. According to the method, a CCTV system does not need to be deployed in the radar and the pan-tilt, and the pan-tilt can realize positioning directly through target information provided by the radar.
Method embodiment one
Fig. 2 is a flowchart of a method according to a first embodiment of the invention. Based on the working scenario of fig. 1, the method provided in the first embodiment is executed and implemented by an internal processor of the pan/tilt head 12, as shown in fig. 2, the method includes:
step 201: and receiving target information transmitted by the radar, wherein the target information comprises the geographical position information of the target object detected by the radar.
Step 202: and converting the target information into omni-directional movement zooming information, wherein the omni-directional movement zooming information is positioning information used for controlling the motion of the pan-tilt.
The Pan/Tilt Zoom information described herein is information required when controlling the movement of the Pan/Tilt head, and is also generally referred to as PTZ (PTZ, Pan/Tilt/Zoom) information, where P denotes a horizontal angle to be adjusted by the Pan/Tilt head, T denotes a vertical angle to be adjusted by the Pan/Tilt head, and Z denotes Zoom information to be adjusted by the Pan/Tilt head.
Step 203: and controlling the motion of the pan-tilt camera according to the omnibearing moving zooming information to realize the positioning of the pan-tilt to the target object.
That is to say, in the first embodiment of the present application, a CCTV system is not deployed in the radar and the pan/tilt, and the pan/tilt converts target information into omni-directional zoom information that can be directly used by the pan/tilt according to target information provided by the radar, and controls the motion of the pan/tilt by using the omni-directional zoom information, so as to achieve the purpose of positioning a target object.
Method embodiment two
In practical application, target information obtained by the radar when detecting a target object not only includes geographical position information of the target object, but also includes attribute information of the target object and attribute information of the radar, wherein the geographical position information includes longitude and latitude of the geographical position where the target object is located and a distance between the radar and the target object, the attribute information of the radar includes a pitch angle when the radar is aligned with the target object, and the attribute information of the target object includes a physical size of the target object detected by the radar.
The second embodiment will describe in detail how the above-described target information is converted into the omni-directional movement zoom information. In the conversion, the second aspect of the embodiment is to perform the conversion respectively: one is conversion of omnidirectional movement information, also called PT information in practical application, which is mainly calculated according to latitude and longitude, pitch angle and target distance, and the other is conversion of zoom information, also called Z information in practical application, which is mainly calculated according to physical size of target object and pan-tilt frame ratio. And integrating the two information into omnidirectional movement zooming information, namely PTZ information.
Fig. 3 is a flowchart of a method according to a second embodiment of the present application. As shown in fig. 3, the method includes:
step 301: and acquiring the relative position relation of the radar and the holder in the three-dimensional space.
Step 302: and determining a three-dimensional space coordinate system according to the relative position relation.
Step 301 and step 302 are actually the calibration process of the radar and pan/tilt three-dimensional coordinate system, and the three-dimensional coordinate system is used for the conversion of the subsequent target information to the omnidirectional movement zoom information to provide the reference coordinate.
In practical application, reference may be made to fig. 4 for a specific method for determining a relative position relationship between a radar and a pan/tilt head in a three-dimensional space. Fig. 4 is a relative position relationship coordinate diagram of the radar and the holder. It is assumed that the radar and pan/tilt head are mounted at different geospatial locations (as is common in practical applications). Wherein the radar is arranged at a high point position with a wide view, which is represented by a point B, and the tripod head is represented by a point A. The holder is firstly aligned to the radar, the holder is taken as the origin of coordinates, the north Y-axis direction is taken as the 0-degree angle direction, and the clockwise direction is specified as the direction of increasing the angle. At this time, the three-dimensional space coordinate of the radar relative to the holder can be determined by utilizing the omni-directional movement zoom information of the holder. In practical application, it is needless to say that the calibration may be performed not based on the radar or the pan/tilt but based on another third party, as long as the calibration is performed in a unified coordinate system.
Fig. 5 is a graph showing a relative positional relationship among the radar, the pan/tilt head, and the target object. As shown in fig. 5, in addition to the radar and pan/tilt head, a target object is included, here indicated by point D. Then, assume that a three-dimensional coordinate system shown in fig. 5 is established with point C as the origin, where north is the Y axis, CB is the Z axis, and CX is the X axis. The point C and the point A are on the same horizontal plane, the three-dimensional coordinate of the radar point B is (0, 0, | CB |), and the three-dimensional coordinate of the pan-tilt A is (| CJ |, | AJ |, 0). The distances mapped by the point D on each axis of the coordinate with the point C as the origin are | GD |, | HG |, and | CH |, respectively. In the embodiment of the present application, the terms "CB", "CJ", "AJ", "GD", "HG", "CH", etc. represent absolute values of a distance, and belong to scalar quantities, and the following terms are similar.
In the coordinate system of fig. 5, assuming ∠ JAC is P, i.e. the horizontal angle when the pan/tilt is aimed at the radar, and ∠ BAC is T, i.e. the vertical angle when the pan/tilt is aimed at the radar, then:
∠CAJ=360–P;
|CA|=|BC|/Tan(T);
|CJ|=|CA|*Sin∠CAJ;
|AJ|=|CA|*Cos∠CAJ;
the information P and T can be determined when the pan-tilt is calibrated, and | BC | can be obtained by means of GPS, for example. The three-dimensional coordinates (CJ, AJ, 0) of the tripod head relative to the radar can be determined through the calculation of the relation.
In practical application, the relative position relationship of the radar and the holder in the three-dimensional space can be determined through the method, and the three-dimensional space coordinate system can be determined.
Step 303: and receiving target information transmitted by the radar, wherein the target information comprises longitude and latitude of the geographic position of the target object detected by the radar, the distance between the radar and the target object, a pitch angle when the radar is aligned with the target object and the physical size of the target object.
Step 304: and determining the three-dimensional space coordinate of the target object in the three-dimensional space coordinate system according to the longitude and latitude, the pitch angle and the target distance.
The three-dimensional space coordinate system is determined according to the relative position relationship between the radar and the pan-tilt head in the three-dimensional space in steps 301 and 302.
For example, in FIG. 5, ∠ CBD is the pitch angle of the radar when it is aimed at the target object, ∠ KHD is the horizontal angle of the radar when it is aimed at the target object, and BD is the target distance.
|CH|=|BD|*Cos∠CBD–|CB|;
|GD|=|HD|*Sin∠GHD=|BD|*Sin∠CBD*Sin(180–∠KHD);
|GH|=|HD|*Cos∠GHD=|BD|*Sin∠CBD*Cos(180–∠KHD);
The three-dimensional coordinates of the target object can be determined to be (GD, GH, CH) by calculation according to the relation.
Step 305: and calculating the omnibearing movement information aiming at the target object according to the three-dimensional space coordinate of the target object and the three-dimensional space coordinate of the holder.
Here, after the steps 304 and 305 are performed, the omni-directional movement information can be obtained, that is, a method of how to calculate the omni-directional movement information of the pan/tilt head according to the longitude and latitude, the pitch angle, and the target distance is implemented. Since the three-dimensional space coordinates of the target object are calculated in step 304, the three-dimensional space coordinates of the pan/tilt head can be determined in step 305 upon determining the relative position to the radar, and therefore, the two known three-dimensional space coordinates can determine the vector from the pan/tilt head to the target object. The vector can easily calculate the horizontal angle and the vertical angle from the holder to the target object in a three-dimensional space coordinate system, namely the omnibearing moving information of the holder.
Taking fig. 5 as an example, after step 304, the pan/tilt head determines not only its own three-dimensional space coordinates but also the three-dimensional space coordinates of the target object, and at this time, the vectors from a to D may be determined. According to the AD vector, the horizontal angle P and the vertical angle T from the tripod head to the target object can be determined, so that the omnibearing movement information when the tripod head is aligned to the target object is determined.
Step 306: and calculating a horizontal viewing field half angle of the holder according to the physical size of the target object and the picture proportion of the holder, and calculating zooming information of the holder according to the horizontal viewing field half angle and the corresponding relation between the magnification of the holder and the viewing angle.
Here, the physical size of the target object is obtained by the radar through detection, and the pan-tilt frame ratio refers to a ratio of the target object to be displayed on the pan-tilt frame to the whole display frame, which can be specified by a user applying the scheme of the present application. The half angle of the horizontal field of view refers to half of the horizontal field of view of the pan/tilt head during shooting, and the horizontal field of view is the divergence angle of the pan/tilt head lens. The visible light part inside the camera of the pan-tilt head consists of a machine core and a lens in a matching way, the field angles corresponding to different multiplying powers can be calculated according to the multiplying powers of the sensor and the optical lens, the corresponding relation table of the multiplying power and the field angle of the pan-tilt camera is inquired, the required multiplying power can be determined according to the relation table, and the zooming information is determined.
Taking fig. 6 as an example, it is assumed that fig. 6 is a schematic view of a horizontal field angle when the pan-tilt is aligned with the target object. As shown in fig. 6, assuming that the physical size of the target object is L, the horizontal field half angle of the pan/tilt head is Θ (i.e., half of the divergence angle), half the field width is F, and | AD | is the distance between the pan/tilt head and the target object. In practical application, the target object can be set to be located in the center of the screen, and the ratio of the target object on the screen is set to be R. Then:
Tan(Θ)=L*|AD|/(2R);
according to the relation, the horizontal viewing field half angle theta of the holder can be determined, and then the zooming information of the holder is calculated according to the horizontal viewing field half angle theta and the corresponding relation between the magnification and the field angle of the holder.
By this, the omni-directional movement information is calculated in steps 304 and 305, and the zoom information is calculated in step 306, thereby obtaining the entire omni-directional movement zoom information, i.e., PTZ information. That is, after the pan/tilt head acquires target information from the radar, the target information can be converted into omni-directional movement zoom information through the above steps. According to the calculated omnibearing movement zoom information, the tripod head controls an internal moving part to adjust the rotation of the camera to aim the camera at a target object, and the ratio of the camera in a picture reaches a specified R value, so that the positioning of the tripod head is realized, and a worker can clearly observe the target object.
Step 307: and controlling the motion of the holder according to the omni-directional movement zooming information to realize the positioning of the target object by the holder.
By applying the scheme of the second embodiment of the application, the omnidirectional movement information is calculated through the longitude and latitude, the pitch angle and the target distance, the zooming information is calculated through the physical size of the target object and the picture occupation ratio of the holder, the longitude and latitude, the pitch angle, the target distance and the physical size of the target object are detected by the radar, and the picture occupation ratio of the holder is specified by a user according to requirements. Therefore, under the condition that the CCTV system is not deployed in the radar and the holder, the holder can still calculate to obtain the omnidirectional movement zooming information, so that the target object is positioned, the scheme is simplified, and the effect of reducing the safety monitoring cost is achieved.
Apparatus embodiment one
The application also provides an embodiment of a device for realizing the cloud deck positioning. Fig. 7 is a schematic structural diagram of a first embodiment of the apparatus of the present invention. The device is a logical component of the cradle head for implementing the method, and in practical application, the cradle head may also include other components, but since the embodiment of the present application is not referred to herein, it will not be mentioned in the following description.
As shown in fig. 7, the apparatus includes: a data receiving module 31, a data conversion module 32 and a positioning module 33. The data receiving module 31 is configured to receive target information sent by a radar, where the target information includes geographic position information of a target object detected by the radar; the data conversion module 32 converts the target information into omni-directional movement zoom information, which is positioning information used by the pan/tilt head to control the motion of the pan/tilt head; the positioning module 33 controls the motion of the pan/tilt head according to the omni-directional zoom information, so as to position the target object by the pan/tilt head.
In practical application, the target information not only includes the geographical position information of the target object detected by the radar, but also includes the attribute information of the target object and the attribute information of the radar. The geographic position information comprises longitude and latitude of a geographic position where the target object is located and a distance between the radar and the target object, the attribute information of the radar comprises a pitch angle when the radar is aligned with the target object, and the attribute information of the target object comprises a physical size of the target object detected by the radar. The latitude and longitude, the pitch angle, the target distance, the physical size of the target object and the like are a series of information determined by the radar by transmitting electromagnetic waves to the target object and receiving an echo signal returned from the target object. In addition, the omni-directional movement zoom information described herein is information that is required for the operation of the pan/tilt head, and is also generally referred to as PTZ information, where P denotes a horizontal angle to be adjusted by the pan/tilt head, T denotes a vertical angle to be adjusted by the pan/tilt head, and Z denotes zoom information to be adjusted by the pan/tilt head. After the all-direction movement zooming information is provided, the pan-tilt camera can control the camera to align the target object according to the obtained all-direction movement zooming information without deploying a CCTV system, zooming is carried out, the target object is amplified to a proper multiple, the target object is clearly displayed in a picture, and workers can observe or monitor the target object conveniently.
Fig. 8 is a schematic diagram of an internal structure of the data conversion module 32 according to an embodiment of the apparatus. As shown in fig. 8, the data conversion module 32 may include: an omni-directional movement information conversion module 41 and a zoom information conversion module 42. The omni-directional movement information conversion module 41 is configured to calculate omni-directional movement information according to the longitude and latitude, the pitch angle, and the target distance; the zoom information conversion module 42 is configured to calculate zoom information according to the physical size of the target object and the pan-tilt frame ratio.
By applying the embodiment scheme of the device, after the pan-tilt receives the target information, the latitude and longitude of the target object detected by the radar, the pitch angle of the radar when the radar aims at the target object and the distance between the radar and the target object are obtained from the target information, and the all-directional movement information is obtained. Similarly, after the pan-tilt receives the target information, the physical size of the target object detected by the radar can be obtained from the target information. A certain distance exists between the target object and the holder, and the proportion in a picture shot by the holder does not necessarily meet the requirements of a user. Therefore, the pan-tilt calculates pan-tilt zoom information according to the physical size of the target object and the pan-tilt frame ratio.
In the third embodiment of the present application, the pan/tilt head 12 calculates the omni-directional movement information and the zoom information by using the spatial relationship, and integrates the omni-directional movement and the zoom information into the omni-directional movement and zoom information capable of controlling the motion of the pan/tilt camera, thereby realizing the pan/tilt positioning.
Further, in order to describe more clearly how to convert the target information into the omnidirectional moving zoom information using the spatial relationship in detail, a practical example will be described below.
Device embodiment II
Fig. 9 is a schematic structural diagram of a second embodiment of the apparatus of the present application. Suppose that the second embodiment of the apparatus includes not only the logic function modules shown in fig. 7 and 8, but also the calibration module 34, which is used to obtain the relative position relationship between the radar and the pan/tilt head in the three-dimensional space; and determining a three-dimensional space coordinate system according to the relative position relationship, wherein the three-dimensional space coordinate system is used for providing reference coordinates for the conversion from the target information to the omnidirectional movement zooming information. The omni-directional mobile information conversion module 41 includes: a target object coordinate determination module 411 and an omni-directional movement information calculation module 412. The target object coordinate determination module 411 determines a three-dimensional space coordinate of the target object 13 in a three-dimensional space coordinate system according to the longitude and latitude, the pitch angle and the target distance, where the three-dimensional space coordinate system is determined according to a relative position relationship between the radar 11 and the pan/tilt head 13 in a three-dimensional space. The omni-directional movement information calculating module 412 calculates the omni-directional movement information of the pan/tilt head 12 for the target object according to the three-dimensional space coordinates of the target object 13 and the three-dimensional space coordinates of the pan/tilt head 12. The zoom information conversion module 42 includes: a horizontal field of view calculation module 421 and a zoom information calculation module 422. The horizontal viewing field calculation module 421 calculates the horizontal viewing field half angle of the pan/tilt head 12 according to the size of the target object 13 and the pan/tilt head picture ratio. The zooming information calculating module 422 calculates zooming information according to the half angle of the horizontal viewing field and the corresponding relationship between the magnification and the viewing angle of the pan/tilt head 12.
Embodiments of the present application may also provide a computer-readable storage medium for storing instructions, that is: instructions associated with the above-described methods are stored in a computer-readable storage medium of the head, which instructions, when executed by a processor, cause the processor to perform the steps of any of the above-described head positioning methods.
Of course, an embodiment of the present application may also provide a pan/tilt head, which at least includes a camera, a computer-readable storage medium, and a processor, where the above-mentioned pan/tilt head positioning method may be executed by the processor inside the pan/tilt head, that is: the instructions related to the method are stored in a computer readable storage medium of the holder, and the processor of the holder executes the instructions related to the steps in any embodiment, so that the positioning of the holder can be realized.
By applying the scheme of the embodiment of the application, a complex and expensive CCTV system does not need to be deployed in the radar and the pan-tilt, the pan-tilt converts target information into omni-directional movement zooming information, and the pan-tilt is controlled to rotate, so that the aim of accurately positioning a target object is fulfilled, and the effects of reducing the cost of safety monitoring and simplifying the complexity of the scheme are achieved.
The above description is only exemplary of the present application and should not be taken as limiting the present application, as any modification, equivalent replacement, or improvement made within the spirit and principle of the present application should be included in the scope of protection of the present application.

Claims (10)

1. A method of pan-tilt positioning, the method comprising:
receiving target information transmitted by a radar, wherein the target information comprises geographic position information of a target object detected by the radar;
converting the target information into omni-directional movement zooming information, wherein the omni-directional movement zooming information is positioning information used for controlling the motion of a holder;
and controlling the motion of the holder according to the omni-directional movement zooming information to realize the positioning of the target object by the holder.
2. The method of claim 1, wherein the target information further includes attribute information of the target object and attribute information of the radar, and wherein converting the target information into omni-directional zoom information includes:
and converting the geographic position information of the target object, the attribute information of the target object and the attribute information of the radar into the omnidirectional movement zooming information.
3. The method of claim 2, wherein the geographic location information includes latitude and longitude of a geographic location of the target object, and a distance between the radar and the target object, wherein the radar attribute information includes a pitch angle of the radar when directed at the target object, wherein the target object attribute information includes a physical dimension of the target object detected by the radar, and wherein the converting to omni-directional zoom information includes:
calculating the omnibearing movement information of the holder according to the longitude and latitude, the pitch angle and the target distance;
and calculating zoom information of the pan-tilt according to the physical size of the target object and the pan-tilt frame ratio, wherein the omni-directional movement zoom information comprises the omni-directional movement information and the zoom information.
4. The method according to any of claims 1-3, wherein prior to said converting the target information into omni-directional movement zoom information, the method further comprises:
acquiring the relative position relation of the radar and the holder in a three-dimensional space;
and determining a three-dimensional space coordinate system according to the relative position relationship, wherein the three-dimensional space coordinate system is used for providing reference coordinates for the conversion from the target information to the omnidirectional movement zooming information.
5. The method of claim 3, wherein the calculating omni-directional movement information from the latitude and longitude, the pitch angle, and the target distance comprises:
determining a three-dimensional space coordinate of the target object in a three-dimensional space coordinate system according to the longitude and latitude, the pitch angle and the target distance, wherein the three-dimensional space coordinate system is determined according to a relative position relation between a radar and the holder in a three-dimensional space;
and calculating the omnibearing pan-tilt movement information of the target object according to the three-dimensional space coordinate of the target object and the three-dimensional space coordinate of the pan-tilt.
6. The method according to claim 3, wherein the method for calculating the zoom information of the pan/tilt head according to the physical size of the target object and the pan/tilt frame ratio comprises:
calculating a horizontal view field half angle of the holder according to the physical size of the target object and the holder picture ratio;
and calculating the zooming information of the holder according to the horizontal view field half angle and the corresponding relation between the magnification of the holder and the view field angle.
7. A pan-tilt positioning apparatus, comprising:
the data receiving module is used for receiving target information transmitted by the radar, and the target information comprises geographic position information of a target object detected by the radar;
the data conversion module is used for converting the target information into omni-directional movement and zoom information, wherein the omni-directional movement zoom information is positioning information used for controlling the motion of the tripod head;
and the positioning module controls the motion of the holder according to the omnidirectional movement zooming information to realize the positioning of the target object by the holder.
8. The apparatus of claim 7, wherein the target information further includes attribute information of the target object and attribute information of the radar; the geographic position information comprises longitude and latitude of a geographic position where the target object is located and a distance between the radar and the target object, the attribute information of the radar comprises a pitch angle when the radar is aligned with the target object, and the attribute information of the target object comprises a physical size of the target object detected by the radar; the data conversion module includes:
the omnibearing movement information conversion module is used for calculating the omnibearing movement information of the holder according to the longitude and latitude, the pitch angle and the target distance;
and the zooming information conversion module is used for calculating zooming information of the tripod head according to the physical size of the target object and the picture ratio of the tripod head, and the omni-directional movement zooming information comprises the omni-directional movement information and the zooming information.
9. A computer readable storage medium storing instructions which, when executed by a processor, cause the processor to perform the steps in a pan-tilt positioning method according to any one of claims 1 to 6.
10. A head comprising a camera, characterized in that it further comprises a computer-readable storage medium according to claim 9, and a processor that can execute instructions in said computer-readable storage medium.
CN201811247814.XA 2018-10-25 2018-10-25 Holder positioning method and device Pending CN111103899A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811247814.XA CN111103899A (en) 2018-10-25 2018-10-25 Holder positioning method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811247814.XA CN111103899A (en) 2018-10-25 2018-10-25 Holder positioning method and device

Publications (1)

Publication Number Publication Date
CN111103899A true CN111103899A (en) 2020-05-05

Family

ID=70418191

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811247814.XA Pending CN111103899A (en) 2018-10-25 2018-10-25 Holder positioning method and device

Country Status (1)

Country Link
CN (1) CN111103899A (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111580052A (en) * 2020-05-18 2020-08-25 苏州理工雷科传感技术有限公司 Simulation holder system and device for FOD detection radar joint debugging test
CN112394347A (en) * 2020-11-18 2021-02-23 杭州海康威视数字技术股份有限公司 Target detection method, device and equipment
CN112949466A (en) * 2021-02-26 2021-06-11 重庆若上科技有限公司 Video AI smoke pollution source identification and positioning method

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103034247A (en) * 2012-12-04 2013-04-10 浙江天地人科技有限公司 Controlling method and controlling device for remote monitoring system
CN104135644A (en) * 2014-07-31 2014-11-05 天津市亚安科技股份有限公司 Intelligent tracking cradle head having radar monitoring function and monitoring method
CN205142414U (en) * 2015-11-24 2016-04-06 陕西亿达泰电子科技有限公司 Video linkage monitoring device based on radar
JP2017204795A (en) * 2016-05-13 2017-11-16 キヤノン株式会社 Tracking apparatus
CN108037501A (en) * 2018-01-30 2018-05-15 长沙深之瞳信息科技有限公司 It is a kind of to obtain area outlook radar system and method for the target pitch to angle
CN207742335U (en) * 2018-02-06 2018-08-17 上海圆舟电子科技有限公司 A kind of intelligence maritime affairs tracking radar
CN108615321A (en) * 2018-06-07 2018-10-02 湖南安隆软件有限公司 Security pre-warning system and method based on radar detecting and video image behavioural analysis
CN207965139U (en) * 2018-01-30 2018-10-12 长沙深之瞳信息科技有限公司 A kind of area outlook radar system that can obtain target pitch angle

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103034247A (en) * 2012-12-04 2013-04-10 浙江天地人科技有限公司 Controlling method and controlling device for remote monitoring system
CN104135644A (en) * 2014-07-31 2014-11-05 天津市亚安科技股份有限公司 Intelligent tracking cradle head having radar monitoring function and monitoring method
CN205142414U (en) * 2015-11-24 2016-04-06 陕西亿达泰电子科技有限公司 Video linkage monitoring device based on radar
JP2017204795A (en) * 2016-05-13 2017-11-16 キヤノン株式会社 Tracking apparatus
CN108037501A (en) * 2018-01-30 2018-05-15 长沙深之瞳信息科技有限公司 It is a kind of to obtain area outlook radar system and method for the target pitch to angle
CN207965139U (en) * 2018-01-30 2018-10-12 长沙深之瞳信息科技有限公司 A kind of area outlook radar system that can obtain target pitch angle
CN207742335U (en) * 2018-02-06 2018-08-17 上海圆舟电子科技有限公司 A kind of intelligence maritime affairs tracking radar
CN108615321A (en) * 2018-06-07 2018-10-02 湖南安隆软件有限公司 Security pre-warning system and method based on radar detecting and video image behavioural analysis

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
孙申雨;: "雷达与视频联动系统应用浅析" *
汪永军 等: "一种基于岸基多雷达的船舶监视管理系统" *

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111580052A (en) * 2020-05-18 2020-08-25 苏州理工雷科传感技术有限公司 Simulation holder system and device for FOD detection radar joint debugging test
CN111580052B (en) * 2020-05-18 2023-05-23 苏州理工雷科传感技术有限公司 Simulation holder system and device for FOD detection radar joint debugging test
CN112394347A (en) * 2020-11-18 2021-02-23 杭州海康威视数字技术股份有限公司 Target detection method, device and equipment
CN112949466A (en) * 2021-02-26 2021-06-11 重庆若上科技有限公司 Video AI smoke pollution source identification and positioning method
CN112949466B (en) * 2021-02-26 2022-11-22 重庆若上科技有限公司 Video AI smoke pollution source identification and positioning method

Similar Documents

Publication Publication Date Title
CN108574822B (en) Method for realizing target tracking, pan-tilt camera and monitoring platform
US9109889B2 (en) Determining tilt angle and tilt direction using image processing
RU2743112C2 (en) Apparatus and method for analyzing vibrations using high-speed video data and using such a device for contactless analysis of vibrations
US10613231B2 (en) Portable GNSS survey system
CN108447075B (en) Unmanned aerial vehicle monitoring system and monitoring method thereof
CN105700547B (en) A kind of aerial three-dimensional video-frequency streetscape system and implementation method based on navigation dirigible
CN110910459B (en) Camera device calibration method and device and calibration equipment
KR101105606B1 (en) The method and apparatus of topographical map data with movement multi sensor moudle
US20140156219A1 (en) Determining tilt angle and tilt direction using image processing
CN108303078B (en) Omnidirectional ship anti-collision early warning and navigation system based on stereoscopic vision
KR101223242B1 (en) Apparatus for drawing digital map
CN107192377B (en) Method and device for remotely measuring object coordinates and aircraft
KR101308744B1 (en) System for drawing digital map
JP6251142B2 (en) Non-contact detection method and apparatus for measurement object
KR101214081B1 (en) Image expression mapping system using space image and numeric information
CN111103899A (en) Holder positioning method and device
CN109996032B (en) Information display method and device, computer equipment and storage medium
CN115841487B (en) Hidden danger positioning method and terminal along power transmission line
CN113345028A (en) Method and equipment for determining target coordinate transformation information
JP2011095112A (en) Three-dimensional position measuring apparatus, mapping system of flying object, and computer program
JP2018146524A (en) Survey system
JP2015010911A (en) Airborne survey method and device
CN111325790B (en) Target tracking method, device and system
CN111862197B (en) Target tracking method and system in video monitoring and ball machine
EP2948791B1 (en) Improved laser range finding

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right
TA01 Transfer of patent application right

Effective date of registration: 20200708

Address after: 311501 building A1, No. 299, Qiushi Road, Tonglu Economic Development Zone, Tonglu County, Hangzhou City, Zhejiang Province

Applicant after: Hangzhou Haikang Micro Shadow Sensing Technology Co.,Ltd.

Address before: No. 555 Kang Hangzhou Science Park of Zhejiang province Binjiang District Qianmo road 310053

Applicant before: Hangzhou Hikvision Digital Technology Co.,Ltd.