CN113489945A - Target positioning method, device and system and computer readable storage medium - Google Patents

Target positioning method, device and system and computer readable storage medium Download PDF

Info

Publication number
CN113489945A
CN113489945A CN202110634844.1A CN202110634844A CN113489945A CN 113489945 A CN113489945 A CN 113489945A CN 202110634844 A CN202110634844 A CN 202110634844A CN 113489945 A CN113489945 A CN 113489945A
Authority
CN
China
Prior art keywords
information
image
target
acquisition device
image acquisition
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110634844.1A
Other languages
Chinese (zh)
Inventor
吴云松
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Weifei Technology Co ltd
Original Assignee
Shenzhen Weifei Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Weifei Technology Co ltd filed Critical Shenzhen Weifei Technology Co ltd
Publication of CN113489945A publication Critical patent/CN113489945A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • H04N7/181Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a plurality of remote sources
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C11/00Photogrammetry or videogrammetry, e.g. stereogrammetry; Photographic surveying
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C11/00Photogrammetry or videogrammetry, e.g. stereogrammetry; Photographic surveying
    • G01C11/02Picture taking arrangements specially adapted for photogrammetry or photographic surveying, e.g. controlling overlapping of pictures
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B13/00Burglar, theft or intruder alarms
    • G08B13/18Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength
    • G08B13/189Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems
    • G08B13/194Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems
    • G08B13/196Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems using television cameras
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/695Control of camera direction for changing a field of view, e.g. pan, tilt or based on tracking of objects

Abstract

The invention discloses a target positioning method, a device, a system and a computer readable storage medium, which relate to the technical field of computer vision, wherein the target positioning method comprises the following steps: acquiring height information and first angle information of an image acquisition device; the height information is the distance between the image acquisition device and the ground, and the first angle information is the included angle between the image acquisition device and the ground; calculating the corresponding relation between the image position of the target in the image and the actual position of the target on the ground according to the height information and the first angle information; acquiring image information through the image acquisition device, and acquiring image position information of the target according to the image information; and obtaining the actual position information of the target according to the image position information and the corresponding relation. The target positioning method can automatically and accurately position the position of the target.

Description

Target positioning method, device and system and computer readable storage medium
Technical Field
The present invention relates to the field of computer vision technologies, and in particular, to a method, an apparatus, a system, and a computer-readable storage medium for target positioning.
Background
In recent years, with the rapid increase of the number of vehicles and the rapid expansion of urban areas, the phenomenon that one-way roads are very congested easily occurs in the morning and evening of working days and holidays. In order to cope with the overwhelming road traffic, the tidal channel can be changed according to the indication of the lane, thereby changing the correct driving direction of the road and utilizing the resources of the road to the maximum extent. At present, the existing tidal channels are all fixed switching schemes according to historical traffic flow data of roads, the tidal channels are generally opened or closed in fixed time periods, or the size of bidirectional road traffic flow is observed manually on site, and the control is realized by remotely or manually changing signboards, signal lamps, movable guardrails and the like.
Disclosure of Invention
The present invention is directed to solving at least one of the problems of the prior art. Therefore, the invention provides a target positioning method, a device, a system and a computer readable storage medium, which can automatically and accurately position the position of a target.
In a first aspect of the present application, a target positioning method is provided, including:
acquiring height information and first angle information of an image acquisition device; the height information is the distance between the image acquisition device and the ground, and the first angle information is the included angle between the image acquisition device and the ground;
calculating the corresponding relation between the image position of the target in the image and the actual position of the target on the ground according to the height information and the first angle information;
acquiring image information through the image acquisition device, and acquiring image position information of the target according to the image information;
and obtaining the actual position information of the target according to the image position information and the corresponding relation.
The object locating method according to the embodiment of the first aspect of the present invention has at least the following advantages: the image acquisition device is arranged at a certain height from the ground, a larger shooting visual angle can be obtained by rotating the image acquisition device in the horizontal direction and the vertical direction, the corresponding relation between the image position in a shot image and the actual position of a target on the ground can be calculated by acquiring the height information and the first angle information of the image acquisition device, the image information acquired by the image acquisition device is subjected to image processing and analysis to obtain the image position information of the target of the image information, the actual position information of the target on the ground can be calculated according to the corresponding relation and the image position information, and the whole process can position the target by automatically analyzing the image information.
According to some embodiments of the first aspect of the present invention, the acquiring height information and first angle information of the image capturing apparatus includes: acquiring the height information and initial angle information of the image acquisition device; the initial angle information is an initial included angle between the image acquisition device and the ground; acquiring rotation angle information of the image acquisition device; when the first angle information target is obtained to move according to the initial angle information and the rotation angle information, the shooting range of the image acquisition device can be adjusted by controlling the horizontal rotation angle and the vertical rotation angle of the image acquisition device, and the current first angle information of the image acquisition device can be calculated according to the initial angle and the adjusted rotation angle information, so that the corresponding relation between the image position and the actual position can be accurately calculated.
According to some embodiments of the first aspect of the present invention, the acquiring, by the image acquisition device, image information and acquiring image position information of the target according to the image information includes: controlling the horizontal rotation angle and the vertical rotation angle of the image acquisition device according to the moving direction of the target; acquiring the image information through the image acquisition device; and acquiring the image position information when the target moves according to the image information. When a moving target is detected in the image information, the image acquisition device performs rotation adjustment in the horizontal direction and/or the vertical direction according to the moving direction of the target and tracks the target, and tracks the image position information of the target on the image according to the rotated image information, so that the actual position information of the target can be accurately tracked.
Some embodiments according to the first aspect of the present invention further comprise: if the actual position information is the same as the preset position information, an early warning is sent out; the preset position information is a danger early warning area; acquiring target state information according to the actual position information and the preset position information; the target state information comprises that the target enters a preset position and the target is far away from the preset position. Comparing the actual position information with the preset position information, if the actual position information is the same as the preset position information, indicating that the target is in a dangerous early warning area, giving an alarm, carrying out automatic early warning on the warning area to acquire the relative state of the actual position and the preset position when the target moves, judging whether the target is close to or far away from the preset position, and making different reactions according to different moving directions: when the vehicle approaches the preset position, an early warning is actively sent out to prompt an operator to pay attention and take measures, or the vehicle drives away a target entering the preset area through the early warning.
According to some embodiments of the first aspect of the present invention, the preset position information includes one or a combination of preset area information and preset line information, the preset area is at least one closed area, and the preset line is at least one-dimensional line segment. The method comprises the steps of setting a preset area or a preset line, namely setting a protection area or a warning line, and actively sending out early warning when a target enters the preset area or crosses the warning line to prompt an operator to pay attention and take measures or to warn the operator to walk away the target running into the preset area.
According to some embodiments of the first aspect of the present invention, the obtaining of the actual position information of the target according to the image position information and the correspondence includes: obtaining actual abscissa information of the target according to the image abscissa information and the corresponding relation; and obtaining the actual longitudinal coordinate information of the target according to the image longitudinal coordinate information and the corresponding relation. In this embodiment, the image position information is located in a coordinate axis manner, and the position of the target position information on the actual coordinate axis is obtained according to the corresponding relationship, so that the position of the target is quickly and accurately obtained.
According to some embodiments of the first aspect of the present invention, the obtaining the actual position information of the target according to the image position information and the correspondence comprises: and acquiring actual position information of the target in real time through a plurality of image acquisition devices according to the image position information and the corresponding relation. The plurality of image acquisition devices are arranged to acquire the actual position information of the target together, so that on one hand, the actual position of the target can be accurate or the actual position of the target can be in an area, and on the other hand, the range of the camera for shooting the ground can be enlarged, and a larger monitoring range can be acquired.
In a second aspect of the present application, there is provided an object localization apparatus comprising: at least one memory; at least one processor; at least one program; the programs are stored in the memory, and the processor executes at least one of the programs to implement the object localization method as described in the embodiments of the first aspect.
In a third aspect of the present application, there is provided an object positioning system comprising: the system comprises at least one image acquisition device, a ground monitoring device and a control device, wherein the image acquisition device is arranged at a first height from the ground and is used for acquiring image information; the rotating device is connected with the image acquisition device and used for adjusting the rotation angle of the image acquisition device, the rotating device is further provided with a first sensing component used for obtaining height information and first angle information, and the rotating device is further provided with an original point sensing component used for sensing initial position information of the image acquisition device; the target positioning device according to the embodiment of the second aspect, wherein the target positioning device is respectively connected to the image acquisition device and the rotation device, and is configured to receive the image information, the height information, and the first angle information, and obtain actual position information of a target according to the image information, the height information, and the first angle information; and the supporting mechanism is connected with the rotating device and is used for supporting the rotating device and the image acquisition device.
The object locating system according to the embodiment of the third aspect of the present invention has at least the following advantages: the image acquisition device is arranged at a first height from the ground, a larger scanning range can be obtained, and image information with a larger range can be acquired, the camera is connected with the rotating device, the shooting visual angle of the camera in the horizontal direction and the vertical direction can be freely adjusted, the rotating device is internally provided with a first sensing component which is used for obtaining the height information and the first angle information of the image acquisition device and sending the height information, the first angle information and the image information to the target positioning device, so as to obtain the actual position information of a target, the rotating device is provided with an origin sensing component which is used for obtaining the initial position information of the image acquisition device, so that the rotating device can be controlled to rotate at the angle in the horizontal direction and the vertical direction, so as to acquire the image information in real time, the whole process can position the target by automatically analyzing the image information, and automatic early warning is carried out on the warning area.
In a fourth aspect of the present application, a computer-readable storage medium is provided, which stores computer-executable instructions for causing a computer to perform the object localization method according to any one of the embodiments of the first aspect.
Additional aspects and advantages of the invention will be set forth in part in the description which follows and, in part, will be obvious from the description, or may be learned by practice of the invention.
Drawings
Additional aspects and advantages of the present invention will become apparent and readily appreciated from the following description of the embodiments, taken in conjunction with the accompanying drawings of which:
fig. 1 is a flowchart of a target positioning method according to an embodiment of the present application;
fig. 2 is a flowchart of a target positioning method according to another embodiment of the present application;
fig. 3 is a flowchart of a target positioning method according to another embodiment of the present application;
FIG. 4 is a flowchart of a target location method according to another embodiment of the present application;
FIG. 5 is a flowchart of a target location method according to another embodiment of the present application;
FIG. 6 is a schematic view of an object locating device provided in one embodiment of the present application;
FIG. 7 is a schematic view of an object locating system provided in one embodiment of the present application;
FIG. 8 is a schematic view of an image capture device of a target positioning device according to one embodiment of the present application;
FIG. 9 is an exploded view of a rotating device of a target positioning device according to one embodiment of the present application.
The reference numbers are as follows:
an image acquisition device 100; a rotating device 200; a horizontal rotation structure 210; a vertical rotation structure 220; the support mechanism 300.
Detailed Description
Reference will now be made in detail to embodiments of the present invention, examples of which are illustrated in the accompanying drawings, wherein like or similar reference numerals refer to the same or similar elements or elements having the same or similar function throughout. The embodiments described below with reference to the accompanying drawings are illustrative only for the purpose of explaining the present invention, and are not to be construed as limiting the present invention.
In the description of the present invention, it should be understood that the orientation or positional relationship referred to in the description of the orientation, such as the upper, lower, front, rear, left, right, etc., is based on the orientation or positional relationship shown in the drawings, and is only for convenience of description and simplification of description, and does not indicate or imply that the device or element referred to must have a specific orientation, be constructed and operated in a specific orientation, and thus, should not be construed as limiting the present invention.
In the description of the present invention, if there is any description of "first", "second", etc. for the purpose of distinguishing technical features, it is not to be understood as indicating or implying relative importance or implicitly indicating the number of indicated technical features or implicitly indicating the precedence of the indicated technical features. "at least one" means one or more, "a plurality" means two or more. "and/or" describes the association relationship of the associated objects, and means that there may be three relationships, for example, a and/or B, and may mean that a exists alone, a and B exist simultaneously, and B exists alone. Wherein A and B can be singular or plural. The character "/" generally indicates that the former and latter associated objects are in an "or" relationship. "at least one of the following" and similar expressions refer to any combination of these items, including any combination of singular or plural items. For example, at least one of a, b, and c may represent: a, b, c, a and b, a and c, b and c or a and b and c, wherein a, b and c can be single or multiple.
In the description of the present invention, unless otherwise explicitly limited, terms such as arrangement, installation, connection and the like should be understood in a broad sense, and those skilled in the art can reasonably determine the specific meanings of the above terms in the present invention in combination with the specific contents of the technical solutions.
In a first aspect, referring to fig. 1, fig. 1 is a flowchart of an object locating method provided in an embodiment of the present application, and the present application provides an object locating method, which includes, but is not limited to, step S110, step S120, step S130, and step S140:
step S110: acquiring height information and first angle information of an image acquisition device; the height information is the distance between the image acquisition device and the ground, and the first angle information is the included angle between the image acquisition device and the ground.
It can be understood that, by disposing the image capturing apparatus 100 on a two-degree-of-freedom pan/tilt head at a certain height from the ground, the image capturing apparatus 100 can freely rotate in both the horizontal direction and the vertical direction, wherein the horizontal direction is parallel to the horizontal plane. The height information and the first angle information of the image acquisition device 100 from the ground are acquired to acquire the range which is monitored by the image acquisition device 100, the range and the definition of the image information acquired by the image acquisition device 100 can be changed by adjusting the height information, and the area of the image information acquired by the image acquisition device 100 can be changed by adjusting the first angle information.
It is understood that a horizontal ball is disposed on the horizontal rotation mechanism, and the levelness of the horizontal rotation mechanism 210 can be adjusted by adjusting the position of the horizontal ball.
It is understood that the horizontal direction may not be parallel to the horizontal plane, and the horizontal direction may be measured in advance or obtained through calibration. At this time, the height of the image capturing apparatus 100 from the ground may also be determined by measurement.
Step S120: and calculating the corresponding relation between the image position of the target in the image and the actual position of the target on the ground according to the height information and the first angle information.
It can be understood that, in the case of different height information and different first angle information, the correspondence relationship between the image position in the image acquired by the image acquisition apparatus 100 and the actual position of the target on the ground is different, and the change of the height information and the first angle information is acquired in real time to calculate the correspondence relationship between the image position and the actual position through the mathematical geometric relationship and the mapping relationship in real time.
Step S130: and acquiring image information through an image acquisition device, and acquiring image position information of the target according to the image information.
It can be understood that, by acquiring the image information from the image acquisition apparatus 100 and performing image recognition analysis processing on the acquired image, whether an object exists in the image information is detected, the object may be a part added to the original reference image or a part moving in the image information, and the image position information is acquired, specifically, coordinate axes are established in the image information, and coordinates where the object exists in the image are calibrated.
Step S140: and obtaining the actual position information of the target according to the image position information and the corresponding relation.
It can be understood that the actual position information is obtained by substituting the image position information into a specific calculation formula of the corresponding relationship between the image position of the target in the image and the actual position of the target on the ground, so that the specific position of the target is accurately positioned.
The target positioning method can position the target and automatically early warn the warning area by automatically analyzing the image information in the whole process.
Referring to fig. 2, fig. 2 is a flowchart of a target positioning method according to another embodiment of the present application, wherein step S110 includes, but is not limited to, the following steps:
step S210: acquiring height information and initial angle information of an image acquisition device; wherein, the initial angle information is the initial included angle between the image acquisition device and the ground.
Step S220: and acquiring the rotation angle information of the image acquisition device.
Step S230: and obtaining first angle information according to the initial angle information and the rotation angle information.
It can be understood that, through the height information and the first angle information, the corresponding relation between the image position of the target in the image and the actual position of the target on the ground can be calculated, wherein, the first angle information is obtained by firstly obtaining the initial angle information of the image acquisition device 100 through the origin sensor, the initial angle information comprises the initial horizontal angle information and the initial vertical angle information, generally, the initial horizontal angle information and the initial vertical horizontal angle information are set as 0 degree, then obtaining the rotation angle information of the image acquisition device 100, the rotation angle information comprises the horizontal rotation angle information and the vertical rotation angle information, first horizontal angle information is obtained from the initial horizontal angle information and the horizontal rotation angle information, and, similarly, and obtaining first vertical angle information according to the initial vertical angle information and the vertical rotation angle information, wherein the first horizontal angle information and the first vertical angle information are the first angle information. When the target moves, the shooting range of the image acquisition device 100 can be adjusted by controlling the horizontal rotation angle and the vertical rotation angle of the image acquisition device 100, and the current first angle information of the image acquisition device 100 can be calculated according to the initial angle and the adjusted rotation angle information, so that the corresponding relation between the image position and the actual position can be accurately calculated.
Referring to fig. 3, fig. 3 is a flowchart of a target positioning method according to another embodiment of the present application, and step S130 includes, but is not limited to, the following steps:
step S310: and controlling the horizontal rotation angle and the vertical rotation angle of the image acquisition device according to the moving direction of the target.
Step S320: and acquiring image information through an image acquisition device.
Step S330: and acquiring image position information when the target moves according to the image information.
It can be understood that, when a moving target is detected in the image information, the horizontal rotation angle and the vertical rotation angle of the image acquisition device 100 are controlled according to the moving direction of the target to track the movement of the target, and the image information is acquired in real time, and the image position information of the target is obtained through image recognition, analysis and processing to prevent the target from being lost in the visual field range, so as to achieve the effect of tracking the specific position of the target.
Referring to fig. 4, the target positioning method provided in the present application further includes, but is not limited to, the following steps:
step S410: if the actual position information is the same as the preset position information, an early warning is sent out; and presetting the position information as a danger early warning area.
Step S420: acquiring target state information according to the actual position information and the preset position information; the target state information comprises that the target enters a preset position and the target is far away from the preset position.
The method for driving and evacuating the targets in the dangerous early warning area comprises the steps that the range indicated by the preset position information is a dangerous early warning area, when the fact that the actual position information of the targets is the same as the preset position information is detected, the targets are located in the dangerous early warning area, early warning is sent out, and the targets are reminded to leave the dangerous early warning area and related workers are reminded to drive and evacuate the targets entering the dangerous early warning area in an early warning mode. The method is applied to different fields, can prevent common personnel from entering a dangerous area by mistake for early warning, and can prevent lawbreakers from entering a confidential area.
It can be understood that the relative state of the actual position and the preset position when the target moves is obtained, whether the target is close to or away from the preset position can be judged, different reactions are made according to different moving directions, when the target is detected to be close to the preset position, early warning is actively sent out, related workers are prompted to pay attention and take measures, or the target rushing into the preset area is swept away through early warning, or when the target is detected to be away from the preset position, the early warning does not need to be sent out, and the attention to the target can be reduced.
It can be understood that if the actual position information is the same as the preset position information, an early warning is given, where the preset position information includes one or a combination of two kinds of information of preset region information and preset line information, the preset region is at least one closed region, and the preset line is at least one-dimensional line segment. The method comprises the steps of setting a preset area or a preset line, namely setting a protection area or a warning line, and actively sending out early warning when a target enters the preset area or crosses the warning line to prompt an operator to pay attention and take measures or to warn the operator to walk away the target running into the preset area. And according to the corresponding relation between the image position and the actual position of the target, setting a protection area or a warning line in the image, correspondingly protecting the actual protection area or the warning line, and actively sending out an early warning when the image position of the target is in the protection area or exceeds the warning line. The method can be applied to anti-theft control and active early warning; the method can also be applied to airports, and when birds or other foreign matters enter a certain area away from the plane, the method prompts an operator to take certain measures or sends out alarm sound to drive away the target rushing into a preset area.
Referring to fig. 5, in the target positioning method provided by the present application, step S140 includes, but is not limited to, the following steps:
step S510: and obtaining the actual abscissa information of the target according to the image abscissa information and the corresponding relation.
Step S520: and obtaining the actual vertical coordinate information of the target according to the vertical coordinate information of the image and the corresponding relation.
It can be understood that the image position information of the target, that is, the coordinates of the target in the image, including the image abscissa information and the image ordinate information, is obtained in the image information, the actual position information of the target is obtained through calculation according to the corresponding relationship between the image position at the current height and angle and the actual position, a coordinate axis is established at the actual position, the target where the target is located in the actual position, that is, the actual abscissa information and the actual ordinate information, is marked, the image position information is located in the coordinate axis manner, and the position of the target position information on the actual coordinate axis is obtained according to the corresponding relationship, so that the position where the target is located is quickly and accurately obtained.
In the target positioning method provided by the present application, step S140 includes, but is not limited to, the following steps: and acquiring the actual position information of the target in real time through a plurality of image acquisition devices according to the image position information and the corresponding relation.
It can be understood that, by setting the plurality of image capturing devices 100 to jointly obtain the actual position information of the target, on one hand, the actual position of the target can be accurately determined or the actual position of the target can be obtained, and on the other hand, the range of the camera for shooting the ground can be enlarged, so that a larger monitoring range can be obtained.
In a second aspect, the present application further provides a target positioning device, including: at least one memory, at least one processor and at least one program, the programs being stored in the memory, the processor executing the one or more programs to implement the above-described object localization method. One processor is illustrated in fig. 6.
The processor and memory may be connected by a bus or other means, with fig. 6 taking the example of a connection by a bus.
The memory, which is a non-transitory computer readable storage medium, may be used to store non-transitory software programs, non-transitory computer executable programs, and signals, such as program instructions/signals corresponding to the processing modules in the embodiments of the present application. The processor executes various functional applications and data processing by executing non-transitory software programs, instructions and signals stored in the memory, so as to implement the target positioning method of the above method embodiment.
The memory may include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program required for at least one function; the storage data area can store the related data of the automobile driving and steering safety early warning method and the like. Further, the memory may include high speed random access memory, and may also include non-transitory memory, such as at least one disk storage device, flash memory device, or other non-transitory solid state storage device. In some embodiments, the memory optionally includes memory located remotely from the processor, and these remote memories may be connected to the processing module via a network. Examples of such networks include, but are not limited to, the internet, intranets, local area networks, mobile communication networks, and combinations thereof.
The one or more signals are stored in a memory and, when executed by the one or more processors, perform the object localization method in any of the method embodiments described above. For example, the above-described method steps S110 to S140 in fig. 1, method steps S210 to S230 in fig. 2, method steps S310 to S330 in fig. 3, method steps S410 to S420 in fig. 4 and method steps S510 and S520 in fig. 5 are performed.
Referring to fig. 7 and 8, in a third aspect, the present application further provides an object positioning system comprising: at least one image acquisition device 100, a rotation device 200, a support mechanism 300 and an object positioning device as in the embodiment of the second aspect, the image acquisition device 100 is arranged at a first height from the ground, the image acquisition device 100 is used for acquiring image information, the rotating device 200 is connected with the image acquisition device 100, the rotating device 200 is used for adjusting the rotating angle of the image acquisition device 100, the rotating device 200 is also provided with a first sensing part, for obtaining the height information and the first angle information, the target positioning device is connected to the image collecting device 100 and the rotating device 200, respectively, the target positioning device is used for receiving the image information, the height information and the first angle information, acquiring actual position information of a target according to the image information, the height information and the first angle information, and specifically acquiring height information of the image acquisition device 100 from the ground and the first angle information of the image acquisition device 100; calculating the corresponding relation between the image position of the target in the image and the actual position of the target on the ground according to the height information and the first angle information; acquiring image information, and acquiring image position information of a target according to the image information; obtaining actual position information of the target according to the image position information and the corresponding relation; and if the actual position information is the same as the preset position information, sending out early warning. The supporting mechanism 300 is connected to the rotating device 200, and the supporting mechanism 300 is used for supporting the rotating device 200 and the image capturing device 100. Through setting up image acquisition device 100 in the first high department apart from the ground, can obtain great scanning range, thereby gather the bigger image information of scope, and the camera is connected with rotary device 200, can freely adjust the camera at the shooting visual angle of horizontal direction and vertical direction, rotary device 200 is inside still to be equipped with first response part, a height information and the first angle information for obtaining image acquisition device 100, and with height information, first angle information and image information send to target positioning device, thereby obtain the actual position information of target, whole process can be through automatic analysis image information, fix a position the target, and carry out automatic early warning to the warning region.
It can be understood that the rotation device 200 is further provided with an origin sensing component for sensing initial position information of the image capturing device 100, the initial position information includes initial horizontal angle information and initial vertical angle information of the image capturing device 100, and the origin sensing component is provided on the rotation device 200 to obtain the initial position information of the image capturing device 100, so as to control the angle of the rotation device 200 required to rotate in the horizontal and vertical directions, so as to obtain the image information in real time.
Referring to fig. 9, it can be understood that the rotating device 200 includes a horizontal rotating structure 210 and a vertical rotating structure 220, the upper end of the vertical rotating structure 220 is connected to the image capturing device 100, the lower end of the vertical rotating structure 220 is rotatably connected to the upper end of the horizontal rotating structure 210, the horizontal rotating structure 210 drives the vertical rotating structure 220 to rotate in the horizontal direction, so as to drive the image capturing device 100 to rotate in the horizontal direction, and the vertical rotating structure drives the image capturing device 100 to rotate in the vertical direction, so as to flexibly adjust the area where the image capturing device 100 captures the image information.
It can be understood that the supporting mechanism 300 may be a telescopic rod, and the height of the telescopic rod is adjusted by a motor, so as to adjust the range of the image acquisition device 100 for acquiring image information.
Referring to fig. 3, MNPQ is the image target surface, MNPQ is determined to be unchanged after the image capturing device 100 is installed, when the horizontal rotating structure 210 and the vertical rotating structure 220 rotate to one position, the observation range MNPQ of the image capturing device 100 is projected to the ground and then intersects with the ground, and the intersecting surface is ABCD, that is, the perspective projection relationship is formed between four points of the ABCD on the ground and four points of the MNPQ coordinate system of the camera. Each point in the ABCD quadrangle can be mapped to a plane determined by the MNPQ four points through perspective projection relation. Otherwise, the points in the MNPQ quadrangle correspond to the points in the ABCD quadrangle and have perspective projection relation. The connection line between the MNPQ and the center point O in the image capturing device 100 forms an OA that is the same as the rotation angle of the camera to position the target. For example, the camera may not be in the zero point when the camera is turned on, the camera moves to the zero point first, after the target is captured, the camera rotates by a corresponding angle, for example, 30 degrees, the OA line also rotates by 30 degrees, and the coordinate values of the ABCD and the physical coordinates of the target on the ground can be calculated by the corresponding relationship between the rotation angle of the image capturing device 100 and the coordinates of the ABCD.
It can be understood that if the ground is uneven, the corresponding relation between the MNPQ plane and the ABCD area can be obtained by obtaining the three-dimensional topography of the ground. Different from the horizontal ground, the corresponding relation between the MNPQ plane of the non-horizontal ground and the ABCD area is a self-defined nonlinear projection relation, and the intersection point of the ray OA and the curved surface equation is calculated by looking up the corresponding table of the pixel and the ground position or expressing the ground by the curved surface equation, so that the actual position of the target on the non-horizontal ground is obtained.
In a fourth aspect, embodiments of the present application provide a computer-readable storage medium storing computer-executable instructions, which when executed by one or more processors, cause the one or more processors to perform an object location method in the above method embodiments. For example, the above-described method steps S110 to S140 in fig. 1, method steps S210 to S230 in fig. 2, method steps S310 to S330 in fig. 3, method steps S410 to S420 in fig. 4 and method steps S510 and S520 in fig. 5 are performed.
The above-described embodiments of the apparatus are merely illustrative, and the units described as separate parts may or may not be physically separate, and the parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
From the above description of embodiments, those of ordinary skill in the art will appreciate that all or some of the steps, systems, and methods disclosed above may be implemented as software, firmware, hardware, and suitable combinations thereof. Some or all of the physical components may be implemented as software executed by a processor, such as a central processing unit, digital signal processor, or microprocessor, or as hardware, or as an integrated circuit, such as an application specific integrated circuit. Such software may be distributed on computer readable media, which may include computer storage media (or non-transitory media) and communication media (or transitory media). The term computer storage media includes volatile and nonvolatile, removable and non-removable media implemented in any method or technology for storage of information such as computer readable signals, data structures, program modules or other data, as is well known to those of ordinary skill in the art. Computer storage media includes, but is not limited to, RAM, ROM, EEPROM, flash memory or other memory technology, CD-ROM, Digital Versatile Disks (DVD) or other optical disk storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other medium which can be used to store the desired information and which can accessed by a computer. In addition, communication media typically embodies computer readable signals, data structures, program modules or other data in a modulated data signal such as a carrier wave or other transport mechanism and includes any information delivery media as known to those skilled in the art.
The embodiments of the present invention have been described in detail with reference to the accompanying drawings, but the present invention is not limited to the above embodiments, and various changes can be made within the knowledge of those skilled in the art without departing from the gist of the present invention.

Claims (10)

1. A method of locating an object, comprising:
acquiring height information and first angle information of an image acquisition device; the height information is the distance between the image acquisition device and the ground, and the first angle information is the included angle between the image acquisition device and the ground;
calculating the corresponding relation between the image position of the target in the image and the actual position of the target on the ground according to the height information and the first angle information;
acquiring image information through the image acquisition device, and acquiring image position information of the target according to the image information;
and obtaining the actual position information of the target according to the image position information and the corresponding relation.
2. The method of claim 1, wherein the obtaining the height information and the first angle information of the image capturing device comprises:
acquiring the height information and initial angle information of the image acquisition device; the initial angle information is an initial included angle between the image acquisition device and the ground;
acquiring rotation angle information of the image acquisition device;
and obtaining the first angle information according to the initial angle information and the rotation angle information.
3. The target positioning method according to claim 1, wherein the acquiring image information by the image acquisition device and obtaining image position information of the target according to the image information comprises:
controlling the horizontal rotation angle and the vertical rotation angle of the image acquisition device according to the moving direction of the target;
acquiring the image information through the image acquisition device;
and acquiring the image position information when the target moves according to the image information.
4. The method of claim 3, further comprising:
if the actual position information is the same as the preset position information, an early warning is sent out; the preset position information is a danger early warning area;
acquiring target state information according to the actual position information and the preset position information; the target state information comprises that the target enters a preset position and the target is far away from the preset position.
5. The method according to claim 4, wherein the preset position information comprises one or a combination of preset area information and preset line information, the preset area is at least one closed area, and the preset line is at least one-dimensional line segment.
6. The target positioning method according to claim 1, wherein the image position information includes image abscissa information and image ordinate information, and the obtaining of the actual position information of the target according to the image position information and the correspondence relationship includes:
obtaining actual abscissa information of the target according to the image abscissa information and the corresponding relation;
and obtaining the actual longitudinal coordinate information of the target according to the image longitudinal coordinate information and the corresponding relation.
7. The method of claim 1, wherein obtaining the actual position information of the target according to the image position information and the corresponding relationship comprises:
and acquiring actual position information of the target in real time through a plurality of image acquisition devices according to the image position information and the corresponding relation.
8. An object positioning device, comprising:
at least one memory;
at least one processor;
at least one program;
said programs being stored in said memory, said processor executing at least one of said programs to implement the object localization method of any of claims 1 to 6.
9. An object positioning system, comprising:
the system comprises at least one image acquisition device, a ground monitoring device and a control device, wherein the image acquisition device is arranged at a first height from the ground and is used for acquiring image information;
the rotating device is connected with the image acquisition device and used for adjusting the rotation angle of the image acquisition device, the rotating device is further provided with a first sensing component used for obtaining height information and first angle information, and the rotating device is further provided with an original point sensing component used for sensing initial position information of the image acquisition device;
the target positioning device according to claim 8, wherein the target positioning device is connected to the image capturing device and the rotating device, respectively, and configured to receive the image information, the height information, and the first angle information, and obtain actual position information of a target according to the image information, the height information, and the first angle information;
and the supporting mechanism is connected with the rotating device and is used for supporting the rotating device and the image acquisition device.
10. A computer-readable storage medium storing computer-executable instructions for causing a computer to perform the method of object localization according to any one of claims 1 to 7.
CN202110634844.1A 2020-12-18 2021-05-28 Target positioning method, device and system and computer readable storage medium Pending CN113489945A (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN2020115045797 2020-12-18
CN202011504579 2020-12-18

Publications (1)

Publication Number Publication Date
CN113489945A true CN113489945A (en) 2021-10-08

Family

ID=77934759

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110634844.1A Pending CN113489945A (en) 2020-12-18 2021-05-28 Target positioning method, device and system and computer readable storage medium

Country Status (1)

Country Link
CN (1) CN113489945A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116147648A (en) * 2022-12-31 2023-05-23 珠海泰坦新动力电子有限公司 Positioning adjustment method, positioning tool, device, equipment and storage medium
CN116761079A (en) * 2023-08-21 2023-09-15 国网山西省电力公司电力科学研究院 Fine tracking method, system and device for moving target of power transmission line

Citations (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100007736A1 (en) * 2007-02-14 2010-01-14 Panasonic Corporation Monitoring camera and monitoring camera control method
KR20110023472A (en) * 2009-08-31 2011-03-08 주식회사 이미지넥스트 Apparatus and method for tracking object based on ptz camera using coordinate map
CN106791419A (en) * 2016-12-30 2017-05-31 大连海事大学 A kind of supervising device and method for merging panorama and details
CN106973266A (en) * 2017-03-31 2017-07-21 三峡大学 Substation safety operation management and control system and method
CN108168706A (en) * 2017-12-12 2018-06-15 河南理工大学 A kind of multispectral infrared imaging detecting and tracking system for monitoring low-altitude unmanned vehicle
US20180360408A1 (en) * 2017-06-15 2018-12-20 Shanghai United Imaging Healthcare Co., Ltd. Imaging systems and methods thereof
CN109343002A (en) * 2018-11-23 2019-02-15 中国科学院电子学研究所 Auditory localization identification device and method
CN109360243A (en) * 2018-09-28 2019-02-19 上海爱观视觉科技有限公司 A kind of scaling method of the movable vision system of multiple degrees of freedom
KR20190067578A (en) * 2017-12-07 2019-06-17 (주)캠시스 Collision warning device and method using heterogeneous cameras having overlapped capture area
CN110174093A (en) * 2019-05-05 2019-08-27 腾讯科技(深圳)有限公司 Localization method, device, equipment and computer readable storage medium
CN110874953A (en) * 2018-08-29 2020-03-10 杭州海康威视数字技术股份有限公司 Area alarm method and device, electronic equipment and readable storage medium
WO2020063708A1 (en) * 2018-09-28 2020-04-02 杭州海康威视数字技术股份有限公司 Method, device and system for calibrating intrinsic parameters of fisheye camera, calibration device controller and calibration tool
CN110969097A (en) * 2019-11-18 2020-04-07 浙江大华技术股份有限公司 Linkage tracking control method, equipment and storage device for monitored target
CN111198608A (en) * 2018-11-16 2020-05-26 广东虚拟现实科技有限公司 Information prompting method and device, terminal equipment and computer readable storage medium
CN111885361A (en) * 2020-08-03 2020-11-03 南通理工学院 Video monitoring device for image recognition and monitoring method thereof

Patent Citations (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100007736A1 (en) * 2007-02-14 2010-01-14 Panasonic Corporation Monitoring camera and monitoring camera control method
KR20110023472A (en) * 2009-08-31 2011-03-08 주식회사 이미지넥스트 Apparatus and method for tracking object based on ptz camera using coordinate map
CN106791419A (en) * 2016-12-30 2017-05-31 大连海事大学 A kind of supervising device and method for merging panorama and details
CN106973266A (en) * 2017-03-31 2017-07-21 三峡大学 Substation safety operation management and control system and method
US20180360408A1 (en) * 2017-06-15 2018-12-20 Shanghai United Imaging Healthcare Co., Ltd. Imaging systems and methods thereof
KR20190067578A (en) * 2017-12-07 2019-06-17 (주)캠시스 Collision warning device and method using heterogeneous cameras having overlapped capture area
CN108168706A (en) * 2017-12-12 2018-06-15 河南理工大学 A kind of multispectral infrared imaging detecting and tracking system for monitoring low-altitude unmanned vehicle
CN110874953A (en) * 2018-08-29 2020-03-10 杭州海康威视数字技术股份有限公司 Area alarm method and device, electronic equipment and readable storage medium
WO2020063708A1 (en) * 2018-09-28 2020-04-02 杭州海康威视数字技术股份有限公司 Method, device and system for calibrating intrinsic parameters of fisheye camera, calibration device controller and calibration tool
CN109360243A (en) * 2018-09-28 2019-02-19 上海爱观视觉科技有限公司 A kind of scaling method of the movable vision system of multiple degrees of freedom
CN111198608A (en) * 2018-11-16 2020-05-26 广东虚拟现实科技有限公司 Information prompting method and device, terminal equipment and computer readable storage medium
CN109343002A (en) * 2018-11-23 2019-02-15 中国科学院电子学研究所 Auditory localization identification device and method
CN110174093A (en) * 2019-05-05 2019-08-27 腾讯科技(深圳)有限公司 Localization method, device, equipment and computer readable storage medium
CN110969097A (en) * 2019-11-18 2020-04-07 浙江大华技术股份有限公司 Linkage tracking control method, equipment and storage device for monitored target
CN111885361A (en) * 2020-08-03 2020-11-03 南通理工学院 Video monitoring device for image recognition and monitoring method thereof

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116147648A (en) * 2022-12-31 2023-05-23 珠海泰坦新动力电子有限公司 Positioning adjustment method, positioning tool, device, equipment and storage medium
CN116147648B (en) * 2022-12-31 2024-04-05 珠海泰坦新动力电子有限公司 Positioning adjustment method, positioning tool, device, equipment and storage medium
CN116761079A (en) * 2023-08-21 2023-09-15 国网山西省电力公司电力科学研究院 Fine tracking method, system and device for moving target of power transmission line
CN116761079B (en) * 2023-08-21 2023-11-03 国网山西省电力公司电力科学研究院 Fine tracking method, system and device for moving target of power transmission line

Similar Documents

Publication Publication Date Title
JP4003623B2 (en) Image processing system using a pivotable surveillance camera
EP2187166B1 (en) Industrial Machine
JP2609518B2 (en) Device for determining the position of a lens placed on a mobile machine on the field
CA2767312C (en) Automatic video surveillance system and method
CN113489945A (en) Target positioning method, device and system and computer readable storage medium
CN108447075B (en) Unmanned aerial vehicle monitoring system and monitoring method thereof
CN104902246A (en) Video monitoring method and device
EP2660625A1 (en) A method for monitoring a traffic stream and a traffic monitoring device
CN113345019B (en) Method, equipment and medium for measuring potential hazards of transmission line channel target
RU2015141333A (en) SYSTEMS AND METHODS OF TRACKING THE LOCATION OF A MOBILE TARGET OBJECT
JP2022003578A (en) Operation vehicle
CN110705359B (en) Parking space detection method
KR101852057B1 (en) unexpected accident detecting system using images and thermo-graphic image
KR20100110999A (en) Calibration method and apparatus for automotive camera system, and method and ecu for determining angular misalignments of automotive camera system
CN105141912B (en) A kind of method and apparatus of signal lamp reorientation
AU2014259557B2 (en) Method for aligning a laser scanner with respect to a roadway
CN104966062A (en) Video monitoring method and device
CN107458308A (en) A kind of auxiliary driving method and system
CN109087361B (en) Monocular camera-based method for calibrating transverse distance of forward object
CN114239995A (en) Method and system for generating full-area cruising route, electronic device and storage medium
CN111372051B (en) Multi-camera linkage blind area detection method and device and electronic equipment
DE112021002831T5 (en) ROAD-SIDE DETECTION AND WARNING SYSTEM AND PROCEDURES
CN114650395A (en) Method, device and system for positioning flying bird and storage medium
US20220343656A1 (en) Method and system for automated calibration of sensors
CN113096437B (en) Automatic parking method and device and vehicle

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination