CN110595443A - Projection device - Google Patents
Projection device Download PDFInfo
- Publication number
- CN110595443A CN110595443A CN201910780352.6A CN201910780352A CN110595443A CN 110595443 A CN110595443 A CN 110595443A CN 201910780352 A CN201910780352 A CN 201910780352A CN 110595443 A CN110595443 A CN 110595443A
- Authority
- CN
- China
- Prior art keywords
- image
- image acquisition
- acquisition device
- angle
- theta
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C11/00—Photogrammetry or videogrammetry, e.g. stereogrammetry; Photographic surveying
- G01C11/04—Interpretation of pictures
- G01C11/30—Interpretation of pictures by triangulation
- G01C11/34—Aerial triangulation
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S19/00—Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
- G01S19/38—Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system
- G01S19/39—Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system the satellite radio beacon positioning system transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
- G01S19/42—Determining position
- G01S19/45—Determining position by combining measurements of signals from the satellite radio beacon positioning system with a supplementary measurement
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N9/00—Details of colour television systems
- H04N9/12—Picture reproducers
- H04N9/31—Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
- H04N9/3141—Constructional details thereof
- H04N9/317—Convergence or focusing systems
Landscapes
- Engineering & Computer Science (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Multimedia (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Signal Processing (AREA)
- Computer Networks & Wireless Communication (AREA)
- Studio Devices (AREA)
Abstract
The invention provides a positioning method and a positioning system, comprising the following steps: the method comprises the steps that a first image acquisition device acquires a first image of a target object, and a second image acquisition device acquires a second image of the target object; acquiring a first offset angle of the target relative to the first image acquisition device according to the relative position of the target in the first image and the shooting angle range of the first image acquisition device, and acquiring a second offset angle of the target relative to the second image acquisition device according to the relative position of the target in the second image and the shooting angle range of the second image acquisition device; and obtaining the position of the target object according to the position information or the relative position information of the first image acquisition device, the second image acquisition device and the target object, the first offset angle and the second offset angle. The positioning method and the positioning system analyze and obtain the position information of the target object according to the position of the image acquisition device and the image information captured by the image acquisition device, thereby realizing the positioning monitoring of the object.
Description
Technical Field
The present invention relates to the field of monitoring and positioning, and in particular, to a positioning method and a positioning system.
Background
In the existing positioning tracking monitoring device, position information on a target object is acquired and transmitted to a camera device or a monitoring device for positioning tracking, a positioning device such as a GPS is required to be installed on the target object, the positioning device is required to be communicated with the monitoring device to transmit the positioning information to a monitoring platform, but the positioning device cannot be installed on each target object under the actual monitoring condition; in addition, the monitoring device needs to establish communication with a target object provided with the positioning device to obtain positioning information of the target object, the process is complex, the use scenes or occasions are extremely limited, the monitoring device is extremely inconvenient to use in many scenes, and some scenes cannot be used.
Disclosure of Invention
Accordingly, it is desirable to provide a positioning method and a positioning system that can facilitate positioning monitoring and improve versatility.
Based on the above purpose, the present invention provides a positioning method, which includes the following steps:
A. the method comprises the steps that a first image acquisition device acquires a first image of a target object, and a second image acquisition device acquires a second image of the target object;
B. acquiring a first offset angle of the target relative to the first image acquisition device according to the relative position of the target in the first image and the shooting angle range of the first image acquisition device, and acquiring a second offset angle of the target relative to the second image acquisition device according to the relative position of the target in the second image and the shooting angle range of the second image acquisition device;
C. and obtaining the position of the target object according to the position information or the relative position information of the first image acquisition device, the second image acquisition device and the target object, the first offset angle and the second offset angle.
Preferably, in the step a, the first image and the second image are respectively a dynamic image frame or a static image obtained by the first image capturing device and the second image capturing device at the same time.
Preferably, in step a: and identifying a common object in the first image and the second image as the target object.
Preferably, in step B: first offset Angle θ'1Comprises the following steps:
wherein, image _ x1Is the abscissa, w, of the object in the second image1Is the frame width of the first image (theta)11-θ10) Indicating the range of the shooting angle, theta, of the first image-pickup device10And theta11Respectively the starting angle and the ending angle of the shooting angle range of the first image acquisition device;
second offset angle θ2' is: theta2'=(image_x2)×(θ21-θ20)/w2………………(2)
Wherein, image _ x2Is the abscissa, w, of the object in the second image2Is the frame width of the second image (theta)21-θ20) Indicating the range of the shooting angle, theta, of the second image pickup device20And theta21Respectively the start angle and the end angle of the shooting angle range of the second image acquisition device.
Preferably, the theta is10And the theta11The absolute angle of the first image acquisition device relative to a preset initial position of the first image acquisition device or the relative angle of the first image acquisition device relative to the current direction of the first image acquisition device; theta is a function of20And the theta21Is an absolute angle relative to a preset initial position of the second image capturing device or is a relative angle relative to a current pointing direction of the second image capturing device.
Preferably, theta10And theta11Is a fixed value or a variable value adjusted according to the depth of field or focal length of the first image acquisition device; theta20、θ21Is a fixed value or a variable value adjusted according to the depth of field or focal length of the second image capturing device.
Preferably, step C includes: obtaining a third distance and a third angle of a connecting line between the first image acquisition device and the second image acquisition device according to the position information of the first image acquisition device and the second image acquisition device; and obtaining the position of the target object according to the third distance, the third angle, the first offset angle and the second offset angle.
Preferably, step C specifically includes: according to the triangulation principle, obtaining a triangular geometric relation:
d is obtained by calculation according to the formula (3)1Then obtaining X by calculation according to the formula (4)0Y is obtained by calculation according to the formula (6)0(ii) a Alternatively, d is obtained by calculation according to equation (3)2Then obtaining X by calculation according to the formula (5)0Y is obtained by calculation according to the formula (7)0;
Wherein the position coordinates of the first image capturing device are expressed as (x)1,y1) The position coordinate of the second image acquisition device is expressed as (x)2,y2),(X0,Y0) Representing the position coordinates of the object, d1Representing the distance of the first image-pickup device to the target object, d2Indicating the distance, theta, of the second image pick-up device to the object1Representing a first offset angle, theta, of the object with respect to the first image-capturing device2Representing a second offset angle of the object relative to the second image capturing device.
Preferably, the position information of the first image capturing device includes a first GPS position and first orientation information, and the position information of the second image capturing device includes a second GPS position and second orientation information.
Based on the above object, the present invention further provides a positioning system, comprising: the monitoring device is in communication connection with the at least two image acquisition devices; the at least two image acquisition devices comprise a first image acquisition device and a second image acquisition device;
the first image acquisition device is used for acquiring a first image of a target object;
the second image acquisition device is used for acquiring a second image of the target object;
the monitoring device is used for acquiring a first offset angle of the target relative to the first image acquisition device according to the relative position of the target in the first image and the shooting angle range of the first image acquisition device, and acquiring a second offset angle of the target relative to the second image acquisition device according to the relative position of the target in the second image and the shooting angle range of the second image acquisition device; the monitoring device is further used for obtaining the position of the target object according to the position information or the relative position information of the first image acquisition device, the second image acquisition device and the target object, the first offset angle and the second offset angle.
The positioning method and the positioning system can realize the positioning monitoring of the object in the monitoring area according to the self-positioning information and the respective captured images of the first image acquisition device and the second image acquisition device, and can also adjust the visual field range, the monitoring range and the monitoring area of the first image acquisition device and the second image acquisition device according to the captured object by adjusting the parameters of the first image acquisition device and the second image acquisition device, adjust the monitoring area according to the movement change of the monitored object to track and capture the specific object for directional capture, and realize real-time tracking in a certain area range. The invention can carry out positioning monitoring on the target object without acquiring the authorization of the target object or establishing a related communication protocol or installing a positioning device such as a GPS on the target object, and is not limited by a communication scene and the permission of the target object.
Drawings
Fig. 1 is a partial flowchart of a positioning method according to an embodiment of the present invention;
FIG. 2 is a schematic diagram of an image capturing device according to an embodiment of the present invention, associating with a target object on a captured image;
FIG. 3 is a schematic diagram illustrating a relative position relationship between a first image capturing device, a second image capturing device and a target object according to an embodiment of the present invention;
FIG. 4 is a schematic diagram of a relative position relationship between a first image capturing device, a second image capturing device and a target object according to another embodiment of the present invention;
fig. 5 is a partial schematic view of an application scenario of the positioning system according to an embodiment of the invention.
Detailed Description
In order to further understand the objects, structures, features and functions of the present invention, the following embodiments are described in detail.
Certain terms are used throughout the description and following claims to refer to particular components. As one of ordinary skill in the art will appreciate, manufacturers may refer to a component by different names. The present specification and claims do not intend to distinguish between components that differ in name but not function. In the following description and in the claims, the terms "include" and "comprise" are used in an open-ended fashion, and thus should be interpreted to mean "include, but not limited to.
As shown in fig. 1, a flow chart of a positioning method according to a first embodiment of the present invention is disclosed, which includes the following steps:
step A, a first image acquisition device 10 acquires a first image I1 of an object 30, and a second image acquisition device 20 acquires a second image of the object 30;
step B, obtaining a first offset angle of the object 30 relative to the first image capturing device 10 according to the relative position of the object 30 in the first image I1 and the shooting angle range of the first image capturing device 10, and obtaining a second offset angle of the object 30 relative to the second image capturing device 20 according to the relative position of the object 30 in the second image and the shooting angle range of the second image capturing device 20;
and step C, obtaining the position of the target object 30 according to the position information or the relative position information of the first image acquisition device 10, the second image acquisition device 20 and the target object 30, and the first offset angle and the second offset angle.
In a preferred embodiment, in step a, the first image I1 and the second image I2 are motion picture frames or still pictures acquired by the first image capturing device 10 and the second image capturing device 20 at the same time, respectively.
Further, in step a of this embodiment: an object common to the first image I1 and the second image I2 is identified as the object 30. Image recognition techniques may be used for recognition, such as recognizing the license plate number of a vehicle in motion, or extracting features of the target object 30 for feature matching.
Further, in the present embodiment, the position information of the first image capturing device 10 includes: a first GPS location and first orientation information. The position information of the second image capturing device 20 includes: a second GPS location and second position information.
Preferably, the photographing angle range of the first image pickup device 10 or the photographing angle range of the second image pickup device 20 in the present embodiment is a horizontal angle range or a horizontal angle of view of the photographing range.
The offset angle in the present embodiment is a relative offset angle of the target 30 with respect to the angle of view of the image pickup device, or the initial angle or the central angle of the shooting angle range.
The image acquisition device of the embodiment can adopt a camera device to acquire images, such as continuously acquiring dynamic images and intermittently acquiring static pictures; in a system with a plurality of cameras, the image acquisition operation of other cameras in a new entering road section can be triggered according to the fact that the intersection camera acquires the target object 30 of the new entering road section, or the image acquisition operation of other cameras in the site or in a specific area in the site can be triggered according to the fact that the entrance camera acquires the target object 30 of the entering site, and the specific acquisition mode is set according to actual needs.
Further, in the present embodiment, the position information of the first image capturing device 10 and the second image capturing device 20 may be manually recorded in the system when the first image capturing device 10 and the second image capturing device 20 are installed, or may be obtained according to the positioning devices of the first image capturing device 10 and the second image capturing device 20. The positioning device may be built in the image acquisition device or may be externally provided, and is mainly used to acquire position information such as GPS information. The positioning device may also be a built-in or external gyroscope for acquiring the orientation information of the image capturing device, such as the rotation angle, but the invention is not limited thereto.
Before calculating the position of the target 30 in step C, the position information of the first image capturing device 10 and the second image capturing device 20 is acquired. For example, the first image capturing device 10 and the second image capturing device 20 are fixed, and the position information thereof may be pre-stored in a database, and when the identification information of the two image capturing devices is obtained, the identification information is extracted from the database and analyzed; for another example, the first image capturing device 10 and the second image capturing device 20 are fixed or mobile, and upload the GPS information and/or the orientation information obtained by the GPS device and the orientation device built in or associated with the image capturing devices when uploading the images.
The positioning method of the embodiment can position a fixed object and can also position an object in motion or movement. Preferably, the positioning method of the present embodiment monitors a specific object by positioning the object in motion; or the moving object in a certain specific area or a specific scene is monitored in real time, the specific object can be monitored in a certain area range, and the shooting angle range or the field angle, the field range, the scene, the shooting area and the like can be adjusted by adjusting the focal length and the lens steering of the image acquisition device. For example, vehicles traveling on roads are monitored in real time, people enter and exit places such as meetings, stations, and the like.
Further, the size of the first image I1 or the second image I2 of the present embodiment is represented by width × height. The width (w1 or w2) of the first image I1 or the second image I2, the first target image coordinates (image _ x1, image _ y1) of the target 30 in the first image I1, and the second target image coordinates (image _ x2, image _ y2) of the target 30 in the second image I2 may be obtained according to the number of image pixels or the measurement size, and the above coordinates may be specific feature points of the target image in the image, such as the center position, the center of gravity position, the leftmost edge, the rightmost edge, and the like of the target image, and the invention is not limited thereto.
In the present embodiment, the relative position of the object 30 in the first image I1 is determined based on the first object image abscissa image _ x1 and the width w1 of the first image I1. The relative position of the object 30 in the second image I2 is confirmed based on the same method, and will not be described herein.
Preferably, the imaging angle range or the angle of view in the present embodiment is a horizontal imaging angle range or a horizontal angle of view. The images of the stationary object or the moving object, such as the images or the images of the vehicles running on the road, are collected according to the horizontal shooting angle range or the horizontal field angle projected downwards by the first image capturing device 10 and the second image capturing device 20, so as to obtain the relative position of the target object 30 in the second image.
The shooting angle range, or field angle size, of the present embodiment is related to the image sensor, such as CCD sensor size and lens focal length: horizontal field angle is 2 × arctan (w'/2 f); the vertical field angle is 2 × arctan (h'/2 f); field angle 2 × arctan (d'/2 f); w ' is the width of the image sensor, e.g., CCD, h ' is the height of the image sensor, e.g., CCD, and d ' is the diagonal length of the image sensor, e.g., CCD.
Further, a first offset angle theta 'of the offset of the object 30 relative to the shooting angle range or the view angle starting line of the first image acquisition device 10 is estimated according to the relative position of the object 30 in the first image I1 and the shooting angle range or the view angle of the first image acquisition device 10'1. Estimating a second offset angle theta 'of the initial side/initial line offset of the object 30 relative to the shooting angle range or the view angle of the second image acquisition device 20 according to the relative position of the object 30 in the second image I2 and the shooting angle range or the view angle of the second image acquisition device 20'2。
As shown in fig. 2, 3 and 4, further, in step B:
first offset Angle θ'1Comprises the following steps: theta'1=(image_x1)×(θ11-θ10)/w1…………………………(1)
Wherein (image _ x)1)/w1Is the relative position of the object 30 in the first image I1, image _ x1Is the abscissa, w, of the object 30 in the second image I21The width of the first image I1 (theta)11-θ10) Indicates the shooting angle range or the field angle, θ, of the first image pickup device 1010And theta11Respectively, a start angle and an end angle of the shooting angle range of the first image capturing apparatus 10.
Second offset Angle θ'2Comprises the following steps: theta'2=(image_x2)×(θ21-θ20)/w2…………………………(2)
Wherein (image _ x)2)/w2Is the relative position of the object 30 in the second image I2, image _ x2Is the abscissa, w, of the object 30 in the second image I22Is the picture width of the second image I2 (theta)21-θ20) Indicates a photographing angle range or an angle of view, θ, of the second image pickup device 2020And theta21Respectively, a start angle and an end angle of the shooting angle range of the second image capturing apparatus 20.
In the present embodiment, θ10、θ11May be a relative angle, θ, with respect to the current orientation of the first image capturing device 1020、θ21May be a relative angle with respect to the current orientation of the second image capturing device 20. As shown in fig. 2 and 3, the angles associated with the first image capturing device 10 are relative to the current pointing angle θ of the first image capturing device 1012In other words, the angles associated with the second image capturing device 20 are all relative to the current pointing angle θ at the second image capturing device 2022In other words. In this case, θ10、θ11Are generally equal, θ20、θ21The fields of view of image capture devices 10 and 20 are also generally equal or symmetric about their optical axes.
For example, the object image 31 or 32 is to the left of the central axis in the image I1 or I2 in fig. 2, at this time (θ)21-θ20) Is a negative value, obtained is theta'1Is a negative value; in another exampleThe object image 31 or 32 is on the right side of the central axis in the image I1 or I2 in fig. 2, at this time (θ)21-θ20) Is positive value, obtained is theta'1Positive values. At this time, the line connecting the object 30 with respect to the first image capturing device 10 forms an angle θ with the x-axis1(or, the normalized first deviation angle), the line connecting the target 30 with respect to the second image capturing device 10 forms an angle θ with the x-axis2(or, the normalized second offset angle).
At this time, θ is based on the coordinate relationship1=θ12+θ'1-(θ11-θ10)/2,θ2=θ22-((θ21-θ20)/2-θ'2)。
Further, θ of the present embodiment10、θ11The focus distance may be a fixed value, or may be a variable value according to the depth of field or the focal length of the first image capturing device 10; theta20、θ21The focus distance may be a fixed value or a variable value according to the depth of field or the focal length of the second image capturing device 20. For example, when the image pickup device adjusts the lens to perform optical zooming or digital zooming, the image of the object is enlarged in the field of view, and the field angle (θ) of the image pickup device at this time11-θ10) Or (theta)21-θ20) It becomes smaller. In other words, the image capture device 10/20 may be a fixed focus device with a fixed field of view. The image capture device 10/20 may also be a fixed focus device, with the field of view range varying with the adjustment of the focal length. The adjustment of the focal length includes, but is not limited to, changing the lens, optical zooming, digital zooming.
In another embodiment, θ10、θ11But also an absolute angle theta with respect to a preset initial position of the first image acquisition arrangement 1020、θ21Is an absolute angle relative to a preset initial position of the second image capturing device 20. As shown in FIG. 4, each angle associated with the first image capturing device 10 is a first offset angle θ'1The pointing angle theta of the first image capturing device 1012(or, lens barrel)Center axis, optical axis S1) and start angle θ of the shooting angle range of the first image pickup device 1010And an end angle theta11. The angles associated with the second image capturing device 20 are all relative to a second coordinate system established with the second image capturing device 20 as an origin, and are each a second offset angle θ'2The pointing angle theta of the second image capturing device 2022(or lens center axis, optical axis S2), and start angle θ of the shooting angle range of the second image pickup device 2020And an end angle theta21. The 0 degree direction and the rotation direction of the two coordinate systems are the same. Other similar reasoning can be used to obtain theta in the previous embodiment1And theta2And will not be described herein.
In one embodiment, step C comprises: the step C comprises the following steps: obtaining a third distance and a third angle of a connecting line between the first image acquisition device and the second image acquisition device according to the position information of the first image acquisition device and the second image acquisition device; and obtaining the position of the target object according to the third distance, the third angle, the first offset angle and the second offset angle.
In another embodiment, assuming that the position coordinates of the target object 30 are (X0, Y0), the position coordinates of the first image capturing device 10 are (X1, Y1), and the position coordinates of the second image capturing device 20 are (X2, Y2), the position coordinates may be GPS coordinates, and d is a distance between the first image capturing device and the second image capturing device, and the position coordinates may be GPS coordinates1Represents the distance, d, of the first image acquisition device 20 to the target object 302Indicates the distance, theta, from the second image pickup device 20 to the target 301Representing a first offset angle, theta, of the object 30 with respect to the first image acquisition arrangement 102Indicating a second offset angle of the object 30 relative to the second image capturing device 20. Step C comprises the following geometrical relationships according to the triangulation principle:
X0=x1+d1×sinθ1…………………………………(4)
X0=x2+d2×sinθ2…………………………………(5)
Y0=y1+d1×cosθ1…………………………………(6)
Y0=y2+d2×cosθ2…………………………………(7)
solving for d according to equations (4), (5), (6), (7) above1And d2To obtain equation (3):
the summary is as follows:
since x1, y1, x2 and y2 have been obtained in the foregoing manner, d can be obtained by calculation according to the formula (3)1And d2Then, X is obtained by calculation according to the formula (4) or the formula (5)0Y is obtained by calculation according to the formula (6) or the formula (7)0I.e., obtaining the position coordinates of the object 30.
The positioning system provided by the present invention may include a plurality of image capturing devices, wherein two adjacent image capturing devices may monitor and cover the same area, as shown in fig. 5, the above positioning method is adopted, and the whole monitoring area is monitored and covered.
The lenses of the first image capturing device 10 and the second image capturing device 20 of the present embodiment may be adjusted or controlled to be adjusted according to needs, and the orientation, the angle, and the parameters may be adjusted according to needs; the coordinate plane may be established by selecting three corresponding points according to the first image capturing device 10, the second image capturing device 20, and the target 30, or the coordinate system may be established by converting the first image capturing device 10, the second image capturing device 20, and the target 30 into one plane to perform the correlation calculation.
In this embodiment, the image capturing devices may be set at different positions on the road, and may be set on two sides of the road in a staggered manner, so as to increase the monitoring range, perform positioning monitoring on moving objects or moving vehicles on the road, or perform specific monitoring on specific objects or specific vehicles, and may control the image capturing devices to rotate the lens, adjust the focal length, and the like to perform positioning monitoring on the objects. The positioning method of the embodiment is not limited to be used on a straight road, and is also applicable to various curved roads or trails, and the first image acquisition device 10 and the second image acquisition device 20 are arranged in a staggered manner relative to the target object 30 or the image acquisition devices to be positioned on different sides and form a first image acquisition device 10 and a second image acquisition device 20 with adjacent opposite side image acquisition devices to position and monitor the object. The opposite sides of the present embodiment are only directed to the different direction sides of the object 30, or the two opposite sides of the road, and are not limited to symmetrical sides, and may be disposed on two sides of a curved segment or two sides of a multi-segment curved segment or two sides of a continuous turning lane, and are only directed to the different direction sides of the object 30, or different sides of the road, and are not limited to the two sides disposed opposite to each other.
Referring to FIG. 5, a schematic diagram of an embodiment of a positioning system of the present invention is disclosed, which includes: a monitoring device and at least two image acquisition devices (10, 20), wherein the monitoring device is connected with the at least two image acquisition devices in a communication way; wherein, the at least two image acquisition devices comprise a first image acquisition device 10 and a second image acquisition device 20;
the first image acquisition device 10 is used for acquiring a first image I1 of the object 30;
the second image capturing device 20 is used for acquiring a second image I2 of the object 30;
the monitoring device obtains a first offset angle of the object 30 relative to the first image capturing device 10 according to the relative position of the object 30 in the first image I1 and the shooting angle range of the first image capturing device 10, obtains a second offset angle of the object 30 relative to the second image capturing device 20 according to the relative position of the object 30 in the second image I2 and the shooting angle range of the second image capturing device 20, and obtains the position of the object 30 according to the position information or the relative position information of the first image capturing device 10, the second image capturing device 20 and the object 30, and the first offset angle and the second offset angle.
Further, in the present embodiment, the position information of the first image capturing device 10 includes: a first GPS location and first orientation information. The position information of the second image capturing device 20 includes: a second GPS location and second position information.
Preferably, the photographing angle range of the first image pickup device 10 or the photographing angle range of the second image pickup device 20 in the present embodiment is a horizontal angle range or a horizontal angle of view of the photographing range.
The monitoring device of this embodiment may be a module integrated with a certain image acquisition device, an independent data processing device, a server, a monitoring center, and the like. The operations of the first image capturing device 10 and the second image capturing device 20 may also be controlled via a monitoring device, for example, to control a switching operation state, a lens view area or a capturing area, to select a target object, and the like. The image acquisition device can be in communication connection with the monitoring device in a wired or wireless communication mode to upload corresponding image information; and control operation information from the monitoring device can be received, so that the working state of the monitoring device can be adjusted, and the like.
In addition, the positioning system can also comprise a database to store and backup the position information of each image acquisition device and the acquired images so as to facilitate subsequent online or offline analysis and processing.
The above-mentioned embodiments only express several embodiments of the present invention, and the description thereof is more specific and detailed, but not construed as limiting the scope of the present invention. It should be noted that, for a person skilled in the art, several variations and modifications can be made without departing from the inventive concept, which falls within the scope of the present invention. Therefore, the protection scope of the present patent shall be subject to the appended claims.
As will be appreciated by one skilled in the art, embodiments of the present application may be provided as a method, system, or computer program product. Accordingly, the present application may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present application may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The present application is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to embodiments of the application. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
The present invention has been described in relation to the above embodiments, which are only exemplary of the implementation of the present invention. It should be noted that the disclosed embodiments do not limit the scope of the invention. Rather, it is intended that all such modifications and variations be included within the spirit and scope of this invention.
Claims (10)
1. A method of positioning, comprising:
A. the method comprises the steps that a first image acquisition device acquires a first image of a target object, and a second image acquisition device acquires a second image of the target object;
B. acquiring a first offset angle of the target relative to the first image acquisition device according to the relative position of the target in the first image and the shooting angle range of the first image acquisition device, and acquiring a second offset angle of the target relative to the second image acquisition device according to the relative position of the target in the second image and the shooting angle range of the second image acquisition device;
C. and obtaining the position of the target object according to the position information or the relative position information of the first image acquisition device, the second image acquisition device and the target object, the first offset angle and the second offset angle.
2. The method according to claim 1, wherein in step a, the first image and the second image are respectively a motion picture frame or a still picture acquired by the first image capturing device and the second image capturing device at the same time.
3. The positioning method according to claim 1, wherein in step a: and identifying a common object in the first image and the second image as the target object.
4. The positioning method according to claim 1, wherein in step B: first offset angle theta1' is: theta1'=(image_x1)×(θ11-θ10)/w1………………(1)
Wherein, image _ x1Is the abscissa, w, of the object in the second image1Is the frame width of the first image (theta)11-θ10) Indicating the range of the shooting angle, theta, of the first image-pickup device10And theta11Respectively acquire the first imageA start angle and an end angle of a shooting angle range of the container device;
second offset angle θ2' is: theta2'=(image_x2)×(θ21-θ20)/w2………………(2)
Wherein, image _ x2Is the abscissa, w, of the object in the second image2Is the frame width of the second image (theta)21-θ20) Indicating the range of the shooting angle, theta, of the second image pickup device20And theta21Respectively the start angle and the end angle of the shooting angle range of the second image acquisition device.
5. The positioning method of claim 4, wherein θ is10And the theta11The absolute angle of the first image acquisition device relative to a preset initial position of the first image acquisition device or the relative angle of the first image acquisition device relative to the current direction of the first image acquisition device; theta is a function of20And the theta21Is an absolute angle relative to a preset initial position of the second image capturing device or is a relative angle relative to a current pointing direction of the second image capturing device.
6. The positioning method according to claim 4, wherein θ is10And theta11Is a fixed value or a variable value adjusted according to the depth of field or focal length of the first image acquisition device; theta20、θ21Is a fixed value or a variable value adjusted according to the depth of field or focal length of the second image capturing device.
7. The positioning method according to claim 1, wherein step C comprises: obtaining a third distance and a third angle of a connecting line between the first image acquisition device and the second image acquisition device according to the position information of the first image acquisition device and the second image acquisition device; and obtaining the position of the target object according to the third distance, the third angle, the first offset angle and the second offset angle.
8. The positioning method according to claim 1, wherein step C specifically comprises: according to the triangulation principle, obtaining a triangular geometric relation:
d is obtained by calculation according to the formula (3)1Then obtaining X by calculation according to the formula (4)0Y is obtained by calculation according to the formula (6)0(ii) a Alternatively, d is obtained by calculation according to equation (3)2Then obtaining X by calculation according to the formula (5)0Y is obtained by calculation according to the formula (7)0;
Wherein the position coordinates of the first image capturing device are expressed as (x)1,y1) The position coordinate of the second image acquisition device is expressed as (x)2,y2),(X0,Y0) Representing the position coordinates of the object, d1Representing the distance of the first image-pickup device to the target object, d2Indicating the distance, theta, of the second image pick-up device to the object1Representing a first offset angle, theta, of the object with respect to the first image-capturing device2Representing a second offset angle of the object relative to the second image capturing device.
9. The positioning method according to claim 1, wherein the position information of the first image capturing device includes a first GPS position and first orientation information, and the position information of the second image capturing device includes a second GPS position and second orientation information.
10. A positioning system, comprising: the monitoring device is in communication connection with the at least two image acquisition devices; the at least two image acquisition devices comprise a first image acquisition device and a second image acquisition device;
the first image acquisition device is used for acquiring a first image of a target object;
the second image acquisition device is used for acquiring a second image of the target object;
the monitoring device is used for acquiring a first offset angle of the target relative to the first image acquisition device according to the relative position of the target in the first image and the shooting angle range of the first image acquisition device, and acquiring a second offset angle of the target relative to the second image acquisition device according to the relative position of the target in the second image and the shooting angle range of the second image acquisition device; the monitoring device is further used for obtaining the position of the target object according to the position information or the relative position information of the first image acquisition device, the second image acquisition device and the target object, the first offset angle and the second offset angle.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910780352.6A CN110595443A (en) | 2019-08-22 | 2019-08-22 | Projection device |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910780352.6A CN110595443A (en) | 2019-08-22 | 2019-08-22 | Projection device |
Publications (1)
Publication Number | Publication Date |
---|---|
CN110595443A true CN110595443A (en) | 2019-12-20 |
Family
ID=68855270
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201910780352.6A Pending CN110595443A (en) | 2019-08-22 | 2019-08-22 | Projection device |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN110595443A (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113192139A (en) * | 2021-05-14 | 2021-07-30 | 浙江商汤科技开发有限公司 | Positioning method and device, electronic equipment and storage medium |
Citations (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2004325072A (en) * | 2003-04-21 | 2004-11-18 | Mamoru Otsuki | Photogrammetry and photogrammetric program |
CN101943580A (en) * | 2009-07-07 | 2011-01-12 | 宏达国际电子股份有限公司 | Method and device for detecting distance from target and computer program product thereof |
CN102622767A (en) * | 2012-03-05 | 2012-08-01 | 广州乐庚信息科技有限公司 | Method for positioning binocular non-calibrated space |
US20130297205A1 (en) * | 2012-05-02 | 2013-11-07 | Korea Institute Of Science And Technology | System and method for indoor navigation |
CN104021538A (en) * | 2013-02-28 | 2014-09-03 | 株式会社理光 | Object positioning method and device |
CN104915965A (en) * | 2014-03-14 | 2015-09-16 | 华为技术有限公司 | Camera tracking method and device |
CN104933718A (en) * | 2015-06-23 | 2015-09-23 | 广东省自动化研究所 | Physical coordinate positioning method based on binocular vision |
CN105072414A (en) * | 2015-08-19 | 2015-11-18 | 浙江宇视科技有限公司 | Method and system for detecting and tracking target |
CN105550670A (en) * | 2016-01-27 | 2016-05-04 | 兰州理工大学 | Target object dynamic tracking and measurement positioning method |
CN106597424A (en) * | 2016-12-22 | 2017-04-26 | 惠州Tcl移动通信有限公司 | Distance measuring method and distance measuring system based on dual cameras, and mobile terminal |
CN106813649A (en) * | 2016-12-16 | 2017-06-09 | 北京远特科技股份有限公司 | A kind of method of image ranging localization, device and ADAS |
CN107292906A (en) * | 2017-08-11 | 2017-10-24 | 阔地教育科技有限公司 | A kind of method for tracking target, storage device and target tracker |
-
2019
- 2019-08-22 CN CN201910780352.6A patent/CN110595443A/en active Pending
Patent Citations (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2004325072A (en) * | 2003-04-21 | 2004-11-18 | Mamoru Otsuki | Photogrammetry and photogrammetric program |
CN101943580A (en) * | 2009-07-07 | 2011-01-12 | 宏达国际电子股份有限公司 | Method and device for detecting distance from target and computer program product thereof |
CN102622767A (en) * | 2012-03-05 | 2012-08-01 | 广州乐庚信息科技有限公司 | Method for positioning binocular non-calibrated space |
US20130297205A1 (en) * | 2012-05-02 | 2013-11-07 | Korea Institute Of Science And Technology | System and method for indoor navigation |
CN104021538A (en) * | 2013-02-28 | 2014-09-03 | 株式会社理光 | Object positioning method and device |
CN104915965A (en) * | 2014-03-14 | 2015-09-16 | 华为技术有限公司 | Camera tracking method and device |
CN104933718A (en) * | 2015-06-23 | 2015-09-23 | 广东省自动化研究所 | Physical coordinate positioning method based on binocular vision |
CN105072414A (en) * | 2015-08-19 | 2015-11-18 | 浙江宇视科技有限公司 | Method and system for detecting and tracking target |
CN105550670A (en) * | 2016-01-27 | 2016-05-04 | 兰州理工大学 | Target object dynamic tracking and measurement positioning method |
CN106813649A (en) * | 2016-12-16 | 2017-06-09 | 北京远特科技股份有限公司 | A kind of method of image ranging localization, device and ADAS |
CN106597424A (en) * | 2016-12-22 | 2017-04-26 | 惠州Tcl移动通信有限公司 | Distance measuring method and distance measuring system based on dual cameras, and mobile terminal |
CN107292906A (en) * | 2017-08-11 | 2017-10-24 | 阔地教育科技有限公司 | A kind of method for tracking target, storage device and target tracker |
Non-Patent Citations (1)
Title |
---|
赵霞等: "基于视觉的目标定位技术的研究进展", 《计算机科学》 * |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113192139A (en) * | 2021-05-14 | 2021-07-30 | 浙江商汤科技开发有限公司 | Positioning method and device, electronic equipment and storage medium |
WO2022237071A1 (en) * | 2021-05-14 | 2022-11-17 | 浙江商汤科技开发有限公司 | Locating method and apparatus, and electronic device, storage medium and computer program |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN108111818B (en) | Moving target actively perceive method and apparatus based on multiple-camera collaboration | |
CN107438173B (en) | Video processing apparatus, video processing method, and storage medium | |
CN109151439B (en) | Automatic tracking shooting system and method based on vision | |
US9497388B2 (en) | Zooming factor computation | |
CN103905792B (en) | A kind of 3D localization methods and device based on PTZ CCTV cameras | |
KR101172747B1 (en) | Camera tracking monitoring system and method using thermal image coordinates | |
US8098290B2 (en) | Multiple camera system for obtaining high resolution images of objects | |
US9641754B2 (en) | Monitoring camera for generating 3-dimensional image and method of generating 3-dimensional image using the same | |
US9071819B2 (en) | System and method for providing temporal-spatial registration of images | |
JP2016100696A (en) | Image processing device, image processing method, and image processing system | |
JP2006191524A (en) | Auto framing device and photographing device | |
CN109905641B (en) | Target monitoring method, device, equipment and system | |
CN111914592B (en) | Multi-camera combined evidence obtaining method, device and system | |
Neves et al. | Acquiring high-resolution face images in outdoor environments: A master-slave calibration algorithm | |
JP2011030015A5 (en) | ||
CN111314609A (en) | Method and device for controlling pan-tilt tracking camera shooting | |
CN109543496B (en) | Image acquisition method and device, electronic equipment and system | |
JP6065629B2 (en) | Object detection device | |
KR101033237B1 (en) | Multi-function detecting system for vehicles and security using 360 deg. wide image and method of detecting thereof | |
JP2003179800A (en) | Device for generating multi-viewpoint image, image processor, method and computer program | |
US20190149740A1 (en) | Image tracking device | |
CN110595443A (en) | Projection device | |
JP2018201146A (en) | Image correction apparatus, image correction method, attention point recognition apparatus, attention point recognition method, and abnormality detection system | |
KR20170133666A (en) | Method and apparatus for camera calibration using image analysis | |
JP2010014699A (en) | Shape measuring apparatus and shape measuring method |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
RJ01 | Rejection of invention patent application after publication | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20191220 |