CN101464132A - Position confirming method and apparatus - Google Patents

Position confirming method and apparatus Download PDF

Info

Publication number
CN101464132A
CN101464132A CNA2008102474971A CN200810247497A CN101464132A CN 101464132 A CN101464132 A CN 101464132A CN A2008102474971 A CNA2008102474971 A CN A2008102474971A CN 200810247497 A CN200810247497 A CN 200810247497A CN 101464132 A CN101464132 A CN 101464132A
Authority
CN
China
Prior art keywords
image
preset
target object
coordinate
imaging
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CNA2008102474971A
Other languages
Chinese (zh)
Other versions
CN101464132B (en
Inventor
邓亚峰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Mid Star Technology Ltd By Share Ltd
Original Assignee
Vimicro Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Vimicro Corp filed Critical Vimicro Corp
Priority to CN200810247497.1A priority Critical patent/CN101464132B/en
Publication of CN101464132A publication Critical patent/CN101464132A/en
Application granted granted Critical
Publication of CN101464132B publication Critical patent/CN101464132B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Length Measuring Devices By Optical Means (AREA)

Abstract

The embodiment of the invention provides a method for determining locations and a device thereof. The method comprises the following steps: collecting an image with target object imagery; acquiring area description information used for describing the occupied area of the imagery in the image; and determining a spatial location by using the area description information and pre-determined relation parameters which can reflect the correlative relation between the area description information and the spatial location of the target object. The technical scheme of the embodiment of the invention has the advantages that the spatial location can be determined more accurately; and the unique spatial location of the target object relative to the image collecting device can further be determined.

Description

Position determination method and device
Technical Field
The invention relates to the technical field of image processing, in particular to a position determining method and device.
Background
The continuous development of image digital processing technology advances the digitization process of image acquisition equipment and the wide application thereof in more fields. In some cases, it is desirable to know the spatial position of the photographed target relative to the image capturing device based on the image processing technology, and to perform other operations based on the known spatial position, for example, in a simulation application, perform related calculation by using the position description information of the spatial position, such as predicting or determining whether a person in front of the image capturing device will meet an obstacle, so as to test the flexibility of the person; as another example, location description information for the spatial location is provided to a decision maker to assist in performing relevant decisions, and so on.
In the prior art, for the case that the shot object is a person, a technical scheme of determining the position description information of the spatial position of the person relative to the image acquisition equipment based on the shot image is provided. In the prior art, people are abstracted into a cylinder for convenient processing. The characteristic points of the human body such as eyes and mouth are distributed on the cylinder.
Referring to fig. 1, fig. 1 is a schematic projection diagram of a person and an image acquisition device in a top view direction in the prior art. In fig. 1, a two-dimensional rectangular coordinate system is adopted to calibrate the relevant position information, wherein an origin O (0, 0) is set as the position of the image acquisition device; h (x, y) is the center of a circular surface projected by the cylinder on the coordinates, and the radius of the circular surface is set as r; the direct distance between H (x, y) and O (0, 0) is set as D; on the coordinate shown in fig. 1, an included angle between the left-eye projection and the projection of the mouth and a fan shape enclosed by H (x, y) is set as α, and the component of the distance between the two projections on the x axis is a; the right eye projection and the mouth projection form a fan shape with H (x, y), the included angle is also alpha, and the component of the distance between the two projections in the x-axis direction is b. If the face faces the image acquisition equipment, the projection of the origin of coordinates, the center of a circle and the mouth can be approximately on a straight line, and the included angle between the straight line and the optical axis of the image acquisition equipment is theta. Where α can be statistically derived empirically, and a and b can be calculated based on the projected coordinates of both eyes and mouth. To determine the position of the person relative to the image acquisition device, both D and θ need to be solved.
The prior art mainly comprises:
calculating the direct distance D between the person and the image acquisition equipment according to the distance between the two eyes of the person on the image, relevant imaging parameters of the image acquisition equipment, such as resolution, focal length of a photosensitive device, physical size and the like, and an imaging principle;
the relationship between r and θ is estimated as follows:
rsin(α+θ)-rsinθ=a
rsin (α - θ) + rsin θ ═ b formula 1
Solving for θ as follows, according to equation 1:
<math> <mrow> <mi>&theta;</mi> <mo>=</mo> <mi>arctg</mi> <mrow> <mo>[</mo> <mfrac> <mrow> <mrow> <mo>(</mo> <mi>b</mi> <mo>-</mo> <mi>a</mi> <mo>)</mo> </mrow> <mi>sin</mi> <mi>a</mi> </mrow> <mrow> <mrow> <mo>(</mo> <mi>b</mi> <mo>+</mo> <mi>a</mi> <mo>)</mo> </mrow> <mrow> <mo>(</mo> <mn>1</mn> <mo>-</mo> <mi>cos</mi> <mi>&alpha;</mi> <mo>)</mo> </mrow> </mrow> </mfrac> <mo>]</mo> </mrow> </mrow></math> formula 2
Further, based on the following calculation formula 3, the position information of the person relative to the image capturing device on the rectangular coordinates is obtained:
<math> <mrow> <mfenced open='{' close=''> <mtable> <mtr> <mtd> <mi>x</mi> <mo>=</mo> <mi>D</mi> <mo>&CenterDot;</mo> <mi>sin</mi> <mi>&theta;</mi> </mtd> </mtr> <mtr> <mtd> <mi>y</mi> <mo>=</mo> <mi>D</mi> <mo>&CenterDot;</mo> <mi>cos</mi> <mi>&theta;</mi> </mtd> </mtr> </mtable> </mfenced> </mrow></math> formula 3
The above-described conventional technique can specify the position information of the object to be photographed with respect to the image capturing apparatus in a specific case where there is a demand for the position of the feature point of the object to be photographed with respect to the image capturing apparatus, such as a demand for a face of a person to face the image capturing apparatus, so as to apply the above-described calculation expressions to obtain the position information of the person with respect to the image capturing apparatus. In practical application, it is often difficult to ensure the regularity of the relative positions between the feature points and the image acquisition equipment. Also, based on equation 3 above, it is difficult to determine the location description information of the unique person with respect to the image capturing apparatus.
Therefore, the technical solutions provided by the prior art for the position information of the photographed object relative to the image capturing apparatus are still to be improved.
Disclosure of Invention
The embodiment of the invention provides a position determining method and device, and aims to solve the technical problem that the spatial position of a target object relative to an image acquisition device is difficult to accurately determine in the prior art.
To solve the above technical problem, an embodiment of the present invention provides a position determining method, including:
acquiring an image including an image of a target object;
acquiring area description information for describing an area occupied by the imaging on the image;
and determining the spatial position by using the area description information and a preset relation parameter capable of reflecting the incidence relation between the area description information and the spatial position of the target object.
Preferably, the area description information includes: information on the position of the region on the image, and/or information describing the size of the image.
Preferably, the relationship parameters include: a first relation parameter reflecting the size of the target object;
the determining the spatial location comprises:
and determining the projection position of the target object on a plane perpendicular to the optical axis of the imaging device for acquisition by using a preset algorithm, the first relation parameter, the position information and the size description information.
Preferably, a two-dimensional coordinate system is preset, and coordinate values in a specified dimension direction in coordinates calibrated by the two-dimensional coordinate system are used for representing the number of pixels in the specified dimension direction; or, the coordinate value in the specified dimension direction in the coordinates specified by the two-dimensional coordinate system is used for representing the ratio of the number of pixels in the specified dimension direction to the resolution of the imaging device in the specified dimension direction.
Preferably, the first relation parameter includes: a first length of the target object in a specified dimensional direction;
the position information comprises a coordinate value of the imaging in the specified dimension direction in the two-dimensional coordinate system and is recorded as M;
the size description information comprises a second length of the imaging in the direction of the specified dimension;
the determining the projection position of the target object on a plane perpendicular to the optical axis of the imaging device used for acquisition comprises:
calibrating a space formed by the optical axis and the plane by using a preset three-dimensional coordinate system, wherein a first coordinate axis and a second coordinate axis are used for calibrating the plane, a third coordinate axis is superposed with the optical axis, and the forward direction of the third coordinate axis is opposite to the image;
and calculating coordinate values of the projection of the target object on the plane on the first coordinate axis and/or the second coordinate axis by using a calculation formula of the first length M/the second length.
Preferably, the calculating the coordinate value of the projection of the target object on the plane on the first coordinate axis and/or the second coordinate axis by using the calculation formula of the first length · M/the second length includes:
calculating a first coordinate value of the projection of the target object on the plane on the first coordinate axis by using the calculation formula;
the specified dimension direction comprises a first specified dimension direction and a second specified dimension direction perpendicular to the first specified dimension direction; and the first specified dimension direction is parallel to the first coordinate axis direction; the second specified dimension direction is parallel to the second coordinate axis direction;
recording the coordinate value of the first specified dimension direction as M1; the coordinate value of the second specified dimension direction is M2;
and calculating a second coordinate value projected on the second coordinate axis according to a calculation formula first coordinate value M2/M1.
Preferably, the first relation parameter is preset, including:
measuring the projection of a preset reference object on a reference plane parallel to the plane of the image in advance, wherein the physical length of the preset reference object in a specified direction is measured; the physical length is used as the first relation parameter; or,
measuring the pixel length of an image corresponding to the projection in the specified direction, the pixel coordinate of the reference object in the specified direction, and the coordinate value of the projection of the reference object on a plane perpendicular to the optical axis on a coordinate axis parallel to the specified direction in the three-dimensional coordinate system from the acquired reference image containing the reference object image;
according to the calculation formula: the first relation parameter is calculated from the pixel length and the coordinate value/pixel coordinate on the coordinate axis parallel to the predetermined direction.
Preferably, the relationship parameters include:
a second relation parameter which embodies the incidence relation among the resolution of the imaging device used for acquisition, the size description information of the imaging and the distance between the target object and a plane vertical to the optical axis of the imaging device;
the area description information includes: size descriptive information of the imaging;
the determining the spatial location comprises:
and determining the distance between the target object and a plane perpendicular to the optical axis of the imaging device by using a preset algorithm, the second relation parameter and the size description information.
Preferably, the acquiring the area description information includes:
presetting a preset two-dimensional coordinate system capable of calibrating the position of each point in the image, and measuring the length delta l of the image in the specified dimension direction on the two-dimensional coordinate system;
the determining the distance of the target object from a plane perpendicular to the optical axis of the imaging device comprises:
using the calculation:
Figure A200810247497D00101
calculating the distance;
coordinate values in a specified dimension direction in coordinates calibrated by the two-dimensional coordinate system are used for representing the number of pixels in the specified dimension direction; or, the coordinate value in the specified dimension direction in the coordinates specified by the two-dimensional coordinate system is used for representing the ratio of the number of pixels in the specified dimension direction to the resolution of the imaging device in the specified dimension direction.
Preferably, the second relation parameter is preset, including:
measuring a reference distance between a preset reference object and a plane perpendicular to the optical axis of the imaging device;
measuring a projection of the reference object on a reference plane parallel to the plane of the image, a third length in a specified direction;
using the calculation:
Figure A200810247497D00102
and calculating the second relation parameter.
Preferably, the acquiring is used for describing the imaging position and/or size information, and the position and/or size of the imaging in the image is automatically acquired by adopting an object detection technology.
Preferably, the object is a human face or/and a human head.
To solve the above technical problem, an embodiment of the present invention provides an apparatus, including: a position determination apparatus, the apparatus comprising: the device comprises an acquisition unit, an acquisition unit and a determination unit;
the acquisition unit is used for acquiring an image including the imaging of the target object;
the acquisition unit is used for acquiring area description information for describing the area occupied by the imaging on the image;
the determining unit is configured to determine the spatial position by using the area description information acquired by the acquiring unit and a preset relation parameter capable of embodying an association relation between the area description information and the spatial position of the target object.
Preferably, the acquiring unit includes:
a first acquisition subunit configured to acquire coordinates of the area in a preset two-dimensional coordinate system, the coordinates being used as the position information; the preset two-dimensional coordinate system can calibrate the positions of all points in the image;
and the second acquisition subunit is used for measuring the length delta l of the imaging in the specified dimension direction on a preset two-dimensional coordinate system.
Preferably, the determination unit includes: the device comprises a preset information receiving unit and a first calculating unit;
the preset information receiving unit is used for receiving a preset first relation parameter reflecting the size of the target object;
the first calculating unit is configured to determine, by using a preset algorithm, the first relation parameter received by the preset information receiving unit, and the area description information acquired by the acquiring subunit, a position of a projection of the target object on a plane perpendicular to an optical axis of the imaging device used for acquisition.
Preferably, the determination unit includes: the preset information receiving unit and the second calculating unit;
the preset information receiving unit is used for receiving a preset second relation parameter which reflects the incidence relation among the resolution of an imaging device used by the acquisition unit, the size description information of the imaging and the distance between the target object and a plane vertical to the optical axis of the imaging device;
the second calculating unit is configured to determine, by using a preset algorithm, the second relation parameter received by the preset information receiving unit, and the information acquired by the second acquiring subunit, a distance from the target object to a plane perpendicular to the optical axis of the imaging device.
Compared with the prior art, the technical scheme provided by the embodiment of the invention has the following beneficial effects:
according to the position determining method and device provided by the embodiment of the invention, the area description information of the area occupied by the imaging of the target object is acquired from the acquired image, and the spatial position of the target object is determined by combining the preset relation parameters.
Drawings
FIG. 1 is a schematic view of a prior art human and image capture device in a top view;
FIG. 2 is a flow chart of a method of position determination in an embodiment of the present invention;
FIG. 3 is a schematic view of an embodiment of the present invention for calibrating an object and the space in which it is imaged;
FIG. 4 is a flow chart of determining a position of a target object in an embodiment of the invention;
fig. 5 is a schematic structural diagram of a position determination device in an embodiment of the present invention.
Detailed Description
In order to accurately determine the spatial position of a target object relative to image acquisition equipment based on an acquired image, in the implementation process of the invention, an inventor conducts derivation work based on a projection rule between a real object and an imaged object, and converts a relational expression containing parameters which are difficult to determine into a relational expression consisting of parameters which are relatively easy to determine, so that the operation and operation work required for determining the spatial position of the object relative to the image acquisition equipment is simplified, and the accuracy of the determined position description information is effectively improved.
The technical solutions provided by the embodiments of the present invention are described in detail below with reference to specific embodiments and accompanying drawings.
Referring to fig. 2, fig. 2 is a flow chart of a position determination method in an embodiment of the invention, which may include the following steps:
step 201, acquiring an image including the target object image.
In step 202, region description information for describing an imaging region of the image is obtained.
In an embodiment of the present invention, the area description information may include: position information of the imaging area on the image, or size description information of the imaging area, or the like.
And 203, determining the spatial position by using the area description information and a preset relation parameter capable of embodying the association relation between the area description information and the spatial position of the target object.
In an embodiment of the invention, the relation parameters used may be preset, i.e. the relation parameters determined in advance may be used as known quantities in the process of determining the spatial position of the target object relative to the image acquisition device. Therefore, based on the known relation parameters and the area description information acquired from the image, the spatial position of the target object relative to the image acquisition equipment can be determined by utilizing a relatively simple algorithm calculation formula previously derived by the inventor.
In the process of presetting the relation parameters, the relation parameters can be determined in advance by acquiring the imaging of the reference object. For ease of distinction, the imaging of the reference is referred to as reference imaging. The reference object may be a target object, or other objects such as an object that can be analogized to the target object.
Specifically, the relationship parameter in the embodiment of the present invention may include: and the first relation parameter represents the size of the target object, and/or the second relation parameter represents the incidence relation between the resolution of the imaging device used for acquisition, the size description information of the imaging and the distance between the target object and a plane vertical to the optical axis of the imaging device.
In the embodiment of the invention, in the process of determining the spatial position of the target object relative to the image acquisition equipment, if the selected area description information is the position information of the imaging area on the image, the corresponding selected relation parameter is a first relation parameter; and if the selected area description information is the size description information of the imaging area, the corresponding selected relation parameter is the second relation parameter.
Compared with the prior art, the technical scheme provided by the embodiment of the invention does not strictly limit the category of the target object, so the method and the device are applicable to other scenes besides the scene that the target object is a human, and the scene that the target object is a human does not require a limiting condition that the human faces the image acquisition equipment.
The following describes in detail the implementation process of the technical solution according to the embodiment of the present invention in conjunction with the derivation work performed by the inventor, and refers to the substance involved in the derivation process as a reference.
In practical applications, a two-dimensional matrix is used to represent the image, the specific representation form of the area description information can be represented by a preset image coordinate system, and the coordinates calibrated by the image coordinate system can be regarded as matrix elements in the two-dimensional matrix. Image coordinate systems that may be employed include image pixel coordinate systems, image physical coordinate systems, and the like. In the embodiments of the present invention, for convenience, the image pixel coordinate system is referred to as a first coordinate system, and the image physical coordinate system is referred to as a second coordinate system.
The two-dimensional coordinates calibrated by the first coordinate system and the second coordinate system comprise coordinate values of the positions of the pixel points in two dimension directions respectively, wherein the coordinate value in one dimension direction can be called a row sequence number in the two-dimensional matrix, and the coordinate value in the other dimension direction can be called a column sequence number. For the row serial number and the row serial number calibrated by the first coordinate system, the unit is the number of pixels; the row number and the column number are calibrated to the second coordinate system in units of length, such as centimeters (cm).
Selecting an image coordinate system, wherein the concrete expression form of the position information of the imaging area on the image comprises coordinate values of representative points of the imaging area on different dimensions; the size descriptive information of the imaging region may include lengths of the imaging region in different dimensions, specifically, a length of the imaging region in a direction of a specified dimension and a width of the imaging region in another specified dimension.
And for the same pixel point, a mapping relation exists between the coordinate calibrated by the first coordinate system and the coordinate calibrated by the second coordinate system. Referring to fig. 3, fig. 3 is a schematic diagram of a space for calibrating an object and imaging the object according to an embodiment of the present invention. The first coordinate system comprises a first origin, and a u-axis and a v-axis which pass through the first origin and are perpendicular to each other, wherein the unit of the u-axis and the unit of the v-axis are the number of pixels. Let the second coordinate system include a second origin, and an x-axis and a y-axis respectively passing through the second origin, and the unit of the x-axis and the y-axis is a length unit, such as cm. Let the x-axis of the second coordinate system be parallel to the column axis of the first coordinate system and the y-axis be parallel to the row axis of the first coordinate system. If the coordinate of a certain pixel point on the first coordinate system is (u, v), the mapping relationship between the coordinates (x, y) and (u, v) of the pixel point in the second coordinate system is:
u = x dx + u 0 formula 4
v = y dy + v 0 Formula 5
Wherein dx and dy are the physical lengths occupied by a single pixel point in the x-axis direction and the y-axis direction respectively.
And setting a third coordinate system to calibrate the spatial position of the reference object relative to the image acquisition equipment by taking the position of the image acquisition equipment as a reference. The third coordinate system is a three-dimensional rectangular coordinate system, a third origin of the third coordinate system can be set as the position of the optical center of an imaging device on the image acquisition equipment, a z axis of the third coordinate system is coincident with the optical axis, and the third coordinate system points to the object in the forward direction; the plane formed by the X axis and the Y axis is vertical to the optical axis of the imaging device. Referring to fig. 3, fig. 3 is a schematic diagram of a space for calibrating an object and imaging the object according to an embodiment of the present invention. Recording the coordinate of the space position of the target object as B (X)c,Yc,Zc) The coordinate B (X)c,Yc,Zc) I.e. to describe the spatial position of the target object, in embodiments of the invention, i.e. where it is desired to derive a reference for determining the sitting positionA formula for calculating one or more coordinate values in the target. In addition, the third coordinate system may also be referred to as a world coordinate system.
For practical applications of acquiring images based on image acquisition equipment, an equipment coordinate system is often adopted to calibrate relevant information. In the embodiment of the present invention, for the convenience of derivation, the third coordinate system actually coincides with the device coordinate system, and then the perspective projection transformation relation of the image captured by the image capturing device is simplified as follows:
Z c x y 1 = f 0 0 0 0 f 0 0 0 0 1 0 X c Y c Z c 1
further, the method can be obtained as follows:
<math> <mrow> <mi>X</mi> <mo>=</mo> <mfrac> <mrow> <mi>x</mi> <mo>&CenterDot;</mo> <mi>Z</mi> </mrow> <mi>f</mi> </mfrac> </mrow></math> formula 6
<math> <mrow> <mi>Y</mi> <mo>=</mo> <mfrac> <mrow> <mi>y</mi> <mo>&CenterDot;</mo> <mi>Z</mi> </mrow> <mi>f</mi> </mfrac> </mrow></math> Formula 7
From formula 4, one can obtain:
x=(u-u0) Dx formula 8
From formula 5:
y=(v-v0)·dy formula 9
By substituting formula 8 for formula 6, one can obtain:
<math> <mrow> <msub> <mi>X</mi> <mi>c</mi> </msub> <mo>=</mo> <mrow> <mo>(</mo> <mi>u</mi> <mo>-</mo> <msub> <mi>u</mi> <mn>0</mn> </msub> <mo>)</mo> </mrow> <mo>&CenterDot;</mo> <mi>dx</mi> <mo>&CenterDot;</mo> <mfrac> <msub> <mi>Z</mi> <mi>c</mi> </msub> <mi>f</mi> </mfrac> </mrow></math> formula 10
By substituting formula 9 for formula 7, one can obtain:
<math> <mrow> <msub> <mi>Y</mi> <mi>c</mi> </msub> <mo>=</mo> <mrow> <mo>(</mo> <mi>v</mi> <mo>-</mo> <msub> <mi>v</mi> <mn>0</mn> </msub> <mo>)</mo> </mrow> <mo>&CenterDot;</mo> <mi>dy</mi> <mo>&CenterDot;</mo> <mfrac> <msub> <mi>Z</mi> <mi>c</mi> </msub> <mi>f</mi> </mfrac> </mrow></math> formula 11
Wherein u is0And v0All are constants, and can be set to zero for convenient operation.
According to the imaging principle of the imaging device on the image acquisition equipment, and the image distance a can be approximate to the focal length f of the imaging device, then:
h H = f A
h is the length of the reference object in one dimension, H is the length of the corresponding image, A is the object distance, namely the distance between the reference object and the plane where the X axis and the Y axis are located in the third coordinate system, and the X axis and the Y axis are called as the XY plane.
Assuming that the length of the projection of the reference object on the XY plane in the X-axis direction is W, the length of the corresponding image is h ═ dx · Δ u; wherein Δ u is the number of pixels of the pixel in the row direction of the first coordinate system, then:
<math> <mrow> <msub> <mi>Z</mi> <mi>c</mi> </msub> <mo>=</mo> <mfrac> <mrow> <mi>W</mi> <mo>&CenterDot;</mo> <mi>f</mi> </mrow> <mrow> <mi>&Delta;u</mi> <mo>&CenterDot;</mo> <mi>dx</mi> </mrow> </mfrac> </mrow></math> formula 12
Assuming that the projection of the reference object on the XY plane has a length H on the Y axis, and the length of the corresponding image is H, dy · Δ v, where Δ v is the pixel size of the image in the column direction of the first coordinate system, we obtain:
<math> <mrow> <msub> <mi>Z</mi> <mi>c</mi> </msub> <mo>=</mo> <mfrac> <mrow> <mi>H</mi> <mo>&CenterDot;</mo> <mi>f</mi> </mrow> <mrow> <mi>&Delta;v</mi> <mo>&CenterDot;</mo> <mi>dy</mi> </mrow> </mfrac> </mrow></math> formula 13
Substitution of formula 12 for formula 10 yields:
<math> <mrow> <msub> <mi>X</mi> <mi>c</mi> </msub> <mo>=</mo> <mfrac> <mrow> <mo>(</mo> <mi>u</mi> <mo>-</mo> <msub> <mi>u</mi> <mn>0</mn> </msub> <mo>)</mo> </mrow> <mi>&Delta;u</mi> </mfrac> <mo>&CenterDot;</mo> <mi>W</mi> </mrow></math> formula 14
Substitution of formula 13 for formula 11 yields:
<math> <mrow> <msub> <mi>Y</mi> <mi>c</mi> </msub> <mo>=</mo> <mfrac> <mrow> <mo>(</mo> <mi>v</mi> <mo>-</mo> <msub> <mi>v</mi> <mn>0</mn> </msub> <mo>)</mo> </mrow> <mi>&Delta;v</mi> </mfrac> <mo>&CenterDot;</mo> <mi>H</mi> </mrow></math> formula 15
U in the above equation 14 and v in equation 15 are concrete expressions of position information of the imaging region on the image in the region description information in the embodiment of the present invention, which may be generally referred to as u and v as coordinate values M of the imaging in the specified dimension direction, and the value of M may be obtained from the acquired image including the imaging of the target object. And the above Δ u and Δ v are the size of the object in the specified dimension direction, and are the width in the row axis direction and the height in the column axis direction, respectively.
For an object with a fixed size, W and H are constants, and are set as specific expressions of the first relation parameter in the embodiment of the present invention, and:
C1=W;C2h-type 16
For the case where the photosensitive device is fixed, the focal length is fixed, and the actual height and width of the reference are fixed, C1And C2Is constant, C can be predetermined1And C2
The inventor considers that in practical application, the resolution of the photosensitive device can be set to different values in different situations, and therefore, in order to make the derived calculation formula suitable for more scenes and avoid the situation that the derived calculation formula is not suitable due to the unfixed resolution of the photosensitive device, the above equations 13 to 15 are further transformed as follows:
assuming that the physical size of the photosensitive device is P · Q and the corresponding resolution is U · V, then,
dx = P U ; dy = Q V formula 17
Substituting formula 16 for formulae 12 and 13 yields:
<math> <mrow> <msub> <mi>Z</mi> <mi>c</mi> </msub> <mo>=</mo> <mfrac> <mrow> <mi>W</mi> <mo>&CenterDot;</mo> <mi>f</mi> </mrow> <mrow> <mfrac> <mi>&Delta;u</mi> <mi>U</mi> </mfrac> <mo>&CenterDot;</mo> <mi>P</mi> </mrow> </mfrac> </mrow></math> formula 18
Or,
<math> <mrow> <msub> <mi>Z</mi> <mi>c</mi> </msub> <mo>=</mo> <mfrac> <mrow> <mi>H</mi> <mo>&CenterDot;</mo> <mi>f</mi> </mrow> <mrow> <mfrac> <mi>&Delta;v</mi> <mi>V</mi> </mfrac> <mo>&CenterDot;</mo> <mi>Q</mi> </mrow> </mfrac> </mrow></math> formula 19
Coordinate values in a specified dimension direction in the coordinates specified by the first coordinate system are used for representing the number of pixels in the specified dimension direction; a normalized coordinate system may be defined, in which coordinate values in a specified dimension direction in the calibrated coordinates are used to represent a ratio of the number of pixels in the specified dimension direction to the resolution of the imaging device in the specified dimension direction.
The t-axis and the u-axis of the normalized coordinate system coincide, the r-axis and the v-axis of the normalized coordinate system coincide, and t = u U , t = v V , then the process of the first step is carried out,
<math> <mrow> <msub> <mi>Z</mi> <mi>c</mi> </msub> <mo>=</mo> <mfrac> <mrow> <mi>W</mi> <mo>&CenterDot;</mo> <mi>f</mi> </mrow> <mrow> <mi>&Delta;t</mi> <mo>&CenterDot;</mo> <mi>P</mi> </mrow> </mfrac> <mo>,</mo> </mrow></math> or, <math> <mrow> <msub> <mi>Z</mi> <mi>c</mi> </msub> <mo>=</mo> <mfrac> <mrow> <mi>H</mi> <mo>&CenterDot;</mo> <mi>f</mi> </mrow> <mrow> <mi>&Delta;r</mi> <mo>&CenterDot;</mo> <mi>Q</mi> </mrow> </mfrac> </mrow></math> formula 20
Formula 19 above can be written as:
<math> <mrow> <msub> <mi>Z</mi> <mi>c</mi> </msub> <mo>=</mo> <mfrac> <msub> <mi>C</mi> <mn>3</mn> </msub> <mi>&Delta;t</mi> </mfrac> <mo>=</mo> <mfrac> <msub> <mi>C</mi> <mn>3</mn> </msub> <mrow> <mi>&Delta;u</mi> <mo>/</mo> <mi>U</mi> </mrow> </mfrac> <mo>,</mo> </mrow></math> or, <math> <mrow> <msub> <mi>Z</mi> <mi>c</mi> </msub> <mo>=</mo> <mfrac> <msub> <mi>C</mi> <mn>4</mn> </msub> <mi>&Delta;r</mi> </mfrac> <mo>=</mo> <mfrac> <msub> <mi>C</mi> <mn>4</mn> </msub> <mrow> <mi>&Delta;v</mi> <mo>/</mo> <mi>V</mi> </mrow> </mfrac> </mrow></math> formula 21
Δ u and Δ v in the above equation 21 are concrete expressions of the size description information of the specified area on the image of the target object in the embodiment of the present invention.
Equation 14 becomes:
<math> <mrow> <msub> <mi>X</mi> <mi>c</mi> </msub> <mo>=</mo> <mfrac> <mrow> <mo>(</mo> <mi>u</mi> <mo>-</mo> <msub> <mi>u</mi> <mn>0</mn> </msub> <mo>)</mo> </mrow> <mi>&Delta;u</mi> </mfrac> <mo>&CenterDot;</mo> <mi>W</mi> <mo>=</mo> <mfrac> <mrow> <mo>(</mo> <mi>t</mi> <mo>-</mo> <msub> <mi>t</mi> <mn>0</mn> </msub> <mo>)</mo> </mrow> <mi>&Delta;t</mi> </mfrac> <mo>*</mo> <mi>W</mi> <mo>,</mo> </mrow></math> formula 22
Equation 15 becomes:
<math> <mrow> <msub> <mi>Y</mi> <mi>c</mi> </msub> <mo>=</mo> <mfrac> <mrow> <mo>(</mo> <mi>v</mi> <mo>-</mo> <msub> <mi>v</mi> <mn>0</mn> </msub> <mo>)</mo> </mrow> <mi>&Delta;v</mi> </mfrac> <mo>&CenterDot;</mo> <mi>H</mi> <mo>=</mo> <mfrac> <mrow> <mo>(</mo> <mi>r</mi> <mo>-</mo> <msub> <mi>r</mi> <mn>0</mn> </msub> <mo>)</mo> </mrow> <mi>&Delta;r</mi> </mfrac> <mo>*</mo> <mi>H</mi> <mo>.</mo> </mrow></math> formula 23
The above formulas 22 and 23 can be generically referred to as: a first length M/a second length, wherein the first length is, for example, W, H; a second length, e.g. Δ u, Δ v or Δ t, Δ r, M, e.g. (u-u)0)、(t-t0)
Wherein, t 0 = u 0 U , r 0 = v 0 V .
generally speaking, Δ u and Δ v are lengths Δ l of the imaging in the direction of the specified dimension in the first coordinate system, and Δ l can be obtained from the acquired image including the imaging of the target object. Formula 21 can be synthesized as:
where S is the resolution of the imaging device in the specified dimension, such as U, V.
For the case where the photosensitive device is fixed, the focal length is fixed, and the actual height and width of the reference are fixed, C3And C4Is constant, and is a concrete expression form of the second relation parameter, and C can be determined in advance based on the following calculation formula3And C4
<math> <mrow> <msub> <mi>C</mi> <mn>3</mn> </msub> <mo>=</mo> <msub> <mi>Z</mi> <mi>c</mi> </msub> <mo>&CenterDot;</mo> <mfrac> <mi>U</mi> <mi>&Delta;u</mi> </mfrac> <mo>;</mo> </mrow></math> <math> <mrow> <msub> <mi>C</mi> <mn>4</mn> </msub> <mo>=</mo> <msub> <mi>Z</mi> <mi>c</mi> </msub> <mo>&CenterDot;</mo> <mfrac> <mi>V</mi> <mi>&Delta;v</mi> </mfrac> </mrow></math> Formula 24
The derivation work performed by the embodiments of the present invention thus far is in the following paragraphs. In practical applications, any one of the expressions 14, 15, 22 and 23, the predetermined relation parameters, the area description information obtained from the currently acquired image, and the parameter information of the imaging device may be selected and used as necessary to calculate the corresponding coordinate values, such as only Zc or only Z, as necessarycAnd XcAnd so on. Based on equations 14 and 15, and equations 22 and 23, a relatively unique world coordinate value can be determined.
Further, the ratio of the size of the real object to the corresponding imaging size can be determined, and the following proportional relationship exists, that is, for different areas on the real object to be imaged, the ratio of the size of the different areas to the corresponding imaging size is usually unchanged, and can be expressed as: <math> <mrow> <mfrac> <mi>W</mi> <mi>&Delta;u</mi> </mfrac> <mo>=</mo> <mfrac> <mi>H</mi> <mi>&Delta;v</mi> </mfrac> <mo>,</mo> </mrow></math> then, further according to equations 14, 15 and 22, 23, we obtain:
X c Y c = ( t - t 0 ) ( r - r 0 ) . formula 25
When the horizontal and vertical resolution settings also satisfy the above proportional relationship, i.e., dx ═ dy, then there is:
X c Y c = ( u - u 0 ) ( v - v 0 ) . formula 26
So that only C can be determined1Calculating to obtain XcAfter according to
Y c = X c * ( r - r 0 ) ( t - t 0 ) Or Y c = X c * ( v - v 0 ) ( u - u 0 ) Formula 27
Calculating Yc
It is also possible to determine only C2Calculating to obtain YcAfter according to
Y c = X c * ( t - t 0 ) ( r - r 0 ) Or X c = Y c * ( u - u 0 ) ( v - v 0 ) Formula 28
Calculating Xc
The description about equations 27 and 28 above may include: calculating the coordinate value of the projection of the target object on the XY plane on the X or Y coordinate axis;
a second coordinate value projected on a second coordinate axis is calculated based on the formula of the first coordinate value M2/M1, wherein the first coordinate value is a coordinate value on the X axis and the second coordinate value is a coordinate value on the Y axis. M2 is as (u-u)0)、(t-t0) M1 is as (r-r)0)、(v-v0)
When dx is dy, C3=C4
The procedure for presetting the relationship parameters in the embodiment of the present invention is described in detail below.
Presetting a reference object, and measuring the distance between the reference object and the XY plane, and referring the distance as the reference distance, namely the object distance Zc
Measuring a projection of a reference object on a reference plane parallel to a plane where the image is located, and a first length in a specified direction, specifically comprising: placing the reference object parallel to the X-axis of the third coordinate system, measuring the length W of the reference object in the X-axis direction, and obtaining C according to equation 161
Measuring a second length of an image corresponding to a projection in a specified direction from an acquired reference image containing a reference object image, specifically comprising: measuring the number delta u of corresponding pixels imaged on an image coordinate system;
calculating C from the above equation 24 and predictable U3
Under the condition of keeping the object distance unchanged, the reference object is placed in parallel to the Y axis of the third coordinate system, the first length H of the reference object in the Y axis direction is measured, and C can be obtained according to the formula 162(ii) a Measuring the number delta v of corresponding pixels imaged on an image coordinate system;
c is calculated by using the above equation 24 and V which can be predicted4
The method for presetting the second relation parameter can be described as follows:
measuring a reference distance between a preset reference object and a plane perpendicular to the optical axis of the imaging device;
measuring a projection of the reference object on a reference plane parallel to the plane of the image, a third length in a specified direction;
using the calculation:
Figure A200810247497D00191
calculating the second relation parameter;
wherein S is a resolution of the imaging device in the specified direction.
In addition, C can be preset in other ways1And C2The following were used:
the coordinates of the reference object in the X-axis and in the Y-axis can be measured and based on
<math> <mrow> <mi>W</mi> <mo>=</mo> <mfrac> <mi>&Delta;u</mi> <mrow> <mo>(</mo> <mi>u</mi> <mo>-</mo> <msub> <mi>u</mi> <mn>0</mn> </msub> <mo>)</mo> </mrow> </mfrac> <mo>&CenterDot;</mo> <msub> <mi>X</mi> <mi>c</mi> </msub> <mo>,</mo> </mrow></math> <math> <mrow> <mi>H</mi> <mo>=</mo> <msub> <mi>Y</mi> <mi>c</mi> </msub> <mo>&CenterDot;</mo> <mfrac> <mi>&Delta;v</mi> <mrow> <mo>(</mo> <mi>v</mi> <mo>-</mo> <msub> <mi>v</mi> <mn>0</mn> </msub> <mo>)</mo> </mrow> </mfrac> <mo></mo> </mrow></math> Formula 29
And (4) calculating.
Thus, all the relation parameters can be determined. It should be noted that, in practical applications, the process of presetting the relationship parameter does not need to be completely consistent with the above-mentioned process listed in the embodiment of the present invention, as long as a more accurate relationship parameter can be determined based on the calculation formula provided in the embodiment of the present invention. In addition, the pixel length of the reference object in the horizontal direction and the pixel length of the reference object in the vertical direction can be obtained by means of manual calibration, and can also be obtained by means of automatic image detection.
In addition, u0=0,v00 is applicable to general imaging devices. For improving accuracyAn object can be placed at a position which is not the origin point in the Z-axis direction of a camera coordinate system to obtain the pixel coordinate position of the object in an image, and the abscissa of the position is u0The ordinate is v0
Referring to fig. 4, fig. 4 is a flowchart of determining the position of the target object in the embodiment of the present invention, in which the target object is a human. The process may include the steps of:
step 401, collecting an image including a face region, where the face region is a region corresponding to an actual face imaged in the image.
Step 402, determining the position (u, v) and size description information of the face region on the image by using a face detection technology.
If other targets are set as reference targets, the position and size information of the targets can be automatically acquired by adopting corresponding object detection technology. For example, a person's head may be detected using a person's head detection technique to determine the location and size of the person's head region in the image. If the automobile is adopted as a target, the position and size information of the automobile area in the image can be automatically acquired by adopting the automobile detection technology.
The size description information includes the length of the face region in the row axis direction, i.e., the width Δ u, and the length of the face region in the column axis direction, i.e., the height Δ v. In step 402, a central point of each face image may be preset as a representative point of the face image, and the position information of the central point is determined as the position information of the face area.
Step 403, determining position description information (X) of each person relative to the spatial position of the image acquisition device according to a preset algorithm and the relation parameterc,Yc,Zc)。
In practical application, if X is required to be calculatedcCan be based on the above formula 14 and preset C1Solving; or the solution can be obtained according to the above equation 22, specifically, whether the coordinate used in the practical application is the first coordinate system or the normalized coordinate system; or, if the vertical bar of formula 28 is satisfiedThen can calculate YcThen, based on the above equation 28, the solution is obtained;
if necessary, calculating YcThen, the method can be based on the above formula 14 and the preset C2Solving; alternatively, the solution can be obtained according to the above equation 23, specifically, whether the coordinate used in the practical application is the first coordinate system or the normalized coordinate system; alternatively, if the conditions satisfying equation 27 are satisfied, X can be calculatedcThen, the solution is obtained based on the above equation 27;
if necessary, calculating ZcThen, the formula 21 and the predicted C can be used3Or C4And (6) solving.
In addition, the above calculation process requires Δ u or Δ v to be used in a specific area, which can be measured from the acquired image.
Therefore, the spatial position of each person relative to the image acquisition equipment can be accurately determined based on the acquired image.
In practical application, when a posture of a person changes, a face may not be detected, or the size of a designated area on a face image is uncertain due to the movement of the body of the person in the detected face image, so that the accuracy of the determined spatial position is affected. Therefore, it is more preferable to detect the position and size of the image of the head of the person on the image using a human head detection algorithm and then determine the spatial position of the person with respect to the image capturing apparatus.
In practical applications, other operations may be performed based on the known spatial position of the target object relative to the image capturing device, such as moving a person in front of the image capturing device, determining whether the movement of the person satisfies a condition according to the obtained spatial position of the person relative to the image capturing device, and so on.
Referring to fig. 5, fig. 5 is a schematic structural diagram of a position determination apparatus according to an embodiment of the present invention, and in fig. 5, the position determination apparatus 500 may include: an acquisition unit 501, an acquisition unit 502 and a determination unit 503;
an acquisition unit 501, configured to acquire an image including an image of a target object;
an acquisition unit 502 configured to acquire area description information for describing an area occupied by the imaging on the image;
a determining unit 503, configured to determine the spatial position by using the area description information acquired by the acquiring unit 502 and a preset relation parameter capable of embodying an association relation between the area description information and the spatial position of the target object.
In fig. 5, the acquisition unit 502 includes:
a first acquiring subunit 5021, configured to acquire coordinates of the area in a preset two-dimensional coordinate system, where the coordinates are used as the position information; the preset two-dimensional coordinate system can calibrate the positions of all points in the image;
and the second acquiring subunit 5022 is configured to measure the length Δ l of the imaging in the direction of the specified dimension on the preset two-dimensional coordinate system.
The determination unit 503 in fig. 5 includes: a preset information receiving unit 5031 and a first calculating unit 5032;
a preset information receiving unit 5031, configured to receive a preset first relation parameter indicating a proportional relation between the size of the target object and the imaging size;
a first calculating unit 5032, configured to determine, by using a preset algorithm, the first relation parameter received by the preset information receiving unit 4031, and the information acquired by the acquiring subunit 502, a position of the target object on a plane perpendicular to an optical axis of the imaging device used for the acquisition.
The determining unit 503 may further include: a second computing unit 5033;
the preset information receiving unit 5031 is further configured to receive a preset second relation parameter that represents an association relationship between the resolution of the imaging device used by the acquiring unit 501, the size description information of the imaging, and the distance from the target object to the plane perpendicular to the optical axis of the imaging device;
the second calculating unit 5033 is configured to determine, by using a preset algorithm, the second relation parameter received by the preset information receiving unit 5031, and the information acquired by the second acquiring subunit 5022, a distance from the target object to a plane perpendicular to the optical axis of the imaging device.
In practical applications, the position determining apparatus may be disposed in a specific device, so that the device has related functions.
In summary, the position determining method and device provided by the embodiments of the present invention determine the spatial position of the target object by obtaining the area description information of the area occupied by the imaging of the target object from the acquired image and combining the preset relation parameters, and since both the area description information and the relation parameters can be known more accurately, compared with the prior art, the technical solution provided by the embodiments of the present invention can determine the spatial position more accurately and determine the unique spatial position of the target object relative to the image acquisition device.
The foregoing is only a preferred embodiment of the present invention, and it should be noted that those skilled in the art can make various improvements and modifications without departing from the principle of the present invention, and these improvements and modifications should also be construed as the protection scope of the present invention.

Claims (16)

1. A method of position determination, comprising:
acquiring an image including an image of a target object;
acquiring area description information for describing an area occupied by the imaging on the image;
and determining the spatial position by using the area description information and a preset relation parameter capable of reflecting the incidence relation between the area description information and the spatial position of the target object.
2. The method of claim 1, wherein the region description information comprises: information on the position of the region on the image, and/or information describing the size of the image.
3. The method of claim 2,
the relationship parameters include: a first relation parameter reflecting the size of the target object;
the determining the spatial location comprises:
and determining the projection position of the target object on a plane perpendicular to the optical axis of the imaging device for acquisition by using a preset algorithm, the first relation parameter, the position information and the size description information.
4. The method according to claim 3, characterized in that a two-dimensional coordinate system is preset, and coordinate values in a specified dimension direction in coordinates calibrated by the two-dimensional coordinate system are used for representing the number of pixels in the specified dimension direction; or, the coordinate value in the specified dimension direction in the coordinates specified by the two-dimensional coordinate system is used for representing the ratio of the number of pixels in the specified dimension direction to the resolution of the imaging device in the specified dimension direction.
5. The method of claim 4, wherein the first relationship quantity comprises: a first length of the target object in a specified dimensional direction;
the position information comprises a coordinate value of the imaging in the specified dimension direction in the two-dimensional coordinate system and is recorded as M;
the size description information comprises a second length of the imaging in the direction of the specified dimension;
the determining the projection position of the target object on a plane perpendicular to the optical axis of the imaging device used for acquisition comprises:
calibrating a space formed by the optical axis and the plane by using a preset three-dimensional coordinate system, wherein a first coordinate axis and a second coordinate axis are used for calibrating the plane, a third coordinate axis is superposed with the optical axis, and the forward direction of the third coordinate axis is opposite to the image;
and calculating coordinate values of the projection of the target object on the plane on the first coordinate axis and/or the second coordinate axis by using a calculation formula of the first length M/the second length.
6. The method according to claim 4, wherein the calculating the coordinate values of the first coordinate axis and/or the second coordinate axis of the projection of the target object on the plane by using the calculation formula of the first length M/the second length comprises:
calculating a first coordinate value of the projection of the target object on the plane on the first coordinate axis by using the calculation formula;
the specified dimension direction comprises a first specified dimension direction and a second specified dimension direction perpendicular to the first specified dimension direction; and the first specified dimension direction is parallel to the first coordinate axis direction; the second specified dimension direction is parallel to the second coordinate axis direction;
recording the coordinate value of the first specified dimension direction as M1; the coordinate value of the second specified dimension direction is M2;
and calculating a second coordinate value projected on the second coordinate axis according to a calculation formula first coordinate value M2/M1.
7. The method of claim 3, wherein the first relationship parameter is preset, comprising:
measuring the projection of a preset reference object on a reference plane parallel to the plane of the image in advance, wherein the physical length of the preset reference object in a specified direction is measured; the physical length is used as the first relation parameter; or,
measuring the pixel length of an image corresponding to the projection in the specified direction, the pixel coordinate of the reference object in the specified direction, and the coordinate value of the projection of the reference object on a plane perpendicular to the optical axis on a coordinate axis parallel to the specified direction in the three-dimensional coordinate system from the acquired reference image containing the reference object image;
according to the calculation formula: the first relation parameter is calculated from the pixel length and the coordinate value/pixel coordinate on the coordinate axis parallel to the predetermined direction.
8. The method of claim 1, wherein the relationship parameters comprise:
a second relation parameter which embodies the incidence relation among the resolution of the imaging device used for acquisition, the size description information of the imaging and the distance between the target object and a plane vertical to the optical axis of the imaging device;
the area description information includes: size descriptive information of the imaging;
the determining the spatial location comprises:
and determining the distance between the target object and a plane perpendicular to the optical axis of the imaging device by using a preset algorithm, the second relation parameter and the size description information.
9. The method of claim 8, wherein the obtaining the area description information comprises:
presetting a preset two-dimensional coordinate system capable of calibrating the position of each point in the image, and measuring the length delta l of the image in the specified dimension direction on the two-dimensional coordinate system;
the determining the distance of the target object from a plane perpendicular to the optical axis of the imaging device comprises: using the calculation:
Figure A200810247497C00041
calculating the distance;
coordinate values in a specified dimension direction in coordinates calibrated by the two-dimensional coordinate system are used for representing the number of pixels in the specified dimension direction; or, the coordinate value in the specified dimension direction in the coordinates specified by the two-dimensional coordinate system is used for representing the ratio of the number of pixels in the specified dimension direction to the resolution of the imaging device in the specified dimension direction.
10. The method of claim 9, wherein the second relationship parameter is preset, comprising:
measuring a reference distance between a preset reference object and a plane perpendicular to the optical axis of the imaging device;
measuring a projection of the reference object on a reference plane parallel to the plane of the image, a third length in a specified direction;
using the calculation:and calculating the second relation parameter.
11. The method of claim 2, wherein the obtaining information describing the imaging position and/or size is automatically obtaining the position and/or size of the imaging in the image using object detection techniques.
12. The method of claim 11, wherein the object is a human face or/and a human head.
13. An apparatus, comprising: a position determination apparatus, the apparatus comprising: the device comprises an acquisition unit, an acquisition unit and a determination unit;
the acquisition unit is used for acquiring an image including the imaging of the target object;
the acquisition unit is used for acquiring area description information for describing the area occupied by the imaging on the image;
the determining unit is configured to determine the spatial position by using the area description information acquired by the acquiring unit and a preset relation parameter capable of embodying an association relation between the area description information and the spatial position of the target object.
14. The apparatus of claim 13, wherein the obtaining unit comprises:
a first acquisition subunit configured to acquire coordinates of the area in a preset two-dimensional coordinate system, the coordinates being used as the position information; the preset two-dimensional coordinate system can calibrate the positions of all points in the image;
and the second acquisition subunit is used for measuring the length delta l of the imaging in the specified dimension direction on a preset two-dimensional coordinate system.
15. The apparatus of claim 14, wherein the determining unit comprises: the device comprises a preset information receiving unit and a first calculating unit;
the preset information receiving unit is used for receiving a preset first relation parameter reflecting the size of the target object;
the first calculating unit is configured to determine, by using a preset algorithm, the first relation parameter received by the preset information receiving unit, and the area description information acquired by the acquiring subunit, a position of a projection of the target object on a plane perpendicular to an optical axis of the imaging device used for acquisition.
16. The apparatus of claim 14, wherein the determining unit comprises: the preset information receiving unit and the second calculating unit;
the preset information receiving unit is used for receiving a preset second relation parameter which reflects the incidence relation among the resolution of an imaging device used by the acquisition unit, the size description information of the imaging and the distance between the target object and a plane vertical to the optical axis of the imaging device;
the second calculating unit is configured to determine, by using a preset algorithm, the second relation parameter received by the preset information receiving unit, and the information acquired by the second acquiring subunit, a distance from the target object to a plane perpendicular to the optical axis of the imaging device.
CN200810247497.1A 2008-12-31 2008-12-31 Position confirming method and apparatus Active CN101464132B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN200810247497.1A CN101464132B (en) 2008-12-31 2008-12-31 Position confirming method and apparatus

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN200810247497.1A CN101464132B (en) 2008-12-31 2008-12-31 Position confirming method and apparatus

Publications (2)

Publication Number Publication Date
CN101464132A true CN101464132A (en) 2009-06-24
CN101464132B CN101464132B (en) 2014-09-10

Family

ID=40804822

Family Applications (1)

Application Number Title Priority Date Filing Date
CN200810247497.1A Active CN101464132B (en) 2008-12-31 2008-12-31 Position confirming method and apparatus

Country Status (1)

Country Link
CN (1) CN101464132B (en)

Cited By (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101876535A (en) * 2009-12-02 2010-11-03 北京中星微电子有限公司 Method, device and monitoring system for height measurement
CN102022979A (en) * 2009-09-21 2011-04-20 鸿富锦精密工业(深圳)有限公司 Three-dimensional optical sensing system
CN102735249A (en) * 2011-03-31 2012-10-17 通用电气公司 Method, system and computer program product for correlating information and location
CN104267203A (en) * 2014-10-30 2015-01-07 京东方科技集团股份有限公司 Sample testing method and device
CN104776832A (en) * 2015-04-16 2015-07-15 浪潮软件集团有限公司 Method, set top box and system for positioning objects in space
CN105007396A (en) * 2015-08-14 2015-10-28 山东诚海电子科技有限公司 Positioning method and positioning device for classroom teaching
WO2016062076A1 (en) * 2014-10-22 2016-04-28 中兴通讯股份有限公司 Camera-based positioning method, device, and positioning system
CN105635555A (en) * 2014-11-07 2016-06-01 青岛海尔智能技术研发有限公司 Camera focusing control method, image pick-up device and wearable intelligent terminal
CN106949830A (en) * 2016-06-24 2017-07-14 广州市九州旗建筑科技有限公司 The measuring technology and its computational methods of scale built in a kind of imaging system and application
CN107478155A (en) * 2017-08-24 2017-12-15 苏州光照精密仪器有限公司 Product inspection method, apparatus and system
CN109671190A (en) * 2018-11-27 2019-04-23 杭州天翼智慧城市科技有限公司 A kind of multi-pass barrier gate device management method and system based on recognition of face
CN109961455A (en) * 2017-12-22 2019-07-02 杭州萤石软件有限公司 Target detection method and device
CN111161339A (en) * 2019-11-18 2020-05-15 珠海随变科技有限公司 Distance measuring method, device, equipment and computer readable medium
CN111813984A (en) * 2020-06-23 2020-10-23 北京邮电大学 Method and device for realizing indoor positioning by using homography matrix and electronic equipment

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN100483462C (en) * 2002-10-18 2009-04-29 清华大学 Establishing method of human face 3D model by fusing multiple-visual angle and multiple-thread 2D information
CN1635545A (en) * 2003-12-30 2005-07-06 上海科技馆 Method and apparatus for changing human face image
WO2006051607A1 (en) * 2004-11-12 2006-05-18 Omron Corporation Face feature point detector and feature point detector

Cited By (25)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102022979A (en) * 2009-09-21 2011-04-20 鸿富锦精密工业(深圳)有限公司 Three-dimensional optical sensing system
US8657682B2 (en) 2009-09-21 2014-02-25 Hon Hai Precision Industry Co., Ltd. Motion sensing controller and game apparatus having same
CN101876535B (en) * 2009-12-02 2015-11-25 北京中星微电子有限公司 A kind of height measurement method, device and supervisory system
CN101876535A (en) * 2009-12-02 2010-11-03 北京中星微电子有限公司 Method, device and monitoring system for height measurement
CN102735249A (en) * 2011-03-31 2012-10-17 通用电气公司 Method, system and computer program product for correlating information and location
CN102735249B (en) * 2011-03-31 2017-04-12 通用电气公司 Method, system and computer implementation device for correlating information and location
CN105588543A (en) * 2014-10-22 2016-05-18 中兴通讯股份有限公司 Camera-based positioning method, device and positioning system
WO2016062076A1 (en) * 2014-10-22 2016-04-28 中兴通讯股份有限公司 Camera-based positioning method, device, and positioning system
CN105588543B (en) * 2014-10-22 2019-10-18 中兴通讯股份有限公司 A kind of method, apparatus and positioning system for realizing positioning based on camera
CN104267203A (en) * 2014-10-30 2015-01-07 京东方科技集团股份有限公司 Sample testing method and device
CN105635555A (en) * 2014-11-07 2016-06-01 青岛海尔智能技术研发有限公司 Camera focusing control method, image pick-up device and wearable intelligent terminal
CN105635555B (en) * 2014-11-07 2020-12-29 青岛海尔智能技术研发有限公司 Camera focusing control method, camera shooting device and wearable intelligent terminal
CN104776832A (en) * 2015-04-16 2015-07-15 浪潮软件集团有限公司 Method, set top box and system for positioning objects in space
CN105007396A (en) * 2015-08-14 2015-10-28 山东诚海电子科技有限公司 Positioning method and positioning device for classroom teaching
CN106949830A (en) * 2016-06-24 2017-07-14 广州市九州旗建筑科技有限公司 The measuring technology and its computational methods of scale built in a kind of imaging system and application
CN107478155A (en) * 2017-08-24 2017-12-15 苏州光照精密仪器有限公司 Product inspection method, apparatus and system
CN109961455A (en) * 2017-12-22 2019-07-02 杭州萤石软件有限公司 Target detection method and device
CN109961455B (en) * 2017-12-22 2022-03-04 杭州萤石软件有限公司 Target detection method and device
US11367276B2 (en) 2017-12-22 2022-06-21 Hangzhou Ezviz Software Co., Ltd. Target detection method and apparatus
CN109671190A (en) * 2018-11-27 2019-04-23 杭州天翼智慧城市科技有限公司 A kind of multi-pass barrier gate device management method and system based on recognition of face
CN109671190B (en) * 2018-11-27 2021-04-13 杭州天翼智慧城市科技有限公司 Multi-channel gate management method and system based on face recognition
CN111161339A (en) * 2019-11-18 2020-05-15 珠海随变科技有限公司 Distance measuring method, device, equipment and computer readable medium
CN111161339B (en) * 2019-11-18 2020-11-27 珠海随变科技有限公司 Distance measuring method, device, equipment and computer readable medium
CN111813984A (en) * 2020-06-23 2020-10-23 北京邮电大学 Method and device for realizing indoor positioning by using homography matrix and electronic equipment
CN111813984B (en) * 2020-06-23 2022-09-30 北京邮电大学 Method and device for realizing indoor positioning by using homography matrix and electronic equipment

Also Published As

Publication number Publication date
CN101464132B (en) 2014-09-10

Similar Documents

Publication Publication Date Title
CN101464132B (en) Position confirming method and apparatus
CN105698699B (en) A kind of Binocular vision photogrammetry method based on time rotating shaft constraint
JP4147059B2 (en) Calibration data measuring device, measuring method and measuring program, computer-readable recording medium, and image data processing device
US10506225B2 (en) Method of calibrating a camera
JP2874710B2 (en) 3D position measuring device
CN107084680B (en) Target depth measuring method based on machine monocular vision
CN111220129B (en) Focusing measurement method with rotating holder and terminal
CN102768762B (en) Digital camera calibration method targeted to shield tunnel defect digital radiography detection and device thereof
Albarelli et al. Robust camera calibration using inaccurate targets
JP5430138B2 (en) Shape measuring apparatus and program
JP2003130621A (en) Method and system for measuring three-dimensional shape
JP2009017480A (en) Camera calibration device and program thereof
CN106169076B (en) A kind of angle license plate image library building method based on perspective transform
CN105335699B (en) Read-write scene is read and write intelligent identification and the application thereof of element three-dimensional coordinate
TW201203173A (en) Three dimensional distance measuring device and method
CN102798456A (en) Method, device and system for measuring working amplitude of engineering mechanical arm support system
CN107123147A (en) Scaling method, device and the binocular camera system of binocular camera
JP3842988B2 (en) Image processing apparatus for measuring three-dimensional information of an object by binocular stereoscopic vision, and a method for recording the same, or a recording medium recording the measurement program
CN109493378B (en) Verticality detection method based on combination of monocular vision and binocular vision
JP3696336B2 (en) How to calibrate the camera
Hsu et al. Distance and angle measurement of objects on an oblique plane based on pixel number variation of CCD images
CN105354828B (en) Read and write intelligent identification and the application thereof of reading matter three-dimensional coordinate in scene
CN112907647B (en) Three-dimensional space size measurement method based on fixed monocular camera
JP3696335B2 (en) Method for associating each measurement point of multiple images
KR20180008066A (en) Measurement system and method of 3d displacement for structure using of single camera

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant
TR01 Transfer of patent right

Effective date of registration: 20180124

Address after: 519031 Guangdong city of Zhuhai province Hengqin Baohua Road No. 6, room 105, -23898 (central office)

Patentee after: Zhongxing Technology Co., Ltd.

Address before: 100083, Haidian District, Xueyuan Road, Beijing No. 35, Nanjing Ning building, 15 Floor

Patentee before: Beijing Vimicro Corporation

TR01 Transfer of patent right
CP01 Change in the name or title of a patent holder

Address after: 519031 -23898, 105 room 6, Baohua Road, Hengqin New District, Zhuhai, Guangdong (centralized office area)

Patentee after: Mid Star Technology Limited by Share Ltd

Address before: 519031 -23898, 105 room 6, Baohua Road, Hengqin New District, Zhuhai, Guangdong (centralized office area)

Patentee before: Zhongxing Technology Co., Ltd.

CP01 Change in the name or title of a patent holder