CN110148167A - A kind of distance measurement method and terminal device - Google Patents
A kind of distance measurement method and terminal device Download PDFInfo
- Publication number
- CN110148167A CN110148167A CN201910310118.7A CN201910310118A CN110148167A CN 110148167 A CN110148167 A CN 110148167A CN 201910310118 A CN201910310118 A CN 201910310118A CN 110148167 A CN110148167 A CN 110148167A
- Authority
- CN
- China
- Prior art keywords
- color image
- frame color
- terminal device
- feature point
- frame
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01B—MEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
- G01B11/00—Measuring arrangements characterised by the use of optical techniques
- G01B11/14—Measuring arrangements characterised by the use of optical techniques for measuring distance or clearance between spaced objects or spaced apertures
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C3/00—Measuring distances in line of sight; Optical rangefinders
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/50—Depth or shape recovery
- G06T7/55—Depth or shape recovery from multiple images
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10016—Video; Image sequence
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10024—Color image
Landscapes
- Physics & Mathematics (AREA)
- Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Theoretical Computer Science (AREA)
- Electromagnetism (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Image Analysis (AREA)
- Image Processing (AREA)
Abstract
The embodiment of the present invention provides a kind of distance measurement method and terminal device, is related to field of terminal technology, the IMU of heavy dependence when for solving the problems, such as the distance between two o'clock in terminal device measurement space in the prior art.This method comprises: obtaining the depth data of first frame color image and each characteristic point;First location information is determined according to the depth data of first frame color image and fisrt feature point;Obtain the depth data of each characteristic point of the second frame color image into M frame color image and each frame color image;According to first frame color image to M frame color image, the depth data of each characteristic point and the first posture information, the second posture information is determined;According to the depth data of the first posture information, the second posture information, first location information, M frame color image and second feature point, the distance between fisrt feature point and second feature point are determined.The embodiment of the present invention is used to measure the distance between two o'clock in space by terminal device.
Description
Technical field
The present invention relates to field of terminal technology more particularly to a kind of distance measurement methods and terminal device.
Background technique
It usually needs to measure the distance in space between two o'clock during navigation, mapping etc., therefore passes through terminal device
The distance between unobstructed two o'clock has become a research hotspot in measurement space.
In the prior art, a kind of scheme measuring the distance between unobstructed two o'clock in space by terminal device includes:
Computer vision analysis is carried out to the camera acquired image sequence of terminal device, identifies the feature in each frame image
Point;Then in conjunction with terminal device Inertial Measurement Unit (Inertial Measurement Unit, IMU) acquisition data with
Change in location of the characteristic point between adjacent image frame, obtains terminal device position and deflection information;Again by characteristic point
It calculates and identifies, identify the true planar in environment, finally establish coordinate system in the true planar of identification, calculate plane
The distance of upper two o'clock.Range measurement scheme in the prior art certain scene and under the conditions of, more can accurately measure sky
Between the distance between middle two o'clock, but heavy dependence IMU can not obtain terminal device if the IMU of terminal device stops working
Position and deflection information, and then measurement is caused to fail;In addition, even if the IMU of terminal device can continuous collecting data, if but eventually
The precision of the IMU of end equipment is poor, then the measurement data finally obtained also has biggish error.
Summary of the invention
The embodiment of the present invention provides a kind of distance measurement method and terminal device, for solving terminal device in the prior art
When measuring the distance between two o'clock in space the problem of heavy dependence IMU.
In order to solve the above-mentioned technical problem, the embodiments of the present invention are implemented as follows:
In a first aspect, being applied to terminal device, the method the embodiment provides a kind of distance measurement method
Include:
Obtain the depth data of first frame color image and each characteristic point in the first frame color image, described the
It include fisrt feature point in one frame color image;
First location information is determined according to the depth data of the first frame color image and fisrt feature point, it is described
When first location information is used to indicate the acquisition first frame color image, the phase of the fisrt feature point and the terminal device
To position;
The second frame color image is obtained to M frame color image and the second frame color image to the M frame color
The depth data of each characteristic point in coloured picture picture in each frame color image, the M frame color image include second feature point,
M is the integer more than or equal to 2;
According to the first frame color image to the M frame color image, the first frame color image to the M
The depth data and the first posture information of each characteristic point in frame color image in each frame color image, determine the second pose
Information, first posture information are used to indicate position and appearance of the terminal device when acquiring the first frame color image
State, second posture information are used to indicate position and posture of the terminal device when acquiring M frame color image;
According to first posture information, second posture information, the first location information, the M frame color
The depth data of image and second feature point, determine between the fisrt feature point and second feature point away from
From.
Second aspect, the embodiment provides a kind of terminal devices, comprising:
Acquisition unit, for obtaining the depth of each characteristic point in first frame color image and the first frame color image
Degree evidence includes fisrt feature point in the first frame color image;
Determination unit, for determining first according to the depth data of the first frame color image and fisrt feature point
Location information, when the first location information is used to indicate the acquisition first frame color image, the fisrt feature point and institute
State the relative position of terminal device;
The acquisition unit is also used to obtain the second frame color image to M frame color image and the second frame color
The depth data of each characteristic point of the coloured picture picture into the M frame color image in each frame color image, the M frame color
Image includes second feature point, and M is the integer more than or equal to 2;
The determination unit is also used to according to the first frame color image to the M frame color image, described first
The depth data of each characteristic point of the frame color image into the M frame color image in each frame color image and first
Appearance information determines that the second posture information, first posture information are used to indicate the terminal device and are acquiring the first frame
Position and posture when color image, second posture information are used to indicate the terminal device in acquisition M frame chromaticity diagram
As when position and posture;
Processing unit, for according to first posture information, second posture information, the first location information,
The depth data of the M frame color image and second feature point determines the fisrt feature point and second spy
The distance between sign point.
The third aspect, the embodiment provides a kind of terminal devices, comprising: processor, memory are stored in institute
The computer program that can be run on memory and on the processor is stated, when the computer program is executed by the processor
The step of realizing distance measurement method as described in relation to the first aspect.
Fourth aspect, it is described computer-readable to deposit the embodiment provides a kind of computer readable storage medium
Computer program is stored on storage media, realizes that distance as described in relation to the first aspect is surveyed when the computer program is executed by processor
The step of amount method.
Distance measurement method provided in an embodiment of the present invention obtains the first frame color image including fisrt feature point first
And the depth data of each characteristic point in the first frame color image, then according to the first frame color image and described
The fisrt feature point and the terminal device when depth data of fisrt feature point determines the acquisition first frame color image
Relative position;The second frame color image is obtained again to M frame color image and the second frame color image to the M
The depth data of each characteristic point in frame color image in each frame color image, and according to the first frame color image to institute
State each spy of M frame color image, the first frame color image into the M frame color image in each frame color image
It levies the depth data of point and is used to indicate position and posture of the terminal device when acquiring the first frame color image
The first posture information, determine and be used to indicate the of position and posture of the terminal device when acquiring M frame color image
Two posture informations, finally according to first posture information, second posture information, the first location information, the M
The depth data of frame color image and second feature point determines between the fisrt feature point and second feature point
Distance.That is, the embodiment of the present invention measure between fisrt feature point and second feature point apart from when, can be according to first frame
Color image each frame color into the M frame color image to the M frame color image, the first frame color image
The depth data of each characteristic point in image and the terminal device is used to indicate when acquiring the first frame color image
Position and posture the first posture information, determination be used to indicate position of the terminal device when acquiring M frame color image
It sets the second posture information with posture, and then obtains the distance between fisrt feature point and second feature point, without using IMU,
Therefore heavy dependence when the embodiment of the present invention can solve the distance between two o'clock in terminal device measurement space in the prior art
The problem of IMU of terminal device.
Detailed description of the invention
Fig. 1 is the architecture diagram of Android operation system provided by the embodiments of the present application;
Fig. 2 is the step flow chart of distance measurement method provided by the embodiments of the present application;
Fig. 3 is the schematic diagram of terminal device provided in an embodiment of the present invention;
Fig. 4 is the hardware structural diagram of terminal device provided in an embodiment of the present invention.
Specific embodiment
Following will be combined with the drawings in the embodiments of the present invention, and technical solution in the embodiment of the present invention carries out clear, complete
Site preparation description, it is clear that described embodiments are some of the embodiments of the present invention, instead of all the embodiments.Based on this hair
Embodiment in bright, every other implementation obtained by those of ordinary skill in the art without making creative efforts
Example, shall fall within the protection scope of the present invention.
The terms "and/or", only a kind of incidence relation for describing affiliated partner, indicates that there may be three kinds of passes
System, for example, A and/or B, can indicate: individualism A exists simultaneously A and B, these three situations of individualism B.In addition, herein
Middle character "/" typicallys represent the relationship that forward-backward correlation object is a kind of "or";In formula, character "/" indicates forward-backward correlation
Object is the relationship of a kind of " being divided by ".If explanation is not added, " multiple " herein refer to two or more.
For the ease of clearly describing the technical solution of the embodiment of the present invention, in an embodiment of the present invention, use " the
One ", the printed words such as " second " distinguish function or the essentially identical identical entry of effect or similar item, and those skilled in the art can
To understand that the printed words such as " first ", " second " are not defined quantity and execution order.
In the embodiment of the present invention, " illustrative " or " such as " etc. words for indicate make example, illustration or explanation.This hair
Be described as in bright embodiment " illustrative " or " such as " any embodiment or design scheme be not necessarily to be construed as than it
Its embodiment or design scheme more preferably or more advantage.Specifically, use " illustrative " or " such as " etc. words be intended to
Related notion is presented in specific ways.In embodiments of the present invention, unless otherwise indicated, the meaning of " plurality " is refer to two or
It is more than two.
Range measurement scheme in the prior art certain scene and under the conditions of, more can accurately measure two in space
The distance between point, but the IMU of heavy dependence terminal device can not obtain terminal and set if the IMU of terminal device stops working
Standby position and deflection information, and then measurement is caused to fail;In addition, even if the IMU of terminal device can continuous collecting data, but
If the precision of the IMU of terminal device is poor, then the measurement data finally obtained also has biggish error.
To solve the above-mentioned problems, the embodiment of the invention provides a kind of distance measurement method and terminal device, the distances
Measurement method obtains the first frame color image for including fisrt feature point and each spy in the first frame color image first
The depth data of point is levied, acquisition institute is then determined according to the depth data of the first frame color image and fisrt feature point
State the relative position of the fisrt feature point and the terminal device when first frame color image;The second frame color image is obtained again
To M frame color image and the second frame color image into the M frame color image in each frame color image
The depth data of each characteristic point, and according to the first frame color image to the M frame color image, the first frame color
It the depth data of each characteristic point of the image into the M frame color image in each frame color image and is used to indicate described
First posture information of position and posture of the terminal device when acquiring the first frame color image determines described in being used to indicate
Second posture information of position and posture of the terminal device when acquiring M frame color image, finally according to first pose
Information, second posture information, the first location information, the M frame color image and second feature point
Depth data determines the distance between the fisrt feature point and second feature point.That is, the embodiment of the present invention is in measurement the
It, can be according to first frame color image to the M frame color image, institute when the distance between one characteristic point and second feature point
State each characteristic point of the first frame color image into the M frame color image in each frame color image depth data and
It is used to indicate the first posture information of position and posture of the terminal device when acquiring the first frame color image, is determined
It is used to indicate the second posture information of position and posture of the terminal device when acquiring M frame color image, and then is obtained
The distance between fisrt feature point and second feature point, without using IMU, therefore the embodiment of the present invention can solve the prior art
In middle terminal device measurement space when the distance between two o'clock the problem of the IMU of heavy dependence terminal device.
The control method of terminal device provided by the embodiments of the present application can be applied to terminal device, which can be with
For the terminal device with operating system.The operating system can be Android operation system, or iOS operating system, also
It can be other possible operating systems, the embodiment of the present application is not construed as limiting.
Below by taking Android operation system as an example, the control method institute of terminal device provided by the embodiments of the present application is introduced
The software environment of application.
As shown in Figure 1, being a kind of configuration diagram of possible Android operation system provided by the embodiments of the present application.Scheming
In 1, the framework of Android operation system includes 4 layers, be respectively as follows: application layer, application framework layer, system Runtime Library layer and
Inner nuclear layer (is specifically as follows Linux inner core).
Wherein, application layer includes each application program (including system application and in Android operation system
Tripartite's application program).
Application framework layer is the frame of application program, and developer can be in the exploitation for the frame for abiding by application program
In the case where principle, some application programs are developed based on application framework layer.
System Runtime Library layer includes library (also referred to as system library) and Android operation system running environment.Library is mainly Android behaviour
As system it is provided needed for all kinds of resources.Android operation system running environment is used to provide software loop for Android operation system
Border.
Inner nuclear layer is the operating system layer of Android operation system, belongs to the bottom of Android operation system software level.It is interior
Stratum nucleare provides core system service and hardware-related driver based on linux kernel for Android operation system.
By taking Android operation system as an example, in the embodiment of the present application, developer can be based on above-mentioned Android as shown in Figure 1
The software program of the control method of terminal device provided by the embodiments of the present application is realized in the system architecture of operating system, exploitation, from
And the control method of the terminal device is run based on Android operation system as shown in Figure 1.That is processor or end
End equipment can realize terminal device provided by the embodiments of the present application by running the software program in Android operation system
Control method.
Terminal device provided by the embodiments of the present application can be mobile phone, tablet computer, laptop, super mobile personal
Computer (ultra-mobile personal computer, UMPC), net book, personal digital assistant (personal
Digital assistant, PDA), smartwatch, the terminal devices such as Intelligent bracelet or the terminal device can also be other
The terminal device of type, the embodiment of the present application are not construed as limiting.
Distance measurement method provided in an embodiment of the present invention, the distance measurement method are applied to terminal device.Referring to Fig. 2 institute
Show, distance measurement method provided in an embodiment of the present invention includes the following steps 11 to step 15.
Step 11, the depth number for obtaining first frame color image and each characteristic point in the first frame color image
According to.
It wherein, include fisrt feature point in the first frame color image.
Specifically, can control the color camera of terminal device and depth camera while carrying out data acquisition, thus obtaining
While taking a frame color image, the depth data of each characteristic point in the color image is obtained.
In addition, the hardware device for obtaining color image can be the RGB camera of terminal device, the hardware of depth data is obtained
Device can be flight time (Time Of Flight, TOF) camera.That is, first frame color image can be RGB image, the
The depth data of each characteristic point in one frame color image can be the TOF data of each characteristic point in first frame color image.
Further, the fisrt feature point in above-described embodiment can be selected by user.User selectes fisrt feature point
Process may include steps of 111 and step 112.
Step 111, the first input for receiving user, first input is to the defeated of a certain characteristic point in preview screen
Enter.
Step 112 is inputted in response to first, will be received the preview screen that user first inputs and is determined as first frame chromaticity diagram
It is determined as fisrt feature point as and by receive the characteristic point that user first inputs in preview screen.
Step 12 determines that first position is believed according to the depth data of the first frame color image and fisrt feature point
Breath.
Wherein, when the first location information is used to indicate the acquisition first frame color image, the fisrt feature point
With the relative position of the terminal device.
Specifically, since terminal device acquires the field angle of color image as fixed value, and available fisrt feature point
Position and fisrt feature point in first frame color image, therefore can be according to the first frame at a distance from terminal device
The depth data of color image and fisrt feature point determines first location information.
Optionally, the depth data in above-mentioned steps 12 according to the first frame color image and fisrt feature point is true
Determine first location information, may include steps of 121 and step 122.
Step 121 establishes generation as origin using the position of the terminal device when acquisition first frame color image
Boundary's coordinate system.
That is, the world coordinates for acquiring the terminal device when first frame color image is (0,0,0).
Step 122, according to the depth color image of the first frame color image and fisrt feature point calculate described in
The world coordinates of fisrt feature point.
That is, calculating fisrt feature point according to the depth color image of the first frame color image and fisrt feature point
Coordinate in the world coordinate system of foundation.
Optionally, in above-mentioned steps 12 (according to the depth data of the first frame color image and fisrt feature point
Determine first location information) before, the method that invention embodiment provides can also include:
Data are carried out to the depth data of each characteristic point in first frame color image and first frame color image to locate in advance
Reason.
Specifically, being counted to the depth data of each characteristic point in first frame color image and first frame color image
Data preprocess may include: to carry out to the depth data of each characteristic point in first frame color image and first frame color image
Gamma correction, zero migration correction, cut oblique correction, temporal filtering, distortion correction, plane correction, in flight pixel correction
It is one or more.
Step 13 obtains the second frame color image to M frame color image and the second frame color image to described
The depth data of each characteristic point in M frame color image in each frame color image.
Wherein, the M frame color image includes second feature point, and M is the integer more than or equal to 2.
Equally, the second feature point in above-described embodiment can be selected by user.User selectes the process of second feature point
It can be to include the following steps 131 and step 132.
Step 131, after user selectes fisrt feature point, the characteristic point that is carried out continuously in color image and color image
Depth acquisition, and the color image of acquisition is shown on the screen of terminal device in real time, until receiving user's input
Second input.
Step 132 is inputted in response to second, stops carrying out color image acquisition, by the preview for receiving the second input
Picture is determined as M frame color image and is determined as second feature for the characteristic point that user second inputs is received in preview screen
Point.
Step 14, according to the first frame color image to the M frame color image, the first frame color image extremely
The depth data and the first posture information of each characteristic point in the M frame color image in each frame color image determine
Second posture information.
Wherein, first posture information is used to indicate the terminal device when acquiring the first frame color image
Position and posture, second posture information be used to indicate position of the terminal device when acquiring M frame color image and
Posture.
Optionally, according to the first frame color image to the M frame color image, described first in above-mentioned steps 14
The depth data of each characteristic point of the frame color image into the M frame color image in each frame color image and first
Appearance information determines that the realization process of the second posture information may include: successively to the second frame color image to M frame color
Image executes following steps 141 and step 142.
Step 141 carries out Feature Points Matching to nth frame color image and N-1 frame color image, obtains matching characteristic
Point.
Wherein, the matching characteristic point is successful match in the nth frame color image and the N-1 frame color image
Characteristic point, N is integer, and M >=N >=2.
Specifically, the realization process of above-mentioned steps 141 may include: respectively to nth frame color image and N-1 frame color
Image carries out feature point extraction, then will mention from the characteristic point extracted in nth frame color image with from N-1 frame color image
All characteristic points taken are matched;If successful match, this feature point is determined as matching characteristic point, and will be from nth frame color
The next characteristic point extracted in coloured picture picture is matched with all characteristic points extracted from N-1 frame color image;If
With unsuccessful, then will directly be extracted from the next characteristic point extracted in nth frame color image with from N-1 frame color image
All characteristic points matched, until all characteristic points for being extracted from nth frame color image with a later frame color image
Until all characteristic points of middle extraction are matched.
Step 142, according to institute in the intrinsic parameter and outer parameter, the nth frame color image of the camera of the terminal device
State the depth data of matching characteristic point described in the depth data of matching characteristic point, the N-1 frame color image, and acquisition
The position of the terminal device and posture when the N-1 frame color image, determine acquire the nth frame color image when described in
The position of terminal device and posture.
Specifically, the camera intrinsic parameter of terminal device is parameter relevant to camera self-characteristic, under normal circumstances, camera
Intrinsic parameter be made of following two parts, a part are as follows: the focus of the parameter of projective transformation itself, camera arrives imaging plane
Distance, that is, focal length, another part are as follows: the transformation matrix from imaging plane coordinate system to pixel coordinate system;And camera is outer
Parameter is used to describe the movement of camera camera under static scene, or when camera is fixed, the fortune of object captured by camera
It is dynamic, camera can be obtained in the multiple image relative motion being continuously shot by Camera extrinsic number.The camera of terminal device it is interior
Parameter and outer parameter are preset parameter, and are generally stored in terminal device, can reading terminals are set directly from terminal device
The intrinsic parameter of standby camera and outer parameter.
Further, the intrinsic parameter of the camera of the terminal device and outer parameter include: the camera for acquiring color image
Intrinsic parameter, the outer parameter of camera for acquiring color image, sampling depth data camera intrinsic parameter and sampling depth data
Camera outer parameter.Acquisition color image camera be RGB camera, sampling depth data camera be the feelings of TOF camera
Under condition, the intrinsic parameter of the camera of terminal device and outer parameter include: the intrinsic parameter of RGB camera and the internal reference of outer parameter, TOF camera
Several and outer parameter.
Illustratively, the realization process of above-mentioned steps 14 is illustrated by taking M=3 as an example below, as M=3 on
It states and determines that the process of the second posture information includes the following steps a to step d.
Step a, Feature Points Matching is carried out to the second frame color image and first frame color image, obtains the first matching characteristic
Point.
Step b, according to the intrinsic parameter of the camera of the terminal device and outer parameter, the first frame color image
The depth data of first matching characteristic point described in the depth data of first matching characteristic point, the second frame color image, with
And the position of the terminal device and posture when the first frame color image are acquired, it determines and acquires the second frame color image
The position of Shi Suoshu terminal device and posture.
Step c, Feature Points Matching is carried out to third frame color image and the second frame color image, obtains the second matching characteristic
Point.
Step d, according to the intrinsic parameter of the camera of the terminal device and outer parameter, the third frame color image
The depth data of second matching characteristic point described in the depth data of second matching characteristic point, the second frame color image, with
And the position of the terminal device and posture when the second frame color image are acquired, it determines and acquires the third frame color image
The position of Shi Suoshu terminal device and posture (the second posture information).
Equally, in above-mentioned steps 14 (according to the first frame color image to the M frame color image, described first
The depth data of each characteristic point of the frame color image into the M frame color image in each frame color image and first
Appearance information determines the second posture information) before, the embodiment of the present invention can also be with multipair second frame color image to M frame color
The depth of each characteristic point of image and the second frame color image into the M frame color image in each frame color image
Degree is according to progress data prediction.
Step 15, according to first posture information, second posture information, the first location information, the M
The depth data of frame color image and second feature point determines between the fisrt feature point and second feature point
Distance.
Optionally, generation is established as origin in the position of the terminal device when using the acquisition first frame color image
In the case where boundary's coordinate system, according to first posture information, second posture information, first described in above-mentioned steps 15
The depth data of confidence breath, the M frame color image and second feature point determines the fisrt feature point and described
The distance between second feature point includes the following steps 151 to step 154.
Step 151, according to first posture information and second posture information, determine acquisition M frame color image
The world coordinates and deflection angle of Shi Suoshu terminal device.
Specifically, the position of terminal device and appearance when the first posture information acquires the first frame color image with instruction
State, and world coordinates is to be established using the position of the terminal device when acquisition first frame color image as origin,
Therefore the world coordinates of terminal device when acquisition M frame color image can be determined first;Secondly, establishing world coordinates
After system, the deflection angle of terminal device in all directions when acquiring the first frame color image can be determined, such as: it can make
The direction of one reference axis of world coordinate system is identical as focal length direction when terminal device acquisition first frame color image,
To make the deflection angle of terminal device in all directions be zero;Therefore according to terminal when acquiring the first frame color image
The deflection angle of equipment in all directions, additionally it is possible to the terminal device each when calculating acquisition M frame color image
The deflection angle in direction.
Step 152, according to the depth data of the M frame color image and second feature point, determine the second position
Information.
Wherein, the second location information be used to indicate acquire the M frame color image when, the second feature point with
The relative position of the terminal device.
The principle and root of second location information are determined according to the depth data of M frame color image and second feature point
Determine that the principle of first location information is identical according to first frame color image and the depth data of the fisrt feature point, then this is no longer
It repeats.
Step 153, according to the world of the terminal device when second location information and acquisition M frame color image
Coordinate and deflection angle determine the world coordinates of the second feature point.
Specifically, the world coordinates and deflection angle of terminal device when M frame color image are acquired due to obtaining,
And second location information can indicate to acquire the phase of the second feature point and the terminal device when M frame color image
To position, therefore can be according to the world of the terminal device when second location information and acquisition M frame color image
Coordinate and deflection angle determine the world coordinates of the second feature point.
Step 154, according to the world coordinates of the fisrt feature point and the world coordinates of second feature point, calculate institute
State the distance between fisrt feature point and second feature point.
Need two characteristic points (fisrt feature point and second feature point) measured in the same coordinate system (generation due to obtaining
Boundary's coordinate system) in coordinate, therefore by the world coordinates of the world coordinates of fisrt feature point (x1, y1, z1) and second feature point
(x2, y2, z2) substitutes into formula:
The distance between the fisrt feature point and second feature point can be calculated | AB |.
Distance measurement method provided in an embodiment of the present invention obtains the first frame color image including fisrt feature point first
And the depth data of each characteristic point in the first frame color image, then according to the first frame color image and described
The fisrt feature point and the terminal device when depth data of fisrt feature point determines the acquisition first frame color image
Relative position;The second frame color image is obtained again to M frame color image and the second frame color image to the M
The depth data of each characteristic point in frame color image in each frame color image, and according to the first frame color image to institute
State each spy of M frame color image, the first frame color image into the M frame color image in each frame color image
It levies the depth data of point and is used to indicate position and posture of the terminal device when acquiring the first frame color image
The first posture information, determine and be used to indicate the of position and posture of the terminal device when acquiring M frame color image
Two posture informations, finally according to first posture information, second posture information, the first location information, the M
The depth data of frame color image and second feature point determines between the fisrt feature point and second feature point
Distance.That is, the embodiment of the present invention measure between fisrt feature point and second feature point apart from when, can be according to first frame
Color image each frame color into the M frame color image to the M frame color image, the first frame color image
The depth data of each characteristic point in image and the terminal device is used to indicate when acquiring the first frame color image
Position and posture the first posture information, determination be used to indicate position of the terminal device when acquiring M frame color image
It sets the second posture information with posture, and then obtains the distance between fisrt feature point and second feature point, without using IMU,
Therefore heavy dependence when the embodiment of the present invention can solve the distance between two o'clock in terminal device measurement space in the prior art
The problem of IMU of terminal device.
Some embodiments of the present invention can carry out the division of functional module according to above method example to terminal device.Example
Such as, each functional module of each function division can be corresponded to, two or more functions can also be integrated in a mould
In block.Above-mentioned integrated module both can take the form of hardware realization, can also be realized in the form of software function module.
It should be noted that being schematical, only a kind of logic function stroke to the division of module in some embodiments of the present invention
Point, there may be another division manner in actual implementation.
Using integrated unit, Fig. 3 shows one kind of terminal device involved in above-described embodiment
Possible structural schematic diagram, the terminal device 300 include:
Acquisition unit 31, for obtaining each characteristic point in first frame color image and the first frame color image
Depth data includes fisrt feature point in the first frame color image;
Determination unit 32, for determining according to the depth data of the first frame color image and fisrt feature point
One location information, the first location information be used to indicate acquire the first frame color image when, the fisrt feature point with
The relative position of the terminal device;
The acquisition unit 31 is also used to obtain the second frame color image to M frame color image and second frame
The depth data of each characteristic point of the color image into the M frame color image in each frame color image, the M frame color
Coloured picture picture includes second feature point, and M is the integer more than or equal to 2;
The determination unit 32 is also used to according to the first frame color image to the M frame color image, described
The depth data and first of each characteristic point of the one frame color image into the M frame color image in each frame color image
Posture information determines that the second posture information, first posture information are used to indicate the terminal device in acquisition described first
Position and posture when frame color image, second posture information are used to indicate the terminal device in acquisition M frame color
Position and posture when image;
Processing unit 33, for being believed according to first posture information, second posture information, the first position
The depth data of breath, the M frame color image and second feature point, determines the fisrt feature point and described second
The distance between characteristic point.
Optionally, the determination unit 32 is specifically used for successively to the second frame color image to M frame color image
It performs the following operations:
Feature Points Matching is carried out to nth frame color image and N-1 frame color image, obtains matching characteristic point, described
It is the characteristic point of successful match in the nth frame color image and the N-1 frame color image with characteristic point, N is integer, and
M≥N≥2;
It is matched according to the intrinsic parameter of the camera of the terminal device and outer parameter, the nth frame color image special
Levy the depth data of matching characteristic point described in the depth data put, the N-1 frame color image, and the acquisition N-
The position of the terminal device and posture when 1 frame color image determine the terminal device when acquisition nth frame color image
Position and posture.
Optionally, the determination unit 32, terminal when being specifically used for the acquisition first frame color image
The position of equipment is that origin establishes world coordinate system, according to the depth color of the first frame color image and fisrt feature point
Coloured picture picture calculates the world coordinates of the fisrt feature point.
Optionally, the processing unit 33 is specifically used for according to first posture information and second posture information,
Determine the world coordinates and deflection angle of terminal device when acquisition M frame color image;According to the M frame color image
With the depth data of the second feature point, determine that second location information, the second location information are used to indicate described in acquisition
When M frame color image, the relative position of the second feature point and the terminal device;According to the second location information with
And the world coordinates and deflection angle of terminal device when M frame color image are acquired, determine the generation of the second feature point
Boundary's coordinate;According to the world coordinates of the world coordinates of the fisrt feature point and second feature point, it is special to calculate described first
The distance between sign point and the second feature point.
Optionally, any frame color image is RGB image,
The depth data of each characteristic point in any frame color image is the TOF number of each characteristic point in the frame color image
According to.
Terminal device provided in an embodiment of the present invention, comprising: acquisition unit, determination unit and processing unit;Wherein, it adopts
Collection unit can obtain each feature in first frame color image and the first frame color image including fisrt feature point
The depth data of point;Determination unit can be determined according to the depth data of the first frame color image and fisrt feature point
Acquire the relative position of the fisrt feature point and the terminal device when first frame color image;Acquisition unit can also
The second frame color image is obtained to M frame color image and the second frame color image into the M frame color image
The depth data of each characteristic point in each frame color image, determination unit can also be according to the first frame color images to institute
State each spy of M frame color image, the first frame color image into the M frame color image in each frame color image
It levies the depth data of point and is used to indicate position and posture of the terminal device when acquiring the first frame color image
The first posture information, determine and be used to indicate the of position and posture of the terminal device when acquiring M frame color image
Two posture informations, processing unit can be believed according to first posture information, second posture information, the first position
The depth data of breath, the M frame color image and second feature point, determines the fisrt feature point and described second
The distance between characteristic point.That is, the embodiment of the present invention measure between fisrt feature point and second feature point apart from when, can be with
According to first frame color image to the M frame color image, the first frame color image into the M frame color image
It the depth data of each characteristic point in each frame color image and is used to indicate the terminal device and is acquiring the first frame
First posture information of position and posture when color image determines and is used to indicate the terminal device in acquisition M frame color
Second posture information of position and posture when image, and then obtain the distance between fisrt feature point and second feature point, nothing
IMU need to be used, therefore the embodiment of the present invention can solve the distance between two o'clock in terminal device measurement space in the prior art
When heavy dependence terminal device IMU the problem of.
A kind of hardware structural diagram of terminal device of the embodiment of Fig. 4 to realize the present invention, the terminal device 100 packet
It includes but is not limited to: radio frequency unit 101, network module 102, audio output unit 103, input unit 104, sensor 105, display
The components such as unit 106, user input unit 107, interface unit 108, memory 109, processor 110 and power supply 111.Ability
Field technique personnel are appreciated that terminal device structure shown in Fig. 4 does not constitute the restriction to terminal device, and terminal device can
To include perhaps combining certain components or different component layouts than illustrating more or fewer components.Implement in the application
In example, terminal device includes but is not limited to mobile phone, tablet computer, laptop, palm PC, car-mounted terminal, wearable sets
Standby and pedometer etc..
Wherein, the input unit 104, for obtaining in first frame color image and the first frame color image
The depth data of each characteristic point includes fisrt feature point in the first frame color image;
The processor 110, it is true for the depth data according to the first frame color image and fisrt feature point
Determine first location information, when the first location information is used to indicate the acquisition first frame color image, the fisrt feature
The relative position of point and the terminal device;
The input unit 104 is also used to obtain the second frame color image to M frame color image and second frame
The depth data of each characteristic point of the color image into the M frame color image in each frame color image, the M frame color
Coloured picture picture includes second feature point, and M is the integer more than or equal to 2;
The processor 110 is also used to according to the first frame color image to the M frame color image, described
The depth data and first of each characteristic point of the one frame color image into the M frame color image in each frame color image
Posture information determines the second posture information, and according to first posture information, second posture information, described first
The depth data of location information, the M frame color image and second feature point determines the fisrt feature point and institute
State the distance between second feature point;Wherein, first posture information is used to indicate the terminal device and is acquiring described the
Position and posture when one frame color image, second posture information are used to indicate the terminal device in acquisition M frame color
Coloured picture as when position and posture.
Terminal device provided in an embodiment of the present invention can obtain first frame color image including fisrt feature point and
The depth data of each characteristic point in the first frame color image;According to the first frame color image and the fisrt feature
The depth data of point determines the opposite position of the fisrt feature point and the terminal device when acquisition first frame color image
It sets;The second frame color image is obtained to M frame color image and the second frame color image to the M frame color image
In each characteristic point in each frame color image depth data, according to the first frame color image to the M frame color
The depth of each characteristic point of image, the first frame color image into the M frame color image in each frame color image
Data and the first pose for being used to indicate position and posture of the terminal device when acquiring the first frame color image
Information determines the second posture information for being used to indicate position and posture of the terminal device when acquiring M frame color image,
According to first posture information, second posture information, the first location information, the M frame color image and
The depth data of the second feature point, determines the distance between the fisrt feature point and second feature point.That is, this hair
Bright embodiment measure between fisrt feature point and second feature point apart from when, can be according to first frame color image to described
Each feature of M frame color image, the first frame color image into the M frame color image in each frame color image
Point depth data and be used to indicate position and posture of the terminal device when acquiring the first frame color image
First posture information, determination are used to indicate the second of position and posture of the terminal device when acquiring M frame color image
Posture information, and then the distance between fisrt feature point and second feature point are obtained, without using IMU, therefore the present invention is implemented
The IMU of heavy dependence terminal device when example can solve the distance between two o'clock in terminal device measurement space in the prior art
Problem.
It should be understood that the embodiment of the present application in, radio frequency unit 101 can be used for receiving and sending messages or communication process in, signal
Send and receive, specifically, by from base station downlink data receive after, to processor 110 handle;In addition, by uplink
Data are sent to base station.In general, radio frequency unit 101 includes but is not limited to antenna, at least one amplifier, transceiver, coupling
Device, low-noise amplifier, duplexer etc..In addition, radio frequency unit 101 can also by wireless communication system and network and other set
Standby communication.
Terminal device provides wireless broadband internet by network module 102 for user and accesses, and such as user is helped to receive
It sends e-mails, browse webpage and access streaming video etc..
Audio output unit 103 can be received by radio frequency unit 101 or network module 102 or in memory 109
The audio data of storage is converted into audio signal and exports to be sound.Moreover, audio output unit 103 can also provide and end
The relevant audio output of specific function that end equipment 100 executes is (for example, call signal receives sound, message sink sound etc.
Deng).Audio output unit 103 includes loudspeaker, buzzer and receiver etc..
Input unit 104 is for receiving audio, vision signal and optical signal.Input unit 104 may include: that color is taken the photograph
As head, depth camera, graphics processor (Graphics Processing Unit, GPU) 1041, microphone 1042 etc. fill
It sets.Wherein, graphics processor 1041 (is such as imaged in video acquisition mode or image capture mode by image capture apparatus
Head) static images that obtain or the color image of video handled.Treated, and picture frame may be displayed on display unit 106
On.Through graphics processor 1041, treated that picture frame can store in memory 109 (or other storage mediums) or pass through
It is sent by radio frequency unit 101 or network module 102.Microphone 1042 can receive sound, and can be by such sound
Sound processing is audio data.Treated audio data can be converted in the case where telephone calling model can be via radio frequency list
Member 101 is sent to the format output of mobile communication base station.
Terminal device 100 further includes at least one sensor 105, such as optical sensor, motion sensor and other biographies
Sensor.Specifically, optical sensor includes ambient light sensor and proximity sensor, wherein ambient light sensor can be according to environment
The light and shade of light adjusts the brightness of display panel 1061, and proximity sensor can close when terminal device 900 is moved in one's ear
Display panel 1061 and/or backlight.As a kind of motion sensor, accelerometer sensor can detect in all directions (general
For three axis) size of acceleration, it can detect that size and the direction of gravity when static, can be used to identify terminal device posture (ratio
Such as horizontal/vertical screen switching, dependent game, magnetometer pose calibrating), Vibration identification correlation function (such as pedometer, tap);It passes
Sensor 105 can also include fingerprint sensor, pressure sensor, iris sensor, molecule sensor, gyroscope, barometer, wet
Meter, thermometer, infrared sensor etc. are spent, details are not described herein.
Display unit 106 is for showing information input by user or being supplied to the information of user.Display unit 106 can wrap
Display panel 1061 is included, liquid crystal display (Liquid Crystal Display, LCD), Organic Light Emitting Diode can be used
Forms such as (Organic Light-Emitting Diode, OLED) configure display panel 1061.
User input unit 107 can be used for receiving the number of input or character information and generate and the user of terminal device
Setting and the related key signals input of function control.Specifically, user input unit 107 include touch panel 1071 and its
His input equipment 1072.Touch panel 1071, also referred to as touch screen collect the touch operation (ratio of user on it or nearby
Such as user is using finger, stylus any suitable object or attachment on touch panel 1071 or near touch panel 1071
Operation).Touch panel 1071 may include both touch detecting apparatus and touch controller.Wherein, touch detecting apparatus
The touch orientation of user is detected, and detects touch operation bring signal, transmits a signal to touch controller;Touch controller
Touch information is received from touch detecting apparatus, and is converted into contact coordinate, then gives processor 110, receives processor
110 orders sent simultaneously are executed.Furthermore, it is possible to using multiple types such as resistance-type, condenser type, infrared ray and surface acoustic waves
Type realizes touch panel 1071.In addition to touch panel 1071, user input unit 107 can also include other input equipments
1072.Specifically, other input equipments 1072 can include but is not limited to physical keyboard, function key (such as volume control button,
Switch key etc.), trace ball, mouse, operating stick, details are not described herein.
Further, touch panel 1071 can be covered on display panel 1061, when touch panel 1071 is detected at it
On or near touch operation after, send processor 110 to determine the type of touch event, be followed by subsequent processing device 110 according to touching
The type for touching event provides corresponding visual output on display panel 1061.Although in Fig. 4, touch panel 1071 and display
Panel 1061 is the function that outputs and inputs of realizing terminal device as two independent components, but in some embodiments
In, can be integrated by touch panel 1071 and display panel 1061 and realize the function that outputs and inputs of terminal device, it is specific this
Place is without limitation.
Interface unit 108 is the interface that external device (ED) is connect with terminal device 100.For example, external device (ED) may include having
Line or wireless head-band earphone port, external power supply (or battery charger) port, wired or wireless data port, storage card end
Mouth, port, the port audio input/output (I/O), video i/o port, earphone end for connecting the device with identification module
Mouthful etc..Interface unit 108 can be used for receiving the input (for example, data information, electric power etc.) from external device (ED) and
By one or more elements that the input received is transferred in terminal device 100 or can be used in 100 He of terminal device
Data are transmitted between external device (ED).
Memory 109 can be used for storing software program and various data.Memory 109 can mainly include storing program area
The storage data area and, wherein storing program area can (such as the sound of application program needed for storage program area, at least one function
Sound playing function, image player function etc.) etc.;Storage data area can store according to mobile phone use created data (such as
Audio data, phone directory etc.) etc..In addition, memory 109 may include high-speed random access memory, it can also include non-easy
The property lost memory, a for example, at least disk memory, flush memory device or other volatile solid-state parts.
Processor 110 is the control centre of terminal device, utilizes the more of various interfaces and the entire terminal device of connection
A part is stored in memory by running or executing the software program being stored in memory 109 and/or module and call
Data in 109 execute the various functions and processing data of terminal device, to carry out integral monitoring to terminal device.Processing
Device 110 may include one or more processing units;Preferably, processor 110 can integrate application processor and modulation /demodulation processing
Device, wherein the main processing operation system of application processor, user interface and application program etc., modem processor is mainly located
Reason wireless communication.It is understood that above-mentioned modem processor can not also be integrated into processor 110.
Terminal device 100 can also include the power supply 111 (such as battery) powered to multiple components, it is preferred that power supply 111
Can be logically contiguous by power-supply management system and processor 110, to realize management charging, electric discharge by power-supply management system
And the functions such as power managed.
In addition, terminal device 100 includes some unshowned functional modules, details are not described herein.
It should be noted that, in this document, the terms "include", "comprise" or its any other variant are intended to non-row
His property includes, so that the process, method, article or the device that include a series of elements not only include those elements, and
And further include other elements that are not explicitly listed, or further include for this process, method, article or device institute it is intrinsic
Element.In the absence of more restrictions, the element limited by sentence "including a ...", it is not excluded that including being somebody's turn to do
There is also other identical elements in the process, method of element, article or device.
Through the above description of the embodiments, those skilled in the art can be understood that above-described embodiment side
Method can be realized by means of software and necessary general hardware platform, naturally it is also possible to by hardware, but in many cases
The former is more preferably embodiment.Based on this understanding, technical solution of the present invention substantially in other words does the prior art
The part contributed out can be embodied in the form of software products, which is stored in a storage medium
In (such as ROM/RAM, magnetic disk, CD), including some instructions are used so that a terminal (can be mobile phone, computer, service
Device, air conditioner or network side equipment etc.) execute method described in various embodiments of the present invention.
The embodiment of the present invention is described with above attached drawing, but the invention is not limited to above-mentioned specific
Embodiment, the above mentioned embodiment is only schematical, rather than restrictive, those skilled in the art
Under the inspiration of the present invention, without breaking away from the scope protected by the purposes and claims of the present invention, it can also make very much
Form belongs within protection of the invention.
Claims (11)
1. a kind of distance measurement method, which is characterized in that be applied to terminal device, which comprises
Obtain the depth data of first frame color image and each characteristic point in the first frame color image, the first frame
It include fisrt feature point in color image;
First location information is determined according to the depth data of the first frame color image and fisrt feature point, described first
When location information is used to indicate the acquisition first frame color image, the opposite position of the fisrt feature point and the terminal device
It sets;
The second frame color image is obtained to M frame color image and the second frame color image to the M frame chromaticity diagram
The depth data of each characteristic point as in each frame color image, the M frame color image includes second feature point, and M is
Integer more than or equal to 2;
According to the first frame color image to the M frame color image, the first frame color image to the M frame color
The depth data and the first posture information of each characteristic point in coloured picture picture in each frame color image determine that the second pose is believed
Breath, first posture information are used to indicate position and appearance of the terminal device when acquiring the first frame color image
State, second posture information are used to indicate position and posture of the terminal device when acquiring M frame color image;
According to first posture information, second posture information, the first location information, the M frame color image
And the depth data of the second feature point, determine the distance between the fisrt feature point and second feature point.
2. the method according to claim 1, wherein described according to the first frame color image to the M
Each characteristic point of frame color image, the first frame color image into the M frame color image in each frame color image
Depth data and the first posture information, determine the second posture information, comprising:
Successively the second frame color image to M frame color image is performed the following operations:
Feature Points Matching is carried out to nth frame color image and N-1 frame color image, obtains matching characteristic point, the matching is special
Sign point is the characteristic point of successful match in the nth frame color image and the N-1 frame color image, and N is integer, and M >=N
≥2;
According to matching characteristic point described in the intrinsic parameter of the camera of the terminal device and outer parameter, the nth frame color image
Depth data, matching characteristic point described in the N-1 frame color image depth data, and the acquisition N-1 frame
The position of the terminal device and posture when color image determine terminal device when acquiring the nth frame color image
Position and posture.
3. the method according to claim 1, wherein described according to the first frame color image and described first
The depth data of characteristic point determines first location information, comprising:
World coordinate system is established as origin in the position of the terminal device when using the acquisition first frame color image;
The fisrt feature point is calculated according to the depth color image of the first frame color image and fisrt feature point
World coordinates.
4. according to the method described in claim 3, it is characterized in that, described according to first posture information, the second
Appearance information, the first location information, the M frame color image and second feature point depth data, determine institute
State the distance between fisrt feature point and second feature point, comprising:
According to first posture information and second posture information, determine that the terminal is set when acquisition M frame color image
Standby world coordinates and deflection angle;
According to the depth data of the M frame color image and second feature point, second location information is determined, described second
When location information is used to indicate the acquisition M frame color image, the opposite position of the second feature point and the terminal device
It sets;
The world coordinates and deflection angle of the terminal device when according to the second location information and acquisition M frame color image
Degree, determines the world coordinates of the second feature point;
According to the world coordinates of the world coordinates of the fisrt feature point and second feature point, the fisrt feature point is calculated
The distance between described second feature point.
5. method according to any one of claims 1 to 4, which is characterized in that
Any frame color image is RGB image,
The depth data of each characteristic point in any frame color image is the flight time of each characteristic point in the frame color image
Data.
6. a kind of terminal device characterized by comprising
Acquisition unit, for obtaining the depth number of each characteristic point in first frame color image and the first frame color image
According to including fisrt feature point in the first frame color image;
Determination unit, for determining first position according to the depth data of the first frame color image and fisrt feature point
Information, when the first location information is used to indicate the acquisition first frame color image, the fisrt feature point and the end
The relative position of end equipment;
The acquisition unit is also used to obtain the second frame color image to M frame color image and the second frame chromaticity diagram
As the depth data of each characteristic point into the M frame color image in each frame color image, the M frame color image
Including second feature point, M is the integer more than or equal to 2;
The determination unit is also used to according to the first frame color image to the M frame color image, the first frame color
The depth data of each characteristic point of the coloured picture picture into the M frame color image in each frame color image and the first pose letter
Breath determines that the second posture information, first posture information are used to indicate the terminal device and are acquiring the first frame color
Position and posture when image, second posture information are used to indicate the terminal device when acquiring M frame color image
Position and posture;
Processing unit, for according to first posture information, second posture information, the first location information, described
The depth data of M frame color image and second feature point determines the fisrt feature point and second feature point
The distance between.
7. terminal device according to claim 6, which is characterized in that the determination unit is specifically used for successively to described
Second frame color image to M frame color image performs the following operations:
Feature Points Matching is carried out to nth frame color image and N-1 frame color image, obtains matching characteristic point, the matching is special
Sign point is the characteristic point of successful match in the nth frame color image and the N-1 frame color image, and N is integer, and M >=N
≥2;
According to matching characteristic point described in the intrinsic parameter of the camera of the terminal device and outer parameter, the nth frame color image
Depth data, matching characteristic point described in the N-1 frame color image depth data, and the acquisition N-1 frame
The position of the terminal device and posture when color image determine terminal device when acquiring the nth frame color image
Position and posture.
8. terminal device according to claim 6, which is characterized in that the determination unit is specifically used for the acquisition
The position of the terminal device is that origin establishes world coordinate system when the first frame color image, according to the first frame color
The depth color image of image and fisrt feature point calculates the world coordinates of the fisrt feature point.
9. terminal device according to claim 8, which is characterized in that the processing unit is specifically used for according to described the
One posture information and second posture information, when determining acquisition M frame color image the world coordinates of the terminal device and
Deflection angle;According to the depth data of the M frame color image and second feature point, second location information is determined, institute
Second location information is stated to be used to indicate when acquiring the M frame color image, the second feature point and the terminal device
Relative position;According to the world coordinates of the terminal device when second location information and acquisition M frame color image and
Deflection angle determines the world coordinates of the second feature point;According to the world coordinates and described second of the fisrt feature point
The world coordinates of characteristic point calculates the distance between the fisrt feature point and second feature point.
10. according to the described in any item terminal devices of claim 6-9, which is characterized in that
Any frame color image is RGB image,
The depth data of each characteristic point in any frame color image is the flight time of each characteristic point in the frame color image
Data.
11. a kind of terminal device characterized by comprising processor, memory are stored on the memory and can be in institute
The computer program run on processor is stated, such as claim 1 to 5 is realized when the computer program is executed by the processor
Any one of described in distance measurement method the step of.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910310118.7A CN110148167B (en) | 2019-04-17 | 2019-04-17 | Distance measuring method and terminal equipment |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910310118.7A CN110148167B (en) | 2019-04-17 | 2019-04-17 | Distance measuring method and terminal equipment |
Publications (2)
Publication Number | Publication Date |
---|---|
CN110148167A true CN110148167A (en) | 2019-08-20 |
CN110148167B CN110148167B (en) | 2021-06-04 |
Family
ID=67589669
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201910310118.7A Active CN110148167B (en) | 2019-04-17 | 2019-04-17 | Distance measuring method and terminal equipment |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN110148167B (en) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111402344A (en) * | 2020-04-23 | 2020-07-10 | Oppo广东移动通信有限公司 | Calibration method, calibration device and non-volatile computer-readable storage medium |
CN114111704A (en) * | 2020-08-28 | 2022-03-01 | 华为技术有限公司 | Method and device for measuring distance, electronic equipment and readable storage medium |
Citations (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105761245A (en) * | 2016-01-29 | 2016-07-13 | 速感科技(北京)有限公司 | Automatic tracking method and device based on visual feature points |
US20170094153A1 (en) * | 2015-07-02 | 2017-03-30 | Pixart Imaging Inc. | Distance measurement device based on phase difference and distance measurement method thereof |
CN106646442A (en) * | 2016-12-08 | 2017-05-10 | 努比亚技术有限公司 | Distance measurement method and terminal |
CN107016704A (en) * | 2017-03-09 | 2017-08-04 | 杭州电子科技大学 | A kind of virtual reality implementation method based on augmented reality |
CN107167139A (en) * | 2017-05-24 | 2017-09-15 | 广东工业大学 | A kind of Intelligent Mobile Robot vision positioning air navigation aid and system |
CN107292949A (en) * | 2017-05-25 | 2017-10-24 | 深圳先进技术研究院 | Three-dimensional rebuilding method, device and the terminal device of scene |
CN107369183A (en) * | 2017-07-17 | 2017-11-21 | 广东工业大学 | Towards the MAR Tracing Registration method and system based on figure optimization SLAM |
US20180041747A1 (en) * | 2016-08-03 | 2018-02-08 | Samsung Electronics Co., Ltd. | Apparatus and method for processing image pair obtained from stereo camera |
CN107705333A (en) * | 2017-09-21 | 2018-02-16 | 歌尔股份有限公司 | Space-location method and device based on binocular camera |
KR101865173B1 (en) * | 2017-02-03 | 2018-06-07 | (주)플레이솔루션 | Method for generating movement of motion simulator using image analysis of virtual reality contents |
CN109029417A (en) * | 2018-05-21 | 2018-12-18 | 南京航空航天大学 | Unmanned plane SLAM method based on mixing visual odometry and multiple dimensioned map |
CN109102541A (en) * | 2018-07-13 | 2018-12-28 | 宁波盈芯信息科技有限公司 | A kind of distance measurement method and device of the smart phone of integrated depth camera |
-
2019
- 2019-04-17 CN CN201910310118.7A patent/CN110148167B/en active Active
Patent Citations (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20170094153A1 (en) * | 2015-07-02 | 2017-03-30 | Pixart Imaging Inc. | Distance measurement device based on phase difference and distance measurement method thereof |
US10148864B2 (en) * | 2015-07-02 | 2018-12-04 | Pixart Imaging Inc. | Imaging device having phase detection pixels and regular pixels, and operating method thereof |
CN105761245A (en) * | 2016-01-29 | 2016-07-13 | 速感科技(北京)有限公司 | Automatic tracking method and device based on visual feature points |
US20180041747A1 (en) * | 2016-08-03 | 2018-02-08 | Samsung Electronics Co., Ltd. | Apparatus and method for processing image pair obtained from stereo camera |
CN106646442A (en) * | 2016-12-08 | 2017-05-10 | 努比亚技术有限公司 | Distance measurement method and terminal |
KR101865173B1 (en) * | 2017-02-03 | 2018-06-07 | (주)플레이솔루션 | Method for generating movement of motion simulator using image analysis of virtual reality contents |
CN107016704A (en) * | 2017-03-09 | 2017-08-04 | 杭州电子科技大学 | A kind of virtual reality implementation method based on augmented reality |
CN107167139A (en) * | 2017-05-24 | 2017-09-15 | 广东工业大学 | A kind of Intelligent Mobile Robot vision positioning air navigation aid and system |
CN107292949A (en) * | 2017-05-25 | 2017-10-24 | 深圳先进技术研究院 | Three-dimensional rebuilding method, device and the terminal device of scene |
CN107369183A (en) * | 2017-07-17 | 2017-11-21 | 广东工业大学 | Towards the MAR Tracing Registration method and system based on figure optimization SLAM |
CN107705333A (en) * | 2017-09-21 | 2018-02-16 | 歌尔股份有限公司 | Space-location method and device based on binocular camera |
CN109029417A (en) * | 2018-05-21 | 2018-12-18 | 南京航空航天大学 | Unmanned plane SLAM method based on mixing visual odometry and multiple dimensioned map |
CN109102541A (en) * | 2018-07-13 | 2018-12-28 | 宁波盈芯信息科技有限公司 | A kind of distance measurement method and device of the smart phone of integrated depth camera |
Non-Patent Citations (3)
Title |
---|
BISHWAJIT PAL ET AL: "3D Point Cloud Generation from 2D Depth Camera Images Using Successive Triangulation", 《INTERNATIONAL CONFERENCE ON INNOVATIVE MECHANISMS FOR INDUSTRY APPLICATIONS》 * |
WEIGUO ZHOU ET AL: "Efficient and Fast Implementation of Embedded", 《IEEE SENSORS JOURNAL》 * |
魏许: "空间非合作目标的近距离相对位姿测量技术研究", 《中国优秀硕士学位论文全文数据库 信息科技辑》 * |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111402344A (en) * | 2020-04-23 | 2020-07-10 | Oppo广东移动通信有限公司 | Calibration method, calibration device and non-volatile computer-readable storage medium |
CN114111704A (en) * | 2020-08-28 | 2022-03-01 | 华为技术有限公司 | Method and device for measuring distance, electronic equipment and readable storage medium |
Also Published As
Publication number | Publication date |
---|---|
CN110148167B (en) | 2021-06-04 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN110891144B (en) | Image display method and electronic equipment | |
CN108628985B (en) | Photo album processing method and mobile terminal | |
CN109743498A (en) | A kind of shooting parameter adjustment method and terminal device | |
CN110505403A (en) | A kind of video record processing method and device | |
CN109862504A (en) | A kind of display methods and terminal device | |
CN109618218B (en) | Video processing method and mobile terminal | |
CN110213485A (en) | A kind of image processing method and terminal | |
CN108828692A (en) | A kind of weather prediction method and terminal device | |
CN109241832A (en) | A kind of method and terminal device of face In vivo detection | |
CN108898040A (en) | A kind of recognition methods and mobile terminal | |
CN108564613A (en) | A kind of depth data acquisition methods and mobile terminal | |
CN108881723A (en) | A kind of image preview method and terminal | |
CN108317992A (en) | A kind of object distance measurement method and terminal device | |
CN107255812A (en) | Speed-measuring method, mobile terminal and storage medium based on 3D technology | |
CN110148167A (en) | A kind of distance measurement method and terminal device | |
CN109005314A (en) | A kind of image processing method and terminal | |
CN110233914A (en) | A kind of terminal device and its control method | |
CN109781091A (en) | A kind of map-indication method of mobile terminal, mobile terminal and storage medium | |
CN109813300A (en) | A kind of localization method and terminal device | |
CN108833791A (en) | A kind of image pickup method and device | |
CN109361864A (en) | A kind of acquisition parameters setting method and terminal device | |
CN109117037A (en) | A kind of method and terminal device of image procossing | |
CN108898000A (en) | A kind of method and terminal solving lock screen | |
CN108647566A (en) | A kind of method and terminal of identification skin characteristic | |
CN108965701B (en) | Jitter correction method and terminal equipment |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |