CN110148167B - Distance measuring method and terminal equipment - Google Patents

Distance measuring method and terminal equipment Download PDF

Info

Publication number
CN110148167B
CN110148167B CN201910310118.7A CN201910310118A CN110148167B CN 110148167 B CN110148167 B CN 110148167B CN 201910310118 A CN201910310118 A CN 201910310118A CN 110148167 B CN110148167 B CN 110148167B
Authority
CN
China
Prior art keywords
color image
frame
terminal equipment
feature point
characteristic point
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910310118.7A
Other languages
Chinese (zh)
Other versions
CN110148167A (en
Inventor
周协
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Vivo Mobile Communication Co Ltd
Original Assignee
Vivo Mobile Communication Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Vivo Mobile Communication Co Ltd filed Critical Vivo Mobile Communication Co Ltd
Priority to CN201910310118.7A priority Critical patent/CN110148167B/en
Publication of CN110148167A publication Critical patent/CN110148167A/en
Application granted granted Critical
Publication of CN110148167B publication Critical patent/CN110148167B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/14Measuring arrangements characterised by the use of optical techniques for measuring distance or clearance between spaced objects or spaced apertures
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C3/00Measuring distances in line of sight; Optical rangefinders
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery
    • G06T7/55Depth or shape recovery from multiple images
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10016Video; Image sequence
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10024Color image

Landscapes

  • Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Theoretical Computer Science (AREA)
  • Electromagnetism (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Image Analysis (AREA)
  • Image Processing (AREA)

Abstract

The embodiment of the invention provides a distance measuring method and terminal equipment, relates to the technical field of terminals, and aims to solve the problem that the terminal equipment in the prior art depends on IMU seriously when measuring the distance between two points in space. The method comprises the following steps: acquiring a first frame of color image and depth data of each feature point; determining first position information according to the first frame color image and the depth data of the first characteristic point; acquiring depth data of each feature point in the second frame color image to the Mth frame color image and each frame color image; determining second position and posture information according to the first frame color image to the Mth frame color image, the depth data of each characteristic point and the first position and posture information; and determining the distance between the first characteristic point and the second characteristic point according to the first position information, the second position information, the first position information, the Mth frame color image and the depth data of the second characteristic point. The embodiment of the invention is used for measuring the distance between two points in space through the terminal equipment.

Description

Distance measuring method and terminal equipment
Technical Field
The present invention relates to the field of terminal technologies, and in particular, to a distance measurement method and a terminal device.
Background
In navigation, mapping and the like, the distance between two points in a space needs to be measured, and therefore, measuring the distance between two points in the space without occlusion through a terminal device has become a research hotspot.
In the prior art, a scheme for measuring a distance between two unobstructed points in a space by using a terminal device includes: carrying out computer vision analysis on an image sequence acquired by a camera of the terminal equipment to identify characteristic points in each frame of image; then, combining data acquired by an Inertial Measurement Unit (IMU) of the terminal equipment and position change of the characteristic points between adjacent image frames to obtain the position and deflection information of the terminal equipment; and identifying a real plane in the environment by calculating and identifying the characteristic points, and finally establishing a coordinate system on the identified real plane to calculate the distance between two points on the plane. In the distance measurement scheme in the prior art, under a certain scene and condition, the distance between two points in a space can be measured more accurately, but the IMU is seriously relied on, and if the IMU of the terminal equipment stops working, the position and deflection information of the terminal equipment cannot be acquired, so that the measurement fails; in addition, even if the IMU of the terminal device can continuously collect data, if the accuracy of the IMU of the terminal device is poor, the finally obtained measurement data may have a large error.
Disclosure of Invention
The embodiment of the invention provides a distance measuring method and terminal equipment, which are used for solving the problem that the terminal equipment in the prior art seriously depends on IMU when measuring the distance between two points in space.
In order to solve the above technical problem, the embodiment of the present invention is implemented as follows:
in a first aspect, an embodiment of the present invention provides a distance measuring method, applied to a terminal device, where the method includes:
acquiring a first frame color image and depth data of each feature point in the first frame color image, wherein the first frame color image comprises a first feature point;
determining first position information according to the first frame color image and the depth data of the first characteristic point, wherein the first position information is used for indicating the relative position of the first characteristic point and the terminal equipment when the first frame color image is collected;
acquiring depth data of each feature point in each of second to M frames of color images and the second to M frames of color images, wherein the M frame of color image comprises a second feature point, and M is an integer greater than or equal to 2;
determining second pose information according to depth data of each feature point in each of the first to M frames of color images and first pose information, wherein the first pose information is used for indicating the position and the pose of the terminal equipment when the first frame of color image is collected, and the second pose information is used for indicating the position and the pose of the terminal equipment when the M frame of color image is collected;
and determining the distance between the first characteristic point and the second characteristic point according to the first position information, the second position information, the first position information, the Mth frame color image and the depth data of the second characteristic point.
In a second aspect, an embodiment of the present invention provides a terminal device, including:
the device comprises an acquisition unit, a processing unit and a display unit, wherein the acquisition unit is used for acquiring a first frame color image and depth data of each feature point in the first frame color image, and the first frame color image comprises a first feature point;
the determining unit is used for determining first position information according to the first frame color image and the depth data of the first characteristic point, wherein the first position information is used for indicating the relative position of the first characteristic point and the terminal equipment when the first frame color image is collected;
the acquisition unit is further configured to acquire depth data of each feature point in each of the second to mth frames of color images and the second to mth frames of color images, the mth frame of color image includes a second feature point, and M is an integer greater than or equal to 2;
the determining unit is further configured to determine second pose information according to depth data of each feature point in each of the first to M-th frame color images and the first to M-th frame color images, where the first pose information is used to indicate a position and a pose of the terminal device when the first frame color image is acquired, and the second pose information is used to indicate a position and a pose of the terminal device when the M-th frame color image is acquired;
and the processing unit is used for determining the distance between the first characteristic point and the second characteristic point according to the first position information, the second position information, the first position information, the Mth frame color image and the depth data of the second characteristic point.
In a third aspect, an embodiment of the present invention provides a terminal device, including: a processor, a memory, a computer program stored on the memory and executable on the processor, the computer program, when executed by the processor, implementing the steps of the distance measurement method according to the first aspect.
In a fourth aspect, an embodiment of the present invention provides a computer-readable storage medium, on which a computer program is stored, which, when being executed by a processor, realizes the steps of the distance measurement method according to the first aspect.
The distance measuring method provided by the embodiment of the invention comprises the steps of firstly obtaining a first frame color image comprising a first characteristic point and depth data of each characteristic point in the first frame color image, and then determining the relative position of the first characteristic point and terminal equipment when the first frame color image is collected according to the first frame color image and the depth data of the first characteristic point; then obtaining the depth data of each feature point in each frame of color image from the second frame of color image to the Mth frame of color image and from the second frame of color image to the Mth frame of color image, and determining second position information for indicating the position and the posture of the terminal equipment when the terminal equipment collects the Mth frame color image according to the depth data of each feature point in each frame of color image from the first frame color image to the Mth frame color image and the first position information for indicating the position and the posture of the terminal equipment when the first frame color image is collected, and finally determining the distance between the first feature point and the second feature point according to the first position information, the second position information, the first position information, the Mth frame color image and the depth data of the second feature point. That is, when measuring the distance between the first feature point and the second feature point, the embodiment of the present invention may determine, according to the depth data of each feature point in each of the first to mth frame color images and the first pose information indicating the position and the pose of the terminal device when acquiring the first frame color image, the second pose information indicating the position and the pose of the terminal device when acquiring the mth frame color image, and further obtain the distance between the first feature point and the second feature point, without using an IMU.
Drawings
Fig. 1 is an architecture diagram of an android operating system provided in an embodiment of the present application;
FIG. 2 is a flowchart illustrating steps of a distance measurement method according to an embodiment of the present disclosure;
fig. 3 is a schematic structural diagram of a terminal device according to an embodiment of the present invention;
fig. 4 is a schematic diagram of a hardware structure of a terminal device according to an embodiment of the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are some, not all, embodiments of the present invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
The term "and/or" herein is merely an association describing an associated object, meaning that three relationships may exist, e.g., a and/or B, may mean: a exists alone, A and B exist simultaneously, and B exists alone. In addition, the character "/" herein generally indicates that the former and latter related objects are in an "or" relationship; in the formula, the character "/" indicates that the preceding and following related objects are in a relationship of "division". The term "plurality" herein means two or more, unless otherwise specified.
For the convenience of clearly describing the technical solutions of the embodiments of the present invention, in the embodiments of the present invention, the words "first", "second", and the like are used to distinguish the same items or similar items with basically the same functions or actions, and those skilled in the art can understand that the words "first", "second", and the like do not limit the quantity and execution order.
In the embodiments of the present invention, words such as "exemplary" or "for example" are used to mean serving as examples, illustrations or descriptions. Any embodiment or design described as "exemplary" or "e.g.," an embodiment of the present invention is not necessarily to be construed as preferred or advantageous over other embodiments or designs. Rather, use of the word "exemplary" or "such as" is intended to present concepts related in a concrete fashion. In the embodiments of the present invention, the meaning of "a plurality" means two or more unless otherwise specified.
In a distance measurement scheme in the prior art, under a certain scene and condition, the distance between two points in a space can be measured more accurately, but the distance measurement scheme depends heavily on the IMU of the terminal equipment, and if the IMU of the terminal equipment stops working, the position and deflection information of the terminal equipment cannot be acquired, so that the measurement fails; in addition, even if the IMU of the terminal device can continuously collect data, if the accuracy of the IMU of the terminal device is poor, the finally obtained measurement data may have a large error.
In order to solve the above problem, an embodiment of the present invention provides a distance measurement method and a terminal device, where the distance measurement method includes first obtaining a first frame color image including a first feature point and depth data of each feature point in the first frame color image, and then determining a relative position between the first feature point and the terminal device when the first frame color image is acquired according to the first frame color image and the depth data of the first feature point; then obtaining the depth data of each feature point in each frame of color image from the second frame of color image to the Mth frame of color image and from the second frame of color image to the Mth frame of color image, and determining second position information for indicating the position and the posture of the terminal equipment when the terminal equipment collects the Mth frame color image according to the depth data of each feature point in each frame of color image from the first frame color image to the Mth frame color image and the first position information for indicating the position and the posture of the terminal equipment when the first frame color image is collected, and finally determining the distance between the first feature point and the second feature point according to the first position information, the second position information, the first position information, the Mth frame color image and the depth data of the second feature point. That is, when measuring the distance between the first feature point and the second feature point, the embodiment of the present invention may determine, according to the depth data of each feature point in each of the first to mth frame color images and the first pose information indicating the position and the pose of the terminal device when acquiring the first frame color image, the second pose information indicating the position and the pose of the terminal device when acquiring the mth frame color image, and further obtain the distance between the first feature point and the second feature point, without using an IMU.
The control method of the terminal device provided by the embodiment of the application can be applied to the terminal device, and the terminal device can be a terminal device with an operating system. The operating system may be an android operating system, an iOS operating system, or other possible operating systems, which is not limited in the embodiment of the present application.
Next, a software environment applied to the control method for the terminal device provided by the embodiment of the present application is introduced by taking an android operating system as an example.
Fig. 1 is a schematic diagram of an architecture of a possible android operating system according to an embodiment of the present application. In fig. 1, the architecture of the android operating system includes 4 layers, which are respectively: an application layer, an application framework layer, a system runtime layer, and a kernel layer (specifically, a Linux kernel layer).
The application program layer comprises various application programs (including system application programs and third-party application programs) in an android operating system.
The application framework layer is a framework of the application, and a developer can develop some applications based on the application framework layer under the condition of complying with the development principle of the framework of the application.
The system runtime layer includes libraries (also called system libraries) and android operating system runtime environments. The library mainly provides various resources required by the android operating system. The android operating system running environment is used for providing a software environment for the android operating system.
The kernel layer is an operating system layer of an android operating system and belongs to the bottommost layer of an android operating system software layer. The kernel layer provides kernel system services and hardware-related drivers for the android operating system based on the Linux kernel.
Taking an android operating system as an example, in the embodiment of the present application, a developer may develop a software program for implementing the control method of the terminal device provided in the embodiment of the present application based on the system architecture of the android operating system shown in fig. 1, so that the control method of the terminal device may operate based on the android operating system shown in fig. 1. That is, the processor or the terminal device may implement the control method of the terminal device provided in the embodiment of the present application by running the software program in the android operating system.
The terminal device provided in the embodiment of the present application may be a mobile phone, a tablet computer, a notebook computer, an ultra-mobile personal computer (UMPC), a netbook, a Personal Digital Assistant (PDA), an intelligent watch, an intelligent bracelet, or other types of terminal devices, and the embodiment of the present application is not limited.
The distance measuring method provided by the embodiment of the invention is applied to terminal equipment. Referring to fig. 2, a distance measuring method according to an embodiment of the present invention includes the following steps 11 to 15.
And 11, acquiring a first frame color image and depth data of each feature point in the first frame color image.
Wherein, the first frame color image comprises a first characteristic point.
Specifically, the color camera and the depth camera of the terminal device may be controlled to perform data acquisition simultaneously, so that while a frame of color image is acquired, depth data of each feature point in the color image is acquired.
Further, the hardware device for acquiring the color image may be an RGB camera Of the terminal device, and the hardware device for acquiring the depth data may be a Time Of Flight (TOF) camera. That is, the first frame color image may be an RGB image, and the depth data of each feature point in the first frame color image may be TOF data of each feature point in the first frame color image.
Further, the first feature point in the above embodiments may be selected by a user. The process of the user selecting the first feature point may include the following step 111 and step 112.
And step 111, receiving a first input of a user, wherein the first input is an input of a certain characteristic point in the preview picture.
Step 112, in response to the first input, determining a preview screen receiving the first input of the user as a first frame color image and determining a feature point in the preview screen receiving the first input of the user as a first feature point.
And step 12, determining first position information according to the first frame color image and the depth data of the first characteristic point.
The first position information is used for indicating the relative position of the first characteristic point and the terminal equipment when the first frame color image is collected.
Specifically, the field angle of the color image collected by the terminal device is a fixed value, and the position of the first feature point in the first frame color image and the distance between the first feature point and the terminal device can be obtained, so that the first position information can be determined according to the first frame color image and the depth data of the first feature point.
Optionally, the determining the first position information according to the first frame color image and the depth data of the first feature point in step 12 may include the following steps 121 and 122.
And step 121, establishing a world coordinate system by taking the position of the terminal equipment as an origin when the first frame of color image is acquired.
Namely, the world coordinate of the terminal device when the first frame color image is acquired is (0, 0, 0).
And step 122, calculating the world coordinates of the first characteristic point according to the first frame color image and the depth color image of the first characteristic point.
Namely, the coordinates of the first characteristic point in the established world coordinate system are calculated according to the first frame color image and the depth color image of the first characteristic point.
Optionally, before step 12 (determining first position information according to the first frame color image and the depth data of the first feature point), the method provided in the embodiment of the present invention may further include:
and carrying out data preprocessing on the first frame color image and the depth data of each feature point in the first frame color image.
Specifically, the data preprocessing of the first frame color image and the depth data of each feature point in the first frame color image may include: and performing one or more of nonlinear correction, zero offset correction, shear correction, time filtering, distortion correction, plane correction and flight pixel correction on the first frame color image and the depth data of each characteristic point in the first frame color image.
And step 13, acquiring depth data of each feature point in each of the second frame color image to the Mth frame color image and the second frame color image to the Mth frame color image.
The M frame color image comprises a second characteristic point, and M is an integer greater than or equal to 2.
Also, the second feature point in the above-described embodiment may be selected by the user. The process of the user selecting the second feature point may be a process including the following steps 131 and 132.
And step 131, after the user selects the first feature point, continuously acquiring the color image and the depth of the feature point in the color image, and displaying the acquired color image on a screen of the terminal device in real time until a second input by the user is received.
And 132, responding to the second input, stopping color image acquisition, determining the preview picture receiving the second input as the mth frame color image, and determining the characteristic point receiving the second input of the user in the preview picture as the second characteristic point.
And step 14, determining second position and orientation information according to the depth data of each feature point in each of the first frame color image to the Mth frame color image, the first frame color image to the Mth frame color image and the first position and orientation information.
The first position and posture information is used for indicating the position and posture of the terminal equipment when the first frame of color image is collected, and the second position and posture information is used for indicating the position and posture of the terminal equipment when the Mth frame of color image is collected.
Optionally, in step 14, the implementation process of determining second pose information according to the depth data of each feature point in each of the first to M-th frames of color images and the first pose information may include: the following steps 141 and 142 are sequentially performed on the second to mth frame color images.
And step 141, performing feature point matching on the N frame color image and the N-1 frame color image to obtain matched feature points.
The matching feature points are feature points successfully matched in the color image of the Nth frame and the color image of the N-1 th frame, N is an integer, and M is more than or equal to N and more than or equal to 2.
Specifically, the implementation process of step 141 may include: respectively extracting feature points of the N frame of color image and the N-1 frame of color image, and then matching the feature points extracted from the N frame of color image with all the feature points extracted from the N-1 frame of color image; if the matching is successful, determining the feature point as a matching feature point, and matching the next feature point extracted from the N-th frame color image with all the feature points extracted from the N-1-th frame color image; and if the matching is unsuccessful, directly matching the next feature point extracted from the N-th frame color image with all feature points extracted from the N-1 th frame color image until all feature points extracted from the N-th frame color image are matched with all feature points extracted from the next frame color image.
And 142, determining the position and the posture of the terminal equipment when the N-1 frame color image is acquired according to the internal parameter and the external parameter of the camera of the terminal equipment, the depth data of the matched feature point in the N-1 frame color image and the position and the posture of the terminal equipment when the N-1 frame color image is acquired.
Specifically, the camera intrinsic parameters of the terminal device are parameters related to the characteristics of the camera itself, and generally, the camera intrinsic parameters are composed of two parts, one part being: the parameters of the projective transformation itself, the distance from the focal point of the camera to the imaging plane, i.e. the focal length, and another part are: a transformation matrix from an imaging plane coordinate system to a pixel coordinate system; and the external parameters of the camera are used for describing the motion of the camera in a static scene, or the motion of an object shot by the camera when the camera is fixed, and the relative motion of a plurality of images shot by the camera in succession can be obtained through the external parameters of the camera. The internal parameters and the external parameters of the camera of the terminal device are fixed parameters, and are generally stored in the terminal device, and the internal parameters and the external parameters of the camera of the terminal device can be directly read from the terminal device.
Further, the internal parameter and the external parameter of the camera of the terminal device include: the internal parameters of the camera that collects the color image, the external parameters of the camera that collects the color image, the internal parameters of the camera that collects the depth data, and the external parameters of the camera that collects the depth data. Under the condition that the camera for collecting the color image is an RGB camera and the camera for collecting the depth data is a TOF camera, the internal parameters and the external parameters of the camera of the terminal equipment comprise: intrinsic and extrinsic parameters of an RGB camera, and intrinsic and extrinsic parameters of a TOF camera.
For example, the implementation process of step 14 is described below by taking M ═ 3 as an example, and the process of determining the second posture information when M ═ 3 includes the following steps a to d.
And a, performing feature point matching on the second frame color image and the first frame color image to obtain a first matching feature point.
And b, determining the position and the posture of the terminal equipment when the second frame of color image is collected according to the internal parameter and the external parameter of the camera of the terminal equipment, the depth data of the first matched feature point in the first frame of color image, the depth data of the first matched feature point in the second frame of color image and the position and the posture of the terminal equipment when the first frame of color image is collected.
And c, performing feature point matching on the third frame of color image and the second frame of color image to obtain a second matching feature point.
And d, determining the position and the posture (second position and posture information) of the terminal equipment when the third frame of color image is collected according to the internal parameter and the external parameter of the camera of the terminal equipment, the depth data of the second matched characteristic point in the third frame of color image, the depth data of the second matched characteristic point in the second frame of color image and the position and the posture of the terminal equipment when the second frame of color image is collected.
Similarly, before step 14 (determining the second pose information according to the depth data of each feature point in each of the first to M-th frame color images and the first pose information), the embodiment of the invention may perform data preprocessing on the depth data of each feature point in each of the second to M-th frame color images and the second to M-th frame color images.
And step 15, determining the distance between the first characteristic point and the second characteristic point according to the first position information, the second position information, the first position information, the Mth frame color image and the depth data of the second characteristic point.
Optionally, when a world coordinate system is established with the position of the terminal device as an origin when the first frame color image is acquired, determining a distance between the first feature point and the second feature point according to the first pose information, the second pose information, the first position information, the mth frame color image, and the depth data of the second feature point in step 15 includes the following steps 151 to 154.
And 151, determining the world coordinate and the deflection angle of the terminal equipment when the M & ltth & gt frame color image is acquired according to the first position information and the second position information.
Specifically, the first pose information indicates a position and a pose of the terminal device when the first frame color image is acquired, and the world coordinate is established with the position of the terminal device when the first frame color image is acquired as an origin, so that the world coordinate of the terminal device when the M-th frame color image is acquired can be determined first; secondly, after establishing the world coordinate system, the deflection angle of the terminal device in each direction when acquiring the first frame of color image can be determined, for example: the direction of one coordinate axis of the world coordinate system can be the same as the direction of the focal length when the terminal equipment collects the first frame of color image, so that the deflection angle of the terminal equipment in each direction is zero; therefore, according to the deflection angle of the terminal device in each direction when the first frame color image is acquired, the deflection angle of the terminal device in each direction when the M frame color image is acquired can be calculated.
And 152, determining second position information according to the Mth frame color image and the depth data of the second characteristic point.
The second position information is used for indicating the relative position of the second feature point and the terminal device when the Mth frame color image is acquired.
The principle of determining the second position information according to the mth frame color image and the depth data of the second feature point is the same as the principle of determining the first position information according to the first frame color image and the depth data of the first feature point, and the description is omitted.
And step 153, determining the world coordinate of the second feature point according to the second position information and the world coordinate and the deflection angle of the terminal device when the mth frame color image is acquired.
Specifically, since the world coordinate and the deflection angle of the terminal device when the M-th frame color image is acquired are obtained, and the second position information may indicate the relative position of the second feature point and the terminal device when the M-th frame color image is acquired, the world coordinate of the second feature point may be determined according to the second position information and the world coordinate and the deflection angle of the terminal device when the M-th frame color image is acquired.
And 154, calculating the distance between the first characteristic point and the second characteristic point according to the world coordinates of the first characteristic point and the world coordinates of the second characteristic point.
Since the coordinates of two feature points (a first feature point and a second feature point) to be measured in the same coordinate system (world coordinate system) are obtained, the world coordinates (x1, y1, z1) of the first feature point and the world coordinates (x2, y2, z2) of the second feature point are substituted into the formula:
Figure BDA0002031173530000071
a distance | AB | between the first feature point and the second feature point can be calculated.
The distance measuring method provided by the embodiment of the invention comprises the steps of firstly obtaining a first frame color image comprising a first characteristic point and depth data of each characteristic point in the first frame color image, and then determining the relative position of the first characteristic point and terminal equipment when the first frame color image is collected according to the first frame color image and the depth data of the first characteristic point; then obtaining the depth data of each feature point in each frame of color image from the second frame of color image to the Mth frame of color image and from the second frame of color image to the Mth frame of color image, and determining second position information for indicating the position and the posture of the terminal equipment when the terminal equipment collects the Mth frame color image according to the depth data of each feature point in each frame of color image from the first frame color image to the Mth frame color image and the first position information for indicating the position and the posture of the terminal equipment when the first frame color image is collected, and finally determining the distance between the first feature point and the second feature point according to the first position information, the second position information, the first position information, the Mth frame color image and the depth data of the second feature point. That is, when measuring the distance between the first feature point and the second feature point, the embodiment of the present invention may determine, according to the depth data of each feature point in each of the first to mth frame color images and the first pose information indicating the position and the pose of the terminal device when acquiring the first frame color image, the second pose information indicating the position and the pose of the terminal device when acquiring the mth frame color image, and further obtain the distance between the first feature point and the second feature point, without using an IMU.
Some embodiments of the present invention may perform the division of the functional modules on the terminal device according to the above method example. For example, the functional blocks may be divided for the respective functions, or two or more functions may be integrated into one block. The integrated module can be realized in a hardware mode, and can also be realized in a software functional module mode. It should be noted that, in some embodiments of the present invention, the division of the modules is schematic, and is only one division of logic functions, and there may be another division in actual implementation.
In the case of an integrated unit, fig. 3 shows a schematic diagram of a possible structure of the terminal device involved in the above embodiment, where the terminal device 300 includes:
the image processing device comprises an acquisition unit 31, a processing unit and a processing unit, wherein the acquisition unit 31 is used for acquiring a first frame color image and depth data of each feature point in the first frame color image, and the first frame color image comprises a first feature point;
a determining unit 32, configured to determine first position information according to the first frame color image and the depth data of the first feature point, where the first position information is used to indicate a relative position of the first feature point and the terminal device when the first frame color image is acquired;
the acquisition unit 31 is further configured to acquire depth data of each feature point in each of the second to mth frames of color images and the second to mth frames of color images, where the mth frame of color image includes the second feature point, and M is an integer greater than or equal to 2;
the determining unit 32 is further configured to determine second pose information according to depth data of each feature point in each of the first to M-th frame color images and the first to M-th frame color images, where the first pose information is used to indicate a position and a pose of the terminal device when the first frame color image is acquired, and the second pose information is used to indicate a position and a pose of the terminal device when the M-th frame color image is acquired;
a processing unit 33, configured to determine a distance between the first feature point and the second feature point according to the first pose information, the second pose information, the first position information, the mth frame color image, and the depth data of the second feature point.
Optionally, the determining unit 32 is specifically configured to perform the following operations on the second frame color image to the mth frame color image in sequence:
matching feature points of the Nth frame of color image and the N-1 th frame of color image to obtain matched feature points, wherein the matched feature points are the feature points successfully matched in the Nth frame of color image and the N-1 th frame of color image, N is an integer, and M is more than or equal to N and more than or equal to 2;
and determining the position and the posture of the terminal equipment when the N-1 frame color image is acquired according to the internal parameter and the external parameter of the camera of the terminal equipment, the depth data of the matched characteristic points in the N-1 frame color image and the position and the posture of the terminal equipment when the N-1 frame color image is acquired.
Optionally, the determining unit 32 is specifically configured to establish a world coordinate system with the position of the terminal device when the first frame color image is acquired as an origin, and calculate the world coordinate of the first feature point according to the first frame color image and the depth color image of the first feature point.
Optionally, the processing unit 33 is specifically configured to determine, according to the first pose information and the second pose information, a world coordinate and a deflection angle of the terminal device when the mth frame of color image is acquired; determining second position information according to the M frame color image and the depth data of the second characteristic point, wherein the second position information is used for indicating the relative position of the second characteristic point and the terminal equipment when the M frame color image is collected; determining the world coordinate of the second characteristic point according to the second position information and the world coordinate and the deflection angle of the terminal equipment when the M frame of color image is collected; and calculating the distance between the first characteristic point and the second characteristic point according to the world coordinates of the first characteristic point and the world coordinates of the second characteristic point.
Optionally, any frame of color image is an RGB image,
the depth data of each feature point in any frame of color image is TOF data of each feature point in the frame of color image.
The terminal device provided by the embodiment of the invention comprises: the device comprises an acquisition unit, a determination unit and a processing unit; the acquisition unit can acquire a first frame color image comprising first characteristic points and depth data of each characteristic point in the first frame color image; the determining unit can determine the relative position of the first characteristic point and the terminal equipment when the first frame color image is acquired according to the first frame color image and the depth data of the first characteristic point; the acquisition unit is further capable of acquiring depth data of each feature point in each of second to Mth frame color images and the second to Mth frame color images, the determination unit is further capable of determining second pose information indicating a position and a pose of the terminal device at the time of acquiring the Mth frame color image according to the depth data of each feature point in each of the first to Mth frame color images and the first pose information indicating the position and the pose of the terminal device at the time of acquiring the first frame color image, and the processing unit is capable of determining the second pose information indicating the position and the pose of the terminal device at the time of acquiring the Mth frame color image according to the first pose information, the second pose information, the first position information, the Mth frame color image and the depth data of the second feature point, determining a distance between the first feature point and the second feature point. That is, when measuring the distance between the first feature point and the second feature point, the embodiment of the present invention may determine, according to the depth data of each feature point in each of the first to mth frame color images and the first pose information indicating the position and the pose of the terminal device when acquiring the first frame color image, the second pose information indicating the position and the pose of the terminal device when acquiring the mth frame color image, and further obtain the distance between the first feature point and the second feature point, without using an IMU.
Fig. 4 is a schematic diagram of a hardware structure of a terminal device for implementing an embodiment of the present invention, where the terminal device 100 includes but is not limited to: radio frequency unit 101, network module 102, audio output unit 103, input unit 104, sensor 105, display unit 106, user input unit 107, interface unit 108, memory 109, processor 110, and power supply 111. Those skilled in the art will appreciate that the terminal device configuration shown in fig. 4 does not constitute a limitation of the terminal device, and that the terminal device may include more or fewer components than shown, or combine certain components, or a different arrangement of components. In the embodiment of the present application, the terminal device includes, but is not limited to, a mobile phone, a tablet computer, a notebook computer, a palm computer, a vehicle-mounted terminal, a wearable device, a pedometer, and the like.
The input unit 104 is configured to obtain a first frame color image and depth data of each feature point in the first frame color image, where the first frame color image includes a first feature point;
the processor 110 is configured to determine first position information according to the first frame color image and the depth data of the first feature point, where the first position information is used to indicate a relative position of the first feature point and the terminal device when the first frame color image is acquired;
the input unit 104 is further configured to acquire depth data of each feature point in each of second to mth frames of color images and the second to mth frames of color images, where the mth frame of color image includes a second feature point, and M is an integer greater than or equal to 2;
the processor 110 is further configured to determine second pose information according to depth data of each feature point in each of the first to mth frames of color images, and first pose information, and determine a distance between the first feature point and the second feature point according to the first pose information, the second pose information, the first location information, the mth frame of color image, and the depth data of the second feature point; the first position and posture information is used for indicating the position and posture of the terminal equipment when the first frame of color image is collected, and the second position and posture information is used for indicating the position and posture of the terminal equipment when the Mth frame of color image is collected.
The terminal device provided by the embodiment of the invention can acquire a first frame color image comprising first characteristic points and depth data of each characteristic point in the first frame color image; determining the relative position of the first characteristic point and the terminal equipment when the first frame color image is collected according to the first frame color image and the depth data of the first characteristic point; acquiring depth data of each feature point in each of second to Mth frames of color images and the second to Mth frames of color images, determining second position and posture information used for indicating the position and posture of the terminal equipment when the terminal equipment acquires the Mth frame color image according to the depth data of each characteristic point in each of the first to Mth frame color images and the first position and posture information used for indicating the position and posture of the terminal equipment when the first frame color image is acquired, and determining the distance between the first characteristic point and the second characteristic point according to the first position information, the second position information, the first position information, the Mth frame color image and the depth data of the second characteristic point. That is, when measuring the distance between the first feature point and the second feature point, the embodiment of the present invention may determine, according to the depth data of each feature point in each of the first to mth frame color images and the first pose information indicating the position and the pose of the terminal device when acquiring the first frame color image, the second pose information indicating the position and the pose of the terminal device when acquiring the mth frame color image, and further obtain the distance between the first feature point and the second feature point, without using an IMU.
It should be understood that, in the embodiment of the present application, the radio frequency unit 101 may be used for receiving and sending signals during a message sending and receiving process or a call process, and specifically, receives downlink data from a base station and then processes the received downlink data to the processor 110; in addition, the uplink data is transmitted to the base station. Typically, radio frequency unit 101 includes, but is not limited to, an antenna, at least one amplifier, a transceiver, a coupler, a low noise amplifier, a duplexer, and the like. In addition, the radio frequency unit 101 can also communicate with a network and other devices through a wireless communication system.
The terminal device provides wireless broadband internet access to the user through the network module 102, such as helping the user send and receive e-mails, browse webpages, access streaming media, and the like.
The audio output unit 103 may convert audio data received by the radio frequency unit 101 or the network module 102 or stored in the memory 109 into an audio signal and output as sound. Also, the audio output unit 103 may also provide audio output related to a specific function performed by the terminal device 100 (e.g., a call signal reception sound, a message reception sound, etc.). The audio output unit 103 includes a speaker, a buzzer, a receiver, and the like.
The input unit 104 is used for receiving audio and video signals and optical signals. The input unit 104 may include: color cameras, depth cameras, a Graphics Processing Unit (GPU) 1041, a microphone 1042, and the like. Among other things, the graphics processor 1041 processes color images of still pictures or video obtained by an image capture device (e.g., a camera) in a video capture mode or an image capture mode. The processed image frames may be displayed on the display unit 106. The image frames processed by the graphic processor 1041 may be stored in the memory 109 (or other storage medium) or transmitted via the radio frequency unit 101 or the network module 102. The microphone 1042 may receive sound and may be capable of processing such sound into audio data. The processed audio data may be converted into a format output transmittable to a mobile communication base station via the radio frequency unit 101 in case of a phone call mode.
The terminal device 100 also includes at least one sensor 105, such as a light sensor, a motion sensor, and other sensors. Specifically, the light sensor includes an ambient light sensor that can adjust the brightness of the display panel 1061 according to the brightness of ambient light, and a proximity sensor that can turn off the display panel 1061 and/or the backlight when the terminal device 900 is moved to the ear. As one of the motion sensors, the accelerometer sensor can detect the magnitude of acceleration in each direction (generally three axes), detect the magnitude and direction of gravity when stationary, and can be used to identify the terminal device posture (such as horizontal and vertical screen switching, related games, magnetometer posture calibration), vibration identification related functions (such as pedometer, tapping), and the like; the sensors 105 may also include fingerprint sensors, pressure sensors, iris sensors, molecular sensors, gyroscopes, barometers, hygrometers, thermometers, infrared sensors, etc., which are not described in detail herein.
The display unit 106 is used to display information input by a user or information provided to the user. The Display unit 106 may include a Display panel 1061, and the Display panel 1061 may be configured in the form of a Liquid Crystal Display (LCD), an Organic Light-Emitting Diode (OLED), or the like.
The user input unit 107 may be used to receive input numeric or character information and generate key signal inputs related to user settings and function control of the terminal device. Specifically, the user input unit 107 includes a touch panel 1071 and other input devices 1072. Touch panel 1071, also referred to as a touch screen, may collect touch operations by a user on or near the touch panel 1071 (e.g., operations by a user on or near touch panel 1071 using a finger, stylus, or any suitable object or attachment). The touch panel 1071 may include two parts of a touch detection device and a touch controller. The touch detection device detects the touch direction of a user, detects a signal brought by touch operation and transmits the signal to the touch controller; the touch controller receives touch information from the touch sensing device, converts the touch information into touch point coordinates, sends the touch point coordinates to the processor 110, and receives and executes commands sent by the processor 110. In addition, the touch panel 1071 may be implemented in various types, such as a resistive type, a capacitive type, an infrared ray, and a surface acoustic wave. In addition to the touch panel 1071, the user input unit 107 may include other input devices 1072. Specifically, other input devices 1072 may include, but are not limited to, a physical keyboard, function keys (e.g., volume control keys, switch keys, etc.), a trackball, a mouse, and a joystick, which are not described in detail herein.
Further, the touch panel 1071 may be overlaid on the display panel 1061, and when the touch panel 1071 detects a touch operation thereon or nearby, the touch panel 1071 transmits the touch operation to the processor 110 to determine the type of the touch event, and then the processor 110 provides a corresponding visual output on the display panel 1061 according to the type of the touch event. Although in fig. 4, the touch panel 1071 and the display panel 1061 are two independent components to implement the input and output functions of the terminal device, in some embodiments, the touch panel 1071 and the display panel 1061 may be integrated to implement the input and output functions of the terminal device, and is not limited herein.
The interface unit 108 is an interface for connecting an external device to the terminal apparatus 100. For example, the external device may include a wired or wireless headset port, an external power supply (or battery charger) port, a wired or wireless data port, a memory card port, a port for connecting a device having an identification module, an audio input/output (I/O) port, a video I/O port, an earphone port, and the like. The interface unit 108 may be used to receive input (e.g., data information, power, etc.) from an external device and transmit the received input to one or more elements within the terminal apparatus 100 or may be used to transmit data between the terminal apparatus 100 and the external device.
The memory 109 may be used to store software programs as well as various data. The memory 109 may mainly include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program required by at least one function (such as a sound playing function, an image playing function, etc.), and the like; the storage data area may store data (such as audio data, a phonebook, etc.) created according to the use of the cellular phone, and the like. Further, the memory 109 may include high speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid state storage device.
The processor 110 is a control center of the terminal device, connects various parts of the entire terminal device using various interfaces and lines, and performs various functions of the terminal device and processes data by running or executing software programs and/or modules stored in the memory 109 and calling data stored in the memory 109, thereby performing overall monitoring of the terminal device. Processor 110 may include one or more processing units; preferably, the processor 110 may integrate an application processor, which mainly handles operating systems, user interfaces, application programs, etc., and a modem processor, which mainly handles wireless communications. It will be appreciated that the modem processor described above may not be integrated into the processor 110.
The terminal device 100 may further include a power supply 111 (such as a battery) for supplying power to the plurality of components, and preferably, the power supply 111 may be logically connected to the processor 110 through a power management system, so as to implement functions of managing charging, discharging, and power consumption through the power management system.
In addition, the terminal device 100 includes some functional modules that are not shown, and are not described in detail here.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
Through the above description of the embodiments, those skilled in the art will clearly understand that the method of the above embodiments can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware, but in many cases, the former is a better implementation manner. Based on such understanding, the technical solution of the present invention may be embodied in the form of a software product, which is stored in a storage medium (such as ROM/RAM, magnetic disk, optical disk) and includes instructions for enabling a terminal (such as a mobile phone, a computer, a server, an air conditioner, or a network-side device) to execute the method according to the embodiments of the present invention.
While the present invention has been described with reference to the embodiments shown in the drawings, the present invention is not limited to the embodiments, which are illustrative and not restrictive, and it will be apparent to those skilled in the art that various changes and modifications can be made therein without departing from the spirit and scope of the invention as defined in the appended claims.

Claims (7)

1. A distance measurement method is applied to terminal equipment, and the method comprises the following steps:
acquiring a first frame color image and depth data of each feature point in the first frame color image, wherein the first frame color image comprises a first feature point;
establishing a world coordinate system by taking the position of the terminal equipment as an origin when the first frame of color image is collected; calculating world coordinates of the first characteristic point according to the first frame color image and the depth color image of the first characteristic point;
acquiring depth data of each feature point in each of second to M frames of color images and the second to M frames of color images, wherein the M frame of color image comprises a second feature point, and M is an integer greater than or equal to 2;
determining second pose information according to depth data of each feature point in each of the first to M frames of color images and first pose information, wherein the first pose information is used for indicating the position and the pose of the terminal equipment when the first frame of color image is collected, and the second pose information is used for indicating the position and the pose of the terminal equipment when the M frame of color image is collected;
according to the first position information and the second position information, determining world coordinates and a deflection angle of the terminal equipment when the M frame of color image is collected;
determining second position information according to the M frame color image and the depth data of the second characteristic point, wherein the second position information is used for indicating the relative position of the second characteristic point and the terminal equipment when the M frame color image is collected;
determining the world coordinate of the second characteristic point according to the second position information and the world coordinate and the deflection angle of the terminal equipment when the M frame of color image is collected;
and calculating the distance between the first characteristic point and the second characteristic point according to the world coordinates of the first characteristic point and the world coordinates of the second characteristic point.
2. The method according to claim 1, wherein the determining second pose information according to the depth data of each feature point in each of the first to M frames of color images and the first pose information comprises:
sequentially executing the following operations on the second frame color image to the Mth frame color image:
matching feature points of the Nth frame of color image and the N-1 th frame of color image to obtain matched feature points, wherein the matched feature points are the feature points successfully matched in the Nth frame of color image and the N-1 th frame of color image, N is an integer, and M is more than or equal to N and more than or equal to 2;
and determining the position and the posture of the terminal equipment when the N-1 frame color image is acquired according to the internal parameter and the external parameter of the camera of the terminal equipment, the depth data of the matched characteristic points in the N-1 frame color image and the position and the posture of the terminal equipment when the N-1 frame color image is acquired.
3. The method according to claim 1 or 2,
any one of the frame color images is an RGB image,
the depth data of each characteristic point in any frame of color image is the flight time data of each characteristic point in the frame of color image.
4. A terminal device, comprising:
the device comprises an acquisition unit, a processing unit and a display unit, wherein the acquisition unit is used for acquiring a first frame color image and depth data of each feature point in the first frame color image, and the first frame color image comprises a first feature point;
the determining unit is used for establishing a world coordinate system by taking the position of the terminal equipment as an origin when the first frame of color image is acquired; calculating world coordinates of the first characteristic point according to the first frame color image and the depth color image of the first characteristic point;
the acquisition unit is further configured to acquire depth data of each feature point in each of the second to mth frames of color images and the second to mth frames of color images, the mth frame of color image includes a second feature point, and M is an integer greater than or equal to 2;
the determining unit is further configured to determine second pose information according to depth data of each feature point in each of the first to M-th frame color images and the first to M-th frame color images, where the first pose information is used to indicate a position and a pose of the terminal device when the first frame color image is acquired, and the second pose information is used to indicate a position and a pose of the terminal device when the M-th frame color image is acquired;
the processing unit is used for determining the world coordinate and the deflection angle of the terminal equipment when the Mth frame of color image is acquired according to the first position information and the second position information; determining second position information according to the M frame color image and the depth data of the second feature point, wherein the second position information is used for indicating the relative position of the second feature point and the terminal equipment when the M frame color image is collected; determining the world coordinate of the second characteristic point according to the second position information and the world coordinate and the deflection angle of the terminal equipment when the M frame of color image is collected; and calculating the distance between the first characteristic point and the second characteristic point according to the world coordinates of the first characteristic point and the world coordinates of the second characteristic point.
5. The terminal device according to claim 4, wherein the determining unit is specifically configured to perform the following operations on the second frame color image to the mth frame color image in sequence:
matching feature points of the Nth frame of color image and the N-1 th frame of color image to obtain matched feature points, wherein the matched feature points are the feature points successfully matched in the Nth frame of color image and the N-1 th frame of color image, N is an integer, and M is more than or equal to N and more than or equal to 2;
and determining the position and the posture of the terminal equipment when the N-1 frame color image is acquired according to the internal parameter and the external parameter of the camera of the terminal equipment, the depth data of the matched characteristic points in the N-1 frame color image and the position and the posture of the terminal equipment when the N-1 frame color image is acquired.
6. The terminal device according to claim 4 or 5,
any one of the frame color images is an RGB image,
the depth data of each characteristic point in any frame of color image is the flight time data of each characteristic point in the frame of color image.
7. A terminal device, comprising: processor, memory, computer program stored on the memory and executable on the processor, which computer program, when being executed by the processor, carries out the steps of the distance measuring method according to any one of claims 1 to 3.
CN201910310118.7A 2019-04-17 2019-04-17 Distance measuring method and terminal equipment Active CN110148167B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910310118.7A CN110148167B (en) 2019-04-17 2019-04-17 Distance measuring method and terminal equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910310118.7A CN110148167B (en) 2019-04-17 2019-04-17 Distance measuring method and terminal equipment

Publications (2)

Publication Number Publication Date
CN110148167A CN110148167A (en) 2019-08-20
CN110148167B true CN110148167B (en) 2021-06-04

Family

ID=67589669

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910310118.7A Active CN110148167B (en) 2019-04-17 2019-04-17 Distance measuring method and terminal equipment

Country Status (1)

Country Link
CN (1) CN110148167B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111402344A (en) * 2020-04-23 2020-07-10 Oppo广东移动通信有限公司 Calibration method, calibration device and non-volatile computer-readable storage medium
CN114111704B (en) * 2020-08-28 2023-07-18 华为技术有限公司 Method and device for measuring distance, electronic equipment and readable storage medium

Family Cites Families (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10148864B2 (en) * 2015-07-02 2018-12-04 Pixart Imaging Inc. Imaging device having phase detection pixels and regular pixels, and operating method thereof
CN105761245B (en) * 2016-01-29 2018-03-06 速感科技(北京)有限公司 A kind of automatic tracking method and device of view-based access control model characteristic point
US20180041747A1 (en) * 2016-08-03 2018-02-08 Samsung Electronics Co., Ltd. Apparatus and method for processing image pair obtained from stereo camera
CN106646442A (en) * 2016-12-08 2017-05-10 努比亚技术有限公司 Distance measurement method and terminal
KR101865173B1 (en) * 2017-02-03 2018-06-07 (주)플레이솔루션 Method for generating movement of motion simulator using image analysis of virtual reality contents
CN107016704A (en) * 2017-03-09 2017-08-04 杭州电子科技大学 A kind of virtual reality implementation method based on augmented reality
CN107167139A (en) * 2017-05-24 2017-09-15 广东工业大学 A kind of Intelligent Mobile Robot vision positioning air navigation aid and system
CN107292949B (en) * 2017-05-25 2020-06-16 深圳先进技术研究院 Three-dimensional reconstruction method and device of scene and terminal equipment
CN107369183A (en) * 2017-07-17 2017-11-21 广东工业大学 Towards the MAR Tracing Registration method and system based on figure optimization SLAM
CN107705333B (en) * 2017-09-21 2021-02-26 歌尔股份有限公司 Space positioning method and device based on binocular camera
CN109029417B (en) * 2018-05-21 2021-08-10 南京航空航天大学 Unmanned aerial vehicle SLAM method based on mixed visual odometer and multi-scale map
CN109102541A (en) * 2018-07-13 2018-12-28 宁波盈芯信息科技有限公司 A kind of distance measurement method and device of the smart phone of integrated depth camera

Also Published As

Publication number Publication date
CN110148167A (en) 2019-08-20

Similar Documents

Publication Publication Date Title
CN110913132B (en) Object tracking method and electronic equipment
CN110891144B (en) Image display method and electronic equipment
CN109743498B (en) Shooting parameter adjusting method and terminal equipment
CN109862504B (en) Display method and terminal equipment
CN109523253B (en) Payment method and device
CN109495616B (en) Photographing method and terminal equipment
CN111401463A (en) Method for outputting detection result, electronic device, and medium
CN110908750B (en) Screen capturing method and electronic equipment
CN110148167B (en) Distance measuring method and terminal equipment
CN108833791B (en) Shooting method and device
CN109104573B (en) Method for determining focusing point and terminal equipment
CN109067975B (en) Contact person information management method and terminal equipment
CN109117037B (en) Image processing method and terminal equipment
CN111221602A (en) Interface display method and electronic equipment
JP7472281B2 (en) Electronic device and focusing method
CN111147754B (en) Image processing method and electronic device
CN109922256B (en) Shooting method and terminal equipment
CN109618278B (en) Positioning method and mobile terminal
CN108965701B (en) Jitter correction method and terminal equipment
CN109582264B (en) Image display method and mobile terminal
CN109829707B (en) Interface display method and terminal equipment
CN109348212B (en) Image noise determination method and terminal equipment
CN110933305B (en) Electronic equipment and focusing method
CN108986508B (en) Method and terminal for displaying route information
CN111107271B (en) Shooting method and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant