CN108317992A - A kind of object distance measurement method and terminal device - Google Patents

A kind of object distance measurement method and terminal device Download PDF

Info

Publication number
CN108317992A
CN108317992A CN201810036578.0A CN201810036578A CN108317992A CN 108317992 A CN108317992 A CN 108317992A CN 201810036578 A CN201810036578 A CN 201810036578A CN 108317992 A CN108317992 A CN 108317992A
Authority
CN
China
Prior art keywords
optical lens
image
terminal device
pixel
difference
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201810036578.0A
Other languages
Chinese (zh)
Inventor
周燎
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Vivo Mobile Communication Co Ltd
Original Assignee
Vivo Mobile Communication Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Vivo Mobile Communication Co Ltd filed Critical Vivo Mobile Communication Co Ltd
Priority to CN201810036578.0A priority Critical patent/CN108317992A/en
Publication of CN108317992A publication Critical patent/CN108317992A/en
Priority to PCT/CN2019/071635 priority patent/WO2019137535A1/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C3/00Measuring distances in line of sight; Optical rangefinders
    • G01C3/32Measuring distances in line of sight; Optical rangefinders by focusing the object, e.g. on a ground glass screen
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10004Still image; Photographic image

Abstract

The embodiment of the invention discloses a kind of object distance measurement method and terminal devices, are related to field of communication technology, can solve since terminal device is when obtaining the depth information of subject, using multiple cameras, the caused higher problem of cost.Concrete scheme is:Obtain the first image that the optical lens of terminal device is acquired in first position;In the case where optical lens is moved to the second position from first position, the second image that optical lens is acquired in the second position is obtained;Obtain the first distance between first position and the second position;Obtain the first difference;According to the first distance, the focal length of optical lens and the first difference, determine object away from;First difference is the difference between the position coordinates of the second pixel in the position coordinates and the second image of the first pixel in first image in the case of the first image and the second image in the same plane.Object distance is obtained by using single camera in the embodiment of the present invention, the cost of camera can be reduced.

Description

A kind of object distance measurement method and terminal device
Technical field
The present embodiments relate to field of communication technology more particularly to a kind of object distance measurement method and terminal devices.
Background technology
With the development of terminal technology, the camera function of terminal device also constantly enhances, such as the camera in terminal device Ranging, the functions such as background blurring.
Currently, by installing two cameras on the terminal device, depth information (the i.e. quilt of subject can be obtained Shoot the space length between object and the camera lens of camera), to the ranging for realizing terminal device, the functions such as background blurring. Specifically, each camera that terminal device is respectively adopted in two cameras get subject the camera mirror Image information on head, and the difference between the two pieces of image information of acquisition is calculated, subject is then obtained according to the difference The depth information of body.
But in the above method, since terminal device is when obtaining the depth information of subject, using two Camera, therefore cost is higher.
Invention content
A kind of object distance measurement method of offer of the embodiment of the present invention and terminal device, can solve obtaining due to terminal device When the depth information of subject, using multiple cameras, the caused higher problem of cost.
In order to solve the above-mentioned technical problem, the present invention adopts the following technical scheme that:
The first aspect of the present invention provides a kind of object distance measurement method, is applied to terminal device, the object distance measurement method packet It includes:Obtain the first image that the optical lens of terminal device is acquired in first position;It is moved to from first position in optical lens In the case of the second position, the second image that optical lens is acquired in the second position is obtained;Obtain first position and the second position Between the first distance;Obtain the first difference;According to the first distance, the focal length of optical lens and the first difference, object is determined Away from;Wherein, the first difference is in the case of the first image and the second image in the same plane first in first image Difference between the position coordinates of the second pixel in the position coordinates of pixel and the second image;Second pixel and first Pixel is the pixel of relative position.
The second aspect of the present invention provides a kind of terminal device, which includes:Acquiring unit and determination unit. Wherein, acquiring unit, the first image that the optical lens for obtaining terminal device is acquired in first position.Acquiring unit, also For in the case where optical lens is moved to the second position from first position, optical lens is acquired in the second position the is obtained Two images.Acquiring unit is additionally operable to obtain the first distance between first position and the second position.Acquiring unit is additionally operable to obtain Take the first difference.Determination unit, for according to the first distance, the focal length of optical lens and the first difference, determine object away from.Its In, the first difference is the first pixel in first image in the case of the first image and the second image in the same plane Difference between the position coordinates of the second pixel in the position coordinates and the second image of point;Second pixel and the first pixel Point is the pixel of relative position.
The third aspect of the present invention provides a kind of terminal device, which includes processor, memory and be stored in On memory and the computer program that can run on a processor, such as first party is realized when which is executed by processor The step of object distance measurement method described in face.
The fourth aspect of the present invention provides a kind of computer readable storage medium, is deposited on the computer readable storage medium Computer program is stored up, the step of object distance measurement method as described in relation to the first aspect is realized when which is executed by processor Suddenly.
In embodiments of the present invention, terminal device can be according between the first position and the second position of optical lens One distance, the focal length of optical lens and the first difference, determine object away from.Since terminal device can control the optics of camera Camera lens is moved to first position and the second position, obtains the first distance between first position and the second position, and obtain first The first difference between the second image when the first image when position and the second position, and do not need to use multiple cameras The first difference between first position and the second position and the first image and the second image is obtained, therefore camera shooting can be reduced The cost of head.
Description of the drawings
Fig. 1 is a kind of configuration diagram of Android operation system provided in an embodiment of the present invention;
Fig. 2 is one of object distance measuring method flow chart provided in an embodiment of the present invention;
Fig. 3 is the two of object distance measuring method flow chart provided in an embodiment of the present invention;
Fig. 4 is the three of object distance measuring method flow chart provided in an embodiment of the present invention;
Fig. 5 is a kind of schematic diagram of space coordinates provided in an embodiment of the present invention;
Fig. 6 is a kind of example schematic obtaining the first difference provided in an embodiment of the present invention;
Fig. 7 is one of the examples of locations schematic diagram of the first image provided in an embodiment of the present invention and the second image;
Fig. 8 is the two of the examples of locations schematic diagram of the first image provided in an embodiment of the present invention and the second image;
Fig. 9 is a kind of structural schematic diagram of terminal device provided in an embodiment of the present invention;
Figure 10 is a kind of hardware architecture diagram of terminal device provided in an embodiment of the present invention.
Specific implementation mode
Following will be combined with the drawings in the embodiments of the present invention, and technical solution in the embodiment of the present invention carries out clear, complete Site preparation describes, it is clear that described embodiments are some of the embodiments of the present invention, instead of all the embodiments.Based on this hair Embodiment in bright, the every other implementation that those of ordinary skill in the art are obtained without creative efforts Example, shall fall within the protection scope of the present invention.
Term " first " and " second " in the specification and claims of the embodiment of the present invention etc. are for distinguishing not Same object, rather than for the particular order of description object.For example, first position and second position etc. are for distinguishing difference Position, rather than for describing the particular order of position.It is unless otherwise indicated, " more in the description of the embodiment of the present invention It is a " meaning refer to two or more.
The terms "and/or" is a kind of incidence relation of description affiliated partner, indicates may exist three kinds of relationships, For example, A and/or B, can indicate:Individualism A exists simultaneously A and B, these three situations of individualism B.Symbol herein "/" indicates that affiliated partner is relationship such as A/B expressions A or B either.
In embodiments of the present invention, " illustrative " or " such as " etc. words for indicate make example, illustration or explanation.This Be described as in inventive embodiments " illustrative " or " such as " any embodiment or design scheme be not necessarily to be construed as comparing Other embodiments or design scheme more preferably or more advantage.Specifically, use " illustrative " or " such as " etc. words purport Related notion is being presented in specific ways.
Some concepts involved in object distance measurement method provided in an embodiment of the present invention and terminal device are explained below Explanation.
" the first difference ", refer to camera (camera includes optical lens and imaging sensor) in two different locations to same When shooting object is shot, the difference of the position coordinates of two pixels of relative position in obtained two images.For example, The position of the second pixel in the position coordinates and the second image of the first pixel in the first image in the embodiment of the present invention Set the difference between coordinate.
" focal length of optical lens " refers to optical lens the distance between to imaging sensor.
" object distance " refers to optical lens the distance between to shooting object.
A kind of object distance measurement method of offer of the embodiment of the present invention and terminal device can be applied to terminal device and obtain object distance Process.Specifically, during can be applied to terminal device by single camera acquisition object distance, the prior art can be solved In due to terminal device obtain subject depth information when, using multiple cameras, caused cost is higher The problem of.
Terminal device in the embodiment of the present invention can be the terminal device with operating system.The operating system can be Android (Android) operating system can be ios operating systems, can also be other possible operating systems, and the present invention is implemented Example is not especially limited.
Below by taking Android operation system as an example, introduce what object distance measurement method provided in an embodiment of the present invention was applied Software environment.
As shown in Figure 1, being a kind of configuration diagram of possible Android operation system provided in an embodiment of the present invention.Scheming In 1, the framework of Android operation system includes 4 layers, respectively:Application layer, application framework layer, system operation library layer and Inner nuclear layer (is specifically as follows Linux inner core).
Wherein, application layer includes each application program (including system application and in Android operation system Tripartite's application program).
Application framework layer is the frame of application program, and developer can be in the exploitation of the frame in accordance with application program In the case of principle, some application programs are developed based on application framework layer.
System operation library layer includes library (also referred to as system library) and Android operation system running environment.Library is mainly Android behaviour All kinds of resources needed for it are provided as system.Android operation system running environment is used to provide software loop for Android operation system Border.
Inner nuclear layer is the operating system layer of Android operation system, belongs to the bottom of Android operation system software level.It is interior Stratum nucleare provides core system service and hardware-related driver based on linux kernel for Android operation system.
By taking Android operation system as an example, in the embodiment of the present invention, developer can be based on above-mentioned Android as shown in Figure 1 The software program of object distance measurement method provided in an embodiment of the present invention is realized in the system architecture of operating system, exploitation, so that The object distance measurement method can be run based on Android operation system as shown in Figure 1.I.e. processor or terminal device can lead to It crosses and runs software program realization object distance measurement method provided in an embodiment of the present invention in Android operation system.
In the first embodiment of the present invention, Fig. 2 shows a kind of object distance measurement method provided in an embodiment of the present invention, This method can be applied to the terminal device for the Android operation system for having as shown in Figure 1.As shown in Fig. 2, the object distance measurement side Method includes step 201- steps 205:
Step 201, terminal device obtain the first image that the optical lens of terminal device is acquired in first position.
In the embodiment of the present invention, terminal device can determine first position in the case where receiving the input of user, and The first image of the first position is obtained by optical lens.Wherein, which adopts for the optical lens of triggering terminal equipment Collect image, which is that optical lens is located at the image acquired when first position.
In the embodiment of the present invention, after user opens a terminal the camera of equipment, user can work as prezone in terminal device It is inputted on face, the first image is acquired with the optical lens of triggering terminal equipment.
Illustratively, user can carry out the first input to the shooting icon in the current interface after opening camera, so that The optical lens for obtaining terminal device shoots shooting object, which is in the light of first position in terminal device It learns and is imaged as the first image on camera lens.
Optionally, in the embodiment of the present invention, above-mentioned input can be click/pressing operation of the user to terminal device, should Clicking operation can be to click, double-click or the operation etc. of adopting consecutive click chemical reaction preset times.
It is understood that user can also be by shortcut key, programmable button or the programmable button in terminal device Combination is inputted, and the first image is acquired with the optical lens of triggering terminal equipment.
Optionally, in the embodiment of the present invention, first position can be the initial position of optical lens.Specifically, combining figure 2, as shown in figure 3, before above-mentioned steps 201, object distance measurement method provided in an embodiment of the present invention further includes step 301:
The initial position of optical lens is determined as first position by step 301, terminal device.
Optionally, in the embodiment of the present invention, first position can in order to control optical lens it is pre- along second from initial position Position after set direction movement.Specifically, in conjunction with Fig. 2, as shown in figure 4, before above-mentioned steps 201, the embodiment of the present invention carries The object distance measurement method of confession further includes step 302:
Step 302, terminal device control optical lens from the initial position of optical lens along plane where optical lens The second preset direction be moved to first position.
Illustratively, terminal device can pass through optical anti-vibration (the Optical Image in terminal device Stabilization, OIS) device control optical lens be moved to first position along the second preset direction from initial position.
Optionally, in the embodiment of the present invention, the second preset direction includes at least one in first direction and second direction, First direction is X-direction, and second direction is Y direction.
Illustratively, as shown in figure 5, to provide a kind of space coordinates in the embodiment of the present invention.Optical lens 1 is in such as Position a in space coordinates shown in fig. 5, plane where optical lens are the plane that X-axis and Y-axis form, and P points are subject Body.Terminal device can control optical lens 1 by OIS devices and be moved to first position along the X-axis in space from position a; Alternatively, terminal device can control optical lens 1 by OIS devices is moved to first from position a along the Y-axis in space It sets;Alternatively, terminal device can by OIS devices control optical lens 1 from position a along in space X-axis and Y-axis move To first position.
Step 202, in the case where optical lens is moved to the second position from first position, terminal device obtain optical frames Second image of the head in second position acquisition.
In the embodiment of the present invention, terminal device is detecting the case where optical lens is moved to the second position from first position Under, determine the second position, and the second image of the second position is obtained by optical lens.
Optionally, in the embodiment of the present invention, in the case where first position is the initial position of optical lens, the second position Can in order to control optical lens from initial position moved along the first preset direction after position.Correspondingly, in conjunction with Fig. 2, such as Fig. 3 Shown, before above-mentioned steps 202, object distance measurement method provided in an embodiment of the present invention further includes step 303:
Step 303, terminal device control optical lens and are preset from initial position along first of plane where optical lens Direction is moved to the second position.
Illustratively, terminal device can control optical lens from initial position along the first default side by OIS devices To being moved to the second position.
Optionally, in the embodiment of the present invention, the first preset direction includes at least one in first direction and second direction, First direction is X-direction, and second direction is Y direction.
Illustratively, with reference to space coordinates shown in fig. 5, terminal device can control optical lens 1 from position a along X-axis in space is moved to the second position;Alternatively, terminal device can control optical lens 1 from position a along the Y in space Axis is moved to the second position;Alternatively, terminal device can control optical lens from position a along the X in space by OIS devices Axis and Y-axis are moved to the second position.
For example, first position is the initial position of optical lens, the space coordinate of initial position is (0,0,0).Terminal is set For optical lens can be controlled the second position (3,0,0) is moved to along the X-axis in space from (0,0,0);Alternatively, terminal device Optical lens can be controlled and be moved to the second position (0,4,0) along the Y-axis in space from (0,0,0);Alternatively, terminal device can With control optical lens from (0,0,0) along in space X-axis and Y-axis be moved to the second position (3,4,0).
Optionally, in the embodiment of the present invention, in first position, optical lens is default along second from initial position in order to control Direction movement after position in the case of, the second position can in order to control optical lens from first position along third preset direction Position after movement.Correspondingly, in conjunction with Fig. 2, as shown in figure 4, before above-mentioned steps 202, object provided in an embodiment of the present invention Distance measurement method further includes step 304:
It is default along the third of plane where optical lens from first position that step 304, terminal device control optical lens Direction is moved to the second position.
Wherein, the second preset direction is different with third preset direction.
Optionally, in the embodiment of the present invention, the second preset direction includes at least one in first direction and second direction, First direction is X-direction, and second direction is Y direction;Third preset direction include in first direction and second direction extremely One item missing.
Optionally, in the embodiment of the present invention, the second preset direction and third preset direction are opposite.
Illustratively, with reference to space coordinates shown in fig. 5, terminal device can control optical lens 1 from position a along X-axis in space is moved to first position, and optical lens is moved to second from first position along the X-axis in space It sets;Alternatively, terminal device can control optical lens 1 is moved to first position from position a along the Y-axis in space, and by light It learns camera lens and is moved to the second position along the Y-axis in space from first position;Alternatively, terminal device can control optical lens from Position a along in space X-axis and Y-axis be moved to first position, and by optical lens from first position along the X in space Axis and Y-axis are moved to the second position.
For example, it is assumed that the position a of optical lens is initial position, the space coordinate of the initial position is (0,0,0).Terminal Equipment can control optical lens and be moved to first position (3,0,0) along the X-axis in space from (0,0,0), and by optical frames Head is moved to the second position (- 3,0,0) from first position (3,0,0) along the X-axis in space;Alternatively, terminal device can be controlled Optical lens processed is moved to first position (0,4,0) from (0,0,0) along the Y-axis in space, and by optical lens from first It sets (0,4,0) and is moved to the second position (0, -4,0) along the Y-axis in space;Alternatively, terminal device can control optical lens From (0,0,0) along in space X-axis and Y-axis be moved to first position (3,4,0), and by optical lens from first position (3, 4,0) it is moved to the second position (- 3, -4,0) along the Y-axis in space.
Step 203, terminal device obtain the first distance between first position and the second position.
Illustratively, terminal device, can be according to first behind the first position and the second position for determining optical lens It sets and the first distance, delta X is calculated with the second position.
For example, it is assumed that the space coordinate of first position is (3,0,0), the space coordinate of the second position is (- 3,0,0).Eventually The first distance, delta X=6 is calculated according to (3,0,0) and (- 3,0,0) in end equipment.
Step 204, terminal device obtain the first difference.
Wherein, the first difference is in the case of the first image and the second image in the same plane, in the first image The first pixel position coordinates and the second image in the second pixel position coordinates between difference;Second pixel With the pixel that the first pixel is relative position.
Illustratively, if as shown in fig. 6, O1 is the first position where optical lens, O2 is the where optical lens Two positions, the first distance are Δ X, it is assumed that the shooting object of camera viewing is P points, and the space coordinate of the P points is expressed as P (xc,yc, zc), zcGenerally it can be thought that being the object distance of the P points, indicated with Z.The first image that O1 is acquired at first position is P1, the P1 Pixel position coordinates be P1(x1,y1), the second image that O2 is acquired at the second position is P2, opposite with the position of P1 P2 pixel position coordinates be P2(x2,y1);Terminal device can pass through the position coordinates P of calculating pixel1(x1,y1) With the position coordinates P of pixel2(x2,y1) position coordinates between difference, i.e. x1With x2Between difference | x2-x1|=d, is obtained It is d to get the first difference.
It is understood that in the embodiment of the present invention, terminal device, can be first to the first figure before obtaining the first difference Picture and the second image compensate correction process, such as pattern distortion, angle adjustment so that the first image and the second image are in sky Between be aligned (i.e. the first image and the second image are in the same plane) in Y-axis (and/or X-axis).
Step 205, terminal device according to the first distance, the focal length of optical lens and the first difference, determine object away from.
In the embodiment of the present invention, terminal device can in conjunction with preset formula, according to the first distance, optical lens focal length and First difference, determine object away from.
Illustratively, the preset formula that terminal device uses forAs shown in fig. 7, Z is target range, P is to clap Object is taken the photograph, the first image that O1 is acquired at first position is P1, and the second image that O2 is acquired at the second position is P2, and B is The distance between O1 and O2, d are the first difference, and f is the focal length of optical lens.According to Similar Principle of Triangle, can obtainAnd formula one is calculated:
Also, as shown in figure 8, D is the width of imaging sensor, Z is target range, between the first image and the second image Vision difference is Δ Y=Y2+ Δ X-Y1 can obtain according to Similar Principle of TriangleBy equivalent original B=Δ Y known to reason, then can obtain formula two:
Preset formula can be obtained by formula one and formula two:
Preset formula may be used in terminal deviceAccording to the first distance, delta X of acquisition, the coke of optical lens Away from f and the first difference d, object is calculated away from Z.
Compared with the prior art, since the optical lens that terminal device can control single camera is moved to first position And the second position, and the first difference of the second image when the first image when obtaining first position and the second position, and not Multiple cameras are needed to obtain the first difference of first position and the second position and the first image and the second image respectively, because This can reduce the cost of camera.
The embodiment of the present invention provides a kind of object distance measurement method, terminal device can according to the first position of optical lens and The first distance between the second position, the focal length of optical lens and the first difference, determine object away from.Since terminal device can be with The optical lens of control camera is moved to first position and the second position, obtains first between first position and the second position Distance, and the first difference between the second image when the first image when obtaining first position and the second position, and and be not required to Multiple cameras are used to obtain the first difference between first position and the second position and the first image and the second image, Therefore the cost of camera can be reduced.
Also, since the embodiment of the present invention uses single camera, solve due to being shot using multiple cameras When object, the parameter (such as camera lens curvature, brightness) of multiple cameras has differences, and influences the synthesis essence of terminal device shooting The problem of accuracy, shooting effect to influence terminal device so that the image of shooting is more accurate.
Further, offer object distance of embodiment of the present invention measurement method can be by obtaining object away to realize background The applications such as figure, 3D are scratched in virtualization.
In second of embodiment of the present invention, Fig. 9 shows one kind of the terminal device involved in the embodiment of the present invention Possible structural schematic diagram, as shown in figure 9, the terminal device 90 may include:Acquiring unit 91 and determination unit 92.
Wherein, acquiring unit 91, the first image that the optical lens for obtaining terminal device is acquired in first position.It obtains Unit 91 is taken, is additionally operable in the case where optical lens is moved to the second position from first position, obtains optical lens second Second image of station acquisition.Acquiring unit 91 is additionally operable to obtain the first distance between first position and the second position.It obtains Unit 91 is additionally operable to obtain the first difference.Determination unit 92, for poor according to the first distance, the focal length of optical lens and first Value, determine object away from.Wherein, the first difference is in the case of the first image and the second image in the same plane the Difference between the position coordinates of the second pixel in the position coordinates and the second image of the first pixel in one image;The Two pixels and the pixel that the first pixel is relative position.
In one possible implementation, determination unit 92 are additionally operable to obtain the light of terminal device in acquiring unit 91 Camera lens is learned before the first image that first position acquires, the initial position of optical lens is determined as first position.The present invention Terminal device in embodiment further includes:First control unit.Wherein, the first control unit, for being obtained in acquiring unit 91 Optical lens is before the second image that the second position acquires, and control optical lens is from initial position along flat where optical lens First preset direction in face is moved to the second position.
In one possible implementation, the first preset direction includes at least one in first direction and second direction , which is X-direction, which is Y direction.
In one possible implementation, the terminal device in the embodiment of the present invention further includes:Second control unit.Its In, the second control unit, the first figure that the optical lens for obtaining terminal device in acquiring unit 91 is acquired in first position Before picture, control optical lens is moved from the initial position of optical lens along the second preset direction of plane where optical lens To first position.Second control unit is additionally operable to obtain the second figure that optical lens is acquired in the second position in acquiring unit 91 Before picture, control optical lens is moved to second from first position along the third preset direction of plane where optical lens It sets.
In one possible implementation, the second preset direction and third preset direction are opposite.
In one possible implementation, the second preset direction includes at least one in first direction and second direction , which is X-direction, which is Y direction.
In one possible implementation, third preset direction includes at least one in first direction and second direction , which is X-direction, which is Y direction.
Terminal device 90 provided in an embodiment of the present invention can realize that terminal device is realized each in above method embodiment A process, to avoid repeating, which is not described herein again for detailed description and advantageous effect.
In the third embodiment of the present invention, a kind of Figure 10 terminal devices of each embodiment to realize the present invention it is hard Part structural schematic diagram, the terminal device 100 include but not limited to:Radio frequency unit 101, network module 102, audio output unit 103, input unit 104, sensor 105, display unit 106, user input unit 107, interface unit 108, memory 109, The components such as processor 110 and power supply 111.
It should be noted that it will be understood by those skilled in the art that terminal device structure shown in Figure 10 is not constituted Restriction to terminal device, terminal device may include than illustrating more or fewer components, either combine certain components or Different component arrangements.In the embodiment of the present application, terminal device include but not limited to mobile phone, tablet computer, laptop, Palm PC, car-mounted terminal, wearable device and pedometer etc..
Wherein, processor 110, in the case where receiving the first input of user, determining first position, this first Input acquires the first image for the optical lens of triggering terminal equipment, which is that the optical lens is located at this first The image acquired when setting;In the case where detecting that the optical lens is moved to the second position from the first position, determine this Two positions;Obtain the first distance between the first position and the second position;The first difference is obtained, which is first Difference in image and the second image between the abscissa of corresponding two pixels, second image are that the optical lens is located at The image acquired when the second position;In conjunction with preset formula, according to the first distance, the focal length of the optical lens and the first difference, really Set the goal object distance.
Terminal device 100 provided by the embodiments of the present application can realize that terminal device is realized each in above method embodiment A process, to avoid repeating, which is not described herein again for detailed description and advantageous effect.
It should be understood that the embodiment of the present application in, radio frequency unit 101 can be used for receiving and sending messages or communication process in, signal Send and receive, specifically, by from base station downlink data receive after, to processor 110 handle;In addition, by uplink Data are sent to base station.In general, radio frequency unit 101 includes but not limited to antenna, at least one amplifier, transceiver, coupling Device, low-noise amplifier, duplexer etc..In addition, radio frequency unit 101 can also by radio communication system and network and other set Standby communication.
Terminal device has provided wireless broadband internet to the user by network module 102 and has accessed, and such as user is helped to receive Send e-mails, browse webpage and access streaming video etc..
It is that audio output unit 103 can receive radio frequency unit 101 or network module 102 or in memory 109 The audio data of storage is converted into audio signal and exports to be sound.Moreover, audio output unit 103 can also provide and end The relevant audio output of specific function that end equipment 100 executes is (for example, call signal receives sound, message sink sound etc. Deng).Audio output unit 103 includes loud speaker, buzzer and receiver etc..
Input unit 104 is for receiving audio or video signal.Input unit 104 may include graphics processor (Graphics Process ing Unit, GPU) 1041 and microphone 1042, graphics processor 1041 in video to capturing mould The image data of the static images or video that are obtained by image capture apparatus (such as camera) in formula or image capture mode carries out Processing.Treated, and picture frame may be displayed on display unit 106.It can be with through treated the picture frame of graphics processor 1041 It is stored in memory 109 (or other storage mediums) or is sent via radio frequency unit 101 or network module 102.Wheat Gram wind 1042 can receive sound, and can be audio data by such acoustic processing.Treated audio data can be with The format output of mobile communication base station can be sent to via radio frequency unit 101 by being converted in the case of telephone calling model.
Terminal device 100 further includes at least one sensor 105, such as optical sensor, motion sensor and other biographies Sensor.Specifically, optical sensor includes ambient light sensor and proximity sensor, wherein ambient light sensor can be according to environment The light and shade of light adjusts the brightness of display panel 1061, and proximity sensor can close when terminal device 100 is moved in one's ear Display panel 1061 and/or backlight.As a kind of motion sensor, accelerometer sensor can detect in all directions (general For three axis) size of acceleration, size and the direction of gravity are can detect that when static, can be used to identify terminal device posture (ratio Such as horizontal/vertical screen switching, dependent game, magnetometer pose calibrating), Vibration identification correlation function (such as pedometer, tap);It passes Sensor 105 can also include fingerprint sensor, pressure sensor, iris sensor, molecule sensor, gyroscope, barometer, wet Meter, thermometer, infrared sensor etc. are spent, details are not described herein.
Display unit 106 is for showing information input by user or being supplied to the information of user.Display unit 106 can wrap Display panel 1061 is included, liquid crystal display (Liquid Crystal Display, LCD), Organic Light Emitting Diode may be used Forms such as (Organic Light-Emitting Diode, OLED) configure display panel 1061.
User input unit 107 can be used for receiving the number or character information of input, and generate the use with terminal device Family is arranged and the related key signals input of function control.Specifically, user input unit 107 include touch panel 1071 and Other input equipments 1072.Touch panel 1071, also referred to as touch screen collect user on it or neighbouring touch operation (for example user uses any suitable objects or attachment such as finger, stylus on touch panel 1071 or in touch panel 1071 Neighbouring operation).Touch panel 1071 may include both touch detecting apparatus and touch controller.Wherein, touch detection Device detects the touch orientation of user, and detects the signal that touch operation is brought, and transmits a signal to touch controller;Touch control Device processed receives touch information from touch detecting apparatus, and is converted into contact coordinate, then gives processor 110, receiving area It manages the order that device 110 is sent and is executed.Furthermore, it is possible to more using resistance-type, condenser type, infrared ray and surface acoustic wave etc. Type realizes touch panel 1071.In addition to touch panel 1071, user input unit 107 can also include other input equipments 1072.Specifically, other input equipments 1072 can include but is not limited to physical keyboard, function key (such as volume control button, Switch key etc.), trace ball, mouse, operating lever, details are not described herein.
Further, touch panel 1071 can be covered on display panel 1061, when touch panel 1071 is detected at it On or near touch operation after, send processor 110 to determine the type of touch event, be followed by subsequent processing device 110 according to touch The type for touching event provides corresponding visual output on display panel 1061.Although in Fig. 10, touch panel 1071 with it is aobvious Show that panel 1061 is to realize the function that outputs and inputs of terminal device as two independent components, but in some embodiments In, can be integrated by touch panel 1071 and display panel 1061 and realize the function that outputs and inputs of terminal device, it is specific this Place does not limit.
Interface unit 108 is the interface that external device (ED) is connect with terminal device 100.For example, external device (ED) may include having Line or wireless head-band earphone port, external power supply (or battery charger) port, wired or wireless data port, storage card end Mouth, port, the port audio input/output (I/O), video i/o port, earphone end for connecting the device with identification module Mouthful etc..Interface unit 108 can be used for receiving the input (for example, data information, electric power etc.) from external device (ED) and By one or more elements that the input received is transferred in terminal device 100 or can be used in 100 He of terminal device Transmission data between external device (ED).
Memory 109 can be used for storing software program and various data.Memory 109 can include mainly storing program area And storage data field, wherein storing program area can storage program area, application program (such as the sound needed at least one function Sound playing function, image player function etc.) etc.;Storage data field can store according to mobile phone use created data (such as Audio data, phone directory etc.) etc..In addition, memory 109 may include high-speed random access memory, can also include non-easy The property lost memory, a for example, at least disk memory, flush memory device or other volatile solid-state parts.
Processor 110 is the control centre of terminal device, utilizes each of various interfaces and the entire terminal device of connection A part by running or execute the software program and/or module that are stored in memory 109, and calls and is stored in storage Data in device 109 execute the various functions and processing data of terminal device, to carry out integral monitoring to terminal device.Place Reason device 110 may include one or more processing units;Preferably, processor 110 can integrate application processor and modulatedemodulate is mediated Manage device, wherein the main processing operation system of application processor, user interface and application program etc., modem processor is main Processing wireless communication.It is understood that above-mentioned modem processor can not also be integrated into processor 110.
Terminal device 100 can also include the power supply 111 (such as battery) powered to all parts, it is preferred that power supply 111 Can be logically contiguous by power-supply management system and processor 110, to realize management charging by power-supply management system, put The functions such as electricity and power managed.
In addition, terminal device 100 includes some unshowned function modules, details are not described herein.
Preferably, the embodiment of the present application also provides a kind of terminal device, including processor 110, and memory 109 is stored in On memory 109 and the computer program that can be run on the processor 110, the computer program are executed by processor 110 Each process of Shi Shixian above method embodiments, and identical technique effect can be reached, it is no longer superfluous here to avoid repeating It states.
The embodiment of the present application also provides a kind of computer readable storage medium, and meter is stored on computer readable storage medium Calculation machine program, realizes each process of above method embodiment when which is executed by processor, and can reach identical Technique effect, to avoid repeating, which is not described herein again.Wherein, the computer readable storage medium, such as read-only memory (Read-Only Memory, abbreviation ROM), random access memory (Random Access Memory, abbreviation RAM), magnetic disc Or CD etc..
It should be noted that herein, the terms "include", "comprise" or its any other variant are intended to non-row His property includes, so that process, method, article or device including a series of elements include not only those elements, and And further include other elements that are not explicitly listed, or further include for this process, method, article or device institute it is intrinsic Element.In the absence of more restrictions, the element limited by sentence "including a ...", it is not excluded that including this There is also other identical elements in the process of element, method, article or device.
Through the above description of the embodiments, those skilled in the art can be understood that above-described embodiment side Method can add the mode of required general hardware platform to realize by software, naturally it is also possible to by hardware, but in many cases The former is more preferably embodiment.Based on this understanding, the technical solution of the application substantially in other words does the prior art Going out the part of contribution can be expressed in the form of software products, which is stored in a storage medium In (such as ROM/RAM, magnetic disc, CD), including some instructions are used so that a station terminal equipment (can be mobile phone, computer, clothes Be engaged in device, air conditioner or the network equipment etc.) execute each embodiment of the application described in method.
Embodiments herein is described above in conjunction with attached drawing, but the application be not limited to it is above-mentioned specific Embodiment, the above mentioned embodiment is only schematical, rather than restrictive, those skilled in the art Under the enlightenment of the application, the application objective and scope of the claimed protection are not being departed from, can also made very much Form belongs within the protection of the application.

Claims (14)

1. a kind of object distance measurement method, which is characterized in that the method includes:
Obtain the first image that the optical lens of terminal device is acquired in first position;
In the case where the optical lens is moved to the second position from the first position, the optical lens is obtained second Second image of station acquisition;
Obtain the first distance between the first position and the second position;
Obtain the first difference;
According to first distance, the focal length of the optical lens and first difference, determine object away from;
Wherein, first difference is institute in the case of described first image and second image in the same plane It states between the position coordinates of the first pixel in the first image and the position coordinates of the second pixel in second image Difference;Second pixel and the pixel that first pixel is relative position.
2. according to the method described in claim 1, it is characterized in that, the optical lens for obtaining terminal device is in first position Before first image of acquisition, the method further includes:
The initial position of the optical lens is determined as the first position;
Described to obtain the optical lens before the second image that the second position acquires, the method further includes:
The optical lens is controlled to move along the first preset direction of plane where the optical lens from the initial position To the second position.
3. according to the method described in claim 2, it is characterized in that, first preset direction includes first direction and second party At least one of in, the first direction is X-direction, and the second direction is Y direction.
4. according to the method described in claim 1, it is characterized in that, the optical lens for obtaining terminal device is in first position Before first image of acquisition, the method further includes:
The optical lens is controlled from the initial position of the optical lens along the second pre- of plane where the optical lens Set direction is moved to the first position;
Described to obtain the optical lens before the second image that the second position acquires, the method further includes:
The optical lens is controlled to move along the third preset direction of plane where the optical lens from the first position To the second position.
5. according to the method described in claim 4, it is characterized in that, second preset direction and the third preset direction phase Instead.
6. method according to claim 4 or 5, which is characterized in that second preset direction includes first direction and At least one of in two directions, the first direction is X-direction, and the second direction is Y direction;The default side of the third To including at least one in the first direction and the second direction.
7. a kind of terminal device, which is characterized in that the terminal device includes:
Acquiring unit, the first image that the optical lens for obtaining the terminal device is acquired in first position;
The acquiring unit is additionally operable to, in the case where the optical lens is moved to the second position from the first position, obtain The second image for taking the optical lens to be acquired in the second position;
The acquiring unit is additionally operable to obtain the first distance between the first position and the second position;
The acquiring unit is additionally operable to obtain the first difference;
Determination unit, for according to first distance, the focal length of the optical lens and first difference, determining object Away from;
Wherein, first difference is institute in the case of described first image and second image in the same plane It states between the position coordinates of the first pixel in the first image and the position coordinates of the second pixel in second image Difference;Second pixel and the pixel that first pixel is relative position.
8. terminal device according to claim 7, which is characterized in that the determination unit is additionally operable to single in the acquisition Member obtains the optical lens of the terminal device before the first image that first position acquires, by the initial of the optical lens Location determination is the first position;
The terminal device further includes:
First control unit, for the acquiring unit obtain the second image that the optical lens is acquired in the second position it Before, it controls the optical lens and is moved to from the initial position along the first preset direction of plane where the optical lens The second position.
9. terminal device according to claim 8, which is characterized in that
First preset direction includes at least one in first direction and second direction, and the first direction is X-direction, The second direction is Y direction.
10. terminal device according to claim 7, which is characterized in that the terminal device further includes:
Second control unit, what the optical lens for obtaining the terminal device in the acquiring unit was acquired in first position Before first image, the optical lens is controlled from the initial position of the optical lens along plane where the optical lens The second preset direction be moved to the first position;
Second control unit is additionally operable to obtain second that the optical lens is acquired in the second position in the acquiring unit Before image, third preset direction of the optical lens from the first position along plane where the optical lens is controlled It is moved to the second position.
11. terminal device according to claim 10, which is characterized in that
Second preset direction and the third preset direction are opposite.
12. the terminal device according to claim 10 or 11, which is characterized in that
Second preset direction includes at least one in first direction and second direction, and the first direction is X-direction, The second direction is Y direction;The third preset direction include in the first direction and the second direction at least One.
13. a kind of terminal device, which is characterized in that including processor, memory and be stored on the memory and can be in institute The computer program run on processor is stated, such as claim 1 to 6 is realized when the computer program is executed by the processor Any one of described in object distance measurement method the step of.
14. a kind of computer readable storage medium, which is characterized in that store computer journey on the computer readable storage medium Sequence is realized when the computer program is executed by processor such as object distance measurement method according to any one of claims 1 to 6 Step.
CN201810036578.0A 2018-01-15 2018-01-15 A kind of object distance measurement method and terminal device Pending CN108317992A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN201810036578.0A CN108317992A (en) 2018-01-15 2018-01-15 A kind of object distance measurement method and terminal device
PCT/CN2019/071635 WO2019137535A1 (en) 2018-01-15 2019-01-14 Object distance measurement method and terminal device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810036578.0A CN108317992A (en) 2018-01-15 2018-01-15 A kind of object distance measurement method and terminal device

Publications (1)

Publication Number Publication Date
CN108317992A true CN108317992A (en) 2018-07-24

Family

ID=62894241

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810036578.0A Pending CN108317992A (en) 2018-01-15 2018-01-15 A kind of object distance measurement method and terminal device

Country Status (2)

Country Link
CN (1) CN108317992A (en)
WO (1) WO2019137535A1 (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109859265A (en) * 2018-12-28 2019-06-07 维沃通信科技有限公司 A kind of measurement method and mobile terminal
WO2019137535A1 (en) * 2018-01-15 2019-07-18 维沃移动通信有限公司 Object distance measurement method and terminal device
CN110136114A (en) * 2019-05-15 2019-08-16 厦门理工学院 A kind of wave measurement method, terminal device and storage medium
RU2697822C2 (en) * 2018-11-19 2019-08-21 Алексей Владимирович Зубарь Method of determining coordinates of objects based on their digital images
CN109859265B (en) * 2018-12-28 2024-04-19 维沃移动通信有限公司 Measurement method and mobile terminal

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102980556A (en) * 2012-11-29 2013-03-20 北京小米科技有限责任公司 Distance measuring method and device

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103292779B (en) * 2012-02-28 2015-06-24 联想(北京)有限公司 Method for measuring distance and image acquisition equipment
CN103344213A (en) * 2013-06-28 2013-10-09 三星电子(中国)研发中心 Method and device for measuring distance of double-camera
CN106225764A (en) * 2016-07-01 2016-12-14 北京小米移动软件有限公司 Based on the distance-finding method of binocular camera in terminal and terminal
CN106355832A (en) * 2016-10-31 2017-01-25 江苏濠汉信息技术有限公司 Method for monitoring distance from dangerous object to power transmission and distribution line channel
CN108317992A (en) * 2018-01-15 2018-07-24 维沃移动通信有限公司 A kind of object distance measurement method and terminal device

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102980556A (en) * 2012-11-29 2013-03-20 北京小米科技有限责任公司 Distance measuring method and device

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2019137535A1 (en) * 2018-01-15 2019-07-18 维沃移动通信有限公司 Object distance measurement method and terminal device
RU2697822C2 (en) * 2018-11-19 2019-08-21 Алексей Владимирович Зубарь Method of determining coordinates of objects based on their digital images
CN109859265A (en) * 2018-12-28 2019-06-07 维沃通信科技有限公司 A kind of measurement method and mobile terminal
CN109859265B (en) * 2018-12-28 2024-04-19 维沃移动通信有限公司 Measurement method and mobile terminal
CN110136114A (en) * 2019-05-15 2019-08-16 厦门理工学院 A kind of wave measurement method, terminal device and storage medium
CN110136114B (en) * 2019-05-15 2021-03-02 厦门理工学院 Wave surface height measuring method, terminal equipment and storage medium

Also Published As

Publication number Publication date
WO2019137535A1 (en) 2019-07-18

Similar Documents

Publication Publication Date Title
WO2021136268A1 (en) Photographing method and electronic device
WO2021104195A1 (en) Image display method and electronic device
CN109743498B (en) Shooting parameter adjusting method and terminal equipment
KR20220092937A (en) Screen display control method and electronic device
CN108307109A (en) A kind of high dynamic range images method for previewing and terminal device
CN108337381A (en) A kind of lens control method and mobile terminal
WO2021082744A1 (en) Video viewing method and electronic apparatus
CN110113528A (en) A kind of parameter acquiring method and terminal device
CN107888833A (en) A kind of image capturing method and mobile terminal
CN110445984A (en) A kind of shooting reminding method and electronic equipment
KR20220124244A (en) Image processing method, electronic device and computer readable storage medium
CN108881719A (en) A kind of method and terminal device switching style of shooting
CN110830713A (en) Zooming method and electronic equipment
CN108564613A (en) A kind of depth data acquisition methods and mobile terminal
CN110769154B (en) Shooting method and electronic equipment
CN109005355A (en) A kind of image pickup method and mobile terminal
CN108317992A (en) A kind of object distance measurement method and terminal device
CN108881723A (en) A kind of image preview method and terminal
CN108174109A (en) A kind of photographic method and mobile terminal
CN108174110A (en) A kind of photographic method and flexible screen terminal
CN107959755A (en) A kind of photographic method and mobile terminal
WO2021104265A1 (en) Electronic device and focusing method
JP7413546B2 (en) Photography method and electronic equipment
WO2021136181A1 (en) Image processing method and electronic device
CN110493454A (en) A kind of image pickup method and terminal device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20180724

RJ01 Rejection of invention patent application after publication