WO2019137535A1 - Object distance measurement method and terminal device - Google Patents

Object distance measurement method and terminal device Download PDF

Info

Publication number
WO2019137535A1
WO2019137535A1 PCT/CN2019/071635 CN2019071635W WO2019137535A1 WO 2019137535 A1 WO2019137535 A1 WO 2019137535A1 CN 2019071635 W CN2019071635 W CN 2019071635W WO 2019137535 A1 WO2019137535 A1 WO 2019137535A1
Authority
WO
WIPO (PCT)
Prior art keywords
optical lens
terminal device
image
difference
pixel
Prior art date
Application number
PCT/CN2019/071635
Other languages
French (fr)
Chinese (zh)
Inventor
周燎
Original Assignee
维沃移动通信有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 维沃移动通信有限公司 filed Critical 维沃移动通信有限公司
Publication of WO2019137535A1 publication Critical patent/WO2019137535A1/en

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C3/00Measuring distances in line of sight; Optical rangefinders
    • G01C3/32Measuring distances in line of sight; Optical rangefinders by focusing the object, e.g. on a ground glass screen
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10004Still image; Photographic image

Definitions

  • the embodiments of the present disclosure relate to the field of communications technologies, and in particular, to an object distance measuring method and a terminal device.
  • the camera function of the terminal device is also continuously enhanced, for example, the camera's ranging, background blur, and the like in the terminal device.
  • the depth information of the object to be photographed (ie, the spatial distance between the object to be photographed and the lens of the camera) can be acquired, thereby realizing functions such as ranging and background blurring of the terminal device.
  • the terminal device acquires image information of the captured object on the lens of the camera by using each of the two cameras, and calculates a difference between the acquired two image information, and then obtains a difference according to the difference.
  • the terminal device since the terminal device uses two cameras while acquiring the depth information of the object to be photographed, the cost is high.
  • an embodiment of the present disclosure provides an object distance measuring method, which is applied to a terminal device, where the object distance measuring method includes: acquiring a first image acquired by an optical lens of a terminal device at a first position; Acquiring a second image acquired by the optical lens at the second position when the position is moved to the second position; acquiring a first distance between the first position and the second position; acquiring the first difference; according to the first distance, optical a focal length of the lens and a first difference, determining a target object distance; wherein the first difference is a position coordinate of the first pixel in the first image in a case where the first image and the second image are on the same plane a difference between position coordinates of the second pixel in the second image; a pixel point at which the second pixel is at a relative position to the first pixel.
  • an embodiment of the present disclosure provides a terminal device, where the terminal device includes: an obtaining unit and a determining unit.
  • the acquiring unit is configured to acquire a first image acquired by the optical lens of the terminal device at the first position.
  • the acquiring unit is further configured to acquire the second image acquired by the optical lens at the second position in a case where the optical lens is moved from the first position to the second position.
  • the acquiring unit is further configured to acquire a first distance between the first location and the second location.
  • the obtaining unit is further configured to acquire the first difference.
  • a determining unit configured to determine a target object distance according to the first distance, the focal length of the optical lens, and the first difference.
  • the first difference is between the position coordinates of the first pixel in the first image and the position coordinates of the second pixel in the second image in a case where the first image and the second image are located on the same plane
  • the difference between the second pixel and the first pixel is a pixel position.
  • an embodiment of the present disclosure provides a terminal device, including a processor, a memory, and a computer program stored on the memory and executable on the processor, where the computer program is executed by the processor The steps of the object distance measuring method as described in the above first aspect.
  • an embodiment of the present disclosure provides a computer readable storage medium, where the computer readable storage medium stores a computer program, and when the computer program is executed by a processor, implements the object distance measuring method according to the first aspect described above. step.
  • FIG. 1 is a schematic structural diagram of an Android operating system according to an embodiment of the present disclosure
  • FIG. 2 is a flowchart of a method for measuring an object distance according to an embodiment of the present disclosure
  • FIG. 3 is a second flowchart of a method for measuring an object distance according to an embodiment of the present disclosure
  • FIG. 4 is a third flowchart of a method for measuring an object distance according to an embodiment of the present disclosure
  • FIG. 5 is a schematic diagram of a space coordinate system according to an embodiment of the present disclosure.
  • FIG. 6 is a schematic diagram of an example of acquiring a first difference according to an embodiment of the present disclosure.
  • FIG. 7 is a schematic diagram of a position example of a first image and a second image according to an embodiment of the present disclosure
  • FIG. 8 is a second schematic diagram of a position example of a first image and a second image according to an embodiment of the present disclosure
  • FIG. 9 is a schematic structural diagram of a terminal device according to an embodiment of the present disclosure.
  • FIG. 10 is a schematic structural diagram of hardware of a terminal device according to an embodiment of the present disclosure.
  • first and second in the specification and claims of the embodiments of the present disclosure are used to distinguish different objects, and are not intended to describe a specific order of the objects.
  • first location and the second location, etc. are used to distinguish different locations, rather than to describe a particular order of locations.
  • the meaning of "a plurality” means two or more unless otherwise indicated.
  • the words “exemplary” or “such as” are used to mean an example, illustration, or illustration. Any embodiment or design described as “exemplary” or “for example” in the disclosed embodiments should not be construed as preferred or advantageous over other embodiments or designs. Rather, the use of the words “exemplary” or “such as” is intended to present the concepts in a particular manner.
  • the following describes the object distance measuring method and some concepts involved in the terminal device provided by the embodiments of the present disclosure.
  • First difference refers to the difference between the position coordinates of two pixels in the relative positions of the two images obtained when the camera (the camera includes the optical lens and the image sensor) photographs the same subject at two different positions. value. For example, the difference between the position coordinates of the first pixel point in the first image and the position coordinates of the second pixel point in the second image in the embodiment of the present disclosure.
  • the focal length of an optical lens refers to the distance between an optical lens and an image sensor.
  • Machine distance refers to the distance between the optical lens and the subject.
  • Embodiments of the present disclosure provide an object distance measuring method and a terminal device, which can be applied to a process in which a terminal device acquires a object distance. Specifically, it can be applied to the process that the terminal device obtains the object distance through the single camera, and can solve the problem in the prior art that the terminal device uses multiple cameras when acquiring the depth information of the object to be photographed, resulting in high cost. problem.
  • the terminal device in the embodiment of the present disclosure may be a terminal device having an operating system.
  • the operating system may be an Android (Android) operating system, and may be an iOS operating system, and may also be other possible operating systems, which are not specifically limited in this embodiment.
  • the following uses the Android operating system as an example to introduce the software environment to which the object distance measuring method provided by the embodiment of the present disclosure is applied.
  • FIG. 1 is a schematic structural diagram of a possible Android operating system provided by an embodiment of the present disclosure.
  • the architecture of the Android operating system includes four layers: the application layer, the application framework layer, the system runtime layer, and the kernel layer (specifically, the Linux kernel layer).
  • the application layer includes various applications (including system applications and third-party applications) in the Android operating system.
  • the application framework layer is the framework of the application, developers can develop some applications based on the application framework layer, while adhering to the development principles of the application framework.
  • the system runtime layer includes libraries (also known as system libraries) and the Android operating system runtime environment.
  • the library mainly provides the various resources required by the Android operating system.
  • the Android operating system runtime environment is used to provide a software environment for the Android operating system.
  • the kernel layer is the operating system layer of the Android operating system and belongs to the lowest level of the Android operating system software layer.
  • the kernel layer provides core system services and hardware-related drivers for the Android operating system based on the Linux kernel.
  • the developer can develop a software program for implementing the object distance measuring method provided by the embodiment of the present disclosure based on the system architecture of the Android operating system shown in FIG.
  • the object distance measurement method can be based on the Android operating system shown in FIG. That is, the processor or the terminal device can implement the object distance measuring method provided by the embodiment of the present disclosure by running the software program in the Android operating system.
  • FIG. 2 illustrates an object distance measuring method provided by an embodiment of the present disclosure, and the method can be applied to a terminal device having an Android operating system as shown in FIG. 1.
  • the object distance measuring method includes steps 201-205:
  • Step 201 The terminal device acquires a first image acquired by the optical lens of the terminal device at the first position.
  • the terminal device may determine the first location when receiving the input of the user, and acquire the first image of the first location through the optical lens.
  • the input is used to trigger an optical lens of the terminal device to acquire an image, where the first image is an image acquired when the optical lens is in the first position.
  • the user may input on the current interface of the terminal device to trigger the optical lens of the terminal device to acquire the first image.
  • the user may perform a first input on the shooting icon on the current interface after the camera is turned on, so that the optical lens of the terminal device captures the shooting object, and the shooting object is in the first position of the optical lens in the terminal device.
  • the image is the first image.
  • the input may be a click/press operation of the user on the terminal device, and the click operation may be a click, a double click, or a continuous click on a preset number of operations.
  • the user can also input the shortcut image, the preset button or the preset button combination in the terminal device to trigger the optical lens of the terminal device to acquire the first image.
  • the first position may be an initial position of the optical lens.
  • the object distance measuring method provided by the embodiment of the present disclosure further includes step 301:
  • Step 301 The terminal device determines an initial position of the optical lens as the first position.
  • the first position may be a position after the optical lens is controlled to move from the initial position along the second predetermined direction.
  • the object distance measuring method provided by the embodiment of the present disclosure further includes step 302:
  • Step 302 The terminal device controls the optical lens to move from the initial position of the optical lens to the first position along a second preset direction of the plane of the optical lens.
  • the terminal device can control the optical lens to move from the initial position along the second preset direction to the first position through an Optical Image Stabilization (OIS) device in the terminal device.
  • OIS Optical Image Stabilization
  • the second preset direction includes at least one of a first direction and a second direction
  • the first direction is an X-axis direction
  • the second direction is a Y-axis direction
  • a spatial coordinate system is provided in an embodiment of the present disclosure.
  • the optical lens 1 is at a position a in the spatial coordinate system as shown in FIG. 5, the plane in which the optical lens is located is a plane composed of the X-axis and the Y-axis, and the point P is a subject.
  • the terminal device can control the optical lens 1 to move from the position a along the X axis in the space to the first position through the OIS device; or, the terminal device can control the optical lens 1 to move from the position a along the Y axis in the space to the O through the OIS device to The first position; alternatively, the terminal device can control the optical lens 1 to move from the position a along the X-axis and the Y-axis in the space to the first position through the OIS device.
  • Step 202 In a case where the optical lens moves from the first position to the second position, the terminal device acquires the second image acquired by the optical lens at the second position.
  • the terminal device determines the second position when detecting that the optical lens moves from the first position to the second position, and acquires the second image of the second position through the optical lens.
  • the second position in a case where the first position is an initial position of the optical lens, the second position may be a position after the optical lens is controlled to move from the initial position along the first preset direction.
  • the object distance measuring method provided by the embodiment of the present disclosure further includes step 303:
  • Step 303 The terminal device controls the optical lens to move from the initial position to the second position along a first preset direction of the plane of the optical lens.
  • the terminal device can control the optical lens to move from the initial position along the first predetermined direction to the second position by the OIS device.
  • the first preset direction includes at least one of a first direction and a second direction, the first direction is an X-axis direction, and the second direction is a Y-axis direction.
  • the terminal device can control the optical lens 1 to move from the position a along the X axis in the space to the second position; or, the terminal device can control the optical lens 1 from the position a The Y-axis in the space is moved to the second position; alternatively, the terminal device can control the optical lens to move from the position a along the X-axis and the Y-axis in the space to the second position through the OIS device.
  • the first position is the initial position of the optical lens
  • the spatial coordinate of the initial position is (0, 0, 0).
  • the terminal device can control the optical lens to move from (0, 0, 0) along the X axis in the space to the second position (3, 0, 0); or, the terminal device can control the optical lens from (0, 0, 0) Moving along the Y axis in space to the second position (0, 4, 0); alternatively, the terminal device can control the optical lens to move from (0, 0, 0) along the X and Y axes in space to the second Location (3, 4, 0).
  • the second position in a case where the first position is a position for controlling the optical lens to move from the initial position along the second preset direction, the second position may be to control the optical lens from the first position.
  • the position after the third preset direction is moved.
  • the object distance measuring method provided by the embodiment of the present disclosure further includes step 304:
  • Step 304 The terminal device controls the optical lens to move from the first position to the second position along a third preset direction of the plane of the optical lens.
  • the second preset direction is different from the third preset direction.
  • the second preset direction includes at least one of a first direction and a second direction
  • the first direction is an X-axis direction
  • the second direction is a Y-axis direction
  • the third preset direction is At least one of the first direction and the second direction is included.
  • the second preset direction and the third preset direction are opposite.
  • the terminal device can control the optical lens 1 to move from the position a along the X axis in the space to the first position, and the optical lens from the first position along the space.
  • the X axis moves to the second position; alternatively, the terminal device can control the optical lens 1 to move from the position a along the Y axis in the space to the first position, and move the optical lens from the first position along the Y axis in the space to a second position; or, the terminal device can control the optical lens to move from the position a along the X-axis and the Y-axis in the space to the first position, and the optical lens from the first position along the X-axis and the Y-axis in the space Move to the second position.
  • the terminal device can control the optical lens to move from (0, 0, 0) along the X axis in the space to the first position (3, 0, 0), and the optical lens is moved from the first position (3, 0, 0)
  • the X axis in the space moves to the second position (-3, 0, 0); or, the terminal device can control the optical lens to move from (0, 0, 0) along the Y axis in the space to the first position (0 , 4, 0), and move the optical lens from the first position (0, 4, 0) along the Y axis in space to the second position (0, -4, 0); or, the terminal device can control the optical lens Move from (0,0,0) along the X and Y axes in space to the first position (3,4,0) and move the optical lens from the first position (3,4,0) along the space
  • the Y axis moves to the second position (-3, -4
  • Step 203 The terminal device acquires a first distance between the first location and the second location.
  • the terminal device may calculate the first distance ⁇ X according to the first position and the second position.
  • Step 204 The terminal device acquires the first difference.
  • the first difference is between the position coordinates of the first pixel in the first image and the position coordinates of the second pixel in the second image in a case where the first image and the second image are located on the same plane
  • the difference between the second pixel and the first pixel is a pixel position.
  • O1 is the first position where the optical lens is located
  • O2 is the second position where the optical lens is located
  • the first distance is ⁇ X
  • the P point The spatial coordinates are expressed as P(x c , y c , z c ), and z c can generally be regarded as the object distance of the P point, denoted by Z.
  • the first image acquired when O1 is in the first position is P1
  • the position coordinate of the pixel point of P1 is P 1 (x 1 , y 1 )
  • the second image acquired when O2 is in the second position is P2, and P1
  • the position coordinates of the pixel points of the opposite positions P2 are P 2 (x 2 , y 1 ); the terminal device can calculate the position coordinates P 1 (x 1 , y 1 ) of the pixel points and the position coordinates P 2 (x of the pixel points).
  • the terminal device may perform compensation correction processing on the first image and the second image, such as image distortion, angle adjustment, etc., so that the first image and the first image
  • the two images are aligned on the spatial Y-axis (and/or the X-axis) (ie, the first image and the second image are on the same plane).
  • Step 205 The terminal device determines the target object distance according to the first distance, the focal length of the optical lens, and the first difference.
  • the terminal device may determine the target object distance according to the first distance, the focal length of the optical lens, and the first difference according to a preset formula.
  • the preset formula adopted by the terminal device is As shown in FIG. 7 , Z is the target distance, P is the photographing object, the first image acquired when O1 is in the first position is P1, the second image acquired when O2 is in the second position is P2, and B is O1 and O2. The distance between them, d is the first difference, and f is the focal length of the optical lens. According to the triangle similarity principle, you can get And calculate the formula one:
  • D is the width of the image sensor
  • Z is the target distance
  • the preset formula can be obtained from Equation 1 and Equation 2:
  • the terminal device can adopt the preset formula
  • the target object distance Z is calculated based on the acquired first distance ⁇ X, the focal length f of the optical lens, and the first difference d.
  • the terminal device can control the optical lens of the single camera to move to the first position and the second position, and acquire the first difference of the second image when the first image and the second position are in the first position
  • the value does not require the plurality of cameras to acquire the first position and the second position, respectively, and the first difference between the first image and the second image, thereby reducing the cost of the camera.
  • An embodiment of the present disclosure provides an object distance measuring method.
  • the terminal device may determine a target object distance according to a first distance between the first position and the second position of the optical lens, a focal length of the optical lens, and a first difference. Since the terminal device can control the optical lens of the camera to move to the first position and the second position, acquire the first distance between the first position and the second position, and acquire the first image and the second position when the first position is obtained.
  • the first difference between the second images does not require the use of a plurality of cameras to acquire the first position and the second position, and the first difference between the first image and the second image, thereby reducing the cost of the camera .
  • the embodiment of the present disclosure adopts a single camera, it solves the difference in parameters (such as lens curvature, brightness, etc.) of multiple cameras when shooting an object by using multiple cameras, thereby affecting the precision of the shooting of the terminal device, thereby The problem that affects the shooting effect of the terminal device makes the captured image more precise.
  • the embodiment of the present disclosure provides an object distance measurement method by acquiring a target object distance to implement background blur, map, 3D, and the like.
  • FIG. 9 is a schematic structural diagram of a terminal device involved in the embodiment of the present disclosure.
  • the terminal device 90 may include: an obtaining unit 91 and The unit 92 is determined.
  • the obtaining unit 91 is configured to acquire a first image acquired by the optical lens of the terminal device at the first position.
  • the obtaining unit 91 is further configured to acquire a second image acquired by the optical lens at the second position in a case where the optical lens is moved from the first position to the second position.
  • the obtaining unit 91 is further configured to acquire a first distance between the first location and the second location.
  • the obtaining unit 91 is further configured to acquire the first difference.
  • the determining unit 92 is configured to determine the target object distance according to the first distance, the focal length of the optical lens, and the first difference.
  • the first difference is between the position coordinates of the first pixel in the first image and the position coordinates of the second pixel in the second image in a case where the first image and the second image are located on the same plane
  • the difference between the second pixel and the first pixel is a pixel position.
  • the determining unit 92 is further configured to determine the initial position of the optical lens as the first position before the acquiring unit 91 acquires the first image acquired by the optical lens of the terminal device at the first position.
  • the terminal device in the embodiment of the present disclosure further includes: a first control unit.
  • the first control unit is configured to control the optical lens to move from the initial position along the first preset direction of the plane of the optical lens to the second position before the acquiring unit 91 acquires the second image acquired by the optical lens at the second position. .
  • the first preset direction includes at least one of a first direction and a second direction, the first direction being an X-axis direction and the second direction being a Y-axis direction.
  • the terminal device in the embodiment of the present disclosure further includes: a second control unit.
  • the second control unit is configured to control the second preset of the optical lens from the initial position of the optical lens along the plane of the optical lens before the acquiring unit 91 acquires the first image acquired by the optical lens of the terminal device at the first position. The direction moves to the first position.
  • the second control unit is further configured to: before the acquiring unit 91 acquires the second image acquired by the optical lens at the second position, control the optical lens to move from the first position to the second position along the third preset direction of the plane of the optical lens .
  • the second preset direction is opposite to the third preset direction.
  • the second preset direction includes at least one of a first direction and a second direction, where the first direction is an X-axis direction, and the second direction is a Y-axis direction.
  • the third preset direction includes at least one of a first direction and a second direction, the first direction being an X-axis direction, and the second direction being a Y-axis direction.
  • the terminal device 90 provided by the embodiment of the present disclosure can implement various processes implemented by the terminal device in the foregoing method embodiments. To avoid repetition, detailed description and beneficial effects are not described herein again.
  • FIG. 10 is a schematic diagram of a hardware structure of a terminal device that implements various embodiments of the present disclosure, including but not limited to: a radio frequency unit 101, a network module 102, and an audio output unit. 103.
  • the terminal device structure shown in FIG. 10 does not constitute a limitation on the terminal device, and the terminal device may include more or less components than the illustrated, or combine some components. , or different parts layout.
  • the terminal device includes, but is not limited to, a mobile phone, a tablet computer, a notebook computer, a palmtop computer, a vehicle-mounted terminal, a wearable device, a pedometer, and the like.
  • the processor 110 is configured to determine, when the first input of the user is received, a first location, where the first input is used to trigger an optical lens of the terminal device to acquire a first image, where the first image is the optical lens An image acquired when the first position is detected; and when the optical lens is detected to move from the first position to the second position, determining the second position; acquiring the first position between the first position and the second position a distance obtained by a difference between an abscissa of the corresponding two pixels in the first image and the second image, wherein the second image is the second position of the optical lens The image acquired at the time; combined with the preset formula, the target object distance is determined according to the first distance, the focal length of the optical lens, and the first difference.
  • the terminal device 100 provided by the embodiment of the present disclosure can implement various processes implemented by the terminal device in the foregoing method embodiments. To avoid repetition, the detailed description and the beneficial effects are not described herein again.
  • the radio frequency unit 101 may be used for receiving and transmitting signals during or after receiving or transmitting information, and specifically, receiving downlink data from the base station, and then processing the data to the processor 110; The uplink data is sent to the base station.
  • radio frequency unit 101 includes, but is not limited to, an antenna, at least one amplifier, a transceiver, a coupler, a low noise amplifier, a duplexer, and the like.
  • the radio unit 101 can also communicate with the network and other devices through a wireless communication system.
  • the terminal device provides the user with wireless broadband Internet access through the network module 102, such as helping the user to send and receive emails, browse web pages, and access streaming media.
  • the audio output unit 103 can convert the audio data received by the radio frequency unit 101 or the network module 102 or stored in the memory 109 into an audio signal and output as sound. Moreover, the audio output unit 103 can also provide audio output (eg, call signal reception sound, message reception sound, etc.) related to a specific function performed by the terminal device 100.
  • the audio output unit 103 includes a speaker, a buzzer, a receiver, and the like.
  • the input unit 104 is for receiving an audio or video signal.
  • the input unit 104 may include a graphics processing unit (GPU) 1041 and a microphone 1042 that captures still pictures or video obtained by an image capturing device (such as a camera) in a video capturing mode or an image capturing mode.
  • Image data is processed.
  • the processed image frame can be displayed on the display unit 106.
  • the image frames processed by the graphics processor 1041 may be stored in the memory 109 (or other storage medium) or transmitted via the radio unit 101 or the network module 102.
  • the microphone 1042 can receive sound and can process such sound as audio data.
  • the processed audio data can be converted to a format output that can be transmitted to the mobile communication base station via the radio unit 101 in the case of a telephone call mode.
  • the terminal device 100 also includes at least one type of sensor 105, such as a light sensor, a motion sensor, and other sensors.
  • the light sensor includes an ambient light sensor and a proximity sensor, wherein the ambient light sensor can adjust the brightness of the display panel 1061 according to the brightness of the ambient light, and the proximity sensor can close the display panel 1061 when the terminal device 100 moves to the ear. / or backlight.
  • the accelerometer sensor can detect the acceleration of each direction (usually three axes). When it is stationary, it can detect the magnitude and direction of gravity. It can be used to identify the attitude of the terminal device (such as horizontal and vertical screen switching, related games).
  • sensor 105 may also include a fingerprint sensor, a pressure sensor, an iris sensor, a molecular sensor, a gyroscope, a barometer, a hygrometer, a thermometer, Infrared sensors and the like are not described here.
  • the display unit 106 is for displaying information input by the user or information provided to the user.
  • the display unit 106 can include a display panel 1061.
  • the display panel 1061 can be configured in the form of a liquid crystal display (LCD), an organic light-emitting diode (OLED), or the like.
  • the user input unit 107 can be configured to receive input numeric or character information and to generate key signal inputs related to user settings and function control of the terminal device.
  • the user input unit 107 includes a touch panel 1071 and other input devices 1072.
  • the touch panel 1071 also referred to as a touch screen, can collect touch operations on or near the user (such as the user using a finger, a stylus, or the like on the touch panel 1071 or near the touch panel 1071. operating).
  • the touch panel 1071 may include two parts of a touch detection device and a touch controller.
  • the touch detection device detects the touch orientation of the user, and detects a signal brought by the touch operation, and transmits the signal to the touch controller; the touch controller receives the touch information from the touch detection device, converts the touch information into contact coordinates, and sends the touch information.
  • the touch panel 1071 can be implemented in various types such as resistive, capacitive, infrared, and surface acoustic waves.
  • the user input unit 107 may also include other input devices 1072.
  • the other input devices 1072 may include, but are not limited to, a physical keyboard, function keys (such as a volume control button, a switch button, etc.), a trackball, a mouse, and a joystick, which are not described herein.
  • the touch panel 1071 can be overlaid on the display panel 1061. After the touch panel 1071 detects a touch operation thereon or nearby, the touch panel 1071 transmits to the processor 110 to determine the type of the touch event, and then the processor 110 according to the touch. The type of event provides a corresponding visual output on display panel 1061.
  • the touch panel 1071 and the display panel 1061 are two independent components to implement the input and output functions of the terminal device, in some embodiments, the touch panel 1071 and the display panel 1061 may be integrated. The input and output functions of the terminal device are implemented, and are not limited herein.
  • the interface unit 108 is an interface in which an external device is connected to the terminal device 100.
  • the external device may include a wired or wireless headset port, an external power (or battery charger) port, a wired or wireless data port, a memory card port, a port for connecting a device having an identification module, and an audio input/output. (I/O) port, video I/O port, headphone port, and more.
  • the interface unit 108 can be configured to receive input from an external device (eg, data information, power, etc.) and transmit the received input to one or more components within the terminal device 100 or can be used at the terminal device 100 and externally Data is transferred between devices.
  • an external device eg, data information, power, etc.
  • Memory 109 can be used to store software programs as well as various data.
  • the memory 109 may mainly include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application required for at least one function (such as a sound playing function, an image playing function, etc.), and the like; the storage data area may be stored according to Data created by the use of the mobile phone (such as audio data, phone book, etc.).
  • the memory 109 may include a high speed random access memory, and may also include a nonvolatile memory such as at least one magnetic disk storage device, flash memory device, or other volatile solid state storage device.
  • the processor 110 is a control center of the terminal device that connects various portions of the entire terminal device using various interfaces and lines, by running or executing software programs and/or modules stored in the memory 109, and recalling data stored in the memory 109. Perform various functions and processing data of the terminal device to perform overall monitoring on the terminal device.
  • the processor 110 may include one or more processing units; preferably, the processor 110 may integrate an application processor and a modem processor, wherein the application processor mainly processes an operating system, a user interface, an application, etc., and performs modulation and demodulation.
  • the processor primarily handles wireless communications. It can be understood that the above modem processor may not be integrated into the processor 110.
  • the terminal device 100 may further include a power source 111 (such as a battery) for supplying power to the respective components.
  • a power source 111 such as a battery
  • the power source 111 may be logically connected to the processor 110 through the power management system to manage charging, discharging, and power management through the power management system. And other functions.
  • the terminal device 100 includes some functional modules not shown, and details are not described herein again.
  • an embodiment of the present disclosure further provides a terminal device, including a processor 110, a memory 109, a computer program stored on the memory 109 and executable on the processor 110, when the computer program is executed by the processor 110.
  • a terminal device including a processor 110, a memory 109, a computer program stored on the memory 109 and executable on the processor 110, when the computer program is executed by the processor 110.
  • the embodiment of the present disclosure further provides a computer readable storage medium.
  • the computer readable storage medium stores a computer program.
  • the computer program is executed by the processor, the processes of the foregoing method embodiments are implemented, and the same technical effects can be achieved. To avoid repetition, we will not repeat them here.
  • the computer readable storage medium such as a read-only memory (ROM), a random access memory (RAM), a magnetic disk, or an optical disk.
  • the foregoing embodiment method can be implemented by means of software plus a necessary general hardware platform, and of course, can also be through hardware, but in many cases, the former is better.
  • Implementation Based on such understanding, the technical solution of the present application, which is essential or contributes to the prior art, may be embodied in the form of a software product stored in a storage medium (such as ROM/RAM, disk,
  • the optical disc includes a number of instructions for causing a terminal device (which may be a mobile phone, a computer, a server, an air conditioner, or a network device, etc.) to perform the methods described in various embodiments of the present application.

Abstract

Embodiments of the present invention relate to the technical field of communications, and provide an object distance measurement method and a terminal device. The specific solution comprises: obtaining a first image acquired by an optical lens of a terminal device at a first position; obtaining, in the case that the optical lens is moved from the first position to a second position, a second image acquired by the optical lens at the second position; obtaining a first distance between the first position and the second position; obtaining a first difference; and determining a target object distance according to the first distance, the focal length of the optical lens, and the first difference, wherein the first difference is a difference between a position coordinate of a first pixel in the first image and a position coordinate of a second pixel in the second image in the case that the first image and the second image are on one plane. In the embodiments of the present invention, an object distance is obtained by a single camera, so that the camera costs can be reduced.

Description

物距测量方法及终端设备Object distance measurement method and terminal equipment
相关申请的交叉引用Cross-reference to related applications
本申请主张在2018年01月15日提交中国专利局、申请号为201810036578.0、申请名称为“一种物距测量方法及终端设备”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。The present application claims priority to Chinese Patent Application No. 201810036578.0, filed on Jan. 15, 2018, the entire disclosure of which is incorporated herein by reference. In the application.
技术领域Technical field
本公开实施例涉及通信技术领域,尤其涉及一种物距测量方法及终端设备。The embodiments of the present disclosure relate to the field of communications technologies, and in particular, to an object distance measuring method and a terminal device.
背景技术Background technique
随着终端技术的发展,终端设备的拍照功能也不断增强,例如终端设备中的相机的测距、背景虚化等功能。With the development of the terminal technology, the camera function of the terminal device is also continuously enhanced, for example, the camera's ranging, background blur, and the like in the terminal device.
目前,通过在终端设备上安装两个摄像头,可以获取被拍摄物体的深度信息(即被拍摄物体与摄像头的镜头之间的空间距离),从而实现终端设备的测距、背景虚化等功能。具体的,终端设备分别采用两个摄像头中的每个摄像头获取到被拍摄物体在该摄像头的镜头上的图像信息,并计算获取的两个图像信息之间的差值,然后根据该差值得到被拍摄物体的深度信息。At present, by installing two cameras on the terminal device, the depth information of the object to be photographed (ie, the spatial distance between the object to be photographed and the lens of the camera) can be acquired, thereby realizing functions such as ranging and background blurring of the terminal device. Specifically, the terminal device acquires image information of the captured object on the lens of the camera by using each of the two cameras, and calculates a difference between the acquired two image information, and then obtains a difference according to the difference. The depth information of the subject being shot.
但是,上述方法中,由于终端设备在获取被拍摄物体的深度信息时,采用的是两个摄像头,因此成本较高。However, in the above method, since the terminal device uses two cameras while acquiring the depth information of the object to be photographed, the cost is high.
发明内容Summary of the invention
第一方面,本公开实施例提供一种物距测量方法,应用于终端设备,该物距测量方法包括:获取终端设备的光学镜头在第一位置采集的第一图像;在光学镜头从第一位置移动到第二位置的情况下,获取光学镜头在第二位置采集的第二图像;获取第一位置与第二位置之间的第一距离;获取第一差值;根据第一距离、光学镜头的焦距和第一差值,确定目标物距;其中,第一差值为在第一图像和第二图像位于同一平面上的情况下,第一图像中的第一像素点的位置坐标和第二图像中的第二像素点的位置坐标之间的差值;第二像素点与第一像素点为相对位置的像素点。In a first aspect, an embodiment of the present disclosure provides an object distance measuring method, which is applied to a terminal device, where the object distance measuring method includes: acquiring a first image acquired by an optical lens of a terminal device at a first position; Acquiring a second image acquired by the optical lens at the second position when the position is moved to the second position; acquiring a first distance between the first position and the second position; acquiring the first difference; according to the first distance, optical a focal length of the lens and a first difference, determining a target object distance; wherein the first difference is a position coordinate of the first pixel in the first image in a case where the first image and the second image are on the same plane a difference between position coordinates of the second pixel in the second image; a pixel point at which the second pixel is at a relative position to the first pixel.
第二方面,本公开实施例提供一种终端设备,该终端设备包括:获取单元和确定单元。其中,获取单元,用于获取终端设备的光学镜头在第一位置采集的第一图像。获取单元,还用于在光学镜头从第一位置移动到第二位置的情况下,获取光学镜头在第二位置采集的第二图像。获取单元,还用于获取第一位置与第二位置之间的第一距离。获取单元,还用于获取第一差值。确定单元,用于根据第一距离、光学镜头的焦距和第一差值,确定目标物距。其中,第一差值为在第一图像和第二图像位于同一平面上的情况下,第一图像中的第一像素点的位置坐标和第二图像中的第二像素点的位置坐标之间的差值;第二像素点与第一像素点为相对位置的像素点。In a second aspect, an embodiment of the present disclosure provides a terminal device, where the terminal device includes: an obtaining unit and a determining unit. The acquiring unit is configured to acquire a first image acquired by the optical lens of the terminal device at the first position. The acquiring unit is further configured to acquire the second image acquired by the optical lens at the second position in a case where the optical lens is moved from the first position to the second position. The acquiring unit is further configured to acquire a first distance between the first location and the second location. The obtaining unit is further configured to acquire the first difference. a determining unit, configured to determine a target object distance according to the first distance, the focal length of the optical lens, and the first difference. Wherein the first difference is between the position coordinates of the first pixel in the first image and the position coordinates of the second pixel in the second image in a case where the first image and the second image are located on the same plane The difference between the second pixel and the first pixel is a pixel position.
第三方面,本公开实施例提供一种终端设备,该终端设备包括处理器、存储器及存储在该存储器上并可在该处理器上运行的计算机程序,该计算机程序被该处理器执行时实现如上述第一方面所述的物距测量方法的步骤。In a third aspect, an embodiment of the present disclosure provides a terminal device, including a processor, a memory, and a computer program stored on the memory and executable on the processor, where the computer program is executed by the processor The steps of the object distance measuring method as described in the above first aspect.
第四方面,本公开实施例提供一种计算机可读存储介质,该计算机可读存储介质上存储计算机程序,该计算机程序被处理器执行时实现如上述第一方面所述的物距测量方法的步骤。In a fourth aspect, an embodiment of the present disclosure provides a computer readable storage medium, where the computer readable storage medium stores a computer program, and when the computer program is executed by a processor, implements the object distance measuring method according to the first aspect described above. step.
附图说明DRAWINGS
图1为本公开实施例提供的一种安卓操作系统的架构示意图;FIG. 1 is a schematic structural diagram of an Android operating system according to an embodiment of the present disclosure;
图2为本公开实施例提供的物距测量方法流程图之一;FIG. 2 is a flowchart of a method for measuring an object distance according to an embodiment of the present disclosure;
图3为本公开实施例提供的物距测量方法流程图之二;FIG. 3 is a second flowchart of a method for measuring an object distance according to an embodiment of the present disclosure;
图4为本公开实施例提供的物距测量方法流程图之三;4 is a third flowchart of a method for measuring an object distance according to an embodiment of the present disclosure;
图5为本公开实施例提供的一种空间坐标系的示意图;FIG. 5 is a schematic diagram of a space coordinate system according to an embodiment of the present disclosure;
图6为本公开实施例提供的一种获取第一差值的实例示意图;FIG. 6 is a schematic diagram of an example of acquiring a first difference according to an embodiment of the present disclosure;
图7为本公开实施例提供的第一图像和第二图像的位置实例示意图之一;FIG. 7 is a schematic diagram of a position example of a first image and a second image according to an embodiment of the present disclosure;
图8为本公开实施例提供的第一图像和第二图像的位置实例示意图之二;FIG. 8 is a second schematic diagram of a position example of a first image and a second image according to an embodiment of the present disclosure;
图9为本公开实施例提供的一种终端设备的结构示意图;FIG. 9 is a schematic structural diagram of a terminal device according to an embodiment of the present disclosure;
图10为本公开实施例提供的一种终端设备的硬件结构示意图。FIG. 10 is a schematic structural diagram of hardware of a terminal device according to an embodiment of the present disclosure.
具体实施方式Detailed ways
下面将结合本公开实施例中的附图,对本公开实施例中的技术方案进行清楚、完整地描述,显然,所描述的实施例是本公开一部分实施例,而不是全部的实施例。基于本公开中的实施例,本领域普通技术人员在没有作出创造性劳动前提下所获得的所有其他实施例,都属于本公开保护的范围。The technical solutions in the embodiments of the present disclosure are clearly and completely described in the following with reference to the accompanying drawings in the embodiments of the present disclosure. It is obvious that the described embodiments are a part of the embodiments of the present disclosure, and not all of the embodiments. All other embodiments obtained by a person of ordinary skill in the art based on the embodiments of the present disclosure without departing from the inventive scope are the scope of the disclosure.
本公开实施例的说明书和权利要求书中的术语“第一”和“第二”等是用于区别不同的对象,而不是用于描述对象的特定顺序。例如,第一位置和第二位置等是用于区别不同的位置,而不是用于描述位置的特定顺序。在本公开实施例的描述中,除非另有说明,“多个”的含义是指两个或两个以上。The terms "first" and "second" and the like in the specification and claims of the embodiments of the present disclosure are used to distinguish different objects, and are not intended to describe a specific order of the objects. For example, the first location and the second location, etc., are used to distinguish different locations, rather than to describe a particular order of locations. In the description of the embodiments of the present disclosure, the meaning of "a plurality" means two or more unless otherwise indicated.
本文中术语“和/或”,是一种描述关联对象的关联关系,表示可以存在三种关系,例如,A和/或B,可以表示:单独存在A,同时存在A和B,单独存在B这三种情况。本文中符号“/”表示关联对象是或者的关系,例如A/B表示A或者B。The term "and/or" in this context is an association relationship describing an associated object, indicating that there may be three relationships, for example, A and/or B, which may indicate that A exists separately, A and B exist simultaneously, and B exists separately. These three situations. The symbol "/" in this document indicates that the associated object is an OR relationship, for example, A/B represents A or B.
在本公开实施例中,“示例性的”或者“例如”等词用于表示作例子、例证或说明。本公开实施例中被描述为“示例性的”或者“例如”的任何实施例或设计方案不应被解释为比其它实施例或设计方案更优选或更具优势。确切而言,使用“示例性的”或者“例如”等词旨在以具体方式呈现相关概念。In the embodiments of the present disclosure, the words "exemplary" or "such as" are used to mean an example, illustration, or illustration. Any embodiment or design described as "exemplary" or "for example" in the disclosed embodiments should not be construed as preferred or advantageous over other embodiments or designs. Rather, the use of the words "exemplary" or "such as" is intended to present the concepts in a particular manner.
下面对本公开实施例提供的物距测量方法及终端设备中涉及的一些概念做解释说明。The following describes the object distance measuring method and some concepts involved in the terminal device provided by the embodiments of the present disclosure.
“第一差值”,是指相机(相机包括光学镜头和图像传感器)在两个不同位置对同一拍摄物体进行拍摄时,得到的两幅图像中相对位置的两个像素点的位置坐标的差值。例如,本公开实施例中的第一图像中的第一像素点的位置坐标和第二图像中的第二像素点的位 置坐标之间的差值。“First difference” refers to the difference between the position coordinates of two pixels in the relative positions of the two images obtained when the camera (the camera includes the optical lens and the image sensor) photographs the same subject at two different positions. value. For example, the difference between the position coordinates of the first pixel point in the first image and the position coordinates of the second pixel point in the second image in the embodiment of the present disclosure.
“光学镜头的焦距”,是指光学镜头到图像传感器之间的距离。"The focal length of an optical lens" refers to the distance between an optical lens and an image sensor.
“物距”,是指光学镜头到拍摄物体之间的距离。“Material distance” refers to the distance between the optical lens and the subject.
本公开实施例提供一种物距测量方法及终端设备,可以应用于终端设备获取物距的过程。具体的,可以应用于终端设备通过单摄像头获取物距的过程中,可以解决现有技术中由于终端设备在获取被拍摄物体的深度信息时,采用的是多个摄像头,导致的成本较高的问题。Embodiments of the present disclosure provide an object distance measuring method and a terminal device, which can be applied to a process in which a terminal device acquires a object distance. Specifically, it can be applied to the process that the terminal device obtains the object distance through the single camera, and can solve the problem in the prior art that the terminal device uses multiple cameras when acquiring the depth information of the object to be photographed, resulting in high cost. problem.
本公开实施例中的终端设备可以为具有操作系统的终端设备。该操作系统可以为安卓(Android)操作系统,可以为ios操作系统,还可以为其他可能的操作系统,本公开实施例不作具体限定。The terminal device in the embodiment of the present disclosure may be a terminal device having an operating system. The operating system may be an Android (Android) operating system, and may be an iOS operating system, and may also be other possible operating systems, which are not specifically limited in this embodiment.
下面以安卓操作系统为例,介绍一下本公开实施例提供的物距测量方法所应用的软件环境。The following uses the Android operating system as an example to introduce the software environment to which the object distance measuring method provided by the embodiment of the present disclosure is applied.
如图1所示,为本公开实施例提供的一种可能的安卓操作系统的架构示意图。在图1中,安卓操作系统的架构包括4层,分别为:应用程序层、应用程序框架层、系统运行库层和内核层(具体可以为Linux内核层)。FIG. 1 is a schematic structural diagram of a possible Android operating system provided by an embodiment of the present disclosure. In Figure 1, the architecture of the Android operating system includes four layers: the application layer, the application framework layer, the system runtime layer, and the kernel layer (specifically, the Linux kernel layer).
其中,应用程序层包括安卓操作系统中的各个应用程序(包括系统应用程序和第三方应用程序)。The application layer includes various applications (including system applications and third-party applications) in the Android operating system.
应用程序框架层是应用程序的框架,开发人员可以在遵守应用程序的框架的开发原则的情况下,基于应用程序框架层开发一些应用程序。The application framework layer is the framework of the application, developers can develop some applications based on the application framework layer, while adhering to the development principles of the application framework.
系统运行库层包括库(也称为系统库)和安卓操作系统运行环境。库主要为安卓操作系统提供其所需的各类资源。安卓操作系统运行环境用于为安卓操作系统提供软件环境。The system runtime layer includes libraries (also known as system libraries) and the Android operating system runtime environment. The library mainly provides the various resources required by the Android operating system. The Android operating system runtime environment is used to provide a software environment for the Android operating system.
内核层是安卓操作系统的操作系统层,属于安卓操作系统软件层次的最底层。内核层基于Linux内核为安卓操作系统提供核心系统服务和与硬件相关的驱动程序。The kernel layer is the operating system layer of the Android operating system and belongs to the lowest level of the Android operating system software layer. The kernel layer provides core system services and hardware-related drivers for the Android operating system based on the Linux kernel.
以安卓操作系统为例,本公开实施例中,开发人员可以基于上述如图1所示的安卓操作系统的系统架构,开发实现本公开实施例提供的物距测量方法的软件程序,从而使得该物距测量方法可以基于如图1所示的安卓操作系统运行。即处理器或者终端设备可以通过在安卓操作系统中运行该软件程序实现本公开实施例提供的物距测量方法。Taking the Android operating system as an example, in the embodiment of the present disclosure, the developer can develop a software program for implementing the object distance measuring method provided by the embodiment of the present disclosure based on the system architecture of the Android operating system shown in FIG. The object distance measurement method can be based on the Android operating system shown in FIG. That is, the processor or the terminal device can implement the object distance measuring method provided by the embodiment of the present disclosure by running the software program in the Android operating system.
在本公开的第一种实施例中,图2示出了本公开实施例提供的一种物距测量方法,该方法可以应用于具有如图1所示的安卓操作系统的终端设备。如图2所示,该物距测量方法包括步骤201-步骤205:In a first embodiment of the present disclosure, FIG. 2 illustrates an object distance measuring method provided by an embodiment of the present disclosure, and the method can be applied to a terminal device having an Android operating system as shown in FIG. 1. As shown in FIG. 2, the object distance measuring method includes steps 201-205:
步骤201、终端设备获取终端设备的光学镜头在第一位置采集的第一图像。Step 201: The terminal device acquires a first image acquired by the optical lens of the terminal device at the first position.
本公开实施例中,终端设备可以在接收到用户的输入的情况下,确定第一位置,并通过光学镜头获取该第一位置的第一图像。其中,该输入用于触发终端设备的光学镜头采集图像,该第一图像为光学镜头位于第一位置时采集的图像。In the embodiment of the present disclosure, the terminal device may determine the first location when receiving the input of the user, and acquire the first image of the first location through the optical lens. The input is used to trigger an optical lens of the terminal device to acquire an image, where the first image is an image acquired when the optical lens is in the first position.
本公开实施例中,当用户打开终端设备的相机后,用户可以在终端设备的当前界面上进行输入,以触发终端设备的光学镜头采集第一图像。In the embodiment of the present disclosure, after the user opens the camera of the terminal device, the user may input on the current interface of the terminal device to trigger the optical lens of the terminal device to acquire the first image.
示例性的,用户可以对打开相机后的当前界面上的拍摄图标进行第一输入,以使得终端设备的光学镜头对拍摄物体进行拍摄,该拍摄物体在终端设备中处于第一位置的光学镜头上的成像为第一图像。Exemplarily, the user may perform a first input on the shooting icon on the current interface after the camera is turned on, so that the optical lens of the terminal device captures the shooting object, and the shooting object is in the first position of the optical lens in the terminal device. The image is the first image.
可选的,本公开实施例中,上述输入可以为用户对终端设备的点击/按压操作,该点击操作可以为单击、双击或者连续点击预设次数的操作等。Optionally, in the embodiment of the present disclosure, the input may be a click/press operation of the user on the terminal device, and the click operation may be a click, a double click, or a continuous click on a preset number of operations.
可以理解的是,用户还可以通过对终端设备中的快捷键、预设按键或者预设按键组合进行输入,以触发终端设备的光学镜头采集第一图像。It can be understood that the user can also input the shortcut image, the preset button or the preset button combination in the terminal device to trigger the optical lens of the terminal device to acquire the first image.
可选的,本公开实施例中,第一位置可以为光学镜头的初始位置。具体的,结合图2,如图3所示,在上述步骤201之前,本公开实施例提供的物距测量方法还包括步骤301:Optionally, in the embodiment of the present disclosure, the first position may be an initial position of the optical lens. Specifically, in conjunction with FIG. 2, as shown in FIG. 3, before the step 201, the object distance measuring method provided by the embodiment of the present disclosure further includes step 301:
步骤301、终端设备将光学镜头的初始位置确定为第一位置。Step 301: The terminal device determines an initial position of the optical lens as the first position.
可选的,本公开实施例中,第一位置可以为控制光学镜头从初始位置沿着第二预设方向移动后的位置。具体的,结合图2,如图4所示,在上述步骤201之前,本公开实施例提供的物距测量方法还包括步骤302:Optionally, in the embodiment of the present disclosure, the first position may be a position after the optical lens is controlled to move from the initial position along the second predetermined direction. Specifically, in conjunction with FIG. 2, as shown in FIG. 4, before the step 201, the object distance measuring method provided by the embodiment of the present disclosure further includes step 302:
步骤302、终端设备控制光学镜头从光学镜头的初始位置沿着光学镜头所在平面的第二预设方向移动至第一位置。Step 302: The terminal device controls the optical lens to move from the initial position of the optical lens to the first position along a second preset direction of the plane of the optical lens.
示例性的,终端设备可以通过终端设备中的光学防抖(Optical Image Stabilization,OIS)器件控制光学镜头从初始位置沿着第二预设方向移动至第一位置。Exemplarily, the terminal device can control the optical lens to move from the initial position along the second preset direction to the first position through an Optical Image Stabilization (OIS) device in the terminal device.
可选的,本公开实施例中,第二预设方向包括第一方向和第二方向中的至少一项,第一方向为X轴方向,第二方向为Y轴方向。Optionally, in the embodiment of the present disclosure, the second preset direction includes at least one of a first direction and a second direction, the first direction is an X-axis direction, and the second direction is a Y-axis direction.
示例性的,如图5所示,为本公开实施例中提供一种空间坐标系。光学镜头1处于如图5所示的空间坐标系中的位置a,光学镜头所在平面为X轴和Y轴组成的平面,P点为拍摄物体。终端设备可以通过OIS器件控制光学镜头1从位置a沿着空间中的X轴移动至第一位置;或者,终端设备可以通过OIS器件控制光学镜头1从位置a沿着空间中的Y轴移动至第一位置;或者,终端设备可以通过OIS器件控制光学镜头1从位置a沿着空间中的X轴和Y轴均移动至第一位置。Illustratively, as shown in FIG. 5, a spatial coordinate system is provided in an embodiment of the present disclosure. The optical lens 1 is at a position a in the spatial coordinate system as shown in FIG. 5, the plane in which the optical lens is located is a plane composed of the X-axis and the Y-axis, and the point P is a subject. The terminal device can control the optical lens 1 to move from the position a along the X axis in the space to the first position through the OIS device; or, the terminal device can control the optical lens 1 to move from the position a along the Y axis in the space to the O through the OIS device to The first position; alternatively, the terminal device can control the optical lens 1 to move from the position a along the X-axis and the Y-axis in the space to the first position through the OIS device.
步骤202、在光学镜头从第一位置移动到第二位置的情况下,终端设备获取光学镜头在第二位置采集的第二图像。Step 202: In a case where the optical lens moves from the first position to the second position, the terminal device acquires the second image acquired by the optical lens at the second position.
本公开实施例中,终端设备在检测到光学镜头从第一位置移动到第二位置的情况下,确定第二位置,并通过光学镜头获取第二位置的第二图像。In the embodiment of the present disclosure, the terminal device determines the second position when detecting that the optical lens moves from the first position to the second position, and acquires the second image of the second position through the optical lens.
可选的,本公开实施例中,在第一位置为光学镜头的初始位置的情况下,第二位置可以为控制光学镜头从初始位置沿着第一预设方向移动后的位置。相应的,结合图2,如图3所示,在上述步骤202之前,本公开实施例提供的物距测量方法还包括步骤303:Optionally, in the embodiment of the present disclosure, in a case where the first position is an initial position of the optical lens, the second position may be a position after the optical lens is controlled to move from the initial position along the first preset direction. Correspondingly, in conjunction with FIG. 2, as shown in FIG. 3, before the step 202, the object distance measuring method provided by the embodiment of the present disclosure further includes step 303:
步骤303、终端设备控制光学镜头从初始位置沿着光学镜头所在平面的第一预设方向移动至第二位置。Step 303: The terminal device controls the optical lens to move from the initial position to the second position along a first preset direction of the plane of the optical lens.
示例性的,终端设备可以通过OIS器件控制光学镜头从初始位置沿着第一预设方向移动至第二位置。Illustratively, the terminal device can control the optical lens to move from the initial position along the first predetermined direction to the second position by the OIS device.
可选的,本公开实施例中,第一预设方向包括第一方向和第二方向中的至少一项,第一方向为X轴方向,第二方向为Y轴方向。Optionally, in the embodiment of the present disclosure, the first preset direction includes at least one of a first direction and a second direction, the first direction is an X-axis direction, and the second direction is a Y-axis direction.
示例性的,参考图5所示的空间坐标系,终端设备可以控制光学镜头1从位置a沿着空间中的X轴移动至第二位置;或者,终端设备可以控制光学镜头1从位置a沿着空间中的Y轴移动至第二位置;或者,终端设备可以通过OIS器件控制光学镜头从位置a沿着空间中的X轴和Y轴均移动至第二位置。Exemplarily, referring to the spatial coordinate system shown in FIG. 5, the terminal device can control the optical lens 1 to move from the position a along the X axis in the space to the second position; or, the terminal device can control the optical lens 1 from the position a The Y-axis in the space is moved to the second position; alternatively, the terminal device can control the optical lens to move from the position a along the X-axis and the Y-axis in the space to the second position through the OIS device.
例如,第一位置为光学镜头的初始位置,初始位置的空间坐标为(0,0,0)。终端设备可以控制光学镜头从(0,0,0)沿着空间中的X轴移动至第二位置(3,0,0);或者,终端设备可以控制光学镜头从(0,0,0)沿着空间中的Y轴移动至第二位置(0,4,0);或者,终端设备可以控制光学镜头从(0,0,0)沿着空间中的X轴和Y轴移动至第二位置(3,4,0)。For example, the first position is the initial position of the optical lens, and the spatial coordinate of the initial position is (0, 0, 0). The terminal device can control the optical lens to move from (0, 0, 0) along the X axis in the space to the second position (3, 0, 0); or, the terminal device can control the optical lens from (0, 0, 0) Moving along the Y axis in space to the second position (0, 4, 0); alternatively, the terminal device can control the optical lens to move from (0, 0, 0) along the X and Y axes in space to the second Location (3, 4, 0).
可选的,本公开实施例中,在第一位置为控制光学镜头从初始位置沿着第二预设方向移动后的位置的情况下,第二位置可以为控制光学镜头从第一位置沿着第三预设方向移动后的位置。相应的,结合图2,如图4所示,在上述步骤202之前,本公开实施例提供的物距测量方法还包括步骤304:Optionally, in the embodiment of the present disclosure, in a case where the first position is a position for controlling the optical lens to move from the initial position along the second preset direction, the second position may be to control the optical lens from the first position. The position after the third preset direction is moved. Correspondingly, in conjunction with FIG. 2, as shown in FIG. 4, before the step 202, the object distance measuring method provided by the embodiment of the present disclosure further includes step 304:
步骤304、终端设备控制光学镜头从第一位置沿着光学镜头所在平面的第三预设方向移动至第二位置。Step 304: The terminal device controls the optical lens to move from the first position to the second position along a third preset direction of the plane of the optical lens.
其中,第二预设方向和第三预设方向不同。The second preset direction is different from the third preset direction.
可选的,本公开实施例中,第二预设方向包括第一方向和第二方向中的至少一项,第一方向为X轴方向,第二方向为Y轴方向;第三预设方向包括第一方向和第二方向中的至少一项。Optionally, in the embodiment of the present disclosure, the second preset direction includes at least one of a first direction and a second direction, the first direction is an X-axis direction, the second direction is a Y-axis direction, and the third preset direction is At least one of the first direction and the second direction is included.
可选的,本公开实施例中,第二预设方向和第三预设方向相反。Optionally, in the embodiment of the disclosure, the second preset direction and the third preset direction are opposite.
示例性的,参考图5所示的空间坐标系,终端设备可以控制光学镜头1从位置a沿着空间中的X轴移动至第一位置,并将光学镜头从第一位置沿着空间中的X轴移动至第二位置;或者,终端设备可以控制光学镜头1从位置a沿着空间中的Y轴移动至第一位置,并将光学镜头从第一位置沿着空间中的Y轴移动至第二位置;或者,终端设备可以控制光学镜头从位置a沿着空间中的X轴和Y轴均移动至第一位置,并将光学镜头从第一位置沿着空间中的X轴和Y轴均移动至第二位置。Exemplarily, referring to the spatial coordinate system shown in FIG. 5, the terminal device can control the optical lens 1 to move from the position a along the X axis in the space to the first position, and the optical lens from the first position along the space. The X axis moves to the second position; alternatively, the terminal device can control the optical lens 1 to move from the position a along the Y axis in the space to the first position, and move the optical lens from the first position along the Y axis in the space to a second position; or, the terminal device can control the optical lens to move from the position a along the X-axis and the Y-axis in the space to the first position, and the optical lens from the first position along the X-axis and the Y-axis in the space Move to the second position.
例如,假设光学镜头的位置a为初始位置,该初始位置的空间坐标为(0,0,0)。终端设备可以控制光学镜头从(0,0,0)沿着空间中的X轴移动至第一位置(3,0,0),并将光学镜头从第一位置(3,0,0)沿着空间中的X轴移动至第二位置(-3,0,0);或者,终端设备可以控制光学镜头从(0,0,0)沿着空间中的Y轴移动至第一位置(0,4,0),并将光学镜头从第一位置(0,4,0)沿着空间中的Y轴移动至第二位置(0,-4,0);或者,终端设备可以控制光学镜头从(0,0,0)沿着空间中的X轴和Y轴移动至第一位置(3,4,0),并将光学镜头从第一位置(3,4,0)沿着空间中的Y轴移动至第二位置(-3,-4,0)。For example, assume that the position a of the optical lens is the initial position, and the spatial coordinate of the initial position is (0, 0, 0). The terminal device can control the optical lens to move from (0, 0, 0) along the X axis in the space to the first position (3, 0, 0), and the optical lens is moved from the first position (3, 0, 0) The X axis in the space moves to the second position (-3, 0, 0); or, the terminal device can control the optical lens to move from (0, 0, 0) along the Y axis in the space to the first position (0 , 4, 0), and move the optical lens from the first position (0, 4, 0) along the Y axis in space to the second position (0, -4, 0); or, the terminal device can control the optical lens Move from (0,0,0) along the X and Y axes in space to the first position (3,4,0) and move the optical lens from the first position (3,4,0) along the space The Y axis moves to the second position (-3, -4, 0).
步骤203、终端设备获取第一位置与第二位置之间的第一距离。Step 203: The terminal device acquires a first distance between the first location and the second location.
示例性的,终端设备在确定光学镜头的第一位置与第二位置后,可以根据第一位置与第二位置计算得到第一距离ΔX。Exemplarily, after determining the first position and the second position of the optical lens, the terminal device may calculate the first distance ΔX according to the first position and the second position.
例如,假设第一位置的空间坐标为(3,0,0),第二位置的空间坐标为(-3,0,0)。终端设备根据(3,0,0)与(-3,0,0)计算得到第一距离ΔX=6。For example, assume that the spatial coordinate of the first position is (3, 0, 0) and the spatial coordinate of the second position is (-3, 0, 0). The terminal device calculates a first distance ΔX=6 based on (3, 0, 0) and (-3, 0, 0).
步骤204、终端设备获取第一差值。Step 204: The terminal device acquires the first difference.
其中,第一差值为在第一图像和第二图像位于同一平面上的情况下,第一图像中的第一像素点的位置坐标和第二图像中的第二像素点的位置坐标之间的差值;第二像素点与第一像素点为相对位置的像素点。Wherein the first difference is between the position coordinates of the first pixel in the first image and the position coordinates of the second pixel in the second image in a case where the first image and the second image are located on the same plane The difference between the second pixel and the first pixel is a pixel position.
示例性的,如图6所示,若O1为光学镜头所在的第一位置,O2为光学镜头所在的第二位置,第一距离为ΔX,假设相机观看的拍摄物体为P点,该P点的空间坐标表示为P(x c,y c,z c),z c通常可以认为是该P点的物距,用Z表示。O1在第一位置时采集的第一图像为P1,该P1的像素点的位置坐标为P 1(x 1,y 1),O2在第二位置时采集的第二图像为P2,与P1的位置相对的P2的像素点的位置坐标为P 2(x 2,y 1);终端设备可以通过计算像素点的位置坐标P 1(x 1,y 1)与像素点的位置坐标P 2(x 2,y 1)的位置坐标之间的差值,即x 1与x 2之间的差值|x 2-x 1|=d,获取到第一差值为d。 Exemplarily, as shown in FIG. 6, if O1 is the first position where the optical lens is located, O2 is the second position where the optical lens is located, and the first distance is ΔX, assuming that the object viewed by the camera is P point, the P point The spatial coordinates are expressed as P(x c , y c , z c ), and z c can generally be regarded as the object distance of the P point, denoted by Z. The first image acquired when O1 is in the first position is P1, the position coordinate of the pixel point of P1 is P 1 (x 1 , y 1 ), and the second image acquired when O2 is in the second position is P2, and P1 The position coordinates of the pixel points of the opposite positions P2 are P 2 (x 2 , y 1 ); the terminal device can calculate the position coordinates P 1 (x 1 , y 1 ) of the pixel points and the position coordinates P 2 (x of the pixel points). The difference between the position coordinates of 2 , y 1 ), that is, the difference |x 2 -x 1 |=d between x 1 and x 2 , obtains the first difference d.
可以理解的是,本公开实施例中,终端设备在获取第一差值之前,可以先对第一图像和第二图像进行补偿矫正处理,如图像畸变、角度调整等,使得第一图像和第二图像在空间Y轴(和/或X轴)上对齐(即第一图像和第二图像位于同一平面上)。It can be understood that, in the embodiment of the present disclosure, before acquiring the first difference, the terminal device may perform compensation correction processing on the first image and the second image, such as image distortion, angle adjustment, etc., so that the first image and the first image The two images are aligned on the spatial Y-axis (and/or the X-axis) (ie, the first image and the second image are on the same plane).
步骤205、终端设备根据第一距离、光学镜头的焦距和第一差值,确定目标物距。Step 205: The terminal device determines the target object distance according to the first distance, the focal length of the optical lens, and the first difference.
本公开实施例中,终端设备可以结合预设公式,根据第一距离、光学镜头的焦距和第一差值,确定目标物距。In the embodiment of the present disclosure, the terminal device may determine the target object distance according to the first distance, the focal length of the optical lens, and the first difference according to a preset formula.
示例性的,终端设备采用的预设公式为
Figure PCTCN2019071635-appb-000001
如图7所示,Z为目标距离,P为拍摄物体,O1在第一位置时采集的第一图像为P1,O2在第二位置时采集的第二图像为P2,B为O1与O2之间的距离,d为第一差值,f为光学镜头的焦距。根据三角形相似原理,可以得到
Figure PCTCN2019071635-appb-000002
并计算得到公式一:
Exemplarily, the preset formula adopted by the terminal device is
Figure PCTCN2019071635-appb-000001
As shown in FIG. 7 , Z is the target distance, P is the photographing object, the first image acquired when O1 is in the first position is P1, the second image acquired when O2 is in the second position is P2, and B is O1 and O2. The distance between them, d is the first difference, and f is the focal length of the optical lens. According to the triangle similarity principle, you can get
Figure PCTCN2019071635-appb-000002
And calculate the formula one:
Figure PCTCN2019071635-appb-000003
Figure PCTCN2019071635-appb-000003
并且,如图8所示,D为图像传感器的宽度,Z为目标距离,第一图像和第二图像之间视觉差值为ΔY=Y2+ΔX-Y1,根据三角形相似原理,可以得到
Figure PCTCN2019071635-appb-000004
由等效原理可知B=ΔY,则可以得到公式二:
Moreover, as shown in FIG. 8, D is the width of the image sensor, Z is the target distance, and the visual difference between the first image and the second image is ΔY=Y2+ΔX-Y1, which can be obtained according to the triangle similarity principle.
Figure PCTCN2019071635-appb-000004
According to the equivalent principle, we can know that B=ΔY, we can get formula 2:
Figure PCTCN2019071635-appb-000005
Figure PCTCN2019071635-appb-000005
由公式一和公式二可以得到预设公式:The preset formula can be obtained from Equation 1 and Equation 2:
Figure PCTCN2019071635-appb-000006
Figure PCTCN2019071635-appb-000006
终端设备可以采用预设公式
Figure PCTCN2019071635-appb-000007
根据获取的第一距离ΔX、光学镜头的焦距f和第一差值d,计算得到目标物距Z。
The terminal device can adopt the preset formula
Figure PCTCN2019071635-appb-000007
The target object distance Z is calculated based on the acquired first distance ΔX, the focal length f of the optical lens, and the first difference d.
相比于现有技术,由于终端设备可以控制单个摄像头的光学镜头移动至第一位置和第二位置,并获取第一位置时的第一图像和第二位置时的第二图像的第一差值,而并不需要多个摄像头分别获取第一位置和第二位置,以及第一图像和第二图像的第一差值,因此可以减少摄像头的成本。Compared with the prior art, since the terminal device can control the optical lens of the single camera to move to the first position and the second position, and acquire the first difference of the second image when the first image and the second position are in the first position The value does not require the plurality of cameras to acquire the first position and the second position, respectively, and the first difference between the first image and the second image, thereby reducing the cost of the camera.
本公开实施例提供一种物距测量方法,终端设备可以根据光学镜头的第一位置和第二位置之间的第一距离,光学镜头的焦距和第一差值,确定目标物距。由于终端设备可以控制摄像头的光学镜头移动至第一位置和第二位置,获取第一位置和第二位置之间的第一距离,并获取第一位置时的第一图像和第二位置时的第二图像之间的第一差值,而并不需要 采用多个摄像头获取第一位置和第二位置,以及第一图像和第二图像之间的第一差值,因此可以减少摄像头的成本。An embodiment of the present disclosure provides an object distance measuring method. The terminal device may determine a target object distance according to a first distance between the first position and the second position of the optical lens, a focal length of the optical lens, and a first difference. Since the terminal device can control the optical lens of the camera to move to the first position and the second position, acquire the first distance between the first position and the second position, and acquire the first image and the second position when the first position is obtained The first difference between the second images does not require the use of a plurality of cameras to acquire the first position and the second position, and the first difference between the first image and the second image, thereby reducing the cost of the camera .
并且,由于本公开实施例采用单个摄像头,因此解决了由于采用多个摄像头拍摄物体时,多个摄像头的参数(如镜头曲率、亮度等)存在差异,而影响终端设备拍摄的合成精准度,从而影响终端设备的拍摄效果的问题,使得拍摄的图像更加精准。Moreover, since the embodiment of the present disclosure adopts a single camera, it solves the difference in parameters (such as lens curvature, brightness, etc.) of multiple cameras when shooting an object by using multiple cameras, thereby affecting the precision of the shooting of the terminal device, thereby The problem that affects the shooting effect of the terminal device makes the captured image more precise.
进一步的,本公开实施例提供物距测量方法可以通过获取目标物距,以实现背景虚化、抠图、3D等应用。Further, the embodiment of the present disclosure provides an object distance measurement method by acquiring a target object distance to implement background blur, map, 3D, and the like.
在本公开的第二种实施例中,图9示出了本公开实施例中涉及的终端设备的一种可能的结构示意图,如图9所示,该终端设备90可以包括:获取单元91和确定单元92。In a second embodiment of the present disclosure, FIG. 9 is a schematic structural diagram of a terminal device involved in the embodiment of the present disclosure. As shown in FIG. 9, the terminal device 90 may include: an obtaining unit 91 and The unit 92 is determined.
其中,获取单元91,用于获取终端设备的光学镜头在第一位置采集的第一图像。获取单元91,还用于在光学镜头从第一位置移动到第二位置的情况下,获取光学镜头在第二位置采集的第二图像。获取单元91,还用于获取第一位置与第二位置之间的第一距离。获取单元91,还用于获取第一差值。确定单元92,用于根据第一距离、光学镜头的焦距和第一差值,确定目标物距。其中,第一差值为在第一图像和第二图像位于同一平面上的情况下,第一图像中的第一像素点的位置坐标和第二图像中的第二像素点的位置坐标之间的差值;第二像素点与第一像素点为相对位置的像素点。The obtaining unit 91 is configured to acquire a first image acquired by the optical lens of the terminal device at the first position. The obtaining unit 91 is further configured to acquire a second image acquired by the optical lens at the second position in a case where the optical lens is moved from the first position to the second position. The obtaining unit 91 is further configured to acquire a first distance between the first location and the second location. The obtaining unit 91 is further configured to acquire the first difference. The determining unit 92 is configured to determine the target object distance according to the first distance, the focal length of the optical lens, and the first difference. Wherein the first difference is between the position coordinates of the first pixel in the first image and the position coordinates of the second pixel in the second image in a case where the first image and the second image are located on the same plane The difference between the second pixel and the first pixel is a pixel position.
在一种可能的实现方式中,确定单元92,还用于在获取单元91获取终端设备的光学镜头在第一位置采集的第一图像之前,将光学镜头的初始位置确定为第一位置。本公开实施例中的终端设备还包括:第一控制单元。其中,第一控制单元,用于在获取单元91获取光学镜头在第二位置采集的第二图像之前,控制光学镜头从初始位置沿着光学镜头所在平面的第一预设方向移动至第二位置。In a possible implementation, the determining unit 92 is further configured to determine the initial position of the optical lens as the first position before the acquiring unit 91 acquires the first image acquired by the optical lens of the terminal device at the first position. The terminal device in the embodiment of the present disclosure further includes: a first control unit. The first control unit is configured to control the optical lens to move from the initial position along the first preset direction of the plane of the optical lens to the second position before the acquiring unit 91 acquires the second image acquired by the optical lens at the second position. .
在一种可能的实现方式中,第一预设方向包括第一方向和第二方向中的至少一项,该第一方向为X轴方向,该第二方向为Y轴方向。In a possible implementation manner, the first preset direction includes at least one of a first direction and a second direction, the first direction being an X-axis direction and the second direction being a Y-axis direction.
在一种可能的实现方式中,本公开实施例中的终端设备还包括:第二控制单元。其中,第二控制单元,用于在获取单元91获取终端设备的光学镜头在第一位置采集的第一图像之前,控制光学镜头从光学镜头的初始位置沿着光学镜头所在平面的第二预设方向移动至第一位置。第二控制单元,还用于在获取单元91获取光学镜头在第二位置采集的第二图像之前,控制光学镜头从第一位置沿着光学镜头所在平面的第三预设方向移动至第二位置。In a possible implementation manner, the terminal device in the embodiment of the present disclosure further includes: a second control unit. The second control unit is configured to control the second preset of the optical lens from the initial position of the optical lens along the plane of the optical lens before the acquiring unit 91 acquires the first image acquired by the optical lens of the terminal device at the first position. The direction moves to the first position. The second control unit is further configured to: before the acquiring unit 91 acquires the second image acquired by the optical lens at the second position, control the optical lens to move from the first position to the second position along the third preset direction of the plane of the optical lens .
在一种可能的实现方式中,第二预设方向和第三预设方向相反。In a possible implementation manner, the second preset direction is opposite to the third preset direction.
在一种可能的实现方式中,第二预设方向包括第一方向和第二方向中的至少一项,该第一方向为X轴方向,该第二方向为Y轴方向。In a possible implementation manner, the second preset direction includes at least one of a first direction and a second direction, where the first direction is an X-axis direction, and the second direction is a Y-axis direction.
在一种可能的实现方式中,第三预设方向包括第一方向和第二方向中的至少一项,该第一方向为X轴方向,该第二方向为Y轴方向。In a possible implementation manner, the third preset direction includes at least one of a first direction and a second direction, the first direction being an X-axis direction, and the second direction being a Y-axis direction.
本公开实施例提供的终端设备90能够实现上述方法实施例中终端设备实现的各个过程,为避免重复,详细描述以及有益效果这里不再赘述。The terminal device 90 provided by the embodiment of the present disclosure can implement various processes implemented by the terminal device in the foregoing method embodiments. To avoid repetition, detailed description and beneficial effects are not described herein again.
在本公开的第三种实施例中,图10为实现本公开各个实施例的一种终端设备的硬件结构示意图,该终端设备100包括但不限于:射频单元101、网络模块102、音频输出单元103、输入单元104、传感器105、显示单元106、用户输入单元107、接口单元108、存储器109、处理器110、以及电源111等部件。In a third embodiment of the present disclosure, FIG. 10 is a schematic diagram of a hardware structure of a terminal device that implements various embodiments of the present disclosure, including but not limited to: a radio frequency unit 101, a network module 102, and an audio output unit. 103. Input unit 104, sensor 105, display unit 106, user input unit 107, interface unit 108, memory 109, processor 110, and power source 111.
需要说明的是,本领域技术人员可以理解,图10中示出的终端设备结构并不构成对终端设备的限定,终端设备可以包括比图示更多或更少的部件,或者组合某些部件,或者不同的部件布置。在本公开实施例中,终端设备包括但不限于手机、平板电脑、笔记本电脑、掌上电脑、车载终端、可穿戴设备、以及计步器等。It should be noted that those skilled in the art can understand that the terminal device structure shown in FIG. 10 does not constitute a limitation on the terminal device, and the terminal device may include more or less components than the illustrated, or combine some components. , or different parts layout. In the embodiment of the present disclosure, the terminal device includes, but is not limited to, a mobile phone, a tablet computer, a notebook computer, a palmtop computer, a vehicle-mounted terminal, a wearable device, a pedometer, and the like.
其中,处理器110,用于在接收到用户的第一输入的情况下,确定第一位置,该第一输入用于触发终端设备的光学镜头采集第一图像,该第一图像为该光学镜头位于该第一位置时采集的图像;在检测到该光学镜头从该第一位置移动到第二位置的情况下,确定该第二位置;获取该第一位置与该第二位置之间的第一距离;获取第一差值,该第一差值为第一图像和第二图像中对应的两个像素点的横坐标之间的差值,该第二图像为该光学镜头位于第二位置时采集的图像;结合预设公式,根据第一距离、该光学镜头的焦距和第一差值,确定目标物距。The processor 110 is configured to determine, when the first input of the user is received, a first location, where the first input is used to trigger an optical lens of the terminal device to acquire a first image, where the first image is the optical lens An image acquired when the first position is detected; and when the optical lens is detected to move from the first position to the second position, determining the second position; acquiring the first position between the first position and the second position a distance obtained by a difference between an abscissa of the corresponding two pixels in the first image and the second image, wherein the second image is the second position of the optical lens The image acquired at the time; combined with the preset formula, the target object distance is determined according to the first distance, the focal length of the optical lens, and the first difference.
本公开实施例提供的终端设备100能够实现上述方法实施例中终端设备实现的各个过程,为避免重复,详细描述以及有益效果这里不再赘述。The terminal device 100 provided by the embodiment of the present disclosure can implement various processes implemented by the terminal device in the foregoing method embodiments. To avoid repetition, the detailed description and the beneficial effects are not described herein again.
应理解的是,本公开实施例中,射频单元101可用于收发信息或通话过程中,信号的接收和发送,具体的,将来自基站的下行数据接收后,给处理器110处理;另外,将上行的数据发送给基站。通常,射频单元101包括但不限于天线、至少一个放大器、收发信机、耦合器、低噪声放大器、双工器等。此外,射频单元101还可以通过无线通信系统与网络和其他设备通信。It should be understood that, in the embodiment of the present disclosure, the radio frequency unit 101 may be used for receiving and transmitting signals during or after receiving or transmitting information, and specifically, receiving downlink data from the base station, and then processing the data to the processor 110; The uplink data is sent to the base station. In general, radio frequency unit 101 includes, but is not limited to, an antenna, at least one amplifier, a transceiver, a coupler, a low noise amplifier, a duplexer, and the like. In addition, the radio unit 101 can also communicate with the network and other devices through a wireless communication system.
终端设备通过网络模块102为用户提供了无线的宽带互联网访问,如帮助用户收发电子邮件、浏览网页和访问流式媒体等。The terminal device provides the user with wireless broadband Internet access through the network module 102, such as helping the user to send and receive emails, browse web pages, and access streaming media.
音频输出单元103可以将射频单元101或网络模块102接收的或者在存储器109中存储的音频数据转换成音频信号并且输出为声音。而且,音频输出单元103还可以提供与终端设备100执行的特定功能相关的音频输出(例如,呼叫信号接收声音、消息接收声音等等)。音频输出单元103包括扬声器、蜂鸣器以及受话器等。The audio output unit 103 can convert the audio data received by the radio frequency unit 101 or the network module 102 or stored in the memory 109 into an audio signal and output as sound. Moreover, the audio output unit 103 can also provide audio output (eg, call signal reception sound, message reception sound, etc.) related to a specific function performed by the terminal device 100. The audio output unit 103 includes a speaker, a buzzer, a receiver, and the like.
输入单元104用于接收音频或视频信号。输入单元104可以包括图形处理器(Graphics Process ing Unit,GPU)1041和麦克风1042,图形处理器1041对在视频捕获模式或图像捕获模式中由图像捕获装置(如摄像头)获得的静态图片或视频的图像数据进行处理。处理后的图像帧可以显示在显示单元106上。经图形处理器1041处理后的图像帧可以存储在存储器109(或其它存储介质)中或者经由射频单元101或网络模块102进行发送。麦克风1042可以接收声音,并且能够将这样的声音处理为音频数据。处理后的音频数据可以在电话通话模式的情况下转换为可经由射频单元101发送到移动通信基站的格式输出。The input unit 104 is for receiving an audio or video signal. The input unit 104 may include a graphics processing unit (GPU) 1041 and a microphone 1042 that captures still pictures or video obtained by an image capturing device (such as a camera) in a video capturing mode or an image capturing mode. Image data is processed. The processed image frame can be displayed on the display unit 106. The image frames processed by the graphics processor 1041 may be stored in the memory 109 (or other storage medium) or transmitted via the radio unit 101 or the network module 102. The microphone 1042 can receive sound and can process such sound as audio data. The processed audio data can be converted to a format output that can be transmitted to the mobile communication base station via the radio unit 101 in the case of a telephone call mode.
终端设备100还包括至少一种传感器105,比如光传感器、运动传感器以及其他传感器。具体地,光传感器包括环境光传感器及接近传感器,其中,环境光传感器可根据环境光线的明暗来调节显示面板1061的亮度,接近传感器可在终端设备100移动到耳边时,关闭显示面板1061和/或背光。作为运动传感器的一种,加速计传感器可检测各个方向上(一般为三轴)加速度的大小,静止时可检测出重力的大小及方向,可用于识别终端设备姿态(比如横竖屏切换、相关游戏、磁力计姿态校准)、振动识别相关功能(比如计步器、敲击)等;传感器105还可以包括指纹传感器、压力传感器、虹膜传感器、分子传感器、陀螺仪、气压计、湿度计、温度计、红外线传感器等,在此不再赘述。The terminal device 100 also includes at least one type of sensor 105, such as a light sensor, a motion sensor, and other sensors. Specifically, the light sensor includes an ambient light sensor and a proximity sensor, wherein the ambient light sensor can adjust the brightness of the display panel 1061 according to the brightness of the ambient light, and the proximity sensor can close the display panel 1061 when the terminal device 100 moves to the ear. / or backlight. As a kind of motion sensor, the accelerometer sensor can detect the acceleration of each direction (usually three axes). When it is stationary, it can detect the magnitude and direction of gravity. It can be used to identify the attitude of the terminal device (such as horizontal and vertical screen switching, related games). , magnetometer attitude calibration), vibration recognition related functions (such as pedometer, tapping), etc.; sensor 105 may also include a fingerprint sensor, a pressure sensor, an iris sensor, a molecular sensor, a gyroscope, a barometer, a hygrometer, a thermometer, Infrared sensors and the like are not described here.
显示单元106用于显示由用户输入的信息或提供给用户的信息。显示单元106可包括显示面板1061,可以采用液晶显示器(Liquid Crystal Display,LCD)、有机发光二极管(Organic Light-Emitting Diode,OLED)等形式来配置显示面板1061。The display unit 106 is for displaying information input by the user or information provided to the user. The display unit 106 can include a display panel 1061. The display panel 1061 can be configured in the form of a liquid crystal display (LCD), an organic light-emitting diode (OLED), or the like.
用户输入单元107可用于接收输入的数字或字符信息,以及产生与终端设备的用户设置以及功能控制有关的键信号输入。具体地,用户输入单元107包括触控面板1071以及其他输入设备1072。触控面板1071,也称为触摸屏,可收集用户在其上或附近的触摸操作(比如用户使用手指、触笔等任何适合的物体或附件在触控面板1071上或在触控面板1071附近的操作)。触控面板1071可包括触摸检测装置和触摸控制器两个部分。其中,触摸检测装置检测用户的触摸方位,并检测触摸操作带来的信号,将信号传送给触摸控制器;触摸控制器从触摸检测装置上接收触摸信息,并将它转换成触点坐标,再送给处理器110,接收处理器110发来的命令并加以执行。此外,可以采用电阻式、电容式、红外线以及表面声波等多种类型实现触控面板1071。除了触控面板1071,用户输入单元107还可以包括其他输入设备1072。具体地,其他输入设备1072可以包括但不限于物理键盘、功能键(比如音量控制按键、开关按键等)、轨迹球、鼠标、操作杆,在此不再赘述。The user input unit 107 can be configured to receive input numeric or character information and to generate key signal inputs related to user settings and function control of the terminal device. Specifically, the user input unit 107 includes a touch panel 1071 and other input devices 1072. The touch panel 1071, also referred to as a touch screen, can collect touch operations on or near the user (such as the user using a finger, a stylus, or the like on the touch panel 1071 or near the touch panel 1071. operating). The touch panel 1071 may include two parts of a touch detection device and a touch controller. Wherein, the touch detection device detects the touch orientation of the user, and detects a signal brought by the touch operation, and transmits the signal to the touch controller; the touch controller receives the touch information from the touch detection device, converts the touch information into contact coordinates, and sends the touch information. To the processor 110, the commands sent by the processor 110 are received and executed. In addition, the touch panel 1071 can be implemented in various types such as resistive, capacitive, infrared, and surface acoustic waves. In addition to the touch panel 1071, the user input unit 107 may also include other input devices 1072. Specifically, the other input devices 1072 may include, but are not limited to, a physical keyboard, function keys (such as a volume control button, a switch button, etc.), a trackball, a mouse, and a joystick, which are not described herein.
进一步的,触控面板1071可覆盖在显示面板1061上,当触控面板1071检测到在其上或附近的触摸操作后,传送给处理器110以确定触摸事件的类型,随后处理器110根据触摸事件的类型在显示面板1061上提供相应的视觉输出。虽然在图10中,触控面板1071与显示面板1061是作为两个独立的部件来实现终端设备的输入和输出功能,但是在某些实施例中,可以将触控面板1071与显示面板1061集成而实现终端设备的输入和输出功能,具体此处不做限定。Further, the touch panel 1071 can be overlaid on the display panel 1061. After the touch panel 1071 detects a touch operation thereon or nearby, the touch panel 1071 transmits to the processor 110 to determine the type of the touch event, and then the processor 110 according to the touch. The type of event provides a corresponding visual output on display panel 1061. Although in FIG. 10, the touch panel 1071 and the display panel 1061 are two independent components to implement the input and output functions of the terminal device, in some embodiments, the touch panel 1071 and the display panel 1061 may be integrated. The input and output functions of the terminal device are implemented, and are not limited herein.
接口单元108为外部装置与终端设备100连接的接口。例如,外部装置可以包括有线或无线头戴式耳机端口、外部电源(或电池充电器)端口、有线或无线数据端口、存储卡端口、用于连接具有识别模块的装置的端口、音频输入/输出(I/O)端口、视频I/O端口、耳机端口等等。接口单元108可以用于接收来自外部装置的输入(例如,数据信息、电力等等)并且将接收到的输入传输到终端设备100内的一个或多个元件或者可以用于在终端设备100和外部装置之间传输数据。The interface unit 108 is an interface in which an external device is connected to the terminal device 100. For example, the external device may include a wired or wireless headset port, an external power (or battery charger) port, a wired or wireless data port, a memory card port, a port for connecting a device having an identification module, and an audio input/output. (I/O) port, video I/O port, headphone port, and more. The interface unit 108 can be configured to receive input from an external device (eg, data information, power, etc.) and transmit the received input to one or more components within the terminal device 100 or can be used at the terminal device 100 and externally Data is transferred between devices.
存储器109可用于存储软件程序以及各种数据。存储器109可主要包括存储程序区和存储数据区,其中,存储程序区可存储操作系统、至少一个功能所需的应用程序(比如声音播放功能、图像播放功能等)等;存储数据区可存储根据手机的使用所创建的数据(比如音频数据、电话本等)等。此外,存储器109可以包括高速随机存取存储器,还可以包括非易失性存储器,例如至少一个磁盘存储器件、闪存器件、或其他易失性固态存储器件。 Memory 109 can be used to store software programs as well as various data. The memory 109 may mainly include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application required for at least one function (such as a sound playing function, an image playing function, etc.), and the like; the storage data area may be stored according to Data created by the use of the mobile phone (such as audio data, phone book, etc.). Further, the memory 109 may include a high speed random access memory, and may also include a nonvolatile memory such as at least one magnetic disk storage device, flash memory device, or other volatile solid state storage device.
处理器110是终端设备的控制中心,利用各种接口和线路连接整个终端设备的各个部分,通过运行或执行存储在存储器109内的软件程序和/或模块,以及调用存储在存储器109内的数据,执行终端设备的各种功能和处理数据,从而对终端设备进行整体监控。处理器110可包括一个或多个处理单元;优选的,处理器110可集成应用处理器和调制解调处理器,其中,应用处理器主要处理操作系统、用户界面和应用程序等,调制解调处理器主要处理无线通信。可以理解的是,上述调制解调处理器也可以不集成到处理器110中。The processor 110 is a control center of the terminal device that connects various portions of the entire terminal device using various interfaces and lines, by running or executing software programs and/or modules stored in the memory 109, and recalling data stored in the memory 109. Perform various functions and processing data of the terminal device to perform overall monitoring on the terminal device. The processor 110 may include one or more processing units; preferably, the processor 110 may integrate an application processor and a modem processor, wherein the application processor mainly processes an operating system, a user interface, an application, etc., and performs modulation and demodulation. The processor primarily handles wireless communications. It can be understood that the above modem processor may not be integrated into the processor 110.
终端设备100还可以包括给各个部件供电的电源111(比如电池),优选的,电源111可以通过电源管理系统与处理器110逻辑相连,从而通过电源管理系统实现管理充电、放 电、以及功耗管理等功能。The terminal device 100 may further include a power source 111 (such as a battery) for supplying power to the respective components. Preferably, the power source 111 may be logically connected to the processor 110 through the power management system to manage charging, discharging, and power management through the power management system. And other functions.
另外,终端设备100包括一些未示出的功能模块,在此不再赘述。In addition, the terminal device 100 includes some functional modules not shown, and details are not described herein again.
优选的,本公开实施例还提供一种终端设备,包括处理器110,存储器109,存储在存储器109上并可在所述处理器110上运行的计算机程序,该计算机程序被处理器110执行时实现上述方法实施例的各个过程,且能达到相同的技术效果,为避免重复,这里不再赘述。Preferably, an embodiment of the present disclosure further provides a terminal device, including a processor 110, a memory 109, a computer program stored on the memory 109 and executable on the processor 110, when the computer program is executed by the processor 110. The various processes of the foregoing method embodiments are implemented, and the same technical effects can be achieved. To avoid repetition, details are not described herein again.
本公开实施例还提供一种计算机可读存储介质,计算机可读存储介质上存储有计算机程序,该计算机程序被处理器执行时实现上述方法实施例的各个过程,且能达到相同的技术效果,为避免重复,这里不再赘述。其中,所述的计算机可读存储介质,如只读存储器(Read-Only Memory,ROM)、随机存取存储器(Random Access Memory,RAM)、磁碟或者光盘等。The embodiment of the present disclosure further provides a computer readable storage medium. The computer readable storage medium stores a computer program. When the computer program is executed by the processor, the processes of the foregoing method embodiments are implemented, and the same technical effects can be achieved. To avoid repetition, we will not repeat them here. The computer readable storage medium, such as a read-only memory (ROM), a random access memory (RAM), a magnetic disk, or an optical disk.
需要说明的是,在本文中,术语“包括”、“包含”或者其任何其他变体意在涵盖非排他性的包含,从而使得包括一系列要素的过程、方法、物品或者装置不仅包括那些要素,而且还包括没有明确列出的其他要素,或者是还包括为这种过程、方法、物品或者装置所固有的要素。在没有更多限制的情况下,由语句“包括一个……”限定的要素,并不排除在包括该要素的过程、方法、物品或者装置中还存在另外的相同要素。It is to be understood that the term "comprises", "comprising", or any other variants thereof, is intended to encompass a non-exclusive inclusion, such that a process, method, article, or device comprising a series of elements includes those elements. It also includes other elements that are not explicitly listed, or elements that are inherent to such a process, method, article, or device. An element that is defined by the phrase "comprising a ..." does not exclude the presence of additional equivalent elements in the process, method, item, or device that comprises the element.
通过以上的实施方式的描述,本领域的技术人员可以清楚地了解到上述实施例方法可借助软件加必需的通用硬件平台的方式来实现,当然也可以通过硬件,但很多情况下前者是更佳的实施方式。基于这样的理解,本申请的技术方案本质上或者说对现有技术做出贡献的部分可以以软件产品的形式体现出来,该计算机软件产品存储在一个存储介质(如ROM/RAM、磁碟、光盘)中,包括若干指令用以使得一台终端设备(可以是手机,计算机,服务器,空调器,或者网络设备等)执行本申请各个实施例所述的方法。Through the description of the above embodiments, those skilled in the art can clearly understand that the foregoing embodiment method can be implemented by means of software plus a necessary general hardware platform, and of course, can also be through hardware, but in many cases, the former is better. Implementation. Based on such understanding, the technical solution of the present application, which is essential or contributes to the prior art, may be embodied in the form of a software product stored in a storage medium (such as ROM/RAM, disk, The optical disc includes a number of instructions for causing a terminal device (which may be a mobile phone, a computer, a server, an air conditioner, or a network device, etc.) to perform the methods described in various embodiments of the present application.
上面结合附图对本申请的实施例进行了描述,但是本申请并不局限于上述的具体实施方式,上述的具体实施方式仅仅是示意性的,而不是限制性的,本领域的普通技术人员在本申请的启示下,在不脱离本申请宗旨和权利要求所保护的范围情况下,还可做出很多形式,均属于本申请的保护之内。The embodiments of the present application have been described above with reference to the drawings, but the present application is not limited to the specific embodiments described above, and the specific embodiments described above are merely illustrative and not restrictive, and those skilled in the art In the light of the scope of the present application, many forms may be made without departing from the scope of the invention as claimed.

Claims (14)

  1. 一种物距测量方法,所述方法包括:An object distance measuring method, the method comprising:
    获取终端设备的光学镜头在第一位置采集的第一图像;Obtaining a first image acquired by the optical lens of the terminal device at the first position;
    在所述光学镜头从所述第一位置移动到第二位置的情况下,获取所述光学镜头在第二位置采集的第二图像;Acquiring the second image acquired by the optical lens at the second position in a case where the optical lens is moved from the first position to the second position;
    获取所述第一位置与所述第二位置之间的第一距离;Obtaining a first distance between the first location and the second location;
    获取第一差值;Obtaining the first difference;
    根据所述第一距离、所述光学镜头的焦距和所述第一差值,确定目标物距;Determining a target object distance according to the first distance, a focal length of the optical lens, and the first difference;
    其中,所述第一差值为在所述第一图像和所述第二图像位于同一平面上的情况下,所述第一图像中的第一像素点的位置坐标和所述第二图像中的第二像素点的位置坐标之间的差值;所述第二像素点与所述第一像素点为相对位置的像素点。Wherein the first difference is a position coordinate of the first pixel in the first image and the second image in a case where the first image and the second image are located on a same plane a difference between position coordinates of the second pixel; the second pixel and the first pixel are pixels of a relative position.
  2. 根据权利要求1所述的方法,所述获取终端设备的光学镜头在第一位置采集的第一图像之前,所述方法还包括:The method of claim 1, before the acquiring the first image of the optical lens of the terminal device in the first position, the method further comprises:
    将所述光学镜头的初始位置确定为所述第一位置;Determining an initial position of the optical lens as the first position;
    所述获取所述光学镜头在第二位置采集的第二图像之前,所述方法还包括:Before the obtaining the second image acquired by the optical lens in the second position, the method further includes:
    控制所述光学镜头从所述初始位置沿着所述光学镜头所在平面的第一预设方向移动至所述第二位置。Controlling the optical lens to move from the initial position to the second position along a first predetermined direction of the plane of the optical lens.
  3. 根据权利要求2所述的方法,所述第一预设方向包括第一方向和第二方向中的至少一项,所述第一方向为X轴方向,所述第二方向为Y轴方向。The method of claim 2, the first predetermined direction comprising at least one of a first direction and a second direction, the first direction being an X-axis direction and the second direction being a Y-axis direction.
  4. 根据权利要求1所述的方法,所述获取终端设备的光学镜头在第一位置采集的第一图像之前,所述方法还包括:The method of claim 1, before the acquiring the first image of the optical lens of the terminal device in the first position, the method further comprises:
    控制所述光学镜头从所述光学镜头的初始位置沿着所述光学镜头所在平面的第二预设方向移动至所述第一位置;Controlling the optical lens to move from the initial position of the optical lens to the first position along a second predetermined direction of the plane of the optical lens;
    所述获取所述光学镜头在第二位置采集的第二图像之前,所述方法还包括:Before the obtaining the second image acquired by the optical lens in the second position, the method further includes:
    控制所述光学镜头从所述第一位置沿着所述光学镜头所在平面的第三预设方向移动至所述第二位置。Controlling the optical lens to move from the first position to the second position along a third predetermined direction of the plane of the optical lens.
  5. 根据权利要求4所述的方法,所述第二预设方向和所述第三预设方向相反。The method of claim 4, wherein the second predetermined direction and the third predetermined direction are opposite.
  6. 根据权利要求4或5所述的方法,所述第二预设方向包括第一方向和第二方向中的至少一项,所述第一方向为X轴方向,所述第二方向为Y轴方向;所述第三预设方向包括所述第一方向和所述第二方向中的至少一项。The method according to claim 4 or 5, wherein the second predetermined direction comprises at least one of a first direction and a second direction, the first direction being an X-axis direction and the second direction being a Y-axis a direction of the third predetermined direction including at least one of the first direction and the second direction.
  7. 一种终端设备,所述终端设备包括:A terminal device, the terminal device comprising:
    获取单元,用于获取所述终端设备的光学镜头在第一位置采集的第一图像;An acquiring unit, configured to acquire a first image acquired by the optical lens of the terminal device at the first position;
    所述获取单元,还用于在所述光学镜头从所述第一位置移动到第二位置的情况下,获取所述光学镜头在第二位置采集的第二图像;The acquiring unit is further configured to acquire, when the optical lens is moved from the first position to the second position, a second image acquired by the optical lens at the second position;
    所述获取单元,还用于获取所述第一位置与所述第二位置之间的第一距离;The acquiring unit is further configured to acquire a first distance between the first location and the second location;
    所述获取单元,还用于获取第一差值;The obtaining unit is further configured to acquire a first difference value;
    确定单元,用于根据所述第一距离、所述光学镜头的焦距和所述第一差值,确定目标物距;a determining unit, configured to determine a target object distance according to the first distance, a focal length of the optical lens, and the first difference;
    其中,所述第一差值为在所述第一图像和所述第二图像位于同一平面上的情况下,所 述第一图像中的第一像素点的位置坐标和所述第二图像中的第二像素点的位置坐标之间的差值;所述第二像素点与所述第一像素点为相对位置的像素点。Wherein the first difference is a position coordinate of the first pixel in the first image and the second image in a case where the first image and the second image are located on a same plane a difference between position coordinates of the second pixel; the second pixel and the first pixel are pixels of a relative position.
  8. 根据权利要求7所述的终端设备,所述确定单元,还用于在所述获取单元获取所述终端设备的光学镜头在第一位置采集的第一图像之前,将所述光学镜头的初始位置确定为所述第一位置;The terminal device according to claim 7, wherein the determining unit is further configured to: before the acquiring unit acquires the first image acquired by the optical lens of the terminal device at the first position, the initial position of the optical lens Determined to be the first location;
    所述终端设备还包括:The terminal device further includes:
    第一控制单元,用于在所述获取单元获取所述光学镜头在第二位置采集的第二图像之前,控制所述光学镜头从所述初始位置沿着所述光学镜头所在平面的第一预设方向移动至所述第二位置。a first control unit, configured to control, before the acquiring unit acquires the second image acquired by the optical lens at the second position, the first pre-preparation of the optical lens from the initial position along a plane of the optical lens The direction is moved to the second position.
  9. 根据权利要求8所述的终端设备,所述第一预设方向包括第一方向和第二方向中的至少一项,所述第一方向为X轴方向,所述第二方向为Y轴方向。The terminal device according to claim 8, wherein the first preset direction comprises at least one of a first direction and a second direction, the first direction is an X-axis direction, and the second direction is a Y-axis direction .
  10. 根据权利要求7所述的终端设备,所述终端设备还包括:The terminal device according to claim 7, further comprising:
    第二控制单元,用于在所述获取单元获取所述终端设备的光学镜头在第一位置采集的第一图像之前,控制所述光学镜头从所述光学镜头的初始位置沿着所述光学镜头所在平面的第二预设方向移动至所述第一位置;a second control unit, configured to control the optical lens from an initial position of the optical lens along the optical lens before the acquiring unit acquires the first image acquired by the optical lens of the terminal device at the first position Moving a second preset direction of the plane to the first position;
    所述第二控制单元,还用于在所述获取单元获取所述光学镜头在第二位置采集的第二图像之前,控制所述光学镜头从所述第一位置沿着所述光学镜头所在平面的第三预设方向移动至所述第二位置。The second control unit is further configured to control, before the acquiring unit acquires the second image acquired by the optical lens at the second position, the optical lens from the first position along a plane of the optical lens The third predetermined direction moves to the second position.
  11. 根据权利要求10所述的终端设备,所述第二预设方向和所述第三预设方向相反。The terminal device according to claim 10, wherein the second preset direction and the third preset direction are opposite.
  12. 根据权利要求10或11所述的终端设备,所述第二预设方向包括第一方向和第二方向中的至少一项,所述第一方向为X轴方向,所述第二方向为Y轴方向;所述第三预设方向包括所述第一方向和所述第二方向中的至少一项。The terminal device according to claim 10 or 11, wherein the second preset direction comprises at least one of a first direction and a second direction, the first direction is an X-axis direction, and the second direction is Y An axial direction; the third predetermined direction includes at least one of the first direction and the second direction.
  13. 一种终端设备,包括处理器、存储器及存储在所述存储器上并可在所述处理器上运行的计算机程序,所述计算机程序被所述处理器执行时实现如权利要求1至6中任一项所述的物距测量方法的步骤。A terminal device comprising a processor, a memory, and a computer program stored on the memory and operable on the processor, the computer program being executed by the processor to implement any of claims 1 to 6 A step of the object distance measuring method described.
  14. 一种计算机可读存储介质,所述计算机可读存储介质上存储计算机程序,所述计算机程序被处理器执行时实现如权利要求1至6中任一项所述的物距测量方法的步骤。A computer readable storage medium storing a computer program, the computer program being executed by a processor to implement the steps of the object distance measuring method according to any one of claims 1 to 6.
PCT/CN2019/071635 2018-01-15 2019-01-14 Object distance measurement method and terminal device WO2019137535A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201810036578.0A CN108317992A (en) 2018-01-15 2018-01-15 A kind of object distance measurement method and terminal device
CN201810036578.0 2018-01-15

Publications (1)

Publication Number Publication Date
WO2019137535A1 true WO2019137535A1 (en) 2019-07-18

Family

ID=62894241

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2019/071635 WO2019137535A1 (en) 2018-01-15 2019-01-14 Object distance measurement method and terminal device

Country Status (2)

Country Link
CN (1) CN108317992A (en)
WO (1) WO2019137535A1 (en)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108317992A (en) * 2018-01-15 2018-07-24 维沃移动通信有限公司 A kind of object distance measurement method and terminal device
RU2697822C2 (en) * 2018-11-19 2019-08-21 Алексей Владимирович Зубарь Method of determining coordinates of objects based on their digital images
CN109859265B (en) * 2018-12-28 2024-04-19 维沃移动通信有限公司 Measurement method and mobile terminal
CN110136114B (en) * 2019-05-15 2021-03-02 厦门理工学院 Wave surface height measuring method, terminal equipment and storage medium

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103292779A (en) * 2012-02-28 2013-09-11 联想(北京)有限公司 Method for measuring distance and image acquisition equipment
CN103344213A (en) * 2013-06-28 2013-10-09 三星电子(中国)研发中心 Method and device for measuring distance of double-camera
CN106225764A (en) * 2016-07-01 2016-12-14 北京小米移动软件有限公司 Based on the distance-finding method of binocular camera in terminal and terminal
CN106355832A (en) * 2016-10-31 2017-01-25 江苏濠汉信息技术有限公司 Method for monitoring distance from dangerous object to power transmission and distribution line channel
CN108317992A (en) * 2018-01-15 2018-07-24 维沃移动通信有限公司 A kind of object distance measurement method and terminal device

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102980556B (en) * 2012-11-29 2015-08-12 小米科技有限责任公司 A kind of distance-finding method and device

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103292779A (en) * 2012-02-28 2013-09-11 联想(北京)有限公司 Method for measuring distance and image acquisition equipment
CN103344213A (en) * 2013-06-28 2013-10-09 三星电子(中国)研发中心 Method and device for measuring distance of double-camera
CN106225764A (en) * 2016-07-01 2016-12-14 北京小米移动软件有限公司 Based on the distance-finding method of binocular camera in terminal and terminal
CN106355832A (en) * 2016-10-31 2017-01-25 江苏濠汉信息技术有限公司 Method for monitoring distance from dangerous object to power transmission and distribution line channel
CN108317992A (en) * 2018-01-15 2018-07-24 维沃移动通信有限公司 A kind of object distance measurement method and terminal device

Also Published As

Publication number Publication date
CN108317992A (en) 2018-07-24

Similar Documents

Publication Publication Date Title
CN108513070B (en) Image processing method, mobile terminal and computer readable storage medium
CN110913132B (en) Object tracking method and electronic equipment
CN109743498B (en) Shooting parameter adjusting method and terminal equipment
WO2021057267A1 (en) Image processing method and terminal device
WO2019137535A1 (en) Object distance measurement method and terminal device
WO2021098603A1 (en) Preview picture display method and electronic device
WO2021098697A1 (en) Screen display control method and electronic device
CN107248137B (en) Method for realizing image processing and mobile terminal
CN109241832B (en) Face living body detection method and terminal equipment
CN111031234B (en) Image processing method and electronic equipment
WO2021082744A1 (en) Video viewing method and electronic apparatus
WO2019076373A1 (en) Photographing method, mobile terminal and computer-readable storage medium
US11805317B2 (en) Method and electronic device for image processing
CN110769154B (en) Shooting method and electronic equipment
CN110602390B (en) Image processing method and electronic equipment
CN109104573B (en) Method for determining focusing point and terminal equipment
CN110913133B (en) Shooting method and electronic equipment
CN108683849B (en) Image acquisition method and terminal
CN108965701B (en) Jitter correction method and terminal equipment
CN111050071A (en) Photographing method and electronic equipment
CN108600517B (en) Method and terminal for switching screen state
JP7472281B2 (en) Electronic device and focusing method
EP4047921A1 (en) Electronic device and focusing method
CN111147745B (en) Shooting method, shooting device, electronic equipment and storage medium
CN112637489A (en) Image shooting method, terminal and storage medium

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19738643

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19738643

Country of ref document: EP

Kind code of ref document: A1