WO2016183723A1 - 一种测量的方法及终端 - Google Patents

一种测量的方法及终端 Download PDF

Info

Publication number
WO2016183723A1
WO2016183723A1 PCT/CN2015/079051 CN2015079051W WO2016183723A1 WO 2016183723 A1 WO2016183723 A1 WO 2016183723A1 CN 2015079051 W CN2015079051 W CN 2015079051W WO 2016183723 A1 WO2016183723 A1 WO 2016183723A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
camera
tested
distance
total amount
Prior art date
Application number
PCT/CN2015/079051
Other languages
English (en)
French (fr)
Inventor
顾星刚
陈绍君
郑士胜
Original Assignee
华为技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 华为技术有限公司 filed Critical 华为技术有限公司
Priority to EP15892104.9A priority Critical patent/EP3264032B1/en
Priority to US15/561,287 priority patent/US10552971B2/en
Priority to JP2017557076A priority patent/JP6490242B2/ja
Priority to CN201580079722.3A priority patent/CN107532881B/zh
Priority to KR1020177030133A priority patent/KR101971815B1/ko
Priority to BR112017021042-8A priority patent/BR112017021042B1/pt
Priority to PCT/CN2015/079051 priority patent/WO2016183723A1/zh
Publication of WO2016183723A1 publication Critical patent/WO2016183723A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery
    • G06T7/55Depth or shape recovery from multiple images
    • G06T7/593Depth or shape recovery from multiple images from stereo images
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/02Measuring arrangements characterised by the use of optical techniques for measuring length, width or thickness
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01DMEASURING NOT SPECIALLY ADAPTED FOR A SPECIFIC VARIABLE; ARRANGEMENTS FOR MEASURING TWO OR MORE VARIABLES NOT COVERED IN A SINGLE OTHER SUBCLASS; TARIFF METERING APPARATUS; MEASURING OR TESTING NOT OTHERWISE PROVIDED FOR
    • G01D7/00Indicating measured values
    • G01D7/12Audible indication of meter readings, e.g. for the blind
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T1/00General purpose image data processing
    • G06T1/0007Image acquisition
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/60Analysis of geometric attributes
    • G06T7/62Analysis of geometric attributes of area, perimeter, diameter or volume
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/45Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from two or more image sensors being of different type or operating in different modes, e.g. with a CMOS sensor for moving images in combination with a charge-coupled device [CCD] for still images
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10004Still image; Photographic image
    • G06T2207/10012Stereo images
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10028Range image; Depth image; 3D point clouds
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20092Interactive image processing based on input by user
    • G06T2207/20101Interactive definition of point of interest, landmark or seed

Definitions

  • the present invention relates to the field of communications technologies, and in particular, to a method and terminal for measurement.
  • the technical problem to be solved by the embodiments of the present invention is to provide a method and a terminal for measurement. To solve the problem that the terminal cannot measure the size of the object to be tested.
  • an embodiment of the present invention provides a method for measuring, including:
  • the comparing the positions of the objects to be tested in the first image and the second image After measuring the total position offset of the object also includes:
  • the method further includes:
  • the imaging data is decoded, and the first image is displayed on a display screen of the terminal.
  • the method further includes:
  • the calculated distance result is notified to the user by voice.
  • the comparing is in the first image and the second The position of the object to be measured in the image is obtained as a total amount of positional deviation of the object to be tested, and is calculated according to the following formula:
  • D is the total amount of positional deviation of the object to be tested
  • X 1 is the positional offset component of the object to be tested in the first image
  • X 2 is the position of the object to be tested in the second image. Offset component.
  • the receiving user is based on the measurement point of the first image input And selecting a command to calculate a distance between the selected measurement points according to the total position offset, the center distance of the first camera and the second camera, and the focal length of the first camera, according to the following formula get on:
  • L is the distance between the selected measurement points
  • (X p , Y p ) is the coordinate of the measurement point p in the first image
  • D p is the total position offset of the measurement point P
  • (X q , Y q ) is the coordinate of the measurement point q in the first image
  • D q is the total amount of positional deviation of the measurement point q
  • H 1 h 1 /a
  • h 1 is the number
  • the focal length of a camera a is the size constant of a single pixel.
  • the method further includes:
  • Z is the vertical distance from the object to be measured to the center line of the first camera and the second camera
  • H 1 h 1 /a
  • h 1 is the focal length of the first camera
  • a is The size constant of a single pixel
  • D1 is the distance between the center of the first camera and the center of the second camera
  • X 1 is the positional offset component of the object to be measured in the first image
  • X 2 is to be A positional offset component of the object in the second image is measured.
  • the sixth possible implementation manner of the first aspect if two measurement points selected by the user are two, two calculations are performed. The distance between the measurement points;
  • the distance between two adjacent measurement points is sequentially calculated according to the selected order.
  • an embodiment of the present invention provides a terminal, including:
  • An acquiring unit configured to acquire, by the first camera, a first image that includes the object to be tested, and acquire, by the second camera, a second image that includes the object to be tested, where the first camera and the second camera are configured On the same plane;
  • a comparing unit configured to compare positions of the objects to be tested in the first image and the second image, to obtain a total amount of positional deviation of the object to be tested, wherein the total amount of the positional offset is used Characterizing an offset of a position of the object to be tested in the first image relative to a position of the object to be tested in the second image;
  • a calculating unit configured to receive a measurement point selection instruction input by the user based on the first image, according to the total amount of the positional offset, the distance between the center of the first camera and the center of the second camera, and the The focal length of the first camera calculates the distance between the selected measurement points.
  • the terminal further includes:
  • a compression unit configured to compress and store the imaging data of the first image and the total amount of position offsets as a photo file
  • a parsing unit configured to parse the photo file to obtain imaging data of the first image and the total amount of positional offset
  • a decoding unit configured to decode the imaging data, and display the first image on a display screen of the terminal.
  • the terminal further includes:
  • a notification unit configured to display the calculated distance result on a display screen of the terminal
  • the calculated distance result is notified to the user by voice.
  • the comparing unit is specifically configured to be compared by using the following formula
  • the positions of the objects to be tested in the first image and the second image are obtained as a total amount of positional deviation of the object to be tested:
  • D is the total amount of positional deviation of the object to be tested
  • X 1 is the positional offset component of the object to be tested in the first image
  • X 2 is the position of the object to be tested in the second image. Offset component.
  • the calculating unit is specifically configured to calculate the selected Distance between measuring points:
  • L is the distance between the selected measurement points
  • (X p , Y p ) is the coordinate of the measurement point p in the first image
  • D p is the total position offset of the measurement point P
  • (X q , Y q ) is the coordinate of the measurement point q in the first image
  • D q is the total amount of positional deviation of the measurement point q
  • H 1 h 1 /a
  • h 1 is the number
  • the focal length of a camera a is the size constant of a single pixel.
  • the calculating unit is further configured to calculate the to-be- Measuring the vertical distance of the object to the center line of the first camera and the second camera:
  • Z is the vertical distance from the object to be measured to the center line of the first camera and the second camera
  • H 1 h 1 /a
  • h 1 is the focal length of the first camera
  • a is The size constant of a single pixel
  • D1 is the distance between the center of the first camera and the center of the second camera
  • X 1 is the positional offset component of the object to be measured in the first image
  • X 2 is to be A positional offset component of the object in the second image is measured.
  • the calculating unit is specifically configured to:
  • the distance between the two measurement points is calculated
  • the distance between two adjacent measurement points is sequentially calculated according to the selected order.
  • an embodiment of the present invention provides a terminal, including: an input device, an output device, a memory, and a processor, where the input device includes a first camera and a second camera, and is configured to acquire an image of the object to be tested.
  • the input device, the output device, the memory, and the processor are connected to the bus, wherein the memory stores a set of program codes, and the processor is configured to call the program code stored in the memory to perform the following operations:
  • the processor is further configured to:
  • the imaging data is decoded, and the first image is displayed on a display screen of the terminal.
  • the output device is configured to calculate the distance between the selected measurement points Displaying the calculated distance result on the display screen of the terminal;
  • the calculated distance result is notified to the user by voice.
  • the processor is specifically configured to be compared by using the following formula
  • the positions of the objects to be tested in the first image and the second image are obtained as a total amount of positional deviation of the object to be tested:
  • D is the total amount of positional deviation of the object to be tested
  • X 1 is the positional offset component of the object to be tested in the first image
  • X 2 is the position of the object to be tested in the second image. Offset component.
  • the processor is specifically configured to calculate the selected Distance between measuring points:
  • L is the distance between the selected measurement points
  • (X p , Y p ) is the coordinate of the measurement point p in the first image
  • D p is the total position offset of the measurement point P
  • (X q , Y q ) is the coordinate of the measurement point q in the first image
  • D q is the total amount of positional deviation of the measurement point q
  • H 1 h 1 /a
  • h 1 is the number
  • the focal length of a camera a is the size constant of a single pixel.
  • the processor is further configured to calculate the to-be- Measuring the vertical distance of the object to the center line of the first camera and the second camera:
  • Z is the vertical distance from the object to be measured to the center line of the first camera and the second camera
  • H 1 h 1 /a
  • h 1 is the focal length of the first camera
  • a is The size constant of a single pixel
  • D1 is the distance between the center of the first camera and the center of the second camera
  • X 1 is the positional offset component of the object to be measured in the first image
  • X 2 is to be A positional offset component of the object in the second image is measured.
  • the processor is configured to calculate between the selected measurement points When the distance is used, it is specifically used to:
  • the distance between the two measurement points is calculated
  • the distance between two adjacent measurement points is sequentially calculated according to the selected order.
  • an embodiment of the present invention provides a computer storage medium, where the computer storage medium stores a program, and the program includes steps according to the first aspect of the embodiment of the present invention or any of the first aspect.
  • the position offset may be The total amount, the center distance between the first camera and the second camera, and the focal length of the first camera can calculate the distance between the selected measurement points.
  • the user does not need to carry any additional measurement tools, and only needs to carry the camera to take measurements.
  • the size of the object to be tested enriches the function of the terminal and expands the practicability and convenience of the terminal.
  • FIG. 1 is a schematic flow chart of a first embodiment of a method for measuring according to the present invention
  • FIG. 2 is a schematic flow chart of a second embodiment of a method for measuring according to the present invention.
  • FIG. 3 is a schematic diagram showing the composition of a first embodiment of a terminal according to the present invention.
  • FIG. 4 is a schematic diagram showing the composition of a second embodiment of a terminal according to the present invention.
  • Figure 5 is a schematic diagram showing the composition of a third embodiment of a terminal of the present invention.
  • Figure 6 is a schematic diagram showing the principle of calculating the distance of the present invention.
  • the terminal in the embodiment of the present invention may include a smart phone (such as an Android mobile phone, an iOS mobile phone, a Windows Phone mobile phone, etc.) having a dual camera photographing function, a tablet computer, a digital camera, a palmtop computer, a notebook computer, and a mobile internet device (Mobile Internet Devices).
  • a smart phone such as an Android mobile phone, an iOS mobile phone, a Windows Phone mobile phone, etc.
  • Mobile Internet Devices Mobile Internet Devices
  • MID mobile internet device
  • wearable devices etc.
  • the above terminals are merely examples, and are not exhaustive, including but not limited to the above terminals.
  • FIG. 1 is a schematic flowchart of a first embodiment of a method for measuring according to the present invention.
  • the method includes the following steps:
  • the terminal acquires a first image that includes the object to be tested through the first camera, and acquires a second image that includes the object to be tested by using the second camera.
  • the first camera and the second camera are disposed on the same plane.
  • the camera when the terminal acquires an image by using two cameras, the camera may be triggered according to an operation of the terminal user, such as the terminal user clicking a camera button of the camera application, or pressing a physical button or more
  • the combination of the physical buttons, or inputting a piece of voice (such as "photographing” or “measurement”, etc.) to the terminal, or making a preset gesture to the terminal to trigger the photographing is not limited in the embodiment of the present invention.
  • the camera on the terminal can be two or three or more. It is only necessary to ensure that the image acquired by at least two cameras contains the object to be tested.
  • the main camera and the sub camera can be set in the two cameras, and the main and sub-relationships can be switched at will.
  • the subsequent images are saved, the image taken by the main camera is saved, and when calculating various data, the calculation is based on the parameters of the main camera providing the image. Just fine.
  • first image and the second image When the first image and the second image are acquired, they can be acquired simultaneously or continuously in a short time interval, and the terminal position at the time of shooting is kept as constant as possible to obtain the first image and the second image with the greatest correlation, if The position of the first shot and the second shot of the terminal are unchanged, and any interval can be separated.
  • the total amount of positional offset is used to characterize the offset of the position of the object to be tested in the first image relative to the position of the object to be tested in the second image. Since the two reference positions are distributed in two images, an intermediate reference position such as the center line of the image can be introduced for comparison in a specific calculation.
  • the total amount of positional offset may be calculated according to a positional offset component of the object to be tested in the first image and a positional offset component of the object to be tested in the second image.
  • the position offset component is a distance from a position of the object to be tested in the image to a center line of the image, and the image center line is perpendicular to a center of the first camera and a center of the second camera. line.
  • the left camera acquires the first image when the camera is photographed, and the object to be tested generally has a leftward shift on the first image.
  • the center line of the first image in the vertical direction is used as a reference, that is, the center line perpendicular to the center line of the two cameras is used as a reference, the positional offset component of the object to be measured in the first image can be obtained.
  • the positional offset component of the object to be measured in the second image can be obtained. Since the two cameras are horizontally arranged, for one pixel point, the ordinate is the same in the coordinate system, so there is no vertical.
  • the deviation of the direction the total amount of positional deviation is the sum of the two positional offset components in the horizontal direction. Similarly, when the two cameras are set vertically, only the positional offset component and the total amount of positional offset in the vertical direction need to be considered.
  • the position offset component of the object to be tested in the first image may be acquired first, and then the position offset component of the object to be tested in the second image is acquired, and then the total position offset of the object to be tested is calculated according to the following formula:
  • D is the total amount of positional deviation of the object to be tested
  • X 1 is the positional offset component of the object to be tested in the first image
  • X 2 is the position of the object to be tested in the second image. Offset component.
  • S103 Receive a measurement point selection instruction input by the user based on the first image, according to the total amount of the positional offset, the distance between the center of the first camera and the center of the second camera, and the first camera.
  • the focal length which calculates the distance between the selected measurement points.
  • the user may be prompted to select a measurement point based on the first image, and the first image is the image captured by the first camera, that is, the first camera, and saved by the terminal.
  • the user can select the edge point of the object to be tested as the first measurement point based on the first image, and then select the second position of the object to be tested as the second measurement point.
  • the terminal can be based on the total position offset, the distance between the centers of the two cameras, and the first The focal length of the camera calculates the distance between the two measurement points.
  • the distance between the center of the first camera and the center of the second camera can be understood as including the distance between the first camera and the second camera, including the lens of the first camera.
  • the distance from the lens of the second camera includes the distance between the image sensor of the first camera and the image sensor of the second camera, including the distance between the lens of the first camera and the image sensor of the second camera.
  • the distance between the photosensitive device of the first camera and any device including the edge of the second camera may also be included. There is no limit to this.
  • the number of measurement points may be two or more than two. If the measurement points selected by the user are two, the distance between the two measurement points is calculated; If the user selects more than two measurement points, the distance between two adjacent measurement points is sequentially calculated according to the selected order. For example, when it is necessary to measure the length and width of the wardrobe, the upper left edge of the wardrobe can be selected as the first measurement point, then the lower left edge of the wardrobe is selected as the second measurement point, and the lower right edge of the wardrobe is selected as the third measurement point. The terminal will calculate the length of the wardrobe according to the first measurement point and the second measurement point in turn, and then calculate the width of the wardrobe according to the second measurement point and the third measurement point.
  • a reset button or a re-measured button can also be set for the user to select. When the user selects reset or re-measures, the previously selected measurement points will be invalid.
  • L is the distance between the selected measurement points
  • (X p , Y p ) is the coordinate of the measurement point p in the first image
  • D p is the total position offset of the measurement point P
  • (X q , Y q ) is the coordinate of the measurement point q in the first image
  • D q is the total amount of positional deviation of the measurement point q
  • H 1 h 1 /a
  • h 1 is the number
  • the focal length of a camera a is the size constant of a single pixel.
  • D 1 , a, h 1 are all known quantities, and when the measurement point is selected, values of (X p , Y p ), (X q , Y q ) are obtained, and by step S102, D p can be obtained. And the value of D q , so that the distance between p point and q point can be calculated by the above formula.
  • FIG. 6 is a schematic diagram of the principle of calculating the distance according to the present invention.
  • the object to be measured offset component is X 1
  • the distance between the centers is D 1
  • the distance from the center of the object to be measured to the center of the two cameras is Z
  • the distance from the center of the object to be measured to the center extension of the first camera is X W .
  • 1 *D 1 /D, Y w y 1 *D 1 /D;
  • the distance is equal to the Euclidean length between the spatial coordinates of the two measurement points, according to the Euclidean formula:
  • H 1 h 1 /a
  • h 1 is the focal length of the first camera
  • a is the size constant of a single pixel point
  • D 1 Length measurement unit the unit of measurement of the center distance of the first camera and the second camera
  • the length of D p and D q is calculated based on the number of pixels in the image. Therefore, by introducing the constant a, the unit before and after the formula can be unified.
  • the unit of L obtained is the length measurement unit commonly used by the user.
  • the position offset may be The total amount, the distance between the center of the first camera and the center of the second camera, and the focal length of the first camera can calculate the distance between the selected measurement points.
  • the user does not need to carry any additional measurement tools, and only needs to carry the terminal to take pictures.
  • the measurement of the size of the object to be tested can be realized, the function of the terminal is enriched, and the practicability and convenience of the terminal are expanded.
  • FIG. 2 is a schematic flowchart of a second embodiment of a method for measuring according to the present invention.
  • the method includes the following steps:
  • the terminal acquires a first image that includes the object to be tested through the first camera, and acquires a second image that includes the object to be tested by using the second camera.
  • the first camera and the second camera are disposed on the same plane.
  • the total amount of positional offset is used to characterize the offset of the position of the object to be tested in the first image relative to the position of the object to be tested in the second image.
  • S203 Compress and store the imaging data of the first image and the total position offset as a photo file.
  • the imaging data of the first image and the total amount of the positional offset may be compressed and stored as a photo file, which is convenient for the user to perform measurement at any time.
  • the calculated distance result may be notified to the user by voice, and the notification manner may be various, and the user may select according to his/her own needs. Or, for example, the manner in which the text display and the voice broadcast are performed simultaneously can be selected, which is not limited in the embodiment of the present invention.
  • the calculation method in this embodiment can be performed by referring to the calculation mode of the first embodiment shown in FIG. 1 , and details are not described herein again.
  • the embodiment of the present invention further provides a computer storage medium, wherein the computer storage medium stores a program, and the program includes some or all of the steps of the method of measurement described in the foregoing method embodiments.
  • the terminal includes:
  • the acquiring unit 100 is configured to acquire, by using the first camera, a first image that includes the object to be tested, and acquire, by the second camera, a second image that includes the object to be tested, where the first camera and the second camera Set on the same plane;
  • a comparing unit 200 configured to compare positions of the objects to be tested in the first image and the second image, to obtain a total amount of positional deviation of the object to be tested, wherein the total amount of the positional offset And an identifier for indicating a position of the object to be tested in the first image relative to a position of the object to be tested in the second image.
  • the total amount of positional offset may be calculated according to a positional offset component of the object to be tested in the first image and a positional offset component of the object to be tested in the second image, the position The offset component is the distance from the position of the object to be tested in the image to the center line of the image Off, the image center line is perpendicular to a line connecting the center of the first camera and the center of the second camera;
  • the calculating unit 300 is configured to receive a measurement point selection instruction input by the user based on the first image, according to the total amount of the positional offset, the distance between the center of the first camera and the center of the second camera, and The focal length of the first camera is calculated, and the distance between the selected measurement points is calculated.
  • the comparing unit 200 is specifically configured to compare the positions of the objects to be tested in the first image and the second image by using a formula to obtain a total position offset of the object to be tested:
  • D is the total amount of positional deviation of the object to be tested
  • X 1 is the positional offset component of the object to be tested in the first image
  • X 2 is the position of the object to be tested in the second image. Offset component.
  • the calculating unit 300 is specifically configured to calculate a distance between the selected measurement points by using the following formula:
  • L is the distance between the selected measurement points
  • (X p , Y p ) is the coordinate of the measurement point p in the first image
  • D p is the total position offset of the measurement point P
  • (X q , Y q ) is the coordinate of the measurement point q in the first image
  • D q is the total amount of positional deviation of the measurement point q
  • H 1 h 1 /a
  • h 1 is the number
  • the focal length of a camera a is the size constant of a single pixel.
  • the calculating unit 300 is further configured to calculate a vertical distance of the object to be tested to a center line of the first camera and the second camera by using the following formula:
  • Z is the vertical distance from the object to be measured to the center line of the first camera and the second camera
  • H 1 h 1 /a
  • h 1 is the focal length of the first camera
  • a is The size constant of a single pixel
  • D1 is the distance between the center of the first camera and the center of the second camera
  • X 1 is the positional offset component of the object to be measured in the first image
  • X 2 is to be A positional offset component of the object in the second image is measured.
  • the terminal includes:
  • the acquiring unit 100 is configured to acquire, by using the first camera, a first image that includes the object to be tested, and acquire, by the second camera, a second image that includes the object to be tested, where the first camera and the second camera Set on the same plane;
  • a comparing unit 200 configured to compare positions of the objects to be tested in the first image and the second image, to obtain a total amount of positional deviation of the object to be tested, wherein the total amount of the positional offset And an identifier for indicating a position of the object to be tested in the first image relative to a position of the object to be tested in the second image.
  • the total amount of positional offset may be calculated according to a positional offset component of the object to be tested in the first image and a positional offset component of the object to be tested in the second image, the position The offset component is a distance from a position of the object to be tested in an image to a center line of the image, and the image center line is perpendicular to a line connecting a center of the first camera and a center of the second camera;
  • the calculating unit 300 is configured to receive a measurement point selection instruction input by the user based on the first image, according to the total amount of the positional offset, the distance between the center of the first camera and the center of the second camera, and The focal length of the first camera is calculated, and the distance between the selected measurement points is calculated.
  • the comparing unit 200 is specifically configured to compare the positions of the objects to be tested in the first image and the second image by using a formula to obtain a total position offset of the object to be tested:
  • D is the total amount of positional deviation of the object to be tested
  • X 1 is the positional offset component of the object to be tested in the first image
  • X 2 is the position of the object to be tested in the second image. Offset component.
  • the calculating unit 300 is specifically configured to calculate a distance between the selected measurement points by using the following formula:
  • L is the distance between the selected measurement points
  • (X p , Y p ) is the coordinate of the measurement point p in the first image
  • D p is the total position offset of the measurement point P
  • (X q , Y q ) is the coordinate of the measurement point q in the first image
  • D q is the total amount of positional deviation of the measurement point q
  • H 1 h 1 /a
  • h 1 is the number
  • the focal length of a camera a is the size constant of a single pixel.
  • the calculating unit 300 is further configured to calculate a vertical distance of the object to be tested to a center line of the first camera and the second camera by using the following formula:
  • Z is the vertical distance from the object to be measured to the center line of the first camera and the second camera
  • H 1 h 1 /a
  • h 1 is the focal length of the first camera
  • a is The size constant of a single pixel
  • D1 is the distance between the center of the first camera and the center of the second camera
  • X 1 is the positional offset component of the object to be measured in the first image
  • X 2 is to be A positional offset component of the object in the second image is measured.
  • the terminal in the embodiment of the present invention further includes:
  • a compression unit 400 configured to compress and store the imaging data of the first image and the total amount of position offsets as a photo file
  • the parsing unit 500 is configured to parse the photo file to obtain imaging data of the first image and the total amount of positional offset;
  • the decoding unit 600 is configured to decode the imaging data, and display the first image on a display screen of the terminal.
  • a notification unit 700 configured to display the calculated distance result on a display screen of the terminal;
  • the calculated distance result is notified to the user by voice.
  • the foregoing obtaining unit 100, the comparing unit 200, the calculating unit 300, the compressing unit 400, the parsing unit 500, the decoding unit 600, and the notifying unit 700 may exist independently or in an integrated manner.
  • the acquiring unit 100, The comparison unit 200, the calculation unit 300, the compression unit 400, the parsing unit 500, the decoding unit 600 or the notification unit 700 may be separately provided in the form of hardware independently of the processor of the terminal, and may be in the form of a microprocessor; It is embedded in the processor of the terminal in hardware, and can also be stored in the memory of the terminal in software, so that the processor of the terminal invokes to execute the above obtaining unit 100, the comparing unit 200, the calculating unit 300, and the compressing unit. 400.
  • the operations corresponding to the parsing unit 500, the decoding unit 600, and the notification unit 700 may exist independently or in an integrated manner.
  • the element 300 may be a processor of the terminal, and the functions of the obtaining unit 100 and the comparing unit 200 may be embedded in the processor, or may be separately set independently of the processor, or may be stored in the memory in the form of software.
  • the caller implements its functionality.
  • the embodiment of the invention does not impose any limitation.
  • the above processor may be a central processing unit (CPU), a microprocessor, a single chip microcomputer, or the like.
  • the terminal includes:
  • An input device 10, an output device 20, a memory 30, and a processor 40 the input device 10 includes a first camera and a second camera for acquiring an image of an object to be tested, the input device 10, the output device 20, and the memory 30
  • the processor 40 is connected to the bus, wherein the memory 30 stores a set of program codes, and the processor 40 is configured to call the program code stored in the memory 30 to perform the following operations:
  • the total amount of positional offset may be calculated according to a positional offset component of the object to be tested in the first image and a positional offset component of the object to be tested in the second image, the position The offset component is a distance from a position of the object to be tested in an image to a center line of the image, and the image center line is perpendicular to a line connecting a center of the first camera and a center of the second camera;
  • the processor 40 is further configured to:
  • the imaging data is decoded, and the first image is displayed on a display screen of the terminal.
  • the output device 20 is configured to display the calculated distance result on the display screen of the terminal after the calculating the distance between the selected measurement points;
  • the calculated distance result is notified to the user by voice.
  • the processor 40 is specifically configured to compare the positions of the objects to be tested in the first image and the second image by using a formula to obtain a total position offset of the object to be tested:
  • D is the total amount of positional deviation of the object to be tested
  • X 1 is the positional offset component of the object to be tested in the first image
  • X 2 is the position of the object to be tested in the second image. Offset component.
  • the processor 40 is further configured to calculate a distance between the selected measurement points by using the following formula:
  • L is the distance between the selected measurement points
  • (X p , Y p ) is the coordinate of the measurement point p in the first image
  • D p is the total position offset of the measurement point P
  • (X q , Y q ) is the coordinate of the measurement point q in the first image
  • D q is the total amount of positional deviation of the measurement point q
  • H 1 h 1 /a
  • h 1 is the number
  • the focal length of a camera a is the size constant of a single pixel.
  • the processor 40 is further configured to calculate a vertical distance of the object to be tested to a center line of the first camera and the second camera by using the following formula:
  • Z is the vertical distance from the object to be measured to the center line of the first camera and the second camera
  • H 1 h 1 /a
  • h 1 is the focal length of the first camera
  • a is The size constant of a single pixel
  • D1 is the distance between the center of the first camera and the center of the second camera
  • X 1 is the positional offset component of the object to be measured in the first image
  • X 2 is to be A positional offset component of the object in the second image is measured.
  • each embodiment in the present specification is described in a progressive manner, and each embodiment focuses on a difference from other embodiments, and the same similar parts between the various embodiments. You can refer to each other.
  • the description is relatively simple, and the relevant parts can be referred to the description of the method embodiment.
  • the present invention has the following advantages:
  • the position offset may be The total amount, the distance between the center of the first camera and the center of the second camera, and the focal length of the first camera can calculate the distance between the selected measurement points.
  • the user does not need to carry any additional measurement tools, and only needs to carry the terminal to take pictures.
  • the measurement of the size of the object to be tested can be realized, the function of the terminal is enriched, and the practicability and convenience of the terminal are expanded.
  • the foregoing program may be stored in a computer readable storage medium, and the program is executed when executed.
  • the foregoing steps include the steps of the foregoing method embodiments; and the foregoing storage medium includes: a medium that can store program codes, such as a ROM, a RAM, a magnetic disk, or an optical disk.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Geometry (AREA)
  • Human Computer Interaction (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Length Measuring Devices By Optical Means (AREA)
  • Measurement Of Optical Distance (AREA)
  • Image Analysis (AREA)
  • Studio Devices (AREA)

Abstract

本发明实施例公开了一种测量的方法,包括:终端通过第一摄像头获取包含待测物体的第一图像,并通过第二摄像头获取包含所述待测物体的第二图像,其中,所述第一摄像头和所述第二摄像头设置在同一平面上;比较在所述第一图像和所述第二图像中所述待测物体的位置,得到所述待测物体的位置偏移总量;接收用户基于所述第一图像输入的测量点选定指令,根据所述位置偏移总量、所述第一摄像头的中心和所述第二摄像头的中心的距离以及所述第一摄像头的焦距,计算选定的测量点之间的距离。本发明实施例还公开了一种终端置。采用本发明,可简单方便地使用具备双摄像头的终端测量待测物体的尺寸。

Description

一种测量的方法及终端 技术领域
本发明涉及通信技术领域,尤其涉及一种测量的方法及终端。
背景技术
在日常生活中,用户常常需要进行距离、长度的测量。例如在购置家具时,既需要测量家里的空间大小,又需要测量家具的长宽高,才能确认家具大小是否与家中的空间匹配。为了满足这些测量需求,用户常常需要购置卷尺等长度测量工具并随身携带,很不方便。一旦忘记携带就无法进行测量,产生很大困扰。同时由于这类工具在长度上的限制,当存在超出其量程的长距离测量需求时,只能进行多次测量,使用不便且误差较大。
随着智能终端如智能手机、平板、智能手表等设备的飞速发展和普及,大部分用户都拥有带摄像头的终端,其中部分终端已配备了双摄像头。在现有技术中,基于双摄像头可以测量待测物体到终端的距离,但是却无法测量待测物体本身的尺寸。
发明内容
本发明实施例所要解决的技术问题在于,提供一种测量的方法及终端。以解决终端无法测量待测物体的尺寸的问题。
第一方面,本发明实施例提供了一种测量的方法,包括:
终端通过第一摄像头获取包含待测物体的第一图像,并通过第二摄像头获取包含所述待测物体的第二图像,其中,所述第一摄像头和所述第二摄像头设置在同一平面上;
比较在所述第一图像和所述第二图像中所述待测物体的位置,得到所述待测物体的位置偏移总量,其中,所述位置偏移总量用于表征所述待测物体在所述第一图像中的位置相对于所述待测物体在所述第二图像中的位置的偏移量;
接收用户基于所述第一图像输入的测量点选定指令,根据所述位置偏移总量、所述第一摄像头的中心和所述第二摄像头的中心的距离以及所述第一摄像头的焦距,计算选定的测量点之间的距离。
结合第一方面的实现方式,在第一方面第一种可能的实现方式中,在所述比较在所述第一图像和所述第二图像中所述待测物体的位置,得到所述待测物体的位置偏移总量之后,还包括:
将所述第一图像的成像数据和所述位置偏移总量进行压缩并存储为照片文件;
在所述接收用户基于所述第一图像输入的测量点选定指令之前,还包括:
解析所述照片文件,得到所述第一图像的成像数据和所述位置偏移总量;
解码所述成像数据,将所述第一图像显示在所述终端的显示屏上。
结合第一方面、或第一方面第一种可能的实现方式,在第一方面第二种可能的实现方式中,在所述计算选定的测量点之间的距离之后,还包括:
将计算得到的距离结果显示在所述终端的显示屏上;或者
将计算得到的距离结果以语音的方式告知用户。
结合第一方面、或第一方面第一种至第二种任一可能的实现方式,在第一方面第三种可能的实现方式中,所述比较在所述第一图像和所述第二图像中所述待测物体的位置,得到所述待测物体的位置偏移总量,根据以下公式计算:
D=X1+X2
其中,D为所述待测物体的位置偏移总量,X1为待测物体在所述第一图像中的位置偏移分量,X2为待测物体在所述第二图像中的位置偏移分量。
结合第一方面、或第一方面第一种至第三种任一可能的实现方式,在第一方面第四种可能的实现方式中,所述接收用户基于所述第一图像输入的测量点选定指令,根据所述位置偏移总量、所述第一摄像头和所述第二摄像头的中心间距以及所述第一摄像头的焦距,计算选定的测量点之间的距离,根据以下公式进行:
Figure PCTCN2015079051-appb-000001
其中,L为选定的测量点之间的距离,(Xp,Yp)为测量点p在所述第一图像中的坐标,Dp为所述测量点P的位置偏移总量,(Xq,Yq)为测量点q在 所述第一图像中的坐标,Dq为所述测量点q的位置偏移总量,H1=h1/a,h1为所述第一摄像头的焦距,a为单个像素点的尺寸常量。
结合第一方面、或第一方面第一种至第四种任一可能的实现方式,在第一方面第五种可能的实现方式中,还包括:
根据以下公式计算所述待测物体到所述第一摄像头和所述第二摄像头的中心连线的垂直距离:
Z=H1*D1/(X1+X2)
其中,Z为所述待测物体到所述第一摄像头和所述第二摄像头的中心连线的垂直距离,H1=h1/a,h1为所述第一摄像头的焦距,a为单个像素点的尺寸常量,D1为所述第一摄像头的中心和所述第二摄像头的中心的距离,X1为待测物体在所述第一图像中的位置偏移分量,X2为待测物体在所述第二图像中的位置偏移分量。
结合第一方面、或第一方面第一种至第五种任一可能的实现方式,在第一方面第六种可能的实现方式中,若用户选定的测量点为两个,则计算两个测量点之间的距离;
若用户选定的测量点大于两个,则根据选定的顺序依次计算相邻两个测量点之间的距离。
第二方面,本发明实施例提供了一种终端,包括:
获取单元,用于通过第一摄像头获取包含待测物体的第一图像,并通过第二摄像头获取包含所述待测物体的第二图像,其中,所述第一摄像头和所述第二摄像头设置在同一平面上;
比较单元,用于比较在所述第一图像和所述第二图像中所述待测物体的位置,得到所述待测物体的位置偏移总量,其中,所述位置偏移总量用于表征所述待测物体在所述第一图像中的位置相对于所述待测物体在所述第二图像中的位置的偏移量;
计算单元,用于接收用户基于所述第一图像输入的测量点选定指令,根据所述位置偏移总量、所述第一摄像头的中心和所述第二摄像头的中心的距离以及所述第一摄像头的焦距,计算选定的测量点之间的距离。
结合第二方面的实现方式,在第二方面第一种可能的实现方式中,所述终端还包括:
压缩单元,用于将所述第一图像的成像数据和所述位置偏移总量进行压缩并存储为照片文件;
解析单元,用于解析所述照片文件,得到所述第一图像的成像数据和所述位置偏移总量;
解码单元,用于解码所述成像数据,将所述第一图像显示在所述终端的显示屏上。
结合第二方面、或第二方面第一种可能的实现方式,在第二方面第二种可能的实现方式中,所述终端还包括:
通知单元,用于将计算得到的距离结果显示在所述终端的显示屏上;或者
将计算得到的距离结果以语音的方式告知用户。
结合第二方面、或第二方面第一种至第二种任一可能的实现方式,在第二方面第三种可能的实现方式中,所述比较单元具体用于通过以下公式比较在所述第一图像和所述第二图像中所述待测物体的位置,得到所述待测物体的位置偏移总量:
D=X1+X2
其中,D为所述待测物体的位置偏移总量,X1为待测物体在所述第一图像中的位置偏移分量,X2为待测物体在所述第二图像中的位置偏移分量。
结合第二方面、或第二方面第一种至第三种任一可能的实现方式,在第二方面第四种可能的实现方式中,所述计算单元具体用于通过以下公式计算选定的测量点之间的距离:
Figure PCTCN2015079051-appb-000002
其中,L为选定的测量点之间的距离,(Xp,Yp)为测量点p在所述第一图像中的坐标,Dp为所述测量点P的位置偏移总量,(Xq,Yq)为测量点q在所述第一图像中的坐标,Dq为所述测量点q的位置偏移总量,H1=h1/a,h1为 所述第一摄像头的焦距,a为单个像素点的尺寸常量。
结合第二方面、或第二方面第一种至第四种任一可能的实现方式,在第二方面第五种可能的实现方式中,所述计算单元还用于通过以下公式计算所述待测物体到所述第一摄像头和所述第二摄像头的中心连线的垂直距离:
Z=H1*D1/(X1+X2)
其中,Z为所述待测物体到所述第一摄像头和所述第二摄像头的中心连线的垂直距离,H1=h1/a,h1为所述第一摄像头的焦距,a为单个像素点的尺寸常量,D1为所述第一摄像头的中心和所述第二摄像头的中心的距离,X1为待测物体在所述第一图像中的位置偏移分量,X2为待测物体在所述第二图像中的位置偏移分量。
结合第二方面、或第二方面第一种至第五种任一可能的实现方式,在第二方面第六种可能的实现方式中,所述计算单元具体用于:
若用户选定的测量点为两个,则计算两个测量点之间的距离;
若用户选定的测量点大于两个,则根据选定的顺序依次计算相邻两个测量点之间的距离。
第三方面,本发明实施例提供了一种终端,包括:输入设备、输出设备、存储器和处理器,所述输入设备包括第一摄像头和第二摄像头,用于获取待测物体的图像,所述输入设备、输出设备、存储器和处理器与总线连接,其中,所述存储器中存储一组程序代码,所述处理器用于调用所述存储器中存储的程序代码,执行以下操作:
通过第一摄像头获取包含待测物体的第一图像,并通过第二摄像头获取包含所述待测物体的第二图像,其中,所述第一摄像头和所述第二摄像头设置在同一平面上;
比较在所述第一图像和所述第二图像中所述待测物体的位置,得到所述待测物体的位置偏移总量,其中,所述位置偏移总量用于表征所述待测物体在所述第一图像中的位置相对于所述待测物体在所述第二图像中的位置的偏移量;
接收用户基于所述第一图像输入的测量点选定指令,根据所述位置偏移总量、所述第一摄像头的中心和所述第二摄像头的中心的距离以及所述第一摄像头的焦距,计算选定的测量点之间的距离。
结合第三方面的实现方式,在第三方面第一种可能的实现方式中,所述处理器还用于:
在所述得到所述待测物体的位置偏移总量之后,将所述第一图像的成像数据和所述位置偏移总量进行压缩并存储为照片文件;
在所述接收用户基于所述第一图像输入的测量点选定指令之前,解析所述照片文件,得到所述第一图像的成像数据和所述位置偏移总量;
解码所述成像数据,将所述第一图像显示在所述终端的显示屏上。
结合第三方面、或第三方面第一种可能的实现方式,在第三方面第二种可能的实现方式中,所述输出设备用于在所述计算选定的测量点之间的距离之后,将计算得到的距离结果显示在所述终端的显示屏上;或者
将计算得到的距离结果以语音的方式告知用户。
结合第三方面、或第三方面第一种至第二种任一可能的实现方式,在第三方面第三种可能的实现方式中,所述处理器具体用于通过以下公式比较在所述第一图像和所述第二图像中所述待测物体的位置,得到所述待测物体的位置偏移总量:
D=X1+X2
其中,D为所述待测物体的位置偏移总量,X1为待测物体在所述第一图像中的位置偏移分量,X2为待测物体在所述第二图像中的位置偏移分量。
结合第三方面、或第三方面第一种至第三种任一可能的实现方式,在第三方面第四种可能的实现方式中,所述处理器具体用于通过以下公式计算选定的测量点之间的距离:
Figure PCTCN2015079051-appb-000003
其中,L为选定的测量点之间的距离,(Xp,Yp)为测量点p在所述第一图像中的坐标,Dp为所述测量点P的位置偏移总量,(Xq,Yq)为测量点q在所述第一图像中的坐标,Dq为所述测量点q的位置偏移总量,H1=h1/a,h1为所述第一摄像头的焦距,a为单个像素点的尺寸常量。
结合第三方面、或第三方面第一种至第四种任一可能的实现方式,在第三方面第五种可能的实现方式中,所述处理器还用于通过以下公式计算所述待测物体到所述第一摄像头和所述第二摄像头的中心连线的垂直距离:
Z=H1*D1/(X1+X2)
其中,Z为所述待测物体到所述第一摄像头和所述第二摄像头的中心连线的垂直距离,H1=h1/a,h1为所述第一摄像头的焦距,a为单个像素点的尺寸常量,D1为所述第一摄像头的中心和所述第二摄像头的中心的距离,X1为待测物体在所述第一图像中的位置偏移分量,X2为待测物体在所述第二图像中的位置偏移分量。
结合第三方面、或第三方面第一种至第五种任一可能的实现方式,在第三方面第六种可能的实现方式中,所述处理器在计算选定的测量点之间的距离时,具体用于:
若用户选定的测量点为两个,则计算两个测量点之间的距离;
若用户选定的测量点大于两个,则根据选定的顺序依次计算相邻两个测量点之间的距离。
第四方面,本发明实施例提供了一种计算机存储介质,所述计算机存储介质存储有程序,该程序执行时包括如本发明实施例第一方面或第一方面任意实施方式所述的步骤。
实施本发明实施例,具有如下有益效果:
通过两个摄像头获取待测物体的第一图像和第二图像,并比较计算得到待测物体的位置偏移总量,当用户选定待测距离的测量点时,便可以根据该位置偏移总量、第一摄像头和第二摄像头的中心间距以及第一摄像头的焦距便可以计算得到选定测量点之间的距离,用户无需额外携带任何测量工具,只需要携带终端进行拍照便可以实现测量待测物体的尺寸,丰富了终端的功能,扩展了终端的实用性和便利性。
附图说明
为了更清楚地说明本发明实施例或现有技术中的技术方案,下面将对实施 例中所需要使用的附图作简单地介绍,显而易见地,下面描述中的附图仅仅是本发明的一些实施例,对于本领域普通技术人员来讲,在不付出创造性劳动的前提下,还可以根据这些附图获得其他的附图。
图1是本发明一种测量的方法的第一实施例的流程示意图;
图2是本发明一种测量的方法的第二实施例的流程示意图;
图3是本发明一种终端的第一实施例的组成示意图;
图4是本发明一种终端的第二实施例的组成示意图;
图5是本发明一种终端的第三实施例的组成示意图;
图6是本发明计算距离的原理示意图。
具体实施方式
下面将结合本发明实施例中的附图,对本发明实施例中的技术方案进行清楚、完整地描述,显然,所描述的实施例仅仅是本发明一部分实施例,而不是全部的实施例。基于本发明中的实施例,本领域普通技术人员在没有作出创造性劳动前提下所获得的所有其他实施例,都属于本发明保护的范围。
本发明实施例中的终端可以包括具备双摄像头拍照功能的智能手机(如Android手机、iOS手机、Windows Phone手机等)、平板电脑、数码相机、掌上电脑、笔记本电脑、移动互联网设备(Mobile Internet Devices,简称MID)或穿戴式设备等,上述终端仅是举例,而非穷举,包含但不限于上述终端。
请参照图1,为本发明一种测量的方法的第一实施例的流程示意图,在本实施例中,所述方法包括以下步骤:
S101,终端通过第一摄像头获取包含待测物体的第一图像,并通过第二摄像头获取包含所述待测物体的第二图像。
其中,所述第一摄像头和所述第二摄像头设置在同一平面上。
可选地,在一些可行的实施方式中,终端通过两个摄像头获取图像时,可根据终端用户的操作来触发拍照,如终端用户点击相机应用的拍照键,或者按下某个物理按键或者多个物理按键的组合,或者对终端输入一段语音(如“拍照”或“测量”等),或者对终端做出预设手势等来触发拍照,本发明实施例不做限定。
且终端上的摄像头可以是两个,也可以是三个或更多,只需要确保至少两个摄像头获取到的图像包含待测物体即可。两个摄像头中可以设置主摄像头和副摄像头,且主副关系可以随意切换,在后续图像保存时,保存主摄像头拍摄的图像,在计算各种数据时,基于提供图像的主摄像头的参数进行计算即可。
在获取第一图像和第二图像时,可同时获取或在较短的时间间隔内连续获取,尽量保持拍摄时终端位置不变以获得相关性最大的第一图像和第二图像,若能保持第一次拍摄和第二次拍摄终端的位置不变,则可以间隔任意时间间隔。
S102,比较在所述第一图像和所述第二图像中所述待测物体的位置,得到所述待测物体的位置偏移总量。
其中,所述位置偏移总量用于表征所述待测物体在所述第一图像中的位置相对于所述待测物体在所述第二图像中的位置的偏移量。由于两个参照位置分布在两个图像里,因此在具体计算时,可引入中间参考位置如图像的中心线进行比较。可选地,所述位置偏移总量可根据所述待测物体在所述第一图像中的位置偏移分量和所述待测物体在所述第二图像中的位置偏移分量计算得到,所述位置偏移分量为所述待测物体在图像中的位置到该图像中心线的距离,所述图像中心线垂直于所述第一摄像头的中心和所述第二摄像头的中心的连线。
例如,可参照图6,若终端具备左右两个摄像头,且两个摄像头水平设置,则拍照时,左摄像头获取到第一图像,待测物体在第一图像上一般将存在向左的偏移量,如果以第一图像在竖直方向的中心线作为基准,即以与两个摄像头中心连线垂直的中心线作为基准的话,可得到待测物体在第一图像里的位置偏移分量,同理在第二图像中可得到待测物体在第二图像的位置偏移分量,由于两个摄像头水平设置,对于一个像素点而言,在坐标系中其纵坐标相同,因此不存在竖直方向的偏差,位置偏移总量即为水平方向上两个位置偏移分量的和。同理,当两个摄像头垂直设置时,即只需要考虑垂直方向的位置偏移分量及位置偏移总量。
可选地,可先获取第一图像中待测物体的位置偏移分量,再获取第二图像中待测物体的位置偏移分量,然后根据以下公式计算待测物体的位置偏移总量:
D=X1+X2
其中,D为所述待测物体的位置偏移总量,X1为待测物体在所述第一图像中的位置偏移分量,X2为待测物体在所述第二图像中的位置偏移分量。
S103,接收用户基于所述第一图像输入的测量点选定指令,根据所述位置偏移总量、所述第一摄像头的中心和所述第二摄像头的中心的距离以及所述第一摄像头的焦距,计算选定的测量点之间的距离。
当终端获取到位置偏移总量后,可以提示用户基于第一图像选择测量点,此时第一图像为主摄像头即第一摄像头拍摄并被终端保存的图像。用户基于第一图像,可以选择待测物体的边缘点作为第一个测量点,然后再选择待测物体的第二个位置作为第二测量点,例如,需要测量图像中衣柜的尺寸时,可以先选定衣柜的上沿作为第一测量点,然后选择其垂直的下沿部位作为第二测量点,此时,终端便可以根据位置偏移总量、两个摄像头的中心的距离和第一摄像头的焦距,计算出两个测量点之间的距离。其中,在本发明实施例中,所述第一摄像头的中心和所述第二摄像头的中心的距离,可理解为,包括第一摄像头和第二摄像头之间的距离,包括第一摄像头的镜头与第二摄像头的镜头之间的距离,包括第一摄像头的图像传感器与第二摄像头的图像传感器之间的距离,包括第一摄像头的镜头和第二摄像头的图像传感器之间的距离。可选地,还可包括第一摄像头的感光器件与包括第二摄像头边缘在内的任一器件之间的距离。对此不做限定。
需要说明的是,在本发明实施例中,测量点的数量可以是两个,也可以是两个以上,若用户选定的测量点为两个,则计算两个测量点之间的距离;若用户选定的测量点大于两个,则根据选定的顺序依次计算相邻两个测量点之间的距离。例如,需要测量衣柜的长度和宽度时,可先选定衣柜左上沿作为第一测量点,然后选定衣柜左下沿作为第二测量点,再选定衣柜右下沿作为第三测量点,则终端将依次根据第一测量点和第二测量点计算出衣柜的长度,然后根据第二测量点和第三测量点计算出衣柜的宽度。当然,也可以设置一个复位按钮或重新测量的按钮供用户选择,当用户选择复位或重新测量后,原先选定的测量点都将失效。
可选地,在计算选定的测量点之间的距离时,可根据以下公式进行:
Figure PCTCN2015079051-appb-000004
其中,L为选定的测量点之间的距离,(Xp,Yp)为测量点p在所述第一图像中的坐标,Dp为所述测量点P的位置偏移总量,(Xq,Yq)为测量点q在所述第一图像中的坐标,Dq为所述测量点q的位置偏移总量,H1=h1/a,h1为所述第一摄像头的焦距,a为单个像素点的尺寸常量。
其中D1,a,h1均为已知量,当测量点选定之后,可获得(Xp,Yp)、(Xq,Yq)的值,并通过步骤S102,可获得Dp和Dq的值,这样,便可以通过上述公式计算得到p点和q点的距离。
L的计算公式推导,可参见图6,图6为本发明计算距离的原理示意图。
如图6所示,待测物体通过第一摄像头成像后的位置偏移分量为X1,待测物体通过第二摄像头成像后的位置偏移分量为X2,第一摄像头和第二摄像头的中心之间的距离为D1,待测物体中心到两个摄像头中心连线的距离为Z,待测物体中心到第一摄像头中心延长线的距离为XW
对于第一图像上的某个像素点,如边缘的X1点,若其坐标为(x1,y1),则在x轴方向上,其绝对长度即为x1
根据三角形相似原理,可得到x1/Xw=h1/Z,与横坐标类似地,在y轴方向上,可得到y1/Yw=h1/Z,从而可得到Xw=x1*D1/D,Yw=y1*D1/D;
因此,对于待测物体的两个测量点p点和q点而言,其距离等于两个测量点的空间坐标之间的欧几里得长度,根据欧几里得公式可得:
Figure PCTCN2015079051-appb-000005
其中的H1=h1/a,h1为所述第一摄像头的焦距,a为单个像素点的尺寸常量,由于第一摄像头和第二摄像头的中心间距为D1的计量单位为通用的长度计量单位,而Dp、Dq的长度基于图像内的像素点数量计算,因此通过引入常量a,可使得公式前后单位统一,最后得到的L的单位即为用户常用的长度计 量单位。
当然,本发明实施例除了可以计算待测物体的尺寸,同样可以计算待测无图到终端即到终端两个摄像头中心连线的垂直距离,具体计算同样基于三角形相似原理,如图6所示,可得到(D1-X1-X2)/D1=(Z-H1)/Z,进而可推导出:
Z=H1*D1/(X1+X2),根据此公式便可以快速的计算得到Z。
而当基于第二摄像头拍摄的第二图像作为基准进行计算时,只需要获取第二摄像的焦距h2,用h2替换h1即可,其余计算方法类似,此处不再赘述。
通过两个摄像头获取待测物体的第一图像和第二图像,并比较计算得到待测物体的位置偏移总量,当用户选定待测距离的测量点时,便可以根据该位置偏移总量、第一摄像头的中心和第二摄像头的中心的距离以及第一摄像头的焦距便可以计算得到选定测量点之间的距离,用户无需额外携带任何测量工具,只需要携带终端进行拍照便可以实现测量待测物体的尺寸,丰富了终端的功能,扩展了终端的实用性和便利性。
请参照图2,为本发明一种测量的方法的第二实施例的流程示意图,在本实施例中,所述方法包括以下步骤:
S201,终端通过第一摄像头获取包含待测物体的第一图像,并通过第二摄像头获取包含所述待测物体的第二图像。
其中,所述第一摄像头和所述第二摄像头设置在同一平面上。
S202,比较在所述第一图像和所述第二图像中所述待测物体的位置,得到所述待测物体的位置偏移总量。
其中,所述位置偏移总量用于表征所述待测物体在所述第一图像中的位置相对于所述待测物体在所述第二图像中的位置的偏移量。
S203,将所述第一图像的成像数据和所述位置偏移总量进行压缩并存储为照片文件。
用户在获取到位置偏移总量后,可能不需要立即进行尺寸测量,这时可以将第一图像的成像数据和位置偏移总量进行压缩存储为照片文件,这样利于用户随时进行测量。且除了将成像数据和位置偏移总量压缩存储外,当然也可以将两者独立存储并配置一定的映射关系,当用户查看第一图像并想要测量相关距离时,可随时调用对应的位置偏移总量进行计算。
S204,解析所述照片文件,得到所述第一图像的成像数据和所述位置偏移总量。
S205,解码所述成像数据,将所述第一图像显示在所述终端的显示屏上。
通过将第一图像在终端显示屏上展示,利于用户直观查看,更方便的选定测量点。
S206,接收用户基于所述第一图像输入的测量点选定指令,根据所述位置偏移总量、所述第一摄像头和所述第二摄像头的中心间距以及所述第一摄像头的焦距,计算选定的测量点之间的距离。
S207,将计算得到的距离结果显示在所述终端的显示屏上。
可选地,除了将结果直观的显示在显示屏上以告知用户计算结果之外,还可以将计算得到的距离结果以语音的方式告知用户,通知方式可以有多种,用户可根据自身需要选择或搭配,例如可以选择文字显示和语音播报同时进行的方式,本发明实施例不作任何限定。
在本实施例中的计算方式可参照图1所示第一实施例的计算方式进行,此处不再赘述。
本发明实施例还提供一种计算机存储介质,其中,该计算机存储介质存储有程序,该程序执行时包括上述方法实施例中记载的测量的方法的部分或全部步骤。
请参照图3,为本发明一种终端的第一实施例组成示意图,在本实施例中,所述终端包括:
获取单元100,用于通过第一摄像头获取包含待测物体的第一图像,并通过第二摄像头获取包含所述待测物体的第二图像,其中,所述第一摄像头和所述第二摄像头设置在同一平面上;
比较单元200,用于比较在所述第一图像和所述第二图像中所述待测物体的位置,得到所述待测物体的位置偏移总量,其中,所述位置偏移总量用于表征所述待测物体在所述第一图像中的位置相对于所述待测物体在所述第二图像中的位置的偏移量。所述位置偏移总量可根据所述待测物体在所述第一图像中的位置偏移分量和所述待测物体在所述第二图像中的位置偏移分量计算得到,所述位置偏移分量为所述待测物体在图像中的位置到该图像中心线的距 离,所述图像中心线垂直于所述第一摄像头的中心和所述第二摄像头的中心的连线;
计算单元300,用于接收用户基于所述第一图像输入的测量点选定指令,根据所述位置偏移总量、所述第一摄像头的中心和所述第二摄像头的中心的距离以及所述第一摄像头的焦距,计算选定的测量点之间的距离。
可选地,所述比较单元200具体用于通过以下公式比较在所述第一图像和所述第二图像中所述待测物体的位置,得到所述待测物体的位置偏移总量:
D=X1+X2
其中,D为所述待测物体的位置偏移总量,X1为待测物体在所述第一图像中的位置偏移分量,X2为待测物体在所述第二图像中的位置偏移分量。
可选地,所述计算单元300具体用于通过以下公式计算选定的测量点之间的距离:
Figure PCTCN2015079051-appb-000006
其中,L为选定的测量点之间的距离,(Xp,Yp)为测量点p在所述第一图像中的坐标,Dp为所述测量点P的位置偏移总量,(Xq,Yq)为测量点q在所述第一图像中的坐标,Dq为所述测量点q的位置偏移总量,H1=h1/a,h1为所述第一摄像头的焦距,a为单个像素点的尺寸常量。
可选地,所述计算单元300还可用于通过以下公式计算所述待测物体到所述第一摄像头和所述第二摄像头的中心连线的垂直距离:
Z=H1*D1/(X1+X2)
其中,Z为所述待测物体到所述第一摄像头和所述第二摄像头的中心连线的垂直距离,H1=h1/a,h1为所述第一摄像头的焦距,a为单个像素点的尺寸常量,D1为所述第一摄像头的中心和所述第二摄像头的中心的距离,X1为待测物体在所述第一图像中的位置偏移分量,X2为待测物体在所述第二图像中的位置偏移分量。
请参照图4,为本发明一种终端的第二实施例组成示意图,在本实施例中, 所述终端包括:
获取单元100,用于通过第一摄像头获取包含待测物体的第一图像,并通过第二摄像头获取包含所述待测物体的第二图像,其中,所述第一摄像头和所述第二摄像头设置在同一平面上;
比较单元200,用于比较在所述第一图像和所述第二图像中所述待测物体的位置,得到所述待测物体的位置偏移总量,其中,所述位置偏移总量用于表征所述待测物体在所述第一图像中的位置相对于所述待测物体在所述第二图像中的位置的偏移量。所述位置偏移总量可根据所述待测物体在所述第一图像中的位置偏移分量和所述待测物体在所述第二图像中的位置偏移分量计算得到,所述位置偏移分量为所述待测物体在图像中的位置到该图像中心线的距离,所述图像中心线垂直于所述第一摄像头的中心和所述第二摄像头的中心的连线;
计算单元300,用于接收用户基于所述第一图像输入的测量点选定指令,根据所述位置偏移总量、所述第一摄像头的中心和所述第二摄像头的中心的距离以及所述第一摄像头的焦距,计算选定的测量点之间的距离。
可选地,所述比较单元200具体用于通过以下公式比较在所述第一图像和所述第二图像中所述待测物体的位置,得到所述待测物体的位置偏移总量:
D=X1+X2
其中,D为所述待测物体的位置偏移总量,X1为待测物体在所述第一图像中的位置偏移分量,X2为待测物体在所述第二图像中的位置偏移分量。
可选地,所述计算单元300具体用于通过以下公式计算选定的测量点之间的距离:
Figure PCTCN2015079051-appb-000007
其中,L为选定的测量点之间的距离,(Xp,Yp)为测量点p在所述第一图像中的坐标,Dp为所述测量点P的位置偏移总量,(Xq,Yq)为测量点q在所述第一图像中的坐标,Dq为所述测量点q的位置偏移总量,H1=h1/a,h1为 所述第一摄像头的焦距,a为单个像素点的尺寸常量。
可选地,所述计算单元300还可用于通过以下公式计算所述待测物体到所述第一摄像头和所述第二摄像头的中心连线的垂直距离:
Z=H1*D1/(X1+X2)
其中,Z为所述待测物体到所述第一摄像头和所述第二摄像头的中心连线的垂直距离,H1=h1/a,h1为所述第一摄像头的焦距,a为单个像素点的尺寸常量,D1为所述第一摄像头的中心和所述第二摄像头的中心的距离,X1为待测物体在所述第一图像中的位置偏移分量,X2为待测物体在所述第二图像中的位置偏移分量。
可选地,本发明实施例中的终端还包括:
压缩单元400,用于将所述第一图像的成像数据和所述位置偏移总量进行压缩并存储为照片文件;
解析单元500,用于解析所述照片文件,得到所述第一图像的成像数据和所述位置偏移总量;
解码单元600,用于解码所述成像数据,将所述第一图像显示在所述终端的显示屏上。
通知单元700,用于将计算得到的距离结果显示在所述终端的显示屏上;或者
将计算得到的距离结果以语音的方式告知用户。
需要说明的是,以上获取单元100、比较单元200、计算单元300、压缩单元400、解析单元500、解码单元600及通知单元700可以独立存在,也可以集成设置,本实施例中获取单元100、比较单元200、计算单元300、压缩单元400、解析单元500、解码单元600或通知单元700可以以硬件的形式独立于终端的处理器单独设置,且设置形式可以是微处理器的形式;也可以以硬件形式内嵌于该终端的处理器中,还可以以软件形式存储于该终端的存储器中,以便于该终端的处理器调用执行以上获取单元100、比较单元200、计算单元300、压缩单元400、解析单元500、解码单元600及通知单元700对应的操作。
例如,在本发明一种终端的第一实施例(图3所示的实施例)中,计算单 元300可以为终端的处理器,而获取单元100和比较单元200的功能可以内嵌于该处理器中,也可以独立于处理器单独设置,也可以以软件的形式存储于存储器中,由处理器调用实现其功能。本发明实施例不做任何限制。以上处理器可以为中央处理单元(CPU)、微处理器、单片机等。
请参照图5,为本发明一种终端的第三实施例的组成示意图,在本实施例中,所述终端包括:
输入设备10、输出设备20、存储器30和处理器40,所述输入设备10包括第一摄像头和第二摄像头,用于获取待测物体的图像,所述输入设备10、输出设备20、存储器30和处理器40与总线连接,其中,所述存储器30中存储一组程序代码,所述处理器40用于调用所述存储器30中存储的程序代码,执行以下操作:
通过第一摄像头获取包含待测物体的第一图像,并通过第二摄像头获取包含所述待测物体的第二图像,其中,所述第一摄像头和所述第二摄像头设置在同一平面上;
比较在所述第一图像和所述第二图像中所述待测物体的位置,得到所述待测物体的位置偏移总量,其中,所述位置偏移总量用于表征所述待测物体在所述第一图像中的位置相对于所述待测物体在所述第二图像中的位置的偏移量。所述位置偏移总量可根据所述待测物体在所述第一图像中的位置偏移分量和所述待测物体在所述第二图像中的位置偏移分量计算得到,所述位置偏移分量为所述待测物体在图像中的位置到该图像中心线的距离,所述图像中心线垂直于所述第一摄像头的中心和所述第二摄像头的中心的连线;
接收用户基于所述第一图像输入的测量点选定指令,根据所述位置偏移总量、所述第一摄像头的中心和所述第二摄像头的中心的距离以及所述第一摄像头的焦距,计算选定的测量点之间的距离。
可选地,所述处理器40还用于:
在所述得到所述待测物体的位置偏移总量之后,将所述第一图像的成像数据和所述位置偏移总量进行压缩并存储为照片文件;
在所述接收用户基于所述第一图像输入的测量点选定指令之前,解析所述照片文件,得到所述第一图像的成像数据和所述位置偏移总量;
解码所述成像数据,将所述第一图像显示在所述终端的显示屏上。
所述输出设备20用于在所述计算选定的测量点之间的距离之后,将计算得到的距离结果显示在所述终端的显示屏上;或者
将计算得到的距离结果以语音的方式告知用户。
可选地,所述处理器40具体用于通过以下公式比较在所述第一图像和所述第二图像中所述待测物体的位置,得到所述待测物体的位置偏移总量:
D=X1+X2
其中,D为所述待测物体的位置偏移总量,X1为待测物体在所述第一图像中的位置偏移分量,X2为待测物体在所述第二图像中的位置偏移分量。
可选地,所述处理器40还用于具体用于通过以下公式计算选定的测量点之间的距离:
Figure PCTCN2015079051-appb-000008
其中,L为选定的测量点之间的距离,(Xp,Yp)为测量点p在所述第一图像中的坐标,Dp为所述测量点P的位置偏移总量,(Xq,Yq)为测量点q在所述第一图像中的坐标,Dq为所述测量点q的位置偏移总量,H1=h1/a,h1为所述第一摄像头的焦距,a为单个像素点的尺寸常量。
可选地,所述处理器40还用于通过以下公式计算所述待测物体到所述第一摄像头和所述第二摄像头的中心连线的垂直距离:
Z=H1*D1/(X1+X2)
其中,Z为所述待测物体到所述第一摄像头和所述第二摄像头的中心连线的垂直距离,H1=h1/a,h1为所述第一摄像头的焦距,a为单个像素点的尺寸常量,D1为所述第一摄像头的中心和所述第二摄像头的中心的距离,X1为待测物体在所述第一图像中的位置偏移分量,X2为待测物体在所述第二图像中的位置偏移分量。
需要说明的是,本说明书中的各个实施例均采用递进的方式描述,每个实施例重点说明的都是与其它实施例的不同之处,各个实施例之间相同相似的部 分互相参见即可。对于装置实施例而言,由于其与方法实施例基本相似,所以描述的比较简单,相关之处参见方法实施例的部分说明即可。
通过上述实施例的描述,本发明具有以下优点:
通过两个摄像头获取待测物体的第一图像和第二图像,并比较计算得到待测物体的位置偏移总量,当用户选定待测距离的测量点时,便可以根据该位置偏移总量、第一摄像头的中心和第二摄像头的中心的距离以及第一摄像头的焦距便可以计算得到选定测量点之间的距离,用户无需额外携带任何测量工具,只需要携带终端进行拍照便可以实现测量待测物体的尺寸,丰富了终端的功能,扩展了终端的实用性和便利性。
本领域普通技术人员可以理解:实现上述方法实施例的全部或部分步骤可以通过程序指令相关的硬件来完成,前述的程序可以存储于一计算机可读取存储介质中,该程序在执行时,执行包括上述方法实施例的步骤;而前述的存储介质包括:ROM、RAM、磁碟或者光盘等各种可以存储程序代码的介质。
以上对本发明实施例所提供的一种测量的方法及终端进行了详细介绍,本文中应用了具体个例对本发明的原理及实施方式进行了阐述,以上实施例的说明只是用于帮助理解本发明的方法及其核心思想;同时,对于本领域的一般技术人员,依据本发明的思想,在具体实施方式及应用范围上均会有改变之处,综上所述,本说明书内容不应理解为对本发明的限制。

Claims (22)

  1. 一种测量的方法,其特征在于,包括:
    终端通过第一摄像头获取包含待测物体的第一图像,并通过第二摄像头获取包含所述待测物体的第二图像,其中,所述第一摄像头和所述第二摄像头设置在同一平面上;
    比较在所述第一图像和所述第二图像中所述待测物体的位置,得到所述待测物体的位置偏移总量,其中,所述位置偏移总量用于表征所述待测物体在所述第一图像中的位置相对于所述待测物体在所述第二图像中的位置的偏移量;
    接收用户基于所述第一图像输入的测量点选定指令,根据所述位置偏移总量、所述第一摄像头的中心和所述第二摄像头的中心的距离以及所述第一摄像头的焦距,计算选定的测量点之间的距离。
  2. 如权利要求1所述的方法,其特征在于,在所述比较在所述第一图像和所述第二图像中所述待测物体的位置,得到所述待测物体的位置偏移总量之后,还包括:
    将所述第一图像的成像数据和所述位置偏移总量进行压缩并存储为照片文件;
    在所述接收用户基于所述第一图像输入的测量点选定指令之前,还包括:
    解析所述照片文件,得到所述第一图像的成像数据和所述位置偏移总量;
    解码所述成像数据,将所述第一图像显示在所述终端的显示屏上。
  3. 如权利要求1或2所述的方法,其特征在于,在所述计算选定的测量点之间的距离之后,还包括:
    将计算得到的距离结果显示在所述终端的显示屏上;或者
    将计算得到的距离结果以语音的方式告知用户。
  4. 如权利要求1-3任一项所述的方法,其特征在于,所述比较在所述第一图像和所述第二图像中所述待测物体的位置,得到所述待测物体的位置偏移 总量,根据以下公式计算:
    D=X1+X2
    其中,D为所述待测物体的位置偏移总量,X1为待测物体在所述第一图像中的位置偏移分量,X2为待测物体在所述第二图像中的位置偏移分量。
  5. 如权利要求1-4任一项所述的方法,其特征在于,所述接收用户基于所述第一图像输入的测量点选定指令,根据所述位置偏移总量、所述第一摄像头和所述第二摄像头的中心间距以及所述第一摄像头的焦距,计算选定的测量点之间的距离,根据以下公式进行:
    Figure PCTCN2015079051-appb-100001
    其中,L为选定的测量点之间的距离,(Xp,Yp)为测量点p在所述第一图像中的坐标,Dp为所述测量点P的位置偏移总量,(Xq,Yq)为测量点q在所述第一图像中的坐标,Dq为所述测量点q的位置偏移总量,H1=h1/a,h1为所述第一摄像头的焦距,a为单个像素点的尺寸常量。
  6. 如权利要求1-5任一项所述的方法,其特征在于,还包括:
    根据以下公式计算所述待测物体到所述第一摄像头和所述第二摄像头的中心连线的垂直距离:
    Z=H1*D1/(X1+X2)
    其中,Z为所述待测物体到所述第一摄像头和所述第二摄像头的中心连线的垂直距离,H1=h1/a,h1为所述第一摄像头的焦距,a为单个像素点的尺寸常量,D1为所述第一摄像头的中心和所述第二摄像头的中心的距离,X1为待测物体在所述第一图像中的位置偏移分量,X2为待测物体在所述第二图像中的位置偏移分量。
  7. 如权利要求1-6任一项所述的方法,其特征在于,
    若用户选定的测量点为两个,则计算两个测量点之间的距离;
    若用户选定的测量点大于两个,则根据选定的顺序依次计算相邻两个测量点之间的距离。
  8. 一种终端,其特征在于,包括:
    获取单元,用于通过第一摄像头获取包含待测物体的第一图像,并通过第二摄像头获取包含所述待测物体的第二图像,其中,所述第一摄像头和所述第二摄像头设置在同一平面上;
    比较单元,用于比较在所述第一图像和所述第二图像中所述待测物体的位置,得到所述待测物体的位置偏移总量,其中,所述位置偏移总量用于表征所述待测物体在所述第一图像中的位置相对于所述待测物体在所述第二图像中的位置的偏移量;
    计算单元,用于接收用户基于所述第一图像输入的测量点选定指令,根据所述位置偏移总量、所述第一摄像头的中心和所述第二摄像头的中心的距离以及所述第一摄像头的焦距,计算选定的测量点之间的距离。
  9. 如权利要求8所述的终端,其特征在于,所述终端还包括:
    压缩单元,用于将所述第一图像的成像数据和所述位置偏移总量进行压缩并存储为照片文件;
    解析单元,用于解析所述照片文件,得到所述第一图像的成像数据和所述位置偏移总量;
    解码单元,用于解码所述成像数据,将所述第一图像显示在所述终端的显示屏上。
  10. 如权利要求8或9所述的终端,其特征在于,所述终端还包括:
    通知单元,用于将计算得到的距离结果显示在所述终端的显示屏上;或者
    将计算得到的距离结果以语音的方式告知用户。
  11. 如权利要求8-10任一项所述的终端,其特征在于,所述比较单元具 体用于通过以下公式比较在所述第一图像和所述第二图像中所述待测物体的位置,得到所述待测物体的位置偏移总量:
    D=X1+X2
    其中,D为所述待测物体的位置偏移总量,X1为待测物体在所述第一图像中的位置偏移分量,X2为待测物体在所述第二图像中的位置偏移分量。
  12. 如权利要求8-11任一项所述的终端,其特征在于,所述计算单元具体用于通过以下公式计算选定的测量点之间的距离:
    Figure PCTCN2015079051-appb-100002
    其中,L为选定的测量点之间的距离,(Xp,Yp)为测量点p在所述第一图像中的坐标,Dp为所述测量点P的位置偏移总量,(Xq,Yq)为测量点q在所述第一图像中的坐标,Dq为所述测量点q的位置偏移总量,H1=h1/a,h1为所述第一摄像头的焦距,a为单个像素点的尺寸常量。
  13. 如权利要求8-12任一项所述的终端,其特征在于,所述计算单元还用于通过以下公式计算所述待测物体到所述第一摄像头和所述第二摄像头的中心连线的垂直距离:
    Z=H1*D1/(X1+X2)
    其中,Z为所述待测物体到所述第一摄像头和所述第二摄像头的中心连线的垂直距离,H1=h1/a,h1为所述第一摄像头的焦距,a为单个像素点的尺寸常量,D1为所述第一摄像头的中心和所述第二摄像头的中心的距离,X1为待测物体在所述第一图像中的位置偏移分量,X2为待测物体在所述第二图像中的位置偏移分量。
  14. 如权利要求8-13任一项所述的终端,其特征在于,所述计算单元具体用于:
    若用户选定的测量点为两个,则计算两个测量点之间的距离;
    若用户选定的测量点大于两个,则根据选定的顺序依次计算相邻两个测量点之间的距离。
  15. 一种终端,其特征在于,包括:
    输入设备、输出设备、存储器和处理器,所述输入设备包括第一摄像头和第二摄像头,用于获取待测物体的图像,所述输入设备、输出设备、存储器和处理器与总线连接,其中,所述存储器中存储一组程序代码,所述处理器用于调用所述存储器中存储的程序代码,执行以下操作:
    通过第一摄像头获取包含待测物体的第一图像,并通过第二摄像头获取包含所述待测物体的第二图像,其中,所述第一摄像头和所述第二摄像头设置在同一平面上;
    比较在所述第一图像和所述第二图像中所述待测物体的位置,得到所述待测物体的位置偏移总量,其中,所述位置偏移总量用于表征所述待测物体在所述第一图像中的位置相对于所述待测物体在所述第二图像中的位置的偏移量;
    接收用户基于所述第一图像输入的测量点选定指令,根据所述位置偏移总量、所述第一摄像头的中心和所述第二摄像头的中心的距离以及所述第一摄像头的焦距,计算选定的测量点之间的距离。
  16. 如权利要求15所述的终端,其特征在于,所述处理器还用于:
    在所述得到所述待测物体的位置偏移总量之后,将所述第一图像的成像数据和所述位置偏移总量进行压缩并存储为照片文件;
    在所述接收用户基于所述第一图像输入的测量点选定指令之前,解析所述照片文件,得到所述第一图像的成像数据和所述位置偏移总量;
    解码所述成像数据,将所述第一图像显示在所述终端的显示屏上。
  17. 如权利要求15或16所述的终端,其特征在于,所述输出设备用于在所述计算选定的测量点之间的距离之后,将计算得到的距离结果显示在所述终端的显示屏上;或者
    将计算得到的距离结果以语音的方式告知用户。
  18. 如权利要求15-17任一项所述的终端,其特征在于,所述处理器具体用于通过以下公式比较在所述第一图像和所述第二图像中所述待测物体的位置,得到所述待测物体的位置偏移总量:
    D=X1+X2
    其中,D为所述待测物体的位置偏移总量,X1为待测物体在所述第一图像中的位置偏移分量,X2为待测物体在所述第二图像中的位置偏移分量。
  19. 如权利要求15-18任一项所述的终端,其特征在于,所述处理器具体用于通过以下公式计算选定的测量点之间的距离:
    Figure PCTCN2015079051-appb-100003
    其中,L为选定的测量点之间的距离,(Xp,Yp)为测量点p在所述第一图像中的坐标,Dp为所述测量点P的位置偏移总量,(Xq,Yq)为测量点q在所述第一图像中的坐标,Dq为所述测量点q的位置偏移总量,H1=h1/a,h1为所述第一摄像头的焦距,a为单个像素点的尺寸常量。
  20. 如权利要求15-19任一项所述的终端,其特征在于,所述处理器还用于通过以下公式计算所述待测物体到所述第一摄像头和所述第二摄像头的中心连线的垂直距离:
    Z=H1*D1/(X1+X2)
    其中,Z为所述待测物体到所述第一摄像头和所述第二摄像头的中心连线的垂直距离,H1=h1/a,h1为所述第一摄像头的焦距,a为单个像素点的尺寸常量,D1为所述第一摄像头的中心和所述第二摄像头的中心的距离,X1为待测物体在所述第一图像中的位置偏移分量,X2为待测物体在所述第二图像中的位置偏移分量。
  21. 如权利要求15-20任一项所述的终端,其特征在于,所述处理器在计算选定的测量点之间的距离时,具体用于:
    若用户选定的测量点为两个,则计算两个测量点之间的距离;
    若用户选定的测量点大于两个,则根据选定的顺序依次计算相邻两个测量点之间的距离。
  22. 一种计算机存储介质,其特征在于,
    所述计算机存储介质存储有程序,该程序执行时包括如权利要求1-7任一项所述的步骤。
PCT/CN2015/079051 2015-05-15 2015-05-15 一种测量的方法及终端 WO2016183723A1 (zh)

Priority Applications (7)

Application Number Priority Date Filing Date Title
EP15892104.9A EP3264032B1 (en) 2015-05-15 2015-05-15 Measurement method and terminal
US15/561,287 US10552971B2 (en) 2015-05-15 2015-05-15 Measurement method, and terminal
JP2017557076A JP6490242B2 (ja) 2015-05-15 2015-05-15 測定方法、および端末
CN201580079722.3A CN107532881B (zh) 2015-05-15 2015-05-15 一种测量的方法及终端
KR1020177030133A KR101971815B1 (ko) 2015-05-15 2015-05-15 측정 방법 및 단말기
BR112017021042-8A BR112017021042B1 (pt) 2015-05-15 2015-05-15 Método de medição, terminal e meio legível por computador
PCT/CN2015/079051 WO2016183723A1 (zh) 2015-05-15 2015-05-15 一种测量的方法及终端

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2015/079051 WO2016183723A1 (zh) 2015-05-15 2015-05-15 一种测量的方法及终端

Publications (1)

Publication Number Publication Date
WO2016183723A1 true WO2016183723A1 (zh) 2016-11-24

Family

ID=57319126

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2015/079051 WO2016183723A1 (zh) 2015-05-15 2015-05-15 一种测量的方法及终端

Country Status (7)

Country Link
US (1) US10552971B2 (zh)
EP (1) EP3264032B1 (zh)
JP (1) JP6490242B2 (zh)
KR (1) KR101971815B1 (zh)
CN (1) CN107532881B (zh)
BR (1) BR112017021042B1 (zh)
WO (1) WO2016183723A1 (zh)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115046480A (zh) * 2021-03-09 2022-09-13 华为技术有限公司 一种测量长度的方法、电子设备以及移动设备

Families Citing this family (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108603743B (zh) * 2016-02-04 2020-03-27 富士胶片株式会社 信息处理装置、信息处理方法及程序
US10762658B2 (en) * 2017-10-24 2020-09-01 Altek Corporation Method and image pick-up apparatus for calculating coordinates of object being captured using fisheye images
CN109008937A (zh) * 2018-07-26 2018-12-18 上海鹰瞳医疗科技有限公司 屈光度检测方法和设备
CN110006340B (zh) * 2019-03-26 2020-09-08 华为技术有限公司 一种物体尺寸测量方法和电子设备
WO2021058841A1 (es) * 2019-09-27 2021-04-01 Sigma Technologies, S.L. Método de medida no supervisada de las dimensiones de un objeto usando una vista obtenida con una única cámara
CN111473767B (zh) * 2020-04-16 2022-10-25 福建汇川物联网技术科技股份有限公司 一种远程测距方法及装置
CN113746963B (zh) * 2021-08-30 2023-11-21 苏州灵猴机器人有限公司 一种零部件安装方法、装置、设备及存储介质
KR20230116565A (ko) * 2022-01-28 2023-08-04 삼성전자주식회사 전자 장치 및 전자 장치의 제어 방법

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103063138A (zh) * 2013-01-24 2013-04-24 惠州Tcl移动通信有限公司 移动终端摄像头测量物体尺寸和速度的方法及移动终端
CN103292710A (zh) * 2013-05-27 2013-09-11 华南理工大学 一种应用双目视觉视差测距原理的距离测量方法
WO2013174354A2 (zh) * 2012-11-30 2013-11-28 中兴通讯股份有限公司 一种单摄像头测距的方法和系统
CN104596419A (zh) * 2015-01-22 2015-05-06 宇龙计算机通信科技(深圳)有限公司 一种基于双摄像头测量物体质量的方法、装置和终端

Family Cites Families (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0518748A (ja) * 1991-07-09 1993-01-26 Kyocera Corp 距離情報等を表示可能なデイジタル電子スチルカメラ
JP3078069B2 (ja) * 1991-12-16 2000-08-21 オリンパス光学工業株式会社 測距装置
JP4468544B2 (ja) 2000-04-03 2010-05-26 オリンパス株式会社 内視鏡装置
US6803270B2 (en) * 2003-02-21 2004-10-12 International Business Machines Corporation CMOS performance enhancement using localized voids and extended defects
US20050036046A1 (en) * 2003-08-14 2005-02-17 Nokia Corporation Method of or device for processing image data, a processed image data format, and a method of or device for displaying at least one image from the processed image data
US20090029078A1 (en) * 2007-07-25 2009-01-29 Gohil Rameshchandra M Oxygen scavenging composition, coating composition and package containing transition metal oxide
JP5167881B2 (ja) * 2008-03-14 2013-03-21 カシオ計算機株式会社 距離測定装置及びそのプログラム
US8345953B2 (en) * 2008-05-22 2013-01-01 Matrix Electronic Measuring Properties, Llc Stereoscopic measurement system and method
JP5441549B2 (ja) * 2009-07-29 2014-03-12 日立オートモティブシステムズ株式会社 道路形状認識装置
JP5018980B2 (ja) 2010-04-08 2012-09-05 カシオ計算機株式会社 撮像装置、長さ測定方法、及びプログラム
JP2013113600A (ja) 2011-11-25 2013-06-10 Sharp Corp ステレオ3次元計測装置
WO2013146269A1 (ja) 2012-03-29 2013-10-03 シャープ株式会社 画像撮像装置、画像処理方法およびプログラム
US20130331145A1 (en) * 2012-06-07 2013-12-12 Sharp Laboratories Of America, Inc. Measuring system for mobile three dimensional imaging system
RU2601421C2 (ru) * 2012-11-29 2016-11-10 Ксир Способ и система калибровки камеры
CN103344213A (zh) 2013-06-28 2013-10-09 三星电子(中国)研发中心 一种双摄像头测量距离的方法和装置
WO2017077906A1 (ja) * 2015-11-06 2017-05-11 富士フイルム株式会社 情報処理装置、情報処理方法、及びプログラム
US20170343367A1 (en) * 2016-05-26 2017-11-30 Hyundai Motor Company Apparatus and method for anticipating and facilitating vehicle occupant's metabolic activity

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2013174354A2 (zh) * 2012-11-30 2013-11-28 中兴通讯股份有限公司 一种单摄像头测距的方法和系统
CN103063138A (zh) * 2013-01-24 2013-04-24 惠州Tcl移动通信有限公司 移动终端摄像头测量物体尺寸和速度的方法及移动终端
CN103292710A (zh) * 2013-05-27 2013-09-11 华南理工大学 一种应用双目视觉视差测距原理的距离测量方法
CN104596419A (zh) * 2015-01-22 2015-05-06 宇龙计算机通信科技(深圳)有限公司 一种基于双摄像头测量物体质量的方法、装置和终端

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP3264032A4 *

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115046480A (zh) * 2021-03-09 2022-09-13 华为技术有限公司 一种测量长度的方法、电子设备以及移动设备
WO2022188691A1 (zh) * 2021-03-09 2022-09-15 华为技术有限公司 一种测量长度的方法、电子设备以及移动设备
CN115046480B (zh) * 2021-03-09 2023-11-10 华为技术有限公司 一种测量长度的方法、电子设备以及移动设备

Also Published As

Publication number Publication date
KR101971815B1 (ko) 2019-04-23
CN107532881B (zh) 2020-02-14
EP3264032B1 (en) 2019-07-24
CN107532881A (zh) 2018-01-02
KR20170128779A (ko) 2017-11-23
BR112017021042B1 (pt) 2022-10-25
BR112017021042A2 (zh) 2018-07-24
JP6490242B2 (ja) 2019-03-27
US20180130219A1 (en) 2018-05-10
EP3264032A4 (en) 2018-02-28
US10552971B2 (en) 2020-02-04
JP2018519500A (ja) 2018-07-19
EP3264032A1 (en) 2018-01-03

Similar Documents

Publication Publication Date Title
WO2016183723A1 (zh) 一种测量的方法及终端
JP6116486B2 (ja) 寸法計測方法
US9721346B2 (en) Image assessment device, method, and computer readable medium for 3-dimensional measuring and capturing of image pair range
JP6348611B2 (ja) 自動ピント合わせ方法、装置、プログラム及び記録媒体
CN108050943B (zh) 一种长度测量方法及终端
US8670023B2 (en) Apparatuses and methods for providing a 3D man-machine interface (MMI)
WO2017012269A1 (zh) 通过图像确定空间参数的方法、装置及终端设备
US20180205882A1 (en) Method and system for remote control of photo-taking by a bluetooth smart watch, a smart terminal and spp thereof
WO2017124899A1 (zh) 一种信息处理方法及装置、电子设备
CN108200335B (zh) 基于双摄像头的拍照方法、终端及计算机可读存储介质
WO2016127671A1 (zh) 图像滤镜生成方法及装置
JP6502511B2 (ja) 計算装置、計算装置の制御方法および計算プログラム
US20170032180A1 (en) Method and device for determining associated user
CN106454086B (zh) 一种图像处理方法和移动终端
JP2018503269A (ja) プレビューイメージの表示方法および装置
CN105320695A (zh) 图片处理方法及装置
JP2017168882A (ja) 画像処理装置、画像処理方法及びプログラム
WO2018133305A1 (zh) 一种图像处理的方法及装置
JP2017090420A (ja) 3次元情報復元装置及び3次元情報復元方法
JP5996233B2 (ja) 画像撮像装置
KR102111148B1 (ko) 썸네일 이미지 생성 방법 및 그 전자 장치
JP2019092168A (ja) 撮像装置、撮像方法、撮像プログラム
WO2016180221A1 (zh) 一种拍照的方法及装置
CN109981990B (zh) 一种图像处理方法、装置及终端
CN118138886A (zh) 对象拍摄方法、装置、终端、存储介质

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 15892104

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 15561287

Country of ref document: US

REEP Request for entry into the european phase

Ref document number: 2015892104

Country of ref document: EP

ENP Entry into the national phase

Ref document number: 20177030133

Country of ref document: KR

Kind code of ref document: A

ENP Entry into the national phase

Ref document number: 2017557076

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

REG Reference to national code

Ref country code: BR

Ref legal event code: B01A

Ref document number: 112017021042

Country of ref document: BR

ENP Entry into the national phase

Ref document number: 112017021042

Country of ref document: BR

Kind code of ref document: A2

Effective date: 20170929