WO2020010561A1 - 测量物体参数的方法及装置 - Google Patents

测量物体参数的方法及装置 Download PDF

Info

Publication number
WO2020010561A1
WO2020010561A1 PCT/CN2018/095370 CN2018095370W WO2020010561A1 WO 2020010561 A1 WO2020010561 A1 WO 2020010561A1 CN 2018095370 W CN2018095370 W CN 2018095370W WO 2020010561 A1 WO2020010561 A1 WO 2020010561A1
Authority
WO
WIPO (PCT)
Prior art keywords
measured
depth
area
image
parameter
Prior art date
Application number
PCT/CN2018/095370
Other languages
English (en)
French (fr)
Inventor
张旭
柯政遠
王强
那柏林
Original Assignee
华为技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 华为技术有限公司 filed Critical 华为技术有限公司
Priority to PCT/CN2018/095370 priority Critical patent/WO2020010561A1/zh
Priority to CN201880087447.3A priority patent/CN111630524B/zh
Publication of WO2020010561A1 publication Critical patent/WO2020010561A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/60Analysis of geometric attributes

Definitions

  • the present application relates to the technical field of terminals, and in particular, to a method and a device for measuring parameters of an object.
  • FIG. 1 is a schematic diagram of non-contact calorie measurement.
  • the apple to be measured is taken as an example in FIG. 1.
  • the user When the user needs to measure the calorie value contained in the apple 1 placed on the table, the user can shoot the apple 1 through the camera of the electronic device 2 to obtain an image 3 containing the apple 1 to be measured, and the electronic device 2 performs image processing
  • the technology recognizes that the type of food in the image 3 is an apple.
  • the electronic device 2 can display to the user the number of calories per 100 grams of apples on the screen; or, the electronic device 2 estimates the actual size of the apple to be measured according to the area occupied by the apple 1 in the image 3, and The calorie value of apple 1 is estimated based on the estimated size of apple 1.
  • FIG. 2 is a schematic diagram of the distance between the electronic device and the object to be measured in the non-contact calorie measurement method
  • FIG. 3 is a schematic diagram of processing the image containing the object to be measured in the non-contact calorie measurement method.
  • the electronic device 2 in FIG. 2 is taken at a distance of L1 from Apple 1.
  • the size of apple 1 in photo A is S1; and the electronic device 2 is taken at an interval of L2 from Apple 1.
  • the photo B in FIG. 3 is obtained through image processing, the size of the apple 1 in the photo B is S2, and S1 ⁇ S2. Therefore, for the same apple 1 to be measured, the area occupied by the apple 1 in the photos A and B taken by the electronic device 2 at different distances is different.
  • the electronic device estimates the calorie value of the object to be measured
  • the electronic device estimates only the relative value of the food area in the photo. In this way, the calorie value of the same apple to be measured determined by the electronic device 2 according to the photos A and B described above will not be the same, so that the estimation of the calorie value of the apple to be measured is not accurate. That is, in the prior art, when the calorie value of food is estimated, the electronic device estimates the calorie value of the food to be measured according to the size of the food to be measured in the photograph. However, the size of the food in the photo is only the relative value of the area in the specific photo .
  • the present application provides a method and a device for measuring an object parameter, which improves the accuracy when measuring an object parameter (such as calories).
  • a first aspect of the present application provides a method for measuring an object parameter, including:
  • Determining the depth of the object to be measured in the image where the depth of the object to be measured is when the device used to capture the first image captures the first image, the device captures the object to be measured distance;
  • a first parameter of the object to be measured is determined according to a category of the object to be measured, the area of the object to be measured, and a depth of the object to be measured.
  • determining the parameters of the object to be measured according to the type of the object to be measured, the area of the object to be measured, and the depth of the object to be measured includes:
  • a parameter of the object to be measured corresponding to the category of the object to be measured, the area of the object to be measured, and the depth of the object to be measured is determined by searching a mapping relationship, and the mapping relationship includes a category of at least one object, Correspondence between area, depth and parameters.
  • the method further includes:
  • the mapping relationship is determined according to a correspondence between a category, an area, a depth, and a parameter of the at least one object.
  • a correspondence between a category, an area, a depth, and a parameter of the at least one object includes:
  • the area and the depth of the at least one object are in a proportional relationship.
  • the area and the depth of the object have a linear or non-linear relationship.
  • the area and the depth of the object are in an inverse proportional relationship, and the shooting distance is in a -2 power relationship with the area.
  • the determining a category of an object to be measured in an image includes:
  • the determining an area of the object to be measured in the image includes:
  • An area of the object to be measured is determined according to a thermal map of the image.
  • the acquiring the depth of the object to be measured in the image includes:
  • the ranging method includes: determining a depth of the object to be measured by using a parallax of a dual camera of the device; or determining the Measure the depth of the object; or determine the depth of the object to be measured by laser ranging with a sensor of the device; or determine the depth of the object to be measured by the TOF time of the camera of the device; or The structured light of the device camera determines the depth of the object to be measured.
  • the parameter of the object to be measured is used to indicate the heat of the object to be measured.
  • the parameter of the object to be measured is a calorie value of the object to be measured.
  • the method for measuring an object parameter provided in the first aspect of the present application, it is determined by identifying the type of the object to be measured, the area and depth in the image, and determining the type of the object to be measured, the area and depth in the image. Parameters of the object to be measured. The influence of the depth on the parameters of the object to be measured is included in the measurement range, thereby improving the accuracy of the non-contact method for measuring the parameters of the object.
  • a second aspect of the present application provides a device for measuring a parameter of an object, including:
  • Recognition module for identifying the type of object to be measured in an image
  • a first determining module configured to determine an area of the object to be measured in the image
  • a second determining module configured to determine a depth of the object to be measured in the image, where the depth of the object to be measured is a device used to capture the image, and when the image is captured, the device arrives at the device to be measured Measure the distance of an object;
  • a third determining module is configured to determine parameters of the object to be measured according to a category of the object to be measured, an area of the object to be measured, and a depth of the object to be measured.
  • the third determining module is specifically configured to:
  • a parameter of the object to be measured corresponding to the category of the object to be measured, the area of the object to be measured, and the depth of the object to be measured is determined by searching a mapping relationship, and the mapping relationship includes a category of at least one object, Correspondence between area, depth and parameters.
  • the method further includes:
  • a fourth determining module configured to determine a correspondence between a category, an area, a depth, and a parameter of at least one object
  • the fourth determining module is further configured to determine the mapping relationship according to a correspondence relationship between a category, an area, a depth, and a parameter of the at least one object.
  • a correspondence between a category, an area, a depth, and a parameter of the at least one object includes:
  • the area and the depth of the at least one object are in a proportional relationship.
  • the area and the depth of the object have a linear or non-linear relationship.
  • the area and the depth of the object are in an inverse proportional relationship, and the shooting distance is in a -2 power relationship with the area.
  • the identification module is specifically configured to:
  • the first determining module is specifically configured to:
  • An area of the object to be measured is determined according to a thermal map of the image.
  • the second determining module is specifically configured to determine a ranging method corresponding to the environmental information according to the environmental information when the device photographs the object to be measured;
  • the ranging method determines the depth of the object to be measured.
  • the ranging method includes: determining a depth of the object to be measured by using a parallax of a dual camera of the device; or determining the target to be measured by an automatic zoom AF camera of the device. Measure the depth of the object; or determine the depth of the object to be measured by laser ranging with a sensor of the device; or determine the depth of the object to be measured by the TOF time of the camera of the device; or The structured light of the device camera determines the depth of the object to be measured.
  • the parameter of the object to be measured is used to indicate the heat of the object to be measured.
  • the parameter of the object to be measured is a calorie value of the object to be measured.
  • the device for measuring object parameters provided in the second aspect of the present application, it is determined by identifying the type of the object to be measured, the area and depth in the image, and determining the type of the object to be measured, the area and depth in the image The first parameter of the object to be measured.
  • the influence of the depth on the parameters of the object to be measured is included in the measurement range, thereby improving the accuracy of the non-contact method for measuring the parameters of the object.
  • an embodiment of the present application provides a device for measuring an object parameter, including: a processor and a memory; the memory is used to store a program; the processor is used to call a program stored in the memory to The method for measuring an object parameter according to any one of the first aspects of the present application is performed.
  • an embodiment of the present application provides a computer-readable storage medium, where the computer-readable storage medium stores program code, and when the program code is executed, executes the program according to any one of the first aspect of the present application.
  • Figure 1 is a schematic diagram of non-contact calorie measurement
  • FIG. 2 is a schematic diagram of the distance between the electronic device and the object to be measured in the non-contact calorie measurement method
  • FIG. 3 is a schematic diagram of processing an image including an object to be measured in a non-contact calorie measurement method
  • Embodiment 4 is a schematic flowchart of Embodiment 1 of a method for measuring an object parameter of the present application
  • FIG. 5 is a schematic diagram of determining a corresponding relationship in a method for measuring an object parameter of the present application
  • FIG. 6 is a schematic diagram showing a relationship between a depth and an area of an object of the present application.
  • FIG. 7 is a principle diagram of determining a depth of a device and an object to be measured according to parallax of a dual camera;
  • FIG. 8 is a flowchart of determining a depth of a device and an object to be measured according to parallax of a dual camera
  • FIG. 9 is a schematic structural diagram of an embodiment of a parameter measurement system of the present application.
  • FIG. 10 is a schematic flowchart of an embodiment of a parameter measurement system of the present application.
  • FIG. 11 is a schematic structural diagram of a first embodiment of a device for measuring an object parameter of the present application
  • FIG. 12 is a schematic structural diagram of a second embodiment of a device for measuring an object parameter of the present application.
  • FIG. 13 is a schematic structural diagram of a third embodiment of a device for measuring an object parameter of the present application.
  • FIG. 4 is a schematic flowchart of Embodiment 1 of a method for measuring an object parameter of the present application.
  • the object parameter measurement method provided in this application includes:
  • S101 Identify the type of the object to be measured in the image.
  • the execution subject of the method for measuring an object parameter in this embodiment may be any device having a data processing function.
  • the device may also be referred to as an electronic device, a terminal, a user equipment (UE), a mobile station (MS), a mobile terminal (MT), and the like.
  • the device can be a mobile phone, a tablet, a computer with wireless transceiver function, a virtual reality (VR) terminal device, an augmented reality (AR) terminal device, an industrial control (industrial wireless terminal in control), wireless terminal in self driving, wireless terminal in remote surgery, wireless terminal in smart grid, transportation terminal safety Wireless terminals, wireless terminals in smart cities, wireless terminals in smart homes, and so on.
  • VR virtual reality
  • AR augmented reality
  • the type of the object to be measured in the image may be identified through S101.
  • the image in this embodiment is an image including an object to be measured, and is captured by a device that measures an object parameter.
  • the image may also be obtained by the device measuring the parameter of the object, including an image of the object to be measured, taken by another device.
  • the parameters of different types of objects are different, that is, the categories of the objects to be measured are used to distinguish the objects to be measured, so as to more accurately measure the parameters of different types of objects.
  • the object to be measured is a food that can calculate calories, such as fruits and meat, because different foods usually have different calorie values
  • the types of foods such as fruits and inner classes are used as the category of the object to be measured.
  • food types can include: apples, pears, chicken and beef.
  • the parameter of the object to be measured may be used to indicate the heat of the object to be measured.
  • the calorie of the object to be measured can be expressed by the calorie value of the object to be measured.
  • the parameter is a calorie value as an example for description, and when the device measures other physical parameters or length parameters of the object, the method for measuring object parameters in the present application may also be used, which is not limited herein.
  • S102 Determine the area of the object to be measured in the image.
  • the device in this embodiment needs to measure the parameters of the object to be measured, it is necessary to determine the area of the object to be measured in the image through S102.
  • the area of the object to be measured in the image determined by the device may be the area of the object relative to the image.
  • Photo A and Photo B shown in FIG. 3 may be images taken by the same device and containing the same object to be measured.
  • both photos A and B include images of the object to be measured.
  • the device can determine the area occupied by the object to be measured in the image by using the areas S1 and S2 included in the photos A and B.
  • the device can determine the relative area of the apple to be measured according to the area a * b of the known photo, where the relative area of the object to be measured in photo A is S1 and the relative area of the object to be measured in photo B is S2.
  • S103 Determine the depth of the object to be measured in the image.
  • the depth of the object to be measured needs to be determined through S103.
  • the depth refers to the distance from the device to the object to be measured when the device used to capture the image captures the image. More specifically, the distance from the device to the object to be measured may be the vertical distance from the camera of the device to the closest cut plane of the object to be measured when the device is capturing an image. For example, when the device 2 shown in FIG. The distances L1 and L2.
  • the device determines the type of the object to be measured through S101, determines the area of the object to be measured in the image through S102, and determines the depth of the object to be measured in the image through S103.
  • the sequence of the three steps is not limited by their sequence numbers.
  • the device in this embodiment can execute the above steps S101, S102, and S103 simultaneously, or sequentially execute the above steps S101, S102, and S103 in any order.
  • S104 Determine the parameters of the object to be measured according to the type of the object to be measured, the area of the object to be measured, and the depth of the object to be measured.
  • the device determines the category of the object to be measured according to S101, determines the area of the object to be measured in the image in S102, and determines the depth of the object to be measured in the image in S103. parameter.
  • S104 includes: determining a parameter of the object to be measured by searching a mapping relationship to determine a correspondence with a type of the object to be measured, an area of the object to be measured, a depth of the object to be measured, and a parameter of the object to be measured.
  • the mapping relationship may be stored in a storage space of the device, and when the device needs to find a parameter of an object to be measured, the mapping relationship is read from the storage space.
  • the mapping relationship includes a correspondence between a category, an area, a depth, and a parameter of at least one object.
  • the mapping relationship includes the correspondence between the category, area, depth, and heat of the apple and the correspondence between the category, area, depth, and heat of the pear "category- Area-depth-heat ".
  • the corresponding relationship between the category, area, depth, and heat of the apple can include: "Apple-20cm2-20cm-52 Kcal "and" Apple-35 cm2-10 cm-70 kcal ".
  • the mapping relationship includes: "Apple-30cm2-10cm-52kcal”, “Apple-20cm2-20 Cm-52 kcal ",” Apple-35 cm2-10 cm-70 kcal “, and” Apple-10 cm2-39 cm-94 kcal ".
  • the device measures the thermal parameters of the apple, the device determines that the type of the object to be measured in the image is apple, determines that the area of the apple to be measured in the image is 20 cm 2, and determines that the depth of the apple to be measured in the image is 20 cm. .
  • the device compares the corresponding relationship “Apple-20 square cm-20 cm” of the three parameters determined above with the corresponding relationship in the mapping relationship.
  • the first three parameters of the second correspondence "Apple-20cm2-20cm-52kcal" in the above mapping relationship are compared with the obtained three parameters, and the category of the apple to be tested is determined
  • the corresponding relationship between the area, depth, and heat is "Apple-20cm2-20cm-52kcal", and the device determines that the calorie parameter of the apple to be measured is 52kcal.
  • a category of an object to be measured in an image is determined; an area of the object to be measured in the image is determined; a depth of the object to be measured in the image is determined, and the depth of the object to be measured is used
  • the device for capturing an image is the distance from the device to the object to be measured when capturing the image; determining parameters of the object to be measured according to the type of the object to be measured, the area of the object to be measured, and the depth of the object to be measured.
  • the influence of the shooting distance on the parameters of the object to be measured is included in the measurement range, and the parameters of the object to be measured are jointly determined according to the type of the object to be measured, the area and depth of the object to be measured in the image, thereby improving the accuracy of measuring the parameters of the object .
  • the device since the device needs to determine the correspondence relationship with the type of the object to be measured, the area of the object to be measured, the depth of the object to be measured, and the parameters of the object to be measured by searching the mapping relationship in S104.
  • the parameters of the object to be measured, so the mapping relationship needs to be determined in advance and stored in the device.
  • the device In order that the device can determine the correspondence relationship from the mapping relationship when determining the parameters of the object to be measured, and the correspondence relationship needs to cover the possible parameter range of the object to be measured as much as possible to improve the accuracy of the parameters determined by the device. Therefore, it is necessary to determine objects of different classes, and determine the area and depth of the corresponding relationship between objects of the same class under different parameters.
  • the method in the foregoing embodiment further includes: the device determines a correspondence relationship between a category, an area, a depth, and a parameter of at least one object, and determines a mapping relationship according to the determined above-mentioned correspondence relationship.
  • FIG. 5 is a schematic diagram of determining a correspondence relationship in a method for measuring an object parameter of the present application.
  • the object to be measured is an apple and the parameter is apple.
  • a 52-calorie apple was placed behind the table. Take photos of the apple with the depth of D1, D2, D3, D4, and / or D5, and the angles of T1, T2, T3, T4, and / or T5 from the device to the apple. photo.
  • the image taken by the device on the surface of a sphere surrounding the apple with the center of the circle and D1 as the radius can be considered as the image taken by the electronic device at the depth D1.
  • mapping relationship all the corresponding relationships in the above examples can be added to the mapping relationship, so that the device can determine the category of the object to be measured, the area of the object to be measured, the depth of the object to be measured, and the distance between the objects to be measured by searching for the corresponding relationships in the mapping relationship.
  • the correspondence between the parameters of the measurement object determines the parameters of the object to be measured.
  • the acquisition of the different correspondences in the mapping relationship in the above example can be summarized as the following steps: 1. Collect the areas of objects of the same category and the same parameter in photos of different depths; 2. Collect the categories and areas of the objects , Depth and parameters. 3. By determining the correspondence between different categories and different parameters, and adding the determined correspondence to the mapping relationship, the device can query the parameters of the object.
  • the category of the at least one object when the category of the at least one object is the same and the parameters are the same, the area and depth of the object are in a proportional relationship. Specifically, for an object of a category, when the parameters of the object of the category are constant, the area occupied by the object in the image captured by the device for the object of the same category and the same parameter is the depth of the device to the object when the image is taken Into -2 power relationship.
  • the above formula is the correspondence between the point on the image plane where the image is located and the actual three-dimensional coordinates where the object is derived through the principle of pinhole imaging.
  • Fx, fy, cx, cy are the shooting devices used by the device to capture the image.
  • Camera internal parameters Tz is the depth value of the object
  • a is the length of the object in the three-dimensional coordinate system
  • L is the length of the object in the two-dimensional coordinate system of the image. Furthermore, the relationship between the area S and the depth Tz of the object in the image can be deduced through the above formula:
  • the relationship between the area on the image plane of the image and the disparity disp to the second power can be derived to the power of -2;
  • the area in the correspondence between the objects of the same type as the object to be measured, and S represents the area of the object to be measured in the image.
  • the area S and the depth Tz of the object in the image have a linear or non-linear relationship, and the relationship is more specifically a -2 power relationship.
  • FIG. 6 is a schematic diagram showing the relationship between the depth and the area of an object in this application.
  • the schematic diagram shown in FIG. 6 shows that a curve of the -2 power relationship of the above relationship can be fitted by using the data of the actual measured depth and area of the object.
  • S0 represents the area occupied by the object in the image
  • Tz represents the depth of the object in the image.
  • the device can be used to shoot objects of the same parameter and type from different depths, and determine the relationship between the area and the depth of the same object from the captured images.
  • an apple to be measured is taken as an example, and an apple with a calorie of 50 kcal is selected for shooting.
  • the area of the apple is 30 square centimeters, and the relationship between the depth and the area of an apple with a heat of 50 kcal is (100,30). Record this point in 6.
  • the area of the apple is 8 square centimeters, and another depth and area of the apple can be obtained. Relationship (200,8), and record this point in Figure 6. All depth-area relationships obtained by photographing the same apple according to different depths are recorded in FIG. 6. A curve can be obtained by fitting all the relationships between depth and area, and it can be seen from FIG. 6 that the curve is the relationship between the depth and the area which is derived from the above formula to the power of -2.
  • the above fitting method may use a fitting method such as a least square method, an interpolation method, or an approximation method commonly used in the prior art, which is not limited herein.
  • the correspondence relationship can adopt the "class-area and depth relationship curve-parameter" form.
  • the corresponding relationship may be "Apple-area and depth relationship curve 1-52 kcal", “Apple-area and depth relationship curve 2-70 kcal”. Then, when the object to be measured obtained by the device is data of “Apple-35 cm2-10 cm”, the relationship between area and depth is substituted into the area-depth relationship curve 1 and relationship curve 2 in the corresponding relationship. If the relationship between the area and the depth of the object to be measured satisfies the relationship curve 1, it is determined that the calorie value of the apple to be measured is 52 kcal.
  • mapping relationship may also include the correspondence between two different representations of "category-area-depth-parameter” and "category-area-depth-relation curve-parameter".
  • the device queries the mapping relationship to confirm the parameters of the object to be measured, if the two different forms of the corresponding relationship have corresponding results, the corresponding relationship of the "category-area and depth relationship curve-parameter" should be determined by query. Parameters of the object to be measured.
  • deep learning when determining the type of the object to be measured in S101, deep learning may be specifically used to identify features of the object to be measured in the image to determine the type of the object to be measured.
  • the deep learning algorithm uses the principle of machine learning to identify the category of the object to be measured, that is, the deep learning network will determine the characteristics of objects of known categories in several images in advance, and record the categories of objects and known features in deep learning. In the network.
  • the device needs to determine the category of the object to be measured, it will input the image of the object to be measured into the deep learning network, the feature of the object to be measured in the image is extracted by the deep learning network, and the feature of the object to be measured is stored in the deep learning network.
  • the method for determining the category of the object to be measured by deep learning provided in this embodiment is merely an example, and the deep learning algorithm itself is not specifically limited. For the unlisted areas of the algorithm, refer to the field. Method for calculation.
  • the area of the object to be measured in the image is determined in S102.
  • the area of the object to be measured may be determined according to the determined heat map of the image.
  • the heat map is used to indicate the heat distribution of objects in the image, and different colors in the heat map are used to represent different heat regions in the image.
  • the device extracts the area where the object to be measured is extracted, the device can first obtain the heat map of the entire image, and then frame the area in the heat map of the entire image where the heat is greater than the confidence level as the area of the object to be measured.
  • the confidence level is a thermal value set according to experience.
  • the area can be considered to include the object to be measured.
  • a second segmentation may be performed on the area determined by the heat map, and the second segmentation may be based on different colors in the framed area of the object to be measured.
  • Objects are separated from the background.
  • the device can also perform heat map extraction of the image by means of deep learning.
  • the deep learning network also needs to perform feature extraction on the images of objects of different thermal power in advance, and store the features of the objects of different thermal power in the image to The characteristics of the object to be measured in the image determine its corresponding thermal force.
  • the deep learning network model can be implemented by a combination of a more easily converged residual network (Residual Networks, ResNets) and an initial network (Inception Net, InceptionNet) that can reduce network parameters.
  • ResNets residual Networks
  • Inception Net Inception Net
  • a new global average pooling layer is added after the network to generate a heat map of the first image to facilitate the positioning of the object to be measured in the first image.
  • the final device can also segment the area of the object to be measured by Gaussian mixture modeling and contour filling after determining the area containing the object to be measured in the image, and determine the area of the object to be measured in the image based on the area of the area .
  • the method for determining the area of an object to be measured by deep learning provided in this embodiment is merely an example, and the deep learning algorithm itself is not specifically limited. For the unlisted areas of the algorithm, refer to the field. Method for calculation.
  • the depth of the object to be measured in the image is determined in S103, and a ranging method corresponding to the environmental information may be determined according to the environmental information when the device takes the object to be measured; subsequently, the electronic device The depth of the object to be measured is determined by the determined ranging method.
  • the distance measurement method may be to determine the depth of the object to be measured through the parallax of the dual cameras of the device; or to determine the depth of the object to be measured through the device's Automatic Focus (AF) camera; or by using the device's sensors
  • Laser ranging determines the depth of the object to be measured; or determines the depth of the object to be measured by the time of flight (TOF) of the camera of the device; or determines the depth of the object to be measured by the structured light of the device camera.
  • the environment information may be light, contrast, whether there is fog or an estimated depth of an object to be measured, and the like when the device takes an image.
  • the device determines the actual depth of the object to be measured by laser ranging, that is, the actual precise distance between the device and the object to be measured ;
  • the device determines the actual precise distance between the device and the object to be measured by means of parallax of dual cameras.
  • the laser ranging method is used to determine the depth of the object to be measured, or when the device is shooting strong light to affect the laser, dual cameras The parallax method determines the depth of the object to be measured.
  • a possible implementation manner of measuring the shooting distance in this step is: determining the depth of the device and the object to be measured through the parallax of the dual camera of the device.
  • the parallax of the dual camera refers to the parallax of the two images generated by the two different cameras of the device when shooting the same scene. Specifically, first, the parallax of the left image and the right image are determined, and the left image and the right image are The images generated by different cameras of the device when capturing the same scene include the object to be measured, and then the depth of the object to be measured in the foregoing image is determined according to the parallax of the first image and the second image.
  • the left image and the right image obtained by the dual camera are used to determine the depth of the object to be measured according to the parallax of the dual camera.
  • the image of the measurement object is a single image containing the object to be measured after the left and right images obtained by the dual cameras of the device are combined and the background is blurred.
  • O L and O R is The optical centers of the left and right cameras, the two line segments of length L represent the image planes of the left and right cameras. Then the shortest distance from the optical center to the image plane is the focal length f, and b is the baseline between the two cameras.
  • P is a point in the world coordinate system
  • its imaging points on the left and right image planes are P L and P R.
  • P L and P R are each distance from the left edge of the image plane is X L and X R.
  • Parallax is X R -X L or X L -X R.
  • O L O R P P L P R P is similar to O L O R P, so there is a proportional relationship: Can be written as: So by: It can be deduced: Among them, X R -X L is called parallax, that is, the horizontal coordinates of the target point imaged on the left and right views are directly different. The spatial three-dimensional coordinates (X, Y, Z) of the scene can be obtained by using the principle of image matching and parallax.
  • FIG. 8 is a flowchart of determining the depth of the device and the object to be measured according to the parallax of the dual cameras.
  • the dual cameras of the device capture the left and right images respectively for the same scene, and then preprocess the left and right images.
  • the preprocessing is to synchronize the left and right images (timestamp alignment), and then perform matching processing on the synchronized left and right images.
  • the device will extract the left and right synchronized images Feature points for image and right image.
  • the corresponding feature points in the left image and the right image are determined to form a feature point pair, and a parallax X R -X L between the feature point pairs is calculated.
  • Z is the vertical distance from the camera of the device to the nearest section of the object to be measured, that is, the depth of the object to be measured. It should be noted that the method of determining the depth of the object to be measured using the parallax of the dual cameras provided in this embodiment is only an example, and the algorithm itself is not specifically limited. For the unlisted areas of the algorithm, reference may be made to the method in the field for calculation. .
  • FIG. 9 is a schematic structural diagram of an embodiment of a parameter measurement system of the present application.
  • the system provided by this embodiment can be used to implement measurement of any one of the foregoing embodiments.
  • Method of object parameters On the basis of the foregoing method, an AR measurement switch in a module is set in the system shown in FIG. 9 to enable and disable the method of measuring an object parameter in a device. If the AR measurement switch is turned on, the binocular measurement function, classification and segmentation, contour extraction, and calorie estimation function modules in the system frame can be used to measure the parameters of the object. After the device obtains the image of the food to be measured, it can use the image based on the image.
  • the binocular measurement module is based on the principle of stereo vision to calculate the depth and parallax information of the spatial points. The specific implementation is shown in S103 in Figure 4.
  • the image classification module is used to determine the kind of object to be measured through deep learning.
  • the image pre-segmentation module and contour extraction module are used to determine the area of the object to be measured in the image according to the heat map algorithm.
  • the specific implementation is shown in S102 in Figure 4.
  • the depth map acquisition module is used to obtain the depth of the object to be measured obtained by the binocular measurement module.
  • the depth of the object to be measured can be recorded by 10bit data, in which the first 8 bits store integer information and the last 2 bits store decimal information. The higher the accuracy, the better the accuracy of the depth represented.
  • the calorie estimation module determines the parameters of the object to be measured according to the type of the object to be measured, the area of the object to be measured, and the depth of the object to be measured.
  • the specific implementation manner is shown as S104 in FIG. 4.
  • FIG. 10 is a schematic flowchart of an embodiment of a parameter measurement system of the present application.
  • the process shown in Figure 10 can be used to execute in the system shown in Figure 9.
  • the system in the device starts the dual camera, it will first determine whether the AR calorie measurement function is turned on. If it is off, it will be in the device after normal shooting. Record the image and end the program; if it is on, the system in the device will continue to execute the program that measures the parameters of the object.
  • the procedure for measuring object parameters first determine whether the user input function switch is turned on. If it is turned on, the type of the object to be measured input by the user of the device is obtained; if it is turned off, the method to be measured in the image is determined according to the method in S101 in the foregoing embodiment.
  • the kind of object After determining the type of the object to be measured, the range of the object to be measured in the image is determined according to the heat map algorithm; and the area of the object to be measured in the image is further determined through the depth map and the image color information; finally, the area in S104 in the foregoing embodiment is passed.
  • the "empirical ratio method" determines the calorie estimate of the object to be measured according to the type of the object to be measured, the area of the object to be measured, and the depth of the object to be measured.
  • FIG. 11 is a schematic structural diagram of a first embodiment of a device for measuring an object parameter of the present application.
  • the apparatus for measuring an object parameter in this embodiment includes an identification module 1101, a first determination module 1102, a second determination module 1103, and a third determination module 1104.
  • the identification module 1101 is used to determine the type of the object to be measured in the image.
  • the first determination module 1102 is used to identify the area of the object to be measured in the image.
  • the second determination module 1103 is used to determine the depth of the object to be measured in the image.
  • the depth of the object is the distance from the device to the object to be measured when the device is used to capture the image.
  • the third determination module 1104 is used to determine the type of the object to be measured, the area of the object to be measured, and the depth of the object to be measured. Parameters of the object to be measured.
  • the apparatus for measuring object parameters in the embodiment shown in FIG. 11 can be used to implement the technical solution of the method embodiment for measuring object parameters shown in FIG. 4.
  • the implementation principles and technical effects are similar, and details are not described herein again.
  • the third determining module 1104 is specifically configured to determine a parameter of the object to be measured corresponding to the category of the object to be measured, the area of the object to be measured, and the depth of the object to be measured by searching the mapping relationship, and the mapping relationship includes at least one object Correspondence between categories, areas, depths, and parameters.
  • FIG. 12 is a schematic structural diagram of a second embodiment of a device for measuring an object parameter of the present application.
  • the apparatus for measuring object parameters in this embodiment further includes a fourth determination module 1105 on the basis of FIG. 11.
  • the fourth determining module 1105 is configured to determine a correspondence between a category, an area, a depth, and a parameter of the at least one object; the fourth determining module 1105 is further configured to, based on the category, the area, the depth, and the parameter, of the at least one object The corresponding relationship determines the mapping relationship.
  • the apparatus for measuring object parameters in the embodiment shown in FIG. 12 may be used to implement the technical solutions of the foregoing method embodiments.
  • the implementation principles and technical effects are similar, and details are not described herein again.
  • the correspondence between the category, area, depth, and parameters of at least one object includes: when the category of at least one object is the same and the parameters are the same, the area and depth of the object are proportional.
  • the second determining module 1103 is specifically configured to determine a ranging method corresponding to the environmental information according to the environmental information when the device photographs the object to be measured; and determine the distance of the object to be measured by the ranging method. depth.
  • the ranging method includes: determining the depth of the object to be measured by using the parallax of the dual cameras of the device; or determining the depth of the object to be measured by the device's automatic zoom AF camera; or, using the device's The sensor performs laser ranging to determine the depth of the object to be measured; or, the time of flight TOF of the device camera determines the depth of the object to be measured; or the structured light of the device camera determines the depth of the object to be measured.
  • the parameter of the object to be measured is used to indicate the heat of the object to be measured.
  • the apparatus for measuring an object parameter in each of the foregoing embodiments may be used to implement the technical solutions of the foregoing method embodiments, and the implementation principles and technical effects thereof are similar, and details are not described herein again.
  • the division of the modules in the embodiments of the present application is schematic, and is only a logical function division. In actual implementation, there may be another division manner.
  • the functional modules in the embodiments of the present application may be integrated into one processing module, or each module may exist separately physically, or two or more modules may be integrated into one module.
  • the above integrated modules can be implemented in the form of hardware or software functional modules.
  • the integrated module When the integrated module is implemented in the form of a software functional module and sold or used as an independent product, it can be stored in a computer-readable storage medium.
  • the technical solution of the present application is essentially a part that contributes to the existing technology or all or part of the technical solution can be embodied in the form of a software product, which is stored in a storage medium , Including a number of instructions for causing a computer device (which may be a personal computer, a server, or a network device) or a processor to perform all or part of the steps of the method described in each embodiment of the present application.
  • the aforementioned storage media include: U disks, mobile hard disks, read-only memory (ROM), random access memory (RAM), magnetic disks or compact discs, and other media that can store program codes .
  • the computer program product includes one or more computer instructions.
  • the computer may be a general-purpose computer, a special-purpose computer, a computer network, or other programmable devices.
  • the computer instructions may be stored in a computer-readable storage medium, or transmitted from one computer-readable storage medium to another computer-readable storage medium, for example, the computer instructions may be from a website site, a computer, a server, or a data center.
  • the computer-readable storage medium may be any available medium that can be accessed by a computer or a data storage device such as a server, a data center, or the like that includes one or more available medium integration.
  • the available medium may be a magnetic medium (for example, a floppy disk, a hard disk, a magnetic tape), an optical medium (for example, a DVD), or a semiconductor medium (for example, a solid state disk (Solid State Disk (SSD)), and the like.
  • part or all of the above modules can also be implemented by embedding on a certain chip of the terminal device in the form of an integrated circuit. And they can be implemented separately or integrated together. That is, the above modules can be configured to implement one or more integrated circuits of the above methods, for example: one or more specific integrated circuits (ASIC), or one or more microprocessors (digital processing processors) , DSP), or one or more Field Programmable Gate Array (FPGA).
  • ASIC application specific integrated circuits
  • microprocessors digital processing processors
  • DSP digital processing processors
  • FPGA Field Programmable Gate Array
  • FIG. 13 is a schematic structural diagram of a third embodiment of a device for measuring an object parameter of the present application.
  • the pig breeding 13 includes a processor 1301, a memory 1302, and a transceiver 130.
  • the memory 1302 is configured to store a program that implements each module of the foregoing method embodiment; the processor 1301 calls the program to perform the operations of the above method embodiment of measuring an object parameter.
  • the present application further provides a storage medium including a readable storage medium and a computer program, where the computer program is used to implement the method for measuring an object parameter provided by any of the foregoing embodiments.
  • the present application also provides a program product, which includes a computer program (ie, an execution instruction), and the computer program is stored in a readable storage medium.
  • a computer program ie, an execution instruction
  • At least one processor of the device that initiates the random access process may read the computer program from a readable storage medium, and the at least one processor executes the computer program to cause the device that initiates the random access process to implement the measurement object provided by the foregoing various embodiments. Method of parameters.

Landscapes

  • Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Geometry (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

一种测量物体参数的方法及装置,其中方法包括:S101识别图像中待测量物体的类别;S102确定在图像中待测量物体的面积;S103确定图像中待测量物体的深度,所述待测量物体的深度为用于拍摄图像的设备在拍摄图像时,设备到待测量物体的距离;S104根据待测量物体的类别、待测量物体的面积和待测量物体的深度确定待测量物体的参数。该测量物体参数的方法及装置能够将拍摄距离对待测量物体参数的影响纳入测量范围,根据待测量物体的类别、待测量物体在图像中的面积和待测量物体的深度共同确定待测量物体的参数,提高了测量物体参数时的准确度。

Description

测量物体参数的方法及装置 技术领域
本申请涉及终端技术领域,尤其涉及一种测量物体参数的方法及装置。
背景技术
随着生活水平的提高,肥胖等不健康表现更多地出现在人们的日常生活中,使得人们更加关注日常饮食中摄入的卡路里对身体的影响。服务提供商与用户都希望能够在日常饮食过程中通过测量食物的卡路里,来对用户的饮食提供建议和帮助,方便用户保持良好的饮食习惯。
现有技术提供了采用非接触式的测量卡路里的方法,该方法通常需要用户通过手机或相机等电子设备拍摄食物的照片后,由电子设备识别出照片中食物的种类,并向用户提示每100克该类型食物所包含的卡路里数值的经验值。或者由电子设备根据所拍摄照片中食物的大小通过经验值估算该食物所包含的卡路里数值。例如:图1为非接触式测量卡路里的示意图。在图1中以待测量物体以苹果为例。当用户需要测量桌子上放置的苹果1所包含的卡路里数值时,用户可以通过电子设备2的摄像头对苹果1进行拍摄,得到包含待测量物体苹果1的图像3,并由电子设备2通过图像处理技术识别图像3中食物的种类为苹果。随后,电子设备2可以在其屏幕上向用户显示出每100克苹果所包含的卡路里数值;或者,电子设备2根据图像3内苹果1所占据的面积大小估计实际的待测量苹果的大小,并根据估计出的苹果1的大小对苹果1的卡路里数值进行估算。
但是,如图2和图3所示,图2为非接触式测量卡路里方法中电子设备距离待测量物体距离的示意图,图3为非接触式测量卡路里方法中处理包含待测量物体图像的示意图。用户在通过电子设备2的摄像头拍摄实际待测量的苹果1时,距离待测量物体的距离并没有一个统一的标准,用户可能通过电子设备2的摄像头在距离待测量苹果1不同距离处对苹果1进行拍摄。当电子设备2离苹果1的距离较近时,拍摄的照片中苹果所占的面积较大,当电子设备2离苹果1的距离较远时,拍摄的照片中苹果1所占的面积较小。例如图2中电子设备2与苹果1间隔L1距离时拍摄得到图3中的照片A经过图像处理后,得到照片A中苹果1的大小为S1;而电子设备2与苹果1间隔L2距离时拍摄得到图3中的照片B经过图像处理后,得到照片B中苹果1的大小为S2,并且S1<S2。因此针对相同的一个待测量苹果1,电子设备2在不同距离所拍摄的照片A和照片B中苹果1所占的面积并不相同。而电子设备对于待测量物体进行卡路里数值估计时,电子设备仅依据照片中食物面积的相对值进行估计。如此电子设备2最终根据上述照片A和照片B确定出的同一待测量的苹果的卡路里数值也不会相同,使得对于待测量苹果的卡路里数值的估计不准确。即现有技术中在估算食物的卡路里数值时,电子设备根据所拍摄的照片中待测量食物的大小对待测量食物的卡路里数值进行估算,但是照片中食物的大小只是在特定照片内面积的相对值。而在实际应用过程中,当电子设备离待测量食物的距离较近时,拍摄的照片中食物所占的面积较大,当电子设备离待测食物的 距离较远时,拍摄的照片中食物所占的面积较小。使得对于同一待测量食物,电子设备所拍摄照片中食物大小的不同造成了对于待测量食物卡路里数值估算的不同。从而导致了现有技术中测量卡路里时的准确度较差。
发明内容
本申请提供一种测量物体参数的方法及装置,提高了测量物体参数(如卡路里)时的准确度。
本申请第一方面提供一种测量物体参数的方法,包括:
识别图像中待测量物体的类别;
确定在所述图像中所述待测量物体的面积;
确定所述图像中所述待测量物体的深度,所述待测量物体的深度为用于拍摄所述第一图像的设备拍摄所述第一图像时,所述设备到所述待测量物体的拍摄距离;
根据所述待测量物体的类别、所述待测量物体的所述面积和所述待测量物体的深度确定所述待测量物体的第一参数。
在本申请第一方面一实施例中,所述根据所述待测量物体的类别、所述待测量物体的面积和所述待测量物体的深度确定所述待测量物体的参数,包括:
通过查找映射关系确定与所述待测量物体的类别、所述待测量物体的面积和所述待测量物体的深度对应的所述待测量物体的参数,所述映射关系包含至少一个物体的类别、面积、深度和参数之间的对应关系。
在本申请第一方面一实施例中,还包括:
确定至少一个物体的类别、面积、深度和参数之间的对应关系;
根据所述至少一个物体的类别、面积、深度和参数之间的对应关系确定所述映射关系。
在本申请第一方面一实施例中,所述至少一个物体的类别、面积、深度和参数之间的对应关系包括:
所述至少一个物体的类别相同且参数相同时,所述至少一个物体的面积和深度成比例关系。
在本申请第一方面一实施例中,所述物体的面积和深度成线性或非线性关系。
在本申请第一方面一实施例中,所述物体的面积和深度成反比例关系,所述拍摄距离与所述面积成-2次幂关系。
在本申请第一方面一实施例中,所述确定图像中待测量物体的类别包括:
利用深度学习对所述图像中所述待测量物体的特征进行识别以确定所述待测量物体的类别。
在本申请第一方面一实施例中,所述确定所述图像中所述待测量物体的面积包括:
确定所述图像的热力图;
根据所述图像的热力图确定所述待测量物体的面积。
在本申请第一方面一实施例中,所述获取所述图像中所述待测量物体的深度,包括:
根据所述设备拍摄所述待测量物体时的环境信息,确定与所述环境信息对应的测距方式;
通过所述测距方式确定所述设备与所述待测量物体的深度。
在本申请第一方面一实施例中,所述测距方式包括:通过所述设备双摄像头的视差确定所述待测量物体的深度;或者,通过所述设备的自动变焦AF摄像头确定所述待测量物体的深度;或者,通过所述设备的传感器进行激光测距确定所述待测量物体的深度;或者,通过所述设备摄像头的飞行时间TOF时间确定所述待测量物体的深度;或者,通过所述设备摄像头的结构光确定所述待测量物体的深度。
在本申请第一方面一实施例中,所述待测量物体的参数用于指示所述待测量物体的热量。
在本申请第一方面一实施例中,所述待测量物体的参数为所述待测量物体的卡路里数值。
综上,在本申请第一方面提供的测量物体参数的方法中,通过识别待测量物体的类别、在图像中的面积和深度,并根据待测量物体的类别、在图像中的面积和深度确定待测量物体的参数。以将深度对待测量物体参数的影响纳入测量范围,从而提高了非接触式方法测量物体参数时的准确度。
本申请第二方面提供一种测量物体参数的装置,包括:
识别模块,用于识别图像中待测量物体的类别
第一确定模块,用于确定所述图像中所述待测量物体的面积;
第二确定模块,用于确定所述图像中所述待测量物体的深度,所述待测量物体的深度为用于拍摄所述图像的设备在拍摄所述图像时,所述设备到所述待测量物体的距离;
第三确定模块,用于根据所述待测量物体的类别、所述待测量物体的面积和所述待测量物体的深度确定所述待测量物体的参数。
在本申请第二方面一实施例中,所述第三确定模块具体用于,
通过查找映射关系确定与所述待测量物体的类别、所述待测量物体的面积和所述待测量物体的深度对应的所述待测量物体的参数,所述映射关系包含至少一个物体的类别、面积、深度和参数之间的对应关系。
在本申请第二方面一实施例中,还包括:
第四确定模块,用于确定至少一个物体的类别、面积、深度和参数之间的对应关系;
所述第四确定模块还用于,根据所述至少一个物体的类别、面积、深度和参数之间的对应关系确定所述映射关系。
在本申请第二方面一实施例中,所述至少一个物体的类别、面积、深度和参数之间的对应关系包括:
所述至少一个物体的类别相同且参数相同时,所述至少一个物体的面积和深度成比例关系。
在本申请第二方面一实施例中,所述物体的面积和深度成线性或非线性关系。
在本申请第二方面一实施例中,所述物体的面积和深度成反比例关系,所述拍摄距离与所述面积成-2次幂关系。
在本申请第二方面一实施例中,所述识别模块具体用于,
利用深度学习对所述图像中所述待测量物体的特征进行识别以确定所述待测量物体的类别。
在本申请第二方面一实施例中,所述第一确定模块具体用于,
确定所述图像的热力图;
根据所述图像的热力图确定所述待测量物体的面积。
在本申请第二方面一实施例中,所述第二确定模块具体用于,根据所述设备拍摄所述待测量物体时的环境信息,确定与所述环境信息对应的测距方式;通过所述测距方式确定所述待测量物体的深度。
在本申请第二方面一实施例中,所述测距方式包括:通过所述设备双摄像头的视差确定所述待测量物体的深度;或者,通过所述设备的自动变焦AF摄像头确定所述待测量物体的深度;或者,通过所述设备的传感器进行激光测距确定所述待测量物体的深度;或者,通过所述设备摄像头的飞行时间TOF时间确定所述待测量物体的深度;或者,通过所述设备摄像头的结构光确定所述待测量物体的深度。
在本申请第二方面一实施例中,所述待测量物体的参数用于指示所述待测量物体的热量。
在本申请第二方面一实施例中,所述待测量物体的参数为所述待测量物体的卡路里数值。
综上,在本申请第二方面提供的测量物体参数的装置中,通过识别待测量物体的类别、在图像中的面积和深度,并根据待测量物体的类别、在图像中的面积和深度确定待测量物体的第一参数。以将深度对待测量物体参数的影响纳入测量范围,从而提高了非接触式方法测量物体参数时的准确度。
第三方面,本申请实施例提供一种测量物体参数的装置,包括:处理器和存储器;所述存储器,用于存储程序;所述处理器,用于调用所述存储器所存储的程序,以执行本申请第一方面中任一所述的测量物体参数的方法。
第四方面,本申请实施例提供一种计算机可读存储介质,所述计算机可读存储介质存储程序代码,当所述程序代码被执行时,以执行如本申请第一方面任一所述的测量物体参数的方法。
附图说明
图1为非接触式测量卡路里的示意图;
图2为非接触式测量卡路里方法中电子设备距离待测量物体距离的示意图;
图3为非接触式测量卡路里方法中处理包含待测量物体图像的示意图;
图4为本申请测量物体参数的方法实施例一的流程示意图;
图5为本申请测量物体参数的方法中确定对应关系的示意图;
图6为本申请物体的深度与面积关系的示意图;
图7为根据双摄像头的视差确定设备与待测量物体的深度的原理图;
图8为根据双摄像头的视差确定设备与待测量物体的深度的流程图;
图9为本申请参数测量系统一实施例的结构示意图;
图10为本申请参数测量系统一实施例的流程示意图;
图11为本申请测量物体参数的装置实施例一的结构示意图;
图12为本申请测量物体参数的装置实施例二的结构示意图;
图13为本申请测量物体参数的装置实施例三的结构示意图。
具体实施方式
下面将结合本申请实施例中的附图,对本申请实施例进行描述。
图4为本申请测量物体参数的方法实施例一的流程示意图。如图4所示,本申请提供的物体参数测量方法包括:
S101:识别图像中待测量物体的类别。
本实施例中测量物体参数的方法的执行主体可以是任何具备数据处理功能的设备。其中,设备也可称为电子设备、终端Terminal、用户设备(user equipment,UE)、移动台(mobile station,MS)、移动终端(mobile terminal,MT)等。例如:设备可以是手机(mobile phone)、平板电脑(Pad)、带无线收发功能的电脑、虚拟现实(Virtual Reality,VR)终端设备、增强现实(Augmented Reality,AR)终端设备、工业控制(industrial control)中的无线终端、无人驾驶(self driving)中的无线终端、远程手术(remote medical surgery)中的无线终端、智能电网(smart grid)中的无线终端、运输安全(transportation safety)中的无线终端、智慧城市(smart city)中的无线终端、智慧家庭(smart home)中的无线终端等等。
具体地,本实施例中的设备需要测量待测量物体的参数时,可以通过S101识别图像中待测量物体的类别。其中,本实施例中的图像是包含待测量物体的图像,由测量物体参数的设备拍摄。或者,该图像也可以由测量物体参数的设备获取其他设备拍摄的包含待测量物体的图像。
对于需要测量的参数,不同类别的物体的参数不同,即待测量物体的类别用于对待测量物体进行区分,以更为准确地对不同类别的物体的参数进行测量。例如:当待测量物体是水果、肉类等可以计算出热量的食物时,由于不同的食物通常具有不同的热量数值,因此将水果、内类等食物的种类作为待测量物体的类别。例如食物种类可以包括:苹果、梨、鸡肉和牛肉等。
可选地,在本实施例中,待测量物体的参数可以用于指示待测量物体的热量。而待测量物体的热量可以通过待测量物体的卡路里数值表示。本申请以下各实施例中均以参数为卡路里数值为例进行说明,而在设备测量物体其他的物理参数或长度参数时也可使用本申请中的测量物体参数的方法,在此不做限定。
S102:确定在图像中待测量物体的面积。
具体地,本实施例中的设备需要测量待测量物体的参数时,需要通过S102确定图像中待测量物体的面积。其中,设备所确定的图像中待测量物体的面积可以是物体相对于图像的面积。
例如:图3中所示的照片A和照片B可以是同一设备拍摄的包含同一待测量物体的图像,以待测量物体为苹果为例,照片A和照片B中均包括了待测量物体的图像。由于同一设备拍摄的图像的大小固定,即照片A和照片B的长和宽均为a和b。则设备可以通过照片A和照片B中所包括的区域S1和S2确定待测量物体在的图像中所占的面积。随后设备可以根据已知的照片的面积a*b确定待测量物体苹果的相对面积,其中待测量物体在照片A中的相对面积为S1,待测量物体在照片B中的相对面积为S2。
S103:确定图像中待测量物体的深度。
具体地,本实施例中的设备需要测量待测量物体的参数时,需要通过S103确定待测量物体的深度。其中,深度指的是用于拍摄图像的设备在拍摄图像时,设备到待测量物体的距离。更为具体地,设备到待测量物体的距离可以是设备在拍摄图像时,设备的摄像头到待测量物体最近切面的垂直距离,例如图2中所示的设备2在拍摄待测量物体苹果1时的距离L1和L2。
需要说明的是,本实施例中设备通过S101确定待测量物体的类别、通过S102确定待测量物体在图像中的面积并通过S103确定待测量物体在图像中的深度。而这三个步骤的先后步骤并不以其序号作为限定,本实施例中的设备可以同时执行上述步骤S101、S102和S103,或者以任意顺序依次执行上述步骤S101、S102和S103。
S104:根据待测量物体的类别、待测量物体的面积和待测量物体的深度确定待测量物体的参数。
具体地,在S104中,设备根据S101确定待测量物体的类别、S102确定待测量物体在图像中的面积和S103确定待测量物体在图像中的深度这三个参数共同确定图像中待测量物体的参数。
更为具体地,S104包括:通过查找映射关系确定与待测量物体的类别、待测量物体的面积、待测量物体的深度和待测量物体的参数之间的对应关系确定待测量物体的参数。其中,映射关系可以存储在设备的存储空间中,当设备需要查找待测量物体的参数时从存储空间中读取该映射关系。映射关系包括至少一个物体的类别、面积、深度和参数之间的对应关系。
例如:以待测量物体的参数为物体的热量为例,映射关系包括苹果的类别、面积、深度和热量之间的对应关系和梨的类别、面积、深度和热量之间的对应关系“类别-面积-深度-热量”。其中,对于同一类别的物体又可以存在多个不同的对应关系,以苹果为例,苹果的类别、面积、深度和热量之间的对应关系可以包括:“苹果-20平方厘米-20厘米-52大卡”和“苹果-35平方厘米-10厘米-70大卡”。
在上述示例中,更为具体地,S104一种可能的实现方式为:映射关系包括的对应关系有:“苹果-30平方厘米-10厘米-52大卡”、“苹果-20平方厘米-20厘米-52大卡”、“苹果-35平方厘米-10厘米-70大卡”、和“苹果-10平方厘米-39厘米-94大卡”。则当设备测量苹果的热量参数时,设备确定图像中的待测量物体的类别为苹果、确定图像内待测量物体苹果的面积为20平方厘米,并确定图像中待测量物体苹果的深度为20厘米。设备将上述已经确定的三项参数的对应关系“苹果-20平方厘米-20厘米”与映射关系中的对应关系进行比较。从而比较出上述映射关系中的第二个对应关系“苹果-20平方厘米-20厘米-52大卡”的前三项参数与所获取的三项参数相匹配,则确定待测试的苹果的类别、面积、深度和热量之间的对应关系为“苹果-20平方厘米-20厘米-52大卡”,进而设备确定待测量物体苹果的热量参数为52大卡。
综上,本实施例提供的测量物体参数的方法中,确定图像中待测量物体的类别;确定在图像中待测量物体的面积;确定图像中待测量物体的深度,待测量物体的深度为用于拍摄图像的设备在拍摄图像时,设备到待测量物体的距离;根据待测量物体的类别、待测量物体的面积和待测量物体的深度确定待测量物体的参数。以将拍摄距离对待测量物体参数的影响纳入测量范围,并根据待测量物体的类别、待测量物体在图像中的面积和深 度共同确定待测量物体的参数,从而提高了测量物体参数时的准确度。
进一步地,在上述实施例中,由于在S104中设备需要通过查找映射关系确定与待测量物体的类别、待测量物体的面积、待测量物体的深度和待测量物体的参数之间的对应关系确定待测量物体的参数,因此映射关系需要提前确定并存储在设备中。而为了使设备确定待测量物体参数时都能够从映射关系中确定存在对应关系,并且对应关系需要尽可能覆盖待测量物体可能的参数范围以提高设备所确定的参数的准确。因此需要确定不同类别的物体,并确定同一类别物体在参数不同的情况下对应关系中的面积和深度。即,在上述实施例中的方法中还包括:设备确定至少一个物体的类别、面积、深度和参数之间的对应关系,并根据所确定的上述对应关系确定映射关系。
图5为本申请测量物体参数的方法中确定对应关系的示意图。如图5所示,以待测量物体为苹果、参数为苹果的热量为例。为了采集同一热量的苹果在不同深度的图像内的面积,将一热量为52大卡的苹果放置在桌上后。通过设备分别距离苹果以D1、D2、D3、D4和/或D5的深度、T1、T2、T3、T4和/或T5的角度对苹果进行拍摄,得到不同角度下、不同深度的包含标准物体的照片。设备在围绕以苹果为圆心、D1为半径的球体表面上拍摄的图像,均可认为是电子设备在深度D1时拍摄的图像。并按照前述示例中的方法确定不同深度的照片内苹果的面积,最终得到对应关系例如有:“苹果-30平方厘米-10厘米-52大卡”和“苹果-20平方厘米-20厘米-52大卡”。随后以相同的方法确定一热量为70大卡的苹果的对应关系有:“苹果-35平方厘米-10厘米-70大卡”和“苹果-24平方厘米-20厘米-70大卡”。并可以上述相同的方法确定热量为50大卡和60大卡的梨的对应关系有“梨-18平方厘米-18厘米-50大卡”和“梨-25平方厘米-18厘米-60大卡”。
最终将上述示例中的对应关系均可加入映射关系,使得设备在通过查找映射关系中的该些对应关系,从而确定与待测量物体的类别、待测量物体的面积、待测量物体的深度和待测量物体的参数之间的对应关系确定待测量物体的参数。
更为具体地,上述示例中对于映射关系中不同对应关系的获取可总结为如下步骤:1、采集同一类别、同一参数的物体在不同深度的照片中的面积;2、将物体的类别、面积、深度和参数之间建立对应关系。3、通过确定不同类别、不同参数的对应关系,并将所确定的对应关系均加入映射关系中,供设备在测量物体的参数时进行查询。
进一步地,在上述实施例中至少一个物体的类别、面积、深度和参数之间的对应关系中,当至少一个物体的类别相同且参数相同时,物体的面积和深度成比例关系。具体地,对于一个类别的物体,当该类别的物体的参数一定时,设备对该相同类别、相同参数的物体所拍摄的图像中,物体所占的面积与拍摄图像时设备到物体的深度成-2次幂的关系。
其中,对于上述实施例设备所拍摄的图像中,对于同一类别且参数相同的物体所占的面积和深度之间的关系,可以通过构建成比例的特征曲线实现,而该特征曲线可以通过摄像头成像原理的下述公式计算:
Figure PCTCN2018095370-appb-000001
其中,上述公式是通过小孔成像原理推导出的图像所在的像平面上的点到物体所在实际的三维坐标中的对应关系,fx,fy,cx,cy是设备拍摄图像时所用的拍摄装置的摄像头内参,Tz是物体的深度值,a是物体在三维坐标系中的长度,L是物体在图像的二维坐标系中的长度。进而经过上述公式可推倒出物体在图像内的面积S与深度Tz的关系为:
Figure PCTCN2018095370-appb-000002
S/S0~Tz -2
其中,根据图像的像平面上的面积与视差disp成2次幂关系,可以推导本实施例中物体在图像中的面积S与物体在图像中的深度Tz成-2次幂的关系;S0表示与待测量物体相同类型的物体的对应关系中的面积,S表示待测量完物体在图像中的面积。并且更为具体地,物体在图像内的面积S与深度Tz为线性或非线性的关系,该关系更为具体地是-2次幂关系。
例如:图6为本申请物体的深度与面积关系的示意图。如图6所示的示意图中示出了通过实际测量的物体的深度与面积的数据可以拟合出上述关系的-2次幂关系的曲线。具体地,S0表示物体在图像中所占的面积,Tz表示物体在图像中的深度。
在确定物体的深度与面积的关系时,可以通过设备对同参数且同类别的物体从不同的深度进行拍摄,并从所拍摄的图像中确定对于同一物体的面积与深度的关系。例如:图6中以待测量物体为苹果为例,选取一热量为50大卡的苹果进行拍摄。设备在距离苹果深度为100毫米处拍摄得到的图像中,苹果的面积为30平方厘米,则得到对于一热量为50大卡的苹果的深度与面积的关系为(100,30),并在图6中记录该点。同样地,设备在距离为200毫米处拍摄上述的同一苹果、或者另一50大卡的苹果所得到的图像中,苹果的面积为8平方厘米,则对于该苹果可得到另一深度与面积的关系(200,8),并在图6中记录该点。按照不同的深度拍摄同一苹果得到的所有深度与面积的关系均记录在图6中。将所有深度与面积的关系通过拟合的方式可得到一条曲线,可以从图6中看出该曲线即为上述公式中推到出的深度与面积成-2次幂的关系。可选地,上述拟合的方法可以采用现有技术中常用的最小二乘法、插值法或者逼近法等拟合方式,在此不做限定。
进一步地,在上述实施例中,由于对相同类别的物体,当物体的参数相同时,设备对物体所拍摄的图像中,物体所占的面积与拍摄图像时设备到物体的深度成-2次幂的关系。则设备在通过映射关系查询物体的参数时,映射关系中所存储物体的类别、面积、深度和参数之间的对应关系中,对应关系可以采用“类别-面积和深度的关系曲线-参数”的形式。
例如:对于物体为苹果、参数为热量参数为例,对应关系可以是“苹果-面积和深度的关系曲线1-52大卡”、“苹果-面积和深度的关系曲线2-70大卡”。则当设备所 获取的待测量物体为“苹果-35平方厘米-10厘米”的数据后,将其中面积和深度的关系代入到对应关系中面积和深度关系曲线1和关系曲线2中。如果待测量物体的面积和深度的关系满足关系曲线1,则确定待测量苹果的热量数值为52大卡。
此外,映射关系中也可以同时包括“类别-面积-深度-参数”和“类别-面积和深度的关系曲线-参数”两种不同表示形式的对应关系。当设备在查询映射关系确认待测量物体的参数时,如果两种不同形式的对应关系都有对应的结果,则应以“类别-面积和深度的关系曲线-参数”的对应关系为准查询确定待测量物体的参数。
可选地,在上述各实施例中,S101中确定待测量物体的类别时,可以具体利用深度学习对所述图像中所述待测量物体的特征进行识别以确定所述待测量物体的类别。其中,深度学习算法通过机器学习的原理对待测量物体的类别进行识别,即深度学习网络会提前确定若干图像中已知类别的物体的特征,并将物体的类别与已知的特征记录在深度学习网络中。当设备需要确定待测量物体的类别时,会将待测量物体的图像输入深度学习网络中,由深度学习网络提取图像中待测量物体的特征,并将待测量物体的特征与深度学习网络中存储的已知类别物体的特征进行比对,最终确定与待测量物体的特征最接近的物体的类别,该类别即为待测量物体的类别。需要说明的是,本实施例中提供的通过深度学习方式确定待测量物体的类别的方法仅为举例,对于深度学习算法本身并不作具体限定,关于该算法的未列出之处可参照本领域方法进行计算。
可选地,在上述各实施例中,S102中确定图像中待测量物体的面积,可以具体通过确定图像的热力图后,根据所确定的图像的热力图确定待测量物体的面积。具体地,热力图用于表示图像中物体热量的分布,通过热力图中不同的颜色表示图像中不同的热量区域。设备在对待测量物体所在的区域进行提取时,可以先得到整个图像的热力图,随后将整个图像的热力图中,热量大于置信度的区域框出作为待测量物体的区域。其中,置信度为根据经验设置的热力值,当图像中某区域的热力值超过所设置的置信度时,可认为该区域包括了待测量的物体。可选地,为了更加有效地提取图像中待测量物体的面积,还可以对热力图所确定的区域进行二次分割,二次分割可以根据已框出的区域中的不同颜色对待测量物体的边界处物体与背景进行分割。可选地,设备也可以通过深度学习的方式进行提取图像的热力图,深度学习网络也需要提前对不同热力的物体的图像进行特征提取,并存储不同热力的物体在图像中的特征,以将待测量物体在图像中的特征确定其对应的热力。其中,深度学习网络的模型可以采用更容易收敛的残差网络(Residual Networks,ResNets)和能够减少网络参数的初始网络(Inception network,InceptionNet)相结合的方式实现。并在该网络后添加新的一层全局均值池化(global average pooling)层,用于产生第一图像的热力图,以方便在第一图像中待测量物体的定位。最终设备还可以通过对图像中所确定的包含待测量物体的区域进行高斯混和建模以及轮廓的填充后实现待测量物体所在区域的分割,并通过该区域的面积确定图像中待测量物体的面积。需要说明的是,本实施例中提供的通过深度学习方式确定待测量物体的面积的方法仅为举例,对于深度学习算法本身并不作具体限定,关于该算法的未列出之处可参照本领域方法进行计算。
可选地,在上述各实施例中,S103中确定图像中待测量物体的深度,可以通过根据设备拍摄待测量物体时的环境信息,确定与环境信息对应的测距方式;随后,由电 子设备通过所确定的测距方式确定待测量物体的深度。
具体地,测距方式可以是,通过设备双摄像头的视差确定待测量物体的深度;或者,通过设备的自动变焦(Automatic Focus,AF)摄像头确定待测量物体的深度;或者,通过设备的传感器进行激光测距确定待测量物体的深度;或者,通过设备摄像头的飞行时间(Time of flight,TOF)确定待测量物体的深度;或者,通过设备摄像头的结构光确定待测量物体的深度。其中,环境信息可以是设备拍摄图像时所处环境的光线、对比度、是否有雾或者待测量物体预估的深度等。
例如:当设备预估待测量物体的深度即该设备拍摄待测量物体时的距离在15cm以内时,设备通过激光测距的方式确定待测量物体的实际深度即设备与待测量物体的实际精确距离;当预估待测量物体的深度大于15cm时,设备通过双摄像头视差的方式确定设备与待测量物体的实际精确距离。又例如:当设备在拍摄待测量物体光线较差时,选用激光测距的方式来确定待测量物体的深度,或者,当设备在拍摄待测量物体光线较强对激光产生影响时,选用双摄像头视差的方式确定待测量物体的深度。
优选地,本步骤中测量拍摄距离的一种可能的实现方式为:通过该设备的双摄像头的视差确定该设备与待测量物体的深度。其中,双摄像头的视差是指该设备的两个不同的摄像头在拍摄相同的场景时所生成的两个图像的视差,具体地,首先确定左图像和右图像的视差,左图像和右图像为该设备的不同摄像头在拍摄相同的场景时所生成的图像,该相同的场景包括该待测量物体,然后根据第一图像和第二图像的视差确定前述图像中待测量物体的深度。特别地,当设备具备双摄像头时,在本步骤中根据双摄像头的视差确定待测量物体的深度时所使用的是双摄像头得到的左图像和右图像,而前述实施例中所述的包含待测量物体的图像为对设备的双摄像头所得到的左图像和右图像进行合并、背景虚化等处理后所得到的包含待测量物体的单张图像。如下结合图7给出了设备通过双摄像头的视差确定待测量物体深度具体的实现方式:
图7为根据双摄像头的视差确定设备与待测量物体的深度的原理图,其中,世界坐标系中的任意一点与它在左右摄像头的成像点在同一个极平面上,O L和O R是左右摄像头的光心,长为L的两条线段表示的是左右摄像头的像面。则光心到像面的最短距离就是焦距长度f,b是两个摄像头间的基线。若P是世界坐标系中的一点,它在左右像面上的成像点是P L和P R。P L和P R距各自像面的左边缘的距离是X L和X R。视差就是X R-X L或者是X L-X R。标定和匹配后f,b,X R,X L都能够得到,那么物体的景深Z和视差的关系可由下面的推导过程得出:
在O LO RP中,P LP RP相似于O LO RP,则有比例关系:
Figure PCTCN2018095370-appb-000003
又可以写为:
Figure PCTCN2018095370-appb-000004
于是由:
Figure PCTCN2018095370-appb-000005
可推导出:
Figure PCTCN2018095370-appb-000006
其中,X R-X L称为视差,即目标点在左右两幅视图上成像的横向坐标直接存在的差异。利用图像匹配和视差原理即可获得景物的空间三维坐标(X,Y,Z)。
图8为根据双摄像头的视差确定设备与待测量物体的深度的流程图。首先设备的双摄像头针对同样的场景分别拍摄出左图像和右图像,然后对左图像和右图像进行预处理。其中,预处理是对左图像和右图像进行同步处理(时间戳对齐),然后再将同 步处理后的左图像和右图像进行匹配处理,在匹配过程中,设备会分别提取同步处理后的左图像和右图像的特征点。然后确定左图像和右图像中相对应的特征点以形成特征点对,并计算出特征点对之间的视差X R-X L。根据左图像和右图像中所有的特征点对确定左图像和右图像的视差图;最后根据公式
Figure PCTCN2018095370-appb-000007
将视差图转化为深度图。其中,Z即为设备的摄像头到待测量物体最近切面的垂直距离即待测量物体的深度。需要说明的是,本实施例中提供的双摄像头的视差确定待测量物体的深度方法仅为举例,对于算法本身并不作具体限定,关于该算法的未列出之处可参照本领域方法进行计算。
可选地,本实施例还提供一种系统来实现上述方法,图9为本申请参数测量系统一实施例的结构示意图,本实施例提供的系统可用于实现上述实施例中任一项的测量物体参数的方法。在前述方法的基础上,如图9所示的系统中设置模块内的AR测量开关用于实现设备中测量物体参数的方法的开启和关闭。如果AR测量开关被开启,则系统框架中的双目测量功能、分类分割、轮廓提取以及卡路里估计等功能模块将可被用于测量物体的参数,设备获取待测食物的图像后即可根据图像确定待测量物体的参数数值(此处以参数为物体的热量单位卡路里为例);相反地,如果AR测量开关被关闭,则关闭上述测量物体参数的方法。如果用户输入开关被开启,则在设备无法准确确定出待测量物体的种类时,可以由设备获取设备的用户所输入的待测量物体的种类。双目测量模块,是根据立体视觉原理,计算出空间点的深度和视差信息,具体实现方式如图4中的S103;图像分类模块,用于通过深度学习确定待测量物体的种,具体实现方式如图4中的S101;图像预分割模块和轮廓提取模块,用于根据热力图算法确定待测量物体在图像中的面积,具体实现方式如图4中的S102。深度图获取模块,用于获取双目测量模块所获取的待测量物体的深度,待测量物体的深度可以通过10bit数据进行记录,其中前8位存储整数信息,后2位存储小数信息,深度的精度越高所表示的深度的准确度就越好。卡路里估计模块,根据待测量物体的类别、待测量物体的面积和待测量物体的深度确定待测量物体的参数,具体实现方式如图4中的S104。
图10为本申请参数测量系统一实施例的流程示意图。如图10所示的流程可用于在图9所示的系统中执行,当设备中的系统启动双摄像头后,会先判断AR测量卡路里功能是否开启,若是关闭状态则在正常拍摄之后在设备中记录图像并结束程序;若是开启状态则设备中的系统会继续执行测量物体参数的程序。在测量物体参数的程序中,首先判断用户输入功能开关是否开启,若开启则获取设备的用户所输入的待测量物体的种类;若关闭则根据前述实施例中S101中的方法确定图像中待测量物体的种类。在确定待测量物体的种类后根据热力图算法确定图像中待测量物体的范围;并通过深度图和图像颜色信息进一步确定对图像中待测量物体的面积;最后通过前述实施例中的S104中的“经验比值法”根据待测量物体的类别、待测量物体的面积和待测量物体的深度确定待测量物体的卡路里估计。
图11为本申请测量物体参数的装置实施例一的结构示意图。如图11所示,本实施例中测量物体参数的装置包括:识别模块1101,第一确定模块1102,第二确定模块1103和第三确定模块1104。其中,识别模块1101用于确定图像中待测量物体的类别;第一确定模块1102用于识别图像中待测量物体的面积;第二确定模块1103用于确定图像中待测量物体的深度,待测量物体的深度为用于拍摄图像的设备在拍摄图像时,设备到待测量 物体的距离;第三确定模块1104,用于根据待测量物体的类别、待测量物体的面积和待测量物体的深度确定待测量物体的参数。
图11所示实施例的测量物体参数的装置可用于执行图4所示的测量物体参数的方法实施例的技术方案,其实现原理和技术效果类似,此处不再赘述。
可选地,第三确定模块1104具体用于,通过查找映射关系确定与待测量物体的类别、待测量物体的面积和待测量物体的深度对应的待测量物体的参数,映射关系包含至少一个物体的类别、面积、深度和参数之间的对应关系。
图12为本申请测量物体参数的装置实施例二的结构示意图。如图12所示,本实施例中测量物体参数的装置在图11所示的基础上还包括:第四确定模块1105。其中,第四确定模块1105用于确定至少一个物体的类别、面积、深度和参数之间的对应关系;第四确定模块1105还用于,根据至少一个物体的类别、面积、深度和参数之间的对应关系确定映射关系。
图12所示实施例的测量物体参数的装置可用于执行上述方法实施例的技术方案,其实现原理和技术效果类似,此处不再赘述。
可选地,在上述各实施例中,至少一个物体的类别、面积、深度和参数之间的对应关系包括:至少一个物体的类别相同且参数相同时,物体的面积和深度成比例关系。
可选地,在上述各实施例中,第二确定模块1103具体用于,根据设备拍摄待测量物体时的环境信息,确定与环境信息对应的测距方式;通过测距方式确定待测量物体的深度。
可选地,在上述各实施例中,测距方式包括:通过设备双摄像头的视差确定待测量物体的深度;或者,通过设备的自动变焦AF摄像头确定待测量物体的深度;或者,通过设备的传感器进行激光测距确定待测量物体的深度;或者,通过设备摄像头的飞行时间TOF确定待测量物体的深度;或者,通过设备摄像头的结构光确定待测量物体的深度。
可选地,在上述各实施例中,待测量物体的参数用于指示待测量物体的热量。
上述各实施例的测量物体参数的装置可用于执行上述方法实施例的技术方案,其实现原理和技术效果类似,此处不再赘述。
需要说明的是,本申请实施例中对模块的划分是示意性的,仅仅为一种逻辑功能划分,实际实现时可以有另外的划分方式。在本申请的实施例中的各功能模块可以集成在一个处理模块中,也可以是各个模块单独物理存在,也可以两个或两个以上模块集成在一个模块中。上述集成的模块既可以采用硬件的形式实现,也可以采用软件功能模块的形式实现。
所述集成的模块如果以软件功能模块的形式实现并作为独立的产品销售或使用时,可以存储在一个计算机可读取存储介质中。基于这样的理解,本申请的技术方案本质上或者说对现有技术做出贡献的部分或者该技术方案的全部或部分可以以软件产品的形式体现出来,该计算机软件产品存储在一个存储介质中,包括若干指令用以使得一台计算机设备(可以是个人计算机,服务器,或者网络设备等)或处理器(processor)执行本申请各个实施例所述方法的全部或部分步骤。而前述的存储介质包括:U盘、移动硬盘、只读存储器(Read-Only Memory,ROM)、随机存取存储器(Random Access Memory,RAM)、磁碟或者光盘等各种可以存储程序代码的介质。
在上述实施例中,可以全部或部分地通过软件、硬件、固件或者其任意组合来实 现。当使用软件实现时,可以全部或部分地以计算机程序产品的形式实现。所述计算机程序产品包括一个或多个计算机指令。在计算机上加载和执行所述计算机程序指令时,全部或部分地产生按照本申请实施例所述的流程或功能。所述计算机可以是通用计算机、专用计算机、计算机网络、或者其他可编程装置。所述计算机指令可以存储在计算机可读存储介质中,或者从一个计算机可读存储介质向另一个计算机可读存储介质传输,例如,所述计算机指令可以从一个网站站点、计算机、服务器或数据中心通过有线(例如同轴电缆、光纤、数字用户线(DSL))或无线(例如红外、无线、微波等)方式向另一个网站站点、计算机、服务器或数据中心进行传输。所述计算机可读存储介质可以是计算机能够存取的任何可用介质或者是包含一个或多个可用介质集成的服务器、数据中心等数据存储设备。所述可用介质可以是磁性介质,(例如,软盘、硬盘、磁带)、光介质(例如,DVD)、或者半导体介质(例如固态硬盘Solid State Disk(SSD))等。
或者,以上各个模块的部分或全部也可以通过集成电路的形式内嵌于该终端设备的某一个芯片上来实现。且它们可以单独实现,也可以集成在一起。即以上这些模块可以被配置成实施以上方法的一个或多个集成电路,例如:一个或多个特定集成电路(Application Specific Integrated Circuit,ASIC),或,一个或多个微处理器(digital singnal processor,DSP),或,一个或者多个现场可编程门阵列(Field Programmable Gate Array,FPGA)等。
图13为本申请测量物体参数的装置实施例三的结构示意图。如图13所示,猪养殖13包括:处理器1301,存储器1302和收发装置130。其中,存储器1302用于存储实现以上方法实施例各个模块的程序;处理器1301调用该程序,执行以上测量物体参数的方法实施例的操作。
本申请还提供一种存储介质,包括:可读存储介质和计算机程序,所述计算机程序用于实现前述任一实施例提供的测量物体参数的方法。
本申请还提供一种程序产品,该程序产品包括计算机程序(即执行指令),该计算机程序存储在可读存储介质中。发起随机接入过程的装置的至少一个处理器可以从可读存储介质读取该计算机程序,至少一个处理器执行该计算机程序使得发起随机接入过程的装置实施前述各种实施方式提供的测量物体参数的的方法。
最后应说明的是:以上各实施例仅用以说明本发明的技术方案,而非对其限制;尽管参照前述各实施例对本发明进行了详细的说明,本领域的普通技术人员应当理解:其依然可以对前述各实施例所记载的技术方案进行修改,或者对其中部分或者全部技术特征进行等同替换;而这些修改或者替换,并不使相应技术方案的本质脱离本发明各实施例技术方案的范围。

Claims (16)

  1. 一种测量物体参数的方法,其特征在于,包括:
    识别图像中待测量物体的类别;
    确定在所述图像中所述待测量物体的面积;
    确定所述图像中所述待测量物体的深度,所述待测量物体的深度为用于拍摄所述图像的设备在拍摄所述图像时,所述设备到所述待测量物体的距离;
    根据所述待测量物体的类别、所述待测量物体的面积和所述待测量物体的深度确定所述待测量物体的参数。
  2. 根据权利要求1所述的方法,其特征在于,所述根据所述待测量物体的类别、所述待测量物体的面积和所述待测量物体的深度确定所述待测量物体的参数,包括:
    通过查找映射关系确定与所述待测量物体的类别、所述待测量物体的面积和所述待测量物体的深度对应的所述待测量物体的参数,所述映射关系包含至少一个物体的类别、面积、深度和参数之间的对应关系。
  3. 根据权利要求2所述的方法,其特征在于,还包括:
    确定至少一个物体的类别、面积、深度和参数之间的对应关系;
    根据所述至少一个物体的类别、面积、深度和参数之间的对应关系确定所述映射关系。
  4. 根据权利要求3所述的方法,其特征在于,所述至少一个物体的类别、面积、深度和参数之间的对应关系包括:
    所述至少一个物体的类别相同且参数相同时,所述至少一个物体的面积和深度成比例关系。
  5. 根据权利要求1-4任一项所述的方法,其特征在于,所述获取所述图像中所述待测量物体的深度,包括:
    根据所述设备拍摄所述待测量物体时的环境信息,确定与所述环境信息对应的测距方式;
    通过所述测距方式确定所述待测量物体的深度。
  6. 根据权利要求5所述的方法,其特征在于,所述测距方式包括:
    通过所述设备双摄像头的视差确定所述待测量物体的深度;
    或者,
    通过所述设备的自动变焦AF摄像头确定所述待测量物体的深度;
    或者,
    通过所述设备的传感器进行激光测距确定所述待测量物体的深度;
    或者,
    通过所述设备摄像头的飞行时间TOF确定所述待测量物体的深度;
    或者,
    通过所述设备摄像头的结构光确定所述待测量物体的深度。
  7. 权利要求1-6任一项所述的方法,其特征在于,所述待测量物体的参数用于指示所述待测量物体的热量。
  8. 一种测量物体参数的装置,其特征在于,包括:
    识别模块,用于识别图像中待测量物体的类别;
    第一确定模块,用于确定所述图像中所述待测量物体的面积;
    第二确定模块,用于确定所述图像中所述待测量物体的深度,所述待测量物体的深度为用于拍摄所述图像的设备在拍摄所述图像时,所述设备到所述待测量物体的距离;
    第三确定模块,用于根据所述待测量物体的类别、所述待测量物体的面积和所述待测量物体的深度确定所述待测量物体的参数。
  9. 根据权利要求8所述的装置,其特征在于,所述第三确定模块具体用于,
    通过查找映射关系确定与所述待测量物体的类别、所述待测量物体的面积和所述待测量物体的深度对应的所述待测量物体的参数,所述映射关系包含至少一个物体的类别、面积、深度和参数之间的对应关系。
  10. 根据权利要求9所述的装置,其特征在于,还包括:
    第四确定模块,用于确定至少一个物体的类别、面积、深度和参数之间的对应关系;
    所述第四确定模块还用于,根据所述至少一个物体的类别、面积、深度和参数之间的对应关系确定所述映射关系。
  11. 根据权利要求10所述的装置,其特征在于,所述至少一个物体的类别、面积、深度和参数之间的对应关系包括:
    所述至少一个物体的类别相同且参数相同时,所述至少一个物体的面积和深度成比例关系。
  12. 根据权利要求8-11任一项所述的装置,其特征在于,所述第二确定模块具体用于,
    根据所述设备拍摄所述待测量物体时的环境信息,确定与所述环境信息对应的测距方式;
    通过所述测距方式确定所述待测量物体的深度。
  13. 根据权利要求12所述的装置,其特征在于,所述测距方式包括:
    通过所述设备双摄像头的视差确定所述待测量物体的深度;
    或者,
    通过所述设备的自动变焦AF摄像头确定所述待测量物体的深度;
    或者,
    通过所述设备的传感器进行激光测距确定所述待测量物体的深度;
    或者,
    通过所述设备摄像头的飞行时间TOF确定所述待测量物体的深度;
    或者,
    通过所述设备摄像头的结构光确定所述待测量物体的深度。
  14. 根据权利要求8-13任一项所述的装置,其特征在于,所述待测量物体的参数用于指示所述待测量物体的热量。
  15. 一种物体参数测量装置,其特征在于,包括:
    处理器和存储器;
    所述存储器,用于存储程序;
    所述处理器,用于调用所述存储器所存储的程序,以执行如权利要求1-7中任一所述的测量物体参数的方法。
  16. 一种计算机可读存储介质,其特征在于,所述计算机可读存储介质存储程序代码,当所述程序代码被执行时,以执行如权利要求1-7中任一所述的测量物体参数的方法。
PCT/CN2018/095370 2018-07-12 2018-07-12 测量物体参数的方法及装置 WO2020010561A1 (zh)

Priority Applications (2)

Application Number Priority Date Filing Date Title
PCT/CN2018/095370 WO2020010561A1 (zh) 2018-07-12 2018-07-12 测量物体参数的方法及装置
CN201880087447.3A CN111630524B (zh) 2018-07-12 2018-07-12 测量物体参数的方法及装置

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2018/095370 WO2020010561A1 (zh) 2018-07-12 2018-07-12 测量物体参数的方法及装置

Publications (1)

Publication Number Publication Date
WO2020010561A1 true WO2020010561A1 (zh) 2020-01-16

Family

ID=69142949

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2018/095370 WO2020010561A1 (zh) 2018-07-12 2018-07-12 测量物体参数的方法及装置

Country Status (2)

Country Link
CN (1) CN111630524B (zh)
WO (1) WO2020010561A1 (zh)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111783801A (zh) * 2020-07-17 2020-10-16 上海明波通信技术股份有限公司 物体轮廓提取方法及系统和物体轮廓预测方法及系统
CN112233144A (zh) * 2020-09-24 2021-01-15 中国农业大学 水下鱼体重量测量方法及装置
CN113689422A (zh) * 2021-09-08 2021-11-23 理光软件研究所(北京)有限公司 一种图像处理方法、装置及电子设备
CN113763412A (zh) * 2021-09-08 2021-12-07 理光软件研究所(北京)有限公司 图像处理方法、装置及电子设备、计算机可读存储介质
CN116320746A (zh) * 2023-05-16 2023-06-23 武汉昊一源科技有限公司 Tof对焦装置、对焦方法及拍摄设备
CN113763412B (zh) * 2021-09-08 2024-07-16 理光软件研究所(北京)有限公司 图像处理方法、装置及电子设备、计算机可读存储介质

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112614058B (zh) * 2020-10-31 2021-11-09 温岭市山市金德利电器配件厂 对象截面面积解析系统

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030076983A1 (en) * 2000-06-06 2003-04-24 Cox Dale W. Personal food analyzer
JP2008217702A (ja) * 2007-03-07 2008-09-18 Fujifilm Corp 撮影装置および撮影方法
CN106709525A (zh) * 2017-01-05 2017-05-24 北京大学 一种相机测量食物营养成分方法
CN106872513A (zh) * 2017-01-05 2017-06-20 深圳市金立通信设备有限公司 一种检测食物热量的方法及终端
CN107851459A (zh) * 2015-07-29 2018-03-27 皮道练 食品信息提供方法及装置
CN107873101A (zh) * 2014-11-21 2018-04-03 克里斯多夫·M·马蒂 用于对象辨识和评估的成像系统

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10404969B2 (en) * 2015-01-20 2019-09-03 Qualcomm Incorporated Method and apparatus for multiple technology depth map acquisition and fusion
WO2017149526A2 (en) * 2016-03-04 2017-09-08 May Patents Ltd. A method and apparatus for cooperative usage of multiple distance meters
CN106845345A (zh) * 2016-12-15 2017-06-13 重庆凯泽科技股份有限公司 活体检测方法及装置
FR3061275B1 (fr) * 2016-12-23 2020-10-16 Groupe Brandt Dispositif d'interface associe a un appareil refrigerateur
CN108198191B (zh) * 2018-01-02 2019-10-25 武汉斗鱼网络科技有限公司 图像处理方法及装置

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030076983A1 (en) * 2000-06-06 2003-04-24 Cox Dale W. Personal food analyzer
JP2008217702A (ja) * 2007-03-07 2008-09-18 Fujifilm Corp 撮影装置および撮影方法
CN107873101A (zh) * 2014-11-21 2018-04-03 克里斯多夫·M·马蒂 用于对象辨识和评估的成像系统
CN107851459A (zh) * 2015-07-29 2018-03-27 皮道练 食品信息提供方法及装置
CN106709525A (zh) * 2017-01-05 2017-05-24 北京大学 一种相机测量食物营养成分方法
CN106872513A (zh) * 2017-01-05 2017-06-20 深圳市金立通信设备有限公司 一种检测食物热量的方法及终端

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111783801A (zh) * 2020-07-17 2020-10-16 上海明波通信技术股份有限公司 物体轮廓提取方法及系统和物体轮廓预测方法及系统
CN111783801B (zh) * 2020-07-17 2024-04-23 上海明波通信技术股份有限公司 物体轮廓提取方法及系统和物体轮廓预测方法及系统
CN112233144A (zh) * 2020-09-24 2021-01-15 中国农业大学 水下鱼体重量测量方法及装置
CN112233144B (zh) * 2020-09-24 2024-05-28 中国农业大学 水下鱼体重量测量方法及装置
CN113689422A (zh) * 2021-09-08 2021-11-23 理光软件研究所(北京)有限公司 一种图像处理方法、装置及电子设备
CN113763412A (zh) * 2021-09-08 2021-12-07 理光软件研究所(北京)有限公司 图像处理方法、装置及电子设备、计算机可读存储介质
CN113689422B (zh) * 2021-09-08 2024-07-02 理光软件研究所(北京)有限公司 一种图像处理方法、装置及电子设备
CN113763412B (zh) * 2021-09-08 2024-07-16 理光软件研究所(北京)有限公司 图像处理方法、装置及电子设备、计算机可读存储介质
CN116320746A (zh) * 2023-05-16 2023-06-23 武汉昊一源科技有限公司 Tof对焦装置、对焦方法及拍摄设备

Also Published As

Publication number Publication date
CN111630524B (zh) 2024-04-30
CN111630524A (zh) 2020-09-04

Similar Documents

Publication Publication Date Title
WO2020010561A1 (zh) 测量物体参数的方法及装置
CN108764091B (zh) 活体检测方法及装置、电子设备和存储介质
US20240046571A1 (en) Systems and Methods for 3D Facial Modeling
KR102319177B1 (ko) 이미지 내의 객체 자세를 결정하는 방법 및 장치, 장비, 및 저장 매체
CN107093171B (zh) 一种图像处理方法及装置、系统
CN107392958B (zh) 一种基于双目立体摄像机确定物体体积的方法及装置
US7825948B2 (en) 3D video conferencing
CN104363378B (zh) 相机对焦方法、装置及终端
CN108510540B (zh) 立体视觉摄像机及其高度获取方法
WO2018112788A1 (zh) 图像处理方法及设备
CN104363377B (zh) 对焦框的显示方法、装置及终端
WO2020063987A1 (zh) 三维扫描方法、装置、存储介质和处理器
CN111787303B (zh) 三维图像的生成方法、装置、存储介质及计算机设备
CN110213491B (zh) 一种对焦方法、装置及存储介质
WO2022218161A1 (zh) 用于目标匹配的方法、装置、设备及存储介质
CN110738703A (zh) 定位方法及装置、终端、存储介质
CN109697444A (zh) 基于深度图像的对象识别方法及装置、设备、存储介质
WO2023142352A1 (zh) 一种深度图像的获取方法、装置、终端、成像系统和介质
WO2020042000A1 (zh) 相机设备及对焦方法
CN110505398A (zh) 一种图像处理方法、装置、电子设备及存储介质
CN110443228B (zh) 一种行人匹配方法、装置、电子设备及存储介质
KR101867497B1 (ko) 그림 처리 방법 및 전자 설비
WO2021193672A1 (ja) 三次元モデル生成方法及び三次元モデル生成装置
TW202221646A (zh) 深度檢測方法、電子設備及電腦可讀儲存媒體
CN110800020B (zh) 一种图像信息获取方法、图像处理设备及计算机存储介质

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18925876

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18925876

Country of ref document: EP

Kind code of ref document: A1