CN113837056A - Method for determining form information, related device, equipment and storage medium - Google Patents

Method for determining form information, related device, equipment and storage medium Download PDF

Info

Publication number
CN113837056A
CN113837056A CN202111101201.7A CN202111101201A CN113837056A CN 113837056 A CN113837056 A CN 113837056A CN 202111101201 A CN202111101201 A CN 202111101201A CN 113837056 A CN113837056 A CN 113837056A
Authority
CN
China
Prior art keywords
target
information
target object
candidate
determining
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202111101201.7A
Other languages
Chinese (zh)
Inventor
郭东升
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Sensetime Technology Co Ltd
Original Assignee
Shenzhen Sensetime Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Sensetime Technology Co Ltd filed Critical Shenzhen Sensetime Technology Co Ltd
Priority to CN202111101201.7A priority Critical patent/CN113837056A/en
Publication of CN113837056A publication Critical patent/CN113837056A/en
Pending legal-status Critical Current

Links

Images

Landscapes

  • Processing Or Creating Images (AREA)

Abstract

The application discloses a method for determining body information, a related device, equipment and a storage medium, wherein the method comprises the following steps: acquiring body type related data of a target object; determining a reference object matched with the target object by utilizing the body type related data; and determining the target body information of the target object based on the reference body information corresponding to the reference object. By the method, the target body information of the target object can be acquired.

Description

Method for determining form information, related device, equipment and storage medium
Technical Field
The present application relates to the field of health management technologies, and in particular, to a method for determining form information, and a related apparatus, device, and storage medium.
Background
At present, people pay more attention to their health management. The physical information of people is often the expression of personal health degree, so people pay attention to the physical information of people.
However, the body shape information of the human body is often measured by a professional measuring tool. For the general public, professional measuring tools are often very deficient, which results in that some people cannot conveniently obtain their own body information.
Therefore, how to enable the user to conveniently acquire the own body information has important significance.
Disclosure of Invention
The application provides a method for determining form information, a related device, equipment and a storage medium.
The first aspect of the present application provides a method for determining form information, where the method includes: acquiring body type related data of a target object; determining a reference object matched with the target object by utilizing the body type related data; and determining the target body information of the target object based on the reference body information corresponding to the reference object.
Therefore, the reference object matched with the target object can be determined by using the body type related data of the target object, and further the target body information of the target object can be determined according to the reference body information corresponding to the reference object, so that the target body information of the target object can be acquired.
Wherein, the reference body information and the target body information both include: at least one of body fat information and muscle tissue distribution information; and/or, the determining the target shape information of the target object based on the reference shape information corresponding to the reference object includes: and taking the reference body information of the reference object as the reference body information of the target object.
Therefore, the acquisition of at least one of body fat information and muscle tissue distribution information of the target object by the body shape information can be realized. The determining a reference object matched with the target object by using the body type related data includes: selecting a reference object matched with the target object from the first candidate object set based on the body type related data; the first candidate object set comprises a plurality of candidate objects, and the candidate objects are preset with reference shape information.
Therefore, the candidate object set is pre-stored, and the reference shape information is pre-set for all the objects in the candidate object set, so as to select the reference object matched with the target object from the candidate object set, and the target shape information of the target object can be determined based on the reference shape information of the reference object.
Before determining a reference object matched with the target object by using the body type related data, the method for determining the body type information further comprises the following steps: acquiring at least object information of a target object, wherein the object information comprises one or more of gender and height; at least one candidate object matching the object information is found from the second set of candidate objects to form a first set of candidate objects.
Therefore, by searching at least one candidate object matched with the object information from the second candidate object set, the candidate object more matched with the body type related data of the target object can be selected to form the first candidate object set on the basis of the second candidate object set, and the candidate objects needing to be matched with the body type related data of the target object can be reduced, so that the body information is improved, and the searching efficiency of the reference object is improved.
The above acquiring data related to the body type of the target object includes: detecting a target image containing a target object to obtain a plurality of human body key points of the target object; and acquiring body type related data of the target object based on the plurality of human body key points.
Therefore, several human key points of the target object are obtained by detecting the target image containing the target object, so that the body type related data of the target object can be determined based on the obtained human key points.
The body type-related data of the target object is a target image including the target object. The determining the reference object matched with the target object by using the body type related data comprises: comparing the target image with the candidate images containing the candidate objects to obtain the body type matching degree of the target object and each candidate object; and selecting the candidate object with the corresponding body type matching degree meeting the preset requirement as a reference object.
Therefore, the target image is compared with the candidate images containing the candidate objects, so that the body type matching degrees of the target object and the candidate objects can be obtained, the candidate objects with the body type matching degrees meeting the preset requirements can be selected as the reference objects, and the matching between the candidate objects and the target object is realized.
Wherein the target image includes: respectively shooting a target object from a plurality of angles to obtain a plurality of target sub-images; the above comparing the target image with the candidate images including the candidate objects to obtain the body type matching degrees of the target object and the candidate objects respectively includes: corresponding to each candidate image, comparing the plurality of target sub-images with the candidate images respectively to obtain the body type matching degree of the target object in each target sub-image with each candidate object respectively; the selecting a candidate object whose body type matching degree meets a preset requirement as a reference object includes: and selecting the candidate object with the corresponding body type matching degree meeting the preset requirement based on the body type matching degree of the target object and each candidate object in each target sub-image.
Therefore, through the plurality of target sub-images obtained by respectively shooting the target object from a plurality of angles, more comprehensive body type related data about the target object can be obtained, so that the target object can be better matched with the candidate object, and the matching accuracy is improved.
After determining the target body information of the target object, the method for determining the body information further comprises the following steps: and displaying the target body information on the target image.
Therefore, the user can conveniently know the target body information of the user by displaying the target body information on the target image. The specific method of displaying the target shape information is, for example, a display method implemented by using an augmented reality technology.
Wherein the target body information includes body fat information. The above displaying the target shape information on the target image includes: determining the heat corresponding to the fat weight of the target object based on the body fat information; determining heat materials matched with heat; and displaying the thermal material on the target image by adopting an enhanced display technology.
Therefore, by determining the caloric material matched with the calories corresponding to the fat weight of the target object and displaying the caloric material on the target image by using the enhanced display technology, the body fat information of the target object can be displayed more intuitively, so that the user can conveniently know the body fat information of the user.
A second aspect of the present application provides a device for determining shape information, including: the system comprises an acquisition module, an object determination module and an information determination module, wherein the acquisition module is used for acquiring the body type related information of a target object; the object determination module is used for determining a reference object matched with the target object by utilizing the body type related data; the information determining module is used for determining the target body information of the target object based on the reference body information corresponding to the reference object.
A third aspect of the application provides an electronic device comprising a processor and a memory coupled to each other, wherein the processor is configured to execute a computer program stored in the memory to perform the method described in the first aspect.
A fourth aspect of the present application provides a computer readable storage medium having stored thereon program instructions which, when executed by a processor, implement the method described in the first aspect above.
According to the scheme, the reference object matched with the target object can be determined by using the body type related data of the target object through the body type related data of the target object, and then the target body information of the target object can be determined according to the reference body information corresponding to the reference object, so that the target body information of the target object can be acquired.
Drawings
Fig. 1 is a schematic flow chart of an embodiment of a method for determining profile information according to the present application;
FIG. 2 is a schematic flow chart of another embodiment of the method for determining profile information according to the present application;
FIG. 3 is a schematic flow chart of another embodiment of the method for determining profile information according to the present application;
fig. 4 is a schematic flowchart of a method for determining profile information according to another embodiment of the present application;
FIG. 5 is a schematic diagram of a framework of an embodiment of the apparatus for determining physical information of the present application;
FIG. 6 is a block diagram of an embodiment of an electronic device of the present application;
FIG. 7 is a block diagram of an embodiment of a computer-readable storage medium of the present application.
Detailed Description
The following describes in detail the embodiments of the present application with reference to the drawings attached hereto.
In the following description, for purposes of explanation and not limitation, specific details are set forth such as particular system structures, interfaces, techniques, etc. in order to provide a thorough understanding of the present application.
The terms "system" and "network" are often used interchangeably herein. The term "and/or" herein is merely an association describing an associated object, meaning that three relationships may exist, e.g., a and/or B, may mean: a exists alone, A and B exist simultaneously, and B exists alone. In addition, the character "/" herein generally indicates that the former and latter related objects are in an "or" relationship. Further, the term "plurality" herein means two or more than two.
The device for executing the method for determining the body information in the application can be a computer, a mobile phone, a tablet computer, smart glasses and other electronic devices.
Referring to fig. 1, fig. 1 is a schematic flowchart illustrating an embodiment of a method for determining profile information according to the present application. Specifically, the method may include the steps of:
step S11: and acquiring body type related data of the target object.
The data related to the body shape of the target subject includes, for example, size data of a human body, such as height, upper arm length, forearm length, thigh length, calf length, chest thickness, shoulder height, and the like, and may further include data related to the body shape, such as contour information of the body shape of the target subject. The method for acquiring the body type related data of the target object can be through modes of instrument measurement, user input and the like, and the acquisition mode of the body type related data of the target object is not specifically limited in the application. In one embodiment, the height, upper arm length, forearm length, thigh length, calf length, etc. of the target object may be determined by analyzing the target image containing the target object, for example by measuring the target image. In another embodiment, feature extraction may be performed on a target image including a target object to extract body type feature information of the target object, and the body type feature information may be used as body type-related data of the target object.
In one embodiment, the acquiring of the body type-related data of the target object may specifically include step S111 and step S112.
Step S111: and detecting a target image containing the target object to obtain a plurality of human body key points of the target object.
The method for detecting the target image including the target object may be a human body key point detection algorithm commonly used in the field of computer vision, for example, a 2D key point detection algorithm or a 3D key point detection algorithm may be used for detection, the 2D key point detection algorithm may specifically be a probabilistic point Machines (CPM) algorithm, a Hourglass algorithm, and the like, and the method does not limit the human body key point detection algorithm. The key points of the human body are, for example, the left and right shoulders, the left and right elbows, the left and right wrists, the left and right hips, the left and right knees, the left and right ankles, and the like.
Step S112: and acquiring body type related data of the target object based on the plurality of human body key points.
Because the human body key points can reflect the body type of the human body, the human body key points can be utilized to acquire the body type related data of the target object. For example, the shoulder width of the target object may be obtained by the distance between the left and right shoulder key points, the left leg length of the target object may be determined by determining the distance between the left hip key point and the left ankle key point, and the left thigh length may be determined by determining the distance between the left hip key point and the left knee key point.
Therefore, several human key points of the target object are obtained by detecting the target image containing the target object, so that the body type related data of the target object can be determined based on the obtained human key points.
Step S12: and determining a reference object matched with the target object by utilizing the body type related data.
Determining the reference object that matches the target object may be by determining the reference object that matches body shape related data of the target object. In one embodiment, the body type-related data of a plurality of candidate objects can be acquired, and then the body type-related data of the candidate objects and the body type-related data of the target object are matched, so as to determine the reference object matched with the target object. For example, if the body shape-related data of the target object is 175 cm in height, 110 cm in thigh length, and 45 cm in upper arm length, the three body shape-related data can be used to determine a reference object matching the target object. In another embodiment, the reference object matched with the target object may be determined by performing feature matching using the body shape feature information of the target object.
In one embodiment, the determining, by using the body type-related data, a reference object matching the target object specifically includes: a reference object matching the target object is selected from the first set of candidate objects based on the body type-related data. In this embodiment, the first set of candidate objects includes several candidate objects, each of which may be predetermined with reference body type-related data. The candidate objects of the candidate object set may be a plurality of objects of different ages, different sexes, different regions, etc. Thus, the reference object can be determined by matching based on the body type-related data of the target object and the reference body type-related data of each candidate object. In addition, reference shape information is preset in each candidate object, so that the reference shape information of the reference object selected from the first candidate object can be used subsequently to obtain the target shape information of the target object.
For example, the body shape related data of the target subject is 175 cm in height, 110 cm in thigh length, and 45 cm in upper arm length. The first candidate set includes 3 candidates, the first candidate has a height 173 cm, a thigh length 105 cm, and an upper arm length 40 cm, the second candidate has a height 177 cm, a thigh length 111 cm, and an upper arm length 47 cm, and the third candidate has a height 175 cm, a thigh length 110 cm, and an upper arm length 44 cm. Thus, the matching degree of the height of each candidate object and the height of the target object can be calculated, the matching degree of the thigh length of each candidate object and the thigh length of the target object is calculated, the matching degree of the upper arm length of each candidate object and the upper arm length of the target object is calculated, and then the reference object matched with the target object is determined by combining the three data matching degrees.
In one embodiment, formula (1) of the method for calculating the degree of matching is as follows:
Figure BDA0003270973730000071
wherein A is the body type related data of the target object, B is the body type related data of the candidate object, and theta is the matching degree obtained through calculation.
For example, the height of the target object is 175 cm, and the height of the candidate object is 173 cm, which is calculated to result in a matching degree of about 98.86%.
After the matching degree of each data is obtained, weighting processing may be performed based on the matching degree of the data, so as to obtain a final matching degree. It is understood that the method for calculating the matching degree of each data and the final matching degree is not limited to the above examples, and the application is not limited thereto.
The reference object matching the target object may be selected from the first candidate object set by selecting a candidate object that most matches all of the body type-related data of the target object as the reference object, by selecting a candidate object that most matches some of the body type-related data as the reference object, or by selecting a candidate object that most matches some of the body type-related data as the reference object.
Therefore, by pre-storing the candidate object set, and presetting the reference shape information on all the objects in the candidate object set, the reference object matched with the target object can be selected from the candidate object set, and the target shape information of the target object can be determined based on the reference shape information of the reference object.
Step S13: and determining the target body information of the target object based on the reference body information corresponding to the reference object.
Based on the reference body information corresponding to the reference object, the reference body information corresponding to the reference object can be directly used as the target body information of the target object; or the target body information of the target object can be determined after the reference body information is processed based on the matching degree of the body type related data of the reference object and the body type related data of the target object; or converting the reference body information corresponding to the reference object to determine the target body information of the target object.
In one embodiment, the reference physical form information and the target physical form information each include: at least one of body fat information and muscle tissue distribution information. The body fat information is, for example, body fat rate and/or fat distribution information. With this, it is possible to achieve acquisition of at least one of body fat information and muscle tissue distribution information of the target object.
Therefore, the reference object matched with the target object can be determined by using the body type related data of the target object, and further the target body information of the target object can be determined according to the reference body information corresponding to the reference object, so that the target body information of the target object can be acquired.
Referring to fig. 2, fig. 2 is a schematic flow chart of another embodiment of the method for determining profile information according to the present application. In the present embodiment, before the step "determining a reference object matching a target object using body type-related data" described above, the method for determining body shape information of the present application further includes steps S21 and S22.
Step S21: at least object information of the target object is acquired.
In this embodiment, the subject information includes one or more of gender and height. In other embodiments, the subject information may also include age, place of residence, place of birth, weight, and the like.
Step S22: at least one candidate object matching the object information is found from the second set of candidate objects to form a first set of candidate objects.
Generally, the degree of matching is generally higher for the body type-related data of the candidate object and the target object to which the object information conforms. Therefore, in the present embodiment, at least one candidate matching the object information may be found from the second candidate set to constitute the first candidate set. For example, if the object information of the target object is gender and height, the gender is male, and the height is 180 cm, at least one object candidate with gender of male and height of 180 cm can be found from the second object candidate set to form the first object candidate set. In one embodiment, when the object information of the target object includes height, candidate objects having a difference value within a certain range from the height of the target object may be further selected to form the first set of candidate objects.
Therefore, by searching at least one candidate object matched with the object information from the second candidate object set, the candidate object more matched with the body type related data of the target object can be selected to form the first candidate object set on the basis of the second candidate object set, and the candidate objects needing to be matched with the body type related data of the target object can be reduced, so that the body information is improved, and the searching efficiency of the reference object is improved.
Referring to fig. 3, fig. 3 is a flowchart illustrating a method for determining profile information according to another embodiment of the present application. In this embodiment, the body shape-related data of the target object is a target image including the target object. The target object of the target image may reflect data related to the body shape of the target object, for example, size data such as thigh length, upper arm length, etc. of the target object may be determined through the target image. In addition, the image information on the target image can reflect the body type related information of the target object as a whole. Therefore, the body shape-related data of the target object is determined as the target image including the target object in the present embodiment. In the present embodiment, the "determining a reference object matching the target object using the body type-related data" mentioned in the above embodiments includes steps S31 and S32.
Step S31: and comparing the target image with the candidate images containing the candidate objects to obtain the body type matching degree of the target object and each candidate object.
The target image is compared with the candidate images including the candidate objects, and in one embodiment, the comparison may be performed based on an image matching method, specifically, the target object in the target image is matched with the candidate object in the candidate images. The image matching method may be a general matching method, and is not described herein again. In another specific embodiment, specific body type related data of the target object in the target image may be acquired, for example, by using a measurement method based on an augmented reality technology, or by using a method of detecting human key points of the target object as mentioned in the above embodiment. In addition, specific body type related data of the candidate object in the candidate image can be acquired, and the specific body type related data of the candidate object can be preset data. And then calculating the matching degree of the body type related data of the two. Thus, the body type matching degree between the target object and each candidate object can be obtained.
In one embodiment, the target image includes: and a plurality of target sub-images obtained by shooting the target object from a plurality of angles respectively. Based on the plurality of target sub-images, the target object with respect to the target object at a plurality of angles may be obtained, and the plurality of target sub-images may be compared with the candidate sub-images of one candidate object at a plurality of angles, respectively. In one embodiment, the method can be used for displaying related prompt information to a target so that the target subject can take a picture of the body fitting clothes, and therefore body shape related data which is more consistent with the body shape of the target subject can be obtained.
Specifically, the step of comparing the target image with the candidate images including the candidate objects to obtain the body type matching degrees between the target object and the candidate objects in the above embodiment may include: and comparing the plurality of target sub-images with the candidate images corresponding to the candidate images respectively to obtain the body type matching degree of the target object in each target sub-image and each candidate object respectively.
In a specific embodiment, the plurality of target sub-images are respectively compared with the candidate images corresponding to the candidate images, or the plurality of target sub-images are respectively compared with the candidate sub-images containing a candidate object at a plurality of angles, so as to obtain the body type matching degree between the target object in each target sub-image and the candidate object. For example, the target sub-image includes the front and back of the target object and the images of the two sides, and the candidate sub-image also includes the front and back of the candidate object and the images of the two sides, so that the front image of the target sub-image and the front image of the candidate object can be compared, the back image of the target sub-image and the back image of the candidate object are compared, and so on, and finally the body type matching degree between the target object in each target sub-image and the candidate object is obtained. In another embodiment, several target sub-images may be compared with one image containing one candidate object, so as to obtain the body type matching degree between the target object in each target sub-image and each candidate object.
In a specific embodiment, the body type related data of the target object, which can be acquired correspondingly by each target sub-image, may be acquired based on a plurality of target sub-images, and then the body type related data of the target object, which can be acquired by each target sub-image, is matched with the body type related data of the candidate object, so as to obtain the body type matching degree between the target object in each target sub-image and each candidate object. For example, the chest thickness data acquired from a target sub-image is matched with the chest thickness data of the candidate object, so as to obtain the body type matching degree of the target object and the candidate object in the target sub-image. The method for calculating the body type matching degree may use the above-mentioned correlation method for calculating the matching degree and the final matching degree to obtain the body type matching degree between the target object in each target sub-image and each candidate object.
Step S32: and selecting the candidate object with the corresponding body type matching degree meeting the preset requirement as a reference object.
In one embodiment, the body type matching degree meets a preset requirement, and may be the highest body type matching degree. For example, after the body type matching degree of each candidate with the target object is calculated, the candidate with the highest body type matching degree with the target object is selected as the reference object. In a specific embodiment, the body type matching degree meets a preset requirement, or the body type matching degree of the data related to the body types is the highest. For example, the corresponding matching degrees are calculated based on the chest, waist, hip, upper arm, thigh, and lower leg of the target object and the candidate object, respectively, and the candidate object having the highest degree of matching with the body type of the target object based on the chest, waist, and hip may be selected as the reference object.
In one embodiment, corresponding to a case that the target image includes a plurality of target sub-images obtained by respectively photographing the target object from a plurality of angles, the candidate object whose corresponding body type matching degree meets the preset requirement may be selected based on the body type matching degree of the target object in each target sub-image and each candidate object. In a specific embodiment, a final body type matching degree may be obtained based on the body type matching degree between the target object in each target sub-image and each candidate object, and then the corresponding candidate object with the highest final body type matching degree may be selected as the reference object. In another embodiment, a final body type matching degree may be obtained based on the body type matching degrees of the target object and each candidate object in the partial target sub-images, and then the corresponding candidate object with the highest final body type matching degree may be selected as the reference object. Therefore, through the plurality of target sub-images obtained by respectively shooting the target object from a plurality of angles, more comprehensive body type related data about the target object can be obtained, so that the target object can be better matched with the candidate object, and the matching accuracy is improved.
Therefore, the target image is compared with the candidate images containing the candidate objects, so that the body type matching degrees of the target object and the candidate objects can be obtained, the candidate objects with the body type matching degrees meeting the preset requirements can be selected as the reference objects, and the matching between the candidate objects and the target object is realized.
In one embodiment, for the body shape related data of the target object as a target image containing the target object, after determining the target body shape information of the target object, the method for determining the body shape information further includes: and displaying the target body information on the target image. For example, the target body shape information is body fat rate and muscle tissue distribution information, and the body fat rate and muscle tissue distribution information may be displayed on the target image. In addition, muscle tissues can be superposed and displayed on the target object in the target image, so that the user can more intuitively know the muscle tissue distribution information of the user, and therefore, the user can conveniently know the target body information of the user by displaying the target body information on the target image. The specific method of displaying the target shape information is, for example, a display method implemented by using an augmented reality technology.
Referring to fig. 4, fig. 4 is a flowchart illustrating a method for determining profile information according to another embodiment of the present application. In the present embodiment, the target body shape information includes body fat information. The above-described "display of target body information on the target image" includes steps S41 to S43.
Step S41: the calorie corresponding to the fat weight of the target subject is determined based on the body fat information.
In this embodiment, if the weight of the target subject is already determined, the fat weight of the target subject may be determined based on the weight and the body fat rate of the target subject, and the calorie corresponding to the fat weight of the target subject may be determined based on the conversion relationship between the fat and the calorie. If the weight of the target object is not determined, the calorie corresponding to the fat weight of the target object can be determined by using the weight information of the reference object.
Step S42: determining thermal material matching the heat.
The thermal material may be material that has been determined to contain a particular amount of heat. The calorie material matched with the calories is that the calories contained in the plurality of calorie materials are the same as the calories corresponding to the fat weight of the target object. For example, if the fat weight of the target subject is determined to correspond to 10000 calories and each hamburger contains 250 calories, then the caloric material matching 10000 calories can be determined to be 40 hamburger.
Step S43: and displaying the thermal material on the target image by adopting an enhanced display technology.
The thermal material can be displayed on the target object of the target image in an overlapping manner, and can also be displayed at other places of the target object. The specific method of display may employ enhanced display techniques to display the thermal material on the target image. In other embodiments, the thermal material may be displayed directly superimposed on the target image.
In one embodiment, the target image may be continuously acquired, and then the caloric content corresponding to the fat weight of the target object on the target image may be determined based on the acquired target object, and then the caloric material may be displayed on the target image using an enhanced display technique on the acquired image. In one embodiment, in the case that the target body information further includes fat distribution information, the method may determine, in combination with the fat distribution of the target object, a caloric material matching with the fat of the portion at a corresponding portion on the target object, and display the caloric material of the portion by using the enhanced display technology. For example, if it is determined that the fat-matched caloric material of the target subject's stomach is two roast chickens, the two roast chickens may be displayed on the target subject's stomach.
Therefore, by determining the caloric material matched with the calories corresponding to the fat weight of the target object and displaying the caloric material on the target image by using the enhanced display technology, the body fat information of the target object can be displayed more intuitively, so that the user can conveniently know the body fat information of the user.
Referring to fig. 5, fig. 5 is a schematic diagram of a framework of an embodiment of the apparatus for determining profile information according to the present application. In the present embodiment, the determination device 50 of the body information includes an acquisition module 51, an object determination module 52, and an information determination module 53. The obtaining module 51 is configured to obtain body type related information of the target object; the object determination module 52 is configured to determine a reference object matching the target object by using the body shape-related data; the information determining module 53 is configured to determine target shape information of the target object based on the reference shape information corresponding to the reference object.
Wherein, the reference body information and the target body information both include: at least one of body fat information and muscle tissue distribution information; and/or, the information determining module 53 is configured to determine the target shape information of the target object based on the reference shape information corresponding to the reference object, and includes: and taking the reference body information of the reference object as the reference body information of the target object.
The object determination module 52 is configured to determine a reference object matching the target object by using the body type-related data, and includes: selecting a reference object matched with the target object from the first candidate object set based on the body type related data; the first candidate object set comprises a plurality of candidate objects, and the candidate objects are preset with reference shape information.
The device 50 for determining the body shape information further includes a screening module, and the obtaining module 51 is configured to obtain the body shape related information of the target object; the object determining module 52 is configured to, before utilizing the body shape-related data, obtain at least object information of the target object, where the object information includes one or more of gender and height; at least one candidate object matching the object information is found from the second set of candidate objects to form a first set of candidate objects.
The obtaining module 51 is configured to obtain the body type related information of the target object, and specifically includes: detecting a target image containing a target object to obtain a plurality of human body key points of the target object; and acquiring body type related data of the target object based on the plurality of human body key points.
The object determining module 52 is configured to determine a reference object matching the target object by using the body type-related data, and includes: comparing the target image with the candidate images containing the candidate objects to obtain the body type matching degree of the target object and each candidate object; and selecting the candidate object with the corresponding body type matching degree meeting the preset requirement as a reference object.
Wherein the target image includes: respectively shooting a target object from a plurality of angles to obtain a plurality of target sub-images; the object determining module 52 is configured to compare the target image with the candidate images including the candidate objects to obtain the body type matching degrees between the target object and the candidate objects, and includes: corresponding to each candidate image, comparing the plurality of target sub-images with the candidate images respectively to obtain the body type matching degree of the target object in each target sub-image with each candidate object respectively; the object determination module 52 is configured to select a candidate object whose body type matching degree meets a preset requirement as a reference object, and includes: and selecting the candidate object with the corresponding body type matching degree meeting the preset requirement based on the body type matching degree of the target object and each candidate object in each target sub-image.
The shape information determining device 50 further comprises a display module, and after the information determining module 53 is used for determining the target shape information of the target object, the display module is used for displaying the target shape information on the target image.
Wherein, the target body information comprises body fat information; the display module is used for displaying the target body information on the target image, and comprises: determining the heat corresponding to the fat weight of the target object based on the body fat information; determining heat materials matched with heat; and displaying the thermal material on the target image by adopting an enhanced display technology.
Referring to fig. 6, fig. 6 is a schematic frame diagram of an embodiment of an electronic device according to the present application. The electronic device 60 comprises a memory 601 and a processor 602 coupled to each other, and the processor 602 is configured to execute program instructions stored in the memory 601 to implement the steps of any of the above-mentioned embodiments of the method for determining physical information. In one particular implementation scenario, electronic device 60 may include, but is not limited to: a microcomputer, a server, and in addition, the electronic device 60 may also include a mobile device such as a notebook computer, a tablet computer, and the like, which is not limited herein.
In particular, the processor 602 is configured to control itself and the memory 601 to implement the steps of any of the above-described method embodiments of determining physical information. Processor 602 may also be referred to as a CPU (Central Processing Unit). The processor 602 may be an integrated circuit chip having signal processing capabilities. The Processor 602 may also be a general purpose Processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), a Field Programmable Gate Array (FPGA) or other Programmable logic device, discrete Gate or transistor logic, discrete hardware components. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like. In addition, the processor 602 may be commonly implemented by integrated circuit chips.
Referring to fig. 7, fig. 7 is a block diagram illustrating an embodiment of a computer-readable storage medium according to the present application. The computer readable storage medium 70 stores program instructions 701 executable by the processor, the program instructions 701 being for implementing the steps of any of the above-described method embodiments of determining physical information.
According to the scheme, the reference object matched with the target object can be determined by using the body type related data of the target object through the body type related data of the target object, and then the target body information of the target object can be determined according to the reference body information corresponding to the reference object, so that the target body information of the target object can be acquired.
The disclosure relates to the field of augmented reality, and aims to detect or identify relevant features, states and attributes of a target object by means of various visual correlation algorithms by acquiring image information of the target object in a real environment, so as to obtain an AR effect combining virtual and reality matched with specific applications. For example, the target object may relate to a face, a limb, a gesture, an action, etc. associated with a human body, or a marker, a marker associated with an object, or a sand table, a display area, a display item, etc. associated with a venue or a place. The vision-related algorithms may involve visual localization, SLAM, three-dimensional reconstruction, image registration, background segmentation, key point extraction and tracking of objects, pose or depth detection of objects, and the like. The specific application can not only relate to interactive scenes such as navigation, explanation, reconstruction, virtual effect superposition display and the like related to real scenes or articles, but also relate to special effect treatment related to people, such as interactive scenes such as makeup beautification, limb beautification, special effect display, virtual model display and the like.
The detection or identification processing of the relevant characteristics, states and attributes of the target object can be realized through the convolutional neural network. The convolutional neural network is a network model obtained by performing model training based on a deep learning framework.
In some embodiments, functions of or modules included in the apparatus provided in the embodiments of the present disclosure may be used to execute the method described in the above method embodiments, and specific implementation thereof may refer to the description of the above method embodiments, and for brevity, will not be described again here.
The foregoing description of the various embodiments is intended to highlight various differences between the embodiments, and the same or similar parts may be referred to each other, and for brevity, will not be described again herein.
In the several embodiments provided in the present application, it should be understood that the disclosed method and apparatus may be implemented in other ways. For example, the above-described apparatus embodiments are merely illustrative, and for example, a division of a module or a unit is merely one type of logical division, and an actual implementation may have another division, for example, a unit or a component may be combined or integrated with another system, or some features may be omitted, or not implemented. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection of devices or units through some interfaces, and may be in an electrical, mechanical or other form.
Units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on network elements. Some or all of the units can be selected according to actual needs to achieve the purpose of the embodiment.
In addition, functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated unit, if implemented in the form of a software functional unit and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present application may be substantially implemented or contributed to by the prior art, or all or part of the technical solution may be embodied in a software product, which is stored in a storage medium and includes instructions for causing a computer device (which may be a personal computer, a server, a network device, or the like) or a processor (processor) to execute all or part of the steps of the method according to the embodiments of the present application. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and other various media capable of storing program codes.

Claims (12)

1. A method for determining shape information, comprising:
acquiring body type related data of a target object;
determining a reference object matched with the target object by using the body type related data;
and determining the target body information of the target object based on the reference body information corresponding to the reference object.
2. The method of claim 1, wherein the reference shape information and the target shape information each comprise: at least one of body fat information and muscle tissue distribution information;
and/or, the determining the target body information of the target object based on the reference body information corresponding to the reference object includes: and taking the reference body information of the reference object as the reference body information of the target object.
3. The method according to claim 1 or 2, wherein the determining a reference object matching the target object using the body type-related data comprises:
selecting a reference object matched with the target object from a first candidate object set based on the body type related data; the first candidate object set comprises a plurality of candidate objects, and the candidate objects are preset with reference shape information.
4. The method of claim 3, wherein prior to said using the body conformation-related data to determine a reference object that matches the target object, the method further comprises:
obtaining at least object information of the target object, wherein the object information includes one or more of gender and height;
at least one candidate object matching the object information is found from a second set of candidate objects to form the first set of candidate objects.
5. The method according to any one of claims 1-4, wherein the obtaining body type-related data of the target subject comprises:
detecting a target image containing the target object to obtain a plurality of human body key points of the target object;
and acquiring body type related data of the target object based on the plurality of human body key points.
6. The method according to claim 3 or 4, wherein the body shape-related data of the target object is a target image containing the target object; the determining, by using the body shape-related data, a reference object matching the target object includes:
comparing the target image with candidate images containing the candidate objects to obtain body type matching degrees of the target object and the candidate objects respectively;
and selecting the candidate object with the corresponding body type matching degree meeting the preset requirement as the reference object.
7. The method of claim 6, wherein the target image comprises: respectively shooting the target object from a plurality of angles to obtain a plurality of target sub-images;
the comparing the target image with the candidate images containing the candidate objects to obtain the body type matching degrees of the target object and the candidate objects respectively comprises:
comparing the plurality of target sub-images with the candidate images corresponding to the candidate images respectively to obtain the body type matching degree of the target object in each target sub-image with each candidate object respectively;
the selecting the candidate object with the body type matching degree meeting the preset requirement as the reference object comprises:
and selecting the candidate objects with the corresponding body type matching degrees meeting preset requirements based on the body type matching degrees of the target objects in the target sub-images and the candidate objects.
8. The method of any one of claims 6-7, wherein after determining the target shape information of the target object, the method further comprises:
and displaying the target body information on the target image.
9. The method of claim 8, wherein the target physical information includes body fat information; the displaying the target body information on the target image comprises:
determining the heat corresponding to the fat weight of the target object based on the body fat information;
determining heat materials matched with the heat;
and displaying the thermal material on the target image by adopting an enhanced display technology.
10. An apparatus for determining shape information, comprising:
the acquisition module is used for acquiring the body type related information of the target object;
an object determination module for determining a reference object matching the target object by using the body type-related data;
and the information determining module is used for determining the target body information of the target object based on the reference body information corresponding to the reference object.
11. An electronic device comprising a processor and a memory coupled to each other, wherein,
the processor is configured to execute the memory-stored computer program to perform the method of any of claims 1 to 9.
12. A computer-readable storage medium, in which a computer program is stored which can be executed by a processor, the computer program being adapted to carry out the method of any one of claims 1 to 9.
CN202111101201.7A 2021-09-18 2021-09-18 Method for determining form information, related device, equipment and storage medium Pending CN113837056A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111101201.7A CN113837056A (en) 2021-09-18 2021-09-18 Method for determining form information, related device, equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111101201.7A CN113837056A (en) 2021-09-18 2021-09-18 Method for determining form information, related device, equipment and storage medium

Publications (1)

Publication Number Publication Date
CN113837056A true CN113837056A (en) 2021-12-24

Family

ID=78960030

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111101201.7A Pending CN113837056A (en) 2021-09-18 2021-09-18 Method for determining form information, related device, equipment and storage medium

Country Status (1)

Country Link
CN (1) CN113837056A (en)

Similar Documents

Publication Publication Date Title
US11017547B2 (en) Method and system for postural analysis and measuring anatomical dimensions from a digital image using machine learning
CN110969114B (en) Human body action function detection system, detection method and detector
US10043068B1 (en) Body modeling and garment fitting using an electronic device
CN105167761B (en) Intelligent wearable device wearing state detection method and device
Bonnechere et al. Determination of the precision and accuracy of morphological measurements using the Kinect™ sensor: comparison with standard stereophotogrammetry
WO2009097122A1 (en) Simple techniques for three-dimensional modeling
Tian et al. Predicting 3D body shape and body composition from conventional 2D photography
Lu et al. Accurate nonrigid 3d human body surface reconstruction using commodity depth sensors
KR101557492B1 (en) Apparatus and Method for generating user's three dimensional body model based on depth information
Liu et al. Simple method integrating OpenPose and RGB-D camera for identifying 3D body landmark locations in various postures
Lu et al. 3D shape-based body composition inference model using a Bayesian network
KR101499699B1 (en) Apparatus and Method for generating user's three dimensional body model based on depth information
Maganti et al. Height and Weight Estimation of an Individual from Virtual Visuals
WO2020147797A1 (en) Image processing method and apparatus, image device, and storage medium
Liu et al. Automated binocular vision measurement of food dimensions and volume for dietary evaluation
CN113837056A (en) Method for determining form information, related device, equipment and storage medium
CN115937969A (en) Method, device, equipment and medium for determining target person in sit-up examination
Xu et al. Three-dimensional surface imaging system for assessing human obesity
CN115578789A (en) Scoliosis detection apparatus, system, and computer-readable storage medium
Calvache et al. Automatic estimation of pose and falls in videos using computer vision model
Lin et al. Create a Virtual Mannequin Through the 2-D Image-based Anthropometric Measurement and Radius Distance Free Form Deformation
KR20230043343A (en) System for virtual fitting service based on body size
KR20230043347A (en) Method for providing fitting service using 3D modeling avatar
KR20230043346A (en) System for virtual fashion item-based 3D content service platform
Liu et al. New anti-blur and illumination-robust combined invariant for stereo vision in human belly reconstruction

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination