WO2021043204A1 - 数据处理方法及装置、计算机设备及计算机可读存储介质 - Google Patents

数据处理方法及装置、计算机设备及计算机可读存储介质 Download PDF

Info

Publication number
WO2021043204A1
WO2021043204A1 PCT/CN2020/113198 CN2020113198W WO2021043204A1 WO 2021043204 A1 WO2021043204 A1 WO 2021043204A1 CN 2020113198 W CN2020113198 W CN 2020113198W WO 2021043204 A1 WO2021043204 A1 WO 2021043204A1
Authority
WO
WIPO (PCT)
Prior art keywords
human body
posture
data
preset
dimensional human
Prior art date
Application number
PCT/CN2020/113198
Other languages
English (en)
French (fr)
Original Assignee
程立苇
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 程立苇 filed Critical 程立苇
Priority to CN202080061120.6A priority Critical patent/CN114556402A/zh
Priority to US17/639,965 priority patent/US11849790B2/en
Publication of WO2021043204A1 publication Critical patent/WO2021043204A1/zh

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A41WEARING APPAREL
    • A41HAPPLIANCES OR METHODS FOR MAKING CLOTHES, e.g. FOR DRESS-MAKING OR FOR TAILORING, NOT OTHERWISE PROVIDED FOR
    • A41H3/00Patterns for cutting-out; Methods of drafting or marking-out such patterns, e.g. on the cloth
    • A41H3/007Methods of drafting or marking-out patterns using computers
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/06Buying, selling or leasing transactions
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T13/00Animation
    • G06T13/203D [Three Dimensional] animation
    • G06T13/403D [Three Dimensional] animation of characters, e.g. humans, animals or virtual beings
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/60Analysis of geometric attributes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/60Analysis of geometric attributes
    • G06T7/62Analysis of geometric attributes of area, perimeter, diameter or volume
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30196Human being; Person
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2210/00Indexing scheme for image generation or computer graphics
    • G06T2210/16Cloth
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2210/00Indexing scheme for image generation or computer graphics
    • G06T2210/56Particle system, point based geometry or rendering

Definitions

  • the present invention relates to the field of computer technology, in particular to data processing methods and devices, computer equipment, and computer equipment storage media.
  • Virtual fitting refers to the act of trying on virtual clothes through the same avatar as oneself in a virtual space.
  • the virtual fitting does not require the user to actually change the outfit, and can achieve the effect of the user putting on a certain piece of clothing.
  • the emergence of virtual fitting technology provides users with a convenient and quick fitting experience, which has greatly changed people's traditional fitting methods.
  • the existing virtual fittings have the problems of low virtual fitting efficiency and poor user experience due to three-dimensional modeling.
  • the embodiment of the present invention provides a data processing method to improve the efficiency of virtual fitting and improve user experience.
  • the method includes:
  • the image data including the human body in the dress through the image acquisition device;
  • the image data of the human body in the dress reflect the projection data of the three-dimensional human body on the first plane perpendicular to the center line of the field of view of the image acquisition device;
  • the two-dimensional human posture reflects the projection data of the three-dimensional human body on the second plane;
  • the second plane is the symmetrical plane between the front of the wearing human body and the back of the wearing human body;
  • the part covered by the original clothing in the preset human body posture of the two-dimensional human body is given the cloth physical properties of the clothing to be tested.
  • the embodiment of the present invention also provides a data processing device to improve the efficiency of virtual fitting and improve user experience.
  • the device includes:
  • the image data acquisition module is used to acquire image data including the wearing human body through the image acquisition device; the image data of the wearing human body reflects the projection data of the three-dimensional human body on a first plane perpendicular to the center line of the field of view of the image acquisition device;
  • the posture recognition module is used to recognize the two-dimensional human posture of the wearing human body according to the image data; the two-dimensional human posture reflects the projection data of the three-dimensional human body on the second plane; the second plane is the symmetrical plane between the front of the wearing human body and the back of the wearing human body;
  • the posture matching module is used to match the recognized two-dimensional human posture in the human posture database to determine the preset human body part of the two-dimensional human posture;
  • the virtual fitting module is used to give the cloth physical properties of the garment to be tested to the part of the body part covered by the original garment in the preset body position of the two-dimensional human body.
  • An embodiment of the present invention also provides a computer device, including a memory, a processor, and a computer program stored in the memory and capable of running on the processor, and the processor implements the above-mentioned data processing method when the computer program is executed.
  • the embodiment of the present invention also provides a computer-readable storage medium, and the computer-readable storage medium stores a computer program for executing the above-mentioned data processing method.
  • the image data including the wearing human body is acquired through the image acquisition device; then the two-dimensional human body posture of the wearing human body is recognized according to the image data; and the preset human body part of the two-dimensional human posture is determined by matching in the human body posture database, Finally, the cloth physical properties of the garment to be tested are directly assigned to the part of the preset human body part covered by the original garment in the two-dimensional human body posture.
  • the embodiment of the present invention does not need to construct a three-dimensional human body model or a clothing model, only needs to recognize the two-dimensional human posture of the wearing human body, and directly assign the cloth physics of the clothing to be tested to the part of the preset human body part of the two-dimensional human posture that is covered by the original clothing Attributes can greatly improve the efficiency of data processing, thereby improving user experience.
  • Fig. 1 is an implementation flowchart of a data processing method provided by an embodiment of the present invention
  • FIG. 2 is a schematic diagram (side view) of the position of the image acquisition device and the first plane provided by an embodiment of the present invention
  • FIG. 3 is a schematic diagram (top view) of the positions of the first plane and the second plane provided by an embodiment of the present invention
  • FIG. 4 is a schematic diagram of a two-dimensional human body posture provided by an embodiment of the present invention (the center line of the field of view is perpendicular to the second plane);
  • FIG. 5 is another implementation flowchart of a data processing method provided by an embodiment of the present invention.
  • FIG. 6 is a schematic diagram of a basic posture of a two-dimensional human body provided by an embodiment of the present invention (the center line of the field of view is perpendicular to the first plane);
  • FIG. 7 is a flowchart of the implementation of step 504 in the data processing method provided by an embodiment of the present invention.
  • FIG. 8 is a schematic diagram of the garment data to be tested covering the original garment data provided by an embodiment of the present invention.
  • Fig. 9(a) is a schematic diagram of the original clothing data being covered by the to-be-tested clothing data provided by an embodiment of the present invention.
  • Fig. 9(b) is another schematic diagram of the original clothing data overlaid with the to-be-tested clothing data provided by the embodiment of the present invention.
  • FIG. 10 is another implementation flowchart of step 504 in the data processing method provided by an embodiment of the present invention.
  • FIG. 11 is a schematic diagram of stretching and rebounding of the garment data to be tested for the arm part provided by an embodiment of the present invention.
  • FIG. 12 is a schematic diagram of stretching and rebounding of the garment data to be tested for the torso provided by an embodiment of the present invention
  • FIG. 13 is another implementation flowchart of step 504 in the data processing method provided by an embodiment of the present invention.
  • FIG. 14 is another implementation flowchart of step 504 in the data processing method provided by an embodiment of the present invention.
  • 15 is a functional module diagram of a data processing device provided by an embodiment of the present invention.
  • FIG. 16 is a diagram of another functional module of a data processing device according to an embodiment of the present invention.
  • FIG. 17 is a structural block diagram of a clothing data covering module 1604 in a data processing device provided by an embodiment of the present invention.
  • FIG. 18 is another structural block diagram of the clothing data covering module 1604 in the data processing device provided by the embodiment of the present invention.
  • FIG. 19 is another structural block diagram of the clothing data covering module 1604 in the data processing device provided by the embodiment of the present invention.
  • FIG. 20 is another structural block diagram of the clothing data covering module 1604 in the data processing device provided by an embodiment of the present invention.
  • FIG. 1 shows the implementation process of the data processing method provided by the embodiment of the present invention.
  • FIG. 1 shows the implementation process of the data processing method provided by the embodiment of the present invention.
  • the details are as follows:
  • the data processing method includes:
  • Step 101 Obtain image data including the dressed human body through the image acquisition device; the image data of the dressed human body reflects projection data of the three-dimensional human body on a first plane perpendicular to the center line of the field of view of the image acquisition device;
  • Step 102 Identify the two-dimensional body posture of the wearing human body according to the image data; the two-dimensional human posture reflects the projection data of the three-dimensional human body on a second plane; the second plane is a symmetrical plane between the front of the wearing human body and the back of the wearing human body;
  • Step 103 Match the recognized two-dimensional human posture in a human posture database to determine a preset human body part of the two-dimensional human posture;
  • Step 104 Assign the cloth physical properties of the garment to be tested to the part of the preset human body part in the two-dimensional human body posture that is covered by the original garment.
  • the image acquisition device may be, for example, a normal camera or a depth camera that can acquire a single frame of image, or a camera that can acquire a video image.
  • the image data may include RGB data, and may also include RGBD data (RGB data and depth data of the distance from the body to the depth camera) and so on. That is, step 101, acquiring image data including the wearing human body through an image acquisition device includes: acquiring RGB data including the wearing human body through a camera, or acquiring RGBD data including the wearing human body through a depth camera.
  • the image acquisition device is a camera
  • the video image captured by the camera can be first converted into a single frame image, and then each frame of image including the human body is acquired.
  • the human body in clothing may be a person wearing clothing, a clothing display stand, or a human body such as a human skeleton or a clothes rack, which is not particularly limited in the embodiment of the present invention.
  • FIG. 2 shows a schematic diagram (side view) of the position of the image acquisition device and the first plane provided by the embodiment of the present invention.
  • FIG. 2 shows a schematic diagram (side view) of the position of the image acquisition device and the first plane provided by the embodiment of the present invention.
  • the details are as follows:
  • the first plane is perpendicular to the center line of the field of view of the image acquisition device.
  • the image acquisition device is located in the middle area near the top of the virtual fitting terminal.
  • the virtual fitting terminal may include, for example, a smart virtual fitting mirror or other fitting terminals including a display screen.
  • the first plane is parallel to the plane where the virtual fitting terminal is located, and the front of the wearing human body faces the virtual fitting terminal.
  • the image data including the wearing human body obtained by the image acquisition device reflects the projection of the three-dimensional human body on the first plane.
  • FIG. 3 shows a schematic diagram of the position of the first plane and the second plane provided by an embodiment of the present invention (top view)
  • FIG. 4 shows a schematic diagram of a two-dimensional human body posture provided by an embodiment of the present invention (the center line of the field of view is perpendicular to the second plane)
  • the width range of the three-dimensional human body projected on the second plane is S, and viewed from the angle that the center line of the field of view in Fig. 4 is perpendicular to the second plane ,
  • the three-dimensional human body projects the two-dimensional human body posture as shown in Fig. 4 on the second plane.
  • first plane and the second plane may be planes parallel to each other or the same plane, or they may be different planes.
  • the projection of the three-dimensional human body on the second plane is consistent with the projection of the three-dimensional human body on the first plane, that is, the projection of the three-dimensional human body on the second plane is the same as the three-dimensional projection of the human body on the second plane.
  • the projection of the human body on the first plane is completely coincident in size and posture.
  • the projection of the three-dimensional human body on the second plane is the projection of the three-dimensional human body on the first plane.
  • the first plane and the second plane are not planes parallel to each other, or are not the same plane, at this time there is an angle ⁇ between the first plane and the second plane (that is, the human body is inclined Standing in front of the virtual fitting terminal), the image data of the wearing human body acquired by the image acquisition device at this time is the projection data of the three-dimensional human body on the first plane, and the width range of the projection data is L, and is based on geometry
  • between the first plane and the second plane
  • the image data of the wearing human body acquired by the image acquisition device at this time is the projection data of the three-dimensional human body on the first plane
  • the width range of the projection data is L
  • the projection data of the wearing human body on the second plane can be determined, and then the two-dimensional body posture of the wearing human body on the second plane can be identified.
  • the two-dimensional body posture of the wearing human body is recognized based on the obtained image data.
  • the acquired image data is RGBD data (RGB data and the depth data of the wearing body distance from the depth camera)
  • the two-dimensional body posture of the wearing body can be recognized through the following steps.
  • the human body posture database is called, and the human figure detection is performed on the above-mentioned pixel point combination.
  • the pixel point combination is matched one by one with different human body pose models in the human body pose database (starting from the T-pose basic human body model), and the human body pose in the human body pose database similar to the pixel point combination is selected. If the similarity is not high or there is no similar part, reduce the preset similarity standard of the depth value of the similar position in the RGBD image, re-cluster and combine the image pixels, and match the different human bodies in the human pose database one by one again Pose model (starting from the T-pose basic human body model).
  • the human posture model in the human posture database is the human posture of the currently dressed human body.
  • the two-dimensional body posture of the wearing human body can be recognized through the following steps.
  • the human body posture database is called, and the human figure detection is performed on the above-mentioned pixel point combination.
  • the pixel point combination is matched one by one with different human body pose models in the human body pose database (starting from the T-pose basic human body model), and the human body pose in the human body pose database similar to the pixel point combination is selected.
  • the similarity is not high or there is no similar part, then lower the preset similarity standard of the colors in the similar position in the RGB image, re-cluster and combine the image pixels, and match the different human bodies in the human pose database one by one again Pose model (starting from the T-pose basic human body model).
  • Pose model starting from the T-pose basic human body model.
  • the two-dimensional body posture of the wearing human body is projection data of the three-dimensional human body on the second plane.
  • the human body can be approximately regarded as a symmetrical structure.
  • the second plane is defined as the symmetrical plane between the front of the human body and the back of the human body. That is to say, the embodiment of the present invention does not need to construct a three-dimensional human body model or a three-dimensional clothing model, but directly obtains the two-dimensional human body posture of the three-dimensional human body on a plane, and then uses the two-dimensional human posture to perform a virtual fitting, which can greatly improve the performance of the virtual fitting. Efficiency and improve user experience.
  • Human body posture recognition can identify the boundary range of the two-dimensional human body posture in the image data, that is, delineate the boundary range of the recognized human body, but cannot recognize the posture of various parts of the human body such as the head, arms, torso, and legs.
  • the two-dimensional human posture is matched in the human posture database to determine the posture of the preset human body part in the two-dimensional human posture, so as to improve the accuracy of virtual fitting And authenticity.
  • the preset human body part is a preset human body part, and those skilled in the art can understand that the preset human body part may include one or more of the head, arms, torso, and legs. For example, it may only include arms and torso (when performing virtual fitting on tops), or only include legs (when performing virtual fitting on bottoms such as pants or skirts), which is not particularly limited in the embodiment of the present invention.
  • the human body pose database is a human body pose database trained in advance through a deep learning framework such as TensorFlow, and the human body pose database obtained by this training contains a large number or even a massive amount of different two-dimensional human body poses.
  • TensorFlow is the second-generation artificial intelligence learning system developed by Google. It transmits complex data structures to artificial intelligence neural networks for analysis and processing. It can be used in multiple machine learning and deep learning fields such as speech recognition and image recognition. , Support convolutional neural network, cyclic neural network and long short-term memory network, etc.
  • the recognized two-dimensional human body posture of the wearing human body is matched with the human body posture database to determine the data set and posture of each preset human body part in the two-dimensional human posture.
  • it can be achieved by the following method:
  • the bone point position of each preset human body part in the two-dimensional human posture can be recognized, and then combined with the boundary data of the two-dimensional human body posture to determine each preset human body part in the two-dimensional human posture
  • Data collections such as head data collection, torso data collection, leg data collection, etc.
  • the data of each preset body part are combined and clustered, and the data parts with similar colors are combined to further determine each Preset the detailed parts of the human body, such as the upper and lower arms of the arm, the elbow joint and the finger joint, etc.; also such as the knee joint of the leg, the thigh and the calf.
  • the orientation and angle of the two-dimensional human posture in the human posture database are different from those of the recognized two-dimensional human posture.
  • the rotation and conversion of the angle are performed in a consistent manner, that is, the orientation and angle of the two-dimensional human body posture in the human body posture database are adjusted to a position that is basically consistent with the orientation and angle of the recognized two-dimensional human body posture, and then the matching is performed.
  • the data set of each preset human body part in the two-dimensional human body posture is combined with the human body
  • the two-dimensional human body posture in the posture database is matched.
  • the part of each human body part covered by the original clothing is directly assigned the cloth physical attributes of the clothing to be tested, so as to perform the operation of virtual fitting.
  • the physical properties of cloth refer to the maximum variable distance (stretching scale) and minimum distance (rebound scale) between two adjacent pixels in the cloth.
  • the physical properties of the cloth of each garment to be tested and the original clothing worn by the human body are known and prepared data.
  • the user can receive instructions (such as the user through Enter the instruction by clicking, or pre-encode the physical properties of the clothing, and enter the instructions by entering the number of the physical properties of the clothing).
  • the cloth physical properties of clothing can also be obtained through a high-definition camera, and the cloth physical properties of all clothing can be formed into a cloth physical property database.
  • the resolution of the high-definition camera is generally not limited, such as 460X320 to 4k, 8k, 16k, etc.
  • the maximum upper limit is determined by the hardware.
  • the cloth physical properties of the clothing to be tested can be determined from the cloth physical property database.
  • the image data including the wearing human body is acquired through the image acquisition device; then the two-dimensional human posture of the wearing human body is recognized according to the image data; and the preset human body part of the two-dimensional human posture is determined by matching in the human posture database , And finally directly give the cloth physical properties of the garment to be tested to the part of the preset human body part covered by the original garment in the two-dimensional human body posture.
  • the embodiment of the present invention does not need to construct a three-dimensional human body model or a clothing model, only needs to recognize the two-dimensional human posture of the wearing human body, and directly assign the cloth physics of the clothing to be tested to the part of the preset human body part of the two-dimensional human posture that is covered by the original clothing Attributes can greatly improve the efficiency of virtual fitting, thereby improving user experience.
  • FIG. 5 shows another implementation flow of the data processing method provided by the embodiment of the present invention.
  • the details are as follows:
  • the data processing method further includes:
  • Step 501 Determine the posture change of each preset human body part in the two-dimensional human posture according to the basic posture of the two-dimensional human body;
  • the basic posture of the two-dimensional human body is the projection data of the three-dimensional basic human body model on the first plane;
  • Step 502 Assign the cloth physical properties of the original clothing to the preset body parts covered by the original clothing in the two-dimensional human posture to form original clothing data;
  • Step 503 Assign the cloth physical properties of the garment to be tested to the preset body part covered by the garment to be tested in the basic posture of the two-dimensional human body to form the garment data to be tested;
  • Step 504 According to the posture change of each preset human body part in the two-dimensional human body posture, the original clothing data of each preset human body part in the two-dimensional human body posture is covered with the clothing data to be tested for each preset human body part in the basic posture of the two-dimensional human body. , To form a two-dimensional human body pose containing the garment data to be tested.
  • the three-dimensional basic human body model is a standard T-pose model.
  • T-pose model In the standard T-pose three-dimensional basic human body model, the arms are raised horizontally and the shoulders are perpendicular to each other. It is named T-pose model because it resembles T-shape.
  • the basic pose of the two-dimensional human body is the projection data of the T-pose three-dimensional basic human body model on the first plane.
  • FIG. 6 shows a schematic diagram of the basic posture of a two-dimensional human body provided by an embodiment of the present invention (the center line of the field of view is perpendicular to the first plane).
  • the center line of the field of view is perpendicular to the first plane.
  • the human body posture shown in the figure is the projection data of the T-pose three-dimensional basic human body model on the first plane, that is, the two-dimensional basic human body posture.
  • the posture changes of each preset human body part in the two-dimensional human posture shown in Fig. 4 are determined, which is the coverage of the physical properties of the cloth and the virtual test. Clothing is the foundation.
  • the preset body parts covered by the original clothing in the two-dimensional human body pose are directly assigned the physical properties of the original clothing to form the original clothing data; while the preset body parts covered by the clothing to be tested in the basic pose of the two-dimensional human body Give the cloth physical properties of the clothes to be tested to form the data of the clothes to be tested.
  • the posture changes of each part of the human body are preset according to the two-dimensional human posture.
  • the posture of the preset human body part is adjusted to the state consistent with the posture of each preset human body part in the two-dimensional human body posture, and the test clothing data of each preset human body part in the basic posture of the two-dimensional human body is used to cover each of the two-dimensional human body postures.
  • the original clothing data of the human body parts that is, different human body parts are separately covered, and finally a two-dimensional human body posture containing the clothing data to be tested is formed to complete the virtual fitting operation.
  • the posture change of each preset human body part in the two-dimensional human posture is determined based on the basic posture of the two-dimensional human body, and the fabric physical properties of the original clothing are given to the preset human body parts covered by the original clothing in the two-dimensional human posture.
  • the original clothing data assigns the fabric physical properties of the clothing to be tested to the preset human body parts covered by the clothing to be tested in the basic posture of the two-dimensional human body to form the clothing data to be tested, and then preset the posture changes of each part of the human body according to the two-dimensional human body posture ,
  • the original clothing data of each preset human body part in the two-dimensional human body posture is covered by the clothing data to be tested for each preset human body part in the basic posture of the two-dimensional human body, which can improve the authenticity of the virtual fitting.
  • FIG. 7 shows the implementation flow of step 504 in the data processing method provided by the embodiment of the present invention.
  • the details are as follows:
  • the preset human body parts include arms and torso.
  • step 504 according to the posture changes of each preset human body part in the two-dimensional human body posture, respectively use the clothing data to be tested for each preset human body part in the basic posture of the two-dimensional human body to cover each preset human body posture
  • the original clothing data of the human body parts form a two-dimensional human body posture containing the clothing data to be tested, including:
  • Step 701 According to the changes in the posture of the arm in the two-dimensional human body posture, overlay the original clothing data of the arm in the two-dimensional human body posture with the original clothing data of the arm in the basic posture of the two-dimensional human body to form an arm that contains the data of the clothing to be tested Parts; among them, the first preset vertex and the first preset boundary of the garment data to be tested for the arm part in the basic posture of the two-dimensional human body and the first preset vertex and the first preset of the original clothing data of the arm part in the two-dimensional human body posture
  • the boundary coincides, or the second preset boundary and second preset boundary center of the garment data to be tested in the arm part of the basic pose of the two-dimensional human body and the second preset boundary and second preset boundary of the original clothing data of the arm part in the two-dimensional human body pose Set the boundary center to coincide;
  • Step 702 Overwrite the original clothing data of the torso in the two-dimensional human body posture with the clothing data to be tested in the basic pose of the two-dimensional human body to form a torso containing the clothing data to be tested; wherein, the torso in the basic two-dimensional human body posture
  • the third preset vertex and the third preset boundary of the clothing data to be tested coincide with the third preset vertex and the third preset boundary of the original clothing data of the torso in the two-dimensional human body posture, or the torso in the basic posture of the two-dimensional human body
  • the fourth preset boundary and the fourth preset boundary center of the clothing data to be tested coincide with the fourth preset boundary and the fourth preset boundary center of the original clothing data of the torso in the two-dimensional human body posture.
  • the preset human body parts include arms and torso (that is, suitable for when the clothes to be tested are tops).
  • FIG. 8 shows a schematic diagram of the clothing data to be tested covering the original clothing data of the arm part provided by the embodiment of the present invention.
  • the details are as follows:
  • the posture of the arm part in the basic posture of the two-dimensional human body including the clothing data to be tested is changed according to the posture change of the arm part in the two-dimensional human body posture. Adjust to be consistent with the posture of the arm part in the two-dimensional human body posture containing the original clothing data, and then cover the original clothing data of the arm part with the test clothing data of the arm part to form the arm part including the test clothing data.
  • the first preset vertex is the upper vertex of the contact position of the arm part and the torso part, in the basic posture of the two-dimensional human body
  • the first preset vertex of the clothing data to be tested for the middle arm is A1
  • the first preset vertex of the original clothing data for the arm in the two-dimensional human body posture is A2
  • the first preset boundary is the upper boundary of the arm
  • the first preset boundary of the garment data to be tested for the arm part in the basic posture of the human body is P1
  • the first preset boundary of the original clothing data of the arm part in the two-dimensional human body posture is P2.
  • the first preset vertex A1 of the garment data to be tested in the arm part of the two-dimensional human body pose coincides with the first preset vertex A2 of the original clothing data of the arm part in the two-dimensional human body pose.
  • the first preset boundary P1 of the garment data to be tested for the arm part in the posture coincides with the first preset boundary P2 of the original clothing data of the arm part in the two-dimensional human body posture to complete the coverage.
  • covering is performed in a manner that the first preset vertex and the first preset boundary overlap, which can further improve the authenticity of the virtual fitting.
  • first preset vertex may also be other vertices besides the aforementioned vertex A1 (A2), and the first preset boundary may also be other boundaries than the aforementioned boundary P1 (P2).
  • first preset vertex may also be other vertices besides the aforementioned vertex A1 (A2)
  • first preset boundary may also be other boundaries than the aforementioned boundary P1 (P2).
  • the first preset vertex and the first preset boundary of the garment data to be tested on the arm part are overlapped with the first preset vertex and the first preset boundary of the original clothing data of the arm part. It can further improve the authenticity of virtual fitting.
  • the second preset boundary is the upper boundary of the arm part (consistent with the above-mentioned first preset boundary), and the second preset of the garment data to be tested on the arm part in the basic posture of the two-dimensional human body
  • the boundary is P1
  • the second preset boundary of the original clothing data of the arm part in the two-dimensional human body posture is P2.
  • the coverage is performed in a manner that the second preset boundary and the center of the second preset boundary overlap, which can further improve the authenticity of the virtual fitting.
  • the second preset boundary may also be another boundary besides the foregoing boundary P1 (P2), which is not particularly limited in the embodiment of the present invention.
  • the second preset boundary and the second preset boundary center of the garment data to be tested on the arm part coincide with the second preset boundary and the second preset boundary center of the original clothing data of the arm part. Coverage can further improve the authenticity of virtual fitting.
  • FIG. 9(a) shows a schematic diagram of the garment data to be tested covering the original garment data provided by an embodiment of the present invention.
  • Figure 9(a) shows a schematic diagram of the garment data to be tested covering the original garment data provided by an embodiment of the present invention.
  • the details are as follows:
  • the torso in the two-dimensional plane can be approximated as Square, here the torso part of the clothing data to be tested can be directly overlaid with the original clothing data of the torso part, so as to form the torso part containing the torso part of the clothing data.
  • the third preset vertex can be the upper vertex where the torso part and the arm part are in contact.
  • the third preset vertex of the torso part to be tested clothing data is B1 (in In this embodiment, it is substantially the same as vertex A1).
  • the third preset vertex of the original clothing data of the torso is B2; the third preset boundary is the upper and left boundaries of the torso,
  • the third preset boundary of the torso clothing data to be tested in the basic posture of the human body is the upper boundary Q1 and the left boundary Q2, and the third preset boundary of the original clothing data of the torso in the two-dimensional human body posture is the upper boundary Q3 and the left boundary Q4.
  • the third preset vertex B1 of the torso clothing data to be tested in the basic pose of the two-dimensional human body coincides with the third preset vertex B2 of the original clothing data of the torso in the two-dimensional human body pose.
  • the upper boundary Q1 of the garment data to be tested in the torso part in the posture coincides with the upper boundary Q3 of the original clothing data of the torso in the two-dimensional human body posture.
  • the left boundary Q2 of the torso clothing data in the two-dimensional basic posture of the human body is the same as the two-dimensional
  • the left boundary Q4 of the original clothing data of the arm part in the human body posture overlaps to complete the coverage.
  • the third preset vertex of the torso part and the third preset boundary are overlapped for covering, which can further improve the authenticity of the virtual fitting.
  • the third preset vertex can also be other than the aforementioned vertex B1 (B2), such as the upper vertex of the contact position of the trunk part and the right arm, and the third preset boundary can also be other than the aforementioned boundary Q1.
  • other boundaries other than Q2 (Q3 and Q4), such as the upper boundary and the right boundary of the trunk which are not particularly limited in the embodiment of the present invention.
  • the third preset vertex and the third preset boundary of the clothing data to be tested on the torso in the basic pose of the two-dimensional human body are used with the third preset vertex and the third preset vertex of the original clothing data on the torso in the two-dimensional human body pose. Covering in a way that the third preset boundary overlaps can further improve the authenticity of the virtual fitting.
  • Fig. 9(b) shows another example of covering the original clothing data with the to-be-tested clothing data provided by the embodiment of the present invention.
  • Fig. 9(b) shows another example of covering the original clothing data with the to-be-tested clothing data provided by the embodiment of the present invention.
  • only the parts related to the embodiment of the present invention are shown, which are described in detail as follows :
  • the fourth preset boundary is the upper boundary of the torso (consistent with the third preset boundary in the above-mentioned embodiment), in the basic posture of the two-dimensional human body
  • the fourth preset boundary of the clothing data to be tested for the middle torso is Q1
  • the fourth preset boundary is O1
  • the center of Q1 is O1.
  • the fourth preset boundary of the original clothing data of the torso in the two-dimensional human body posture is Q3, and the fourth preset boundary is Q3.
  • the center of the preset boundary Q3 is O2.
  • the fourth preset boundary Q1 of the torso part of the clothing data to be tested in the basic pose of the two-dimensional human body and the fourth preset boundary of the original clothing data of the torso part of the two-dimensional human body pose overlaps, and the center O1 of the fourth predetermined boundary Q1 coincides with the center O2 of the fourth predetermined boundary Q3.
  • covering is performed in a manner that the fourth preset boundary and the center of the fourth preset boundary overlap, which can further improve the authenticity of the virtual fitting.
  • the fourth preset boundary and the fourth preset boundary center of the clothing data to be tested in the torso part in the basic pose of the two-dimensional human body are used with the fourth preset boundary in the original clothing data of the torso part in the two-dimensional human body pose. Covering by overlapping the center of the fourth preset boundary can further improve the authenticity of the virtual fitting.
  • FIG. 10 shows another implementation flow of step 504 in the data processing method provided by the embodiment of the present invention.
  • the parts related to the embodiment of the present invention are shown, which are described in detail as follows:
  • step 504 on the basis of the method steps shown in FIG. 7, further includes:
  • Step 1001 According to the tensile force of the physical properties of the original clothing fabric and the tensile force of the physical properties of the clothing fabric to be tested, stretch or stretch or stretch the test clothing data of the arm part in the basic posture of the two-dimensional human body along the extension direction of the arm part in the two-dimensional human body posture. Rebound, forming the arm part containing the data of the garment to be tested after stretching or rebounding;
  • Step 1002 According to the tensile force of the physical properties of the original clothing fabric and the physical properties of the clothing fabric to be tested, stretch or stretch or stretch the torso part of the torso in the basic posture of the two-dimensional human body along the extension direction of the torso part in the two-dimensional human body posture. Rebound, forming the torso containing the data of the garment to be tested after stretching or rebounding.
  • different physical properties of the cloth reflect the different pulling force of the clothing data. Based on the tensile force of the physical properties of the fabric of the original clothing data and the tensile force of the physical properties of the clothing data to be tested, stretching or rebounding the physical properties of the clothing data to be tested can further improve the authenticity of the virtual fitting. In addition, in the case where the physical properties of the clothing cloth are known, the tensile force of the physical properties of the clothing cloth is definitely known.
  • FIG. 11 shows a schematic diagram of stretching and rebounding of the garment data to be tested for an arm part provided by an embodiment of the present invention.
  • FIG. 11 shows a schematic diagram of stretching and rebounding of the garment data to be tested for an arm part provided by an embodiment of the present invention.
  • the details are as follows:
  • the physical properties of the original clothing cloth of the arm part are respectively The tensile force of the arm part and the tensile force of the physical properties of the garment fabric to be tested are determined to determine the stretch scale of the arm part or the rebound scale of the arm part.
  • the tensile strength of the arm portion or the rebound size of the arm portion can be determined according to the difference between the tensile force of the physical properties of the original clothing fabric of the arm portion and the physical properties of the garment fabric to be tested on the arm portion.
  • the following formula can be used to determine the stretch scale of the arm part or the rebound scale of the arm part:
  • ⁇ G 12 represents the stretch scale of the arm part or the rebound scale of the arm part
  • G 1 represents the pulling force of the physical properties of the original clothing fabric of the arm part in the two-dimensional human body posture
  • G 2 represents the clothing fabric to be tested in the arm part of the basic two-dimensional human body posture. The pull of physical properties.
  • the difference ⁇ G 12 is the stretching scale of the arm, and then based on the tension of the arm. Stretching the scale, the physical properties of the cloth to be tested in the arm part are stretched along the extension direction of the arm part in the two-dimensional human body posture (the direction of the arrow shown in the arm part of the last figure in Fig. 11) to form the test after stretching The arm part of the clothing data.
  • the difference ⁇ G 12 is the rebound scale of the arm, based on the return of the arm.
  • Elastic scale the physical properties of the garment fabric to be tested in the arm part, rebound in the direction opposite to the extension direction of the arm part in the two-dimensional human body posture (the direction opposite to the direction of the arrow shown in the arm part in the last picture of Figure 11) to form The arm part containing the data of the garment to be tested after rebound.
  • the test clothing data of the arm part is stretched or rebounded along the extension direction of the arm part in the two-dimensional human body posture. , Can further improve the authenticity of virtual fitting.
  • FIG. 12 shows a schematic diagram of the stretching and rebounding of the garment data to be tested for the torso provided by the embodiment of the present invention.
  • FIG. 12 shows a schematic diagram of the stretching and rebounding of the garment data to be tested for the torso provided by the embodiment of the present invention.
  • the details are as follows:
  • the physical properties of the original clothing fabric of the torso part are respectively To determine the tensile strength of the torso and the physical properties of the fabric to be tested on the torso, determine the tensile scale of the torso or the rebound scale of the torso.
  • the stretching scale of the trunk or the rebound scale of the trunk can be determined according to the difference between the tensile force of the physical properties of the original clothing fabric on the torso and the physical properties of the clothing fabric to be tested on the torso.
  • the stretching scale of the trunk or the rebound scale of the trunk can be determined by the following formula:
  • ⁇ G 34 represents the stretch scale of the torso or the rebound scale of the torso
  • G 3 represents the pulling force of the physical properties of the original clothing fabric of the torso in the two-dimensional human posture
  • G 4 represents the clothing fabric to be tested in the basic posture of the two-dimensional human body. The pull of physical properties.
  • the principle is similar to the above-mentioned arm part.
  • the pulling force of the physical properties of the fabric to be tested on the torso in the basic pose of the two-dimensional human body is consistent with the physical properties of the fabric to be tested on the arm in the basic posture of the two-dimensional human body; the original garment on the torso in the two-dimensional human body pose
  • the pulling force of the physical properties of the cloth is consistent with the pulling force of the physical properties of the original clothing cloth in the arm position of the two-dimensional human body.
  • the difference between the two ⁇ G 34 is the stretching scale of the torso, and then based on the tension of the torso. Stretching scale, the physical properties of the torso part of the clothing fabric to be tested, stretch along the extension direction of the torso part in the two-dimensional human body posture (the direction of the arrow shown in the torso part in the last figure of Figure 12) to form a stretched waiting Try the torso part of the clothing data.
  • the difference ⁇ G 34 is the rebound scale of the torso, which is then based on the return of the torso.
  • Elastic scale the physical properties of the garment fabric to be tested in the torso, rebound in the direction opposite to the extension direction of the torso in the two-dimensional human posture (the direction opposite to the arrow direction shown in the torso in the last figure of Figure 12), forming The body part containing the data of the garment to be tested after rebound.
  • the torso part of the clothing data to be tested is stretched or rebounded along the extension direction of the torso part in the two-dimensional human posture. , Can further improve the authenticity of virtual fitting.
  • FIG. 13 shows another implementation flow of step 504 in the data processing method provided by the embodiment of the present invention.
  • the parts related to the embodiment of the present invention are shown, which are described in detail as follows:
  • the preset human body part includes a leg.
  • step 504 according to the posture changes of each preset human body part in the two-dimensional human body posture, respectively use the clothing data to be tested for each preset human body part in the two-dimensional basic posture to cover each preset in the two-dimensional human posture
  • the original clothing data of the human body parts form a two-dimensional human body posture containing the clothing data to be tested, including:
  • Step 1301 Overwrite the original clothing data of the leg part in the two-dimensional human body posture with the clothing data to be tested in the leg part in the basic two-dimensional human body posture to form a leg part containing the clothing data to be tested; wherein, the basic two-dimensional human body posture
  • the fifth preset vertex and the fifth preset boundary of the clothing data to be tested in the middle leg part coincide with the fifth preset vertex and the fifth preset boundary of the original clothing data of the leg part in the two-dimensional human body posture, or the two-dimensional human body
  • the sixth preset boundary and the sixth preset boundary center of the clothing data to be tested in the leg part in the basic posture coincide with the sixth preset boundary and the sixth preset boundary center of the original clothing data of the leg part in the two-dimensional human body posture.
  • the leg part to be tested clothing data covering the original clothing data, it is similar to the above-mentioned covering principle of the arm part and torso part.
  • the leg part in the two-dimensional plane can be approximately regarded as a regular square. Directly overwrite the original clothing data of the leg part with the clothing data to be tested on the leg part, thereby forming the leg part containing the clothing data to be tested.
  • the fifth preset vertex may be the left vertex where the trunk part and the leg part are in contact
  • the fifth preset boundary is the boundary formed by the part where the trunk part and the leg part are in contact.
  • the fifth preset vertex of the clothing data to be tested in the leg part in the basic pose of the two-dimensional human body coincides with the fifth preset vertex of the original clothing data of the leg part in the two-dimensional human body pose.
  • the fifth preset boundary of the clothing data to be tested in the leg part in the posture coincides with the fifth preset boundary in the original clothing data of the leg part in the two-dimensional human body posture to complete the coverage.
  • covering is performed in a manner that the fifth preset vertex and the fifth preset boundary of the leg part overlap, which can further improve the authenticity of the virtual fitting.
  • the fifth preset vertex can also be other vertex than the aforementioned vertex, for example, the right vertex of the contact position of the trunk part and the leg part, and the fifth preset boundary can also be another boundary besides the aforementioned boundary. , Such as the left boundary or the right boundary of the leg part, which is not particularly limited in the embodiment of the present invention.
  • the fifth preset vertex and the fifth preset boundary of the clothing data to be tested on the leg part coincide with the fifth preset vertex and the fifth preset boundary of the original clothing data of the leg part. Covering can further improve the authenticity of virtual fitting.
  • the sixth preset boundary is a boundary formed by the part where the trunk part contacts the leg part (consistent with the fifth preset boundary in the above-mentioned embodiment).
  • the sixth preset boundary of the clothing data to be tested on the leg part in the basic pose of the two-dimensional human body coincides with the sixth preset boundary of the original clothing data of the leg part in the two-dimensional human body pose, and the two-dimensional human body
  • the center of the sixth preset boundary of the clothing data to be tested in the leg part in the basic posture coincides with the center of the sixth preset boundary in the original clothing data of the leg part in the two-dimensional human body posture.
  • the coverage is performed in a manner that the sixth preset boundary and the center of the sixth preset boundary overlap, which can further improve the authenticity of the virtual fitting.
  • the sixth preset boundary and the sixth preset boundary center of the clothing data to be tested in the leg part in the basic pose of the two-dimensional human body and the sixth preset boundary of the original clothing data in the leg part in the two-dimensional human body pose are used. Setting the boundary and the sixth preset boundary center to overlap for coverage can further improve the authenticity of the virtual fitting.
  • FIG. 14 shows another implementation flow of step 504 in the data processing method provided by the embodiment of the present invention.
  • the parts related to the embodiment of the present invention are shown, which are described in detail as follows:
  • step 504 on the basis of the method steps shown in FIG. 13, further includes:
  • Step 1401 according to the pulling force of the physical properties of the original clothing fabric and the pulling force of the physical properties of the cloth to be tested, pull the data of the garment to be tested in the basic posture of the two-dimensional human body along the extension direction of the leg in the two-dimensional human posture. Stretch or rebound to form the leg part containing the data of the garment to be tested after stretching or rebound.
  • the stretching or rebounding of the garment data to be tested in the leg part is similar to the stretching or rebounding principle of the garment data to be tested in the arm part or the torso part described above.
  • the physical properties of the original clothing fabric of the leg part are determined according to the tensile force of the original clothing fabric and the clothing to be tested in the leg part.
  • the pulling force of the physical properties of the cloth determines the stretch scale of the leg part or the rebound scale of the leg part.
  • the stretch scale of the leg part or the rebound scale of the leg part can be determined according to the difference between the tensile force of the physical properties of the original clothing fabric of the leg part and the physical properties of the garment material of the leg part to be tested.
  • the stretching scale of the leg part or the rebound scale of the leg part can be determined by the following formula:
  • ⁇ G 56 represents the stretch scale of the leg part or the rebound scale of the leg part
  • G 5 represents the pulling force of the physical properties of the original clothing fabric of the leg part in the two-dimensional human body posture
  • G 6 represents the leg part in the basic posture of the two-dimensional human body The tensile force of the physical properties of the garment fabric to be tested.
  • the difference ⁇ G 56 is the stretch scale of the leg, and then based on this The stretch scale of the leg part is to stretch the physical properties of the garment fabric to be tested in the leg part along the extension direction of the leg part in the two-dimensional human body posture to form the leg part containing the stretched garment data to be tested.
  • the difference between the two ⁇ G 56 is the rebound scale of the leg, based on this The rebound scale of the leg part, the physical properties of the cloth to be tested in the leg part, rebound in the direction opposite to the extension direction of the leg part in the two-dimensional human body posture, forming a leg containing the data of the tested clothing after rebound Department site.
  • the data of the leg part in the basic posture of the two-dimensional human body is measured along the leg part of the two-dimensional human body posture. Stretching or rebounding in the extension direction can further improve the authenticity of the virtual fitting.
  • the embodiment of the present invention also provides a data processing device, as described in the following embodiment. Since the principle of solving the problems of these devices is similar to the data processing method, the implementation of these devices can refer to the implementation of the method, and the repetition will not be repeated.
  • FIG. 15 shows the functional modules of the data processing device provided by the embodiment of the present invention. For ease of description, only the parts related to the embodiment of the present invention are shown, which are described in detail as follows:
  • the data processing device includes an image data acquisition module 1501, a gesture recognition module 1502, a gesture matching module 1503, and a virtual fitting module 1504.
  • the image data acquisition module 1501 is used to acquire image data including the wearing human body through the image acquisition device; the image data of the wearing human body reflects the projection data of the three-dimensional human body on a first plane perpendicular to the center line of the field of view of the image acquisition device.
  • the posture recognition module 1502 is used for recognizing the two-dimensional posture of the wearing human body according to the image data; the two-dimensional human posture reflects the projection data of the three-dimensional human body on the second plane; the second plane is a symmetrical plane between the front of the wearing human body and the back of the wearing human body.
  • the posture matching module 1503 is used to match the recognized two-dimensional human posture in the human posture database to determine the preset human body part of the two-dimensional human posture.
  • the virtual fitting module 1504 is used for assigning the cloth physical properties of the garment to be tested to the part of the preset human body part of the two-dimensional human body posture that is covered by the original garment.
  • the image data acquisition module 1501 acquires image data including the dressed human body through the image acquisition device, and the gesture recognition module 1502 recognizes the two-dimensional human body posture of the dressed human body according to the image data; the posture matching module 1503 is in the human body posture database Matching is performed to determine the preset human body parts of the two-dimensional human body posture, and finally, the virtual fitting module 1504 directly assigns the cloth physical properties of the clothing to be tested to the part of the preset human body parts of the two-dimensional human body posture that is covered by the original clothing.
  • the embodiment of the present invention does not need to construct a three-dimensional human body model or a clothing model, only needs to recognize the two-dimensional human posture of the wearing human body, and directly assign the cloth physics of the clothing to be tested to the part of the preset human body part of the two-dimensional human posture that is covered by the original clothing Attributes can greatly improve the efficiency of virtual fitting, thereby improving user experience.
  • FIG. 16 shows another functional module of the data processing device provided by the embodiment of the present invention.
  • the parts related to the embodiment of the present invention are shown, which are described in detail as follows:
  • each module included in the data processing device is used to perform each step in the embodiment corresponding to FIG. 5.
  • the data processing device further includes a posture change determining module 1601, an original clothing data forming module 1602, a test clothing data forming module 1603, and a clothing data overlay Module 1604.
  • the posture change determination module 1601 is used to determine the posture changes of each preset human body part in the two-dimensional human posture according to the basic posture of the two-dimensional human body; the basic posture of the two-dimensional human body is the projection data of the three-dimensional basic human body model on the first plane.
  • the original clothing data forming module 1602 is used to give the cloth physical properties of the original clothing to the preset human body part covered by the original clothing in the two-dimensional human posture to form the original clothing data.
  • the test clothing data forming module 1603 is used to assign the cloth physical properties of the test clothing to the preset human body part covered by the test clothing in the basic posture of the two-dimensional human body to form the test clothing data.
  • the clothing data covering module 1604 is used to cover each preset human body in the two-dimensional human body posture with the clothing data to be tested for each preset human body part in the two-dimensional human body posture according to the posture changes of each preset human body part in the two-dimensional human body posture
  • the original clothing data of the parts form a two-dimensional human body posture containing the clothing data to be tested.
  • the posture change determination module 1601 determines the posture changes of each preset human body part in the two-dimensional human posture based on the basic posture of the two-dimensional human body.
  • the original clothing data forming module 1602 predicts the two-dimensional human posture covered by the original clothing. Assuming that the cloth physical attributes of the original clothing given by the human body parts form the original clothing data, the test clothing data forming module 1603 assigns the cloth physical attributes of the test clothing to the preset human body parts covered by the test clothing in the basic posture of the two-dimensional human body to form the test clothing.
  • FIG. 17 shows a schematic structural diagram of a clothing data covering module 1604 in a data processing device provided by an embodiment of the present invention. For ease of description, only parts related to the embodiment of the present invention are shown, which are described in detail as follows:
  • the preset human body parts include arms and torso.
  • each unit included in the clothing data covering module 1604 is used to execute each step in the embodiment corresponding to FIG. 7.
  • the clothing data covering module 1604 includes an arm part covering unit 1701 and a trunk part covering unit 1702.
  • the arm part covering unit 1701 is used to cover the original clothing data of the arm part in the two-dimensional human body posture with the original clothing data of the arm part in the two-dimensional human body posture according to the change of the posture of the arm part in the two-dimensional human body posture.
  • the arm part of the test garment data among them, the first preset vertex and the first preset boundary of the garment data to be tested in the arm part of the basic pose of the two-dimensional human body and the first preset vertex of the original garment data of the arm part in the two-dimensional human body pose Coincides with the first preset boundary, or the second preset boundary of the garment data to be tested in the arm part in the basic pose of the two-dimensional human body, and the second preset boundary center and the second preset of the original garment data of the arm part in the two-dimensional human body pose
  • the boundary coincides with the center of the second preset boundary.
  • the torso covering unit 1702 is used to cover the torso part of the torso in the basic pose of the two-dimensional human body with the original clothing data of the torso part in the two-dimensional body pose to form the torso part containing the data of the torso; where, the two-dimensional human body
  • the third preset vertex and the third preset boundary of the torso clothing data to be tested in the basic pose coincide with the third preset vertex and the third preset boundary of the original clothing data of the torso in the two-dimensional human body pose, or the two-dimensional human body
  • the fourth preset boundary and the fourth preset boundary center of the torso clothing data to be tested in the basic posture coincide with the fourth preset boundary and the fourth preset boundary center of the original clothing data of the torso in the two-dimensional human body posture.
  • the arm part covering unit 1701 uses the first preset vertex and the first preset boundary of the garment data to be tested in the arm part of the basic pose of the two-dimensional human body with the first preset boundary of the original garment data of the arm part in the two-dimensional body pose.
  • a preset vertex and a first preset boundary are overlapped for coverage, or the second preset boundary and the second preset boundary center of the garment data to be tested in the arm part of the basic posture of the two-dimensional human body and the arm in the two-dimensional human posture
  • the overlap of the second preset boundary and the second preset boundary center of the original clothing data of the part is covered, which can further improve the authenticity of the virtual fitting.
  • the torso covering unit 1702 uses the third preset vertex and the third preset boundary of the torso part of the clothing data to be tested in the basic pose of the two-dimensional human body with the third preset boundary of the original clothing data of the torso in the two-dimensional human body pose.
  • the three preset vertices and the third preset boundary overlap, or the fourth preset boundary and the fourth preset boundary center of the torso in the basic pose of the two-dimensional human body and the torso in the two-dimensional human pose
  • the fourth preset boundary and the fourth preset boundary center of the original clothing data of the part are covered by overlapping, which can further improve the authenticity of the virtual fitting.
  • FIG. 18 shows another structural diagram of the clothing data covering module 1604 in the data processing device provided by the embodiment of the present invention.
  • the details are as follows:
  • each unit included in the clothing data overlay module 1604 is used to execute each step in the embodiment corresponding to FIG. 10.
  • the clothing data covering module 1604 further includes an arm stretch and rebound unit 1801 and a trunk stretch and rebound unit 1802.
  • the arm part stretch and rebound unit 1801 is used to determine the arm part in the basic posture of the two-dimensional human body along the arm in the two-dimensional human body posture according to the tensile force of the physical properties of the original clothing fabric and the physical properties of the cloth to be tested.
  • the extension direction of the part is stretched or rebounded to form the arm part containing the data of the garment to be tested after stretching or rebounding.
  • the torso stretch and rebound unit 1802 is used to determine the torso of the torso in the basic posture of the two-dimensional human body along the torso in the two-dimensional human posture according to the tensile force of the physical properties of the original clothing fabric and the physical properties of the clothing fabric to be tested.
  • the extension direction of the part is stretched or rebounded to form the torso part containing the stretched or rebounded garment data to be tested.
  • the arm part stretch and rebound unit 1801 performs a two-dimensional calculation of the arm part in the basic posture of the two-dimensional human body according to the tensile force of the physical properties of the original clothing fabric and the tensile force of the physical properties of the cloth to be tested. Stretching or rebounding in the extension direction of the arm part in the posture of the human body can further improve the authenticity of the virtual fitting.
  • the torso stretch and rebound unit 1802 performs a two-dimensional calculation of the torso portion of the torso in the basic posture of the two-dimensional human body according to the tensile force of the physical properties of the original clothing fabric and the tensile force of the physical properties of the clothing fabric to be tested. Stretching or rebounding in the extension direction of the torso in the posture of the human body can further improve the authenticity of the virtual fitting.
  • FIG. 19 shows another structural diagram of the clothing data covering module 1604 in the data processing device provided by the embodiment of the present invention.
  • the details are as follows:
  • the preset human body part includes a leg.
  • each unit included in the clothing data overlay module 1604 is used to perform each step in the embodiment corresponding to FIG. 13.
  • the clothing data covering module 1604 includes a leg part covering unit 1901.
  • the leg part covering unit 1901 is used to cover the original clothing data of the leg part in the two-dimensional human body posture with the clothing data to be tested in the leg part in the basic posture of the two-dimensional human body to form the leg part containing the clothing data to be tested; ,
  • the fifth preset vertex and the fifth preset boundary of the clothing data to be tested in the leg part of the basic pose of the two-dimensional human body and the fifth preset vertex and fifth preset boundary of the original clothing data of the leg part in the two-dimensional human body pose Coincidence, or the sixth preset boundary and sixth preset boundary center of the clothing data to be tested in the leg part in the basic pose of the two-dimensional human body and the sixth preset boundary and sixth preset boundary in the original clothing data of the leg part in the two-dimensional human body pose
  • the preset boundary centers coincide.
  • the leg part covering unit 1901 uses the fifth preset vertex and the fifth preset boundary of the clothing data to be tested in the leg part in the basic two-dimensional human body posture with the original clothing on the leg part in the two-dimensional human body posture.
  • the fifth preset vertex and the fifth preset boundary of the data overlap, or the sixth preset boundary and the sixth preset boundary center of the clothing data to be tested in the leg part in the basic posture of the two-dimensional human body are overlapped with the two-dimensional
  • the sixth preset boundary and the sixth preset boundary center of the original clothing data of the leg part in the human body posture are covered by overlapping, which can further improve the authenticity of the virtual fitting.
  • FIG. 20 shows another structural diagram of the clothing data covering module 1604 in the data processing device provided by the embodiment of the present invention.
  • the details are as follows:
  • the various units included in the clothing data overlay module 1604 are used to perform each step in the embodiment corresponding to FIG. 14.
  • the clothing data covering module 1604 further includes a leg stretch and rebound unit 2001.
  • the leg stretch and rebound unit 2001 is used to determine the leg portion of the basic posture of the two-dimensional human body along the two-dimensional body posture according to the tensile force of the physical properties of the original clothing fabric and the physical properties of the clothing fabric to be tested.
  • the extension direction of the middle leg part is stretched or rebounded to form the leg part containing the stretched or rebounded garment data to be tested.
  • the leg part stretching and resilience unit 2001 according to the tensile force of the physical properties of the original clothing fabric and the tensile force of the physical properties of the clothing fabric to be tested, performs data processing on the leg part in the basic posture of the two-dimensional human body. Stretching or rebounding in the extension direction of the leg part in the two-dimensional human body posture can further improve the authenticity of the virtual fitting.
  • the color changes of the pixels on the first plane and the second plane are synchronized.
  • An embodiment of the present invention also provides a computer device, including a memory, a processor, and a computer program stored in the memory and capable of running on the processor, and the processor implements the above-mentioned data processing method when the computer program is executed.
  • the embodiment of the present invention also provides a computer-readable storage medium, and the computer-readable storage medium stores a computer program for executing the above-mentioned data processing method.
  • image data including the wearing human body is acquired through the image acquisition device; then the two-dimensional human body posture of the wearing human body is recognized according to the image data; and the two-dimensional human posture is determined by matching in the human body posture database.
  • the body parts are preset, and finally, the part of the preset body parts covered by the original clothes in the two-dimensional body posture is directly given the cloth physical properties of the clothes to be tested.
  • the embodiment of the present invention does not need to construct a three-dimensional human body model or a clothing model, only needs to recognize the two-dimensional human posture of the wearing human body, and directly assign the cloth physics of the clothing to be tested to the part of the preset human body part of the two-dimensional human posture that is covered by the original clothing Attributes can greatly improve the efficiency of virtual fitting, thereby improving user experience.
  • the embodiments of the present invention can be provided as a method, a system, or a computer program product. Therefore, the present invention may adopt the form of a complete hardware embodiment, a complete software embodiment, or an embodiment combining software and hardware. Moreover, the present invention may adopt the form of a computer program product implemented on one or more computer-usable storage media (including but not limited to disk storage, CD-ROM, optical storage, etc.) containing computer-usable program codes.
  • computer-usable storage media including but not limited to disk storage, CD-ROM, optical storage, etc.
  • These computer program instructions can also be stored in a computer-readable memory that can guide a computer or other programmable data processing equipment to work in a specific manner, so that the instructions stored in the computer-readable memory produce an article of manufacture including the instruction device.
  • the device implements the functions specified in one process or multiple processes in the flowchart and/or one block or multiple blocks in the block diagram.
  • These computer program instructions can also be loaded on a computer or other programmable data processing equipment, so that a series of operation steps are executed on the computer or other programmable equipment to produce computer-implemented processing, so as to execute on the computer or other programmable equipment.
  • the instructions provide steps for implementing the functions specified in one process or multiple processes in the flowchart and/or one block or multiple blocks in the block diagram.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Business, Economics & Management (AREA)
  • Textile Engineering (AREA)
  • Geometry (AREA)
  • Finance (AREA)
  • Accounting & Taxation (AREA)
  • Computer Hardware Design (AREA)
  • Computer Graphics (AREA)
  • General Engineering & Computer Science (AREA)
  • Software Systems (AREA)
  • Development Economics (AREA)
  • Economics (AREA)
  • Marketing (AREA)
  • Strategic Management (AREA)
  • General Business, Economics & Management (AREA)
  • Processing Or Creating Images (AREA)
  • Image Analysis (AREA)

Abstract

本发明公开了一种数据处理方法及装置、计算机设备及计算机可读存储介质,该方法包括:获取人的图像数据;根据图像数据识别人的二维姿态;将识别到的二维姿态与姿态数据库进行比对,确定二维姿态的预设部位;对二维姿态的预设部位中被原始服装覆盖的部分,赋予待试服装的属性。本发明能够极大的提高数据处理的效率,进而提高用户体验。

Description

数据处理方法及装置、计算机设备及计算机可读存储介质
本申请要求申请日为2019年9月3日、申请号为201910827228.0、发明名称为“虚拟试衣方法及装置、计算机设备及计算机设备存储介质”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本发明涉及计算机技术领域,尤其涉及数据处理方法及装置、计算机设备及计算机设备存储介质。
背景技术
本部分旨在为权利要求书中陈述的本发明实施例提供背景或上下文。此处的描述不因为包括在本部分中就承认是现有技术。
随着计算机技术的发展,虚拟试衣技术逐渐得到众多消费者的关注和认可。虚拟试衣是指在虚拟空间中通过与自身相同的化身而试穿虚拟衣服的行为。虚拟试衣不需要用户真正换装,就能够实现用户穿上某件衣服的效果。虚拟试衣技术的出现,给用户提供了方便快捷的试衣体验,使得人们传统的试衣方法产生了极大的变化。
然而,目前在进行虚拟试衣时,需要采集三维人体数据以构建三维人体模型,另外还需要构建三维服装模型,进而将三维服装模型与三维人体模型进行融合,以实现三维试衣。然而不管是构建三维人体模型还是构建三维服装模型,均需要消耗大量的计算机资源,三维建模的效率极其低下,不利于批量化的建立三维人体模型及三维服装模型,使得用户进行虚拟试衣需要消耗较长的时间,导致虚拟试衣的效率低下,同时也严重降低了用户虚拟试衣的体验。
因此,现有的虚拟试衣存在因三维建模导致的虚拟试衣效率低下,且用户体验差的问题。
发明内容
本发明实施例提供一种数据处理方法,用以提高虚拟试衣的效率,提高用户体验,该方法包括:
通过图像获取设备获取包括着装人体的图像数据;着装人体的图像数据反映三维人体在与图像获取设备的视野中心线垂直的第一平面上的投影数据;
根据图像数据识别着装人体的二维人体姿态;二维人体姿态反映三维人体在第二平面上的投影数据;第二平面为着装人体正面与着装人体背面的对称平面;
将识别到的二维人体姿态在人体姿态数据库中进行匹配,确定二维人体姿态的预设人体部位;
对二维人体姿态的预设人体部位中被原始服装覆盖的部分赋予待试服装的布料物理属性。
本发明实施例还提供一种数据处理装置,用以提高虚拟试衣的效率,提高用户体验,该装置包括:
图像数据获取模块,用于通过图像获取设备获取包括着装人体的图像数据;着装人体的图像数据反映三维人体在与图像获取设备的视野中心线垂直的第一平面上的投影数据;
姿态识别模块,用于根据图像数据识别着装人体的二维人体姿态;二维人体姿态反映三维人体在第二平面上的投影数据;第二平面为着装人体正面与着装人体背面的对称平面;
姿态匹配模块,用于将识别到的二维人体姿态在人体姿态数据库中进行匹配,确定二维人体姿态的预设人体部位;
虚拟试衣模块,用于对二维人体姿态的预设人体部位中被原始服装覆盖的部分赋予待试服装的布料物理属性。
本发明实施例还提供一种计算机设备,包括存储器、处理器及存储在存储器上并可在处理器上运行的计算机程序,所述处理器执行所述计算机程序时实现上述数据处理方法。
本发明实施例还提供一种计算机可读存储介质,所述计算机可读存储介质存储有执行上述数据处理方法的计算机程序。
本发明实施例中,通过图像获取设备获取包括着装人体的图像数据;进而根据图像数据识别着装人体的二维人体姿态;并在人体姿态数据库中进行匹配确定二维人体姿态的预设人体部位,最后直接对二维人体姿态的预设人体部位中被原始服装覆盖的部分赋予待试服装的布料物理属性。本发明实施例无需构建三维人体模型或者服装模型,仅需识别着装人体的二维人体姿态,并直接在二维人体姿态的预设人体部位中被原始服装覆盖的部分赋予待试服装的布料物理属性,能够极大的提高数据处理的效率,进而提高用户体验。
附图说明
为了更清楚地说明本发明实施例中的技术方案,下面将对实施例描述中所需要使用的附图作简单地介绍,显而易见地,下面描述中的附图仅仅是本发明的一些实施例,对于本领域普通技术人员来讲,在不付出创造性劳动的前提下,还可以根据这些附图获得其他的附图。在附图中:
图1为本发明实施例提供的数据处理方法的实现流程图;
图2为本发明实施例提供的图像获取设备与第一平面的位置示意图(侧视图);
图3为本发明实施例提供的第一平面及第二平面位置示意图(俯视图);
图4为本发明实施例提供的二维人体姿态示意图(视野中心线垂直于第二平面);
图5为本发明实施例提供的数据处理方法的另一实现流程图;
图6为本发明实施例提供的二维人体基础姿态示意图(视野中心线垂直于第一平面);
图7为本发明实施例提供的数据处理方法中步骤504的实现流程图;
图8为本发明实施例提供的胳膊部位的待试服装数据覆盖原始服装数据的示意图;
图9(a)为本发明实施例提供的躯干部位的待试服装数据覆盖原始服装数据的示意图;
图9(b)为本发明实施例提供的躯干部位的待试服装数据覆盖原始服装数据的另一示意图;
图10为本发明实施例提供的数据处理方法中步骤504的另一实现流程图;
图11为本发明实施例提供的胳膊部位的待试服装数据的拉伸与回弹示意图;
图12为本发明实施例提供的躯干部位的待试服装数据的拉伸与回弹示意图;
图13为本发明实施例提供的数据处理方法中步骤504的再一实现流程图;
图14为本发明实施例提供的数据处理方法中步骤504的又一实现流程图;
图15为本发明实施例提供的数据处理装置的功能模块图;
图16为本发明实施例提供的数据处理装置的另一功能模块图;
图17为本发明实施例提供的数据处理装置中服装数据覆盖模块1604的结构框图;
图18为本发明实施例提供的数据处理装置中服装数据覆盖模块1604的另一结构框图;
图19为本发明实施例提供的数据处理装置中服装数据覆盖模块1604的再一结构框图;
图20为本发明实施例提供的数据处理装置中服装数据覆盖模块1604的又一结构框图。
具体实施方式
为使本发明实施例的目的、技术方案和优点更加清楚明白,下面结合附图对本发明实施例做进一步详细说明。在此,本发明的示意性实施例及其说明用于解释本发明,但并不作为对本发明的限定。
图1示出了本发明实施例提供的数据处理方法的实现流程,为便于描述,仅示出了与本发明实施例相关的部分,详述如下:
如图1所示,数据处理方法,其包括:
步骤101,通过图像获取设备获取包括着装人体的图像数据;着装人体的图像数据反映三维人体在与图像获取设备的视野中心线垂直的第一平面上的投影数据;
步骤102,根据图像数据识别着装人体的二维人体姿态;二维人体姿态反映三维人体在第二平面上的投影数据;第二平面为着装人体正面与着装人体背面的对称平面;
步骤103,将识别到的二维人体姿态在人体姿态数据库中进行匹配,确定二维人体姿态的预设人体部位;
步骤104,对二维人体姿态的预设人体部位中被原始服装覆盖的部分赋予待试服装的布料物理属性。
在本发明实施例中,图像获取设备例如可以是能够获取单帧图像的普通照相机、或者深度相机等设备,还可以是能够获取视频图像的摄像机等。相应的,图像数据可以包括RGB数据,还可以包括RGBD数据(RGB数据及着装人体距离深度相机的深度数据)等。即步骤101,通过图像获取设备获取包括着装人体的图像数据,包括:通过照相机获取包括着装人体的RGB数据,或通过深度相机获取包括着装人体的RGBD数据。
另外,在图像获取设备为摄像机时,可以先将摄像机拍摄到的视频图像转化为单帧图像,进而获取包括着装人体的每帧图像。
着装人体可以是身穿服装的人,也可以是服装展示人台,或者人体骨架或衣架等类人体,本发明实施例对此不做特别的限制。
图2示出了本发明实施例提供的图像获取设备与第一平面的位置示意(侧视图),为便于描述,仅示出了与本发明实施例相关的部分,详述如下:
如图2所示,第一平面与图像获取设备的视野中心线垂直。较优的,图像获取设备位于虚拟试衣终端靠近顶部的中间区域位置,该虚拟试衣终端可以包括例如智能虚拟试衣镜,或者其它包括显示屏的试衣终端。第一平面与虚拟试衣终端所在平面平行,着装人体正面面对虚拟试衣终端等。通过图像获取设备获得的包括着装人体的图像数据,反映出三维人体在第一平面上的投影。
图3示出了本发明实施例提供的第一平面及第二平面位置示意(俯视图),图4示出了本发明实施例提供的二维人体姿态示意(视野中心线垂直于第二平面),为便于描述,仅示出了与本发明实施例相关的部分,详述如下:
如图3及图4所示,在图3中以俯视的角度进行查看,三维人体在第二平面上投影的宽度范围为S,以图4中视野中心线垂直于第二平面的角度进行查看,三维人体在第二平面上投影如图4所示的二维人体姿态。
其中,第一平面与第二平面可以是相互平行的平面或者是同一平面,也可以是不同的平面。在第一平面与第二平面为相互平行的平面时,三维人体在第二平面上的投影与三维人体在第一平面上的投影是一致的,即三维人体在第二平面上的投影与三维人体在第一平面上的投影在大小和姿态上是完全重合的。在第一平面与第二平面为同一平面时,三维人体在第二平面上的投影即为三维人体在第一平面上的投影。
在多数情况下,如图3所示,第一平面与第二平面不是相互平行的平面,或者不是同一平面,此时第一平面与第二平面之间存在一角度θ(即着装人体倾斜的站在虚拟试衣终端前面),此时通过图像获取设备获取到的着装人体的图像数据即为三维人体在第一平面上的投影数据,该投影数据的宽度范围即为L,并且基于几何学原理可以得到:
Figure PCTCN2020113198-appb-000001
进而,可以基于该几何学关系及着装人体在第一平面上的投影数据,确定着装人体在第二平面上的投影数据,进而识别出着装人体在第二平面上的二维人体姿态。
在得到着装人体的图像数据后,基于得到的图像数据对着装人体的二维人体姿态进行识别。优选地,当获取的图像数据是RGBD数据(RGB数据及着装人体距离深度相机的深度数据)时,可以通以下步骤识别着装人体的二维人体姿态。
首先,提取当前RGBD图像数据中的深度值,对相近位置、相似的RGBD深度值对应的像素点进行聚类、组合。
进一步的,调用人体姿态数据库,对上述像素点组合进行人形检测。具体地,将像素点组合逐一匹配人体姿态数据库中的不同人体姿态模型(从T-pose基础人体模型开 始),选出和像素点组合相似的人体姿态数据库中的人体姿态。如果相似度不高或出现没有相似的部分,则降低RGBD图像中相近位置深度值的预设相似度标准,重新对图像像素点进行聚类、组合,并再次逐一匹配人体姿态数据库中的不同人体姿态模型(从T-pose基础人体模型开始)。依此类推,直到当前图像的像素点组合与人体姿态数据库中某个人体姿态模型的相似度不小于预设相似度时,确定该人体姿态数据库中的人体姿态模型为当前着装人体的人体姿态。
接下来,对选定的人体姿态模型(与第二平面平行),勾画关节与骨骼线,使其成为像“火柴人”一样的客观人体标记物,校准关节点,圈定人体边界范围,进而得到含骨骼点位置信息的着装人体二维人体姿态,即三维人体姿态在第二平面上投影数据。
优选地,当获取的图像数据是RGB数据时,可以通以下步骤识别着装人体的二维人体姿态。
首先,提取当前RGB图像数据,对相近位置、相似的颜色对应的像素点进行聚类、组合。进一步的,结合人脸识别技术,确立人体大致位置。然后,调用人体姿态数据库,对上述像素点组合进行人形检测。具体地,将像素点组合逐一匹配人体姿态数据库中的不同人体姿态模型(从T-pose基础人体模型开始),选出和像素点组合相似的人体姿态数据库中的人体姿态。如果相似度不高或出现没有相似的部分,则降低RGB图像中相近位置的颜色的预设相似度标准,重新对图像像素点进行聚类、组合,并再次逐一匹配人体姿态数据库中的不同人体姿态模型(从T-pose基础人体模型开始)。依此类推,直到当前图像的像素点组合与人体姿态数据库中某个人体姿态模型的相似度不小于预设相似度时,确定该人体姿态数据库中的人体姿态模型为当前着装人体的人体姿态。
接下来,对选定的人体姿态模型(与第二平面平行),勾画关节与骨骼线,使其成为像“火柴人”一样的客观人体标记物,校准关节点,圈定人体边界范围,进而得到含骨骼点位置信息的着装人体二维人体姿态,即三维人体姿态在第二平面上投影数据。
综上所述,该着装人体的二维人体姿态为三维人体在第二平面上的投影数据。在定义第二平面时,可以近似将人体视为是对称结构的,第二平面定义为人体正面与人体背面的对称平面。即本发明实施例无需构建三维人体模型或者三维服装模型,而直接获取三维人体在平面上的二维人体姿态,进而利用该二维人体姿态进行虚拟试衣,可以极大的提高虚拟试衣的效率,提高用户体验。
人体姿态识别能够将图像数据中的二维人体姿态的边界范围识别出来,即圈定识别出人体的边界范围,但是并不能识别出头部、胳膊、躯干及腿部等人体各个部分的姿 势。鉴于此,在上述识别到二维人体姿态的基础上,将二维人体姿态在人体姿态数据库中进行匹配,以确定二维人体姿态中预设人体部位的姿态,以便提高虚拟试衣的精确度和真实性。
其中,该预设人体部位为预先设定的人体部位,本领域技术人员可以理解的是,该预设人体部位可以包括头部、胳膊、躯干及腿部中的一个或多个。例如,可以仅包括胳膊及躯干(进行上衣的虚拟试衣时),还可以仅包括腿部(进行裤子或者裙子等下衣的虚拟试衣时),本发明实施例对此不作特别的限制。
另外,人体姿态数据库为预先通过例如TensorFlow等深度学习框架训练得到的人体姿态数据库,该训练得到的人体姿态数据库包含了大量甚至海量的不同的二维人体姿态。TensorFlow,是谷歌研发的第二代人工智能学习系统,其将复杂的数据结构传输至人工智能神经网中进行分析和处理,可被用于语音识别及图像识别等多项机器学习和深度学习领域,支持卷积神经网络,循环神经网络和长短期记忆网络等。
接下来,将识别到的着装人体的二维人体姿态与人体姿态数据库进行匹配,以确定二维人体姿态中每个预设人体部位的数据集合及姿态。优选的,可以通过下面的方法实现:
首先,识别二维人体姿态的同时,可以识别到二维人体姿态中每个预设人体部位的骨骼点位置,进而结合二维人体姿态的边界数据,确定二维人体姿态中各个预设人体部位的数据集合,例如头部数据集合,躯干数据集合、腿部数据集合等。
进一步的,为了提高二维人体姿态中每个预设人体部位的姿态识别的准确性,分别将各个预设人体部位的数据进行组合和聚类,将颜色相近的数据部分进行组合,进一步确定各个预设人体部位的细节部位,例如胳膊部位的上臂和下臂,肘关节及指关节等;还例如腿部部位的膝关节,大腿部位及小腿部位等细节部位。
另外,为了进一步提高二维人体姿态中每个预设人体部位的姿态识别的准确性,对人体姿态数据库中二维人体姿态的朝向和角度,与识别到的二维人体姿态的朝向和角度不相一致的进行角度的旋转和转换,即将人体姿态数据库中二维人体姿态的朝向和角度调整为与识别到的二维人体姿态的朝向和角度基本一致的位置,继而进行匹配。
在确定二维人体姿态(为便于描述,假设二维人体姿态采用A0表示)中上述各个预设人体部位的数据集合后,将二维人体姿态A0中各个预设人体部位的数据集合,与人体姿态数据库中的二维人体姿态进行匹配,当二维人体姿态A0中各个预设人体部位的数据集合,与人体姿态数据库中某个二维人体姿态A1的各个预设人体部位的相似度 均不小于预设相似度时,确定人体姿态数据库中该二维人体姿态A1为匹配的二维人体姿态,进而根据人体姿态数据库中二维人体姿态A1各个预设人体部位的姿态,确定二维人体姿态A0中各个预设人体部位的姿态。
在确定二维人体姿态中每个预设人体部位的数据集合及姿态后,对于各个人体部位中被原始服装覆盖的部分直接赋予待试服装的布料物理属性,以进行虚拟试衣的操作。
其中,布料物理属性是指布料中相邻的两个像素点之间的可变化的最大距离(拉伸尺度)与最小距离(回弹尺度)。每个待试服装、及着装人体穿着的原始服装的布料物理属性都是已知的、预先准备好的数据,在赋予服装的布料物理属性时,例如可以通过接收用户指令的方式(例如用户通过点选的方式输入指令,或者预先对服装的布料物理属性进行编码,通过输入服装的布料物理属性的编号的方式输入指令)。
另外,还可以通过高清摄像头获取服装的布料物理属性,并将所有服装的布料物理属性形成布料物理属性数据库。其中,高清摄像头的分辨率一般不做限制,如460X320到4k、8k、16k等,最大上限根据硬件决定。或者通过机器学习获取服装的布料物理属性,并将所有服装的布料物理属性形成布料物理属性数据库。在赋予服装的布料物理属性时,可以通过从布料物理属性数据库确定待试服装的布料物理属性。
在本发明实施例中,通过图像获取设备获取包括着装人体的图像数据;进而根据图像数据识别着装人体的二维人体姿态;并在人体姿态数据库中进行匹配确定二维人体姿态的预设人体部位,最后直接对二维人体姿态的预设人体部位中被原始服装覆盖的部分赋予待试服装的布料物理属性。本发明实施例无需构建三维人体模型或者服装模型,仅需识别着装人体的二维人体姿态,并直接在二维人体姿态的预设人体部位中被原始服装覆盖的部分赋予待试服装的布料物理属性,能够极大的提高虚拟试衣的效率,进而提高用户体验。
图5示出了本发明实施例提供的数据处理方法的另一实现流程,为便于描述,仅示出了与本发明实施例相关的部分,详述如下:
在本发明的一实施例中,如图5所示,在上述图1所示方法步骤的基础上,数据处理方法,还包括:
步骤501,根据二维人体基础姿态,确定二维人体姿态各个预设人体部位的姿态变化;二维人体基础姿态为三维基础人体模型在第一平面上的投影数据;
步骤502,对二维人体姿态中被原始服装覆盖的预设人体部位赋予原始服装的布料物理属性,形成原始服装数据;
步骤503,对二维人体基础姿态中被待试服装覆盖的预设人体部位赋予待试服装的布料物理属性,形成待试服装数据;
步骤504,根据二维人体姿态各个预设人体部位的姿态变化,分别利用二维人体基础姿态中各个预设人体部位的待试服装数据覆盖二维人体姿态中各个预设人体部位的原始服装数据,形成包含待试服装数据的二维人体姿态。
在本发明实施例中,存在一个三维基础人体模型,该三维基础人体模型为标准的T-pose模型,在标准的T-pose三维基础人体模型中,双臂平举与肩部成垂直状态,因酷似T型故而取名为T-pose模型。其中,二维人体基础姿态为T-pose三维基础人体模型在第一平面上的投影数据。
图6示出了本发明实施例提供的二维人体基础姿态示意图(视野中心线垂直于第一平面),为便于说明,仅示出了与本发明实施例相关的部分,详述如下:
如图6所示,图中所示人体姿态为T-pose三维基础人体模型在第一平面上的投影数据,即二维人体基础姿态。
为了提高虚拟试衣的真实性,首先基于该二维人体基础姿态,确定图4中所示的二维人体姿态中每个预设人体部位的姿态变化,为后续布料物理属性的覆盖和虚拟试衣做基础。
鉴于在二维人体姿态中被原始服装覆盖的各个预设人体部位直接赋予待试服装的布料物理属性,虚拟试衣的真实性会有所折扣,为了提高虚拟试衣的真实性,可以利用待试服装的布料物理属性覆盖原始服装的布料物理属性。
具体的,在二维人体姿态中被原始服装覆盖的预设人体部位直接赋予原始服装的布料物理属性,形成原始服装数据;而在二维人体基础姿态中被待试服装覆盖的预设人体部位赋予待试服装的布料物理属性,形成待试服装数据。
在确定二维人体姿态中的原始服装数据及二维人体基础姿态中的待试服装数据后,进而根据二维人体姿态各个预设人体部位的姿态变化,例如,将二维人体基础姿态中各个预设人体部位的姿态调整为与二维人体姿态各个预设人体部位的姿态一致的状态,分别利用二维人体基础姿态中各个预设人体部位的待试服装数据覆盖二维人体姿态中各个预设人体部位的原始服装数据,即不同的人体部位分别进行覆盖操作,最终形成包含待试服装数据的二维人体姿态,以完成虚拟试衣的操作。
在本发明实施例中,基于二维人体基础姿态确定二维人体姿态各个预设人体部位的姿态变化,对二维人体姿态中被原始服装覆盖的预设人体部位赋予原始服装的布料物理 属性形成原始服装数据,对二维人体基础姿态中被待试服装覆盖的预设人体部位赋予待试服装的布料物理属性,形成待试服装数据,进而根据二维人体姿态各个预设人体部位的姿态变化,分别利用二维人体基础姿态中各个预设人体部位的待试服装数据覆盖二维人体姿态中各个预设人体部位的原始服装数据,能够提高虚拟试衣的真实性。
图7示出了本发明实施例提供的数据处理方法中步骤504的实现流程,为便于描述,仅示出了与本发明实施例相关的部分,详述如下:
在本发明的一实施例中,预设人体部位包括胳膊及躯干。如图7所示,步骤504,根据二维人体姿态各个预设人体部位的姿态变化,分别利用二维人体基础姿态中各个预设人体部位的待试服装数据覆盖二维人体姿态中各个预设人体部位的原始服装数据,形成包含待试服装数据的二维人体姿态,包括:
步骤701,根据二维人体姿态中胳膊部位的姿态变化,将二维人体基础姿态中胳膊部位的待试服装数据覆盖二维人体姿态中胳膊部位的原始服装数据,形成包含待试服装数据的胳膊部位;其中,二维人体基础姿态中胳膊部位待试服装数据的第一预设顶点及第一预设边界与二维人体姿态中胳膊部位原始服装数据的第一预设顶点及第一预设边界重合,或二维人体基础姿态中胳膊部位待试服装数据的第二预设边界及第二预设边界中心与二维人体姿态中胳膊部位原始服装数据的第二预设边界及第二预设边界中心重合;
步骤702,将二维人体基础姿态中躯干部位的待试服装数据覆盖二维人体姿态中躯干部位的原始服装数据,形成包含待试服装数据的躯干部位;其中,二维人体基础姿态中躯干部位待试服装数据的第三预设顶点及第三预设边界与二维人体姿态中躯干部位原始服装数据的第三预设顶点及第三预设边界重合,或二维人体基础姿态中躯干部位待试服装数据的第四预设边界及第四预设边界中心与二维人体姿态中躯干部位原始服装数据的第四预设边界及第四预设边界中心重合。
在本发明实施例中,预设人体部位包括胳膊及躯干(即适用于待试服装为上衣时)。
图8示出了本发明实施例提供的胳膊部位的待试服装数据覆盖原始服装数据的示意,为便于描述,仅示出了与本发明实施例相关的部分,详述如下:
如图8所示,对于胳膊部位在利用待试服装数据覆盖原始服装数据时,根据二维人体姿态中胳膊部位的姿态变化,将包含待试服装数据的二维人体基础姿态中胳膊部位的姿态调整为与包含原始服装数据的二维人体姿态中胳膊部位的姿态一致,进而将胳膊部 位的待试服装数据覆盖胳膊部位的原始服装数据,以此形成包含待试服装数据的胳膊部位。
如图8所示(此处以右侧胳膊部位为例进行说明,左侧胳膊部位与此相同),该第一预设顶点为胳膊部位与躯干部位接触位置的上顶点,在二维人体基础姿态中胳膊部位待试服装数据的第一预设顶点为A1,在二维人体姿态中胳膊部位原始服装数据的第一预设顶点为A2;第一预设边界为胳膊部位的上边界,在二维人体基础姿态中胳膊部位待试服装数据的第一预设边界为P1,在二维人体姿态中胳膊部位原始服装数据的第一预设边界为P2。具体在进行覆盖时,二维人体基础姿态中胳膊部位待试服装数据的第一预设顶点A1,与二维人体姿态中胳膊部位原始服装数据的第一预设顶点A2重合,二维人体基础姿态中胳膊部位待试服装数据的第一预设边界P1,与二维人体姿态中胳膊部位原始服装数据的第一预设边界P2重合,以完成覆盖。本发明实施例中以第一预设顶点及第一预设边界重合的方式进行覆盖,能够进一步提高虚拟试衣的真实性。
另外,该第一预设顶点还可以是除上述顶点A1(A2)之外的其它顶点,该第一预设边界还可以是除上述边界P1(P2)之外的其它边界,本发明实施例对此不作特别的限制。
在本发明实施例中,以胳膊部位待试服装数据的第一预设顶点及第一预设边界与胳膊部位原始服装数据的第一预设顶点及第一预设边界重合的方式进行覆盖,能够进一步提高虚拟试衣的真实性。
在本发明的一实施例中,该第二预设边界为胳膊部位的上边界(与上述第一预设边界一致),在二维人体基础姿态中胳膊部位待试服装数据的第二预设边界为P1,在二维人体姿态中胳膊部位原始服装数据的第二预设边界为P2,具体在进行覆盖时,二维人体基础姿态中胳膊部位待试服装数据的第二预设边界P1与二维人体姿态中胳膊部位原始服装数据的第二预设边界P2重合,且第二预设边界P1与第二预设边界P2的中心重合(图中未示出)。本发明实施例中,以第二预设边界及第二预设边界中心重合的方式进行覆盖,能够进一步提高虚拟试衣的真实性。
另外,该第二预设边界还可以是除上述边界P1(P2)之外的其它边界,本发明实施例对此不作特别的限制。
在本发明实施例中,以胳膊部位待试服装数据的第二预设边界及第二预设边界中心与胳膊部位原始服装数据的第二预设边界及第二预设边界中心重合的方式进行覆盖,能够进一步提高虚拟试衣的真实性。
图9(a)示出了本发明实施例提供的躯干部位的待试服装数据覆盖原始服装数据的示意,为便于描述,仅示出了与本发明实施例相关的部分,详述如下:
如图9(a)所示,对于躯干部位在利用待试服装数据覆盖原始服装数据时,鉴于在二维平面中,躯干部分的姿态基本无变化,在二维平面中躯干部分可以近似看做方形,此处可直接将躯干部位的待试服装数据覆盖躯干部位的原始服装数据,以此形成包含待试服装数据的躯干部位。
如图9(a)所示,该第三预设顶点可以是躯干部位与胳膊部位接触的上顶点,在二维人体基础姿态中躯干部位待试服装数据的第三预设顶点为B1(在该实施例中,实质上与顶点A1一致),在二维人体姿态中躯干部位原始服装数据的第三预设顶点为B2;第三预设边界为躯干部位的上边界及左边界,在二维人体基础姿态中躯干部位待试服装数据的第三预设边界为上边界Q1及左边界Q2,在二维人体姿态中躯干部位原始服装数据的第三预设边界为上边界Q3及左边界Q4。具体在进行覆盖时,二维人体基础姿态中躯干部位待试服装数据的第三预设顶点B1,与二维人体姿态中躯干部位原始服装数据的第三预设顶点B2重合,二维人体基础姿态中躯干部位待试服装数据的上边界Q1,与二维人体姿态中躯干部位原始服装数据的上边界Q3重合,二维人体基础姿态中躯干部位待试服装数据的左边界Q2,与二维人体姿态中胳膊部位原始服装数据的左边界Q4重合,以完成覆盖。本发明实施例中以躯干部位第三预设顶点及第三预设边界重合的方式进行覆盖,能够进一步提高虚拟试衣的真实性。
另外,该第三预设顶点还可以是除上述顶点B1(B2)之外的其它顶点,例如躯干部位与右侧胳膊接触位置的上顶点,该第三预设边界还可以是除上述边界Q1及Q2(Q3及Q4)之外的其它边界,例如躯干部位的上边界及右边界,本发明实施例对此不作特别的限制。
在本发明实施例中,以二维人体基础姿态中躯干部位待试服装数据的第三预设顶点及第三预设边界与二维人体姿态中躯干部位原始服装数据的第三预设顶点及第三预设边界重合的方式进行覆盖,能够进一步提高虚拟试衣的真实性。
图9(b)示出了本发明实施例提供的躯干部位的待试服装数据覆盖原始服装数据的另一示意,为便于描述,仅示出了与本发明实施例相关的部分,详述如下:
如图9(b)所示,在本发明的一实施例中,该第四预设边界为躯干部位的上边界(与上述实施例中第三预设边界一致),在二维人体基础姿态中躯干部位待试服装数据的第四预设边界为Q1,第四预设边界为Q1的中心为O1,在二维人体姿态中躯干部位原始 服装数据的第四预设边界为Q3,第四预设边界Q3的中心为O2,具体在进行覆盖时,二维人体基础姿态中躯干部位待试服装数据的第四预设边界Q1与二维人体姿态中躯干部位原始服装数据的第四预设边界Q3重合,且第四预设边界Q1的中心O1与第四预设边界Q3的中心O2重合。本发明实施例中,以第四预设边界及第四预设边界中心重合的方式进行覆盖,能够进一步提高虚拟试衣的真实性。
在本发明实施例中,以二维人体基础姿态中躯干部位待试服装数据的第四预设边界及第四预设边界中心与二维人体姿态中躯干部位原始服装数据的第四预设边界及第四预设边界中心重合的方式进行覆盖,能够进一步提高虚拟试衣的真实性。
图10示出了本发明实施例提供的数据处理方法中步骤504的另一实现流程,为便于描述,仅示出了与本发明实施例相关的部分,详述如下:
在本发明的一实施例中,如图10所示,步骤504,在上述图7所示的方法步骤的基础上,还包括:
步骤1001,根据原始服装布料物理属性的拉力及待试服装布料物理属性的拉力,对二维人体基础姿态中胳膊部位的待试服装数据沿二维人体姿态中胳膊部位的延伸方向进行拉伸或回弹,形成包含拉伸或回弹后的待试服装数据的胳膊部位;
步骤1002,根据原始服装布料物理属性的拉力及待试服装布料物理属性的拉力,对二维人体基础姿态中躯干部位的待试服装数据沿二维人体姿态中躯干部位的延伸方向进行拉伸或回弹,形成包含拉伸或回弹后的待试服装数据的躯干部位。
鉴于不同服装的布料物理属性并不相同,不同的布料物理属性反映出服装数据不同的拉力。基于原始服装数据布料物理属性的拉力及待试服装数据布料物理属性的拉力,对待试服装数据布料物理属性进行拉伸或回弹,能够进一步提高虚拟试衣的真实性。另外,在服装布料物理属性已知的情况下,服装布料物理属性的拉力是确定已知的。
图11示出了本发明实施例提供的胳膊部位的待试服装数据的拉伸与回弹示意,为便于描述,仅示出了与本发明实施例相关的部分,详述如下:
如图11所示,在本发明的一实施例中,在二维人体基础姿态的胳膊部位待试服装数据覆盖二维人体姿态胳膊部位的原始服装数据后,分别根据胳膊部位原始服装布料物理属性的拉力及胳膊部位待试服装布料物理属性的拉力,确定胳膊部位拉伸尺度或胳膊部位回弹尺度。
较优的,可以根据胳膊部位原始服装布料物理属性的拉力与胳膊部位待试服装布料物理属性的拉力的差值,确定胳膊部位拉伸尺度或胳膊部位回弹尺度。例如,可以通过如下公式确定胳膊部位拉伸尺度或胳膊部位回弹尺度:
ΔG 12=G 1-G 2
其中,ΔG 12表示胳膊部位拉伸尺度或胳膊部位回弹尺度,G 1表示二维人体姿态中胳膊部位原始服装布料物理属性的拉力,G 2表示二维人体基础姿态中胳膊部位待试服装布料物理属性的拉力。
在胳膊部位原始服装布料物理属性的拉力与胳膊部位待试服装布料物理属性的拉力的差值ΔG 12为正值时,两者差值ΔG 12为胳膊部位拉伸尺度,进而基于该胳膊部位拉伸尺度,对胳膊部位的待试服装布料物理属性,沿二维人体姿态中胳膊部位的延伸方向(图11最后一图胳膊部位所示箭头方向)进行拉伸,形成包含拉伸后的待试服装数据的胳膊部位。
在胳膊部位原始服装布料物理属性的拉力与胳膊部位待试服装布料物理属性的拉力的差值ΔG 12为负值时,两者差值ΔG 12为胳膊部位回弹尺度,进而基于该胳膊部位回弹尺度,对胳膊部位的待试服装布料物理属性,沿与二维人体姿态中胳膊部位的延伸方向相反的方向(图11最后一图胳膊部位所示箭头方向相反的方向)进行回弹,形成包含回弹后的待试服装数据的胳膊部位。
在本发明实施例中,根据原始服装布料物理属性的拉力及待试服装布料物理属性的拉力,对胳膊部位的待试服装数据沿二维人体姿态中胳膊部位的延伸方向进行拉伸或回弹,能够进一步提高虚拟试衣的真实性。
图12示出了本发明实施例提供的躯干部位的待试服装数据的拉伸与回弹示意,为便于描述,仅示出了与本发明实施例相关的部分,详述如下:
如图12所示,在本发明的一实施例中,在二维人体基础姿态的躯干部位待试服装数据覆盖二维人体姿态躯干部位的原始服装数据后,分别根据躯干部位原始服装布料物理属性的拉力及躯干部位待试服装布料物理属性的拉力,确定躯干部位拉伸尺度或躯干部位回弹尺度。
较优的,可以根据躯干部位原始服装布料物理属性的拉力与躯干部位待试服装布料物理属性的拉力的差值,确定躯干部位拉伸尺度或躯干部位回弹尺度。具体的,可以通过如下公式确定躯干部位拉伸尺度或躯干部位回弹尺度:
ΔG 34=G 3-G 4
其中,ΔG 34表示躯干部位拉伸尺度或躯干部位回弹尺度,G 3表示二维人体姿态中躯干部位原始服装布料物理属性的拉力,G 4表示二维人体基础姿态中躯干部位待试服装布料物理属性的拉力。
对于躯干部位来说与上述胳膊部位原理相似。一般来讲,二维人体基础姿态中躯干部位待试服装布料物理属性的拉力,与二维人体基础姿态中胳膊部位待试服装布料物理属性的拉力相一致;二维人体姿态中躯干部位原始服装布料物理属性的拉力,与二维人体姿态中胳膊部位原始服装布料物理属性的拉力相一致。
在躯干部位原始服装布料物理属性的拉力与躯干部位待试服装布料物理属性的拉力的差值ΔG 34为正值时,两者差值ΔG 34为躯干部位拉伸尺度,进而基于该躯干部位拉伸尺度,对躯干部位的待试服装布料物理属性,沿二维人体姿态中躯干部位的延伸方向(图12最后一图躯干部位所示箭头的方向)进行拉伸,形成包含拉伸后的待试服装数据的躯干部位。
在躯干部位原始服装布料物理属性的拉力与躯干部位待试服装布料物理属性的拉力的差值ΔG 34为负值时,两者差值ΔG 34为躯干部位回弹尺度,进而基于该躯干部位回弹尺度,对躯干部位的待试服装布料物理属性,沿与二维人体姿态中躯干部位的延伸方向相反的方向(图12最后一图躯干部位所示箭头方向的相反方向)进行回弹,形成包含回弹后的待试服装数据的躯干部位。
在本发明实施例中,根据原始服装布料物理属性的拉力及待试服装布料物理属性的拉力,对躯干部位的待试服装数据沿二维人体姿态中躯干部位的延伸方向进行拉伸或回弹,能够进一步提高虚拟试衣的真实性。
图13示出了本发明实施例提供的数据处理方法中步骤504的再一实现流程,为便于描述,仅示出了与本发明实施例相关的部分,详述如下:
在本发明的一实施例中,预设人体部位包括腿部。如图13所示,步骤504,根据二维人体姿态各个预设人体部位的姿态变化,分别利用二维人体基础姿态中各个预设人体部位的待试服装数据覆盖二维人体姿态中各个预设人体部位的原始服装数据,形成包含待试服装数据的二维人体姿态,包括:
步骤1301,将二维人体基础姿态中腿部部位的待试服装数据覆盖二维人体姿态中腿部部位的原始服装数据,形成包含待试服装数据的腿部部位;其中,二维人体基础姿态中腿部部位待试服装数据的第五预设顶点及第五预设边界与二维人体姿态中腿部部位原始服装数据的第五预设顶点及第五预设边界重合,或二维人体基础姿态中腿部部位待试 服装数据的第六预设边界及第六预设边界中心与二维人体姿态中腿部部位原始服装数据的第六预设边界及第六预设边界中心重合。
对于腿部部位待试服装数据覆盖原始服装数据,与上述胳膊部位及躯干部位的覆盖原理是相似的。对于腿部部位在利用待试服装数据覆盖原始服装数据时,鉴于在二维平面中,腿部部分的姿态基本无变化,在二维平面中腿部部分可以近似看做规则方形,此处可直接将腿部部位的待试服装数据覆盖腿部部位的原始服装数据,以此形成包含待试服装数据的腿部部位。
该第五预设顶点可以是躯干部位与腿部部位接触的左顶点,第五预设边界为躯干部位与腿部部位接触的部分所形成的边界。具体在进行覆盖时,二维人体基础姿态中腿部部位待试服装数据的第五预设顶点,与二维人体姿态中腿部部位原始服装数据的第五预设顶点重合,二维人体基础姿态中腿部部位待试服装数据的第五预设边界,与二维人体姿态中腿部部位原始服装数据的第五预设边界重合,以完成覆盖。本发明实施例中以腿部部位第五预设顶点及第五预设边界重合的方式进行覆盖,能够进一步提高虚拟试衣的真实性。
另外,该第五预设顶点还可以是除上述顶点之外的其它顶点,例如躯干部位与腿部部位接触位置的右顶点,该第五预设边界还可以是除上述边界之外的其它边界,例如腿部部位的左边界或者右边界,本发明实施例对此不作特别的限制。
在本发明实施例中,以腿部部位待试服装数据的第五预设顶点及第五预设边界与腿部部位原始服装数据的第五预设顶点及第五预设边界重合的方式进行覆盖,可以进一步提高虚拟试衣的真实性。
在本发明的一实施例中,该第六预设边界为躯干部位与腿部部位接触的部分形成的边界(与上述实施例中第五预设边界一致)。具体在进行覆盖时,二维人体基础姿态中腿部部位待试服装数据的第六预设边界,与二维人体姿态中腿部部位原始服装数据的第六预设边界重合,且二维人体基础姿态中腿部部位待试服装数据第六预设边界的中心,与二维人体姿态中腿部部位原始服装数据的第六预设边界的中心重合。本发明实施例中,以第六预设边界及第六预设边界中心重合的方式进行覆盖,能够进一步提高虚拟试衣的真实性。
在本发明实施例中,以二维人体基础姿态中腿部部位待试服装数据的第六预设边界及第六预设边界中心与二维人体姿态中腿部部位原始服装数据的第六预设边界及第六预设边界中心重合的方式进行覆盖,能够进一步提高虚拟试衣的真实性。
图14示出了本发明实施例提供的数据处理方法中步骤504的又一实现流程,为便于描述,仅示出了与本发明实施例相关的部分,详述如下:
在本发明的一实施例中,如图14所示,步骤504,在上述图13所示方法步骤的基础上,还包括:
步骤1401,根据原始服装布料物理属性的拉力及待试服装布料物理属性的拉力,对二维人体基础姿态中腿部部位的待试服装数据沿二维人体姿态中腿部部位的延伸方向进行拉伸或回弹,形成包含拉伸或回弹后的待试服装数据的腿部部位。
同样的,腿部部位的待试服装数据的拉伸或者回弹,与上述胳膊部位或者躯干部位待试服装数据的拉伸或者回弹原理相似。
具体的,在二维人体基础姿态的腿部部位待试服装数据覆盖二维人体姿态腿部部位的原始服装数据后,分别根据腿部部位原始服装布料物理属性的拉力及腿部部位待试服装布料物理属性的拉力,确定腿部部位拉伸尺度或腿部部位回弹尺度。
较优的,可以根据腿部部位原始服装布料物理属性的拉力与腿部部位待试服装布料物理属性的拉力的差值,确定腿部部位拉伸尺度或腿部部位回弹尺度。具体的,可以通过如下公式确定腿部部位拉伸尺度或腿部部位回弹尺度:
ΔG 56=G 5-G 6
其中,ΔG 56表示腿部部位拉伸尺度或腿部部位回弹尺度,G 5表示二维人体姿态中腿部部位原始服装布料物理属性的拉力,G 6表示二维人体基础姿态中腿部部位待试服装布料物理属性的拉力。
在腿部部位原始服装布料物理属性的拉力与腿部部位待试服装布料物理属性的拉力的差值ΔG 56为正值时,两者差值ΔG 56为腿部部位拉伸尺度,进而基于该腿部部位拉伸尺度,对腿部部位的待试服装布料物理属性,沿二维人体姿态中腿部部位的延伸方向进行拉伸,形成包含拉伸后的待试服装数据的腿部部位。
在腿部部位原始服装布料物理属性的拉力与腿部部位待试服装布料物理属性的拉力的差值ΔG 56为负值时,两者差值ΔG 56为腿部部位回弹尺度,进而基于该腿部部位回弹尺度,对腿部部位的待试服装布料物理属性,沿二维人体姿态中腿部部位的延伸方向相反的方向进行回弹,形成包含回弹后的待试服装数据的腿部部位。
在本发明实施例中,根据原始服装布料物理属性的拉力及待试服装布料物理属性的拉力,对二维人体基础姿态中腿部部位的待试服装数据沿二维人体姿态中腿部部位的延伸方向进行拉伸或回弹,能够进一步提高虚拟试衣的真实性。
本发明实施例中还提供了一种数据处理装置,如下面的实施例所述。由于这些装置解决问题的原理与数据处理方法相似,因此这些装置的实施可以参见方法的实施,重复之处不再赘述。
图15示出了本发明实施例提供的数据处理装置的功能模块,为便于说明,仅示出了与本发明实施例相关的部分,详述如下:
参考图15,所述数据处理装置所包含的各个模块用于执行图1对应实施例中的各个步骤,具体请参阅图1以及图1对应实施例中的相关描述,此处不再赘述。本发明实施例中,所述数据处理装置包括图像数据获取模块1501、姿态识别模块1502、姿态匹配模块1503及虚拟试衣模块1504。
图像数据获取模块1501,用于通过图像获取设备获取包括着装人体的图像数据;着装人体的图像数据反映三维人体在与图像获取设备的视野中心线垂直的第一平面上的投影数据。
姿态识别模块1502,用于根据图像数据识别着装人体的二维人体姿态;二维人体姿态反映三维人体在第二平面上的投影数据;第二平面为着装人体正面与着装人体背面的对称平面。
姿态匹配模块1503,用于将识别到的二维人体姿态在人体姿态数据库中进行匹配,确定二维人体姿态的预设人体部位。
虚拟试衣模块1504,用于对二维人体姿态的预设人体部位中被原始服装覆盖的部分赋予待试服装的布料物理属性。
在本发明实施例中,图像数据获取模块1501通过图像获取设备获取包括着装人体的图像数据,进而姿态识别模块1502根据图像数据识别着装人体的二维人体姿态;姿态匹配模块1503在人体姿态数据库中进行匹配确定二维人体姿态的预设人体部位,最后虚拟试衣模块1504直接对二维人体姿态的预设人体部位中被原始服装覆盖的部分赋予待试服装的布料物理属性。本发明实施例无需构建三维人体模型或者服装模型,仅需识别着装人体的二维人体姿态,并直接在二维人体姿态的预设人体部位中被原始服装覆盖的部分赋予待试服装的布料物理属性,能够极大的提高虚拟试衣的效率,进而提高用户体验。
图16示出了本发明实施例提供的数据处理装置的另一功能模块,为便于说明,仅示出了与本发明实施例相关的部分,详述如下:
在本发明的一实施例中,参考图16,所述数据处理装置所包含的各个模块用于执行图5对应实施例中的各个步骤,具体请参阅图5以及图5对应实施例中的相关描述,此 处不再赘述。本发明实施例中,在上述图15所示模块结构的基础上,所述数据处理装置,还包括姿态变化确定模块1601、原始服装数据形成模块1602、待试服装数据形成模块1603及服装数据覆盖模块1604。
姿态变化确定模块1601,用于根据二维人体基础姿态,确定二维人体姿态各个预设人体部位的姿态变化;二维人体基础姿态为三维基础人体模型在第一平面上的投影数据。
原始服装数据形成模块1602,用于对二维人体姿态中被原始服装覆盖的预设人体部位赋予原始服装的布料物理属性,形成原始服装数据。
待试服装数据形成模块1603,用于对二维人体基础姿态中被待试服装覆盖的预设人体部位赋予待试服装的布料物理属性,形成待试服装数据。
服装数据覆盖模块1604,用于根据二维人体姿态各个预设人体部位的姿态变化,分别利用二维人体基础姿态中各个预设人体部位的待试服装数据覆盖二维人体姿态中各个预设人体部位的原始服装数据,形成包含待试服装数据的二维人体姿态。
在本发明实施例中,姿态变化确定模块1601基于二维人体基础姿态确定二维人体姿态各个预设人体部位的姿态变化,原始服装数据形成模块1602对二维人体姿态中被原始服装覆盖的预设人体部位赋予原始服装的布料物理属性形成原始服装数据,待试服装数据形成模块1603对二维人体基础姿态中被待试服装覆盖的预设人体部位赋予待试服装的布料物理属性,形成待试服装数据,进而服装数据覆盖模块1604根据二维人体姿态各个预设人体部位的姿态变化,分别利用二维人体基础姿态中各个预设人体部位的待试服装数据覆盖二维人体姿态中各个预设人体部位的原始服装数据,能够提高虚拟试衣的真实性。
图17示出了本发明实施例提供的数据处理装置中服装数据覆盖模块1604的结构示意,为便于说明,仅示出了与本发明实施例相关的部分,详述如下:
在本发明的一实施例中,预设人体部位包括胳膊及躯干。参考图17,所述服装数据覆盖模块1604所包含的各个单元用于执行图7对应实施例中的各个步骤,具体请参阅图7以及图7对应实施例中的相关描述,此处不再赘述。本发明实施例中,所述服装数据覆盖模块1604包括胳膊部位覆盖单元1701及躯干部位覆盖单元1702。
胳膊部位覆盖单元1701,用于根据二维人体姿态中胳膊部位的姿态变化,将二维人体基础姿态中胳膊部位的待试服装数据覆盖二维人体姿态中胳膊部位的原始服装数据,形成包含待试服装数据的胳膊部位;其中,二维人体基础姿态中胳膊部位待试服装数据 的第一预设顶点及第一预设边界与二维人体姿态中胳膊部位原始服装数据的第一预设顶点及第一预设边界重合,或二维人体基础姿态中胳膊部位待试服装数据的第二预设边界及第二预设边界中心与二维人体姿态中胳膊部位原始服装数据的第二预设边界及第二预设边界中心重合。
躯干部位覆盖单元1702,用于将二维人体基础姿态中躯干部位的待试服装数据覆盖二维人体姿态中躯干部位的原始服装数据,形成包含待试服装数据的躯干部位;其中,二维人体基础姿态中躯干部位待试服装数据的第三预设顶点及第三预设边界与二维人体姿态中躯干部位原始服装数据的第三预设顶点及第三预设边界重合,或二维人体基础姿态中躯干部位待试服装数据的第四预设边界及第四预设边界中心与二维人体姿态中躯干部位原始服装数据的第四预设边界及第四预设边界中心重合。
在本发明实施例中,胳膊部位覆盖单元1701以二维人体基础姿态中胳膊部位待试服装数据的第一预设顶点及第一预设边界与二维人体姿态中胳膊部位原始服装数据的第一预设顶点及第一预设边界重合的方式进行覆盖,或者以二维人体基础姿态中胳膊部位待试服装数据的第二预设边界及第二预设边界中心与二维人体姿态中胳膊部位原始服装数据的第二预设边界及第二预设边界中心重合的方式进行覆盖,能够进一步提高虚拟试衣的真实性。
在本发明实施例中,躯干部位覆盖单元1702以二维人体基础姿态中躯干部位待试服装数据的第三预设顶点及第三预设边界与二维人体姿态中躯干部位原始服装数据的第三预设顶点及第三预设边界重合的方式进行覆盖,或者以二维人体基础姿态中躯干部位待试服装数据的第四预设边界及第四预设边界中心与二维人体姿态中躯干部位原始服装数据的第四预设边界及第四预设边界中心重合的方式进行覆盖,能够进一步提高虚拟试衣的真实性。
图18示出了本发明实施例提供的数据处理装置中服装数据覆盖模块1604的另一结构示意,为便于说明,仅示出了与本发明实施例相关的部分,详述如下:
在本发明的一实施例中,参考图18,所述服装数据覆盖模块1604所包含的各个单元用于执行图10对应实施例中的各个步骤,具体请参阅图10以及图10对应实施例中的相关描述,此处不再赘述。本发明实施例中,在上述图17所示单元结构的基础上,所述服装数据覆盖模块1604,还包括胳膊部位拉伸回弹单元1801及躯干部位拉伸回弹单元1802。
胳膊部位拉伸回弹单元1801,用于根据原始服装布料物理属性的拉力及待试服装布料物理属性的拉力,对二维人体基础姿态中胳膊部位的待试服装数据沿二维人体姿态中胳膊部位的延伸方向进行拉伸或回弹,形成包含拉伸或回弹后的待试服装数据的胳膊部位。
躯干部位拉伸回弹单元1802,用于根据原始服装布料物理属性的拉力及待试服装布料物理属性的拉力,对二维人体基础姿态中躯干部位的待试服装数据沿二维人体姿态中躯干部位的延伸方向进行拉伸或回弹,形成包含拉伸或回弹后的待试服装数据的躯干部位。
在本发明实施例中,胳膊部位拉伸回弹单元1801根据原始服装布料物理属性的拉力及待试服装布料物理属性的拉力,对二维人体基础姿态中胳膊部位的待试服装数据沿二维人体姿态中胳膊部位的延伸方向进行拉伸或回弹,能够进一步提高虚拟试衣的真实性。
在本发明实施例中,躯干部位拉伸回弹单元1802根据原始服装布料物理属性的拉力及待试服装布料物理属性的拉力,对二维人体基础姿态中躯干部位的待试服装数据沿二维人体姿态中躯干部位的延伸方向进行拉伸或回弹,能够进一步提高虚拟试衣的真实性。
图19示出了本发明实施例提供的数据处理装置中服装数据覆盖模块1604的再一结构示意,为便于说明,仅示出了与本发明实施例相关的部分,详述如下:
在本发明的一实施例中,预设人体部位包括腿部。参考图19,所述服装数据覆盖模块1604所包含的各个单元用于执行图13对应实施例中的各个步骤,具体请参阅图13以及图13对应实施例中的相关描述,此处不再赘述。本发明实施例中,所述服装数据覆盖模块1604包括腿部部位覆盖单元1901。
腿部部位覆盖单元1901,用于将二维人体基础姿态中腿部部位的待试服装数据覆盖二维人体姿态中腿部部位的原始服装数据,形成包含待试服装数据的腿部部位;其中,二维人体基础姿态中腿部部位待试服装数据的第五预设顶点及第五预设边界与二维人体姿态中腿部部位原始服装数据的第五预设顶点及第五预设边界重合,或二维人体基础姿态中腿部部位待试服装数据的第六预设边界及第六预设边界中心与二维人体姿态中腿部部位原始服装数据的第六预设边界及第六预设边界中心重合。
在本发明实施例中,腿部部位覆盖单元1901以二维人体基础姿态中腿部部位待试服装数据的第五预设顶点及第五预设边界与二维人体姿态中腿部部位原始服装数据的第五 预设顶点及第五预设边界重合的方式进行覆盖,或以二维人体基础姿态中腿部部位待试服装数据的第六预设边界及第六预设边界中心与二维人体姿态中腿部部位原始服装数据的第六预设边界及第六预设边界中心重合的方式进行覆盖,能够进一步提高虚拟试衣的真实性。
图20示出了本发明实施例提供的数据处理装置中服装数据覆盖模块1604的又一结构示意,为便于说明,仅示出了与本发明实施例相关的部分,详述如下:
在本发明的一实施例中,参考图20,所述服装数据覆盖模块1604所包含的各个单元用于执行图14对应实施例中的各个步骤,具体请参阅图14以及图14对应实施例中的相关描述,此处不再赘述。本发明实施例中,在上述图19所示单元结构的基础上,所述服装数据覆盖模块1604,还包括腿部部位拉伸回弹单元2001。
腿部部位拉伸回弹单元2001,用于根据原始服装布料物理属性的拉力及待试服装布料物理属性的拉力,对二维人体基础姿态中腿部部位的待试服装数据沿二维人体姿态中腿部部位的延伸方向进行拉伸或回弹,形成包含拉伸或回弹后的待试服装数据的腿部部位。
在本发明实施例中,腿部部位拉伸回弹单元2001根据原始服装布料物理属性的拉力及待试服装布料物理属性的拉力,对二维人体基础姿态中腿部部位的待试服装数据沿二维人体姿态中腿部部位的延伸方向进行拉伸或回弹,能够进一步提高虚拟试衣的真实性。
该说明书中,第一平面和第二平面的像素点的颜色变化是同步的。
本发明实施例还提供一种计算机设备,包括存储器、处理器及存储在存储器上并可在处理器上运行的计算机程序,所述处理器执行所述计算机程序时实现上述数据处理方法。
本发明实施例还提供一种计算机可读存储介质,所述计算机可读存储介质存储有执行上述数据处理方法的计算机程序。
综上所述,本发明实施例中,通过图像获取设备获取包括着装人体的图像数据;进而根据图像数据识别着装人体的二维人体姿态;并在人体姿态数据库中进行匹配确定二维人体姿态的预设人体部位,最后直接对二维人体姿态的预设人体部位中被原始服装覆盖的部分赋予待试服装的布料物理属性。本发明实施例无需构建三维人体模型或者服装模型,仅需识别着装人体的二维人体姿态,并直接在二维人体姿态的预设人体部位中被 原始服装覆盖的部分赋予待试服装的布料物理属性,能够极大的提高虚拟试衣的效率,进而提高用户体验。
本领域内的技术人员应明白,本发明的实施例可提供为方法、系统、或计算机程序产品。因此,本发明可采用完全硬件实施例、完全软件实施例、或结合软件和硬件方面的实施例的形式。而且,本发明可采用在一个或多个其中包含有计算机可用程序代码的计算机可用存储介质(包括但不限于磁盘存储器、CD-ROM、光学存储器等)上实施的计算机程序产品的形式。
本发明是参照根据本发明实施例的方法、设备(系统)、和计算机程序产品的流程图和/或方框图来描述的。应理解可由计算机程序指令实现流程图和/或方框图中的每一流程和/或方框、以及流程图和/或方框图中的流程和/或方框的结合。可提供这些计算机程序指令到通用计算机、专用计算机、嵌入式处理机或其他可编程数据处理设备的处理器以产生一个机器,使得通过计算机或其他可编程数据处理设备的处理器执行的指令产生用于实现在流程图一个流程或多个流程和/或方框图一个方框或多个方框中指定的功能的装置。
这些计算机程序指令也可存储在能引导计算机或其他可编程数据处理设备以特定方式工作的计算机可读存储器中,使得存储在该计算机可读存储器中的指令产生包括指令装置的制造品,该指令装置实现在流程图一个流程或多个流程和/或方框图一个方框或多个方框中指定的功能。
这些计算机程序指令也可装载到计算机或其他可编程数据处理设备上,使得在计算机或其他可编程设备上执行一系列操作步骤以产生计算机实现的处理,从而在计算机或其他可编程设备上执行的指令提供用于实现在流程图一个流程或多个流程和/或方框图一个方框或多个方框中指定的功能的步骤。
以上所述的具体实施例,对本发明的目的、技术方案和有益效果进行了进一步详细说明,所应理解的是,以上所述仅为本发明的具体实施例而已,并不用于限定本发明的保护范围,凡在本发明的精神和原则之内,所做的任何修改、等同替换、改进等,均应包含在本发明的保护范围之内。

Claims (10)

  1. 一种数据处理方法,其特征在于,包括:
    通过图像获取设备获取包括着装人体的图像数据;着装人体的图像数据反映三维人体在与图像获取设备的视野中心线垂直的第一平面上的投影数据;
    根据图像数据识别着装人体的二维人体姿态;二维人体姿态反映三维人体在第二平面上的投影数据;第二平面为着装人体正面与着装人体背面的对称平面;
    将识别到的二维人体姿态在人体姿态数据库中进行匹配,确定二维人体姿态的预设人体部位;
    对二维人体姿态的预设人体部位中被原始服装覆盖的部分赋予待试服装的布料物理属性。
  2. 如权利要求1所述的数据处理方法,其特征在于,第一平面和第二平面为同一平面,或第一平面和第二平面为不同平面。
  3. 如权利要求1所述的数据处理方法,其特征在于,还包括:
    根据二维人体基础姿态,确定二维人体姿态各个预设人体部位的姿态变化;二维人体基础姿态为三维基础人体模型在第一平面上的投影数据;
    对二维人体姿态中被原始服装覆盖的预设人体部位赋予原始服装的布料物理属性,形成原始服装数据;
    对二维人体基础姿态中被待试服装覆盖的预设人体部位赋予待试服装的布料物理属性,形成待试服装数据;
    根据二维人体姿态各个预设人体部位的姿态变化,分别利用二维人体基础姿态中各个预设人体部位的待试服装数据覆盖二维人体姿态中各个预设人体部位的原始服装数据,形成包含待试服装数据的二维人体姿态。
  4. 如权利要求3所述的数据处理方法,其特征在于,预设人体部位包括胳膊及躯干,根据二维人体姿态各个预设人体部位的姿态变化,分别利用二维人体基础姿态中各个预设人体部位的待试服装数据覆盖二维人体姿态中各个预设人体部位的原始服装数据,形成包含待试服装数据的二维人体姿态,包括:
    根据二维人体姿态中胳膊部位的姿态变化,将二维人体基础姿态中胳膊部位的待试服装数据覆盖二维人体姿态中胳膊部位的原始服装数据,形成包含待试服装数据的胳膊部位;其中,二维人体基础姿态中胳膊部位待试服装数据的第一预设顶点及第一预设边界与二维人体姿态中胳膊部位原始服装数据的第一预设顶点及第一预设边界重合,或二 维人体基础姿态中胳膊部位待试服装数据的第二预设边界及第二预设边界中心与二维人体姿态中胳膊部位原始服装数据的第二预设边界及第二预设边界中心重合;及
    将二维人体基础姿态中躯干部位的待试服装数据覆盖二维人体姿态中躯干部位的原始服装数据,形成包含待试服装数据的躯干部位;其中,二维人体基础姿态中躯干部位待试服装数据的第三预设顶点及第三预设边界与二维人体姿态中躯干部位原始服装数据的第三预设顶点及第三预设边界重合,或二维人体基础姿态中躯干部位待试服装数据的第四预设边界及第四预设边界中心与二维人体姿态中躯干部位原始服装数据的第四预设边界及第四预设边界中心重合。
  5. 如权利要求4所述的数据处理方法,其特征在于,还包括:
    根据原始服装布料物理属性的拉力及待试服装布料物理属性的拉力,对二维人体基础姿态中胳膊部位的待试服装数据沿二维人体姿态中胳膊部位的延伸方向进行拉伸或回弹,形成包含拉伸或回弹后的待试服装数据的胳膊部位;
    根据原始服装布料物理属性的拉力及待试服装布料物理属性的拉力,对二维人体基础姿态中躯干部位的待试服装数据沿二维人体姿态中躯干部位的延伸方向进行拉伸或回弹,形成包含拉伸或回弹后的待试服装数据的躯干部位。
  6. 如权利要求3所述的数据处理方法,其特征在于,预设人体部位包括腿部,根据二维人体姿态各个预设人体部位的姿态变化,分别利用二维人体基础姿态中各个预设人体部位的待试服装数据覆盖二维人体姿态中各个预设人体部位的原始服装数据,形成包含待试服装数据的二维人体姿态,包括:
    将二维人体基础姿态中腿部部位的待试服装数据覆盖二维人体姿态中腿部部位的原始服装数据,形成包含待试服装数据的腿部部位;其中,二维人体基础姿态中腿部部位待试服装数据的第五预设顶点及第五预设边界与二维人体姿态中腿部部位原始服装数据的第五预设顶点及第五预设边界重合,或二维人体基础姿态中腿部部位待试服装数据的第六预设边界及第六预设边界中心与二维人体姿态中腿部部位原始服装数据的第六预设边界及第六预设边界中心重合。
  7. 如权利要求6所述的数据处理方法,其特征在于,还包括:
    根据原始服装布料物理属性的拉力及待试服装布料物理属性的拉力,对二维人体基础姿态中腿部部位的待试服装数据沿二维人体姿态中腿部部位的延伸方向进行拉伸或回弹,形成包含拉伸或回弹后的待试服装数据的腿部部位。
  8. 一种数据处理装置,其特征在于,包括:
    图像数据获取模块,用于通过图像获取设备获取包括着装人体的图像数据;着装人体的图像数据反映三维人体在与图像获取设备的视野中心线垂直的第一平面上的投影数据;
    姿态识别模块,用于根据图像数据识别着装人体的二维人体姿态;二维人体姿态反映三维人体在第二平面上的投影数据;第二平面为着装人体正面与着装人体背面的对称平面;
    姿态匹配模块,用于将识别到的二维人体姿态在人体姿态数据库中进行匹配,确定二维人体姿态的预设人体部位;
    虚拟试衣模块,用于对二维人体姿态的预设人体部位中被原始服装覆盖的部分赋予待试服装的布料物理属性。
  9. 一种计算机设备,包括存储器、处理器及存储在存储器上并可在处理器上运行的计算机程序,其特征在于,所述处理器执行所述计算机程序时实现权利要求1至7任一所述数据处理方法。
  10. 一种计算机可读存储介质,其特征在于,所述计算机可读存储介质存储有执行权利要求1至7任一所述数据处理方法的计算机程序。
PCT/CN2020/113198 2019-09-03 2020-09-03 数据处理方法及装置、计算机设备及计算机可读存储介质 WO2021043204A1 (zh)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202080061120.6A CN114556402A (zh) 2019-09-03 2020-09-03 数据处理方法及装置、计算机设备及计算机可读存储介质
US17/639,965 US11849790B2 (en) 2019-09-03 2020-09-03 Apparel fitting simulation based upon a captured two-dimensional human body posture image

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201910827228 2019-09-03
CN201910827228.0 2019-09-03

Publications (1)

Publication Number Publication Date
WO2021043204A1 true WO2021043204A1 (zh) 2021-03-11

Family

ID=74853474

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2020/113198 WO2021043204A1 (zh) 2019-09-03 2020-09-03 数据处理方法及装置、计算机设备及计算机可读存储介质

Country Status (3)

Country Link
US (1) US11849790B2 (zh)
CN (1) CN114556402A (zh)
WO (1) WO2021043204A1 (zh)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130243259A1 (en) * 2010-12-09 2013-09-19 Panasonic Corporation Object detection device and object detection method
CN106920146A (zh) * 2017-02-20 2017-07-04 宁波大学 基于体感特征参数提取的立体试衣方法
CN107622428A (zh) * 2016-07-14 2018-01-23 幸福在线(北京)网络技术有限公司 一种实现虚拟试穿的方法及装置
CN107977885A (zh) * 2017-12-12 2018-05-01 北京小米移动软件有限公司 虚拟试穿的方法及装置
CN108460338A (zh) * 2018-02-02 2018-08-28 北京市商汤科技开发有限公司 人体姿态估计方法和装置、电子设备、存储介质、程序

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7149665B2 (en) * 2000-04-03 2006-12-12 Browzwear International Ltd System and method for simulation of virtual wear articles on virtual models
US9292967B2 (en) * 2010-06-10 2016-03-22 Brown University Parameterized model of 2D articulated human shape
US20150134302A1 (en) * 2013-11-14 2015-05-14 Jatin Chhugani 3-dimensional digital garment creation from planar garment photographs
US10546433B2 (en) * 2017-08-03 2020-01-28 The University Of North Carolina At Chapel Hill Methods, systems, and computer readable media for modeling garments using single view images

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130243259A1 (en) * 2010-12-09 2013-09-19 Panasonic Corporation Object detection device and object detection method
CN107622428A (zh) * 2016-07-14 2018-01-23 幸福在线(北京)网络技术有限公司 一种实现虚拟试穿的方法及装置
CN106920146A (zh) * 2017-02-20 2017-07-04 宁波大学 基于体感特征参数提取的立体试衣方法
CN107977885A (zh) * 2017-12-12 2018-05-01 北京小米移动软件有限公司 虚拟试穿的方法及装置
CN108460338A (zh) * 2018-02-02 2018-08-28 北京市商汤科技开发有限公司 人体姿态估计方法和装置、电子设备、存储介质、程序

Also Published As

Publication number Publication date
CN114556402A (zh) 2022-05-27
US11849790B2 (en) 2023-12-26
US20220322775A1 (en) 2022-10-13

Similar Documents

Publication Publication Date Title
US10679046B1 (en) Machine learning systems and methods of estimating body shape from images
CN107111833B (zh) 快速3d模型适配和人体测量
CN110399809A (zh) 多特征融合的人脸关键点检测方法及装置
CN104794722A (zh) 利用单个Kinect计算着装人体三维净体模型的方法
US20150235305A1 (en) Garment modeling simulation system and process
Hu et al. 3DBodyNet: fast reconstruction of 3D animatable human body shape from a single commodity depth camera
JP2019096113A (ja) キーポイントデータに関する加工装置、方法及びプログラム
CN110503681B (zh) 人体模型自动创建方法及三维试衣系统
CN111445426B (zh) 一种基于生成对抗网络模型的目标服装图像处理方法
WO2021063271A1 (zh) 人体模型重建方法、重建系统及存储介质
Cho et al. An implementation of a garment-fitting simulation system using laser scanned 3D body data
AU2020300067B2 (en) Layered motion representation and extraction in monocular still camera videos
TWI750710B (zh) 圖像處理方法及裝置、圖像處理設備及儲存媒體
CN109407828A (zh) 一种凝视点估计方法及系统、存储介质及终端
US20240193899A1 (en) Methods of estimating a bare body shape from a concealed scan of the body
CN114049683A (zh) 基于三维人体骨架模型的愈后康复辅助检测系统、方法、介质
Cao et al. Human posture recognition using skeleton and depth information
WO2021043204A1 (zh) 数据处理方法及装置、计算机设备及计算机可读存储介质
Xu et al. 3D joints estimation of the human body in single-frame point cloud
Lan et al. Data fusion-based real-time hand gesture recognition with Kinect V2
CN111275610A (zh) 一种人脸变老图像处理方法及系统
CN114723517A (zh) 一种虚拟试衣方法、装置及存储介质
CN115331304A (zh) 一种跑步识别方法
CN110148202B (zh) 用于生成图像的方法、装置、设备和存储介质
Xiong et al. Gaze estimation based on 3D face structure and pupil centers

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20861096

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20861096

Country of ref document: EP

Kind code of ref document: A1